Corporate Blogging Survey 2005

« Millie's blog at the AMA Blogging Workshop | Main | Babson College Presentation on Corporate Blogging »

July 19, 2005

Survey Engines and Chatting with a Survey Vendor

Lessons learned on the blogging survey are not only from lessons for marketing people in terms of blogging; I think the Backbone Media team picked up some great tips on how to conduct a survey. Backbone Media, Inc. used Zoomerang to manage the survey, in hindsight; I wish we had split each question up individually rather than leave many of the questions to appear on the same page. I believe this would have encouraged more response from the many people who clicked through to the survey. We had a lot of questions, 32, and while the survey, as a whole did not take all that much time to complete, a page full of questions was probably a little overwhelming. One useful feature of Zoomerang is the ability to skip questions depending upon the respondent鈥檚 answer; we really used this skip function only once, if a respondent was not a blogger we sent them off to another set of questions. We have not as yet published the results to the survey.

The experience of using the engine caused me to ponder about survey engines in general. In Zoomerang it is not possible to send a secure link to a respondent with a password, so a survey link might be posted on the web. And while the skip function is useful if I wanted to make spilt the survey down any more than one or two questions, the design of the survey could get rather complicated.

Arick Rynearson is Research Services Manager for DataStat, a Seattle based survey engine software company, I knew Arick from when I lived in the city. We recently chatted and he told me a few things about the application. Here鈥檚 an overview:

鈥淒esigned from the ground-up to handle the needs of multi-user, multi-site, multi-survey environments, the DatStat Illume platform provides flexibility, and control for the survey user. The technology provides you with actionable results you need to make better decisions,鈥? stated Arick.

I wanted to ask Arick a few questions about his product over time. I鈥檝e always found that the best way to build a relationship with someone is to meeting with him or her on a regular basis. Arick had sent me a list of features of his product. I鈥檝e put together the list of the features I am most interested in discussing. And over the next few weeks, I hope to conduct a back and forth between the two of us. This is a little reverse to the way blogs are used in terms to discussing a product on the company鈥檚 website. But I thought it would be interesting to try an experiment where the vendor and I have a conversation about their product. We don鈥檛 use DataStat at Backbone Media, Inc. at the moment, but I鈥檇 like to learn more about the product for our use or one of our clients. Here are the features I am interested in learning more about.

路 Surveys are created as a dynamic series of definable objects - not page or question number centric
路 Re-use surveys, with new questions or edits, across time periods to easily monitor trends
路 Data Repository model is far more powerful than question library; robust search capability to quickly locate historical questions- click and drag to re-use; Repository management allows only approved questions; locks scale values for future cross-survey trend analysis
路 Show-if logic for branching. Operator Wizards to easily define logic to convey when questions or collections of questions are displayed much more powerful than skip to?
路 Robust analysis interface; build and share powerful queries, analyze trends across time periods or even different surveys; output to a variety of formats
路 Multi-layered, roles-based system administration provides control on what users can access and perform for each survey, project, or group.
路 XML translation module greatly streamlines survey localization process

Arick, I understand the concept of objects in programming, how does this work in the survey world? Can you give me some simple examples?

Posted by johncass at July 19, 2005 7:02 PM

Trackback Pings

TrackBack URL for this entry:


Hi John-

Thanks for the invitation to your blog! Excited to enter into this discussion with you here. As a blogging 'newbie', I'm interested to learn more about corporate blogging and how DatStat may be able to use it to connect with new audiences. So... I'm looking forward to learning from you, too, as we play out our exchange of ideas here.

Regarding your question about how the 'concepts of objects in programming' relate to the survey world... Most online survey systems have inherited the legacy of their paper predecessors when it comes to survey design. They organize by 'question number' and 'page number'.

DatStat has taken a different, object-oriented approach. For our purposes, a survey is just a series of objects. Using our Survey Designer GUI, you can build and define a variety of object types: questions, question matrices, images, HTML files, calculations, etc. You can also define the attributes (meta-data) of each object: type of question, what it looks like, associated scale values, what it鈥檚 called, whether it鈥檚 required or not, etc.

One of the attributes you can define for each object is whether it鈥檚 shown or not (鈥榮how-if鈥 logic). This show-if logic interface is very powerful for controlling survey branching. As you mentioned, managing survey skips with traditional products is limited and potentially daunting. Traditionally, to branch you ask a question and then 鈥榮kip to鈥 the follow-up question by going to question number X or page number Y. The problem with this is it can be very cumbersome to branch on multiple conditions and if anything in your survey changes (you add, delete or move questions) you must go back and recode all your skips because the follow up question now has a different 鈥榥umber鈥 or is on a different 鈥榩age鈥.

Using show-if logic, it鈥檚 quite simple to define the condition(s) for when an object is shown. Essentially, I can dynamically customize my survey to ask a specific follow up question of any segment of my respondents. Example:

鈥 I want to ask a follow-up question about corporate blogging: 鈥淗ow has corporate blogging impacted your business鈥?
鈥 I would build that question as an object in the system and give it a name: ex. 鈥楤log_Impact鈥.
鈥 In this example, I really want to hear from 鈥榮mall startups that work in the technology industry鈥
鈥 To achieve this, I would define the 鈥榮how-if鈥 condition for Blog_Impact such that it is only shown if
o company_size = 1- 100 employees
o in_business o industry = technology
鈥 for the above, company_size, in_business and industry would all be 鈥榦bjects鈥 in the system and the answer ranges would be from their associated scale values

Now, Blog_Impact will only be shown if the conditions I鈥檝e built have been met somewhere previously in the survey. The system keeps track of answer responses so I can ask Blog_Impact at any point in the survey (assuming it comes after the other questions that define the logic conditions are asked). If the participant doesn鈥檛 meet these criteria, they will not see the follow up question, saving them time and increasing response rate. Because the logic is tied to the object Blog_Impact, I can 鈥榗lick and drag鈥 it anywhere in the survey (the logic moves with it!- I don鈥檛 ever have to recode.

Using 鈥榮how-if鈥 you can build powerful surveys that adapt dynamically to the respondent as they answer questions. It enables researchers to 鈥榩rofile on the fly鈥 and get much deeper, more relevant responses rather than getting to the analysis phase and wishing they鈥檇 been able to ask a certain question of a certain segment. You can also group similar objects into 鈥榗ollections鈥 and set 鈥榮how-if鈥 conditions for the collection. Thus, if there are a series of questions I want to ask of the 鈥榮mall startups that work in the technology industry鈥 I can put all of those question objects into a 鈥榗ollection鈥 and set show if logic for the single collection, rather than having to do it for all those individual questions- a easy way to organize your survey and big time saver! You can even drive show-if conditions off of calculated variables or variables that you 鈥榩re-load鈥 into the survey from other data sources.

Okay, I fear I鈥檓 getting a little 鈥榯oo鈥 technical. I think I鈥檒l stop here鈥 Hopefully this makes sense? Happy to clarify or give other examples if it helps. Can you see how this object-oriented architecture would be beneficial in your own research efforts?

Looking forward to continuing our conversation鈥

(as a blogging 鈥榥ewbie鈥, please let me know if there are any 'rules of engagement', etiquette, etc. for conducting our discussion- hopefully we're off to a good start!)

Posted by: Arick Rynearson at July 22, 2005 6:10 PM


Does your system allow clients to review results as customers take the survey, i.e. you ask the question and then immediately show the results of the survey and then ask the survey respondent to answer the next question?


Posted by: John Cass at August 4, 2005 6:36 PM


Our system, DatStat Illume, has a web interface (called Web Console) which enables clients to view, analyze and download respondent data in real-time.

As far as asking the respondent to answer the next question, the system keeps track of how the respondent answers and then 'branches' to present relevant follow-up questions (or collections of questions) depending on the 'show-if' conditions which have been set. The show-if logic effectively tells the system how to branch depending on the respondent is answering. This happens dynamically- depending on how the show-if conditions are setup and the way a respondent answers questions, they may only see a handful of questions out of a survey with hundreds of potential questions.

So, you can see the results in real-time and the Illume system dynamically presents the follow-up questions depending on how you鈥檝e setup the 鈥榮how-if鈥 logic.

Using our technology, it鈥檚 also possible to 鈥榩re-load鈥 data from other external data sources (CRM, ERP, Web Trending software, etc.). This preloaded data becomes variables within the system. You can use these pre-load variables to customize the survey a particular participant might see by 鈥榩iping鈥 the info. into the survey or by driving 鈥榮how-if鈥 conditions off the pre-load variables. You can also factor the pre-load data into future analysis, as it鈥檚 now part of the data set.

One exciting way you can leverage online survey technology is by 鈥榖ridging鈥 behavioral and attitudinal data. Essentially, dynamically adaptable surveys enable you can ask very specific follow-up questions about why people behaved the way they did. Why did some see the banner and visit the website while others didn鈥檛? Why did some abandon the shopping cart and others completed their purchase? Why could some find what they were looking for on the website vs. others who got lost? The technology effectively enables researchers to understand the 鈥榳hy鈥 behind the behaviors they measure- valuable information, I think you鈥檒l agree?

Posted by: Arick Rynearson at August 19, 2005 2:34 PM

Post a comment

Remember Me?

(you may use HTML tags for style)

Type the characters you see in the picture above.

Subscribe to This Entry
Subscribe to This Category
Subscribe to This Blog

About Us
Recent Entries

Subscribe in NewsGator Online

Powered by
Movable Type 3.33