« Millie's blog at the AMA Blogging Workshop | Main | Babson College Presentation on Corporate Blogging »
July 19, 2005Survey Engines and Chatting with a Survey Vendor
Lessons learned on the blogging survey are not only from lessons for marketing people in terms of blogging; I think the Backbone Media team picked up some great tips on how to conduct a survey. Backbone Media, Inc. used Zoomerang to manage the survey, in hindsight; I wish we had split each question up individually rather than leave many of the questions to appear on the same page. I believe this would have encouraged more response from the many people who clicked through to the survey. We had a lot of questions, 32, and while the survey, as a whole did not take all that much time to complete, a page full of questions was probably a little overwhelming. One useful feature of Zoomerang is the ability to skip questions depending upon the respondent’s answer; we really used this skip function only once, if a respondent was not a blogger we sent them off to another set of questions. We have not as yet published the results to the survey.
The experience of using the engine caused me to ponder about survey engines in general. In Zoomerang it is not possible to send a secure link to a respondent with a password, so a survey link might be posted on the web. And while the skip function is useful if I wanted to make spilt the survey down any more than one or two questions, the design of the survey could get rather complicated.
Arick Rynearson is Research Services Manager for DataStat, a Seattle based survey engine software company, I knew Arick from when I lived in the city. We recently chatted and he told me a few things about the application. Here’s an overview:
“Designed from the ground-up to handle the needs of multi-user, multi-site, multi-survey environments, the DatStat Illume platform provides flexibility, and control for the survey user. The technology provides you with actionable results you need to make better decisions,� stated Arick.
I wanted to ask Arick a few questions about his product over time. I’ve always found that the best way to build a relationship with someone is to meeting with him or her on a regular basis. Arick had sent me a list of features of his product. I’ve put together the list of the features I am most interested in discussing. And over the next few weeks, I hope to conduct a back and forth between the two of us. This is a little reverse to the way blogs are used in terms to discussing a product on the company’s website. But I thought it would be interesting to try an experiment where the vendor and I have a conversation about their product. We don’t use DataStat at Backbone Media, Inc. at the moment, but I’d like to learn more about the product for our use or one of our clients. Here are the features I am interested in learning more about.
· Surveys are created as a dynamic series of definable objects - not page or question number centric
· Re-use surveys, with new questions or edits, across time periods to easily monitor trends
· Data Repository model is far more powerful than question library; robust search capability to quickly locate historical questions- click and drag to re-use; Repository management allows only approved questions; locks scale values for future cross-survey trend analysis
· Show-if logic for branching. Operator Wizards to easily define logic to convey when questions or collections of questions are displayed much more powerful than skip to?
· Robust analysis interface; build and share powerful queries, analyze trends across time periods or even different surveys; output to a variety of formats
· Multi-layered, roles-based system administration provides control on what users can access and perform for each survey, project, or group.
· XML translation module greatly streamlines survey localization process
Arick, I understand the concept of objects in programming, how does this work in the survey world? Can you give me some simple examples?
Posted by johncass at July 19, 2005 7:02 PM
Trackback Pings
TrackBack URL for this entry:
http://tmpmt.backbonemedia.com/mt2/mt-tb.cgi/64
Comments
Hi John-
Thanks for the invitation to your blog! Excited to enter into this discussion with you here. As a blogging 'newbie', I'm interested to learn more about corporate blogging and how DatStat may be able to use it to connect with new audiences. So... I'm looking forward to learning from you, too, as we play out our exchange of ideas here.
Regarding your question about how the 'concepts of objects in programming' relate to the survey world... Most online survey systems have inherited the legacy of their paper predecessors when it comes to survey design. They organize by 'question number' and 'page number'.
DatStat has taken a different, object-oriented approach. For our purposes, a survey is just a series of objects. Using our Survey Designer GUI, you can build and define a variety of object types: questions, question matrices, images, HTML files, calculations, etc. You can also define the attributes (meta-data) of each object: type of question, what it looks like, associated scale values, what it’s called, whether it’s required or not, etc.
One of the attributes you can define for each object is whether it’s shown or not (‘show-if’ logic). This show-if logic interface is very powerful for controlling survey branching. As you mentioned, managing survey skips with traditional products is limited and potentially daunting. Traditionally, to branch you ask a question and then ‘skip to’ the follow-up question by going to question number X or page number Y. The problem with this is it can be very cumbersome to branch on multiple conditions and if anything in your survey changes (you add, delete or move questions) you must go back and recode all your skips because the follow up question now has a different ‘number’ or is on a different ‘page’.
Using show-if logic, it’s quite simple to define the condition(s) for when an object is shown. Essentially, I can dynamically customize my survey to ask a specific follow up question of any segment of my respondents. Example:
• I want to ask a follow-up question about corporate blogging: “How has corporate blogging impacted your business�
• I would build that question as an object in the system and give it a name: ex. ‘Blog_Impact’.
• In this example, I really want to hear from ‘small startups that work in the technology industry’
• To achieve this, I would define the ‘show-if’ condition for Blog_Impact such that it is only shown if
o company_size = 1- 100 employees
o in_business
o industry = technology
• for the above, company_size, in_business and industry would all be ‘objects’ in the system and the answer ranges would be from their associated scale values
Now, Blog_Impact will only be shown if the conditions I’ve built have been met somewhere previously in the survey. The system keeps track of answer responses so I can ask Blog_Impact at any point in the survey (assuming it comes after the other questions that define the logic conditions are asked). If the participant doesn’t meet these criteria, they will not see the follow up question, saving them time and increasing response rate. Because the logic is tied to the object Blog_Impact, I can ‘click and drag’ it anywhere in the survey (the logic moves with it!- I don’t ever have to recode.
Using ‘show-if’ you can build powerful surveys that adapt dynamically to the respondent as they answer questions. It enables researchers to ‘profile on the fly’ and get much deeper, more relevant responses rather than getting to the analysis phase and wishing they’d been able to ask a certain question of a certain segment. You can also group similar objects into ‘collections’ and set ‘show-if’ conditions for the collection. Thus, if there are a series of questions I want to ask of the ‘small startups that work in the technology industry’ I can put all of those question objects into a ‘collection’ and set show if logic for the single collection, rather than having to do it for all those individual questions- a easy way to organize your survey and big time saver! You can even drive show-if conditions off of calculated variables or variables that you ‘pre-load’ into the survey from other data sources.
Okay, I fear I’m getting a little ‘too’ technical. I think I’ll stop here… Hopefully this makes sense? Happy to clarify or give other examples if it helps. Can you see how this object-oriented architecture would be beneficial in your own research efforts?
Looking forward to continuing our conversation…
(as a blogging ‘newbie’, please let me know if there are any 'rules of engagement', etiquette, etc. for conducting our discussion- hopefully we're off to a good start!)
Posted by: Arick Rynearson at July 22, 2005 6:10 PM
Arick,
Does your system allow clients to review results as customers take the survey, i.e. you ask the question and then immediately show the results of the survey and then ask the survey respondent to answer the next question?
John
Posted by: John Cass at August 4, 2005 6:36 PM
John-
Our system, DatStat Illume, has a web interface (called Web Console) which enables clients to view, analyze and download respondent data in real-time.
As far as asking the respondent to answer the next question, the system keeps track of how the respondent answers and then 'branches' to present relevant follow-up questions (or collections of questions) depending on the 'show-if' conditions which have been set. The show-if logic effectively tells the system how to branch depending on the respondent is answering. This happens dynamically- depending on how the show-if conditions are setup and the way a respondent answers questions, they may only see a handful of questions out of a survey with hundreds of potential questions.
So, you can see the results in real-time and the Illume system dynamically presents the follow-up questions depending on how you’ve setup the ‘show-if’ logic.
Using our technology, it’s also possible to ‘pre-load’ data from other external data sources (CRM, ERP, Web Trending software, etc.). This preloaded data becomes variables within the system. You can use these pre-load variables to customize the survey a particular participant might see by ‘piping’ the info. into the survey or by driving ‘show-if’ conditions off the pre-load variables. You can also factor the pre-load data into future analysis, as it’s now part of the data set.
One exciting way you can leverage online survey technology is by ‘bridging’ behavioral and attitudinal data. Essentially, dynamically adaptable surveys enable you can ask very specific follow-up questions about why people behaved the way they did. Why did some see the banner and visit the website while others didn’t? Why did some abandon the shopping cart and others completed their purchase? Why could some find what they were looking for on the website vs. others who got lost? The technology effectively enables researchers to understand the ‘why’ behind the behaviors they measure- valuable information, I think you’ll agree?
Posted by: Arick Rynearson at August 19, 2005 2:34 PM