Question Order Can Make a Difference in the Data You Collect


Previously, I've written about how to avoid asking the wrong questions in surveys. Today, I write about avoiding putting survey answers in the wrong order.

As source material, I'm using my recent experience taking a written survey on a television show about which I knew nothing -- including its existence. To start, the survey asked how often I watch the show, offering various frequencies for choices. Naturally, I selected, "Never."

The next question was, "Why did you say you never watch [the television show]?" Here, I could pick the "most correct" answer from several -- at least eight -- options. The first answer was, "Didn't know it was a television show."

I didn't know it was a television show. Again, I had never heard of it. So I promptly selected that first answer without bothering to read the rest.

I was just about to move to the next page and move on when my gaze happened to catch an option, much farther down the list, that said, "Never heard of it." Because that response was more accurate, I changed my answer accordingly.

This was serendipity, however. I was not particularly up for carefully reading through every one of the multiple options of several questions (a symptom of the "survey fatigue," discussed here, I suppose). I easily could have missed the correct response and given the survey less accurate data.

Indeed, "never heard of it" is even a more likely response than "[heard of it but] didn't know it was a television show," but a person ignorant of the show's existence would be well tempted (as I was) to select the latter option if it was presented first.

If you want the best data from your written surveys, make sure people understand the questions you're asking -- which means that they read them in full. Keep this in mind when determining how to order the responses. To get the best data, make sure that the more likely catch-all response comes before the less likely qualifier.

Also, because you want to make sure people read what they're responding to (at least one major retailer asks survey respondents a special question, as a fellow blogger wrote about here, to see if they're paying attention), limit the number of possible responses. Precision is great, but any increase in the length of your survey is a tax on your respondents' time -- and thus potentially a tax against your return on investment.

As organizational consultant Doug Williamson mentioned in an AllAnalytics.com interview several months ago, more precise but intermediate "hedge" answers may be less helpful. He recommends infusing your survey responses with sharp definition while limiting the number of possible responses, forcing respondents to pick the best option. (For questions asking people to rate something on a scale, Williamson advises limiting the scale to four points.)

If, for whatever reason, you deem that none of this will work for you, you have an alternative: Allow for open responses and employ text analytics to categorize the answers you get (such as in this example).

It all comes down to planning your survey efforts effectively (hint: plan backwards). Figure out to the penny how much each piece of information is worth to you. From that, determine the information that it is really worth it to you to find out.

If you're spewing out survey questions each with seven or more responses, you probably haven't done your homework. Therefore, the answers probably aren't optimally ordered for encouraging understanding and accuracy, either.

The likely result of your haphazard selection of haphazardly ordered answers? Haphazard responses by people who just don't care.

What are your survey peeves? Share on the message board below.

Joe Stanganelli, Attorney & Marketer

Joe Stanganelli is founder and principal of Beacon Hill Law, a Boston-based general practice law firm.  His expertise on legal topics has been sought for several major publications, including U.S. News and World Report and Personal Real Estate Investor Magazine. 

Joe is also a communications consultant.  He has been working with social media for many years -- even in the days of local BBSs (one of which he served as Co-System Operator for), well before the term "social media" was invented.

From 2003 to 2005, Joe ran Grandpa George Productions, a New England entertainment and media production company. He has also worked as a professional actor, director, and producer.  Additionally, Joe is a produced playwright.

When he's not lawyering, marketing, or social-media-ing, Joe writes scripts, songs, and stories.

He also finds time to lose at bridge a couple of times a month.

Follow Joe on Twitter: @JoeStanganelli

Also, check out his blog .

Clean Your Data with Visualization and Algorithmic Tests

Speakers at Bio-IT World explore techniques for biotech researchers and others working with big data to identify the accurate data in their data files.

Data Sharing: A Matter of Life and Death

Cooperation among medical researchers -- done right -- very simply can mean lives saved, but the research community needs education on how to execute on that collaboration.


Re: Abiguous Manipulation
  • 5/1/2012 12:22:29 AM
NO RATINGS

Hi, Callmebob.

To be fair, as Doug Williamson discussed with me for an earlier piece (and as addressed here), making people commit to exacting language, without allowing them to hedge, actually yields better, more accurate, more truthful results.

At the same time, there are surveys with loaded questions that do try to steer people, particularly in survey contexts outside of customer surveys, and I may have a piece about this topic coming in the future.

Re: Peeves
  • 4/30/2012 10:58:07 PM
NO RATINGS

Bulk,

You would think the approach would be to gradually build up to the main questions, that way you are lead to be fully engaged and more prone to answer candidly.

Re: Peeves
  • 4/30/2012 3:50:37 PM
NO RATINGS

@Maryam:

 Do it yourself web based survey tools are not the problem. The main issue is about the presentation of the informaiton in the questionnaire and the way it is organized. The KISS (Keep It Simple and Straightforward) concept should be applied to online surveys as well.

Open responses
  • 4/30/2012 3:38:04 PM
NO RATINGS

"Allow for open responses and employ text analytics to categorize the answers you get."

Thank you for the advice. But personally I don't like spending my time writing open responses to survey questions. I prefer to have many answers to choose from.

Abiguous Manipulation
  • 4/30/2012 1:24:37 PM
NO RATINGS

My biggest pet peeve survey questions are ones that force me to respond with an ambiguous and inexact answer. This is when none of the responses reflect my true and accurate choice. My market research professor from many years ago explained how surveys can be constructed to steer answers to a desired result. In these cases when questions exclude a none of the above response, I feel manipulated and that the survey provider is trying to steer me to their preferred response. If the survey included a, Does this question make you feel paranoid?, I would probably answer, "Yes".

Re: Peeves
  • 4/30/2012 1:13:01 PM
NO RATINGS

I have noticed this as well, more and more of the web based surveys I have come across are leading off with loaded questions.

Re: Quickest way to incomplete responses
  • 4/30/2012 1:02:05 PM
NO RATINGS

Most participants begin surveys halfhearted and not fully engaged, many factors play crucial roles in fostering full engagement or lead to disinterest. Lenght of questions (having to read a short paragraph before answering a question) is on top of my list. The increase use of paying participants is evidence.

Peeves
  • 4/30/2012 12:40:07 PM
NO RATINGS

By Far leading and loaded question are my greatest peeve they scream of no analyst involvement. Unfortunately they are getting more common with all the do it yourself web based survey tools.

Re: Quickest way to incomplete responses
  • 4/30/2012 12:17:33 PM
NO RATINGS

"Figure out to the penny how much each piece of information is worth to you. From that, determine the information that it is really worth it to you to find out."

This could be said for a lot of research efforts - data mining, model building, decision management etc.  It's too easy to get involved in a fishing expedition!

Quickest way to incomplete responses
  • 4/30/2012 9:33:15 AM
NO RATINGS

Thank you for asking, I have so many survey pet peeves. Response ordering is high on my list, as is the number of options in sliding scales or any question asking me to rate the degree of my satisfaction. A big block of responses and radio buttons is a great way to encourage respondents to bail.

I have seen the verifcation questions in surveys. They usually say something like 'for quality purposes, please select ___". If you're reading you'll answer correctly.

INFORMATION RESOURCES
ANALYTICS IN ACTION
CARTERTOONS
VIEW ALL +
QUICK POLL
VIEW ALL +