Why One-Question Surveys Make for Bad Analytics

If I had to describe Netflix's telephone customer service in one word, that word would be "streamlined."

When you call Netflix customer service, you get no prompting menus, and you are placed on hold for only a few minutes at most. You can even skip to the head of the queue if you use an easily accessed service code from the Netflix Website.

Additionally, Netflix refuses to accept too many calls if the lines get too full. If you happen to call when a major service issue is causing many customers to call in, you will hear -- just before being disconnected -- a curt recording telling you to call back another time. As terrible as this sounds, it may help keep Netflix's customer service representatives from getting jaded by protecting them from potential abuse from customers angry about being on hold for 45 minutes.

However, nowhere is Netflix's streamlined approach to customer service more apparent than at the end of the phone call. After you wrap up your call and the customer service representative wishes you well, you are immediately directed to a customer satisfaction survey.

The survey is one question long:

"We would like to hear about your experience with Netflix. If you are satisfied, press 3. If you are dissatisfied, press 4."

The problem with this question is that it involves major guesswork on the customer's part. Consequently, the response may say more about the respondent than about Netflix -- and therefore may be of little analytical value.

For one thing, it is unclear whether it is asking about satisfaction with the company in general or with the particular customer service phone call. I have always assumed the latter, given the context, but I could be wrong.

For another example, once or twice in the past, I have been assured something would be handled by the exceedingly friendly Netflix customer service representative, pressed 3 on the survey, and found later that the issue was handled incorrectly.

Now I have a dilemma. Do I call Netflix back to complain? If the representative handling that call takes care of it for me, do I tell the survey that I'm satisfied (truthful but doing nothing to mitigate the prior "satisfied" rating) or not satisfied (untruthful but potentially more accurate in the aggregate)? The additional problem with the latter approach is that I don't want the person actually helping me to be punished.

On a related note, what if the employees try their best, but I still have the problem? Recently, I called Netflix about some streaming issues I've been having. Ultimately, the representative could not assist me. I honestly have no way of knowing whether the representative was truly not in a position to assist me (in which case, I should vote satisfied, because he tried his best and was very courteous), or if there was something more he could have done, and he either didn't know what it was or just didn't bother (in which case, I should vote dissatisfied). Torn, I hung up on the survey without answering.

Netflix isn't the only company that has these one-question surveys. As many companies overwhelm customers with a laundry list of increasingly complex feedback questions, others -- in the interest of increasing customer participation -- are taking the opposite approach: brutal simplicity.

However, this kind of all-or-nothing survey, leaving no room between two extremes (satisfied and dissatisfied), has serious weaknesses. Sure, with only a straightforward yes-or-no question, the survey is quick and simple, but "quick and simple" is not synonymous with "efficient" -- or, for that matter, "accurate."

Moreover, its usefulness is limited. Netflix is getting little to no insight as to why the customer is satisfied or dissatisfied. Additionally, unless Netflix keeps track of customers who can't get through during a busy time, the customer survey fails to take into account perhaps the most dissatisfied customers of all.

Accuracy, not simplicity, is the touchstone of analytics. Organizations must therefore take a more balanced approach to surveys. Yes, it may make the analytics more difficult, but thinking like an actual customer yields far more accurate -- and actionable -- results.

Joe Stanganelli, Attorney & Marketer

Joe Stanganelli is founder and principal of Beacon Hill Law, a Boston-based general practice law firm.  His expertise on legal topics has been sought for several major publications, including U.S. News and World Report and Personal Real Estate Investor Magazine. 

Joe is also a communications consultant.  He has been working with social media for many years -- even in the days of local BBSs (one of which he served as Co-System Operator for), well before the term "social media" was invented.

From 2003 to 2005, Joe ran Grandpa George Productions, a New England entertainment and media production company. He has also worked as a professional actor, director, and producer.  Additionally, Joe is a produced playwright.

When he's not lawyering, marketing, or social-media-ing, Joe writes scripts, songs, and stories.

He also finds time to lose at bridge a couple of times a month.

Follow Joe on Twitter: @JoeStanganelli

Also, check out his blog .

Clean Your Data with Visualization and Algorithmic Tests

Speakers at Bio-IT World explore techniques for biotech researchers and others working with big data to identify the accurate data in their data files.

Data Sharing: A Matter of Life and Death

Cooperation among medical researchers -- done right -- very simply can mean lives saved, but the research community needs education on how to execute on that collaboration.

Re: Survey frustration
  • 10/4/2011 2:16:13 PM

I'm wondering if it is truly a one question survey. If a person presses 4 for being dissatisfied are they emailed a follow up survey to get more information? 

The reason I wonder is that Netflix is consistently rated one of the highest web  in customer satisifaction.  This year they are only one point behind Amazon. 

Re: Survey frustration
  • 9/24/2011 6:27:12 PM

@Cordell  Excellent point regarding Banks, an industry fraught with "agnostic arrogance" regarding customer service.  And you are so right, banks are missing the point, people are not changing banks as often due to this disregard for their time and business, as they have little choice.  So what is analytics to make of this ?  Not much unless you measure the right variables and come from a place of geniune concern for the customer.

Cordell's point about customer service = cost
  • 9/24/2011 3:52:31 AM

When I read about the customer service, I felt a similar vibe to Cordell's point about cost and customer service.  He's right but it's not just banks. Many businesses consider customer service as strictly a cost rather than a chance to change the value of the operation.  That's essential what Zappos proved with its customer service (and with its hiring practice of paying prospective employees to quit).  Now that there's been a succesful model, more consideration of how to improve customer service will come into practice.  The only question left is which companies will continue to refine and succeed.

Re: Survey frustration
  • 8/26/2011 4:47:47 PM

That's an excellent point about banks and loyalty.

Re: Survey frustration
  • 8/26/2011 2:41:13 PM

An anecdote about phone trees. I was reading an article complaining about the elaborate phone trees your bank requires you to navigate when you call that included this gem - "It's almost like they'd rather you didn't call at all"  Um, that's exactly what it is.  For a bank calls=cost.  If the CSR can't cross-sell something when they get you on the line then they'd rather not take the call at all when a machine can do it.  Shortsighted? Maybe but seriously, when was the last time you switched banks?  What some banks may interpret as loyalty is nothing more than being averse to switching!

Netflix might do the same, mistaking loyalty for a lack of alternatives.

Re: Thinking to deep
  • 8/26/2011 2:28:03 PM

Okay let me turn this scenario on it's head.  I'm speculating of course but what if the survey isn't really designed to evaluate specific customer feedback per se.  Instead it's a tool to evaluate the CSR's. (Beth a good question when/if you get the interview!). This would lead to all kinds of unusual behavior like reps disconnecting before you could respond to a bad interaction or encouraging you to call later to push you off on another rep.

Now let's really get crazy and imagine that someone at Netflix decided that this was useful market data and upon observing that that most customers are "satisfied" (because the data's been skewed) they conclude that they must be less price sensitive - hence the change in pricing structure.  Well you can't say they didn't take action!

Bad decision based on bad data?  I'm positive I'm overthinking now. Surely Netflix realizes that their loyal customer base isn't going anywhere based on more than thier phone surveys.  The lesson here is that not thinking through design of the survey and the process for data gathering can bring unintended consequences.

Re: Thinking to deep
  • 8/26/2011 1:08:50 PM


True. It's all about action from a service P.O.V. If you can't act decisively based on feedback then, in essence, you have nothing. SO much like standard marketing with calls to action, surveys must provide the same answers to those who are looking for answers.

Re: Thinking to deep
  • 8/26/2011 11:44:39 AM

I haven't taken one of the customer support surveys, so I don't know what if any other options are available after selecting a choice.

It there aren't any, you get high-level overview of what your customers think (as I mentioned not very useful in its own, but gives you the temperature). What they should do is break this down a few more steps afterwards.

This sounds to me like they're getting something started and are taking the approach very slowly and simplistically. Too much change for a user is scary.

If this is all they have planned for the future, than I agree, the data isn't going to be worth anything long term. 

Re: Thinking to deep
  • 8/26/2011 10:18:34 AM

But scucci, the knowledge as to whether I am "satisfied" in some vague, generic way is not actionable.  It does not tell you why I feel that way, what I feel that way about, or what you can do to maintain or change that feeling.

It's meaningless, and can only lead to blind guesswork.

At least the e-mail surveys asking about picture and audio quality, while they do have their weaknesses (as an earlier commenter pointed out), tell you *something.*  So to with the e-mail surveys asking when a particular DVD arrived or was shipped.

But "yes or no: are you satisfied" with no in-between or explanation (either of what the question really means or what the answers really mean)?  You may be able to impress a clueless senior executive or board member with a measurement of the numbers on that metric, but it doesn't tell you anything productive.  Strictly from the perspective of gathering and applying analytics, you may as well abandon the question and just look at sales volume and costumer service call volume.

Re: Need Marketing researcher
  • 8/26/2011 10:12:38 AM

First let me state that I don't work for Netflix (LOL) and that I might be the only person on the this and other boards that like the UI.

What we have to remember is that we're used to having the ability to customize our systems when we don't like an option.

What Netflix is doing with their UI is making it as EASY to use as possible for the AVERAGE user. When I say average user I'm thinking about my son and grandmother. They're not concerned that it takes slightly longer or doesn't have the options that we want, because they want to streamline everything to make it easy to use.

This gives better customer satisfaction since they know how to use it without reading directions or calling customer support.

I think market research is right on the money if you're an average user.

Page 1 / 4   >   >>