Why One-Question Surveys Make for Bad Analytics


If I had to describe Netflix's telephone customer service in one word, that word would be "streamlined."

When you call Netflix customer service, you get no prompting menus, and you are placed on hold for only a few minutes at most. You can even skip to the head of the queue if you use an easily accessed service code from the Netflix Website.

Additionally, Netflix refuses to accept too many calls if the lines get too full. If you happen to call when a major service issue is causing many customers to call in, you will hear -- just before being disconnected -- a curt recording telling you to call back another time. As terrible as this sounds, it may help keep Netflix's customer service representatives from getting jaded by protecting them from potential abuse from customers angry about being on hold for 45 minutes.

However, nowhere is Netflix's streamlined approach to customer service more apparent than at the end of the phone call. After you wrap up your call and the customer service representative wishes you well, you are immediately directed to a customer satisfaction survey.

The survey is one question long:

"We would like to hear about your experience with Netflix. If you are satisfied, press 3. If you are dissatisfied, press 4."

The problem with this question is that it involves major guesswork on the customer's part. Consequently, the response may say more about the respondent than about Netflix -- and therefore may be of little analytical value.

For one thing, it is unclear whether it is asking about satisfaction with the company in general or with the particular customer service phone call. I have always assumed the latter, given the context, but I could be wrong.

For another example, once or twice in the past, I have been assured something would be handled by the exceedingly friendly Netflix customer service representative, pressed 3 on the survey, and found later that the issue was handled incorrectly.

Now I have a dilemma. Do I call Netflix back to complain? If the representative handling that call takes care of it for me, do I tell the survey that I'm satisfied (truthful but doing nothing to mitigate the prior "satisfied" rating) or not satisfied (untruthful but potentially more accurate in the aggregate)? The additional problem with the latter approach is that I don't want the person actually helping me to be punished.

On a related note, what if the employees try their best, but I still have the problem? Recently, I called Netflix about some streaming issues I've been having. Ultimately, the representative could not assist me. I honestly have no way of knowing whether the representative was truly not in a position to assist me (in which case, I should vote satisfied, because he tried his best and was very courteous), or if there was something more he could have done, and he either didn't know what it was or just didn't bother (in which case, I should vote dissatisfied). Torn, I hung up on the survey without answering.

Netflix isn't the only company that has these one-question surveys. As many companies overwhelm customers with a laundry list of increasingly complex feedback questions, others -- in the interest of increasing customer participation -- are taking the opposite approach: brutal simplicity.

However, this kind of all-or-nothing survey, leaving no room between two extremes (satisfied and dissatisfied), has serious weaknesses. Sure, with only a straightforward yes-or-no question, the survey is quick and simple, but "quick and simple" is not synonymous with "efficient" -- or, for that matter, "accurate."

Moreover, its usefulness is limited. Netflix is getting little to no insight as to why the customer is satisfied or dissatisfied. Additionally, unless Netflix keeps track of customers who can't get through during a busy time, the customer survey fails to take into account perhaps the most dissatisfied customers of all.

Accuracy, not simplicity, is the touchstone of analytics. Organizations must therefore take a more balanced approach to surveys. Yes, it may make the analytics more difficult, but thinking like an actual customer yields far more accurate -- and actionable -- results.

Joe Stanganelli, Attorney & Marketer

Joe Stanganelli is founder and principal of Beacon Hill Law, a Boston-based general practice law firm.  His expertise on legal topics has been sought for several major publications, including U.S. News and World Report and Personal Real Estate Investor Magazine. 

Joe is also a communications consultant.  He has been working with social media for many years -- even in the days of local BBSs (one of which he served as Co-System Operator for), well before the term "social media" was invented.

From 2003 to 2005, Joe ran Grandpa George Productions, a New England entertainment and media production company. He has also worked as a professional actor, director, and producer.  Additionally, Joe is a produced playwright.

When he's not lawyering, marketing, or social-media-ing, Joe writes scripts, songs, and stories.

He also finds time to lose at bridge a couple of times a month.

Follow Joe on Twitter: @JoeStanganelli

Also, check out his blog .

Clean Your Data with Visualization and Algorithmic Tests

Speakers at Bio-IT World explore techniques for biotech researchers and others working with big data to identify the accurate data in their data files.

Data Sharing: A Matter of Life and Death

Cooperation among medical researchers -- done right -- very simply can mean lives saved, but the research community needs education on how to execute on that collaboration.


Survey frustration
  • 8/22/2011 10:49:27 AM
NO RATINGS

Joe, I know this frustration well having recently had a similar experience with Comcast. I spent way too much time on a recent Sunday afternoon working with myriad Comcast techs as I tried to update my cable modem/router setup. Each time I had to initiate another call -- which happened multiple times during this process for a variety of reasons, including several instances of misrouting and disconnections, I was asked if I'd mind taking a quick survey at the end of my call. I did once, after talking to the first guy who was nice and helpful and even though he ended up not being able to help me he at least acknowledged that and successfully passed me along to the next tech down the line. She, too, was helpful -- or so I thought and so I rated the overall experience as positive. However, in thinking about the experience after the fact, I kind of think she just wanted me off the phone because she didn't know how to fix my problem. She told me to call back after my new modem had time to download all the appropriate software from the network, but that ended up resulting in about 2 hours and all sorts of misdirection. Ultimately, I had to reinstall my old modem. I should have taken multiple surveys expressing my irritation, but as the day wore on all I wanted was my dang connection reestablished and the heck to providing Comcast with usable feedback!

Re: Survey frustration
  • 8/22/2011 11:27:10 AM
NO RATINGS

Hi, Beth.

Yes, I also have issues with Comcast's surveys.

I once made the mistake of agreeing to take a survey, but then deciding that I didn't want to after hearing the loaded question, and hanging up.  Comcast continued to auto-call me for WEEKS after the fact.  Now I never take their surveys.

Plus, you can't take the survey when a particularly dull-witted Comcast employee abruptly disconnects you -- which happened to me about a week ago.

Re: Survey frustration
  • 8/22/2011 1:47:28 PM

I like their idea of the picture quality 1-question survey you randomly get after viewing a movie. But there is also a big problem with the question as well. Can you guess what it is? The email question reads:

Survey: How was the Picture and Audio Quality?

Dear Jaime,

You recently watched Goya's Ghosts. To help us ensure a great experience for all members, would you take a moment to tell us about the picture and audio quality?

The quality was very good

The quality was acceptable

The quality was unacceptable

-------

The problem is that it is trying to rate two different aspects of the movie. What if the video was good but not the audio? This composite question should be separated into two or show me only audio or video and the next person who watches gets the other one. With as many movie watchers, I am sure Nextflix would get a good sample and be able to react when a movie's audio or video is bad. Fortunately, I've always rated their instant viewing positively and never had a problem with video or audio.

Your post reminds me of the 1-question you get from a restaurant waiter. "Was the food good?" I have to say that even when it wasn't, I always say yes. By being more specific, it would entice the customer to be more honest. For example, was the chicken seasoned correctly? Was the milkshake too thick? Restaurants can gather lots of info if they just had the right 1-question selected.

Re: Survey frustration
  • 8/22/2011 2:44:36 PM
NO RATINGS

Jaime,
Agree totally. I think on one hand they've tried to make the process simple and, well, short, but they've overdone it. They are also trying to simplify the data being collected but there are too many ways to interpret the answers for the process to be helpful to the company, which should be the whole point of the survey in the first place.

Re: Survey frustration
  • 8/22/2011 3:41:57 PM
NO RATINGS

The Netflix satisfaction survey, and so many others out there, smacks to me of somebody THINKING they ought to be collecting data but not really analyzing the why of it or, more importantly, putting themselves in the customer's seat and judging the survey question from that perspective. (Or maybe they did, but determined there was no way to get to the info they really wanted in a simple, straightforward fashion and this is simply their "throw in the towel" approach!)

Re: Survey frustration
  • 8/22/2011 4:16:15 PM
NO RATINGS
1 saves

 

Indeed, getting more customers to participate in satisfaction surveys may yield more data but does not guarantee better insights. Maybe we should start a movement called "little data"... :-)

As for Netflix, it is probably more interested in analyzing (or at least watching closely) the data they get from their customers anyway, without the need to beg for participation. This leads to reduced calls to the streamlined call center and to proactive customer service.  Earlier this year, I was playing with their video streaming service, starting and stopping after a few minutes a number of movies. The next day (possibly the next hour),  I got an email from Netflix apologizing for the difficulties I had in using their video service and offering me a few dollars off my next monthly bill.  I'm sure Netflixers are much more excited about performing these kinds of customer delighters than designing and analyzing traditional satisfaction surveys. Given that Netflix Instant accounts for 20% of peak U.S. bandwidth use, their analysts and Data Scientists probably prefer to spend their time analyzing Internet usage patterns, capacity constraints, and performance, everywhere in the U.S., knowledge that must represent an enviable competitive and market differentiation and drives their proactive customer service.

 

Re: Survey frustration
  • 8/22/2011 9:10:59 PM
NO RATINGS
1 saves

GilPress,

You bring up an interesting point, namely, is the customer survey much of a survey at all or is it a kind of customer service, one more example of how Netflix welcomes feedback even though they limit the calls they will receive? I remember when we started using Netflix in our home and how difficult it became to justify a trip to the video rental store...when there still was one in business locally! Why? Because for the cost and the number of titles offered not only with streaming video but also with home delivery, there wasn't much competition. I don't mean this to sound like a Netflix ad. The point I am making is that it's almost like complaining about free services from Google. Due to the adoption rate of these technologies, how helpful could such surveys really be and aren't there likely better places to apply analytics in terms of mapping growth and examining the next new opportunity.

Re: Survey frustration
  • 8/22/2011 11:47:17 PM
NO RATINGS

I'm a Netflix subscriber too. I always long for a second question or at least a comment box on thier one-question surveys. I don't want to spend a long time completing a survey but I want to give accurate feedback. I have wondered what data they were most interested in with these seemingly one-off questions.

Wasted Effort
  • 8/23/2011 12:23:28 AM
NO RATINGS

@Joe   As I read your blog ( which is excellent) I could not really believe what I was reading!  One question surveys ?  A sort of true and false scenario ?  You have got to be kidding me, yes I know you are not.  

Anyhow, this is discouraging at best, it looks like Netflix and companies like them are trying to make Analytics as easy as possible and are as you say missing the boat entirely.

What can they possibly hope to garner from such an exercise ?  I cannot for the life of me understand why a company would foolishly waste the effort.

Need Marketing researcher
  • 8/23/2011 2:21:27 AM
NO RATINGS

Truly sad, they might have used the same survey to determine the price increase they recently implemented that has caused mass user backlash. It is sad that so many companies are cutting back on research as a result of the economy, if they had a marketing researcher on staff or on contract such a mistake would not have been made. Some of the “do it yourself” research tools on the web have made companies think they can cut out researchers and the software will do it for you—this is a critical mistake that they will see in their erroneous results.

Page 1 / 4   >   >>
INFORMATION RESOURCES
ANALYTICS IN ACTION
CARTERTOONS
VIEW ALL +
QUICK POLL
VIEW ALL +