When you call Netflix customer service, you get no prompting menus, and you are placed on hold for only a few minutes at most. You can even skip to the head of the queue if you use an easily accessed service code from the Netflix Website.
Additionally, Netflix refuses to accept too many calls if the lines get too full. If you happen to call when a major service issue is causing many customers to call in, you will hear -- just before being disconnected -- a curt recording telling you to call back another time. As terrible as this sounds, it may help keep Netflix's customer service representatives from getting jaded by protecting them from potential abuse from customers angry about being on hold for 45 minutes.
However, nowhere is Netflix's streamlined approach to customer service more apparent than at the end of the phone call. After you wrap up your call and the customer service representative wishes you well, you are immediately directed to a customer satisfaction survey.
The survey is one question long:
"We would like to hear about your experience with Netflix. If you are satisfied, press 3. If you are dissatisfied, press 4."
The problem with this question is that it involves major guesswork on the customer's part. Consequently, the response may say more about the respondent than about Netflix -- and therefore may be of little analytical value.
For one thing, it is unclear whether it is asking about satisfaction with the company in general or with the particular customer service phone call. I have always assumed the latter, given the context, but I could be wrong.
For another example, once or twice in the past, I have been assured something would be handled by the exceedingly friendly Netflix customer service representative, pressed 3 on the survey, and found later that the issue was handled incorrectly.
Now I have a dilemma. Do I call Netflix back to complain? If the representative handling that call takes care of it for me, do I tell the survey that I'm satisfied (truthful but doing nothing to mitigate the prior "satisfied" rating) or not satisfied (untruthful but potentially more accurate in the aggregate)? The additional problem with the latter approach is that I don't want the person actually helping me to be punished.
On a related note, what if the employees try their best, but I still have the problem? Recently, I called Netflix about some streaming issues I've been having. Ultimately, the representative could not assist me. I honestly have no way of knowing whether the representative was truly not in a position to assist me (in which case, I should vote satisfied, because he tried his best and was very courteous), or if there was something more he could have done, and he either didn't know what it was or just didn't bother (in which case, I should vote dissatisfied). Torn, I hung up on the survey without answering.
Netflix isn't the only company that has these one-question surveys. As many companies overwhelm customers with a laundry list of increasingly complex feedback questions, others -- in the interest of increasing customer participation -- are taking the opposite approach: brutal simplicity.
However, this kind of all-or-nothing survey, leaving no room between two extremes (satisfied and dissatisfied), has serious weaknesses. Sure, with only a straightforward yes-or-no question, the survey is quick and simple, but "quick and simple" is not synonymous with "efficient" -- or, for that matter, "accurate."
Moreover, its usefulness is limited. Netflix is getting little to no insight as to why the customer is satisfied or dissatisfied. Additionally, unless Netflix keeps track of customers who can't get through during a busy time, the customer survey fails to take into account perhaps the most dissatisfied customers of all.
Accuracy, not simplicity, is the touchstone of analytics. Organizations must therefore take a more balanced approach to surveys. Yes, it may make the analytics more difficult, but thinking like an actual customer yields far more accurate -- and actionable -- results.