After nearly 10 minutes of answering questions, I asked the survey-taker how much longer the survey would take. When she responded with a number of minutes far higher than I cared to stay on the phone with her, I begged off and hung up.
The problem for me wasn't too many questions. It was too many wrong questions.
The survey-taker had peppered me with a tedious series of loaded questions. Several were inapplicable. Many more were so poorly worded as to be virtually unanswerable. I was getting survey fatigue, which the blogger Sandra Gittlen discussed this week. The two of us will talk about that topic today, Feb. 2, at 2:00 p.m. ET in an instant e-chat here.
In one series of questions, the Blundersurv survey-taker repeatedly asked me about the quality of service I'd received from the company for as long as I've used the service. She made a point of asking me to consider how I felt about my entire history of doing business with Blundersurv -- with the pointedly clear implication that I was to place equal weight between recent events and those from long ago.
I've been a Blundersurv customer for many years. Several years ago, Blundersurv had given me some of the very worst customer service of my entire life, and I had come very close to canceling. Since then, the customer service has improved considerably, and I am now a mostly satisfied customer. To answer the questions honestly, however, I'd have had to give a misleading series of "dissatisfied" responses.
Additionally, Blundersurv's survey asked nitpicky questions about quality-of-service ills I would have never thought to consider -- until I was asked. Suddenly, every minor frustration I'd had with Blundersurv in the past year flooded into my consciousness.
By asking me these questions, not only was Blundersurv obtaining irrelevant, misleading, and obsolete information about my present satisfaction, but it was also forcing me to reflect upon almost every single problem I have had with the company in the past -- problems from months and even years ago.
These reflections, combined with the burden the survey was placing on my time and energy, made me irritated with Blundersurv. I began to doubt the wisdom of continuing to do business with the company.
What can you learn from Blundersurv's blunder?
- Don't beat around the bush. Ask the customer what you really want to know. Have you satisfied the customer? Will the customer recommend your organization to others? Boil it down to what's important -- and keep it simple.
- Ask yourself, "Do I care?" Be realistic with yourself about what you plan to do with the data you collect. If a customer had a bad experience at your store two months ago, will you attempt to rectify it? If so, great. If not, don't ask about it.
- Understand the value of your data -- put a dollar figure on it. Asking a telecom customer about overall satisfaction with call quality is OK. Asking that customer about every dropped call or bad connection may be too much. At best, the customer may feel badgered. At worst, you may open a can of worms. These details may be valuable in efforts to improve your call quality, but you must determine if getting them is genuinely worthwhile. Remember that every question you ask your customers is a tax against the good will you have earned from them.
These tips alone may or may not lead to the perfect survey question, but they'll at least help you avoid asking the wrong one.
What are your best-practices for assuring your surveys result in great, usable data for analytics? Share on the message board below, and be sure to join our e-chat this afternoon. Again, we'll be chatting at 2:00 p.m. ET about the death of surveys and the rise of social media analytics. Join the chat, and share your insight.