3 Ways to Completely Mess Up Customer Surveys
Customer surveys are everywhere – and most of them aren’t going to offer one ounce of insight. Instead, learn to use them properly and for the right reasons.
Customer surveys are inescapable these days. They include pop ups that ask how you are enjoying using the product while you are trying to get something entirely else done. Then there are the ones that beg you to hang out in your on-line session to ‘answer a few questions about our service’. Or, my latest all-time favorite, a bank whose employees make sure to tell you when your business is complete that you’re going to get a survey about them and “anything less than a perfect 10 will be problematic for me.” Not kidding, that happened to me just this past week. Here’s the problem – most of these things aren’t worth the digital ink that’s spilled on them. So, how to do better?
Use relative, not absolute, measures
The first big problem with most surveys is that the measure is an absolute measure. Questions like, “on a scale of 1 to 10, how did we do?” are typical. Or “to what extent do you agree with the following statement?” The reason such questions are meaningless is that unless you know what the standard is that your respondent is comparing you to, you have absolutely no idea what a score of, say, 8 means.
I mean, think about asking your significant other about the quality of your current relationship. Say they respond, “I don’t know, I guess about a three.” What do you do with that?
Instead, you need some kind of comparison standard. An easy one is “relative to your expectations” how well did we do? A little more complex might be to compare your service to a specific alternative. Or to doing absolutely nothing. Or to alternative ways that a given outcome could be accomplished. At least you know the respondents’ frame of reference and have something you can work for.
Include some kind of dependent variable
Another big issue with most surveys is that they go nowhere. All you are finding out at best, say, is that the customer might be willing to refer you to someone else. With surveys like that you are assuming that positive results translate into some kind of behavior you would like to encourage. You have absolutely no idea whether it does or not.
Think about it – how many times have you politely answered a question because you know it’s the answer the person asking wants to hear? So too with surveys.
Instead, see if you can get some insight into the customer’s future behavior. Asking something like “the next time you are looking for voice-recognition enabled widgets, how likely are you to get ours” is better. “How much of your budget for voice-recognition enabled widgets do you plan to spend with us in the next six months” is another.
The gold standard, which is hard to do but which is increasingly possible given how much software surrounds our every move, is to link survey responses in theory to actual behavior in real life. Mastercard, for instance, under the leadership of our friend Gary Kearns, was able to do this because they could connect advertisements or other inducements to subsequent actual purchasing (Gary is a genius at building information products – we’re big fans).
Remember the “zone of indifference”
I’ll borrow a graphic developed by the RFI Group here because it lays out the problem clearly. We tend to assume that some variable like satisfaction (“give me a 10! Give me a 10!) correlates with something we care about, such as loyalty or share of wallet as in the following graph:
Here’s the trouble. It doesn’t. The first problem is that I may have indeed scored that bank employee a “10” but that doesn’t mean I’m going to give the bank any more business. In fact, the thing that I contacted them about was to cancel a credit card – they would have gotten a lot more insight about why I cancelled the card by asking me instead of wasting their time on a basically meaningless survey. For instance, that I got talked into an expensive card by a bank employee while I was there for something else, and had second thoughts once I got home.
The second problem is more serious. That is that in between being actively dissatisfied (see my previous rant on the topic with respect to Verizon) and being a totally exhilarated advocate lies a long, deep zone of indifference. In other words, I might well report more satisfaction, but that doesn’t mean its going to translate into anything that you, the provider, care about.
What this means is that you have two sweet spots when it comes to investing to increase customer satisfaction.
One is to invest just enough to get customers to neutral – this is a favorite strategy of firms whose customers have no or very little choice about which provider to use. Think airlines, phone companies, your cable company, etc. They know you are a hostage, not a customer, and they behave accordingly.
The other is to invest enough to provoke real delight on the part of your customer, to the point at which they become advocates for your company and brand. The Wall Street Journal recently published an analysis of companies who seem to be getting this right – satisfying customers AND getting more of their business.
So before you plunk down money or spend time on a customer survey, remember to ask whoever is trying to talk you into doing so:
· What comparison are you asking the respondent to make?
· How will we know if the answers lead to more business?
· Are we wandering through the zone of indifference?
Want to know how your organization is doing on getting beyond innovation theater?
You might find our Innovation Mastery Survey to be useful. We have a few different models for how you can do the survey and analyze the results, from a pretty inexpensive self-serve version to a more elaborate diagnostic and debrief with me. Get in touch – write us at growth@valize.com.
Get in Touch, Keep in Touch!
If you’d like to keep up to date on upcoming events you can sign up to the newsletter at this link: https://www.ritamcgrath.com/newsletters/