Creating Customer Experience Surveys That Actually Work
The goal of most service teams is to provide an exceptional support experience – one that differentiates your company from the pack. But how do you know if you are actually delivering on that goal? Many companies choose the survey route. You know, those “rate us from 1-10” on your experience surveys that either follow via a phone call, email or text. The ones most customers usually ignore. They might ignore them because they feel overwhelmed or perhaps wonder if they affect any change at all. The truth is though, organizations rely on surveys to unlock hidden insights and driving measurable change in customer experience, but they are only useful if they are designed in a way that makes the customer want to respond.
Designing respondent-centric surveys result in not just more data, but data that’s more accurate and reliable- data that can confidently shape improvements to your customer support processes.
Here are a few tips to keep in mind when designing your experience survey:
- Edit Ruthlessly
Ask only they questions you genuinely care about and will use. Though it may be tempting to ask a long laundry list of questions, being ruthless when editing out extra “fluff” will pay off in many ways. Survey fatigue sets in quickly and is a major threat to data quality. Shorter surveys result in higher response rates and those responses will be of higher quality. To make sure that you are only asking what you need to ask, review your survey draft or outline with a very critical eye for following considerations:
- Does each question have utility and value for what you need to know?
- Do we already know this?
- Are we really going to use this question in the actual analysis?
- Check your biases and assumptions.
It is surprisingly easy for missteps and biases in question wording to seep into a well-intentioned questionnaire. Below are some common examples to avoid:
- Leading questions (e.g., “How much should tech-savvy professionals pay for this service?”) introduce bias and can interfere with respondents’ answering truthfully to your questions. Problems can also arise when you order your series of questions in a way that is leading or suggesting an expected or desired way of answering. Be mindful of how the wording and ordering of questions might be perceived by your respondents and ensure that it is as unbiased as possible.
- Another type of question that isn’t necessarily biased, but hard-to-answer for different reasons is a double-barreled question (e.g., “How satisfied are you with the price and features?”). Each question should contain one distinct topic or concept.
- Speak the respondent’s language
- Not only should your survey questions be edited for length and bias, but they also need to be worded using clear and accessible language that is relevant to everyone responding to your surveys. You want to make the respondent’s job easy. Eliminate jargon or overly wordy language. If you’re unsure if your questionnaire is accessible and clear, test it out with people outside of your team before launching it more widely.
- Make sure your question wording matches your response options/scale labels. All too often we’ve seen questions that ask something like, “How important are the following statements to you?” with scale points labeled from “Strongly Disagree” to “Strongly Agree.” This question would be easier to answer and more precise if it’s labels were changed to “Very important” to “Not important at all.” These are simple adjustments to make, but can make a world of difference.
- Think about the extremes or outliers in your population. We often design surveys with the “average” person in mind. However, this is limiting and biased, because this mindset can lead to excluding important answer options for corners of your population. One classic example is the best practice of including a “Not applicable” or “I don’t know” answer choice to your survey items. If you exclude these, it may make it more difficult for certain respondents to complete your survey, and you risk receiving low survey completion and poor data quality.
Designing respondent-centric surveys can be quite easy to do once these best practices are in-hand, and by doing so, everyone benefits. Your customers benefit from an easy and pleasant survey experience – they get the opportunity to feel heard but not burdened. You benefit from increased confidence and actionable insights from the higher response rates and higher data quality and you benefit from increased credibility in the eyes of your customers.
Did you know that Rescue offers a customer survey feature? Check out our on-demand webinar that addresses how you can get the most out of your customer feedback, as well as how to utilize this capability within Rescue.