An average business will not hear from 96% of its dissatisfied customers. When considering that 91% of those dissatisfied customers who go unheard will leave and not return, that statistic becomes an important bottom-line influencer. A dissatisfied customer, however, will often be willing to voice an opinion when prompted by the offending company. One of many ways to solicit this information is through a customer survey. The ultimate goals of this survey are to:
- Identify and fix any meaningful problems that have occurred for customers with the company’s products or service.
- Assess and improve the performance of its customer-facing units (retail locations, call centers, digital care team, etc.) and staff (salespeople, call center reps, etc.).
- Improve its processes and standards for delivery.
- Understand customers’ needs as they use the company’s products or services so the company can provide them a better overall experience.
With these goals in mind, and in order to yield the most actionable feedback, the survey should be carefully designed in the following ways:
Targeted and Focused
The survey should be built around a well-defined objective. Designers of the survey should resist the temptation to solve more than one problem at a time, which can spread the data too thin. To answer the appointed objective, limited and direct questions should be used in order to not waste space for other important questions or the respondent’s time. It is recommended that these guidelines be followed when preparing a survey. This particular method of asking questions is called SONAR.
- Specific: Be direct and avoid asking questions within questions.
- Objective: Minimising bias by avoiding loaded or leading questions.
- Non-ambiguous: Very simple language. Don’t use jargon or unknown acronyms.
- Actionable: Only ask a question if you can take action on the response.
- Relevant: Make sure that the question is relevant to your goals and objectives and make sure the questions are relevant to your respondents’ frame of reference.
Collects Information Needed
‘Concise’ is the name of the game when looking for quality respondent data, so only questions that will yield valuable measurement or insight should be posed. A few well-placed open-ended questions are a good option for allowing respondent to take as much or as little time as they feel necessary, and often speak more candidly to the problem being addressed than do simple metrics. Be short, cover the bases, and allow for respondent elaboration.
Promotes a Good Customer Experience
The ultimate purpose of the survey is likely to improve the customer experience, so the survey should also contribute to that end. Respondents often complain that surveys are long, boring, ambiguous, and hard to read. The solution to these problems is simple: write more engaging, more creative surveys. Understand that on the other side of the 10-question response sheet, there is a human. Help that human feel their opinion counts for something, and of course, avoid length. A survey should rarely take longer than 10 minutes.
Works in Medium of Choice
The most actionable survey results come by first understanding the audience, and then deciding how best to reach them. Because data collection methodology can factor into the overall design of the survey, it should be determined based on ease of accessibility to potential respondents as well as the survey content itself. For example, if the respondents are not accessible via email, but phone numbers are available, then the survey should be administered over the phone. If the survey is designed for phone-data collection, certain aspects need be accounted for in the design, such as the interviewer reading the questions for the respondent. For online administration, designers should consider mobile optimisation, and recognise that open-ended responses on mobile may be shorter. An interactive voice response (IVR) survey will have to use shorter response scales to allow the use of number buttons on the phone to indicate the ratings. These are only some of the elements of survey design to consider based on the chosen medium for survey administration.
Provides Actionable Information to Address Business Objectives
Be it IVR, paper, phone interview, or online, the end result of the survey should be to make the case for improving CX. To do that, survey designers should pay attention to question type, scale type, and answer choice so all necessary analyses can be conducted. These factors will greatly influence the understanding of where and how the customer experience should be improved.
Often, executives need tangible reasons to dedicate time to improving customer experience (CX). In order to say X causes Y, or A improves B by C amount, the data points need to be structured in such a way that illustrates the statistically-sound answers to that problem.
Asking the right questions in the right ways will help identify issues, key influencers, and driving factors contributing to the problem with higher confidence, thus revealing steps to the problem’s solution.
Action taken on survey data is one of many ways to improve the customer experience. Improved CX contributes to a company’s CX Maturity, which, when developed, is directly correlated with positive business outcomes, such as increasing year over year (YOY) financial gain and higher client retention. CX Maturity is governed by how well a business understands its customer, and the customer needs a way of communicating with the business to strengthen the relationship — in this case, via a survey.
A common problem facing survey data collectors is non-response. Should the survey not go as far as intended in maximising its sample size without bombarding customers with survey requests, another tool called PredictionCX may be necessary to make survey responses go further. To learn more about this and other products and tools MaritzCX has to offer, click below: