This post originally appeared as a podcast on Allegiance Radio.
|Click play to listen here:|
Hello, everybody, my name is Jeff Olsen, your host, and I want to welcome you to another episode of Allegiance Radio. Allegiance is a company in Salt Lake City, Utah, that provides data gathering and data analysis software. I also want to just give a shout-out to our listeners, those of you who help us promote our brand and help us keep the show going. Hopefully this information you’re getting helps give you some suggestions on how you can promote your own VOC programs, improve the quality of your surveys, or whatever motivates you to listen. We’re happy to have you.
I’d also like to invite everyone to give us a shout and let us know what you think of the show, or if you have any suggestions on what we can do to improve it, or what topics you’d like to see discussed in future episodes. We would certainly welcome that!
Today is part two of our mini-boot camp on surveys. We started, the last time we aired, talking about first-timers and novice survey designers. We’re going to pick up where we left off, because as you remember, those of you that have listened to the previous show, we talked a little bit about designing the survey with the end in mind, knowing who your audience is, knowing who it has to be delivered to, as being key elements of getting your survey noticed and increasing response rates and all of that.
Where we’d like to pick up today is what I refer to as “survey etiquette,” and also just some basic elements on good question design, because it doesn’t take a rocket scientist to get a Zoomerang license or a Survey Monkey license and to start popping some questions in there and sending them out. There is a little more to it than that, and I’d like to give you a few tips today on what I have learned, through my experience doing a lot of high-end customer VOC programs for very large enterprise companies, as well as very small ones, and doing mini-surveys for my own purposes.
Good Survey Etiquette
I think to start this off, I’d like to give you a story about a friend of mine who works in marketing and who was telling me about purchasing a new car. She said “I bought a new Kia, and I’ve never owned a Kia.” Turns out it wasn’t just the car she liked, she really loved the dealership and found the whole experience of buying a new car very nice.
I have to tell you, I rolled my eyes, because I hate shopping for a car. It’s like going to the dentist, only worse. But I was quite intrigued and asked her what she found so great about it. Here’s what she said,
“When I went there, the sales rep was low-pressure, really took into account my needs, and made me feel valued, and then when the time came that I signed the dotted line and bought the car, I didn’t want to trade in my old one, and so one of the sales reps actually got in my old car, followed me where I was going, to my house, and then I brought him back. He went out of his way to just help me do that, and I was so in love with this dealership, and they sent me a survey. The first time I took the survey, absolutely high marks. Tens across the board. Just loved it.”
Then it got funny . . .
“Five days later, I got another survey, and then pretty soon I was on this frequent survey list and they said every time I fill this out I get a dollar. After a while, there were so many surveys coming out, and I got so incensed with this company that I actually started hating them, and I started commenting inside the surveys saying, ‘I hate your company, don’t ever send me another survey. The dollar is not worth it. Don’t send me the dollar and don’t send me another survey.'”
The surveys really changed her mind about her original opinion of the dealership. So this is the idea of survey etiquette, and the first thing I want to cover because we’re all intrigued and interested about getting data, about getting our customers or our employees to respond and, and for some of us, our KPIs (Key Performance Indicators) are measured with this.
When we keep our jobs as long as we keep more people giving us positive feedback, we tend to over-survey and not think of the actual respondent, of what their experience is. So I’ve put together what I refer to as my four basic rules of survey etiquette, and that’s where I’d like to start today.
4 basic rules of survey etiquette
1. Don’t ask a question unless you can use the result
A lot of times we go to committee, we want to make sure we’re getting all the questions in a fair representation to all departments, and somebody squeezes in a question like, “How tall are you?” or, “How often do you buy fast food?” Something like that, it just makes absolutely no sense when the participant is reading through the survey. “Why do you care how tall I am? What are you going to do with this?” You need to differentiate between what I refer to as the “need to know questions” and the “nice to know questions.”
If you cannot use the results of the question to further the GOALS of your survey, do not ask it. It makes people drop off, it makes them abandon the survey. The questions have to flow and make sense. They have to make sense with the objective of the survey, or the response … think of yourself. Would you answer a question like that? Probably not.
Respondents are a lot more likely to answer a survey if they know what you’re trying to accomplish and how it might affect them. So, we are conducting a customer satisfaction survey because we care what you think and we will use the data to improve the overall customer experience. They want to know their answers aren’t going in a vacuum somewhere, that you’re not just checking this off as something you have to do. Tell them how giving you data is going to make them better, or going to influence their experience the next time.
3. Don’t cause survey fatigue
This third rule of survey etiquette comes back to my friend who bought the Kia, don’t cause survey fatigue! Keep the survey as short as possible, keep it as simple and straightforward as possible, and don’t over-survey.
She might have been exaggerating because she knows I work in the business. Her opinion was within two or three months after buying her car, the dealership had sent her somewhere around the neighborhood of eight or ten surveys. For relationship surveys, even once every six months is pretty frequent. Transactional surveys are a little bit different. Every time they have an interaction with you, you can offer a survey. Don’t farm out the data, don’t push the data out. And coordinate departments – if customer service is sending one out and then the marketing department or the sales department wants to send one out, get together. Let’s be sure that we’re not over-inviting or causing survey fatigue.
In the survey itself, 25 questions is plenty. Try to keep it on one page. Keep it short as possible. The general rule is five minutes ought to be enough to take an average survey. Test the survey yourself. Try to make sure that you can get it in as succinctly as possible. Keep it visually appealing and user-friendly. Keep it short and don’t over-survey.
4. Thank your respondents
Finally, at the end of the survey, just like you always want to have a good introduction page to tell them what the purpose of the survey is, you want to make sure that you have a page at the end that thanks them. Sometimes this can be just the last question in the survey, but you want to thank them for their participation. Everybody likes to be appreciated.
Acknowledge that you value their time, maybe reiterate that you value their expertise, you value the feedback. Maybe have a link to an e-mail or a Web site where they can go if they want more information about the data or about the study that you’re conducting with the survey. Again, a good thank you is a fundamental best practice as far as getting better participation on your future surveys.
Good Survey Question Design
Let’s talk a little bit about questions. How many questions should your survey have? There’s not really a set number, it all depends on the objectives of your survey and the target population. Once those things have been put together, then you can start formulating your questions. As with the objectives, the survey should be short, to the point, and effective. In order to keep the survey on target, it’s always a good idea to keep the objectives on hand to ensure that each question you ask relates back to your objectives.
Stay on Target
I teach at a local university here part-time, and one of the things I know, in developing curriculum and putting together good lesson plans, is you always have to have objectives in mind. What, at the end of the day, do you want the student to walk out with, and how are they going to get there? It’s the same thing with a survey. What is it you want to achieve from the survey? The questions are kind of the bread crumbs on the path that lead people through to get the data that you’re after.
To keep the survey on target, it’s always a good idea to keep in mind the objectives and make sure that each question that you ask relates back to one of those objectives. That way, you get rid of those red herring questions, the ones that stand out and make no sense. I’m asking about your customer experience, now all of a sudden there’s a question about your height and weight. “Why in the world are you asking me that?”
Tell a Story
There should be a logical flow to the questions. One of our statisticians and survey designers told me that a survey ought to read like a story. It ought to be as intriguing as a story, and each section of questions should match and flow nicely to the transition to the next set of questions.
A survey should be viewed as a launching pad for conversation between the interviewer and the respondent. Think as though you were interviewing this respondent. How would you conduct that interview? A conversation, maybe, between your company and the respondent. Not only is it frustrating for a respondent if that conversation is disjointed, but it can also influence the quality of the information that’s gathered. Your survey should flow very nicely. Each set of questions should flow logically through the survey. The questions within the group should all have to do with that objective, and they should transition nicely one to the other.
Ask the Right Questions
People often ask about what questions they need to include on surveys. I’m sure if you Google it you’ll find probably thousands of people who have opinions on question design, and question design can be an entity in and of itself, especially if you’re in committee and you’re asking “What questions do we put on the survey.” Everybody has their opinion. While I’m not going to say, “Here are the exact questions you need to put on the survey,” I am going to share with you a methodology that I use and here at Allegiance we use quite a bit. I refer to it as the S-O-N-A-R method.
Be specific. Be direct about what you’re asking, and avoid double-barreled questions. What is a double-barreled question? A double-barreled question is essentially a question that is asking two questions. For instance, I might ask something like, “Thinking about the current Federal Government, do you feel like the House and the Senate, along with the Executive Branch, are making decisions that are good for your family?” I may feel like the President is doing a good job, but I may not feel like the House of Representatives is doing a good job. I’ve got literally three questions in this one. Questions within questions.
How do I answer that? What if I say Strongly Agree? What do I Strongly Agree with? Do I strongly agree with the question about the Senate, about the House, about the Executive Branch? Be very specific about what you’re asking, and understand that there should be only one answer, one question. Be specific.
Stay objective. Minimize your bias, avoid loaded or leading questions. Back to our question about the government. You wouldn’t want to pose your question as “Bearing in mind that the outgoing government caused us four years of incredible tax increases, do you agree that the current Administration has lowered taxes?” You’ve already set the stage for influencing opinion that yes, the previous Administration were stupid and raised taxes and whatever, and it’s totally set up to influence how I’m going to answer the question. Remain neutral with very unbiased questions.
Use very simple language. Do not use jargon or unfamiliar acronyms. Avoid words that could have multiple meanings without appropriate context. Just simple language, no jargon. Don’t say something like “Understanding that super PACs fund much of candidates’ campaigns these days, do you agree that super PACs should not be allowed to contribute?” I may not even know what a super PAC is, but I’m going to look stupid if I don’t answer the question.
Only ask a question if you can take action on the responses. Does my knowing what your family income level is help me to take action on the outcome of the survey? I don’t know. Probably not. If I ask a question, “Do you agree that we should get out of the United Nations?” or something like that, well, why do I ask it if there’s nothing I can do about it?
Make sure that the questions are actionable. You’ll get better feedback, you’ll get more feedback if people know that this is coming from a market research company, or this is coming from a retail organization, and they probably have some clout in determining product quality or product definition or production. Honestly, if you do ask an unactionable question, I guarantee the person who you surveyed this time probably won’t answer your survey the next time.
Make sure the question is relevant to your goals and objectives and make sure the question is relevant to your respondent’s fame of reference. The example I like to use on is this: I was helping a financial institution create a customer experience/product research survey. They were launching a new suite of products and services and they were interested to see what their current customers felt about it, and they weighted their sample based on demographics where they had a good balance of young contributors, middle-aged, and older people taking the survey. They had designed their questions and asked met o take a look.
One of the questions was, “Do you participate in a Roth IRA?” That was the question. I looked at it and looked at who they were aiming the survey at and I would wager that if I walked up to the average 18 year-old and asked them what a Roth IRA was, they’d probably not know. It’s not in their frame of reference. So whey are they asking a group of people, including 18 year-olds, what they think of their retirement products?
Probably not a good idea. That would be probably a legitimate question to somebody who is over 50, or somebody who is looking at investing and those types of things. They could have attacked the retirement products question by asking something along the lines of, “Do you participate or use our retirement products? Yes or no?” If they say yes, then I can branch, because if they do, then they’ll know whether or not it’s a traditional or Roth IRA. I’ve segregated my audience down to ones that I know will understand this, and the rest go on their merry way. So be sure your questions are relevant. Not only relevant to the survey itself, but also relevant to your respondent’s frame of reference.
The wording of the questions also can have an effect, and I only have a couple of minutes left on the show today so I want to leave you with this. There was actually a poll that asked two very similar questions, I believe it’s out of the Wall Street Journal a number of years ago. Question one, what if we asked “Do you favor cutting government entitlements to reduce the budget deficit?” People say absolutely! How about this question? “Do you favor cutting programs such as Social Security, Medicaid, and farm subsidies to reduce the budget deficit?” Wait a second, I may not! Wording of the questions, we’re going to get into that in another podcast segment . . . talking about what’s proper wording and focusing on questions.
In closing, I just want to thank you all for participating or listening to the podcast today. Again, please, we really would like to hear from you. We’ve been doing this quite a while, we’re getting a lot of great listeners, people chatting us up, friending us on Facebook, following us on Twitter, following us on Blog Talk Radio. If you can’t find the survey link, Blog Talk Radio doesn’t do a real good job of making these apparent, but here’s a link right here: Allegiance Radio Survey, and also you can just e-mail me directly at firstname.lastname@example.org. I’m Jeff Olsen, your host, and I look forward to hosting another show soon. Thanks, and have a great day.
This post comes from an Allegiance Radio episode. We invite you to visit Blog Talk Radio and save us in your favorites to receive updates and invitations to upcoming shows. Also, look for us on Facebook and Twitter, and remember to contact us for a no-obligation assessment of your data gathering and data analysis needs.
We always welcome your feedback. If you have ideas for upcoming shows, or maybe an interview that you think would be good for us to do, leave us a comment or take a quick survey to let us know how we can make the show better!