Published in General

Talk to the Hand: New Approaches in Web Survey Design

June 29, 2007 changed the CX industry.  Any guess as to the product introduced a little over 10 years ago that has had a profound impact on not only the research industry, but human behavior overall?  If you need a hint, there is a good chance it may have woken you up this morning.  It’s probably in your presence right now and you might even be reading this blog post on it.  That’s right, the smartphone. Steve Jobs’ released of the first iPhone to the world just over ten years ago.  It’s tough to believe it’s been that long since the smartphone was unveiled and, at the same time, difficult to remember how we ever lived our lives without a small computer in our pocket.

How the Smartphone Impacted the CX Industry

So, what does the introduction of the smartphone have to do with the CX industry?  Mobile devices have impacted the CX industry in many ways.  One of which being the many new digital experiences that now impact consumers.  Websites and apps have become the new storefronts for many companies and are the first place people go to learn about products and, increasingly, a channel consumers use for service.  Our blog and joint webinar with Protiviti in September further addressed the need for companies to incorporate digital channel measurement and management into their CX program.  Another way that the mobile culture has impacted the CX industry, and the focus of this blog post, is the impact that mobile devices have had on consumers when they provide feedback.  Web surveys were designed for big browser devices and in 2007 all respondents used their desktops and laptops to complete surveys.  The introduction of the iPhone, shortly after by the introduction of Android changed consumer expectations. People started using the browser on their phone to browse websites and then used mobile email clients to read their email.  As  those experiences improved consumers began using their phones more and more for activities typically reserved for big browser devices including customer satisfaction surveys.  This group of tech savvy survey respondents, initially referred to as ‘unintentional mobile respondents,’ are now accounting for more the majority of all stats.  The chart below visualizes the growth in MaritzCX mobile survey starts over the past five plus years.  As you can see, mobile devices now account for 56 percent of all survey starts with 50 percent of all starts on mobile phones and another 6 percent started on tablets.

A chart showing the growth of mobile surveys

Challenges Taking Surveys On Mobile Devices

Respondents face many challenges when taking surveys designed for big browsers on mobile devices.  The extent of the problems encountered can related to the programming standards applied and the types of questions being presented. We have conducted dozens of research on research studies to better understand the challenges respondents and researchers/CX practitioners face when surveys do not display as intended on mobile devices.  For the respondent, poor survey displays result in respondent fatigue, higher abandon rates and a poor image of the brand commissioning the research.  Researchers face data quality issues when questions do not display as intended on the mobile screen along with fewer data points due to higher question non-response and higher survey abandon rates.  There is no shortage of information and best practices to share on this topic (a high-level overview is provided in this blog post) here so we would like to focus on one of our recent studies that took a bit different approach than most of our other research.

Mobile Inherent Challenges

Through our testing, we have determined that even when surveys have been fully optimized, meaning they are following programing standard to display the appropriate font size for the device and are using questions that display well across devices, there are still challenges mobile respondents face due to the fact that they are mobile.  For example, mobile respondents abandon surveys at a greater rate as survey length increases than non-mobile respondents. Due to being on the move, respondents can be more easily pulled away from the survey.  They could be on a train and pull up to their stop or they may be at a restaurant waiting for food that has arrived. Another mobile-related challenge involves the design of a mobile phone.  Mobile keyboards are much smaller than standard keyboards and most use a touch screen, meaning they keys are not textured. Having such keyboards typically leads to 20-40 percent fewer responses to non-forced open-end questions when compared to responses to the same questions from non-mobile respondents.  Furthermore, those mobile respondents who do offer a response provide less commentary, 10-20 percent fewer words or characters on average.

To address the problem of fewer responses on open-end questions (which will become more of an issue for companies that value verbatim feedback as more respondents choose to use mobile devices), we decided to focus on unique features of mobile devices, deciding to prompt respondents to use the voice-to-text feature on their phone.  The voice-to-text feature is available on most newer mobile operating systems and allows tablet and smartphone users to click on the microphone at the base of their keyboard to enable audio transcription from their phone.  It allows the user to speak their response and have it transcribed to text, avoiding the challenges presented by the mobile keyboard.

Testing the Voice-to-Text Feature in Surveys

We first tested the voice-to-text feature in a mobile research on research project in April of 2014.  At that time, less than 3 percent of mobile survey respondents were using the feature without being prompted. Our 2014 test also indicated that most respondents who were prompted to use voice-to-text did not like the feature, with only 36 percent of the test group indicating a willingness to use the feature in the future.  We believe this low rate was partially attributable to 25 percent of those who attempted to use the voice-to-text feature being forced to make a correction due to an error in transcription.

The text to speech button on a mobile device

We have monitored the voice to text technology over time and, having noted differences in the way the voice-to-text feature works. When we first conducted the research, the iOS and Android operating systems would require the user click the microphone to start dictating their response and then click ‘Done’ to end transcription. The user would then wait for the device the provide the transcription, going back into the text field to identify and errors and make corrections.  Operating system upgrades to each platform now capture text word-for-word as the response is provided.  We believed that this upgrade, along with increased general use of the feature and the expectation that the accuracy of transcription may have improved, warranted another test.

2016 Re-Test

In August of 2016 we repeated our test and saw an increase in unprompted use of voice-to-text, with 7.2 percent of smartphone respondents and 8.5 percent of tablet respondents using the feature without being prompted.  Those who were prompted to use the feature still ran into some issues with transcription with roughly 25 percent of those using voice-to-text being forced to make a change – the same rate as experienced in April of 2014.  Despite the high percentage of transcription errors, over 50 percent of respondents in the text group indicated that they would be willing to use the voice-to-text feature in the future and, in a follow-up open end response, many indicated that the prompt provided a “cool” factor to their survey experience and others also indicated that they use the feature outside of surveys.  Others who use voice-to-text outside of surveys appreciated the prompt to remind them to use the feature as an alternative to typing.  The main benefit of voice-to-text came across in the quality of response that is displayed in the chart below.

The quality of responses in mobile surveys

Given the increase in usage of the feature, respondent interest in being prompted, anticipated improvements in the accuracy of transcription over time and, most importantly the improvement in quality of response, using voice-to-text prompts in your surveys should be strongly considered. As our lives become increasingly digital and artificial intelligence becomes more advanced, using voice-to-text will become easier to use and more convenient.

As technology and customer expectations continue to evolve there will undoubtedly be additional challenges and solutions and we will continue testing new approaches to capturing respondent feedback.  Some interesting findings from testing around scale usage and video data capture will be shared in future blog posts.  In closing, it’s critical that CX practitioners continue to evolve the mechanisms they employ to capture feedback from their customers.  Using a platform that does not allow customers to easily share their thoughts will result in fewer responses, decisions being made on less than optimal data and a negative perception of the brand commissioning the research.