By Heidi Dietzsch
Creating a well-designed survey but no one completes it is a bit like throwing a huge party and no one pitches up.
The market research process consists of many important steps, but probably the most vital is to entice potential respondents to participate in a study. All the other steps might have been perfectly executed – you even have a visually compelling online survey in place – but if people don’t complete your survey, it’s is as good as doomed.
There are many reasons why people are hesitant to complete surveys, but there are ways to diminish this unwillingness.
Possibly the main reason is that the questionnaire is too long. Researchers often feel they want to gather as much information as possible and need to ask lots of important questions. However, overly long surveys can have the opposite effect than was actually intended. A 2017 study (Consumer Participation in Research) conducted by GreenBook investigated the impact of poorly designed surveys on research quality and respondent experience. It found that 45% of respondents believe that surveys should be take less than 10 minutes to complete.[1]
Ensure your survey is short and concise. Expecting respondents to participate in a very long survey might convey the impression that you don’t value their time and disregard them as being unimportant. Respondents are the lifeblood of market research and should be treated fairly and with respect. Apart from low completion rates, lengthy surveys can also result in poor data quality because respondents might rush through it without really considering the questions.
People are usually willing to participate in surveys, especially if they have a vested interest in it, or if they believe the research will lead to positive change. However, if participation requires additional effort – for instance if respondents are asked to check which brands all the electrical appliances in their homes are – they will lose interest fast. You are already asking them to take time out to complete the survey but now you are also expecting them to step away from their computers or cellphones and do extra work. Few respondents will adhere to such a request.
Respondents’ boundaries should also be respected. To researchers, online surveys can seem like a conveniently impersonal way of asking people for information. After all, it’s easier to ask people sensitive questions when you are not dealing with them personally. However, for respondents, those questions can be just as sensitive as they would be in person. Asking questions that make respondents uncomfortable can cause a great deal of respondent fatigue.[2]
Researchers know too well that not all topics are easy to talk about. People will naturally shy away from certain subjects because they are deemed to be too personal, stressful or sacred, or they fear stigmatisation.[3] Controversial topics that might cause respondents to think twice before taking part include personal income and finance, sexual behaviour, illegal behaviour, drug and alcohol use, religion and health – especially mental health.
It might be necessary to minimise the number of controversial questions or exclude them altogether, although this can affect the robustness and granularity of the data. Otherwise, such questions can still be included in the survey but shouldn’t be mandatory. If respondents cannot skip a sensitive question they will most probably leave the survey at that point. Researchers should also think of clever ways of making these types of questions less offensive so that respondents are more likely to answer them.
Many respondents would prefer to complete surveys on their smartphones. Researchers should ensure that surveys fit perfectly onto a small screen device with the question layout in exactly the same format as it would be on other devices. It’s extremely frustrating if a survey on a smartphone requires considerable up and down and left to right scrolling. This is also likely to result in many respondents abandoning the survey.
Non-responsiveness is not the only obstacle: another is respondent bias. This occurs when respondents are unable or unwilling to provide accurate or honest answers to a survey. This could be due to various reasons, but most often it’s due to unfamiliarity, respondent fatigue, faulty recall, question format and question context.
Acknowledging this, researchers need to be vigilant in framing their questions clearly and to the point. But even the most well framed and thought-out questions are not bullet proof to inaccurate answers. That is why it is imperative that every question has an opt-out choice. This is usually in the form of a “Don’t know,” “Not sure” or “Undecided”. Not only will adding the opt-out choice eliminate a lot of inaccurate answers, but it will also provide researchers with valuable information. For instance, you can learn how many people have not made up their mind or are uneducated on a topic.[4]
Unwillingness to provide accurate or honest answers might also stem from a phenomenon called social desirability bias. This occurs when respondents feel pressured, either internally or externally, to provide a socially desired response. Accordingly, the information from these respondents will be biased and will not accurately reflect the target population.[5]
Social desirability bias is particularly prevalent when respondents participate in employee satisfaction surveys, political polls or behavioural studies. For instance, when employees are asked to rate their working environment or manager, they might give a more favourable rating than they actually feel to be the case out of fear of being ostracised in the workplace. It can also lead to over-reporting “good behaviour” or under-reporting “bad behaviour”. This can happen even if employees are ensured that the information they provide is anonymous.
Similarly, voters may tell pollsters that they are undecided or will vote for the socially acceptable option while planning to vote for a more controversial candidate on election day. Also, when confronted with the question, “How many glasses of alcohol do you consume per week?”, respondents will tend to downplay this number. Similarly, research on feelings of insecurity has shown many times that men tend to downplay their feelings of insecurity as they are – rather stereotypically – expected to have less fear than women.[6]
Of course, participation rates can be increased by offering incentives. There are many ways in which respondents can be incentivised and it can be monetary or non-monetary. The value of the incentives will depend on the type of project and the amount of time the respondent will spend taking part in the survey, as well as the topic being researched. When research topics are very sensitive or personal in nature, the value of the incentive generally needs to be higher.
Studies have shown that cash is king and is most likely to pique the interest of potential respondents. Non-monetary incentives such as thank you gifts are less successful in increasing response rates.[7]
Disregarding the respondent experience in the research process is unproductive and senseless. Respondents are doing you a favour by participating in the research and this should be valued. Survey participation should be a pleasant and informative experience, and should not be associated with washing dishes, preparing taxes or standing in lines at government departments.
[1] https://greenbookblog.org/2017/04/25/why-most-respondents-dont-like-participating-in-research-and-what-we-can-do-about-it/
[2] https://www.surveymonkey.com/curiosity/eliminate-survey-fatigue-fix-3-things-respondents-hate/
[3] https://antedote.com/9-tips-research-sensitive-topics/
[4] http://fluidsurveys.com/university/tips-for-avoiding-respondent-bias/
[5] https://www.survata.com/market-research/resources/social-desirability-bias/
[6] https://www.checkmarket.com/blog/sensitive-topics/
[7] https://www.surveymonkey.com/curiosity/offer-survey-incentives-without-sacrificing-good-data/