The quality question: how to use it?| Izzy - 24/03/2020
The quality question: how to use it?
The quality question: what is it used for?
At Made in Surveys, we use quality questions to ensure the data collected is of the highest quality standards: it is true that unfortunately some panelists do not take surveys seriously. One of the research manager’s tasks is to flush them out using different methods. One of these methods is the “quality question” which we are going to introduce today.
We usually observe (with our proprietary panels) removal of circa 10% of the overall responses, either in field with the automatic checks in place, or manually after field is closed and data is reviewed by one of the research managers.
This is problematic, not just because of the time spent in cleaning the data, but also because it implies forecasting additional sample which can be a difficult task especially with complex profiles or studies with low incidence rates.
The quality question is one of many ways to save a lot of time on the manual quality checks also a great way to ensure the highest number of valid completes in the final report.
What kind of quality questions are they, and how to use them?
Different quality questions can be used depending on what we are trying to achieve:
- Verification question: to be sure that the respondent is actually a real person, and that they have all the tools at their disposal to answer correctly.
- Attention questions: usually used within long questionnaires to ensure that the participant remains attentive all the way through the questionnaire.
These questions will stop or screen out the participant if they fail to answer. They are generally used at the beginning of a survey:
- Captcha helps prevent robots or bots from accessing the survey. For studies with multiple partners, a captcha is a basic yet extremely useful tool to hinder bots that attempt to answer questionnaires.
- The video question: again, this is an extremely simple and basic question, which enables you to check in a few seconds that the respondent is able to hear and see visuals/videos that are shown during the course of the survey: we display a video with an image that is decorrelated from the sound, and ask the participant what they see and what they hear.
But careful not to systematically use the same association of video and sound or respondents may remember the combination. At MIS Group, we have over ten possible combinations, to prevent false respondents from going through:
Generally, these questions are used to make sure the respondent is paying attention to what they are doing (within the survey) and that they are answering honestly to all the questions:
- The repeat question: an identical question is asked at the start and end of the questionnaire, then there is an automatic check to see if both answers match and therefore the coherence: respondents that do not tick the same response are filtered out.
- The instruction question: this question is generally added midway through the survey and is usually used to spot inattentive respondents. It is generally a very simple question like: « Tick the blue item, respondents that do not give the right answer are automatically filtered out.
- The logic question: Very similar to the question with instructions, here we wish to check that the respondents are reading and thinking the whole way through the questionnaire. for example: « What day is it today? »
However, be careful not to be overly enthusiastic with quality questions, or you’ll risk losing the most honest respondents: we always suggest to not make the quality questions too complex or subjective.
For example, another type of quality question is repeating a previously ask questioned but with the items in another order. This type of quality question has proven to also filter “honest” respondents as a reflex answer doesn’t mean it is bad quality.
Alternative solutions to use (that will be the topic of a future article)
Unfortunately quality questions alone are not enough to guarantee the quality of the data from an online quantitative study!
Other tools are also at our disposal, depending on what you need: automatic deletion of respondents that answer too quickly, manual verification of answers to open questions, manual and automatic verification of participants who respond too quickly or too systematically to matrix questions etc.
We’ll make sure to cover these in a future article. Stay tuned!