Summary: In this article, I'll explore why pretesting is essential for anyone running surveys in UX research. We'll walk through a real-world case that shows how testing your survey questions ahead of time can save your data from becoming useless.
I've been writing a lot about analyzing survey data lately. As a result, readers have been DMing me with basic questions about effective survey writing techniques. This got me thinking, and I want to talk with you today about how to pretest your surveys.
Imagine this: you've spent hours fine-tuning the wording of each and every question on a survey. Then, you set up multiple meetings with your team to get input from all angles, making sure your questions are the best they can possibly be. With the green light from your stakeholders, your survey goes out, and when the responses come back, it's clear, participants didn't understand a damn thing you were getting at in the survey.
All that time, effort, and budget, not to mention the goodwill of your users, is wasted because the data is essentially useless. This is where pretesting, or the test of the test, becomes invaluable, especially when dealing with complex products that require intricate survey questions. By catching misunderstandings early, pretesting guarantees that your survey delivers the insights you are aiming for without the frustration of confusing or misinterpreted responses.
Pretesting: A Best Practice
Pretesting is like usability testing for your survey. It's a method that helps you see if your questions make sense to respondents so they can answer them accurately. You start by selecting a small sample of participants who resemble your target audience. Then, you walk them through the survey, typically asking them to think aloud as they respond. This allows you to catch any points of confusion or misunderstanding before you launch the survey at scale.
Just like usability testing, it doesn't matter how experienced you are or how many rules you've followed.
Yes, having past experience and following best practices will put you in the top percentile of UX researchers when it comes to these methods, but we need to strive for better than that.
According to a report from the American Association for Public Opinion Research (AAPOR), pre-testing helps identify problems that might not be apparent during the survey design phase. This can lead to significant improvements in question clarity, with some studies noting a reduction in problematic responses by as much as 50% after pre-testing.
The only true way to have confidence when sending out a survey is to do a dry run and see how your test performs in the real world. That's why pretesting isn't just a buzzword or a bullet point to check off on your resume. It's a critical step that all serious professionals should take, yet few actually do.
Real-World Example
Like most things on this blog, I think discussing a time I used pretesting in practice would be more helpful than talking about it in theory.
Scenario: Feature Satisfaction Survey
I worked with a company in the real estate technology sector that had developed a new feature for their mobile app. They wanted to run a survey to figure out how satisfied users were with this feature and whether they perceived it as meeting their needs. The data from this survey would guide further development and marketing strategies, and the project had a lot of eyes on it.
There was no room for a "redo" if the questions weren't right. (I'd argue there is never room for a redo, especially if we want to gain confidence and respect for the UX discipline within our organizations.)
Survey Setup
We decided on a quantitative survey that paired Likert scale questions with open-ended follow-ups, a technique known as "question pairing." This approach allows you to gather both quantitative data (e.g., how satisfied users are with a feature) and qualitative insights (e.g., why they feel that way). Pairing questions like this enriches the data, providing both numbers and narratives to better understand user sentiment and overall satisfaction.
The Pretest
For this survey, I chose to run a moderated pretest over Zoom. While you can run pretests unmoderated through tools like UserTesting, I prefer a moderated approach because it allows you to ask follow-up questions in real time and explore any issues that arise more deeply.
During the Zoom sessions, I guided participants through the survey, asking them to think aloud as they responded. This helped me understand their thought processes and identify where they stumbled.
Here's what I found:
Comprehension Issues: Several participants struggled with jargon-heavy questions, particularly those using industry-specific terms like "property valuation models." They weren't sure what we were asking and hesitated before answering.
The Fix: I rewrote these questions in plain language. For example, instead of "property valuation models," I used "ways to estimate property value," which resonated better with users. I also added brief explanations or examples where necessary to further clarify the terms.
Response Options Were Off: One of the questions offered response options that didn't cover all possible answers, which left some participants feeling frustrated. They either didn't select any option or picked one that didn't really match their thoughts.
The Fix: I expanded the range of response choices to ensure all potential answers were represented. For example, if a question asked about satisfaction with the feature, I added options that captured both specific frustrations and high satisfaction, ensuring that every participant could find an option that matched their experience.
Recall Challenges: A question asked participants to recall how they used the feature a month ago. Many struggled to remember specifics, leading to vague or inaccurate answers.
The Fix: I adjusted the question phrasing to be more specific, asking about a recent instance when they used the feature rather than referencing a vague period of time.
The Outcome
After making these adjustments based on the pretest findings, we launched the full survey. The data quality was excellent, providing clear, actionable insights that the design and product teams could immediately use. (Your outcomes may vary. Hahaha.)
How To Run A Moderated Pretest
If you're running a survey with complex questions or working on a high-stakes project, a moderated pretest is often the best way to ensure your survey will perform well. A moderated pretest allows you to interact directly with participants, guiding them through the survey and gaining deeper insights into their thought processes.
Here's how to set it up:
Recruit Users
Define Your Target Audience: Start by identifying the specific demographic or user group that your survey is targeting. Your pretest participants should closely resemble this group to provide relevant and accurate feedback.
Determine Sample Size: A small sample of 4-5 participants is typically all you'll need to uncover major issues. However, if your survey targets multiple user segments (e.g., different age groups or experience levels), you may want to include participants from each segment.
Recruitment Methods: Screen participants to ensure they match your desired demographic. Use built-in panels and third-party recruitment services at your own risk. I highly recommend screening participants yourself, especially since this requires so few participants.
Use Think-Aloud Protocol
Explain the Process: Before starting the pretest, explain to participants that they should verbalize their thoughts as they complete the survey. Encourage them to share what they're thinking, any confusion they experience, and their reasoning behind each answer. Make sure to follow moderation best practices by not interrupting and staying neutral throughout.
Conduct the Pretest: Guide each participant through the survey, either in person or via a video conferencing tool like Zoom. As they respond to each question, listen carefully for signs of confusion, hesitation, or misunderstanding.
Ask Follow-Up Questions: If participants struggle with a question or seem unsure, don't hesitate to ask follow-up questions. For example, "Can you describe in your own words what this question is asking?" or "Can you explain why you chose that particular response?" These additional questions can help you pinpoint exactly where a question might be failing in the real world.
Take Detailed Notes: Document not only what the participants say but also their tone, body language, and any non-verbal cues. I recommend taking notes during the session and recording all sessions so you can review and analyze them later.
Analyze and Adjust
Review the Feedback: After completing the pretest sessions, carefully review the feedback you've gathered. Look for patterns in the confusion or hesitation that multiple participants may have experienced with the same questions.
Identify Key Issues: Determine whether participants misunderstood the questions, struggled with the response options, or found certain terms or concepts unclear.
Make Necessary Adjustments: Based on the feedback, revise the survey. This could involve rewording questions for clarity, simplifying complex terms, adding more relevant response options, or even restructuring the survey flow to make it more intuitive.
Consider a Follow-Up Pretest: If significant changes were made, you might consider running another round of moderated pretests to ensure the revisions have effectively resolved the issues.
How To Run An Unmoderated Pretest
If a moderated pretest isn't feasible due to time or resource constraints, an unmoderated pretest is a great alternative. This method allows participants to complete the survey on their own, without a moderator, using tools like UserTesting, Userlytics, or similar platforms. While it lacks the interactive element of a moderated session, it's still a valuable way to gather feedback and catch potential issues before the full survey launch.
Here's how to conduct an unmoderated pretest:
Create Your Survey Link
Prepare the Survey: Ensure your survey is fully finalized, with all questions, logic, and flow in place. Double-check for any technical issues or errors that could affect the pretest results.
Generate the Link: Create a shareable link to your survey. This will be the main task participants complete during the unmoderated test.
Set Up the Test
Choose a Tool: Select a platform that supports unmoderated testing, like UserTesting, Userlytics, Maze, or PlaybookUX. These platforms allow you to distribute the survey and gather participant feedback asynchronously.
Create the Test: Set up a new test in your chosen platform. Provide a clear description of the test's purpose, and include the survey link as the main task.
Write Participant Instructions: Since you won't be present to guide participants, your instructions need to be very clear. Include directions like:
"Please complete the survey while thinking aloud about each question." (Be sure to explain that thinking aloud means verbalizing your thoughts, reasoning, and any confusion as you answer each question.)
"Focus on the clarity and ease of understanding of the questions."
"Note any terms or concepts that are confusing or unclear."
Additional Tasks (Optional): If your tool allows, you might add additional tasks to gather more feedback, such as asking participants to summarize their overall experience after completing the survey.
Recruit Participants
Define Your Audience: Just as with a moderated pretest, select participants who closely resemble your survey's target audience. If your survey has multiple segments, try to include participants from each segment.
Determine Sample Size: Again, 4-5 participants are enough to catch the major issues. If your survey is more complex or targets varied user groups, consider a slightly larger sample.
Recruitment Methods: Like with a moderated pretest, screen participants to make sure they match your desired demographic. I highly recommend screening your own participants. Use built-in panels and third-party recruitment services at your own risk.
Launch and Analyze
Launch the Test: Once set up, launch your unmoderated pretest. The platform will distribute the survey link to participants, who will complete it in their own time.
Collect Feedback: As participants complete the test, the platform will collect and compile the feedback. This may include screen recordings, responses to think-aloud prompts, and survey data.
Review and Analyze: Carefully review the data collected, paying close attention to any points where participants expressed confusion or struggled with the questions. Analyze the feedback to identify common issues.
Make Necessary Adjustments: Based on the feedback, revise the survey as needed. If significant changes are made, consider running another unmoderated pretest to ensure the adjustments were effective.
Conclusion
Pretesting might seem like an extra step, but it's one that can save you from a lot of headaches down the road. Whether you're dealing with complex questions or high-stakes projects, this method can help ensure your survey delivers the insights you need.
Once you see the benefits of pretesting, you won't want to skip it. So, next time you're working on a survey, give it a shot. Your future self, and your data, will thank you.
Let me know in the comments if you've used pretesting before or if you're planning to try it out. I'm always curious to hear how these techniques work for you in the real world!
If you need more tips on getting started with pretesting, drop me a DM or leave a comment. I'm here to help!
Great article, as always. When pre-testing, one additional question I find helpful is to ask participants how they would rephrase 'it' - where 'it' is anything they might find unclear. This often provides terms they're more familiar with or reviels additional insights that can even inform content design in product.