down arrow
photo of person's hands tapping a smartphone screen surrounded by shapes

70+ Great User Testing Questions To Ask Before, During, & After User Tests

What makes a “good” user test question? Take inspiration from these 70+ sample user testing questions to ask in your next research project.

What makes a “good” user testing question?

In short, effective testing questions are those that prompt participants to provide useful insights without introducing bias

But writing them is harder than you might think. 

In this post, we’ve listed 70+ sample user test questions to use in your next research project. You’ll learn:

  • How to write good user test questions (without introducing bias)
  • The best questions to ask before a user test (including key screener survey questions)
  • The best questions to ask during a user test, categorized by test type
  • The best follow-up questions to ask after a user test
  • Questions to avoid

How to write good user testing questions (without introducing bias) 

Before we dig into all the example questions we’ve gathered for you, we wanted to talk about how to write good user testing questions. 

To be clear, when we say “good,” we mean: questions that prompt participants to provide useful insights without introducing bias.

That last part—without introducing bias—is critical to gathering valid data. Learn more about how to identify and reduce bias in your research.

Here are some steps for writing effective user testing questions: 

  1. Starting with your research plan, determine what specific information you need to learn in order to answer your overall research question. For example, instead of “improve the UX of our mobile app,” you’ll want to articulate a more specific goal, e.g. “understand which steps in the purchase flow are most likely to result in cart abandonment and why.”
  2. Decide what characteristics your testing participants should have for them to give you relevant, usable information. Learn more about identifying and recruiting the right audience
  3. Choose your testing methodology. 
  4. Write questions based on your understanding of steps 1-3.
  5. Rephrase as needed to eliminate leading questions and to make sure your questions are human-centric rather than product-centric

This Taxonomy of Cognitive Domain chart from Nikki Anderson at the UX Collective is a nifty cheat-sheet for writing questions that get to the heart of what you’re hoping to learn:

💡 Note for those interested: The taxonomy of cognitive domain comes from the idea that there are six categories of cognitive ability: knowledge (information recall), comprehension (actually understanding what you can recall), application (the ability to use that knowledge in a practical way), analysis (the ability to draw insights out of given information), synthesis (the ability to create something new from what is known), and evaluation (the ability to make judgments about what is known).

When to use open vs. closed questions

There are two main types of usability testing questions: open and closed. 

  • Open questions are those that can’t be answered with a “yes” or “no” response and allow for free-form answers, e.g. “tell us about yourself” or “how would you describe your confidence level with using smartphones?” Open-ended questions encourage storytelling and room for personal anecdotes.
  • Closed questions are those that must be answered with yes, no, or a static choice from a predefined list of responses, e.g. “on a scale of 1-10, how likely would you be to recommend this product?” or “did you like your experience?”

Neither one is inherently better or worse than the other—but using the wrong type of question at the wrong time can introduce bias and fail to get you the data you need. 

As a general rule of thumb, use open-ended questions when you’re looking for qualitative data about feelings, thoughts, or behaviors, and use closed-ended questions when you’re looking for quantitative data with a focused set of possible answers. 

📚 Related Reading: How to Write UX Research Interview Questions to Get the Most Insight

70+ user testing questions to ask before, during, and after a user test

Usability testing governs any testing around how a user might work with or navigate the product you’ve created. It’s commonly used to make sure that navigation, onboarding, and other aspects of a site or tool work as intended. If there are any hiccups or confusing instructions, it’s how you’ll find out about them. 

📌 Gearing up for your next user testing project? Take the pain out of recruiting participants with User Interviews. Tell us who you need, and we’ll get them on your calendar. Sign up today. 

Best questions to ask before a user test (including key screener questions)

Many of the questions you ask prior to a user test can be covered in your screener survey. However, you may want to repeat some or all of the questions in-session for the chance to confirm the participant’s answers and ask for follow-up information. 

The questions you ask prior to a user test should cover information about the participants’ demographics (if applicable) and current behaviors, experience, and attitudes.

Here are some examples of questions to ask pre-test:

  1. Tell us about yourself.
  2. How old are you?
  3. What is your gender identity?
  4. What is your profession? 
  5. What is your household income?
  6. What is the highest level of education you’ve completed?
  7. How much do you already know about [product or task]?
  8. How often do you do [tasks that your product solves for]?
  9. How would you rate your confidence level in using [product] on a scale of 1 to 10?
  10. How often (if ever) do you use [products]?
  11. How often (if ever) do you use [brand’s products]? 
  12. When was the last time you bought [product]?
  13. In an average week, how much money do you spend on [product]?
  14. In an average week, how much time do you spend doing [task]?
  15. Do you own or have access to [tools needed to complete the test]?

📚 Related Reading: 7 Common Screener Survey Mistakes Even Experienced Researchers Make—and How to Fix Them

Best questions to ask during a user test

The questions you ask during the session are the most important. In this section, we’ve laid out some effective questions to ask during a user test (categorized by test type), as well as some effective follow-up questions to help you dig for more information.

⛺ Product survey questions

A product survey can be used to generate ideas for improving an existing product, testing an idea, or even getting feedback on a beta version. If you’re looking for quick feedback, you can ask multiple-choice questions. But it’s worth throwing in a few free response questions to see what extra info you can get from respondents.

Multiple Choice Survey Questions:

  1. Is [product] something you need or don’t need?
  2. How easy is [product] to use?
  3. Have you recommended [product] to anyone you know?
  4. Do you know anyone who could use [product]?
  5. If yes, would you recommend it to them?‍
  6. Did anything about [product] frustrate you when you were using it?
  7. If a positive response, ask: “We’re sorry to hear that! Please explain.”‍
  8. How did you first hear about [product]?
  9. How often do you buy [product]?
  10. How often do you buy [competitors’ products]?
  11. When is the last time you used [product]?

Open-Ended Questions:

  1. Do you currently have anything that does what [product] does? Explain.
  2. What do you think [product] is best used for?
  3. What are all the ways you use [product]?
  4. What do you like best about [product]?

In addition to these types of questions, you can list a series of features or aspects of the product you’re testing and ask participants to rate them on a given scale.

🎯 Task analysis questions

How we think someone performs a task and how they actually perform the task can be very different. And unless you perform a task analysis study, you don’t know what you don’t know. 

Task analysis is best done before you’ve solidified the user flow for a new product. It helps you understand factors such as cultural or environmental influences on how someone performs that task. If being present to observe how people approach the task is critical, you’ll want to conduct an ethnographic field study. Otherwise, you can still get a wealth of information from user interviews.

Here are a few questions you might ask during task analysis:

  1. When do you need to complete [task]?
  2. Walk me through every step you take to complete [task].
  3. What additional tools do you need to complete [task]?
  4. I noticed you chose to do [action]. Why is that?
  5. What are the most annoying steps in [process to complete the task]?
  6. What are your favorite parts about [task]?
  7. How have you completed [task] when in a rush?
  8. How do your [colleagues/friends/family/etc.] complete [task]?
  9. Where did you learn how to do [task]?
  10. How did you feel when learning how to do [task]?
  11. Have you ever failed to complete [task]?
  12. What would happen if you couldn’t complete [task]?

🗂️ Card sorting questions

Card sorting can be moderated or unmoderated. The benefit of a moderated session is that you can remind test participants to think out loud, granting access to the reasoning process they go through to make sorting decisions. If all you care about is the end result, however, unmoderated is the way to go

Most of the work in setting up a good card sort is done before the exercise starts. You’ll need to give participants some context as to what they’re doing ahead of time (at least for most studies).

Whether or not you give them categories for the cards or have them write their own is up to you.

If you do ask any questions during the process, consider these:

  1. Why did you put [card] with [group] and not the others?
  2. (If a participant left one or more cards unsorted) Why didn’t you sort [card]?
  3. Are there [card topics] you expected to see, but didn’t?
  4. If you could add a category beyond the given ones, what would it be?

🌱 Beta testing questions

Many of the questions & scenarios we’ve discussed through now (and will continue to discuss in the usability section below) would be appropriate during beta testing. But there are some beta testing questions we haven’t covered, like how to ask potential customers about pricing.

Since we’ve already had a great conversation with Marie Prokopets about how she and FYI co-founder Hiten Shah failed and then succeeded with a product launch, we’ll let you read all of their beta testing advice and recommended questions instead of rehashing it here.

💻 Website user testing questions

Website usability testing can be moderated or unmoderated. Conducting moderated tests means you can react to problems in the moment and better understand what the user is thinking. Unmoderated tests are convenient and can result in valuable insights as long as you leave good instructions and the testers explain what they’re thinking.

Preparing a focused test is arguably more important than the questions you ask during the session. But a few well-placed questions, especially if the participant isn’t as talkative as you hoped, can keep things on track.

Consider asking these web app usability testing questions:

  1. How often do you use [your website]?
  2. (As they’re completing the test) What’s going through your mind right now?
  3. Is there anything on the page that confuses you?
  4. (If a user is surprised by something) What were you expecting to happen when you [task]?
  5. Was anything about your experience frustrating?
  6. Did [task] take as long as you expected it to?
  7. Was there something missing from [flow you had the user go through]?
  8. How did this compare to [similar process for competitor]?
  9. Do you think there is an easier way to accomplish [process] than what you just did?

During the testing session, keep on eye on how they respond to what’s in front of them. Does anything distract them from the specific tasks you gave them? Do they completely miss something you think is important? How quickly can they find what they need? Do they take a circuitous route to solve what you thought was a simple problem? How do they react to usability issues?

📱 Mobile app user testing questions

To be honest, most of the questions you need to ask for website usability testing are the same for mobile app usability testing sessions. While your testing environment may look a little different, the goals are mostly the same.

Still, there are a few extra questions worth mentioning:

  1. Do the permissions requested by this app seem reasonable to you?
  2. Why or why not?
  3. Have you ever had trouble signing into the app?‍
  4. (If the user is testing in the wild) Could you hear [notification/other cause of sound]?

As with website testing, pay attention to trouble spots — do they linger on a task you think should be easy? Are there places where the app is slow or annoys them? Do they make any verbal cues (“Ooh,” “Ah,” “Hmm,” and so forth) at specific parts of the flow you’re testing?

👉 Follow-up questions for moderated sessions

Sometimes, the best information you’ll get in the whole session comes from follow-up questions. Perhaps it’s just a matter of you mimicking what they just said (“The navbar disappeared ... ?”) and waiting for them to expound on that statement.

For the sake of variety, here are a few different ways of getting a participant to explain themselves:

  1. Tell me more.
  2. Could you explain what you mean by that?
  3. Oh?
  4. Why is that?
  5. Why do you think that is?
  6. Is there anything I should have asked you today that I didn’t?

Best questions to ask after a user test

Following a user test, you’ll mostly want to ask questions related to users’ overall impression. 

Effective questions to ask after a user test include:

  1. How would you describe your overall experience with [product]?
  2. What did you like most/least about your experience and why?
  3. What, if anything, surprised you about your experience?
  4. Would you use this product again in real life? If so, how frequently would you use this product?
  5. What features would make you use this product more often? 
  6. What would stop you from using this product in the future?
  7. If you could change anything about your experience, what would you change?
  8. How would you compare [our product] to [a competitor’s product]?
  9. On a scale of 1-10 how likely are you to recommend this product to a friend? 

📚 Related Reading: How to Ask Great User Research Questions with Amy Chess of Amazon

A final warning: Questions to avoid

questions to avoid in your next user test: ❌ Do you like this product?   ❌ Do you like the upgraded feature better?  ❌ Was using the product easy?   ❌ How much did you enjoy using the product?  ❌ What do you think of this idea?  ❌ What do you want?  ❌ Would you buy ‘X’ if it were on the market?

Lastly, here are a few questions to avoid asking in your next user test:

  • Do you like this product?
  • Do you like the upgraded feature better?
  • How easy was it to use the product?
  • How much did you enjoy using the product?
  • What do you think of this idea?
  • What do you want?
  • Would you buy ‘X’ if it were on the market?

The first four questions are leading, because they imply the desired answer within the question (e.g. identifying the "upgraded" feature will give participants the sense that it's the one they should like better).

These last three questions are poor because humans are notoriously bad at predicting their own behavior. Plus, there can be a real difference between what people think they want and what they’ll actually choose to solve a problem when the time comes.

Happy asking!

You definitely won’t need all 100+ questions from this guide for each project you undertake. You’ll probably need to write additional questions specific to your work. Hopefully, these are enough to get you started.

Even with the right questions and testing methodology, you won’t get very far without recruiting great participants. If you’d like to make recruiting quick and (relatively) painless, consider User Interviews.

Using our platform, you can recruit from nearly 3 million vetted participants simply by telling us which demographic characteristics matter to you, and by setting up a short screener survey based on behaviors your users exhibit. When you’re ready to launch your study, we’ll find as many participants as you need. 

Or, you can bring your own panel and seamlessly manage all the logistics using our panel management and recruitment automation platform, Research Hub

Sign up for a free account to get started right away or visit our pricing page to learn about our plans for any sized team. 

Lizzy Burnam
Product Education Manager

Marketer, writer, poet. Lizzy likes hiking, people-watching, thrift shopping, learning and sharing ideas. Her happiest memory is sitting on the shore of Lake Champlain in the summer of 2020, eating a clementine.

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
[X]
Table of contents
down arrow