Pay Attention to the Nuances: How To Make User Interviewing Your Superpower

How to prepare for a user interview, all the way to sharing the results with your team.

Ally Mexicotte
Product Coalition

--

The skill of running effective user interviews is key to defining your target users, finding product-market fit, growing your product, figuring out what to build next — or just simply understanding how users perceive your product.

Being good at user interviews is not equal to simply listening to your users. If all you do is listen to your users without a plan or a decision making framework, you might walk away with no insights or the wrong insights.

One the other hand, if you don’t listen to your users, you might lose touch with understanding what your users want.

User interviews are one of those “easy to understand, hard to master” skills because it requires an ability to observe and experience the nuances of each conversation with a user. Many companies expect product managers, designers and other roles to be able to deliver good user interviews, but the training is often by trial-and-error.

I compiled this guide back when I was training product managers on my team to be able to run user interviews. It heavily references Laura Klein’s Build Better Products, as well as learnings from my own mistakes.

This document helped set clear expectations of what a user interview should entail and saved me from having to repeat myself as I onboarded new people into the process.

Set the right learning objectives

I’ve found it helpful to define a learning objective for each user interview I do. I sometimes like to frame it as a hypothesis that I’m testing. When I decide ahead of time what the goal of my research is, I clarify the most important insight I need to gather, as I’m usually trying to learn many things in any given interview.

I write my learning objectives in bullet points at the top of my notes, and if anyone from my team is shadowing, I share that document so they can be oriented in understanding what I’m trying to learn.

Examples of some poorly written learning objectives:

  • Will the [target persona] use these features?
  • How should I change the UX of [feature]?
  • Will the [target persona] buy this product?

Examples of better learning objectives:

  • “What kind of users get the most value out of this product?”
  • “Why do so many shoppers successfully start the registration process but then stop?”
  • “What are the biggest problems for [target users] when they’re trying to buy party supplies online?

I try to be extremely intentional when I set these learning objectives, because it’s akin to writing a hypothesis. If I ask the wrong questions, I’ll probably get the wrong answers.

Follow a loose, semi-structured format

A semi-structured interview format is probably the most commonly used format because the best interviewers are able to adapt to the interviewee and go off script where it makes sense.

A semi-structured interview basically means that you have prepared a script of questions ahead of time, but you use them depending on what the user says and you’re not overly sticking to the script.

Caution: Even if you know that you don’t want to be a robot following a script, it can still be challenging not to rush through some questions, because you feel like you’re falling behind. It takes intentionality, practice and retrospectives to know when which questions are important to ask within the limited interview time you have.

I try to keep an open mind and dig deeper if I come across an important desire or pain point for the user, even if it’s unrelated to my originally intended topic. If I hear a statement by the user that could warrant more clarity, I use the “What’s [blank] about it?” question to keep digging.

I was once interviewing users to understand why they were using our peer-to-peer sports betting app, and a college-aged user told me it was cool. When I asked him what was cool about it, he described how he liked the app because it automated payment to whoever won the bet, saving him from telling his friends to pay up after he won fair and square.

Although we had many features in our app that differentiated us from competitors, this was a key insight for us in understanding how we created a different experience than when friends bet each other using Venmo or CashApp.

Ease into the conversation

One way I start to build trust in the interview is by introducing myself, my role, and describe the format of the user interview. Before jumping into the core learning objectives, I ask questions to get to know the user.

Starting with background questions is useful for two reasons:

  1. This follows normal patterns in a natural conversation where you start by asking basic questions to get to know someone before asking more personal questions. It can help participants relax in the conversation.
  2. I get a deeper understanding of whether this person fits the target demographic that I’m aiming to learn from.

The following are example questions I asked when I ran user interviews for a sports betting company. If I hear a potentially useful insight, I follow up with conversational questions and try to keep the vibe very comfortable and natural.

  • “How did you get into sports and sports betting? What’s your favorite part about betting on a game?”
  • “How many times did you place a bet on a sport event this month/week?”
  • “How many times this month would you say you used the [insert name] app?”
  • “When you last placed a bet, who did you send the bet to?”

Put yourself in their shoes

How can you build trust within a 45 minute interview? Why does this even matter? This question is answered easier if you’ve recently been on the receiving end of an interview.

As interviewers, we may feel like we’re asking simple questions about a person’s behaviors with a piece of software, but from the interviewee’s perspective, someone they’ve never met is asking them very detailed questions about what they do and how they do it. User interviews can sometimes be a weird experience.

If we’re being asked about how we’re using an enterprise software, for instance, it might be helpful to be sensitive to the fact that the behaviors we’re asking about are closely intertwined with how well the user does their work.

I once interviewed a business analyst on how she used an internal application, and it was only after she gave a few qualifications and disclaimers about her processes, I realized that I was making her nervous.

From her perspective, I was basically quizzing her on whether she was using this work tool correctly, and therefore indirectly assessing how well she was doing her job. This wasn’t my intention at all. I was just trying to understand how people used the application — but I wasn’t considering how the experience might have felt for her.

Questions not to ask

There are such things as bad questions when you’re interviewing a user. For instance, leading questions and yes/no questions are a big no-no.

There are some questions that are extremely tempting to ask but will not elicit helpful insight. These questions include:

  • “Would you use this [feature]?”
  • “Would you buy this?”
  • “What features does it need to have?”
  • “How often do you think you’d use this?”

Why should we not ask these questions? These questions are directly related to some of the questions you may have around your product — but there are limitations of what you can learn from user interviews.

There’s been a lot of research showing that people are extremely bad at distinguishing between their ideal behavior and their actual behavior. The left hemisphere of your brain often fabricates answers without the person even realizing (lots of studies done on this with split brain patients).

People tend to give very positive answers, regardless of whether it’s true. They’re not intentionally deceiving you, but some questions are better off being answered by watching what people do rather than what they say.

In these cases, design experiments or look at your user analytics data rather than running user interviews. Looking at user analytics can tell you what happens, and user interviews can help you to find out why.

Dig into specific stories

The biggest tip in asking good questions: ask about a specific time when the person did something, not a high level question about how they might do it.

Example: “Can you tell me about the last time you looked at the leaderboard?” instead of “How often do you look at the leaderboard?”

When you dig into the complaints or pain points of your interviewees, use problem-finding and context questions.

A problem the user is having should always be observed in the context in which it exists. Simply asking people to state the things they don’t like isn’t as effective as asking them to tell you stories about the last time they encountered that problem. When you understand the context of the problem, it also helps you to understand the best way to fix the problem.

  • “Start at the beginning. What happened first?”
  • “Where were you? Set the scene for me.”
  • “What happened before that?”
  • “What happened after that?”
  • “Can you walk me through what happened the last time you encountered this problem?”
  • “You said this was ‘interesting’. What do you mean by that?”
  • “What challenges did you encounter?”

If a user starts talking in generalizations, ask, “In this specific instance, did you face that challenge?”

By reorienting the user back into their story, you can learn a lot about when issues arise and when they don’t.

If a user asks for a specific feature, ask, “What would that do for you?” to try to understand the underlying need rather than a specific feature.

Show prototypes or mockups

Rather than asking the participant to imagine a feature, show them something. A wireframe or mockup can go a long way.

If you’re showing wireframes or mockups to a user, start by asking very open ended questions.

  • “What do you think this screen or app is about?”
  • “Can you show me what you’d do if I weren’t here?”
  • “What would you click on next?”

If the user struggles with how to navigate the mock-up, do not intervene!

Do not tell them where to click. Whenever this happens in user interviews, this is a learning moment for me as a product manager — a user is struggling to understand how we’ve designed this app. That means we need to pay very careful attention to why this is confusing.

You as the interviewer need to be aware enough to recognize these moments of tension and resist the urge to relieve that tension by telling them the answer.

As Laura Klein says, when someone is failing to understand how to use your product in a usability test, other people are going to fail in the wild, and you’re not going to be sitting next to them to help.

Share the results with your team in a visual way

User interviewing is expensive, so it’s important to have a clear goal for learning in each interview and to leverage what you learned to influence decisions. Share condensed findings with the whole team early.

There are many tools out there that you can use. One of the tools I’ve heard positive things about is called Dovetail. I haven’t used it myself, but my teammate used it to share video snippets of the customer giving feedback with his broader team.

If he had a major stakeholder who was pushing for a feature that wouldn’t deliver value for the users, he would record users’ reactions to said feature and send the snippet to the stakeholder. Bold. Sometimes we just need to hear a few users give us the feedback straight.

Dovetail is a really cool tool that allows you to share snippets of user interviews to your stakeholders so they can hear the feedback directly from users’ mouths. That’s a powerful tool right there.

If you’re running user interviews at your company, I would definitely encourage inviting cross-functional partners into a few user interviews so that they can gain insight into their end customers. It can be super refreshing and fun to talk to real people who use your product. I’ve definitely seen the interaction inspire my partners to really empathize and put a face to the people we truly work for in our day to day.

When sharing out the results of my user interviews, I adjust the medium and format based on how my stakeholders like to receive information. Do they crave immediate, quick-hit updates on Slack, are they visual and prefer videos, or are they more data-oriented and want to see trends? I fine tune how I surface my learnings based on who needs them, how they want to receive that information, and what they’re most interested in.

Here’s a recap:

  1. Set learning objectives
  2. Follow a semi-structured format
  3. Ease into the conversation
  4. Put yourself in their shoes
  5. Know what kinds of questions to avoid
  6. Dig into specific stories
  7. Use prototypes/mock-ups
  8. Share the results with your team in a consumable format

If you just follow these steps, however, I don’t think that’s enough to produce a great user interview. A fantastic user interviewer is sensitive to the nuances of each individual conversation they’re having, adjusting and truly listening to the person in front of them, even if it’s the 200th interview.

Making user interviewing my superpower has affected other parts of my life and other conversations I have as well. When I really pay attention to all the subtleties of how I can guide or interrupt the flow of conversation with another person, I can’t help but be more curious and ask more questions. As a result, I learn so much more and have much deeper connections with the people I interact with. Every time I talk with a new user or a new teammate, I use these practices to be present, be curious, and I’m almost always pleasantly surprised by how much more I learn by doing so.

If you liked this article, give it a 👏👏 and follow me on Medium to stay updated on the latest product content.

Looking for a product coach to level up your product game or a product consultant for your startup? Email allymexicotte@gmail.com to get on the waiting list for May 2023.

About The Author

Ally Mexicotte is a product manager who specializes in building strong user experiences, empowered teams and community-based products. She’s built products across multiple industries and types of products. Most recently she was the head of product at a peer-to-peer sports betting startup and prior to that she was the director of product at a Fortune 200 company where she built many products from 0 to 1.

Follow me on Twitter to talk product!

Special thanks to Tremis Skeete, Executive Editor at Product Coalition for the valuable input which contributed to the editing of this article.

--

--