My colleague at L4 Digital, Lisa Kream, interviewed me to learn more about how to conduct user interviews. The post was originally written for the blog on L4 Digital.com.
Lisa: When you begin a new project, one of the very first things that you and your team do is conduct user interviews. User interviews help your team answer some very crucial questions: Who will be using our product? Why do they need to use it? What’s their problem, and how can we solve it?
The answers to these questions ultimately determine how successful you and your team will be, as users’ problems inform the very vision and direction of your product. Conducting a meaningful user interview, then, in which you truly learn about your users and listen to their needs, is of the utmost importance.
That is, unfortunately, all the insight that I have to offer. I know that user interviews are a crucial part of the development process, but my expertise ends there.
To further explore why they’re important, and to provide guidance around conducting better, more productive user interviews, I’ve conducted an interview of my own. I sat down with Tricia Cervenan, a Senior Product Manager at L4 Digital, to learn more about this process. Tricia brings ten years of stellar product management experience to the table, and is more qualified than most (and definitely more qualified than I) to help you get the most out of your user interviews.
Lisa: Thanks for joining me, Tricia! Firstly, can you explain in a bit more detail what a user interview is?
Tricia: When those of us in product development start a new project, we have assumptions about the kind of problems users have and what a solution to those problems might be. As user-focused practitioners, we need to validate that the problems we think users have are real. We do this through the user interview. User interviews were introduced in the 1990s as “context interviews” by Wixon, Holtzblatt and Knox. In these interviews, you observe user behavior in a natural environment, and ask users questions in an attempt to understand and verify what motivates their behavior.
The term “user interview” is a bit broader than “context interview”, and really implies that users will be asked a series of questions. It’s important to ensure that the spirit of the context interview is maintained, however, and that questions are focused on previous or observed behavior rather than what-if scenarios that ask users to predict what they will do in the future.
Lisa: Why bother conducting a user interview?
Tricia: We have assumptions about our users’ problems, and we need to validate whether those assumptions are accurate. Oftentimes, teams will build directly off their assumptions without having first proven that a problem is real by observing user behavior and asking questions.
Where quantitative data tells us that there is a problem, user interviews and qualitative research tells us what the problem is and why it exists. If we see one of our conversion metrics decline, digging into the quantitative data will only tell us so much. And if we know and understand our market, we can even make inferences about what might have changed. But without talking to people, we can’t truly understand how they feel about the change. The last thing we want is to be forced to run a hundred tests when a few conversations with users could narrow the field of possible and probable solutions down to something much more manageable (and time efficient).
Lisa: What if you conduct a bad user interview? What are the consequences for your team and your product?
Tricia: The biggest risk of a poorly-conducted interview is the potential for introducing bias into our data. Bias leads us to believe we understand which problems users have when, in reality, this may not be true. There are many biases that come into play, and the ones I’ve encountered the most are confirmation bias and diagnosis bias. Confirmation bias shows up when we focus our line of questioning on the behavior we expect to see or to prove our assumptions. Diagnosis bias occurs when we start interviewing, make a judgement, and then spend the rest of the interview asking questions that confirm our judgement. By injecting bias into our research, we don’t get a truthful answer as to whether a problem is real and whether it’s worth solving.
Lisa: How can our readers conduct better user interviews? What are your top three tips?
Tricia: First, build rapport with your interviewee. Remember that you’re asking someone to be a bit vulnerable and tell you why they’re making the decisions they are. You’re also asking them to be honest. Not everyone is introspective enough to reveal that much about themselves and how they feel without trusting the person on the other side. Start interviews by easing in and assuring participants that there is no right or wrong answer, that they won’t hurt your feelings, and that you’re simply trying to understand how they use a product. Then follow up with softball questions that are easy for them to answer.
What types of technology do you own?
Tell me about your favorite app. Why do you like it?
Tell me about the last time you went grocery/clothes/car/furniture/etc. shopping.
The goal is to get them to start opening up early so that when you get to the meatier questions, they feel more comfortable revealing their deeper feelings.
Second, stop talking. When you ask people to answer questions, you need to give them the space to do so. When you hear a pause after someone has stopped talking, count to five and, more often than not, they will continue talking, both because they have more to say and to fill the silence. One of your goals during the interview is for users to reveal insight into their behavior, and our silences allow users to feel like there is room in the conversation for them to do so.
Third, don’t pitch your product or idea. We’re not salespeople on a grassroots mission to get five more people to use our product. We are researchers looking to learn from our users so that we can build solutions that solve their problems. Pitching our ideas only serves to make us feel better about our assumptions; it doesn’t prove or disprove them. Pause and reevaluate your line of questioning if you find yourself saying things like:
The product does that. Here’s how you do it.
We’re planning to build feature X. Would that be something you would like?
We’ve built this great new product that could make your life easier. Does that sound like something you’d be interested in?
I love talking about how to do user interviews. If you want to read more, I’ve also written a post for General Assembly on this topic.