Seven questions to uncover user goals and needs

Edited excerpt from The Fast (and Easy) Way to Uncover User Needs by Joe Natolli:

Asking this set of questions across even a small pool of people — ten or less — will show you clear, recurring themes and patterns that can be used to validate user needs:

1. How do you define a successful work day? What has to happen in order for you to feel good when you leave?

2. Does that definition of success (and your stated goals) change from day to day — or from week to week? Are there certain times of year where what you need to accomplish changes?

3. What are the top three things standing in the way of you accomplishing your goals or having a successful workday?

4. What are the biggest problems, obstacles or inefficiencies you deal with? Why do you think these things happen?

5. Did you do have this same role at other organizations you’ve worked for? Was it better, worse or different – and why (or how)?

6. Did you perform these tasks in the same way at any of these other organizations? Was it better, worse or different – and why (or how)?

7. What frustrates you most about this? Why?

Notes:
(1) The full article provides the rationale for each question.
(2) The first two questions probe what users are trying to achieve, what defines success for your users. In other words, they help you investigate the Job To Be Done. The third and fourth questions probe the context and challenges. As a result, these questions are helpful in completing a Job Outline.
(3) See also How to interview customers to get great product insights and How to learn about your customers.

Product development — how NOT to conduct a survey

Edited excerpt from Surveys and focus groups by Seth Godin:

The story is told of a focus group for a new $100 electronic gadget. The response in the focus group was fabulous. All the people talked about the features of the new device with excitement.

At the end of the session, the moderator said “Thanks for coming. As our gift to you, you can have your choice of the device or $25.”

Everyone took the cash.

Surveys that ask your customers about their preferences, their net promoter intent, their media habits — they’re essentially useless compared to watching what people actually do when they have a chance.

Notes:
(1) Seth Godin is right that surveys which ask people what they think or would do are ineffective. But that doesn’t mean your only option is to watch directly what people do. Sometimes you don’t have the opportunity to observe actual customer behavior. For example, if you’re trying to assess potential demand for a product you haven’t yet developed, you might want to know what similar products your customers buy. In cases like that, surveys can be very useful.
(2) How then should you run a survey to avoid the pitfall Seth describes? Ask questions that get your customers to share facts and experiences rather than opinions. In the words of Benson Garner: Don’t ask “Would you..?”. Ask “When is the last time you..?” or “Tell me about a time when you..?”. See How to interview customers to get great product insights.
(3) Cf. The survey question you should never ask.

Questions to ask your customers to validate product-market fit

Edited excerpt from 8 Customer Discovery Questions To Validate Product Market Fit For Your Startup by Tomasz Tunguz:

1. How did you hear about the product? Clarifies which customer acquisition mechanisms are working, and if they are consistent with the company’s perceptions.

2. What process did you use to pick this product over the competition? Sheds light into the sales process, key stakeholders and ultimate decision-maker, and sales cycle length.

3. Why did you choose this product? Clarifies product differentiation, whether there are different customer segments in the market who might use a different product to satisfy unique needs.

4. Which teams in the company use the product, and how has that changed over time? Reveals the key users, potential for account growth and negative churn possibility.

5. How important is this product compared to other software? Validates that, in the words of Paul Graham, this is a hair-on-fire problem.

6. How much do you pay? Is it worth more or less than X product? Ascertains whether the customer believes the return on investment is wildly in their favor.

7. How quickly is the product evolving? How satisfied are you with customer support and working with the company? Indicates churn risk.

8. To whom would you recommend this product? Clarifies the breadth of potential customers the product can serve, and the vigor with which an existing customer would recommend.

Notes:
(1) “To whom would you recommend this product? Clarifies… the vigor with which an existing customer would recommend.” This is the net promotor score approach. For an alternative, see Surveying customers to answer Sean Ellis’ “One Question That Matters.
(2) Note the similarity to Sachin Rekhi’s template for Documenting your product-market fit hypotheses. See also Three warning signs you’re not solving a meaningful problem for your customers and How to identify your customers’ “Job To Be Done”..

Net promotor score — how to set up the survey

Edited excerpt from A Practitioner’s Guide to Net Promoter Score (NPS) by Sachin Rekhi:

How NPS is calculated. Ask your customers: “How likely is it that you would recommend our company to a friend or colleague?”, with the possible answers ranging from 0 – 10. Group your customers into Promoters (9-10 score), Passives (7-8 score), and Detractors (0-6 score). Then subtract the percentage of detractors from the percentage of promoters and you have your NPS score. The score ranges from -100 (all detractors) to +100 (all promoters). An NPS score that is greater than 0 is considered good and a score of +50 is excellent.

Additional NPS questions. It’s essential to also ask the open-ended question: “Why did you give our company a rating of [customer’s score]?” This turns the score from simply a past performance measure to an actionable metric to improve future performance.

Sample selection. Survey a random representative sample of your customers each NPS survey. We found strong correlation between engagement and customer tenure and NPS results, so ensure your sample reflects your user base.

Collection methods. The survey is normally sent via email to your customers or delivered through an in-product prompt. Use one of the NPS survey solutions that support collection and analysis across a variety of channels and interfaces, such as SurveyMonkey’s.

Survey frequency. At LinkedIn we found it best to administer our NPS survey quarterly, which aligned with our quarterly product planning cycle. This enabled us to have the most recent scores before going into quarterly planning and enabled us to react to any meaningful observations from the survey in our upcoming roadmap.

Notes:
(1) Net promoter score, writes Sechin, was “devised by Fred Reichheld at Bain & Company in 2003 and introduced in a seminal HBR article The One Number You Need to Grow, which I highly recommend anyone serious about NPS to read in detail. Fred found NPS to be a strong alternative to long customer satisfaction surveys as it was such a simple single question to administer and NPS score was correlated with long-term company growth.”
(2) See How to use net promotor score surveys to improve your product, also from Sachin Rekhi.
(3) Compare Net Promoter Score to Sean Ellis’ “One Question That Matters”.

Three warning signs you’re not solving a meaningful problem for your customers

Edited excerpt from 3 Warning Signs That Your Product Sucks by David Cancel:

If you repeatedly hear any of the following comments, chances are you are not solving a critical problem:

1. “If you made your app easier to use I would start using it.”
2. “I’m really busy right now but I’ll start using your app soon.”
3. “If your app was cheaper I would start using it.”

Notes:
(1) David Cancel says that this feedback indicates you’re not solving a critical problem. I’m not sure the problem needs to be “critical”; perhaps it’s enough that the customer has a “Job To Be Done“.
(2) Cf. What problem are you solving?

How to interview customers to get great product insights

Edited excerpt from 8 Tips For Conducting Interviews That Deliver Relevant Customer Insights by Benson Garner:

Adopt a beginner’s mindset
Listen with open ears and an open mind and avoid interpreting customer responses too early.

Get facts, not opinions
Ask questions that get your customers to share facts and experiences rather than questions that result in opinions. Don’t ask “Would you..?” Ask “When is the last time you..?” or “Tell me about a time when you..?”

Ask “why” to get real motivations
Ask “why?” frequently. You might ask “Why do you need to..?” or “Why is ___ important to you?” or “Why is ___ such a pain?”.

Don’t mention solutions too early
It’s easy to fall into the trap of offering your solution during learning interviews. Don’t do it. People inherently wish to please others so it’s only natural that they confirm your opinions.

Notes:
(1) On “Don’t ask Would you..?”, see The survey question you should never ask.
(2) Cf. Product discovery questions to ask potential customers.
(3) Cf. How to ask great questions.

Product discovery questions to ask potential customers

Edited excerpt from The Ultimate List of Customer Development Questions by Mike Fishbein:

1. What do you think could be done to help you with [problem]?
2. What would your ideal solution to this problem look like?
3. If you could wave a magic wand and instantly have any imaginable solution to this problem, what would it look like?
4. What’s the hardest part about [process you’re improving]?
5. What are you currently doing to solve this problem/get this value?
6. What do you like and dislike about [competing product or solution]?

Notes:
(1) On the question about the customer’s ideal solution (question 3), Mike writes: “I’ve found that about 80% the time the answers I get to this question are not very informative – solutions that aren’t feasible or most certainly wouldn’t be profitable. But the other 20% of the time there are some really informative responses that make the other 80% acceptable.”
(2) See also: How to learn about your customers.

How to use net promotor score surveys to improve your product

Edited excerpt from A Practitioner’s Guide to Net Promoter Score (NPS) by Sachin Rekhi:

The most actionable part of the NPS survey is the categorization of the open-ended verbatim comments from promoters & detractors. Each survey we would analyze the promoter comments and categorize each comment into primary promoter benefit categories as well as similarly categorize each detractor comment into primary detractor issue categories. The categories were initially deduced by reading every single comment and coming up with the large themes across them. We conducted this analysis every quarter so we could see quarter-over-quarter trends in the results. This categorization became the basis of how we came up with roadmap suggestions to address detractor pain points and improve their overall experience. While it can be daunting to read every comment, there is no substitute for the product team digging in and really listening directly to the voice of the customer and how they articulate their experience with your product.

We found it equally helpful to spend time on promoters and understanding what was different about their experiences to make them successful. We correlated specific behavior within the product to NPS results (logins, searches, profile views, and more) and found a strong correlation between certain product actions and a higher NPS. This can help deduce what your product’s “magic moment” is when your users are truly activated and likely to derive delight from your product. Then you can focus on product optimizations to get more of your customer base to this point. The best way to get to these correlations is simply to look at every major action in your product and see if there are any clear correlations with NPS scores. It’s easy to just graph and see if this is the case.

Notes:
(1) “We correlated specific behavior within the product to NPS results… and found a strong correlation between certain product actions and a higher NPS”. This is similar to mining usage data to identify the moment a user becomes truly engaged. See also How to increase active users.
(2) Using NPS survey responses has an advantage over mining usage data: the verbatim comments from detractors can tell you what’s missing from your product or what’s wrong with it, whereas usage data can only tell you what’s successful.

Surveying customers to answer Sean Ellis’ “One Question That Matters”

Edited excerpt from Using Survey.io:

One of the most difficult decisions for a startup is determining when your product is good enough to drive sustainable, scalable customer growth. Trying to scale too early can easily kill your startup. Given the emotional roller coaster of being a founder, the decision can easily be swayed by your mood on a given day.

Here’s an objective metric that removes emotion from the scaling decision while also giving you other important qualitative information. The key question on the survey is:

How would you feel if you could no longer use [product]?

  1. Very disappointed
  2. Somewhat disappointed
  3. Not disappointed (it isn’t really that useful)
  4. N/A – I no longer use [product]

If you find that over 40% of your users are saying that they would be “very disappointed” without your product, there is a great chance you can build sustainable, scalable customer acquisition growth on this “must have” product. This 40% benchmark was determined by comparing results across 100s startups. Those that were above 40% are generally able to sustainably scale the businesses; those significantly below 40% always seem to struggle.

Customer development is about listening, not pitching

From Five techniques that measurably improved our customer development by Peter Nixey:

We stopped pitching and started listening

When we first started customer development we made a massive mistake. Instead of listening to a customer’s problems we would instead pitch them our solution. What we should have been doing was asking people what their issues were and listening to how we might solve them. What we actually did was wax lyrical about the product and debate with them on whether it really could solve their problems (at the time it couldn’t).

Customer development is a very different process to sales though. Sales is about helping your customer to understand the product. Customer development is about helping the product to understand your customer. Don’t try and sell during the initial exploration.

Notes:
(1) On getting the balance right between talking and listening in meetings generally, see Startup founders’ most common mistake in meetings — and how to avoid it.
(2) On listening, see also (i) How to be a better listener and (ii) How to listen without judging — a guide for managers.

How to respond to critical customer feedback

Edited excerpt from We don’t care enough to give you constructive feedback by Seth Godin:

Most of the time, people won’t bother to give you feedback. But when someone does care enough (about you, about the opportunity, about the work or the tool), the ball is in your court.

You can react to the feedback by taking it as an attack, deflecting blame, pointing fingers to policy or the CEO. Then you’ve just told me that you don’t care enough to receive the feedback in a useful way.

Or you can pass me off to a powerless middleman, a frustrated person who mouths the words but makes it clear that the feedback will never get used. Another way to show that you don’t care as much as I do.

One other option: you can care even more than I do. You can not only be open to the constructive feedback, but you can savor it, chew it over, amplify it. You can delight in the fact that someone cares enough to speak up, and dance with their insight and contribution.

Because then, if you’re lucky, it might happen again.

Notes:
(1) Thank you Chanie Weisenberg for the tip.
(2) Cf. When your product change is greeted by a torrent of complaints, what should you do?

How to use a welcome email to get valuable customer feedback

Edited excerpt from How We Got 2,000+ Customers by Doing Things That Didn’t Scale by Alex Turnbull:

From: Alex from Groove
Subject: You’re in 🙂 | Plus, a quick question

I really appreciate you joining us at Groove, and I know you’ll love it when you see how easy it is to deliver awesome, personal support to every customer.

We built Groove to help small business grow, and I hope that we can achieve that for you.

If you wouldn’t mind, I’d love it if you answered one quick question: Why did you sign up for Groove?

I’m asking because knowing what made you sign up is really helpful for us in making sure that we’re delivering on what our users want. Just hit “reply” and let me know.

By the way, over the next couple of weeks, we’ll be sending you a few more emails to help you deliver awesome support to your customers. We’ll be sharing some tips, checking in with you, and showing you how some of our customers use Groove to grow their businesses.

Thanks,
Alex
CEO, Groove

Best practices in getting user feedback on your product

From 5 Mistakes we all make with product feedback by Des Traynor:

1. Stop talking to “all users”: When you survey all your users together you ignore the specifics. You mix up yesterday’s sign-ups with life long customers. If you want to improve your onboarding, only listen to people who recently signed up. If you want to improve a feature, only talk to those who use it.

2. Feedback should be on-going: The default approach to feedback is to solicit it on demand. But that means when you realise you need it you have to wait a week doing nothing while it comes in. Solution: Periodically check in with users. Ask users for feedback on day 30, 60, 120, 365, etc. Sightly more advanced — gather feature specific feedback based on usage.

3. Distinguish free from paying feedback: To improve your product for your paying customers, only talk to your paying customers. To learn what makes people upgrade from free, only talk to customers who upgraded from free. When you want to improve your free product, only talk to your free customers.

4. Don’t fall for the vocal minority: Treat every clustering of feedback that you see as a hypothesis, and then don’t build it, verify it.

5. Don’t assume users request the right features: It’s essential to abstract a level or two above what’s requested, into something that makes sense to you, and benefits all your customers.

Notes:
(1) Cf. When your product change is greeted by a torrent of complaints, what should you do? and The survey question you should never ask.
(2) Thank you Eran Ben-Shushan, co-founder of Bizzabo, for the recommendation.

Never ask if someone would buy your product

From Asking or announcing… by Seth Godin:

When you ask someone if they would use your new product, buy your new widget or participate in your new service once it’s ready, you will get a lie in response.

It might be a generous lie (“sure, I love this”) or it might be a fearful lie (“here are the six reasons I would never use this”). The fearful lies cause us to scale back, to shave off, to go for mediocre. And the generous lies push us to launch stuff that’s just not very good.

People don’t mean to mess you up, but you’ve made the error of asking them to imagine a future they have trouble imagining. It’s incredibly different than asking them to justify what they already do.

We have to create environments where people choose, then ask them why.

Notes:
(1) Characteristic Seth Godin brilliance: “People don’t mean to mess you up, but you’ve made the error of asking them to imagine a future they have trouble imagining.”
(2) Cf. How to set the price for your product.
(3) “Don’t ask people to image what they would do” applies to product as well as pricing. See: The survey question you should never ask.

Feedback versus vision in product management

From The 5 Traits of High Performing Product Teams by Eran Aloni:

Great product managers seek feedback and new ideas from users, customers, developers and all other stakeholders that challenge (rather than reinforce) their point of view. Processing incoming feedback is not enough, though. Active solicitation of feedback is key to getting more inputs from more sources in a shorter amount of time.

Soliciting feedback does not mean product plans change course with every new piece of input. The best product managers are those who can define and articulate clear and consistent product vision while constantly evaluating it. This duality is one of the hardest things in getting product management right.

The trick is finding the point where feedback merits changing course. Small changes are usually easy to digest and act on, but acknowledging that key parts of your vision are flawed and that so much energy and effort are about to go down the drain goes against our nature and our inherent tendency to stay on course and reach our goal, even it’s no longer the right goal. This is why so many companies fail to pivot in the face of overwhelming evidence that their business models are not working and their products fail to fit market needs.

How to learn about your customers

From Intro to Effective Customer Development by Jim Gray:

Talking to random people to learn about your customer, instead of talking to your customer to learn about your customer, is so off target… This will end up either confirming your biases or giving arbitrary results, neither of which is a positive outcome for your business.

To learn about your customers, you have to find and talk to your customers. And we’re not talking broad demographic groups. You need to get specific. I’m talking laser-focused. We’re shooting for the 10% that will be your most rabid fans, not the other 90% who will just be satisfied customers.

If you’re doing it right, most of this should match up:
— What problem hurts really bad
— Why it’s a problem, what goal it’s interfering with
— How they experience the problem
— How they think about the problem
— How they talk about the problem
— How they currently deal with the problem
— Where you can find the target customer
— What approaches they are receptive to

Improving your product via customer support

From Customer Support is the Ultimate Learning Experience by Ben Yoskowitz:

I look at customer support as the ultimate learning experience, particularly early on in your company’s existence or with a new product. There’s no better way to understand how customers are doing than by handling customer support. Even if you did a lot of validation before launching (as you should!), the minute you put your product into people’s hands, they’ll do all kinds of interesting (read: crazy) things. You’ll probably be surprised and confused at how they’re using your product. Some features will get a ton of use, others will get very little; and I bet that you would have predicted the opposite.

Most often, you just don’t know what people are going to do when they get their hands on your product. And customer support is the learning engine that can drive the company forward in terms of resolving usability issues, fixing bugs, prioritizing features, increasing virality/word-of-mouth and more.

The survey question you should never ask

From Intercom:

Research study after research study has shown that people are very bad at predicting their future behaviour and attitudes. Therefore one of the worst, but sadly most common, research questions to ask is: “Would you use feature ‘x’ if we built it?”. In interpreting the question, many biases form the response. For example, people interpret that by suggesting it, you think building the feature is a good idea, so they fall victim to authority bias and a little social proof and tell you that they would definitely use the feature. A second problem is that people state preferences and opinions about something simply because they were asked, whereas without being asked they would never have thought about, nor needed the feature. This is called the query effect. People are incredible storytellers, and can create detailed accounts of things that don’t actually matter to them when they are asked about them.

A better way to discover whether a feature idea would be useful, is to ask about specific recent usage. For example, “The last time you used feature ‘x’, what were you trying to do?”

When your product change is greeted by a torrent of complaints, what should you do?

Edited excerpt from Rework by Jason Fried:

After you introduce a new feature, change a policy, or remove something, knee-jerk reactions will pour in.

Resist the urge to panic or make rapid changes in response. Passions flare in the beginning. That’s normal. But if you ride out that first rocky week, things usually settle down. People are creatures of habit. That’s why they react to change in such a negative way. People often respond before they give a change a fair chance.

Also, remember that negative reactions are almost always louder and more passionate than positive ones. In fact, you may hear only negative voices even when the majority of your customers are happy about a change. Make sure you don’t foolishly backpedal on a necessary but controversial decision.

So when people complain, let things simmer for a while.

Notes:
(1) Even if you wait a week, how do you know then if you made a genuine mistake? True, reacting to immediate complaints can be a mistake. But shutting yourself off from user feedback is risky.
(2) It’s easy to react emotionally to negative feedback. But as Eli Hoffmann (SA’s VP Content) points out, a torrent of user complaints shows that people really care about your product.