top of page

The Framework That Product Managers Can Use to Almost See Into the Future


As Product Managers, our job is to maximize the value of the engineering team. We do this by coming up with ideas that have as high of a probability for success as possible.


How do Product Managers do that successfully? We leverage direct customer feedback and any data we have on hand.


Customer feedback can come in many forms. Customers can on one hand, call in screaming to your customer service representatives about a terrible experience they've had. Alternatively, customers can send you flowers and a valentine because of how incredible their shopping journey was.


Beyond those two ends of the spectrum, it can sometimes be incredibly difficult to get a clear signal of how customers will react to a new feature before it’s actually built. The more you as a Product Manager can "see into the future" and predict how customers will respond to a feature, the more valuable you’ll be as a Product Manager.


For example if you could always, with 100% certainty know how customers would respond to a feature, you wouldn’t just leave your backlog as is. Instead, you’d prioritize things totally differently; changing the sequence of some things, adding other items that were missed, and completely removing certain features from the backlog.


To further illustrate the point, if you knew customers would be beating down your doors to get access to new feature X, wouldn’t you want to do everything you could to deliver that feature faster? If, however, you knew in advance that customers would absolutely hate the feature you're working on, wouldn’t you want to avoid further investment in it?


Diving into the million dollar question: how can you see into the future? Or in more realistic words how do you accurately and quickly gain customer feedback to determine how successful a feature will be before actually investing the time to build it?


Predicting Feature Performance




Some might say the best way to understand the success potential of a feature might be conducting surveys, customer studies, or putting wireframes in front of customers and getting their feedback.


All of those methodologies can provide incredible data but are not always 100% a true reflection of how customers actually would use a feature. Surveys can unintentionally create bias. Most importantly the questions we ask or situations we put customers in aren’t real; we aren’t asking people to open up their wallets and give us their money. But in real life we do!


So how do we, as Product Managers, recreate that mindset of understanding the point at which customers are willing to buy and whether or not your feature changes that level of willingness?


Here is one great way that I’ve found leads to getting a very clear sense of how customers would use a feature:


Coming soon


Different organizations have various names for this approach (vapor test, false door, etc), but the idea is to surface the concept of your feature in some way to the customer to gauge interaction. This does not mean building out the feature and AB testing it, but instead surfacing the concept to the customer and monitoring how they react.


Time for a simple example:



Let’s say we own a hat company that has a very basic product page online. You can view images of the hat, see details about what material it is made of, and the price. Right now our hat company sells 3 colors: White, Black, and Red.

We’ve been receiving tons of requests from a small group of really passionate customers that they would love blue hats. But sourcing a whole new color of hat is a lot of work. We would need to find a supplier, cost it out, place a big initial order and roll the dice on whether or not our customers would actually want to buy them. Worst case, we’d spend thousands of dollars, have an avalanche of inventory of blue hats, AND would likely go bankrupt because of our company's current financial position.


So what do we do?


Before we place any sort of purchase order for 1,000,000 new blue hats we don’t even know if people would like, what if we took 5 minutes and added a new blue swatch to our product page that is roughly what the offering would look like if we did place this big order. If customers click on it, we have a small pop up that says “coming soon!” and over a two week period, we monitor to see what percentage of users click on that swatch.


Does this directly tell us how many people would buy blue hats if we offered them tomorrow? No. But it is an incredibly clear gauge of feature interest.

We could even go a step farther for our test. Maybe instead of just popping up text that says “coming soon!”, we also include an email signup form where customers can get notified when we do add a blue hat for purchase. Now with each new signup, we reduce the risk of that first purchase order.


The upsides of the coming soon approach:


  • You don’t have the cost of building the actual feature


As long as you are scrappy about it, the cost of adding this to your experience should be minimal. Maybe it is one image, some copy, or a tile on a page but it in no way needs to be supported by the back end infrastructure (new physical product, new microservices, etc) that would be needed to support the actual feature.


  • You get true, live, non survey bias customer feedback that at scale


This can give you the data you need to know whether or not a specific feature makes sense to continue refining and eventually building.


Actual shoppers should always be preferred to survey or research participants. At the end of the day your company makes money from actual customer actions, not customer intentions stated on a research panel.


Again, that research is another supplemental way of getting at customer feedback, but it doesn’t come close to actually exposing features to the customer.


The downsides of using an approach like this:




  • If done incorrectly, it can be incredibly non transparent or frustrating to the customer


This is the biggest hurdle to an approach like this. If done incorrectly you can not only permanently lose a customer that was exposed to this approach, but they can have such a negative experience that they tell their friends not to shop with you as well.


That is why this coming soon approach should only be used early in the shopper journey for areas that have a low level of user investment. For example, surfacing a new color on the product page and immediately telling the customer it isn’t actually available is one thing.

However, if you surface the color as an option, and wait all the way to the payment step after the customer spent hours deciding on and designing their product only to tell them “Whoops! We tricked you!”, that would be a terrible customer experience.

This is of course an exaggeration, but hopefully the point comes across.



  • If you are not working in an agile organization, it will be harder to get stakeholder buy in to test something like this


This approach doesn’t tend to sit well with people who are not open to taking a bit of calculated risk, plain and simple. At the 10,000 foot view, people that are open to fast paced, agile, and iterative approaches to working will be receptive to ideas like this.

People that are used to working in waterfall big bang project type of environments however may not be open to approaches like this. They might not understand the short term tradeoff for long term gain, and might not be able to grasp the concept that having a significantly more accurate view of what customers actually want earlier in the design process will 10x the value your team can deliver.



 


Are there applications where you can try something like this out on your own product? Is there a seemingly big project, commitment, or cost your organization hasn’t been willing to take on that could be easily tested out for clear indicators of customer receptivity?


Take a leap and try a Coming Soon test!


About the author:

Ben Staples has over 7 years of Product Management and product marketing eCommerce experience. He is currently employed at Nordstrom as a Senior Product Manager responsible for their product pages on Nordstrom.com. Previously, Ben was a Senior Product Manager for Trunk Club responsible for their iOS and Android apps. Ben started his Product career as a Product Manager for Vistaprint where he was responsible for their cart and checkout experiences. Before leaving Vistaprint, Ben founded the Vistaprint Product Management guild with over 40 members. Learn more at www.Ben-Staples.com


I do Product Management consulting! Interested in finding out more? Want to get notified when my next Product article comes out? Interested in getting into Product Management but don't know how? Want book recommendations?! Contact me!



150 views0 comments

Recent Posts

See All
bottom of page