CONTENTS

Get the latest on product and user research trends. Join our newsletter for tips, updates, and resources to build what matters.

Thanks for joining our newsletter.
Oops! Something went wrong while submitting the form.
✓ Free forever starter plan
✓ No credit card required.

5 Epic Feature Adoption Metrics For Your Product Growth

Stop chasing lagging metrics! Try these 5 powerful feature adoption metrics that deliver true insights into user engagement and drive product success.

November 4, 2024
Avinash Patil

Launching a new feature is exciting, but true success lies in knowing if users actually find it valuable. That’s where feature adoption metrics come in—they offer a window into how users discover, engage with, and return to your feature.

Before diving into them, it’s crucial to ask:

  • Are users finding the feature?
  • Are they actively engaging with it?
  • Do they keep coming back?

Answering these questions can transform your data into actionable insights, helping you identify what’s working and where users might be stuck. With these feature adoption metrics, you can track and improve your feature success.

How to Measure the Adoption of Features

1. Discovery & Activation

Discovery and Activation are the initial phases that determine awareness and the likelihood of trying your feature, the latter of which are critical feature adoption metrics.

Discovery and Activation measures the percentage of users who found the feature. 

Activation Rate =(Number of users who found the feature/Total number of users surveyed)×100

Both of these metrics quantify awareness and interest using: 

  • Time to first use: A shorter time indicates that users quickly see value in the feature in other words; it is the average number of days between discovery and first use 

To find,

Time to First Use = Timestamp of first use−Timestamp of sign up

  • Setup completion rate: If too many users drop off during setup, it might mean complicated process 

To calculate,  Setup Completion Rate=(Total number of users who started setup/Total number of users who completed setup​)×100

Here’s an example of an actual scenario. 

Consider yourself a product manager at a writing app that launches a personalized writing suggestion feature. While users discover it quickly, they take an average of 7 days to engage with it, and many drop off during the setup process, which requires uploading documents.

Action Plan

  • Simplify the document upload process
  • Communicate privacy safeguards
  • Prompt users with reminders during editing sessions

This reduces the Time to First Use and increases the Setup Completion Rate, improving feature activation and adoption.

2. Active Usage

Active usage measures the frequency of users using the feature daily, weekly, or monthly. It measures the feature's stickiness and complements other feature adoption metrics.

For a better yardstick, let's choose weekly active users. This gives a narrow view of how usage may continue over time, here are feature adoption metrics you’d want to try:

  • Usage frequency per user: Understanding how often individual users engage can help identify power users and casual ones
  • Task completion rate: This measures workflow success indicating that high rates suggest that users find the feature useful for accomplishing their goals  

Imagine you’re a fintech product manager noticing declining weekly active usage of a "spending insights" feature. Casual users drop off after customizing their spending categories, while power users use the feature consistently.

Action Plan

  • Identify where casual users drop off (customizing categories)
  • Simplify this step with guided tips and auto-suggestions
  • Run A/B tests to see if the changes improve casual user engagement and task completion rates

The manager can increase active usage and retain more casual users by reducing friction. 

3. Retention & Growth 

Retention and growth measure the number of users who continue to use the feature after a certain time, which we like to call the honeymoon period. 

In this phase, you’d want to monitor the percentage of users still active after 30 days. This will help you understand signs of long-term engagement

Retention Rate=(Total number of users who signed up/Number of users active after 30 days​)×100

Post, you’ll have to measure M2—users returning after 60 days. Try these feature adoption metrics instead: 

  • Feature engagement trend: Analyzing trends over time can help identify whether user interest is growing or waning
  • User satisfaction score: Collecting feedback on user satisfaction helps pinpoint areas for improvement  

Here’s an illustration. 

As a product manager at a fitness app you notice that after 30 days, fewer users are returning to the "workout planner" feature, with a significant drop by the 60-day mark.

Action Plan

  • Track feature engagement trends to identify when user interest declines
  • Gather user satisfaction scores to understand why users are disengaging
  •  Improve the feature based on feedback, such as offering personalized workout recommendations and reminders
  • By addressing user pain points, the manager aims to boost retention and long-term engagement beyond the 60-day mark

Feature Adoption Metrics & Frameworks 

1. Feature Adoption Funnel 

While your usage metrics and stickiness are helpful, they don’t tell you the whole story. As a solution, feature adoption metrics paired with a funnel can help you just do that. 

It’s a four-step funnel that examines the behavioral actions of users in terms of Exposed, Activated, Used, and Used again. 

a. Exposed 

The ‘Exposed’ stage starts with the user coming in contact with the feature for the first time. Call it discovery or revelation, you’d want to measure the number of users who displayed the feature. 

To calculate, you’ll have to divide the number of users landing on the feature page by the number of users who visited your homepage. 

Suppose 2000 users visited your product dashboard but only 750 users landed on the feature page, the exposed percentage would be 38%. 

But, what’s a good or bad score can’t be written off as yet without finding the primary reasons. 

While your instinct may be to blame the feature, that might not always be the case. From our experience, it has got to do with too little awareness or simply blindness. 

Here are the actionable tips for low exposure rates: 

  • Use targeted in-app messaging, emails, and push notifications to increase exposure 
  • Position the features on the right side of the screen for higher visibility—use a tooltip to draw attention 
  • Ensure the feature can directly merge into the workflow

Here’s an example 

When Notion introduced its database feature, many users found it difficult to apply to their workflows. Most of them were new to structured data management.

Notion’s product team created pre-built templates tailored to common use cases, like project management, content calendars, and task tracking. 

They also added in-app guides showing how to customize these templates for specific needs. This approach helped users quickly grasp how databases could fit into their existing workflows, leading to higher adoption.

b. Activated 

A user becomes ‘Activated’ when they start to use the feature. It includes the steps required to use the feature such as setting up a dashboard, creating a note, etc. 

Again, the activation is highly dependent on the level of difficulty. Make it too hard, users will quickly drop off.  

Let’s assume that out of the 750 users who saw your feature, only 370 of them used the feature for the first time. This means your Exposure to activated rate is [370/750] 49%. Further, your activation rate is [370/2000] 19%. 

Here are reasons why activation rates could be lower: 

  • Disconnect between messaging and new feature—highlight the problem, agitate it, and state the solution
  • Cohort differences—new users might adopt the feature sooner compared to existing customers
  • Too much friction when activating the feature—multiple attempts required to complete tasks, dead clicks, rage clicks mean frustration 

Either delay the step or remove it to see if your activation rates see improvements. 

c. Used 

After completing the required steps, you have users who have successfully used it at least once. 

And, for the sake of the funnel, the time taken is irrelevant. Here, you’d want to find out if the users know how to use the feature, the difficulties faced, and their engagement levels. 

For instance, out of the 370 activated users, if 200 users used the feature at least once, your activated-to-used ratio is 200/370*100 is 54%. The used ratio would then be [200/2000*100] is 10%.

Normally, your used percentage is low for reasons such as:

  • Workflow integration problems—users might not be able to add this feature to their routine way of working; create a tutorial with proactive support to correct issues 
  • Limited use cases—if a feature lacks perceived value or simply isn’t worth the effort, users might just use it once; create use cases with real scenarios to persuade users 

Here’s an illustration. 

A dog walker app introduced a "GPS Route Tracking" feature, but adoption was low. Walkers found it to be a hurdle in their routine. Walking multiple dogs while manually starting and stopping the tracker was the key problem.

As a solution, the product manager created an in-app tutorial showing how to auto-start/stop the GPS tracker and smoothly use it. 

They put forth the value of sharing routes with dog owners as a trust-building exercise. To boost feature adoption, use cases where tracked routes led to higher tips and repeat bookings.

This solved both the workflow integration issues and perceived value, driving feature adoption. 

d. Used again 

This is the last step in the feature adoption funnel. You’ll now want to assess the recurring usage of your users. You can continuously engage these users till they are high-value customers.

Let’s suppose, if 190 among the 370 users used your new feature 2-3 times, then your used again ratio is [190/370*100] is 51%. 

Now the total used again rate is 190/2000*100 is 10% 

Here’s how you can identify signs of repeat usage:

  • Track contextual usage patterns—If users use your features repeatedly during a certain day or certain time of week, you engage to create more content to keep them engaged 
  • Monitor trends—Track the user’s actions such as referring to the help documentation, watching tutorial videos, increasing activity trying activities with the same feature 

A product manager for a home services app offering plumbing and lawn mowing services notices a drop in repeat bookings for carpentry services. 

To understand the issue, they track contextual usage patterns and find that most users book carpentry services on weekends, but only a few return after their first booking.

The trends show users frequently referring to the help section for questions about carpentry service pricing and availability. Then, watch tutorial videos on how to select a carpenter based on skill and expertise. This suggests confusion during the booking process. 

The product manager simplifies the carpentry booking steps by adding clear explanations in the app. To engage users, targeted weekend reminders and helpful content around carpentry services are sent. This will lead to an increase in using the feature.

2. TARS framework

The TARS feature adoption metric i is a framework that helps product teams evaluate the performance of features in a product. It breaks down feature performance into four key areas: Target Audience, Adoption, Retention, and Satisfaction.

a. Target Audience

Find the Target Audience to identify the percentage of users who experience the problem that the feature is designed to solve. You can find this via user research or by analyzing product analytics data.

b. Adoption

Adoption measures how many of your users among your target audience use a feature to address their problem. It's calculated by dividing the number of users who engage with the feature in a meaningful way by the number of active target users.

c. Retention

Retention assesses how many users return to use the feature after their initial adoption. It helps you measure the natural frequency of feature use (daily, weekly, monthly, or yearly) to accurately evaluate retention.

d. Satisfaction

Satisfaction measures satisfaction with the feature. This relates to the ease of use while trying the feature. This can be done by conducting a CES survey to collect feedback and translate it into action.

Putting things into perspective

Combine all the TARS metrics into a funnel chart. This visualization helps identify areas where users are dropping off and pinpoint areas needing improvement. 

The TARS framework can also be used to create a feature strategy. By dividing the percentage of your satisfied users by the percentage of target users, product managers can calculate an S/T score for each feature. 

This score can then be plotted on a matrix that compares the feature's strategic importance with its S/T score. This matrix helps teams prioritize features and make informed decisions about product development.

Here’s a scenario of a product manager putting TARS into action. 

A product manager for a home services app uses the TARS framework to improve a scheduling feature.

  • Target Audience: 40% of users face scheduling issues
  • Adoption: Only 20% of target users engage with the feature, revealing low awareness or discoverability
  • Retention: Only 10% of adopters reuse it, indicating usability issues
  • Satisfaction: A low Customer Effort Score (CES) shows users find the feature confusing

Improvements

  • Increase feature visibility through tutorials or notifications
  • Simplify the scheduling process for ease of use
  • The S/T score helps prioritize fixing the feature to boost adoption and satisfaction

Winding Up

With these feature adoption metrics, you will find it easier to measure as it happens to be a proven funnel. However, these metrics work in tandem, when with targeted feature adoption surveys coupled with ways to improve feature adoption rates.

When it works in silos the effect and improvement in feature adoption is minimal since it works in silos. 

If you feel you might need help with improving your feature adoption as a whole, sign up for a demo, and we’ll help you in the best way possible.  

More resources on Feature Adoption

How a B2B SaaS startup increased feature adoption by 80%