CONTENTS

Get the latest on product and user research trends. Join our newsletter for tips, updates, and resources to build what matters.

Thanks for joining our newsletter.
Oops! Something went wrong while submitting the form.
✓ Free forever starter plan
✓ No credit card required.

What Is a Good Survey Response Rate in 2024?

What does it take to achieve a good survey response rate? Learn the key factors and benchmarks to help you get there.

July 26, 2024
Avinash Patil

Times have changed, and so have the trends. But one thing remains constant—the question of survey response rates. 

Whether it's determining what's acceptable, what's average, or what's good, this question persists. Since it can't be answered in just a few lines, we've created a step-by-step guide to help you find the answers you need.

What is a survey response rate? 

A survey response rate is the number of recipients who answered at least one question compared to the total number of recipients who received the survey.

However, most people mistake the survey response rate and survey completion rate to be the same. There’s a thread that separates the two. 

What is the difference between a survey response rate and a completion rate?

The completion rate is the percentage of people who completed the survey out of those who started it.

The difference between response rate and completion rate is the sample size. In response rate, you measure if the survey has reached the right person(answering at least 1 question). In completion rate, you measure the number of people who completed the survey out of those who started the survey. 

For instance, if 200 respondents received the survey with 75 answering at least 1 question, and 60 answered all the questions, then the response rate is 37.5%. The completion rate is 80%. (60/75*100) 

How do you calculate the survey response rate?

Calculating the survey response rate requires two things. One, the number of completed surveys and the total number of recipients sent. 

Survey response rate = No. of respondents who answered at least 1 question/No. of survey recipients*100

Let’s say, out of 500 survey recipients, 175 answered 1 question then the survey response rate would be 35%. 

Likewise, to calculate the survey completion rate, use the formula.

Completion rate = No. of completed surveys/Total number of respondents who started it*100

Assume out of 200 people who start only 75 people complete it, then the completion rate is 37.5%.

What is a good survey response rate? 

Good is subjective. The internet might throw up different numbers until someone comes up with new data. 

The survey response rates differ by different channels. Again, their effectiveness can be attributed to a lot of factors such as timing, reach, frequency, and more importantly asking the right questions

If you dig deep, the response rates vary high and large, here are the numbers: 

  • In-app mobile surveys: 35% 
  • Email surveys: 7% 
  • In-app web surveys: 22%

Going further, it varies by industry segments such as B2B (23-32%) and B2C (13-16%). 

Ideally, a good survey response rate should be able to represent a large chunk of your target market. When going granular, it should include both your high responders and low responders. And include your mid-range responders to evenly balance out results. This will help you bridge the gap to bring remarkable results. 

Remember, your previous survey response rates should be a benchmark for your future ones but not without on-ground improvements

7 Factors that affect your survey response rate 

1. Channels 

You might already know this, but channels play a critical role in influencing survey response rates. Here’s how each of them fare:

a. Email: Email surveys have a low response rate due to a crowded inbox which means low visibility. Spam filters are another affecting your email deliverability and delivery rate. Moreover, unless your emails are mobile-responsive, users will have a hard time accessing them. 

b. SMS: While SMS messages have a high open rate(98%), they have low read rate. With a lot of promotional messages flooding the phone, customers may not always read the messages. Plus, with scammers on the prowl, there is a decline in trust. 

c. Web surveys: Web surveys aren’t triggered on time and are not contextually relevant. They are triggered randomly and don't tie the interaction to a specific touchpoint. 

d. In-app surveys: In-app surveys engage the user on the platform bringing users from other channels such as email, SMS, and WhatsApp. 

e. In-person surveys: In-person surveys are highly accurate but they are time-consuming and can be done for limited respondents. They tend to be expensive and users might give popular answers to seek validation.

2. Sample size 

The sample size has a greater effect on reaching statistical significance. Too low means the data doesn’t represent the target audience enough. It further leads to a higher margin of error. The ideal sample size we recommend is 200. 

This can help you achieve 90% confidence. As you go higher like 500 you get a 95% confidence interval. However, as you get higher, the sample size needed to achieve a higher significance increases. A sign that you’re moving away from your target audience. 

As a rule of thumb, go for your highly engaged users. They spend more time compared to your average page on time and have higher micro-conversions. It means the actions that have a higher chance of leading to a purchase. For an insurance app, it could be policy comparison and educational content engagement. 

3. Industry 

The difference in survey response rates differs depending on the end users. The B2B industry has a higher response rate since the outcome of the survey has a greater effect on their satisfaction. Since there is a higher amount at stake, they are willing to comply. 

In B2C, customers are laid back until there’s a negative experience which compels them to complain. 

Likewise, the survey length across industries affects the response rates. The average survey length for web surveys is 10 to 14 minutes while the number of questions ranges from 7 to 15 questions. 

Another fact that’s often overlooked is opportunity cost. Industries with high opportunity costs where finding alternatives hurts the pockets like luxury brands, have a higher survey response rate. On the flip side, industries where there is a low opportunity cost to switch like grocery would have a low survey response rate. 

4. Timing 

According to studies, the best time to send surveys are when your customers are relaxing. That includes time away from work. Surprisingly, even certain days can guarantee a good response rate. 

According to a Question Pro study, the following findings came to light:

  • Tuesday is the most popular day to send surveys in the US 
  • The two most popular times were 6 AM to 9 AM and 10 AM to 11 AM but this could vary if you’re again segmenting 
  • While B2B audiences are most receptive on Monday (3-6 PM) 
  • B2C audiences are highly receptive between 6-9 AM on weekdays  

Again, this could be different for you based on the channels you use or the demography you cater to. Test different audiences and timings to find the sweet spot. 

5. Language 

While using language that your customers speak is a given, jargon and complex terms are seriously hurting your response rates. 

Running surveys in your native language, using plain language, and localizing the survey language can help your users understand the things you want to convey. It enables cognitive ease—focuses on one thing at a time without hampering the brain. 

The best tip is to write in a language that can be understood by fifth graders. 

6. Question Types 

Not all question types trigger the same response. This is due to the effort required. Question types like multiple choice questions and rating scale questions have a higher response rate since it's a low-effort one. 

The only problem with the rating scale is when the scale is bigger. A 5-point scale works best while a 7 or 10-point scale is confusing as it makes it hard to differentiate between the options. Yes or No questions are easier to answer but they don’t give a full context of the answer. 

For instance, Do you speak Spanish? doesn’t specify the level of proficiency with just a Yes/No. 

Next, a lack of personalization based on the previous answers creates a flawed response. Something that can be avoided with conditional logic. 

7. Geographical regions 

Cultures unique to a region have a large influence on the survey response rates. Nations with individualistic cultures uphold freedom and individual freedom. On the other hand, collectivist cultures value community and group values. 

The US has an individualistic culture where people are more open to participating in surveys while China has a collectivist culture. This usually means hesitation to participate in surveys due to social anxiety. 

Other obvious factors are the follow ups which could increase the survey response rates. 

What is a good response rate for a customer satisfaction survey? 

A good response rate for CSAT is around 5 to 30%. Anywhere between 30 to 50% is remarkable. But, that may not necessarily be an achievable benchmark if that’s not the status quo in your industry. 

If your CSAT survey response rates aren’t close to your industry average or your ideal target, there could be reasons behind it. Like:

  • Sending them too soon—in industries, like retail the customer may not have had the time to use the product 
  • You aren’t sending them after every resolved ticket—sending a CSAT survey after can lead to higher response rates 
  • Leaving automated CSAT surveys to chance—no doubt, they’re good but without any guardrail or conditions they are going to tank 
  • Not segmenting your customers—creating cohorts of users can help you get a granular view such as first-time buyers, recurring customers, discount-only shoppers, etc.
  • Sticking to one CSAT survey and sending another till there’s a positive response
  • Not considering neutral feedback—this can help you balance both sides 

Don't forget to checkout 15 Essential Customer Satisfaction Survey Questions for Actionable Insights

What is a good NPS survey response rate? 

A good NPS survey response rate varies anywhere between 4.5% to 39.3% with 12.4% being the sweet spot. But, the picture takes an interesting turn when it comes to B2B and B2C. 

While the NPS for B2B stands at an impressive 60%, it's 40% for the B2C folks. Again, don’t take this as a definitive word because your actual goal should be to get the responses of your most loyal customers along with the ones who aren’t.  

To be frank, you aren’t likely to get a high NPS response rate if you’re part of an industry having high alternatives where there is a low tradeoff cost. Likewise, if you’re part of an industry with frequent price fluctuations with no low-cost alternatives, the response rates are going to be abysmal. 

Before you go, read NPS vs CSAT: Selecting the Right Feedback Metric

"Shortly after deploying Blitzllama, we witnessed a remarkable surge in the number of surveys we received, surpassing all our previous records. For NPS surveys, the daily increase reached a remarkable 800%, and for persona surveys, the growth was even more impressive. One of the standout features of Blitzllama is its flexibility in data integration. We seamlessly integrated it with Amplitude in less than 10 minutes and conducted successful tests that went beyond our wildest expectations."

CENOA

Akıncan Akülker

Head of Product

Bottomline 

Your survey response rates have a mix of internal and external factors. Some within your control and some not. Choosing the high ROI channels, segmenting the right users, asking the right questions, sticking to the optimal sample size, and timing the surveys right are the only hacks you need to follow. 

Ultimately it’s not a mere percentage but a segment of users who believe in the power of feedback.