Benchmarking a diverse dataset (across a large industry) is complicated and nuanced. As such, we expect you have some questions about our data, conclusions and study population.
Here are some of the most common questions we get. If we haven’t answered any of your questions, send it to firstname.lastname@example.org and we will do what we can!
ASP stands for Average Sales Price and is calculated by taking total bookings for a period and dividing it by the number of new deals closed during the same period.
For companies that typically sign multi-year contracts, ASP reflects the Total Contract Value, not just the annual value.
The participants in our study are InsightSquared customers who have allowed us to use their data anonymously. You can learn more about who they are (and what they do) in the participants section on this page.
All of the participant information comes directly from their operational data. It is gathered through InsightSquared’s own product and reflect sales, finance and customer data from their CRM.
It is important to note that this data does not come from surveys, which have been shown to be inaccurate.
We measured win rate in terms of value (as in total value of won deals divided by total value of open opportunities). This is often higher than when measured by count (total number of won deals divided by total number of open opportunities) because large won deals can pull the win rate up disproportionately.
We also tracked win rates only for opportunities, not leads. Companies that track win rates from lead to deal should expect to have significantly lower win rates.
If you’re paying close attention, you may notice that the benchmarks are not always consistent across the study’s sub-components. For example, the win rate for companies with average deal sizes between $1k and $5k in our sales performance section is 25%, but only 21% in our pipeline management section. What’s the deal?
The answer is that the populations are different for these two analyses. Not all of our customers have configured each section of our product, so each benchmark group has a slightly different population. In this case we had nearly 70 participants for the sales performance analysis but just 30 for the pipeline management section. The difference sample means that the metrics can differ as well.
There are several ways to measure sales reps’ quota attainment, but we picked the one we believe to be the most meaningful and simple: the percent of quota attained by the average sales rep.
Yes. Our benchmarking analysis is an ongoing effort to analyze and interpret trends in the tech/sales world. We will be continuously updating and adding to our benchmarks.
Yes. Our goal is to provide data to help answer the questions that sales leaders and startup executives need to make more informed decisions. If you have a business are or specific question you’d like to see answered with data, email email@example.com with the subject line “Benchmarking Data Request.”
All of the data used in InsightSquared’s Benchmarking Analysis is anonymous and aggregated. We are able to provide meta data about the participants (industry, location, size, age, etc.) but no specifics about the businesses whose data we are using to provide the benchmarks.
All of the benchmarks used in our analysis come directly from InsightSquared’s product. To see how you compare against the industry benchmarks, email firstname.lastname@example.org to request a demo of the product. Once you have connected Salesforce to the product, you will be able to compare your company’s performance to our industry benchmarks.