Categories Articles, Sales and Marketing

This post is part of InsightSquared’s 2016 Sales Benchmark Report.

In late 2015, InsightSquared analyzed the performance data of a large subset of our customer base. The ultimate goal of this project was to provide definitive sales performance and revenue benchmarks that high-growth tech companies can use to assess and contextualize their own performance. Unlike traditional benchmarking studies, this was not conducted using surveys; it was based on the actual Salesforce data of real companies. How did we get this data? How did we analyze it to provide definitive benchmarks? What types of companies participated in the study? In this post we answer these questions by explaining the participant breakdown and methodology of the study.

2016 Sales Benchmarking Participant Breakdown

For our 2016 Sales Benchmarking Analysis, we studied the data from almost 150 businesses. These companies reflect a wide range of industries, sizes, ages, and locations, although the typical participant was a VC-backed, U.S.-based SaaS company with under 100 employees and founded in the first decade of the 21st century. Here is a little more information about how the participants break down:

Global Headquarters

The vast majority of participants are located in North America, particularly in the U.S. However, there are several participants spread across the rest of the globe, mostly in Western Europe.

Of the participants headquartered in the U.S., the bulk are split between New York / Boston and California.

U.S. Headquarters

The typical company in our benchmarking analysis is a VC-backed SaaS company, but there is also a sizable chunk of privately funded SaaS companies as well.

Most participants in our analysis were founded in the first decade of the 21st century. Only 4% were founded before 1990.

Founded Date

Company Size

Almost 40% of the companies we studied had 50 employees or fewer, and more than 95% clocked in under 500.

Now that we’ve broken down who participated in our 2016 Benchmarking Analysis, it’s time to explain a little more how we analyzed their performance. Read on for the technical methodology of how we came up with these tech industry benchmarks.

2016 Sales Benchmarking Methodology

The InsightSquared 2016 Sales Benchmark Analysis was built by studying sales practices and performance of the nearly 150 businesses discussed above. We used creativity and statistical rigor to provide an in-depth look at how sales teams work. This analysis is more than just a lot of data — it’s the inside story of what makes real companies tick at all levels of the organization. Our goal was to produce a comprehensive and descriptive view of sales operations so anyone can look at our analysis and understand how they compare to their peers, and identify areas where they can improve their performance. We asked a blend of questions to frame our analysis based on the key metrics that sales managers and executives want to track, including simple benchmarks such as sales rep win rates, compound annual growth, and customer lifetime. Below is some more detailed information on how we performed this analysis.

The analysis was limited to:

  • Companies that use Salesforce.com as their CRM system
  • Performance metrics between 9/1/2013 to 9/30/2015
  • Companies that are consistently and accurately tracking their performance

Below are a few details on how some of the metrics were calculated:

Methodology-Formulas-2
shutterstock_290297768

Benchmarking a diverse dataset (across a large industry) is complicated and nuanced. As such, we expect you have some questions about our data, conclusions and study population.

Here are some of the most common questions we get. If we haven’t answered any of your questions, send it to info@insightsquared.com and we will do what we can!

What do you mean by ‘ASP’?

ASP stands for Average Sales Price and is calculated by taking total bookings for a period and dividing it by the number of new deals closed during the same period.

For companies that typically sign multi-year contracts, ASP reflects the Total Contract Value, not just the annual value.

Who are the study’s participants?

The participants in our study are InsightSquared customers who have allowed us to use their data anonymously. You can learn more about who they are (and what they do) in the participants section on this page.

How did the participants submit their data?

All of the participant information comes directly from their operational data. It is gathered through InsightSquared’s own product and reflect sales, finance and customer data from their CRM.

It is important to note that this data does not come from surveys, which have been shown to be inaccurate.

Why do the win rates seem high?

We measured win rate in terms of value (as in total value of won deals divided by total value of open opportunities). This is often higher than when measured by count (total number of won deals divided by total number of open opportunities) because large won deals can pull the win rate up disproportionately.

We also tracked win rates only for opportunities, not leads. Companies that track win rates from lead to deal should expect to have significantly lower win rates.

Why aren’t benchmarks always uniform across the pages?

If you’re paying close attention, you may notice that the benchmarks are not always consistent across the study’s sub-components. For example, the win rate for companies with average deal sizes between $1k and $5k in our sales performance section is 25%, but only 21% in our pipeline management section. What’s the deal?

The answer is that the populations are different for these two analyses. Not all of our customers have configured each section of our product, so each benchmark group has a slightly different population. In this case we had nearly 70 participants for the sales performance analysis but just 30 for the pipeline management section. The difference sample means that the metrics can differ as well.

How did you measure quota attainment?

There are several ways to measure sales reps’ quota attainment, but we picked the one we believe to be the most meaningful and simple: the percent of quota attained by the average sales rep.

Will you release more benchmarks?

Yes. Our benchmarking analysis is an ongoing effort to analyze and interpret trends in the tech/sales world. We will be continuously updating and adding to our benchmarks.

Can I request benchmarks?

Yes. Our goal is to provide data to help answer the questions that sales leaders and startup executives need to make more informed decisions. If you have a business are or specific question you’d like to see answered with data, email info@insightsquared.com with the subject line “Benchmarking Data Request.”

Can I have a list of the participants?

All of the data used in InsightSquared’s Benchmarking Analysis is anonymous and aggregated. We are able to provide meta data about the participants (industry, location, size, age, etc.) but no specifics about the businesses whose data we are using to provide the benchmarks.

Can I ask you to create my benchmarks?

All of the benchmarks used in our analysis come directly from InsightSquared’s product. To see how you compare against the industry benchmarks, email sales@insightsquared.com to request a demo of the product. Once you have connected Salesforce to the product, you will be able to compare your company’s performance to our industry benchmarks.

Dan McDade
Dan McDade is the Customer Analytics Specialist at InsightSquared. He uses advanced statistical analytics and predictive modeling to analyze customer behaviors.
Recommended Posts
Comments
pingbacks / trackbacks
  • […] we performed our 2016 Sales Benchmarking Analysis, we wanted to correct this, so we made sure we divided the participants in our study into […]

Leave a Comment

Start typing and press Enter to search