Customer satisfaction surveys are simple at first glance. They ask the taker to rate a specific experience with a company on a clear scale. Pretty hard to mess up, right? Since they’re so easy to create and send, people tend to think so. But when we—a survey company—started sending our first CSAT surveys and evaluating the results, we quickly realized our method was flawed.

This was a big problem. We needed to find a reliable formula for measuring customer satisfaction if we were going to help other companies do it. So we set off to develop one.

Why We Rethought Our CSAT Surveys

Our CSAT score didn’t represent our customer experience.

When Campaign Monitor acquired GetFeedback in November 2014, I took over as their main (and only) support staff. We were a startup in transition and facing the characteristic struggles. I was creating our customer knowledge base and managing all customer needs at once. Our co-founders—the company’s acting support agents, before I stepped in—were doing an amazing job fueling the engine. But we were missing one rather large piece: feedback on overall customer satisfaction.

Our CSAT score didn’t tell us much because our customer feedback process was lacking. As a survey company with “feedback” in its name, that was a problem. We were missing a basic and critical function for understanding and improving customer experience.

Having no real processes in place, we quickly modeled our CSAT surveys after Campaign Monitor‘s structure, which was showing great success: a quick, 10-point ranking survey with images representing customer satisfaction levels. What we ended up with resembled a pain scale in a pediatrician’s office.

3-option CSAT survey

Since we were trying to conform to Campaign Monitor’s existing scale, we used Poor = 1, Good = 5, and Excellent = 10. As you can imagine, this ranking system was far from perfect… but our CSAT score was amazing, at times hitting 100%. As great as that felt, it was unrealistic. By offering just three choices, we were pushing our customers toward favorable responses.

If you’ve ever researched CSAT survey best practices, then you know how limited and subjective the advice is. Our research really only told us one thing: everyone runs their CSAT surveys a bit differently. That answer is pretty frustrating for teams that need a quick blueprint for CSAT.

We needed a more reliable way to measure customer satisfaction.

After researching industry best practices, we finally settled on a true 1-5 scale for our CSAT surveys. This decision was based on a few factors:

  1. We wanted to reduce response bias as much as possible.
  2. We wanted to accurately measure customer satisfaction against our competitors, who were using a 1-5 scale.
  3. Along with regular NPS® surveys, we hoped to get a more complete picture of customer experience by running our CSAT surveys with this scale.

The next step was actual survey distribution. We knew we needed to avoid customer fatigue, so we couldn’t send CSAT surveys after every single interaction. But we also wanted to generate as many responses as we possibly could.

This led to two key decisions for our email surveys:

  1. We would send CSAT surveys upon case close rather than with each response.
  2. We would embed survey responses (1-5) within the support email, beneath our signatures.

Here’s what our final CSAT survey looked like:

5-option CSAT survey

Immediately, our response rates increased by 11%. The reason? By giving customers more freedom (increasing the options from 3 to 5) and convenience (embedding the survey into our emails), we reduced the effort it took to respond.

Next, we added an open-ended question so our customers could share feedback freely as soon as they selected a numerical score. Advanced survey logic fed them the next question based on the 1-5 score they selected. Here are two examples with different question combinations, depending on the score:

CSAT survey advanced logic

If a customer selected anything lower than a 4, we would show them the question above along with the score they gave us. (The latter is a quick survey personalization trick.)

If they gave us a 4 or a 5, we would show them this question instead:

CSAT survey advanced logic

This small change led to a 5% increase in monthly feedback.

Wrap-up

We all know a great CSAT score is important for your business. It not only indicates business success, but it can inspire quality work from your team and drive long-term customer loyalty. But your method for measuring customer satisfaction has to be sound in order to generate data you can count on.

To get the most out of all customer exchanges, it’s crucial to consider—and reconsider—how you’re measuring them. When we embedded survey responses in the case resolution email, it gave us higher survey response rates. Increasing the customer’s response options delivered a truer score. And the improved accuracy we gained from both changes ultimately helped us identify internal processes that needed attention.

In the end, our CSAT score dropped a few points, but the customer insights we’ve gained have proven invaluable. Our customer feedback program guides the most important choices we make. That’s not limited to customer support. Feedback finds its way to the highest reaches of our business, impacting our strategy and roadmap every day.

customer satisfaction surveys for salesforce

Ready to build a better customer experience?
Subscribe for a weekly dose of customer experience insights—straight to your inbox.
×