Are your support metrics telling you what you really need to know? We asked ourselves this question last year and the answer was… complicated.

For years, we’d used the Customer Satisfaction (CSAT) Score as our primary support quality metric. Every time an agent closed a case, it would automatically trigger a survey email with an embedded CSAT question. Customers could rate the quality of our support with one click.

The CSAT survey was simple, effective, and delivered valuable feedback on case-handling and agent performance. However, it wasn’t giving us the full picture.

You don’t know what you don’t know

Our monthly CSAT score rarely fell below 98%, which was satisfying, but unrealistic. We knew our scores were inflated. Customers were primarily rating the support agent, not the support experience as a whole.

support follow-up email with an embedded CSAT survey question

Our case-closed customer satisfaction survey

We wanted to know more

While our CSAT score was certainly a confidence-booster, the results weren’t helping us answer our biggest questions:

  • How did their support experience go?
  • Did the agent answer their questions effectively?
  • How was their overall experience with our product?
  • How was the entire support process—from the time the issue arose to when it was resolved?

As a survey solution, it’s important for us to be methodical and proactive with our customer feedback program. We want actionable insights from customers so we can provide the best possible experience. We already knew our agents are awesome, but what about the self-service experience? How did people feel before they reached our agents?

The contender: Customer Effort Score

So, our Director of Support and Customer Experience, decided to experiment with a different metric: Customer Effort Score (CES). Developed by CEB, this smart metric focuses on the effort customers have to put forth to get their questions answered.

As most support folks know, low-effort customer experiences are the gold standard. If you want loyal customers, it’s better to aim for ease, not mind-blowing or “delightful” support experiences. In fact, CEB found that the Customer Effort Score was 1.8x more predictive of loyalty than CSAT.

With all this in mind, we swapped CSAT for CES.

Follow-up support email with embedded Customer Effort Score question

Our spankin’ new Customer Effort Score question

Bingo.

The results? First, our overall score went down to 85%. Expected—we were asking a different question. Second, we started receiving more thoughtful responses from customers. Rather than focusing on the agent or the interaction, customers began telling us what else they’d like to see from us.

Suddenly, we had tons of suggestions for how to improve our support experience. People wanted more on-boarding resources, customer webinars, and explainer videos. What’s more, our response rates nearly doubled.

Want to learn more?

On February 20th, Kimberly will share the full story in a free webinar hosted by MindTouch. She’ll explain why we made the switch, discuss what we learned when we did, and offer more insights on customer effort. Read more about what’s on the agenda and register today to save your spot!

Sign up for our webinar with MindTouch on February 20th

Ready to build a better customer experience?
Subscribe for a weekly dose of customer experience insights—straight to your inbox.
×