How to Take Action on CSAT Feedback

A process map to take action CSAT feedback: how to transform, consume, and operationalize customer feedback collected via a CSAT survey. 

Article

Annette Franz

November 18, 2019

Back to Resources

Author and businessman Harvey Mackay once said: “You learn when you listen. You earn when you listen–not just money, but respect.”

Those words could not be truer when it comes to customer experience (CX) and to your business. 

Listening (to customers) is one of the most powerful tools in your toolbox. As Harvey says, you learn when you listen. You learn about the customer, about her expectations, about how well you’re performing against her expectations, about how your products and services help her solve her problems, and more.

The respect part of Harvey’s quote comes from doing something with what you learn. Unfortunately, this is where a lot of businesses fall down. Don’t be one of those businesses. Let me help you take what you hear and what you learn from customers and do something with it. After all, insights without action are just expensive trivia!

The work to take action on feedback actually begins long before you even launch your CSAT survey. As you’re designing the survey, in order to ensure that you can take action on the feedback you get from customers, you need to consider the following for each question you ask.

  • What will we do if this question is rated low (or high)?

  • How will we act on it?

  • Who owns this question?

  • Who else needs this information?

  • Who will act on it?

  • How quickly can we make changes?

  • Is this something we can actually change?

  • Why are we asking this?

Asking for feedback about something you can’t change–or in such a way that you’re not sure what you need to change–is pointless. 

You’re wasting your customers’ time and your company’s time (and money). If you can’t succinctly answer these questions about everything you’re asking on the survey, then reconsider what you’re asking. Apply this test every time you design a survey.

Once you’ve thought about–and clearly answered–these questions, it’s time to think about question design. You can’t take action on bad data. So, how are you going to ask your survey questions to ensure that you get good data and can affect real change for the customer experience from what you get? In order to take action, you’ve got to have actionable feedback.

What makes feedback actionable? 

It needs to be:

  • Clear and unambiguous.

  • Specific (it explains or helps you understand “the why”).

  • Relevant (to the person/department who has to use it).

  • Contextualized (it provide insights, not just data).

  • Linked to customer outcomes, customer value.

  • Linked to business outcomes, business value.

How to get actionable feedback from a CSAT survey

Based on those parameters, here are some guidelines for designing the questions you’ll ask on your CSAT surveys.

Don’t ask double-barreled or compound questions

If you’re not familiar with this concept, here’s an example: “How satisfied are you with the speed and quality of the solution you were given?”

You’re asking about the speed of the solution and the quality of the solution, two very distinct things. 

First, this will confuse the respondent. What if speed was great, but the quality wasn’t? or vice versa? 

Next, it will frustrate whoever needs to act on it because it’s not clear what needs to be fixed, one or the other or both. Keep your questions to just one thought/concept.

Don’t ask leading or biased questions 

“We know you loved our new candy bar. How much did you love it?”

It’s a silly example, but you get the point. Don’t bias the question wording by putting a positive or negative spin on it. Just ask what you want to ask; don’t lead the witness.

Don’t only ask generic, high-level questions that aren’t specific enough to drive change

For example, asking customers to “rate your overall satisfaction with our website” without additional detailed attributes about the site or without an open-ended question to understand the why behind the rating is not helpful.

Be specific in open-ended questions

Ask exactly what you want to know, e.g., “What can we do to ensure you rate us a 10 on overall satisfaction the next time you do business with us?” Or, “Tell us the single most important reason you recommended us to your friends.”

Make sure your questions are not ambiguous

If a respondent pauses and says, “What do they mean by that?” then the question is poorly constructed. Poorly-constructed questions result in responses that are not actionable; nobody really knows what they mean or what to do with them.

Don’t overlap response choices

Your question response choices and rating scales should be mutually exclusive. When response choices overlap or don’t make sense, they become meaningless; and that means they are also not actionable.

Do your homework

Make sure you provide a complete list of response choices. I hate when the one answer that should be there is missing. 

Be sure to provide an “Other (please specify)” when appropriate. Not offering this latter option will either force people to skip the question or to select something that may not be accurate–and that’s not actionable; it’s misleading.

Only ask questions that are relevant to that customer and their experience

Don’t let others hijack your survey and force you to mix in a bunch of marketing research questions or other nice-to-knows. Those questions are out of context and are also not relevant to what you’re trying to achieve, which means they aren’t actionable for your cause.

Review verbatims for future surveys

For future question ideas, review verbatims from open-ended questions or “Other (please specify)” responses for emerging and actionable topics. These verbatims are a rich source of information, for a variety of reasons.

Append customer data  

Append customer data (from your CRM or call center platform) to the survey responses. This allows you to conduct a more robust analysis of the feedback and makes the insights far more actionable.

OK, so you are asking the right questions. Questions that are relevant, meaningful, and actionable. And you’ve appended customer data to the feedback to add even more detail and actionability. Great! The survey is now properly designed.

OK, so you are asking the right questions. Questions that are relevant, meaningful, and actionable. And you’ve appended customer data to the feedback to add even more detail and actionability. Great! The survey is now properly designed.

How to take action on your CSAT feedback 

You’ve launched the survey, and the feedback is pouring in. You’ve got tons of data that you need to make sense of. Now what? Now it’s time to analyze the data and turn it into insights that can be used throughout the organization.

Here’s the process that I recommend you take to transform, consume, and operationalize the data. 

Step 1: set up service recovery

First up, close the loop on service recovery opportunities. You’ve likely set up some alerts so that when feedback comes in above or below a certain threshold, a notification gets sent to the appropriate people to follow up with customers.

Following up on negative feedback is a service recovery opportunity, a chance for you to make things better for the customer because she told you things didn’t go well. 

Make sure you’ve got a process in place to follow up with every customer who met the alert threshold, with every customer not happy with the experience.

To plan, ask yourself these questions:

  • Who will respond to the customer? 

  • Within what time frame?

  • In what mode (phone, email, in-person)? 

  • What will they say/ask (e.g., apologize, ask for more information to get to the root cause, schedule a follow-up call for more details, etc.)? 

  • How will you empower your staff to handle these calls? 

  • What information do they need to make the call? 

  • What is the intended outcome of the follow-up? 

  • When and how does the service recovery get escalated? 

  • How will you capture the discussion? Will you share best practices with others to learn from? 

  • How will you know if the customer is satisfied with the follow-up?

  • How will you know if you’ve saved the customer?

By the way, you may also have some kudos alerts set up, as well. 

When customers provide positive feedback or mention specific employees and a job well done, that feedback is routed to the appropriate teams or people. 

It’s not all about the bad experiences; share the good and reinforce the types of actions and behaviors that all employees should engage in.

Step 2: prepare an analysis plan

Before you analyze your feedback, you’ve got to develop your analysis plan, which is a roadmap for how to analyze your data–and probably more importantly–why you’re analyzing it.  

The plan is important because it systematizes how you look at the data–and it clearly spells out some of the things that I wrote about earlier in terms of how the data will be used, linkages, etc. Here’s what the plan should include.

Background

Include the following background information to help keep the analysis focused on what matters.

  • Objectives and goals of the survey. 

  • Purpose of the analysis.

  • What questions you are trying to answer/issues trying to solve.

  • Who owns each question being analyzed.

  • Data sources (survey data, customer data).

  • Population or subsets of the population.

  • Segments, i.e., how the data will be segmented.

  • The audience, i.e., who will view and consume the insights.

Details

Think about the types of analysis you’ll need to conduct to tease out the story from the data, how you want to prepare it, and how you want to present it. 

Address each of the following:

  • Tools to be used for analysis.

  • How to handle missing values (and other data rules).

  • Key outcomes (dependent variables).

  • Inputs (independent variables).

  • Statistical analysis to be performed, e.g., regression, correlation, chi-squared, factor analysis, cluster analysis, etc.

  • Descriptive statistics. 

  • Predictive and prescriptive analysis.

  • Outputs and formats, e.g., means, percents, two decimals, etc.

  • Deliverables, e.g., report, presentation, spreadsheets, etc.

  • QA requirements (yes, someone needs to QA your analysis!).

Questions

For each question in the survey, you’ll go back to the original questions you were supposed to ask about each one and note:

  • Question text and context.

  • Purpose of the question (the problem it solves).

  • Owner of the question (who’s going to do something with it). 

  • How you want to analyze it, e.g., crosstab by segments, correlate against dependent variable X, predictive analytics, etc.).

  • Against what? 

  • And why? What are you trying to uncover with that particular analysis?

  • To what outcome is this linked?

Step 3: analyze the data

Next, obviously, you’ll analyze the data. You’ve got to break it down so that you can better understand it.

The analysis takes many forms because there will be many different types of data to make sense of–not just the survey data but also the customer data that you’ve appended. 

You’ll need a way to do a crosstab, predict, identify key drivers, prioritize improvements with survey data; mine and analyze your unstructured data; and conduct linkage analysis to link customer and employee data, customer feedback with operational metrics, and all data to financial measures.

And finally, you’ll need to conduct a root cause analysis to understand the why behind it all. I love to use the 5 Whys method for root cause analysis because it’s simple, yet powerful and effective. Ask “Why?” five times to get to the root of the problem.

Step 4: synthesize and contextualize the data

Next, you’ve got to synthesize and contextualize the data and the findings from your analysis. 

Once data has been broken down and analyzed for better understanding, it’s most useful for the end-user when it’s transformed into insights; that’s what synthesizing and contextualizing is all about. 

Put all the pieces of the analysis together to tell a story, to put it into context for those who need to act on it–a story that can be easily understood and translated into a better customer experience. 

This storytelling skill is not an easy one to find or to learn, but it’s an important part of making sure the data can be acted on.

Step 5: socialize both data and insights

The story is ready to be told. Next up, data has to be socialized. Those insights and their corresponding stories must be shared across the organization and in such a way that people know what to do with it. 

Insights and resultant recommendations have to get into the hands of the right people who will do something with them. The insights need to be shared with those teams or departments with a vested interest in the specific feedback. 

And you’ve got to get the insight into the hands of your executives, as well. Some of the improvements that need to be made are organization-wide and require C-level involvement to ensure the commitment is there for time, funds, and other resources.

Step 6: strategize a plan for taking action 

Now that the data and insights are in the hands of the people who need to use it to improve the experience, it’s time to develop an action plan. 

Ask yourself:

  • What are some of the common issues, themes, or trends that arise from the data? 

  • How can/will the department respond to these insights? 

  • What will they fix? 

  • What improvements do they need to make to their specific policies and processes? 

  • What additional training does their staff require? 

  • What tools do they need? 

  • What communication is required to ensure the entire team or department is on board with the required changes? 

  • Are there best practices that they can share with other departments? 

This is a critical step to turn insights into action. You’ve got to answer what you will do, how it will be done, and who will do it.

Step 7: take action on the feedback 

With a plan in place, it’s time to implement the changes. You’ve got to prototype the fix, test it with customers, and fail fast.

If they say that the experience is still problematic, then you’ve got to start over or fix the prototype you’ve developed. Let customers be your guide as to whether that then becomes a better experience or not. This is an important step that many companies fail to do.

Once you’ve implemented the new or improved experience, you’ve got to do a couple of things. Make sure employees are trained on new tools or processes that facilitate the experience and make sure they are trained on how to deliver the new experience. 

Close the loop with customers; let them know what you’ve done and how the experience will be different going forward. You’ve got to set the appropriate expectations so that there are no surprises.

In closing

As you can see, when you decide to ask customers for feedback, there’s a lot of work that follows. The key is to be prepared for it. Follow the steps outlined here, and you should be well on your way to delivering a better experience for your customers.

Customer Satisfaction Score: A Free Guide

If you’re on the hunt for a CSAT calculator that will help you measure your score and prove the possible revenue impact of satisfaction, GetFeedback has got the right thing for you: a free CSAT interactive calculator. Check it out! You’ll even get custom recommendations based on your score.

Learn how GetFeedback can help you exceed customers’ expectations—start your free trial today.

About the guest author

Annette Franz is the founder and chief experience officer of CX Journey Inc.

She’s got 25 years of experience in both helping companies understand their employees and customers and identifying what drives retention, satisfaction, engagement, and the overall experience—so that, together, you can design a better experience for all constituents. She has worked with both B2B and B2C brands in a multitude of industries. Connect with her: www.cx-journey.com | @annettefranz | @cxjourney | LinkedIn | Facebook

Subscribe for the latest CX content

Privacy notice|California privacy notice
Terms of use
|Cookie policy

*Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.