If the aim is to improve how your customer support is perceived by users, you need to routinely search for examples of excellence and make more service experiences like them.
Let’s be honest – customer facing business functions (support, sales, marketing, success) are becoming indistinguishable from each other. If you want to succeed in all of them, you need to systematically review both your user feedback and your customer conversations on a regular basis.
Talking to customers ain’t what it used to be
Geoffrey Moore popularized the concept of technology adoption life cycles in his classic book “Crossing the Chasm”. I agree, it is still highly relevant, and probably now more than ever.
One of those phenomena that seems to be well into the early adopters market is conversational marketing, to use Drift’s terminology.
This means that as new kinds of business communication platforms like Intercom, Drift and even Slack are blowing up, the lines between sales, support and marketing are becoming reaaaaallly blurry, to say the least. This is especially true for online-only businesses, where chat-like interactions are becoming the norm and every aspect of the business can be measured.
My conclusion from all of this is that by labelling either customer support or sales or success or marketing as just that one thing, you will lose out on business. The customer could not care less about what your internal org chart looks like.
Customers just want to talk to somebody competent who represents your company. If your salesperson does not know about integrations or your support rep does not know how you differ from your competitors, your users will go to somebody who does.
You’ll lose potential customers to companies who employ customer facing people with excellent all-around knowledge.
The output of customer interactions: user feedback
Okay, so you have understood that sales, support and marketing are converging and you should consider them as one whole. But other than the bottom line results (sales, sign-ups, conversion), what other leading indicators are there to understand whether you are succeeding?
The most obvious one is of course user feedback – CSAT, CES and NPS are great measures for output. If you are not measuring those, it is like running a restaurant and not asking patrons whether they like your food and just counting the money. Measuring CSAT is an art, but luckily Nicereply has literally written a book about it.
You will want to make sure that you’ve got user feedback covered well. Though I will go to explain how user feedback is only half of the equation, you will be flying blind without it.
The input of customer interactions: what you tell your customers
To continue with the restaurant analogy, you will not only want to know if customers like the soup that you are serving – just as important is to be sure of what goes into the soup and how it is made.
The raw ingredients (price, product etc) play an important role, but those alone are not determining the outcome, how you mix and serve them matters too.
This is where the concept of conversation reviews comes in (also referred to as internal QA, case/interaction/ticket reviews etc). If the aim is to improve how your customer support is perceived by users, you need to routinely search for examples of excellence and make more service experiences like them.
The only way to move in that “right” direction is to diligently review your conversations to find areas of improvement. It is the same reason why engineering has code review and traditional sales has coaching.
If you consider the concept of conversational marketing, where sales converges with support and marketing, you need conversation reviews just as much as you need to track user feedback. Now you can draw a direct line from the quality of your customer support to business metrics like revenue, conversion and churn.
That is the recipe for the soup. The input determines the output and you cannot have one without the other, if the aim is to be in control of either.
How to make the soup: tracking user feedback and doing conversation reviews
Now that we have covered the theory, it’s time for practice. Let’s say that you have a merry band of 25 sales/support/success folks and everybody uses a tool like Zendesk or Intercom to communicate with customers. It’s all a bit messy and you have no idea how you should take control of the quality of your conversations.
My suggestion: start small and see what works. Reigning in the chaos of all the customer facing interactions can be quite a task, especially if you are growing quickly and it’s tough to understand where support stops and sales/success/marketing begins.
Here are some ideas for bringing clarity and process into the situation, no matter how messy the status quo.
- Turn on feedback surveys for everybody who talks to customers
Whatever tool (Nicereply is a great one, obviously) you are using, gathering as many data points as possible is key to success. Only a minority of all conversations will get a reaction from users, so it is important to push it as high as possible. Otherwise there will be little to analyze or make conclusions on.
Also, there is no real reason why sales or success should be exempt from this practice. It is important to acknowledge the differences between feedback left for conversations that were started by the company and those that were inbound. While these are not directly comparable, they still provide valuable insight on an individual level and within that category of feedback.
Special attention should be paid, of course, to any negative feedback that you are getting.
- Gather user feedback in a central repository
Whichever tools you use, make sure that all data ends up in the same place. Not only does a central repository simplify the analysis process, it also creates transparency within the team.
Give everybody access to the central repository and establish a common understanding of what your level of quality service is.This can be a specialized tool or a Slack channel or a spreadsheet.
- Establish a simple conversation review process.
You can do conversation reviews without any special tools if you do not have a lot of volume. Or, you can use an app like Qualitista for larger teams.
However, what matters more than the technical setup, is the fact that you actually do conversation reviews. Quite simply, it means that you regularly look at some examples of past conversations and give feedback to your team.
What you establish as review criteria or how you structure the process is up to you. However, bear in mind that if you make conversation reviewing too complicated, it will lower the chances of reviews being done diligently on a regular basis.
- Review, summarize, repeat
Decide who and how often does conversation reviews, and what happens with the results. It can be as simple as this:
- One senior person from sales, support and marketing are responsible for the process.
- They review all customer feedback monthly and each covers a specific portion of the period.
- They take 5% of conversations that didn’t get user feedback and each review a part of that.
- Each gives individual feedback to agents or reps as necessary.
- Each writes a short summary report of the findings for the whole team to learn from.
- Repeat monthly.
Conclusion: direct the conversation or end up in a ditch
All of this is nothing new. In most business disciplines, some or all parts of the review process have been common knowledge and practice for a long time.
What is new, however, is the way that these disciplines overlap. Thus, it has become harder to understand how these practices need to be deployed.
That’s why a refreshed framework is needed for taking control of your customer interactions and, through that, business results. A lot of companies already practice this and evolution is bound to catch up. Those that do not offer great quality in how they talk to customers will be eliminated.
In the brave new world of ever faster reply times, reviews on social media and metrics for everything, you shouldn’t let things just happen and hope for the best. The good news is that taking control is also easier than ever before.
About the Author:
Martin Kõiva is the co-founder of Qualitista, a conversation review tool that makes giving feedback to your support team systematic. Previously, he spent four years building and scaling customer support globally at Pipedrive. Follow Martin on LinkedIn.