LIVE! Wed., Dec. 18, 2024 at 12:00 PM ET

Advanced LinkedIn Ads: Evaluating & Optimizing Like a Boss

Attend

Do the following statements describe your company's approach to gathering customer data?

We ask customers to comment only on what they have experienced firsthand because we need reliable data, not guesses.

We collect customers' views shortly after they have interacted with the company, while the event or activity is fresh in their minds, because people forget and we need to address service issues promptly.

The teams that do the work receive data about their performance because they comprise the people who need to understand and improve the way service is delivered to customers.

We report customer data alongside financial data because we know that customer loyalty improves financial performance.

If you answered yes to all of the above, you are a very rare breed.

Our experience suggests that most companies use a crude approach to gathering vital customer data: Periodic, often annual, surveys that are disconnected in time from the events they are purporting to measure are the norm.

We call them the Annual ''Do You Love Us?'' Survey, and it has many pitfalls.

One of the biggest mistakes is thinking that measuring customer satisfaction is the goal. Though that is nice to know, what companies need to know is how well they perform in those activities that customers think are important.

Customers don't want to be satisfied; they want companies to perform well on things that matter to the customer.

Because companies collect feedback so infrequently, their surveys attempt to collect too much information at once. Customers receive lengthy surveys with questions that are irrelevant to them and are about things they have had no direct experience with.

Many surveys end up in the trash. Those who do complete them get question fatigue and guess at the questions about which they lack firsthand experience. The resulting data is of questionable reliability.

Periodic surveys limit the opportunities for data-driven improvement. An annual survey means only one data-driven learning and improvement opportunity each year. In a competitive market, can you wait a year for an opportunity to improve?

Even a monthly survey schedule means that data can be out of date, particularly when the capture and processing of that data adds weeks to the cycle. If a customer is dissatisfied (and, remember, most don't complain—to you), how long can you afford for that customer to be walking around with a grouse?

Identifying and addressing unspoken complaints is an effective way to improve loyalty, but speed is of the essence.

For many, action planning is also limited by access to the data. Paper-based reports and charts that are bound in documents as thick as telephone directories make widespread circulation difficult.

Even if the information is supplied on spreadsheets, finding the right data set poses problems. There is rarely an easy way to find the right information or navigate around hierarchical views of the data.

Many companies say they want to be customer-focused. A few companies (such as Avis Europe, First Direct, General Electric, and Rackspace Managed Hosting) have embedded a customer-driven approach to continual improvement into their cultures.

For them, regular collection of customer feedback at key points of customer interaction is the fuel that drives continual improvement. Avis Europe's rental stations receive regular customer feedback that local teams review and act on. All over Europe, the Middle East, and Africa, Avis Europe teams are working to make the service they deliver better.

For those companies that still rely on calendar-driven surveys rather than customer-driven surveys, a different approach is needed.
 
Much of the data collection has to be event-driven, referencing the specific interaction between customer and company.

Highly focused event-driven ''micro-assessments''—five to eight questions issued to customers who have recent, firsthand experience of the activity—raise both response rates and the reliability of the data collected. Careful design of those micro-assessments and the capability of underpinning systems allow data from different assessments to be aggregated, forming a comprehensive picture of overall performance.

A new approach to data collection has to be matched with changes in how the data is used.

The primary audience for customer data is the people responsible for the work. Customer information is an essential requirement of work-based continual improvement.

Technology now allows organizations to put customer-feedback data on every desktop. Work teams can analyze the customer's view of their performance and drive continual improvement. Low scores from an individual customer that imply real dissatisfaction can be flagged for immediate follow-up, providing the opportunity to nip a potential customer defection in the bud. Such responsiveness often has a significant impact on repurchase intention.

Managers can look at performance across the organization, recognizing good performance and focusing attention on correcting weak areas. They can review customer data against financial and other operating data, looking for insights into any systemic performance issues.

Here are 10 steps to get your feedback system in shape:

  1. Map out your customer journey, and for each key interaction ask "What matters to the customer?"
  2. Use the answers to that question to develop a suite of event-driven surveys.
  3. Identify like themes across those surveys (e.g., staff attitudes, responsiveness, overall satisfaction, repurchase and recommendation intention), and use common questions to enable performance across the customer journey to be tracked.
  4. Identify the business-process owners for questions in the surveys and use that as the basis for role-relevant reporting.
  5. Create automated alerts to notify business-process owners and account managers of any customers who report low satisfaction, thus ensuring a quick response to issues, and maximize the opportunity for service recovery.
  6. Build reporting mechanisms that deliver actionable data to drive change. Remember, it is the actions driven from feedback that improve the customer experience, not the feedback itself.
  7. Integrate feedback and customer relationship management (CRM) to automate the deployment of event-driven surveys and to personalize content.
  8. Pass key feedback results back into CRM to maintain a single view of the customer.
  9. Use league tables to highlight best-practices and set up a learning culture.
  10. Integrate feedback (attitudinal) data with transactional and financial (behavioral) data to identify the actions that really make a difference.

Stop measuring satisfaction; rather, focus on customer performance. Kill off the Annual ''Do You Love Us?'' Survey and its periodic offspring. Collect specific event data when the event happens, and combine that into an overall picture of the customer experience.

Recognize that managers are the secondary audience; the real audience is the people who do the work because that is where performance is delivered.

Following those steps will go a long way toward giving customers what they want—performance, not satisfaction.

So, does your approach to customer measurement really measure up?

Enter your email address to continue reading

Do Your Customer Surveys Measure Up?

Don't worry...it's free!

Already a member? Sign in now.

Sign in with your preferred account, below.

Did you like this article?
Know someone who would enjoy it too? Share with your friends, free of charge, no sign up required! Simply share this link, and they will get instant access…
  • Copy Link

  • Email

  • Twitter

  • Facebook

  • Pinterest

  • Linkedin


ABOUT THE AUTHOR

David Jackson is a founder and the managing director of Clicktools, Ltd. (www.clicktools.com), a technology-based solutions provider of products and services for measuring and improving customer experience.