Report: 2015 Temkin Loyalty Index

1511_TemkinLoyaltyIndex_COVERWe published a Temkin Group report, 2015 Temkin Loyalty Index. This report ranks the loyalty of consumers to 293 companies across 20 industries. Here’s the executive summary:

The 2015 Temkin Loyalty Index evaluates the loyalty of 10,000 U.S. consumers to 293 companies across 20 industries. The Index is based on evaluating consumers’ likelihood to do five things: repurchase from the company, recommend the company to others, forgive the company if it makes a mistake, trust the company, and try the company’s new offerings. Our research shows that USAA, H-E-B, Publix, and Trader Joe’s are at the top of the list when it comes to consumer loyalty, while Con Edison of NY, Coventry Health Care, Comcast, and Time Warner Cable are at the bottom. At an industry level, supermarkets, fast food chains, and retailers inspire the highest loyalty levels. At the other end of the spectrum, TV service providers and Internet service providers have the lowest levels of loyalty. USAA, JetBlue Airways, TriCare, credit unions, ACE Rent A Car, Apple, and Georgia Power have loyalty levels that most outperform their industry averages. Conversely, Con Edison of NY, RadioShack, Blackboard, Coventry Health Care, Citibank, Jeep, Bi-Lo, and McDonalds fall farthest behind their peers.

Download report for $395
(includes report plus dataset in Excel)
BuyDownload3

Here are the leaders and laggards as well as the industry scores:

1511_TLi_TopBottom

1511_TLi_IndustryRanges

Here are some other highlights of the research:

  • The Temkin Loyalty Index is an average rating across consumers’ likelihood to do five things:
    • Repurchase from the company
    • Recommend the company to others
    • Forgive the company if it makes a mistake
    • Trust the company
    • Try the company’s new offerings
  • At an industry level, supermarkets, fast food chains, and retailers have the highest loyalty levels. At the other end of the spectrum, TV service providers and Internet service providers have the lowest.
  • USAA (for credit cards, banking, and insurance), JetBlue Airways, TriCare, credit unions, ACE Rent A Car, Apple, and Georgia Power have loyalty levels that most outperform their industry averages.
  • Con Edison of NY, RadioShack, Blackboard, Coventry Health Care, Citibank, Jeep, Bi-Lo, and McDonalds fall farthest behind their peers.
  • The average likelihood to purchase across all industries is the highest (67%) while the average likelihood to try new offerings is the lowest (42%).
  • H-E-B and USAA lead, and Con Edison of NY lags in repurchase.
  • Aldi and Hy-Vee lead, and Coventry Health Care lags in recommendations.
  • USAA and ACE Rent A Car lead, and Con Edison of NY lags in forgiveness.
  • ACE Rent A Car leads, and Citibank and Citizens lag in new product loyalty.
  • Credit unions and H-E-B lead, and Comcast lags in trust.

Download report for $395
(includes report plus dataset in Excel)
BuyDownload3

If you want to know what data is included in this report and dataset, download this sample Excel dataset file.

Report: State of Voice of the Customer Programs, 2015

1510_StateOfVoCPrograms2015_CoverWe published a Temkin Group report, State of Voice of the Customer Programs, 2015. This is the fifth year that we’ve benchmarked the competency & maturity of voice of the customer programs within large organization. Here’s the executive summary:

For the fifth year, Temkin Group has benchmarked the voice of the customer (VoC) programs within large organizations. We found that while most organizations consider their VoC efforts to be successful, less than one-third of organizations actually believe they are good at making changes to the business based on these insights. Respondents think that the most important source of insights in the future will be customer interaction history, and they think that going forward, multiple-choice questions will be the least important. Respondents believe that technology will play an increasingly important role in their efforts, but the largest obstacle to VoC success remains integration across systems. In addition to asking questions about their VoC program, we also had respondents complete Temkin Group’s VoC Competency and Maturity Assessment, which examines capabilities across what we call the “Six Ds”: Detect, Disseminate, Diagnose, Discuss, Design, and Deploy. Although only 16% of companies have reached the two highest levels of VoC maturity, this is still an improvement from the 11% last year. When we compared high-scoring VoC programs with lower-scoring programs, we found that companies with mature programs are more successful, focus more on analytics, have more full-time staff, have more strongly coordinated efforts, and have more involved senior executives.

See the State of VoC reports from 201020112013, and 2014.

Download report for $195
BuyDownload3

Here are the results from Temkin Group’s VoC Competency & Maturity Assessment:

1510_VoCCompetencyMaturity

Download report for $195
BuyDownload3

Report: ROI of Customer Experience, 2015

1510_RoIofCX_COVERWe published a Temkin Group report, ROI of Customer Experience, 2015. This research shows that CX is highly correlated to loyalty across 20 industries. Here’s the executive summary:

To understand the connection between customer experience (CX) and loyalty, we examined feedback from 10,000 U.S. consumers that describes both their experiences with and their loyalty to 293 companies across 20 industries. Our analysis shows a strong correlation between customer experience and loyalty factors such as repurchasing, trying new offerings, forgiving mistakes, and recommending the company to friends and colleagues. While all three components of customer experience—success, effort, and emotion—have a strong effect on loyalty, our research shows that emotion is the most important element. When we compared the consumers who gave companies a very good customer experience rating to those who gave companies a very bad customer experience rating, we found that at companies with high customer experience ratings, the percentage of customers who plan on purchasing more is 18 points higher, the percentage who will forgive the company if it makes a mistakes is 12 points higher, the percentage who will try a new offering is 10 points higher, and the percentage who trust the company is 19 points higher. Additionally, companies with very good CX ratings have an average Net Promoter® Score that is 24 points higher than the scores of companies with poor CX. We built a model to evaluate how, over a three-year period, customer experience impacts the revenue of a $1 billion business within each of the 20 industries. This model shows that CX has the largest impact on the revenue of hotels ($823 million) and rental cars ($755 million) over three years. This report also includes a five-step approach for building a model that estimates the value of CX for your organization.

Download report for $295
BuyDownload3

This is the first figure in the report, and it shows the high correlation between Temkin Experience Ratings (customer experience) and purchase intentions for 293 companies across 20 industries:
1510_CXvsRepurchase

Here’s an excerpt from the graphic showing the three year impact on revenues for a $1 billion company in 20 different industries:

1510_ROIRevenues

Download report for $295
BuyDownload3

To see the customer experience levels of all 293 companies, download to the free 2015 Temkin Experience Ratings report.

P.S. Net Promoter Score, Net Promoter, and NPS are registered trademarks of Bain & Company, Satmetrix Systems, and Fred Reichheld.

Report: Net Promoter Score Benchmark Study, 2015

1510_NPSBenchmarkStudy_COVERWe published a Temkin Group report, Net Promoter Score Benchmark Study, 2015. This is the fourth year of this study that includes Net Promoter® Scores (NPS®) on 291 companies across 20 industries based on a study of 10,000 U.S. consumers. Here’s the executive summary:

Many companies use Net Promoter® Score (NPS) to evaluate their customer loyalty, so we measured the NPS of 291 companies across 20 industries. The three companies with the highest scores are USAA, with an NPS of 70, and Lexus and Mercedes-Benz, each with an NPS of 62. Additionally, USAA’s banking, credit card, and insurance businesses all outpaced their respective industries’ averages by more than any other company. Meanwhile, at the bottom of the list, Comcast, Time Warner Cable, and McDonalds received the three lowest scores, and RadioShack, McDonalds, and eMachines fell the farthest below their respective industries’ averages. On an industry level, auto dealers earned the highest average NPS, while Internet service providers and TV service providers earned the lowest. Thirteen of the 20 industries increased their average NPS from last year, with banks enjoying the biggest jump in scores. Out of all the companies, HSBC’s and AirTran Airways’ scores improved the most, whereas Fujitsu’s and Highmark’s scores declined the most. For most industries, older consumers gave companies a higher NPS, while younger consumers gave companies a lower NPS. Investment firms have the largest generation gap.

See the NPS Benchmark Studies from 2012, 2013, and 2014.

Here’s a list of companies included in this study (.pdf).

Download report for $495
(includes report plus dataset in Excel)
BuyDownload3

Here are the NPS scores across 20 industries:
1510_NPS_IndustryRanges

Here are some other highlights of the research:

  • USAA’s insurance business earned the highest NPS (70), followed by Lexus (62) and Mercedes-Benz (62). Other firms to earn an NPS of 55 or more are H-E-B, USAA’s banking and credit card businesses, Apple’s computer business, Chick-fil-A, Wegmans, JetBlue Airways, and Amazon.
  • Comcast TV service (-17) earned the lowest NPS, followed by two firms that also had scores below -10: Time Warner Cable TV service and McDonalds. Other firms to earn NPS of -5 or below are Commonwealth Edison, Pacific Gas and Electric, Charter Communications (TV service and Internet service), Comcast Internet service, RadioShack, Time Warner Cable Internet service, Cablevision Optimum, and Coventry Health Care.
  • USAA’s insurance, banking, and credit card businesses earned NPS levels that are 38 or more points above their industry averages. Eight other firms more than 25 points above their peers: Chick-fil-A, TriCare, credit unions, JetBlue Airways, H-E-B, Wegmans, Amazon, and Apple.
  • Nine companies fell 30 or more points below their industry averages: RadioShack, McDonalds, eMachines, Travelers, Super 8, 7-Eleven, and Spirit Airlines.
  • HSBC’s NPS increased by 29 points between 2014 and 2015, the largest increase of any company. Nine other companies improved by more than 15 points: AirTran Airways, Baskin Robbins, Virgin America, Regions Bank, Citizens Bank, BMW, Southern California Gas, Morgan Stanley Smith Barney, and Food Lion.
  • Fujitsu, Highmark, Buick, and Humana had the largest decline in NPS between 2014 and 2015.

Download report for $495
(includes report plus dataset in Excel)
BuyDownload3

If you want to know what data is included in this report and dataset, download this sample Excel dataset file.Screen Shot 2014-10-17 at 4.05.17 PM

P.S. Net Promoter Score, Net Promoter, and NPS are registered trademarks of Bain & Company, Satmetrix Systems, and Fred Reichheld.

Report: Evaluating Mobile eGift Card Purchasing Experiences

1411- SLICE-B COVERWe published a Temkin Group report, Evaluating Mobile eGift Card Purchasing Experiences. The report uses Temkin Group’s SLICE-B experience review methodology to assess the mobile sites of 10 retailers. Here’s the executive summary:

Although smartphones are a convenient interaction channel, their small screens pose serious design challenges for companies. To evaluate the customer experience of mobile websites, we used Temkin Group’s SLICE-B experience review methodology to assess the experience of buying an eGift Card from ten large retailers: Home Depot, Lowe’s, Walmart, Target, Walgreens, CVS, Starbucks, Dunkin’ Donuts, Best Buy, and RadioShack. Home Depot earned the top score for its functionality and minimalist processes, while the user could not complete the full purchasing goal at Lowe’s, Walmart, Target, Walgreens, CVS, Best Buy, or RadioShack.

Download report for $195
BuyDownload3

The report includes the scores for all 10 companies across each of the six SLICE-B categories, strengths and weaknesses of each retailer, and some best practices across all of the mobile sites. Here is a description of the user and her overall goal that we tested:

Our user was a middle-aged woman looking to send her niece a $25 electronic gift card to help her get settled into her new apartment. While she is reasonably proficient at operating a smartphone, she finds entering a lot of information to be difficult on the small keyboard. She has an iPhone 4s. She does not have an app for any of the companies being evaluated and does not know whether they sell $25 eGift Cards.

Here are the overall results:

1411_GiftCardResults

Download report for $195
BuyDownload3

The bottom line: Gift cards should be easier to buy via mobile phones.

Report: Net Promoter Score Benchmark Study, 2014

1410_NPSBenchmarkStudy_COVERWe published a Temkin Group report, Net Promoter Score Benchmark Study, 2014. This is the third year of this study that includes Net Promoter® Scores (NPS®) on 283 companies across 20 industries based on a study of 10,000 U.S. consumers. Here’s the executive summary:

We measured the Net Promoter Score of 283 companies across 20 industries. USAA and JetBlue took the top two spots, each with an NPS of more than 60. USAA’s banking, credit card, and insurance businesses outpaced their industries’ averages by more than any other company. At the bottom of the list, HSBC and Citibank received the two lowest scores, and Super 8 and Motel 6 fell the farthest below their industry averages. On an industry level, auto dealers earned the highest average NPS, while TV service providers earned the lowest. Eleven of the 19 industries increased their average NPS from last year, with car rentals and credit cards enjoying the biggest score boosts. Out of all the companies, US Airways and Highmark BCBS improved the most, while Quality Inn and Baskin-Robbins declined the most. For most industries, the average NPS is highest with older consumers and is lowest with younger consumers. Investment firms have the largest generation gap.

Here’s a list of companies included in this study (.pdf).

Download report for $495
(includes report plus dataset in Excel)
BuyDownload3

Here are the NPS scores across 20 industries:

1410_industryNPS

Download report for $495
(includes report plus dataset in Excel)
BuyDownload3

If you want to know what data is included in this report and dataset, download this sample Excel dataset file.Screen Shot 2014-10-17 at 4.05.17 PM

P.S. Net Promoter Score, Net Promoter, and NPS are registered trademarks of Bain & Company, Satmetrix Systems, and Fred Reichheld.

Customer Effort, Net Promoter, And Thoughts About CX Metrics

There’s been a recent uptick in people asking me about Customer Effort Score (CES), so I thought I’d share my thoughts in this post.

As I’ve written in the past, no metric is the ultimate question (not even Net Promoter Score). So CES isn’t a panacea. Even the Temkin Experience Ratings isn’t the answer to your customer experience (CX) prayers.

The choice of a metric isn’t the cornerstone to great CX. Instead, how companies use this type of information is what separates CX leaders from their underperforming peers. In our report, the State of CX Metrics, we identify four characteristics that make CX metrics efforts successful:  Consistent, Impactful, Integrated, and Continuous. When we used these elements to evaluate 200 large companies, only 12% had strong CX metrics programs.

Should we use CES and how does it relate to NPS? I hear this type of question all the time. Let me start my answer by examining the four types of things that CX metrics measure: interactions, perceptions, attitudes, and behaviors.

1408_CXMetrics

CES is a perception measure while NPS is an attitudinal measure. In general, perception measurements are better for evaluating individual interactions. So CES might be better suited for a transactional survey while NPS may be better suited for a relationship survey. You can read a lot that I’ve written about NPS on our NPS resource page.

Now, on to CES. I like the concept, but not the execution. As part of our Temkin Experience Ratings, we examine all three aspects of experience—functional, accessible, and emotional. The accessible element examines how easy a company is to work with. I highly encourage companies to dedicate significant resources to becoming easier to work with and removing obstacles that make customers struggle.

But CES uses an oddly worded question: How much effort did you personally have to put forth to handle your request? (Note: In newer versions of the methodology, they have improved the language and scaling of the question). This version of the question goes against a couple of my criteria for good survey design:

  • It doesn’t sound human. Can you imagine a real person asking that question? One key to good survey design is that questions should sound natural.
  • It can be interpreted in multiple ways. If a customer tries to do something online, but can’t, did they put forth a lot of effort? How much effort does it take to move a mouse and push some keys?!? Another key to good survey design is to have questions that can only be interpreted in one way.

If you like the notion of CES (measuring how easy or hard something is to do), then I suggest that you ask a more straight forward question? How about: How easy did you find it to <FILL IN THING>? And let customers pick a response on a scale between “very easy” and “very difficult.”

My last thought is not about CES, but more about where the world of metrics is heading. In the future, organizations will collect data from interactions and correlate them with future behaviors (like loyalty), using predictive analytics to bypass all of these intermediary metrics. Don’t throw away all of your metrics today, but consider this direction in your long-term plans.

The bottom line: There is no such thing as a perfect metric.

The Satisfaction Quarterly Report, Q1 2008

I recently mentioned the American Customer Satisfaction Index (ACSI) to someone and was surprised that she had not heard of it. It’s a great research effort led by Claes Fornell at the University Of Michigan which tracks customer satisfaction on a quarterly basis. Here’s a chart of the national average since the index was created in 1994:

ACSI National Satisfaction Scores

As you can tell, satisfaction scores have been generally on the rise over the last few years.

The ACSI provides both company-specific and industry-specific data for a different set of industries every quarter. The Q1 2008 ACSI looked at the following industries: hotels, restaurants, hospitals, cable & satellite TV, cellular telephones, computer software, fixed line telephone service, motion pictures, network/cable TV news, newspapers, wireless telephone service, airlines, express delivery, U.S. Postal Service, and energy utilities.

Here are some of the highlights from that Q1 2008 data:

  • Best & Worst Organizations:
    • Top rated: FedEx Corporation (express delivery), UPS (express delivery), Olive Garden (restaurant), and Southern Company (Utility)
    • Largest improvement (since last year): Ameren Corporation (energy utilities), Reliant Energy (energy utilities), Energy Future Holdings (energy utilities), and McDonalds (limited service restaurants).
    • Lowest rated: US Airways (airlines), Charter Communications (cable & satellite TV), Comcast Corporation (cable & satellite TV), and Sprint Nextel (wireless telephone services).
    • Largest decline (since last year): US Airways (airlines), Continental Airlines (airlines), Sprint Nextel (wireless telephone services), and Northwest Airlines (airlines).
  • Best & Worst Industries:
    • Top rated: Express Delivery and Ambulatory Care.
    • Largest improvement (since last year): Hotels and Fixed Line Telephone Services.
    • Lowest rated: Airlines, Cable & Satelitte TV, and Newspapers. 
    • Largest decline (since last year): Newspapers and Broadcast TV News.

The bottom line: This should be a wake-up call to many firms (are you listening airlines and cable & satellite companies?).

Lessons Learned From Chief Customer Officers

I just published a report called the “The Chief Customer/Experience Officer Playbook.” To research the report, I interviewed executives with responsibility for customer experience that cut across normal product and/or channel boundaries (we call them Chief Customer/Experience Officers or CC/EOs) from several different organizations including Air Transat, Alaska Air Group, Bank of America, Bombardier, the California State Automobile Association, Century Furniture, the Colorado Rockies, and Symantec. In addition, I spoke with Jeanne Bliss, author of the book Chief Customer Officer: Getting Past Lip Service to Passionate Action.

The research identified five categories of things that CC/EOs should do:

  1. Make sure that you’ve got the right environment.
  2. Prepare to take on a broad change agenda.
  3. Establish a strong operating structure.
  4. Kick off high-priority activities.
  5. Look ahead to the future.

The report goes into much more detail for each of these items. While I can’t share the whole report in my blog (that’s reserved for Forrester clients), I did want to share some of the most interesting quotes from the CC/EOs:

  • “It takes massive support from senior management. This role can destruct careers.”
  • “What’s more important, but less tactical and takes longer, is the realization that customer experience is culture. It’s the mindset of our associates and their empowerment. Not stuff, but attitudinal. We’ve recognized that this is a journey.”
  • “Each of the groups in our company already had some customer experience efforts, so I wanted to make sure that they were on board and not threatened. I needed to talk to each of those groups individually. It’s an ongoing issue – and it’s an ongoing effort for me.”
  • “We focus on employees first. Happy employees make a happy customer. They were very skeptical – so much of our communication is internally driven. We need to support the hell out of them.”
  • “I do a read out to the leadership team every month and tell them my perspectives on how we’re doing (fact-based); a no-holds-barred discussion. No attempt to keep any of that stuff under the rug.”
  • “Customers want one relationship with us and we’ve given them about 10. Our data sources and systems are isolated; the organizations are isolated. We’re trying to break down the silos.”
  • “We’re changing metrics in the call center to eliminate focus on average talk time.”
  • “If I did it over again, I would have focused earlier on consolidating our customer listening posts and voice of the customer efforts. We now look at the perception of reliability, not the actual reliability.”
  • “We’re looking for line of site between our initiatives and NPS, which is a lagging indicator. We’ve worked on projects that have taken three quarters to improve the NPS.”

The bottom line: CC/EOs shouldn’t “own” customer experience, but they can really help support the organizational transformation required to improve it.

Let’s Learn From Delta’s [Continuing] Customer Experience Miscues

Let me start this post with a clear disclaimer — I never consider my personal experiences when evaluating customer experience in my research. Every large company periodically delivers subpar experiences, so anecdotes aren’t necessarily indicative of a company’s overall customer experience efforts. 

Having said that, I feel the need to share my experience with Delta Airlines over the last 2 days, because there’s something to learn (or maybe unlearn) — and, to be completely honest, I feel the need to vent.

The summary: It took me 13 hours to get from the airport in Richmond, Virginia to Boston’s Logan Airport. Along the way, Delta found many ways to make the experience miserable.

The painful details:

  • Yesterday, a colleague of mine and I boarded our plane to New York (JFK) in Richmond, Virginia and the plane pulled away from the gate at 6:15 PM — right on time.
  • Minutes after pulling away from the gate, the pilot said that we were on a ground hold and would need to wait there for a while. No real details. 75 minutes later we were brought back to the terminal and allowed to get off the plane.
  • By the time we got off the plane, there were no more options for us at the airport — either on Delta or on any other airlines. (Note: JetBlue flight #1076 for JFK left ON TIME for JFK while we were sitting on the tarmac)
  • The agent at the counter in Richmond was completely unhelpful. All she said was that we were now booked on a flight out of JFK for the following day. She was completely unwilling to explore any alternatives — even those that I suggested. She didn’t seem to care — even a tiny bit — that our 4 hour trip was now going to span a couple of days.
  • Well, we finally got to the JFK terminal shortly after 10:00 PM. It turns out that our connecting flight left about 9:49 PM (Delta didn’t think we were “important enough” to wait for us to make the connection). Now, on to the Delta customer service agent in JFK.
  • The agent told us we had no options to get home that night (although I have since found out that there was a JetBlue flight #1028 that left later that evening). We were booked on a 10:15 AM flight. Luckily I know about the Delta Shuttle — and was able to push him to book us on the 6:30 AM flight.
  • I asked him which hotel Delta was going to put us at. He then informed us that Delta was not going to provide a hotel because it was not responsible for the problem. He used some technical terms that (in his mind) absolved Delta from all responsibility for our situtation. Then I mentioned that it was, of course, Delta’s fault — the JetBlue flight that left after we pushed from the gate seemed to get to JFK without a problem. His response was precious — “How do you know that?!” (As if I must be either mistaken or lying — neither of which was true). It reminded me of a Seinfeld episode. He was obviously well trained in the techniques of avoiding responsibility.
  • Well, the agent did give us a phone number of a service that helps Delta’s stranded customers find hotels in the area. So we called the number. The guy on the phone gave us the phone number for one hotel. We called the hotel and they had no vacancies. Thanks for the help Delta!
  • Well, we found a hotel in the area (on our own) and actually made it to LaGuardia the next day a bit early. So we tried to get on an earlier flight (6:00 instead of 6:30). You’ll never guess what the agents told us — “that will cost an additional $150.” That’s right, she wanted to charge us more money to get us home a day later! When we told the agent about the terrible experience that we had been through, she did a little research on her system and then said — it looks like your plane from Richmond left on time. The implication: Delta doesn’t need to go out of its way for us because it pushed the plane away from the gate at the scheduled time.
  • We finally “convinced” the agents at the desk to let us on the earlier flight (which was completely empty) without any additional charges.
  • Then, finally, we landed at Logan Airport at 7:00 AM today. 12 hours, 45 minutes later.

The analysis: Delta’s records probably show that we were on 2 flights that left on time — and therefore had a successful trip. Obviously, though, our experience doesn’t match that assessment. Hopefully Delta (and other firms) can learn to avoid the following customer experience miscues that we ran into:

  • Poor communications. I understand that delays happen. But the situation gets much worse when customers are left in the dark. We did not get a lot of accurate information about the status of our flight as we were waiting — raising our anxiety level and making it difficult for us to formulate potential solutions to the problem.
  • No accountability. Along the way, every Delta employee seemed to be trained in mechanisms for denying responsibility. The tone of our interactions may have been different if Delta trained its empoyees to recognize that stranding customers at an airport is ALWAYS its problem.  
  • No empathy. Along the entire ordeal, we did not run into a single Delta employee who said “I’m sorry” or even acknowledged our inconvenience. Maybe Delta can just teach agents to start interactions with stranded customers like this: “I know this is really inconvenient, let me see what we can do…”
  • No advocacy. All of the agents that we met were just trying to get rid of us. Not one of them asked what we wanted to do — and they certainly didn’t go out of their way to explore alternatives. A good lesson to learn: the most important time for helping customers is when they are in need. These moments of truth can build or break loyalty. In this case, Delta clearly achieved the latter.

The bottom line: You need to look at interactions from the standpoint of your customers (note to Delta and other airlines: “on-time departure” is not a good customer experience metric). It can provide a dramatically different view!

Epilogue: I sent Delta’s customer service group a link to this blog in their complaint form. But rather than reading it, they sent me an email that said:

“…We appreciate the e-mail you sent. However, please send us your experience in a text form or letter.” 

Looks like Delta doesn’t really care what happened to me — but it is finding every possible way to avoid taking responsibility.

Epilogue #2: I finally got a response from a representative who seemed to have glanced at the feedback that I had to cut and paste into an email. So the airline decided that I qualified for a $75 credit which it promissed to send via another email. But 2 weeks later — there’s still no credit. The ineptitude of Delta’s customer experience efforts is truly comical. Where’s Ashton Kutcher? I must be getting Punk’D.

Don’t Neglect Your “Welcome Experience”

My wife and I just got back from golf camp at Stratton Mountain (it was actually called Stratton Golf University, but we liked to think of it more as “camp.”) As we drove up on the first day, we were greeted by one of the instructors who was standing in front of the parking lot. He showed us where to park, took our clubs, and showed us where to go next to sign-in. Wow — what a welcoming experience!

Let’s disect what went right:

  • We had no anxiety about what we needed to do.
  • We received an immediate “personal” connection.
  • We felt like the “University” was ready for us.
  • We had a great feeling about the week. 

Notice how I discussed what went right in terms of how my wife and I felt about the experience. When I work with companies, I don’t evaluate interactions based on my personal feelings, but in this case I was actually the target audience.

Some lessons learned about a good Welcome Experience:

  1. Assume customers don’t know as much as you think. We typically spend 40 hours or more per week at work — and many more hours thinking about work and our company when we’re not even there. So we know a whole bunch about our products, services, and processes. But, alas, customers don’t spend nearly that much time thinking (or caring) about our business. So firms have a tendency to assume that customers know more than they actually do — like where to park and what to do with your golf clubs.
  2. Make sure that customer know exactly how to start. If customers don’t know where to go first, then there’s a higher likelihood that they’ll get lost. But as obvious as that sounds, we still find that many experiences fail to get customers going in the right direction. What do these flaws look like? Website homepages that don’t provide clear evidence that the user can accomplish her goal; IVR menus that don’t offer a match to what a customer wants to do on the call; and large airports that don’t provide clear signage to the check-in locations for all of their airlines.
  3. Set the tone right away. If you want your customers to think that you are helpful — establish that context right away. Good or bad — the Welcome Experience shapes how customers view every interaction after that moment. As they say: you only get one chance to make a good first impression. 
  4. Provide feedback along the way. Don’t think of the Welcome Experience as a facade — it’s just the beginning to a continuous experience. Make sure that you provide customers with clear signals and insights as to what they should be doing next. The golf instructors didn’t just point to a building and say go there and register, they took us to the door and pointed to the registration table. We’ve all seen when this goes wrong. Think about a detour you were forced to take when you were driving — only to find that there were only a sparse set of detour signs along the way. Even if you were heading in the right direction, you still wanted to see a sign saying that you were on the correct detour route.

How can you tell if you have a good Welcome Experience? I can think of 2 great ways:

  • Ask your customers. Why not ask customers in your post-interaction surveys about specific elements of the Welcome Experience. Or even interrupt a few people early in the process and ask them what they like/dislike about the experience.
  • View the experience through your customers’ eyes. As you’ll find out in many of my posts, I often recommend that companies internalize the concept of Scenario Design. Think of your target customer and ask the questions: Who is that person; what are her goals? how are you helping her accomplish those goals?

At this point in my post, you’re probably waiting for me to get to the bottom line. So here it is: We had a great time at golf camp — and my wife and I are both hopeful that we cut at least 5 strokes off of our golf scores (which were pretty high to begin with).

Net Promoter And Satisfaction Battle For King Of The Ring

Let’s start with a confession: I’m a big professional wrestling fan; so I really enjoy a good battle. One thing that I’ve learned from the WWE, is that it’s the storyline that makes a battle come to life. And the Net Promoter vs. Satisfaction debate has all of the story trappings of a great tag team match!

One one side of the ring in the blue trunks is the tag team of Fred Reicheld, “father” of the Net Promoter System (NPS) concept and Satmetrix Systems, implementor of NPS-based survey systems. On the other side of the ring in the red trunks, we find Claes Fornell, “father” of the American Customer Satisfaction Index (ACSI) and ForeSee Results, implementor of ACSI-based survey systems.

Both of these teams are fighting for their approach to be recognized as “THE” measure for tracking customer relationships. To put this into perspective, this type of measure represents only one of the five levels of a voice of the customer program (see my earlier post on voice of the customer programs).

Let’s start by handing out some awards to the teams:

  • Best marketed: Net Promoter (Reichheld is very good at touting his concept — and in writing compelling books about it)
  • Most mature: Satisfaction (The ACSI has been tracking data since about 1994 and satisfaction has been around as long as I can remember)
  • Most quantitative: Satisfaction
  • Sexiest: Net Promoter (it’s caused a lot of hooplah)

Net Promoter has gained a lot of momentum over the last few years as many large companies have adopted it. The methodology is pretty straightforward: ask people if they’d recommend your firm. Based on their response, they get categorized as a Promoter, Detractor, or neither. You take the percentage of Promoters and subtract the percentage of Detractors and that leaves you with a Net Promoter percentage.

This debate was enhanced by a recent study cited in the Journal Of Marketing which found that…

Using industries Reichheld cites as exemplars of Net Promoter, the research fails to replicate his assertions regarding the “clear superiority” of Net Promoter compared with other measures in those industries.

Well, if you’re wondering what I really think about this Battle Royale, then here it is. Just like wrestling — the storyline is much more exciting than the reality of the battle. Here’s my take on the contest:

  • Net Promoter is not the “ultimate” measure for a customer relationship
  • Then again, neither is satisfaction.
  • But companies are better off when they have more satisfied than dissatisfied customers and more Promoters than Detractors.

My recommendations:

  • Don’t expect any single measure to be eutopia. Both measures are good, but neither one has enough information to fully guage customer relationships and to provide enough diagnostic information to make all of the necessary improvements.
  • Focus on one measure to build alignment. Picking a single measure to focus on (whether or not it’s perfect) can be very valuable in aligning the organization. If you can get your entire company focused on either raising satisfaction or increasing the number of Promoters, then you will likely see some significant improvements in the reallt important metrics: retention, sales, etc. So, if in doubt, pick one and move on.
  • Evolve your metrics over time. The previous two bullets may seem to contradict each other, but they don’t when you look at it over time. The value from locking into a single measure like Net Promoter is as much from aligning the organization as it is around the perfection of the metric. But after the organization gets aligned, firms will need to build out the portfolio of metrics — and find out for themselves which measures are both predictive and diagnostic.
  • Look at Customer Advocacy. The ring was too crowded to add another contestant to the match earlier in this post, but for some industries we’ve found another measure that is a powerful indicator of loyal customer behavior. So, in the purple trunks is Customer Advocacy, the perception that the firm does what’s best for customers, not just what’s best for its own bottom line. We strongly recommend that financial services and healthcare firms take a very close look at this measure.

The bottom line: Don’t get too caught up in determining the winner of this battle. Just make sure that you do something and are prepared to learn and evolve over time.

If you’re a client of Forrester, then I also recommend that you read these two research documents:

Are you listening to the voice of the customer?

Voice Of The Customer (VoC) is a term that many people use, but few people can define. That’s the type of environment in which I love to do research. So I ended up writing two research documents on the topic: Building Your Voice Of The Customer Program and Voice Of The Customer: Five Levels Of Insight (as always, only Forrester clients can read the full reports). To start with, I developed the following definition for a VoC program:

A systematic approach for incorporating the needs of customers into the design of customer experiences

This definition contains three key elements:

  • A systematic approach. Most companies take an informal approach to gathering customer feedback. A VoC program should augment — not replace — those ad hoc approaches with a more structured way to gather and use customer insights.
  • Customer needs. Companies often have access to a great deal of customer data — but customer insights don’t automatically surface from data. A good VoC program uncovers the current and emerging needs of key customers — and helps identify areas where those needs are not being met.
  • Experience design. Gathering customer insights is only an interim step to improving customer experience. Why? Because VoC programs deliver the most value when companies actually make changes to better serve the customer needs uncovered by the research.

My research also identified five distinct levels of activities in a VoC program:

  1. Relationship tracking. Organizations need to track the health of customer relationships over time. That’s why companies often ask customers to fill out surveys — typically quarterly or annually — about their perception of the firm. Using this feedback, companies can create metrics that are simple to understand and easy to trend. Why is this important? Because an easy-to-grasp report card helps align everyone in the organization around a common purpose.(Note: I won’t get into the debate between “satisfaction” and “NetPromoter” metrics in this post, but I’ll definitely be touching on that in the future)
  2. Interaction monitoring. Every customer interaction — from an online transaction to a call into the call center — is important. Firms need a way to monitor how effectively they handle these customer touches. That’s why many companies do post-interaction surveys — asking customers for feedback on recent interactions.
  3. Continuous listening. Structured feedback through customer surveys provides enormous opportunities for analysis. But one of the strengths of these approaches — providing data — is also a limitation. To avoid this data-only view of customer relationships, companies put in place processes for executives to regularly listen to customers. There are many opportunities to hear what customers are saying, such as listening to calls in the call center, reading blogs, reading inbound emails, and visiting retail outlets.
  4. Project infusion. The following statement is probably not too controversial: Projects that affect customers should incorporate insights about customers. Despite the clear need for this type of effort, many companies lack a formalized approach for infusing customer insights into projects. To make sure that this doesn’t happen, some firms are incorporating customer insight steps in the front-end of their Sigma processes.
  5. Periodic immersion. Every so often, it’s valuable for all employees — especially executives — to spend a significant amount of time interacting directly with customers or working alongside frontline employees. These experiences, which should be at least a half day, provide an excellent opportunity for the company to question the status quo.

Here’s a graphic that shows more details on the five levels… Five Levels Of A Voice Of The Customer Program

Hopefully this helps to create some common language around the Voice Of The Customer.