Domino’s Misses Customer Experience Mark
May 30, 2016 Leave a comment
After we released the Temkin Experience Ratings for fast food restaurants, a Business Insider report asked Domino’s to comment on the pizza giant’s ranking (15th out of 19 fast food restaurants). Here’s how a Domino’s representative responded:
We’re the only restaurant chain of our size in the U.S. that has experienced seven consecutive years of positive same store sales growth and we’ve been rated the best-tasting pizza by independent research of all the national brands. We believe those mean something.
That response wasn’t directly related to customer experience, but we at Temkin Group were intrigued by the Domino’s claim to have the best tasting pizza. Since it was lunch time, and we were hungry, we put the claim to the test.
We enjoyed customizing and ordering our pizza online, which Domino’s estimated would be in our hands in approximately 33 to 43 minutes. After we placed the order, it was fun watching Pete the Pizzamaker keep track of the flow of our two pies. The experience started out great as we watched our pizzas progress from “order placed” to “prep” to “bake” to “quality check.”
Then our lunch took a turn for the worse, as our pizzas remained in the “quality check” for more than 20 minutes, more time than it took to cook them. What was the matter, what had happened to our customized pizza? Was there a quality problem? Did our pizza fail some important tastiness check? Was a corporate task force being called in to fix our lunch?
My best guess is that the pizzas were just sitting there until they were picked up by the delivery person, so that Pete the Pizzaman could change the status to the final stage “order has been delivered.”
Our pizzas finally arrived after 71 minutes, well beyond the promised 33 to 43 minutes. And, they were cold.
I shared that story in order to discuss the survey that Domino’s asked me to fill out. It demonstrates some good examples of survey design practices to avoid. Here’s the survey…
I want to discuss two of the questions on the survey:
- Was your delivery driver Julio both punctual and polite?
- Our pizzas were a half an hour late, so no way was Julio punctual. Was he polite? Sure. This survey question is making a common mistake, it’s asking for a rating of two things in one scale. In this situation, each of the items has a different rating. Given the wording of the question, I answered this with one star (we were very hungry when it finally arrived). How will Domino’s know what to do with this data? They’re likely to mis-interpret it and blame Julio (which is why I added a comment).
- Did Pelen make your order accurately and bake it to perfection?
- First of all, I was disappointed to find out that Pete the Pizzaman did not actually make our pizzas (who the heck is Pelen?). Once again, Domino’s is asking a compound question. The ingredients were correct (so yes on the first part), but it was cold (so no on the second part). Maybe I shouldn’t blame Pelen for the pizzas being cold, but the customer isn’t responsible for identifying who to blame. The question asked about it being baked to perfection, which it wasn’t.
I don’t know what caused the service breakdown with our pizzas, but I also don’t think my numeric answers to those specific questions will help Domino’s understand what happened. The data could be used to blame Julio for not being polite or to show that Pelen made the wrong pizza, both of which are inaccurate.
As you can see, companies need to think through the specific way they ask for feedback. Even poorly designed questions provide data, but the results can easily be misinterpreted.
So, what about Domino’s claim to be the “best-tasting pizza?” There may be studies supporting that assertion, but it did not match the findings from our little “focus group.” Having said that, we did enjoy our lunch with reheated pizza and a hearty discussion about Domino’s customer experience.