headertext headertext promolink

Posts tagged with ‘customer satisfaction ratings’

Message to CX profession – Transparency begets trust

I get requests to complete surveys quite often. They come from my bank, after in branch transactions, websites I visited, customer service of my credit cards and cable providers. caged bird tweetsThey all want to know how I would score whatever is important to them, and leave a little space for my comments. Some of these surveys are just 2 or 3 questions long, but others expect me to answer pages of seemingly repetitive and circular questions.

I have never seen a survey request that explains coherently why my opinion is so important to them. In other words, they never indicate what is going to happen after I have completed the survey, carefully answered all the questions, and provided very detailed comments. Presumably, if the tabulated scores are high enough, whoever created or sponsored these surveys, will high five each other and cash their bonuses. But what about my needs? Would my contribution help anybody to make a better selection? How would I know if my responses contributed to a better product or service? Sometimes a company proudly advertises their customer satisfaction success, but I wonder if  their claims can be taken seriously because there is no way for a consumer to validate them. For these reasons I stopped answering survey requests a long time ago.

Amazon is considered by many, the poster child of customer centricity. I have done business with Amazon for over 10 years and made hundreds of various purchases over that time. I cannot recall a single survey request from them, ever. Could it be, customer-centric Amazon does not care about the customer experience they provide? I think they don’t survey their customers because they understand the power of authenticity that is growing fast with the advance of social consumer. Amazon understood that consumers will never trust a brand more then they trust each other. A long time ago, instead of collecting self-serving survey ratings, they decided to enable their customers to share their experiences with each other in an open forum. Yes, over the years there were incidents of manipulation attempts. Yes, the Liekert stars are not particularly informative. However, overall the customer reviews are extremely valuable to consumers who learned how use the reviews to reduce the uncertainty of their purchasing decisions.

“Amazon does not make money selling goods. Amazon makes money helping customers make good purchasing decisions.”

According to Keller Fay Group research, two primary reasons customers write reviews and publish them online are:

  1. (90%) Help other consumers to make the right choice for them – kind of: “pay it forward”
  2. (70%) Help brands to improve their performance. Consumers rely on the transparency of their input to motivate brands to act

I can only guess that since Amazon does not survey their customers, they probably use the content of reviews, posted on their properties, to measure the level of customer satisfaction of doing business with them. There are plenty of very informative references in many product reviews that indicate how customers regard their experience with Amazon. Explosive and continuous growth of this company is also a pretty good indicator of the consumers’ affinity.

So why do so many companies still shy away from exploring the content, provided by their customers without solicitation? The answers I’ve been given by Voice of Customer practitioners over the years have a common thread:

  • Lack of control over the process
  • Doubts in authenticity of reviews
  • Fear of negative sentiments

In other words, it seems these companies do not trust consumers, who provide their feedback transparently. Yet, these very companies expect consumers to trust them with their feedback without any transparency at all. How reasonable is such expectation?

Fake ROI and Customer Experience

Fake ROI and Customer ExperienceMany of us are familiar with a request to justify any project from the return on investment perspective. Corporate management’s fiduciary obligation is to control the use of financial resources for the best interest of the company’s stakeholders. I have no quarrel with this notion. I do have a quarrel with how it is frequently practiced.

There are two major points of contention:

  • The Practical definition of who are a company’s stakeholders – actions speak louder than words. Choices of internal funding, that favor short-term returns at the expense of long-term sustainability of business, indicate that the leadership does not consider employees and customers to be the stakeholders.
  • Departmental (silo) approach to ROI analysis – the reduction of a single department’s operational cost without evaluating the effect it may have on performance of other departments.

There are often inescapable Customer Experience consequences associated with efficiency initiatives that show fake ROI:

  • Replacement of inside sales force with automated phone bots may look like an excellent ROI initiative. However, every recorded call received by a potential or existing customer chips away of the brand value. What additional investment in marketing would it require to at least balance the negative effect?
  • Reduction of training budget for customer service representatives can jumpstart quarterly earnings and may inch up the share price for a week or two to please hedge fund managers. How will it affect customer churn rate, and what is the expense of replenishing the lost customers?
  • Implementation of community-driven customer support seems like a sure winner until its effect on conversion rate from free to paid subscription is examined. From that perspective, the increased cost of the company customer support looks like much better investment.
  • Streamlining cost of customer intelligence acquisition process is a no-brainer. Just lay off market researchers, delegate to product managers DIY survey process and offshore tabulation and interpretation of results. How would it affect your product success rate? What would just 5% drop do to the company’s bottom line?

The point I am trying to make is that  the silo approach to optimization of business processes often looks like “pound foolish, penny wise” tactics. Excessive focus on efficiency, i.e. cost reduction, may cause disproportional increase in expense of attracting and keeping customers, and that destroys an enterprise’s effectiveness.

 “The purpose of business is to create and keep a customer.”  Peter F. Drucker

Do not confuse Customer Experience with Customer Service

CX is not CSThere are too many people who use the customer experience and customer service/support terms interchangeably. Even well respected authors and customer centricity consultants, like Don Peppers, occasionally slip into this ambiguous trap. Here are some basic definitions found on the web with a simple query:


“Customer experience (CX) is the sum of all experiences a customer has with a supplier of goods and/or services, over the duration of their relationship with that supplier. This can include awareness, discovery, attraction, interaction, purchase, use, cultivation and advocacy.”


“Customer Service is the assistance and advice provided by a company to those people who buy or use its products or services. “


Customer service is just one of the attributes that comprise customer experience, but it is most definitely not the same thing. For some businesses it could be the most important ingredient, and for others in could be completely inconsequential one.


Here are some examples to make the distinctions a little more clear:


• You can have great customer experience without the participation of the customer service department at all, but sometimes even the best customer support efforts cannot salvage overall customer experience:

o The most attentive waiter can’t improve a poorly cooked dish, but a scrumptious meal can be remarkably experienced in self-served establishment.

o Expertly installed TV cable service does not guarantee quality entertainment.

o Customer Success Managers can only help to retain customers for a short period of time if the software does not perform as expected.


• A product plays the leading role in delivering customer experience, not efforts of customer-facing employees. If a product sucks, no heroics of the front line personnel can deliver excellent customer experience. From this perspective it is difficult to understand how product managers, and even more so product marketing managers, manage to avoid the customer experience responsibility spotlight. These are the people who interpret customer needs and wants into a product design. It is a best practice to have them handle customer support lines on a regular basis to learn firsthand how accurate were their interpretations.


• Marketing is the group that creates customer expectations, and when these expectations do not meet reality of a product, customer experience suffers. Classical marketing is supposed to “learn” what customers need and translate this learning to product designers and advertising messages that attract the “right” customers to the “right” product. Instead, marketing is too often focused on “pimping” products designed by engineers overseas without any connection to actual consumers. Focus groups and survey are designed to figure out how to sell what they have got, rather than to make what customers want. No wonder the distinction between “market research” and “marketing research” is so blurry. Customer service can be very helpful to facilitate the return of an unwanted product and deliver great product return experience, but it cannot deliver a great customer experience.


Confusing customer service/support with customer experience puts an unfair and unbearable load on the shoulders of an organization that already is the second most stressed group, after sales, in the company. Even though its performance has relatively limited ability to influence delivery of customer experience, it is measured, dissected and optimized completely out of proportion. When you see that happen, it is the first sign that the company is focused on financial engineering – not on their customers.


Why doesn’t an abundance of analytics translate into actions?

big data analytics doesn't help when poor goal setting produces absence of actionMost companies heard about value of data-driven decisions, and every company accumulated more data than they know what to do with. Some enterprises invested in technology tools sold to capture transactions and mine oceans of data in hopes of increasing their revenues, reducing their expenses and improving every other KPI in between. While dashboards and interactive reports present us with an abundance of analytics results in real-time, I would like to ask you Dr. Phil’s question – “How’s That Working for Ya?”

For many people. distilling insights from analytics is challenging enough—translating those into specific actions is often impossible.

“A perfection of means, and confusion of aims, seems to be our main problem.” Albert Einstein

The good news is that you cannot buy another tool or technology to resolve this problem for you. If it was, our unemployment rate would be even higher. The job of setting right aims is still best performed by humans. Setting of aims should not be confused with creating hypotheses or forming assumptions. If you do that, you limit potential actions only to validation or rejection of those hypotheses. Meanwhile, the most valuable potential actions may never discovered.

I’ve seen the best results when:

1. The scope is focused on 3 to 5 potential targets directly associated with specific business goals at the time:

a. “Is there an opportunity to disrupt the XXX market segment?” or

b. “Is it possible to improve our customers’ experience by linking it to a performance and compensation model of employees from departments other than sales and support?”, or

c. “What is the confidence of executive management in our sales forecasting process?”

2. The targets are potentially achievable, i.e. specific actions could be taken if the insights, guiding such actions, were known. In other words, you have authority or influence to act upon insights to reach the selected targets. If you don’t, the best insights are not likely to result in decisive action that requires changing somebody’s mind when it is already made. The insights that the bourgeoning MP3 player market segment could be disrupted by redesigning user interface and offering a source for high quality digital content, would not help to introduce iPod revolution without the influence of somebody who had an authority to act on such an insight.

3. Appropriate data sources can be identified and triangulated. The intersection of signals from multiple sources provides better guidance for action. Think of it as GPS requirement for signals: a minimum of 3 satellites are required to produce relatively accurate position information. The minimum accuracy (confidence level) and volume (statistical representation) factors have to be assessed before the aim is set. There is no point in producing insight that does not give you confidence to act.

The short-term, “ready, fire, aim” approach is not likely to produce meaningful ROI from a technology investment into analytics.

Apple at the crossroads

As many products and services are becoming more agile by design, even the best-designed products have shorter and shorter time to enjoy superior profit margins before competition starts to catch on. Patent protection and brand recognition do help to extend this time, but the clock keeps accelerating.

Apple’s iPhone provides a good case study for this phenomenon. Ever since its initial introduction by Steve Jobs, the iPhone was the gold standard for the smartphone product category. It became a status symbol of Silicon Valley technocrati and every new model was greeted by millions of lined up fan boys and girls. Apple bestowed the honors of selling it to a chosen few channels and demanded handsome subsidies in return.Social NPS apple vs galaxy The reason for this success was the iPhone’s superior customer experience, designed into the product, that exceeded any other one in that category by a wide margin. Well, this margin finally shrunk.

Last week’s financial news show that hegemony of iPhone may be over. Make no mistake, it is still a great product, but competition has caught on in creating a customer experience as good or better than Apple’s. People bought 31.2 million iPhones last quarter, but their experiences were rather underwhelming. Not because there is anything wrong with their iPhones, but because their expectations were too high as many new customers came to Apple for the first time. As Samsung, HTC and Nokia customers have “upgraded” to iPhone, they do not find the overwhelming difference they expected.

These scores may not likely match company-sponsored survey results, as they are extracted from sentiments customers express reciting their experience in unsolicited reviews they share online, often anonymously. However, what these scores say is that only 6% of iPhone 5 customers (Net Promoters) would actively put out a Word of Mouth to promote the phone, compared to 18% for Galaxy S4.

The differences are even more pronounced when iPhone is compared to Blackberry Z10, HTC One or Lumia 928.

Out of 25 attributes of customer experience that are most important to customers of these smartphones, Apple still dominates in one – Design, as customers rate it 28% above the group average. Details are available on request.

It appears that marginal improvements introduced in the last two iPhone models – 4S and 5 –  failed to separate the brand from the pack. In NPS trend by modeladdition, US carriers started to experiment with unbundling handsets from services, and the subsidies to the manufacturers like Apple are threatened. At this point the category shows signs of maturity. In the past, Apple’s market researchers were able to discover and exploit latent needs of increasingly demanding consumers. Is today’s Apple capable of inventing a new category of products to march on as the industry leader?

Social Consumer challenge to Traditional Brand Management

As most marketers are well aware, when consumers have trust in a brand the products associated with this brand are capable of delivering higher margins than the competition and sustaining adverse economic conditions without loss of their market share. Unless consumers consider a product/service to be a commodity, or choices are limited by regulatory authorities, brand reputation often outweighs price considerations.

Brand Reputation over price

Historically, brands managed their reputation mostly by means of PR and advertising efforts. However, with the advent of Social Customer these efforts became much less effective.

As more customers become publically vocal about their experiences many brands started to see their reputation as being threatened. Since publically expressed sentiments often do not match customer satisfaction data collected by a company, it easy to adopt a defensive attitude toward social media word of mouth. Some marketers feel that social customers are there to rant and most of the online feedback is negative. The data does not support this theory – of over 49 million reviews for products left on Amazon, the median Liekert score is 3.78 (out of 5). The analysis of customer reviews posted on Yelp and TripAdvisor provides similar results.

Here are few ideas for managing brand reputation in the Age of Social Consumer:

  1. Accept that you cannot control customer behavior. Word of Mouth has been around for a very long time. In the past, very few people could hear it and it was easy to out scream it with paid PR. You cannot control it anymore. The more you try the more your brand reputation suffers. If , or should I say when, you get caught manipulating Word of Mouth, it will damage your brand reputation a lot more than a bad review.
  2. The best way to improve the social reputation of a brand is not to hunt and attempt to destroy the negative comments of customers who were disappointed with their experience. The best practice is to understand the root cause of their disappointment and correct underlying problems with your product or service. If you do that and let them know publicly, the social reputation of your brand will soar.
  3. Most negative reviews point businesses to ways of improving their offers. They also help consumers decide if negative comments resonate with their own expectations or not. Consumers are smart enough to understand a difference between legitimate grievance and angry rant.Ostrich Strategy
  4. Public sentiment, regardless the measurement scale, may not match your internal Customer Satisfaction or NPS® scores, but they often correlate. Most importantly, consumers trust social sentiment more than a brand’s internal metrics. Resistance is futile and amounts to the Ostrich Strategy.

Customer Satisfaction—the Ultimate Vanity Metric?

Almost every company measures Customer Satisfaction or its variations at considerable expense and effort.

Some companies attempt to use the metric for advertising. The metric is supposed to convince a shopper to join the ranks of the company’s customers because they are supposedly 97% satisfied. These numbers are impossible for a consumer to validate, methodologically or anecdotally. Besides, there may be information floating in Social Media that disputes the company’ customer satisfaction claims, however unfairly. In my opinion, brandishing the customer satisfaction scores, without complete transparency, will more likely lead to erosion of trust than to increase in sales. Social Customers trust each other’s experiences more than they do brand claims.

csat bullshitMany companies use the customer satisfaction metric to judge departmental performance while their customers keep churning, for reasons the measured business unit may have no control over whatsoever. They do well if the last calendar period metric is scored higher than the previous one. If the last score is lower, bonuses are not paid and changes to a status quo are demanded. Sometimes the change is a switch to different methodology of measuring customer satisfaction.

Disconnect from the Customer Satisfaction Score, regardless of methodology, and from specific and systematic action that targets improvement of customer experience, makes the score an ultimate vanity metric.

Richard H. Levey wrote thatbullshit survey

“True customer insight requires first knowing (discovering) which attributes matter to the customer and then determining how the firm is performing on those attributes. If a customer’s experience occurs over multiple interactions and various media, then each needs to be measured to drive insight precision. ”   Stop Measuring Customer Satisfaction and Start Understanding It. (emphasis mine).

I would add-stop counting the clicks, “likes” and re-tweets, and start understanding WHY customers do what they do. Stop tabulating survey scores and start reading the comments—you may learn something that would help promote an action of positive change.

Customer Satisfaction Is A Relative Term

Customer perceptions of products and services, or companies and brands, are measured using different scales and methodologies. Regardless of any ambiguity of definitions and sophistication of methodology, any scale you choose reflects a fundamental consideration: how does the product (service/brand/company) experience compare to customer expectations? The expectations are formed by a company’s marketing communications and advertising, other consumers’ word of mouth and (in this age of the Social Customer) pundits and existing customer reviews published online. There are many well documented ”purchasing journey” maps produced by respected researchers. Here is one example.

Most of the studies agree that the choice a customer makes is based on the expectation that the selected product will be more satisfying than most other products within the segment. Yet, many businesses measure the Customer Satisfaction of their offerings without comparing the results to their market averages. Considering that these sentiments are very dynamic, competitive comparisons make the process even more volatile and difficult to measure. However, the results are often well worth the effort, as they generate ideas for differentiation, marcom efforts optimization and operational improvements that could produce significant financial gains.

The example below shows Nokia Lumia products exceeding their customers’ expectations by a much wider margin than their top competitors and the smartphone segment average. If you are involved with Customer Experience Management, a deeper look into the reasons behind the trend may help to improve your customer journey.

CSAT is a Relative Term

The following example measures aggregated Customer Satisfaction with Small (Kitchen) Appliance Brands against average satisfaction level within that Category. It is based on content analysis of 65,379 customer reviews published online over one year period.

Kitchen App Brands CSI vs Average

Such measurements can be produced using most popular scales (such as NPS or CSAT), done for any market segment that has Social Customer engagement, and results can be aggregated by brand and/or distributed by a channel.

Valuable insights into channel performance

poor surveyKnowledge of customer satisfaction and experience delivered by a specific channel can be very illuminating from a brand manager’s perspective. It could be even more enlightening if customer satisfaction metrics also analyzed units sold by each channel and units returned. When these streams of data consistently correlate and/or trend together negatively, it is likely to indicate systemic channel performance problems.

From the channel perspective, customer satisfaction with specific brands – and even more importantly, with specific products – can help optimize shelf space for maximum profitability.

The detailed analysis of customer feedback (reviews) and customer support communications associated with a troubled channel or brand can provide root cause(s) and ideas for corrective actions.

Below is an example of a report on customer satisfaction with smartphones by channel/carrier. The information was mined from 142,369 online customer reviews published prior to March 30, 2013. AT&T customers who use Nokia smartphones reported customer satisfaction 21% above average across all major carriers* and all major brands.

Social Customer Satisfaction per Channel

* Sprint did not offer Nokia smartphones during the reported period.

Deeper analysis may reveal customer satisfaction by model, time period, customer gender, age group, other personal characteristic, or geographic region. Please contact us to discuss methodology for mining intelligence in your market segment.

HTC sweeps Customer Experience challenge

October is here, and that signals the arrival of Piplzchoice quarterly smartphone Customer Experience report. The past reports are available upon request.

Here are a few words of explanation of the methodology used to produce this report:

  • We interpret and measure “Customer Experience” according to a definition and understanding articulated by Forrester Research analysts as “how customers perceive their interactions with your company.” However, we expand it further to “your brand” and “your product” to make the measurements more actionable by branding and product management.
  • we do not conduct any surveys, pose any questions, or contact any customers in any form or shape. That also means that no assumptions or keywords are constructed by Amplified Analytics personnel to produce this report. The information published here is based on proprietary, automated Opinion Mining of unsolicited and customer-generated description of their experience with specific smartphone models.
  • We start with a view of a “universe” of 355 smartphone models described by 108,963 customers. We then focus on smartphones that have been reviewed during the last 3 months (39 models).
  • Opinion Miner® software discovers specific attributes of customer experiences with these smartphones and measures the customers’ sentiments for each attribute.
  • This is not a “buzz” sentiment monitoring exercise, as we ignore any content that cannot be reasonably attributed to experience of a paying customer. More on the methodology can be found here.

Spotlight on Brand

A share of customer reviews illustrates a level of their engagement with a brand and correlates with dynamics of their market share performance. The chart below shows a share of customer engagement with the smartphone brands during the third quarter of 2012.

As predicted in the last two reports, HTC’s share of engagement had finally come down as the Thunderbolt customers stopped describing their experiences with the long-obsolete model. Apple’s share of attention will likely rise substantially in the fourth quarter since the slow delivery of iPhone 5 has just started to produce a trickle of their customer reviews.

The big winner of customer share of attention this quarter is the Samsung at 39%. However, it is not a surprise, as its engagement with customers was growing consistently for each period on which we reported. The only other brand that shows consistent increase, albeit on a much smaller scale, is Nokia (6%).

The Average Customer Satisfaction per Brand chart paints a picture that is quite different from the results of most popular surveys that were published in a recent past. We calculated the average Customer Satisfaction of a Brand by averaging Customer Satisfaction scores of each model that belongs to the Brand.

This approach highlights how specific models can impact overall Customer Perception of a Brand. A wide range of Customer Satisfaction scores with Samsung models, from Galaxy Note=1.47 to Intercept=0.88, is responsible for lowering the Brand average. I think our approach provides better guidance for proactive Category/Brand decision management.

Spotlight on Operating Systems

Android keeps dominating the share of Customer Engagement, but Windows OS’ slice of the pie continues to grow.

The trending picture below shows the surprisingly consistent increase in Customer’s engagement with Windows OS from quarter to quarter. This is the first period WP Customers “out” reviewed the Apple Customers by 83%.

The rate of engagement correlates to a high level of Customer Satisfaction, as Windows-powered smartphones are locked in a statistical tie with Apple iPhones as five out of ten most popular models are Windows phones from different manufacturers.

It will be interesting to see how the inflow of iPhone 5 customer reviews impact that battle.

It is worth repeating that these scores reflect aggregate, average satisfaction with the phones and not with their operating systems. Let me know if you need the detail view of Customer Satisfaction with operating systems themselves.

Spotlight on Smartphone Models

Samsung Droid Charge Customers generated the largest number of reviews at 2,084 of all smartphones reviewed during this quarter. The latest arrival, Apple iPhone 5, understandably has the smallest number at 56. I expect it to change dramatically during the last quarter of 2012.

The complete list of all models included in this report can be found here.

This quarter, the HTC Radar smartphone came with the top general satisfaction score of 1.78, exceeding its customers expectations by 78% (N=406). Another HTC Window phone, Titan II (CSAT=1.69/N=83), and HTC Amaze 4G (CSAT=1.56/N=208) were the closest contenders. That is a sweep for HTC. The cellar is occupied by Blackberry Bold 9900 (CSAT=0.69/N=84), LG Cosmos (CSAT=0.76/N=644), and Samsung Intercept (CSAT=0.87/N=441). Both Intercept and Cosmos have been on the bottom for the last three quarters, and I wonder why the brands managers continue to allow the overall equity erosion to perpetuate.

In the previous installments of this report, I have presented a detailed account of Customer Experience measurements for selected models. Since the volumes of information become too large for this format, and a level of the readers’ interest in the details is not clear, I will conclude the analysis here. The detailed comparison of specific smartphone models for personal use can be found by following this link. If you are interested in SmartPhone, or other market’s, Customer Experience Measurement (CXm) dashboard implementation, please email to me directly.