If you are interested in engaging your customers, your listening has to be customer-centric. I know you want to talk about your product and you company. We all want to talk about things that are important to us, but we only engage with people who listen at least as much as they talk.
Many marketers are intrigued with the idea of using Social Media in their go-to-market campaigns for the next product launch. They are disappointed to learn that there is usually not enough customer feedback available at the time of a launch to propel their new product to instant, viral success.
Authentic word of mouth cannot be “manufactured” by marketers when they need it, but can be leveraged very successfully when customers are engaged with their brand/category. Engaging customers is not an event within a campaign, but a long term, customer-centric strategy.
The first mile of customer engagement is a post-commerce or post-transaction strategy that invests in an ongoing experience to keep customers happy now and over time. Doing so sparks positive word of mouth and in turn influences decisions the dynamic customer journey that defines the new era of connected consumerism. If in fact getting closer to customers is a key objective, then why do many businessesneglect the first mile of customer experience?”
Every product experience starts with an expectation. The expectation was originally initiated by product announcements, industry analysts interpretations of these announcements, pundits’ reviews and commentary, customers’ word of mouth, and eventually your own experience. When this experience exceeds the original expectation, the Social Customer has a propensity to generate authentic, positive word of mouth online that is read by scores of interested consumers, who view it as the most trusted source of information about your product.
Many companies monitor social media to supplement their Customer Support channels to help resolve specific customer issues. This is surely a part of Customer Experience, but only a part. Multiple and loud accolades to customer support satisfaction may spook potential buyers by making them think that the product quality is low, because it requires so much in terms of support efforts. However, customers’ stories describing why they have purchased the product and whether it met their expectations truly help potential buyers decide if this is a right selection for them.
The goal is to learn from a very large number of customers, in a very short time how they perceive your product and whether it has met their expectations. The techniques employed in the listening process can be used during go-to-market campaigns.
There is a lot written in the last few years about the importance of consumer engagement with brands in the age of the Social Customer. Most writings are focused either on teaching how to get most Facebook likes and Twitter followers for your brand or how to manage PR disasters fueled by social media winds.
I always look for evidence of Social Customer impact on business growth. Intuitively, most people would agree that satisfied customers, who actively share their experience with other consumers, impact the product and brand market share growth. However, intuition is not very powerful agent of organizational transformation, I hope a proof in a form of data has better chance of success.
The most valuable insights often hide in the intersection of multiple data sources. For example, let’s look at how the combination of social media engagement and customer satisfaction information correlates to changes in a brand market share.
As smartphones represent one of the most dynamic and socially engaging market segments, it provides a good source of data for our example. See below the percent market share by operating system for the first quarters of 2012 and 2013:
Amplified Analytics online marketing research mines customer reviews to produce Social Customer Engagement and Customer Satisfaction scores. We intentionally used Social Customer data from preceding time periods to examine the influence of social feedback on customer behavior that produces market share change.
After combining the data from both sources we calculated the year-to-year change you see below.
The Windows example suggests that strong growth of Social Customer Engagement combined with robust improvement in Customer Satisfaction lead to very meaningful change in the Market Share.
If you are like me, you probably receive requests to share your opinion about websites, products or services on a daily basis. That is not surprising if you consider number of technology and service providers that claim to make VOC solicitation and collection cheap and easy for companies.
I often wondered why do they go to the effort and expense of soliciting our feedback while at the same time ignoring Word of Mouth that is shared online by their customers without any solicitation.
The other day I received an email survey request from the manufacturer of a vacuum cleaner we have purchased a few weeks prior. Very polite message from the Customer Experience Director intimated that their VOC program needs my feedback. I liked the product a lot and left a very favorable review on the retailer’s website describing my experience. In fact, a few people marked my review as “helpful”. So why this polite gentleman wants me to answer 32 questions about my satisfaction with their product?
Have I not already published my feedback under my name? I posed this question to the Customer Experience Director and suggested that the survey questions were convoluted, too wordy and focused on details of no importance to this customer. The response indicated that the company does not include Social Media feedback in their Customer Experience and Satisfaction measurement efforts because it is a domain of their Digital Marketing group and their Customer Service department’s Social Media listening team.
Many companies are striving to engage with Social Customer. However, they insist on controlling method, level and form of engagement. It is not very smart and it will not work because Social Customer has alternative options. Corporate believe that social media research does not produce as valuable market intelligence as VOC is analogous to the conviction that bottled water is “better/cleaner/safer” than the tap water.
Social Media research of WOM has to be included into VOC programs because:
WOM often produces much larger data samples. Low customer survey response rates are always a subject of concern that results are not statistically representative.
Social Media research of WOM does not carry a risk of creating a negative touch point of customer experience by poor execution of VOC effort. Considerable skills and efforts are required to craft the survey questions and to administer the program without introducing friction into customer communication with a brand. WOM analysis helps to ask questions that are more relevant to customers.
The results of Customer Experience Management sponsored VOC measurements are not visible to the consumers and if there are, are not believed by the consumers who see it as a form of advertisement. VOC produce metrics, consumers communicate telling stories.
With the advent of Big Data, it is estimated that 70%-80% of all data collected and stored by an enterprise is in an unstructured form. There are various approaches, technologies and methods to automate the analysis of unstructured data such as text.
However, regardless of advances in technology, some Customer Experience Management, Marketing and Customer Service professionals continue to use the accuracy argument to deny their employers access to significant operational and financial benefits. They argue that the results, produced by the textual analysis software products, are substantially less accurate than results produced by humans, and therefore it is best to ignore the vast repositories of human knowledge and disregard the immense cost of storing them until the technologies mature.
It is humorous that people with such attachment to “accuracy” usually have difficulty clearly defining what it means to them in this context or how to measure it.
Given the ambiguous nature of unstructured data, the challenges of formal definition are easy to understand. In its core we deal with an interpretation by a human or by a machine of what was said or written by another human. A single individual will interpret the same text with different results depending on a multitude of conditions, such as time of day, context in which the text is framed or the state of mind of the interpreter at that moment. In addition, no single individual can possibly handle the volumes of data available – and with each additional interpreter joining the task, the reproducibility of translation results declines exponentially.
The speed and cost are obvious arguments for the automated processing, but a machine also offers a better solution to the problem of the “accuracy” of big, unstructured data analysis. An interpretation of a single piece of text may not agree with an interpretation of a detractor at a given moment, but an average result of a large data set analysis will consistently produce measurements within 10% of a human tester’s results*.
The debate isn’t whether or not automated analysis of unstructured data is “accurate” enough. The debate is whether an enterprise can ignore their vast data reserves in the Age of the Social Consumer.
* This number is based on our internal tests that we conduct at least 3 times per year.
This is our third annual analysis of customer perceptions of smartphones. This report is produced entirely by means of Social Media research. Customers became even more active in 2012, sharing their experiences with products they chose for the benefit of consumers who are shopping for smartphones.
The last year saw an increase in unsolicited Social Customer engagements with brands in this category of over 75% — from 29,971 in 2011 to 52,517 in 2012. The quarterly trends indicate that the rate of engagement is still accelerating. The following filters were applied to arrive to these numbers:
smartphone models that have at least 100 customer reviews published on multiple SM sites
content is unsolicited and volunteered by actual customers
content was published on or before 12/31/2012
We produce smartphone market reports to illustrate the power of our technology developed for DIY primary market research of Social Media and Social Enterprise. Our software mines opinions in customers Word of Mouth to measure their engagement with brands and difference between their expectations and their experience, then predicts their propensity for advocacy. For more information about methodology used to produce this information, please visit our methodology page and/or contact the writer.
Spotlight on Customer Engagement
Brand cannot flourish without advocacy of its customers and the advocacy is not likely without engagement. We have noticed the correlation between number of reviews published online and a number of units shipped, and therefore found it important to use for further studies. There are plenty of smartphones launched every year, but only some engage customers sufficiently to inspire them to share their experience in numbers that are required for meaningful, statistically representative analysis. The consumer exposure (i.e., advertising spent) is one factor, but not the deciding one. An example, HTC 8X, is exposed a lot more via TV advertising than Nokia Lumia 920, but the latter is reviewed online almost twice as much so far.
Samsung is the King of Customer Engagement and it is not surprising that their sales numbers are also leading the rest of the brands. More intriguing is that HTC keeps holding to the second place, considering its well publicized problems. However the Samsung is not the only net gainer – Nokia has also seen dramatic improvement in its Social Customer engagement.
Below is a chart of the trends in Social Customer Engagement with brands year over year. As the overall engagement grows fast, only Samsung and Nokia show substantial gains capturing attention of this critically important demographics.
The Samsung Galaxy S III was the most often reviewed smartphone in 2012 (5,048), which is remarkable considering its late shipping date. It is not really a surprise considering its market penetration (it is available under the same name from every major US carrier), and advertizing spent. This strategy does promote powerful Word of Mouth, but this smartphone was also launched before most of the others on this list.The Apple iPhone 5 was second (3,185) and iPhone 4S third (3,096) in engaging with customers.
Customer Engagement with Platforms
It appears that Social Customers display more loyalty to the smartphone platforms than to the brands as they describe their previous experiences with different manufacturer models, but mostly the same operating systems. That is why we also looked at Customer Engagement by platform.
Again, the domination of Android is to be expected even though it came down from the prior year. The dramatic drop in iOS Customer Engagement is very surprising (30% Y-O-Y), but there is enough evidence of the substantial number of iPhone customers who decided to experience Android (9%) or Windows (14%) platforms. It is always easier to show dramatic gain from a low base, but the growth of engagement with Windows customers is nothing less than remarkable at 1,032% Y-O-Y.
Spotlight on Advocacy
In the previous reports we measured Customer Satisfaction with smartphones, and then aggregated these numbers to the brand level. This year we introduced a capability to estimate (or predict) how the Social Customers would respond to the Net Promoter Score® question, if they were asked “On the scale of 0-10, how likely would you be to recommend this product to your friend or family?”. Since we work with unsolicited customer feedback, we have no ability to ask such a question. However, when customers are asked this question their typical response is to replay their experience with the product in their minds, then decide how to answer, based on those memories. These experiences are precisely the same “raw material” that is available in online customer reviews. Our software does the last piece of translating stories and experiences into a score. It is taking common language and translating it into a scale of how strongly the “author” feels about the subject – either positively or negatively. The chart below shows aggregated NPS for each brand that is produced by calculating weighted scores of the individual smartphones that are associated with a brand.
It would not be surprising that these numbers may substantially differ from the results obtained by a survey if you consider that Social Customer enjoys anonymity that allows them to say how they really feel about their experience. The top Advocacy rating of 2012 is shared by 2 Windows smartphones – HTC Titan II (NPS=55) and Nokia Lumia 920 (NPS=55). Motorola Atrix 2 (NPS=49), Nokia Lumia 822 (NPS=47) and Samsung Galaxy Note (NPS=45) complete the top five smartphones. The basement is occupied by LG Cosmos 2 (NPS=-68), Motorola Droid 2 Global (NPS=-55) and LG Revolution (NPS=-44). These ratings are changing quite frequently. Our software recalculates every time new customers publish their experiences online. Free access to a real-time monitoring of a product category of your choice is available on trial basis.
Advocacy by Platform
Say what you want about a shortage of applications, Windows users just love their experiences with the platform. There is not a lot of them yet (4,152), but their numbers are growing fast (1.032% year-over-year) and they are very vocal. The quarterly growth trends were very consistent and apparently predictive of growing sales (139% in Q3).
Three out of top five smartphones are powered by Windows.
We cannot provide trending information for NPS this year as this capability was launched only a few months ago. However, we can look at Customer Satisfaction as a proxy, to see that it improved for all platforms excluding Blackberry.
The blog format does not allow enough space for detailed reports of each smartphone customer experience analysis, or their benchmarking and root cause analysis, but such information is available on request if desired. Below is an example (a screenshot) illustrating a format of Customer Experience dashboard.
In conclusion, this form of Social Media market research offers compounding benefits: faster time to insight translates into advantages in time to market. Lower research costs mean more resources that can be directed into improving the Customer Experience. All this translates into serious competitive advantage. In our experience, companies that leverage customer review data not only benefit from faster time to insight, but actually find the information more actionable.
Here is the Q2 2012 update. These links – Q1 2012 and 2011 – will take you to the previous reports. This edition includes 353 smart phone models and the aggregation of 104,691 unsolicited customer reviews. We have used the Opinion Miner® software to extract specific attributes of customer experiences with these smart phones and to measure the customers’ sentiments for each attribute. In the interest of consistency, we again filtered for smart phones that have been reviewed during the last 3 months to focus on currently sold models.
Disclaimer – Nokia is our client, however neither Nokia, nor any other company or organization, have sponsored or influenced this study.
Customer Reviews per Model
Only the 51 models that were actually shipped to paying customers during the past quarter are included in this study, as we are interested only in actual customer experience, not in marketing accolades.
If you are a new reader, I would like to stress that no survey or focus groups methods were used to collect data for this study. Please check our methodology if you are interested.
The most customer-reviewed smart phones that made through the filters for Q2 2012 are the same models as the last quarter, but their content contributions have slowed down dramatically during the last couple of months – HTC Thunderbolt (5,892), Samsung Droid Charge (2,009), and Motorola Droid Bionic (1,633). Obviously, the time of a phone introduction impacts a total, aggregated number of customer reviews published for these devices. Click on any image below to make it larger.
Customer Reviews per Brand
The enormous number of customer reviews of the Thunderbolt, which is almost obsolete by now, keeps the HTC brand as the most reviewed one by their customers. We anticipate the next report to reflect this change.
It is worth mentioning the trends of a dramatic decline for LG in a number of customer reviews and steady rise of popularity for Nokia (see figure below). We have studied before the correlation between number of customer reviews published online and a number of units shipped, and therefore found it important to use it for comparison. These trends suggest similar dynamics in customer purchases.
Customer Reviews per Operating System
The reviews of Android phones absolutely dwarf ALL other operating systems, and the trend does not suggest any major change. One thing that is worth noting is the growth in numbers of reviews of Windows phones from 1% at the year-end to 4% at the end of Q2 2012. It would be interesting to see if this trend continues.
iPhone aficionados are a very finicky crowd, and even though the earlier models continue selling, virtually no reviews of those are still published by their purchasers. This fact explains the precipitous drop in iOS share of the reviews after the end of 2011.
Customer Satisfaction per Operating System
Customer satisfaction with Android phones continues improving from the previous reports, but Windows phones satisfaction has seen the most dramatic increase since the previous quarter – 18%, while customer satisfaction with Blackberry phones keeps sliding down. In fact, five out of ten smart phones with highest customer satisfaction scores are Windows models.
It is worth repeating that these scores are the aggregate, average satisfaction with the phones and not with their operating systems. We will look at Customer Satisfaction with an operating system later in this post.
This time, the Samsung Galaxy S Blaze smart phone came with the highest general satisfaction score of 1.81, exceeding its customers expectations by 81% (N=106). The Motorola Atrix 2 (CSAT=1.78/N=186) and Nokia Lumia 710 (CSAT=1.72, N=318) were the closest contenders while LG Cosmos (0.79/590), Samsung Intercept (0.88/1,186) and Blackberry Curve 3G 9330 (0.91/410) have disappointed their customers the most. The two out of the last three have been on the list last quarter, as LG Cosmos’ reputation has continued to sink deeper.
Attributes of Customer Experience by Importance
Our Market Intelligence Analysis of the segment indicates that the following Attributes of Customer Experience are most important to the customers:
This 2-minute video explains the methodology for this chart in greater detail -
More specific insights in customer perception required the application of additional filters to select models representing different Operating Systems in significant numbers. The following models were selected:
The graph below shows Customer Satisfaction scores for the smart phones Reliability experience. Blackberry Bold 9930 is the only model from this sample that disappointed its customers
Not surprisingly the iPhone continues to dazzle their customers with the quality of its Display. However, all models included into this analysis earned very high marks. A score 1.0 represent “Satisfied” value as defined in our methodology and interpreted by our algorithms as an equivalent of this statement, “I experienced what I have expected”.
In the interest of the space limitations, I would like to suggest that more details and customer feedback verbatim are available on request via access to the dynamic dashboard for this segment. I will be happy to provide the link free of charge.
Based on the questions I often get from marketing practitioners after webinars and speaking engagements, there is considerable confusion about the difference between Social Media Monitoring and Customer Intelligence methodologies. Below is my first attempt to establish a clear demarcation line between the two approaches. Please help me to refine this matrix with your feedback, comments and disagreements.
Social Media Monitoring
Customer Intelligence Analytics
SMM captures and measures Word of Mouth communications generated by anybody or anything: consumers, bloggers, marketers, pundits, industry analysts, customers, automated repeating and SEO software. The content originates from Social Media networks and other public (Internet) sources.
CIA captures and measures only customer communications about a product or service they have purchased and experienced. The user/customer-generated content (USG/CGC) can originate from public (Internet) or private (company) sources.
Transactional analytics, i.e., how much buzz there is about keyword=XYZ and whether it’s positive or negative. Focus is on BUZZ.
Contextual analytics - why customers purchased this product, what they do like and don’t like about their experience, and to what degree. Focus is on Customer Experience.
Provides two-dimensional measurements per keyword provided – velocity and sentiment, i.e., how fast and furious the communications are generated, and whether they are negative or positive.
Provides three-dimensional measurements per product/service – discovers what attributes of customer experience with the product are important to customers, measures how important each attribute is, and measures the difference between customer expectations and customer experience with each attribute.
Immediate to short time frame communications are monitored and trended.
Time frame is determined by a product life-cycle and trending of post-shipping customer feedback metrics.
Excellent for PR and Customer Support Crisis Management applications.
Excellent for Strategic Marketing, Marketing Communications Effectiveness, Product Management, Customer Support Management, and Purchasing Management applications.
Last night, I noticed that Brother started to run again its printer’s Reliability commercials on TVknown as Printing Dreams. If you did not see it, the company makes a claim that their printers are the most reliable on the market.
Apparently, this claim is somewhat supported by the 2011 satisfaction survey conducted by PC World, also the survey results made Brother share the Reliability honors with two other brands. More detailed analysis of Customer Generated Content (CGC) indicates that Canon enjoys the highest reputation for Reliability, not Brother. Below is a result of Opinion Miner analysis of 32,309 customer experiences with 350 printers from major Brands.
Personally, I understand that advertising dollars are better spent to promote a brand as opposed to a specific product, but from a user perspective not all products often exhibit the best qualities a brand may be known for, hence I think that similar models comparison is much more meaningful.
Since our approach to Customer Intelligence does not involve expensive surveys and a lot of effort for crafting minimum-biased questions, I decided to run a quick analysis of customer reviews of five similar printers to satisfy my own curiosity. I selected the printers that have been available for a purchase at least for a year, so their customers had an opportunity to experience how reliable they are over time. The second selection criterion is the printers are still current, i.e., they are still available for purchase and their customers continue to publish their experiences this month. Thirdly, each model has a representative number of customers who described their experience with it. The following printers were selected based on the conditions above that were collectively reviewed by 1,224 customers and are listed alphabetically:
Canon Pixma MX420
Epson WorkForce 645
HP LaserJet Pro P1102w, and
Lexmark Prospect Pro205
One thing Brother definitely got right – Reliability is the most important attribute of customer experience. It is measured at 20.91% of importance. Here is the link to the short video explaining how it is measured. However, this particular printer (HL-2270DW) does not measure up to Brother’s claim for highest Reliability at all as it comes short of scores earned by 3 out of 5 competitors.
Printing quality is the second most important attribute of customer experience as 17.17% of all opinions expressed by customers relate to it. Price/value attribute is third in importance (8.41%) and Usability is forth (6.49%).
Canon is the clear leader in this group as it earned the highest satisfaction scores from their customers for most important attributes (7 out of 12):
Color/picture quality (exceeded customer expectations by 27%)
Customer support (met customer expectations)
Printing quality (exceeded customer expectations by 41%)
Reliability (exceeded customer expectations by 27%)
Scanning experience (exceeded customer expectations by 26%)
Speed/Performance (exceeded customer expectations by 68%)
Wireless (exceeded customer expectations by 18%)
Further details can be seen on the snapshot of the Customer Intelligence Analysis dashboard below.
If you care to dig deeper into the details, lower importance attributes and verbatim, please let me know as this information is available on request.
Until last year’s publicity crisis, Toyota enjoyed one of the most formidable brand reputations in the automotive market. Hyundai, on the other hand, used to dwell on the bottom of the brands totem poll. Their respective sales numbers and price deferential have reflected the perception of value in their customer’s mind. The Times They Are a-Changin’ and here is an example of how customers describe their experience with their cars today. We focused on content generated by 386 customers who experienced 2011 Camry and 2011 Sonata, and shared their experiences with other consumers online. Below is Market (segment) Intelligence Analysis dashboard snapshot. The actual dashboard is interactive and provides the access to verbatim for contextual interpretation.
This is not a complete list of Customer Experience Attributes the Opinion Miner has discovered, but we selected the ones that have the most importance to their customers. The complete list is available on request. The bars indicate a difference between customer’s expectations and their experience with a specific Attribute, while the green line across the bars shows the Importance of these Attributes to the customers.
Our methodology does not utilize surveys, focus groups, panels or other forms of leading questions/bias forming market research tools.
Two findings are worth pointing out:
Camry’s customers disappointment with their Customer Care experience is very surprising and troublesome. It may be worth measuring this Attribute across Toyota vehicles line-up to get a better assessment whether this is a data spike or beginning of a trend threatening the Toyota brand value.
Road Noise and Transmission experiences have disappointed the customers of both contenders, however Sonata seems to provide more tolerable experience than Camry. In both cases the Transmission issues are illustrated in verbatim as
“Truly terrible [transmission] (violent shifts and indecision concerning gear choice.)”
“the transmissionsucks! When ever the car shifts gears it always knocks and there is a ways a jerk while driving. It best to slap the gear shift in the manual mode and shift your own gears”
A deeper look into verbatim describing the Attributes can provide valuable hints into language that resonates with consumers of this market segment, and opportunities for targeted communication messages that clearly differentiate your product from it’s competitors.
One of the interesting challenges Marketers are charged with is to make their product or service stand out in minds of the potential customers. Those who are not blessed with analytical talents commonly slide into well bitten path to differentiate by specifications or price. These approaches do not really require any expense and/or curiosity to seek deep understanding of the customers, but they are ultimately led to erosion of profit margins and brand equity. If you, brand “owner,” don’t care about the customers, the customers don’t care about your brand. Advertising alone could carry the brands a great distance in the “good, old days” but in the age of Social Customer, an advertising message is expected to resonate with customer needs or it will cause more harm to the brand image and product sales than good. When it comes to a product reputation or brand equity, the notion that “Any publicity is good publicity” is not the best strategy.
None of it is new to most marketers and some companies are spending serious money to develop processes for discovery of consumer/customer insights. However, most are struggling to convert the findings into specific actions. Measuring financial impact of these actions seems to be an even more formidable challenge. I would like to explore these challenges and perhaps offer some ideas for dealing with them.
Many marketers today are too insulated from their customers to develop a true, genuine understanding and empathy of customer experience with the products or services they market. One of the reasons is the use of outdated market segmentation methodologies based on demographic data that was developed to help marketers to quantify and forecast, but do not provide much help in understanding the needs and discovering opportunities for differentiation. More evolved methodologies that attempt to develop customer “personas” are much more helpful in learning needs of the specific, pre-defined groups of customers. Scott Sehlhorst of Tyner Blain offered a wonderful explanation of how such methodology can be used.
Use of both abovementioned approaches together will likely to improve your product traction, but will fall short of true understanding you need to differentiate your product because everything you have learned so far is based on your own original assumptions. You start with a hypothesis of who your potential customers are, what functions and features they would like in your product, and how much they will pay for it. Then you proceed with a number of potential customers’ validation and advisory activities that confirmed or cancelled your assumptions with various degrees of certainty. However, you still don’t know if the group and personas (within the group) are your best potential customers since you cannot possibly validate that with every potential segment. Additionally, I don’t think it is possible to effectively differentiate – by design, packaging or message – without ultimate understanding how the customers experience the product. All the steps you have taken so far cannot give you this knowledge for 2 reasons:
You have started at the “wrong” place – i.e., demographic segmentation of market is a wrong starting point. Much better starting question is – what products/services my future best customers are hiring today to do the job they need done. I use here terminology and concepts developed by Clayton Christensen. Check this video where he explains why the basic thinking taught in business schools and promulgated by consultants is killing innovation and the US economy if you are not familiar with his work.
Any knowledge of customer preference you have gained so far is company-biased because it was obtained by methods of inquiry and/or moderation. The one who forms a question or selects the subject of discussion ultimately influences any possible outcome. I do not believe that there is such a thing as an unbiased research, and I prefer customer’s bias to a company’s bias for the purpose of learning how a customer experiences a product or a service. This is my preference because regardless of our opinion, that is what they are going to use while selecting to purchase your product or a product of your competitor.
I am not dismissing the value of traditional methodologies off hand, but I am suggesting that substantially better results can be achieved by using triangulation of these with true insights of customer experience. There are plenty of customer-generated content available online for aggregation and analysis; however, even if you find difficult to find good data, we had very good results by asking customers wide open questions designed not for validation and easy tabulations, but to help them tell their stories:
What made you interested in a product XYZ?
How and where do you use it?
What was your experience so far?
Let them know that you asking because you want to learn how to make their experience better and promise that you will let them know the results of the study. Most people are motivated and willing to help. These types of questions are traditionally reserved for qualitative research that in the past was considered expensive, and the results are often dismissed as statistically not representative as they are normally reserved for a small number of customers. Those who try to find insights manually in large volumes of data are quickly get overwhelmed by “drinking from a fire hose.” However, advances in opinion mining technologies significantly reduced cost of high volume content analysis and can offer benefits of qualitative research and statistically representative numbers to back up the value of insights. In the words of Clay Shirky, “There is no information overflow-it is a filter failure.”
Good use of right technology can provide a marketer with a substantial and representative number of clues and hints to how customers think and feel about their experience with a given product or a group of products. However, no automation or outsourcing can replace your creative power of interpreting these clues into actionable insight. You can see examples regularly published on our Google+ feed.
The language customers who used to describe their experience will also provide the source of how to communicate with the market in the way the message will resonate and connect on the emotional level.