Voice Of Customer Data Worth Collecting And Analyzing Despite Challenges

Analyzing customer opinions can provide valuable insights. This knowledge, however, is not easily gained. Knowing how to collect this information without biasing it, and aggregating qualitative — non-numeric — data can be daunting. Chris Cottle, executive vice president of marketing and products for Allegiance, offers thoughts on best practices in aggregating and using customer opinions.

Chief Marketer: Where has the movement toward voice of customer (VOC) data come from, and why has it recently become an area of intrigue for marketers?

Cottle: So much of what customers are saying about a business may never be said to the business. And the words they use can carry a lot of weight. They have influence and sway. They can damage or bolster a brand. And the power to do that is readily available to a consumer in a ways that was never there in the past.

CM: Are some channels for collecting customer voice information more reliable than others?

Cottle: The most reliable is also the most traditional – surveys. But even the nature of surveys is starting to change. They're shorter, more customer-focused and user friendly. They're also administered closer to the moment of truth. In a retail environment, if you ask the person how the shopping experience was two weeks after, it's not effective. They don't remember it. People are busy. In today's environment you have to ask them very quickly.

This is the biggest example of a difference between business-to-business and business to consumer surveying. In B2B, there's generally more time. It's a different kind of relationship. Business-to-consumer marketing is very quick, and in many cases the consumer does not exhibit loyalty based on things other than price. Because there is so much choice, and it is so easy to click away or walk across the aisle, [customer-focused] businesses need to measure that customer experience quickly.

CM: What are some best practices in customer survey design and deployment?

Cottle: Surveys of the past were built around what market researchers ideally needed, with little thought about what customers taking the survey were experiencing. They were long, with awkward and cumbersome wording. And today the wording is different and more friendly. Surveys are designed to be a good and quick experience.

Marketers deploy different strategies to get information they need. [For instance] they may break it into small surveys and use sampling techniques.

They're also making greater use of unstructured questions – open dialog questions. Through text analytics, market researchers are able to do more open-ended question. That improves the customer experience. They can ask general questions and get quantified data out of that.

In the past, practitioners liked them and hated [these questions]. They had to go through the responses manually. But it is the area with the most gold, [where you'll find] anything you haven't perceived or predicted they would say.

Text analytics has made that possible. You can roll it all up into dashboards and reports and it will give patterns over a lot of different data, including open-ended text, unsolicited feedback and operational data.

CM: Why not just rely on operational data?

Cottle: There are lots of erroneous reads. Marketers have gone astray looking at one data source or cherry picking the comments. I wouldn’t want to rely only on operational or survey data.

CM: You mentioned dashboards. What elements of a VOC dashboard are truly useful, as opposed to just nice to have?

Cottle: It depends on what a marketer is looking for. A dashboard will, in a simplistic way, help unite the executive team around the most important drivers of the business as they relate to VOC. It will contain business metrics and scores that roll up into simplistic presentations, and in a timely manner.

I don’t believe many businesses on the B2B side have scores and data that the CEO should be looking at daily. Their business typically doesn’t change that much, so it could be a cadence of weekly, in some cases monthly. Whereas in B2C, where there are large amount of customers who make purchases at low price points, who are not committed to a business, and who could slip away easily, [daily updates] might be of more interest.

CM: Are there survey questions, beyond basic satisfaction ratings, that are most useful?

Cottle: A lot of the newer programs are being constructed around a thing called driver questions. With these, there is a scoring question – rank us – but beyond that there are questions that will help you understand the reasons someone gives you a worse ranking. Having a simple score, like a satisfaction score, is only the first step.

By knowing the drivers [of consumer evaluations] you can connect the dots and make change. If I had five-question survey, one could be a scoring and ranking question, and the others would be built around those drivers. Why did you give us this score?