With a slumping economy, finding any possible way to improve customer service equates to a competitive edge, and gathering information from customers on how they view a service is the best way to measure, and improve, customer satisfaction and the contact centre's revenue contribution.
Analysts agree on the value of surveying customers and point to three primary factors as to why contact centres should offer post-call satisfaction surveys more frequently:
* Improving the customer experience has risen to a board-level focus. This focus includes strategic initiatives such as Six Sigma, ISO 9000 and others to improve service processes, as well as the business processes behind them.
* In addition to contact centre managers and agents, the internal audience for customer feedback has been extended to business users and departments such as tech support.
* More channels are now available through which to collect feedback: IVR, call recording, e-mail, Web chat, e-services, instant messaging, and partners.
As a measurement tool, post-call surveys let contact centres collect customer feedback more diligently and use the resulting data to pinpoint lagging satisfaction drivers. Surveys also supplement metrics like call times and hold times to measure drivers such as agent availability and routing calls to the right person or department. Mostly, though, customer surveys can help take service levels from “more than satisfactory” to superior.
Speech-enabled IVR surveys
Because consumers still prefer the phone by a wide margin over e-mail and Web interactions, post-call surveys have become the preferred method of customer feedback data collection.
Moreover, speech engines and intelligent new survey applications integrated with IVR systems have made it easy to implement surveys, automate the process (leaving agents out of the equation), and also automatically transfer callers to surveys, record and report on results.
In many cases, it reduces costs over hard-to-integrate third-party systems and rudimentary survey applications developed in-house.
10 best practices for IVR surveys
As for the makeup of a survey itself, CFI Group offers a few best practices from its American Customer Satisfaction Index, published annually by the University of Michigan and termed by the 'New York Times' as the “definitive benchmark for how buyers feel”.
1. Use scientific questionnaire design. “Did you find the agent knowledgeable and experienced?” This type of question addresses two issues at once, but fails to capture meaningful information for either agent qualification. Make sure each question equates to one issue.
Post-call surveys let contact centres collect customer feedback more diligently and use the resulting data to pinpoint lagging satisfaction drivers.
Dave Paulding is regional sales manager, UK and Africa, for Interactive Intelligence.
2. Define the goal. Set objectives upfront and don't deviate. If the goal is to collect feedback regarding an agent's knowledge level, don't let marketing add a question about a new giveaway promotion. A clearly focused survey also respects the customer's time.
3. Keep the surveys short. The goal of a survey is to collect actionable information. While there are no established rules for time frames, post-call surveys of two to three minutes have proven to collect enough information to be useful and still hold the respondent's interest.
4. Measure what matters. Say a survey asks travellers to rank an airline by “important characteristics” and respondents put safety first. Safety, however, is not what ultimately compels travellers to choose a particular carrier - price, schedule, and frequent flyer rewards programmes do. So, stick to real cause-and-effect analytics.
5. Use the right scale. A 10-point scale is standard for most consumer surveys, but for IVR surveys ACSI recommends a nine-point scale using the phone numbers one through nine, to shield respondents from “slow-finger” coding errors (typing 1 and 0 fast enough for IVR system capture). With ACSI, nine-point scales are converted to 10-point, and final scores converted to a 100-point scale for reporting and “rightmarking”.
6. Don't strive for a benchmark - find the “rightmark”. Never mind industry average benchmarks or those of current best performers. Integrate operational targets and satisfaction data to find the point where service and delivery converge for your customer base.
7. Co-ordinate with IT. Trust me, IT teams need advanced warning for a survey initiative... and the company will need their buy-in. Integrate speech engines, survey apps, IVR systems and call recorders. Store and distribute survey data and make it more accessible.
8. Don't use survey results to evaluate individuals. Instead, use results to coach an agent and tailor a training regimen for improvement if scores are low. (Recording of high-scoring agents can be a helpful training tool.) If possible, let your IVR system automatically transfer callers for post-call surveys to prevent agent intervention and bias.
9. Report often and make results accessible. Survey results should ideally be made available to users upon a survey's completion, whether via the IVR/survey solution or using dashboards, “heads-up” displays and other types of real-time (or near-time) alerts.
10. If the company's invested, make it work. Finally, don't lose sight of the fact that understanding customer satisfaction improves the contact centre. The company has spent the time and money to collect the data, now put it to work.
In the name of customer retention and revenue, feedback collected using post-interaction surveys provides a more reliable measure of how satisfied - or dissatisfied - customers are with a product, with the service they've received, and with the organisation as a whole.
* Dave Paulding is regional sales manager, UK and Africa, for Interactive Intelligence.
Share