cancel
Showing results for 
Search instead for 
Did you mean: 

Customer Satisfaction: Results from February 2008

Customer Satisfaction: Results from February 2008

Customer Satisfaction: Results from February 2008

The purpose of the Customer Feedback Survey is to facilitate a better understanding of what our customers think of the service they receive from the Customer Support Centre (CSC). Customer feedback will be utilised to identify areas for improvement within the department, it also provides a feedback mechanism on the performance of individual agents. 7538 surveys have been submitted (via email) to PlusNet customers that have called the CSC. 3.0% of the surveys were completed which represents a sample size of 0.5% of all customers that have had telephone contact with a CSC agent last month. It is important to highlight these statistics in order to allow the data presented within this report to be taken in context. Although the results gathered from the survey are a random sample of customers that have spoken to a CSC agent, the results do represent a very small sample of the total calls handled within the CSC during the month. The graphs below represent a selection of the questions asked to the customers to demonstrate the differing types of feedback we receive each month. The first graph shows the type of queries our support team deal with on a daily basis. As you can see the majority of the calls we receive are consistently technical calls. We are seeing a slight increase in faults calls for the 3rd month in a row, this could well be due to the poor weather over the winter period.   If you have to call us for help or advice then you would like the call answered as quickly as possible. There has been a big push within the CSC to drive down the longest wait time, this is represented on the graph by the increase in customers saying the call was answered quickly. This figure is now nearly 85% this month increasing from 67% last month.   The wait time is shown in more detail on the next graph, unfortunately the number of customers waiting more than 10 minutes for an answer has increased slightly. As you can see there has been a dramatic decrease in the amount of customers waiting more than 5 minutes for their call to be answered. This is down mainly to the huge amounts of hard work from within the CSC.   Different customers have different knowledge levels, therefore it is important that we adjust our approach to ensure we can help each person as much as possible. Last month this figure was just under 91%, it has improved this month to 93%.   If you have a problem with your connection or services you want the person helping you to understand the frustration this can cause. This is why we encourage our agents to empathise with customers on the phone. This month the figure for very or extremely satisfied with the level of empathy has increased from around 70% to over 75%.   It's equally important that our agents know what they are talking about and are confident in doing so. Again well over 80% of customers were fairly satisfied or above with the agent's level of confidence.   We ask customer to rate the overall performance of the agent they spoke to, on the whole this is very high but there is always room for improvement. The QoS and training in the CSC is paying off, as the amount of customers rating the agent's performance as very or extremely satisfied has increased by 5% to 76.3%.   After last months slight drop to 60% there has been a lot of hardwork at PN towers analysing the results and seeing the main areas for improvement. I'm delighted to report that we have hit a new high of 67.1%.   I hope you enjoyed reading this and I look forward to next months survey. Chris.

0 Thanks
4 Comments
361 Views
4 Comments
Grafter
Why arn't ticket responses monitored and have the same cusomter satisfaction tests performed on them?
Grafter
Why arn't ticket responses monitored and have cusotmer satisfaction tests performed on those? Why only calls?
Community Gaffer
That's a good question Cheesy The questions are structured to ascertain how the agent performed on the phone, some of the questions would be irrelevant for ticket responses. When customers close the ticket after the issue is resolved they can rate the response and add comments if required, these are picked up and checked for quality to ensure that any poor rated tickets are improved in future.
Newbie
Ticket responses are currently monitored separately and assessed in a different way. Upon closing the ticket the customer is asked to rate the support they received, via the "smiley face" method, which we also use on the support pages. This feedback is reviewed daily by the Managers in the business and improvments and training issued identified. This is then fed back to the agents concerned and improvements made. I am sure a blog is required here so I am sure a TSM will pick this up and run with it. Carol