CSM Website


The Forum for Customer Service Managers & Professionals
 | Forum Start | Register | Search | Statistics | F.A.Q. |
Customer Service Forum Customer Service Manager Forum / Customer Service Forum /  
 

Customer Service Statistics

 
Author dscottcarlisle
Member 
#1 | Posted: 7 May 2008 12:42 
Hello,

I am a new Associate Director of Customer Service. We currently have no measured statistics in our group and I would like to begin doing so. What would you consider to be important measurements internal to the organization as it applies to "Customer Service". Typical and easy would be number of calls, average call length, abondon calls, wrap up codes...... Are there any other that support the size of our group or quality of the service given? Are there industry standards published - I am in the retail Paint industry.

Thank you for the feedback.
Scott

Author Gruzlick
Member 
#2 | Posted: 13 May 2008 13:37 
I think you should start with customer satisfaction survey.
I suggest you this web survey service:http://www.esurveyspro.com/

Author ayaree
Member 
#3 | Posted: 13 May 2008 17:28 
Scott, there are two ironies that I am experiencing. One is that I sifted through this site to look at that very question this morning to think/re-think something I have been working on. The other is that I saw a link to "Customer Service Metrics" in the sponsored panel of this page (ha ha).

Gruzlick, my take on surveys is going to look like I am taking you to task or putting you on the spot, but please don't take it that way. If there is another way to look at them, would like to see your reply, as I have a lot to learn (or re-learn) from a metrics management standpoint. (Any standpoint really, but can you promise to keep that a secret.)

I have noticed people talking about surveys as a tool that should be used in a context that can use some sleeves rolled up, start-up style or revamp style. Not going to say that is not the way to go AT ALL, but I also think it is worth pointing out that every flower in the field is not in fact known if we rely on the completion of surveys turned back in to us. It makes sense that customers that are incredibly miffed will devote the energy required to fill us in on what their bad experiences are. It also makes really good sense that same customers in an incredibly miffed state will choose not to so much as touch the paper the survey was printed on too. Another thing that makes really good sense is that, in my very best of moods under the most livable weather imaginable, I did not give an overcooked Tofu Burger about a survey that somebody wanted me to complete, even after I was thoroughly satisfied with the output of the organization that came together to pull off their service for me. (My life is good, please don't make me do "homework" as a customer; this is not my job.) Satisfaction can be equated with non-responsiveness to surveys, and that does not mean performance was junk! The return results of sat surveys is limited to me, from my own sense - and I will point out I bring no science to that statement.

If I had to initiate the use of metrics in an organization and had no where to begin, I would think about what a good picture is for customer satisfaction in the context I am dealing with. What is considered a quality product? What are the typical reasons that product does not meet standards? What is considered "on time"? Why is it ever not on time? When do you have to get back to somebody on the phone or some other communication mechanism and look like you are there for somebody? What is considered an "Issue"? What are the types of "Issues" and how many are there, how long before they are eliminated? HAving to re-do and re-ship something is an "Issue," right? How many of those are there, and when does that no longer seem like No Big Deal? Do customers come back? Do you know why? Did you add staff somewhere? Did that do anything this month compared with 3 months ago? What are the activities that people perform that should be tracked and that can easily flow into a report?

Then I guess there would have to be a look at what is considered "good" on any/all/more of those questioned items and then actions are put in place by an attentive and supportive management toward making the instances that are LESS than that "good" better.

Not sure whether the paint industry (B2B? B2C?) allows for my line of thinking about how to measure customer service levels, but it would be good to hear back from you some further thoughts all the same.

:)

Author AnnaKn
Member 
#4 | Posted: 27 May 2008 12:44 
Hi Scott,

We measure Contacts per live working hour (if you have 6 live hours a day, its calls divided by 6 not 8 so you aren't punished for time we ask you to be reviewing, studying etc). We ask our analysts to try and resolve issues with one incoming and one outgoing contact (our CRM helps us track this). We ask that for issues remaining open that they contact the client AT LEAST every other day to ensure they are satisfied.

I work in a Software support center so some of these things are more applicable to that world but I think the overall ideas (quick response, contact) are good for any call center.

Hope that helps!
Anna

Author barrymckeich
Member 
#5 | Posted: 2 Jun 2008 09:26 
Hi All,

I think Scott's question is a valid one, it is difficult to know where to start when there is so much you might want to know (as pointed out by Ayaree!). I work for a market research company called aura corporation who specialises in customer feedback research and although i know that customer 'surveys' are not the answer to everything I think Ayaree actually counter argued his point.

I think Scott if you are looking for the rsults of how well your staff are performing in terms of responding to customers etc then Anna's advice is very much what you would be looking for. However if you are more interested in whether a customer is happy with the service they have recieved, and whether they will recommend your company or even use your company again then customer feedback is essential.

Although I agree with Ayaree up to a point, in terms of the fact that it is very easy to do bad surveys, it is clear from the rest of his explanation that he acknowledges that there are a lot of questions where the only person who has the answer is the customer themsleves.

The only way to find out what the customer thinks, is to ask them! This however can be done in many different ways and it is important to utilise the best approach for each customer. I won't go into too much detail, as I could go on for ever. If direct customer feedback is what you are after then I would advise that you speak to a market research professional in order to understand what might or might not work for your business.

Let me know if you need any further details on this.

Hope this helps

Cheers

Barry

Author ayaree
Member 
#6 | Posted: 20 Jun 2008 19:39 
Barry, I regret not seeing this sooner, I LIKED your response. And I was forced to look back at what I had said back then. (Sometimes I enjoy reading what I wrote as much as watching myself on video, meaning not so much, but this time was not too bad.)

What I don't understand is where I have necessarily counter-argued my point. I gave a litany of examples of where you could look at performance after my personal ad-lib section on "what I would do with a survey and is a customer also like me?" (where I ask how much you really get out of surveys if you do not get full participation in them). My litany of examples is about thinking inside an organization on what to do to improve or organize, and you can do that without the conducting of a "survey, proper," a thing you send out and wait until it comes back and compile it. My examples can be things that are learned through incoming complaints from end customers - and also internal incidents that are reported. If you are counting incoming complaints as the voice of the customer who has an experience and also calling that "survey," then I catch your drift, but that is not a "survey" that you send out, that's reacting to a problem that was reported.

What I would not want is for you to gather from me that I don't think surveys mean anything, and this is why I mentioned that I didn't bring science to my answer. I have no doubt there is something I can learn from you, whether it is voluntary outbound surveys or other ways to mine information.

At the same time, I am reminded of He Said/She Said instances when what a customer had to say did not reflect fact. For example, you can have a customer report that they did not receive something. Internally it looks like it could not be that something was not received. It's not within a good customer-facing perspective to expect that what the customer is identifying as truth does not equate with the real (we're not calling a customer a LIAR). It's within the realm of possibility that the customer who reported non-receipt of something to have actually thrown that something he actually DID receive in the garbage and yet call in to complain that he did not get it. And this very thing falls within my customer service experience.

What the customer reports is indicative of SOME kind of problem, but NOT necessarily directly related to performance in any way tied to my organization. In this case, what the customer says about my service is not quite like a "bible," whether he called in to complain or took the time to complete a survey for someone to compile. The "bible" is in what really happened.

Even "garbage" data of any kind does inform and lead to an understanding of some greater problem, and you could find out that a customer has employees who throw things away or the customer has a bad rapport somewhere or some other thing outside the unit of performance that is being observed. And there are ways to get past those episodes.

It would make sense to me that you would have expertise in how to WEED OUT factors such as these, given your background. But I still haven't arrived at where you begin to disagree with me. There is definitely a wealth of information that can be had (or gleaned) from a customer in an infinite number of situations (including our peers, bosses and employee team members). I get that on my worst day. How is something that is not true worth factoring in on any day from a performance metrics perspective on my best day?

Thanks a lot, Barry.

Author kbininger
Member 
#7 | Posted: 8 Jul 2008 07:18 
I agree with one of Ayaree's key points which I think target the answer. We first must fully understand what consists of great service to the customer. We then must measure, on our own, our performance to those metrics. We should not need to ask the customer is the phone was answered quickly, if the delivery date was made, etc.

Keith

Author KarenSB
Member 
#8 | Posted: 10 Jul 2008 15:07 
Yowzer!! Poor Scott has only received one valid response to his actual question as this quickly disintegrated into a discussion about customer satisfaction surveys.

Calling all call center managers - - - can you help Scott out? What other metrics are you using to measure performance and productivity?

Scott, I know that there have been several requests similar to yours over time. If you don't get additional responses, I suggest you look back through the history here. I know you'll find more.

Good luck,
Karen

Author ayaree
Member 
#9 | Posted: 11 Jul 2008 19:43 
Karen, you're right, the survey tangent did dominate. I did try to look at the question itself in my first reply, but I don't want to look at what I said too much (it's really verbose - and I am not altogether happy with the way I interchanged with Barry either, but that's another topic).

But on original questions:

It's not an imprecise statement if we say that there is a degree of drive-by questioning when we don't actually see the original questioner return.

And for those instances that I am willing to participate here (if not over-participating - and I do give thought to that), I think I should give in to the temptation to talk through the tangential. For one thing, the tangential can amount to an increase of participation and there might be some benefit to it when someone decides to chime in.

I would rather watch a Karen or anyone "engaged" embark upon tangents that emerge from topics than to experience a need to treat an original question as though it came to a help desk, when the "customer" may not return. I also don't think the participants should be customers, even though I went to the lengths of troubleshooting a product question at one point, probably stupidly; the participants here ought to be simply talkers within the work, from my view.

Author barrymckeich
Member 
#10 | Posted: 15 Jul 2008 04:14 
Apologies Scott, and Ayaree, this has not been overlly helpful.

The point that i was making earlier, and yes i see what you mean now Ayaree so sorry, is that the measurement of customer service in my experience can sometimes be blighted by internal measures. I agree that an increase in customer complaints is a sign of falling standards - if it continues over time.

However the only really sound measure of customer service levels is how well your customers perceive the service they receive. Unless all the compalints you receive are about the same subject they are never going to be a high enough volume (i hope) to make a change to processes viable. But if through regular tracking of customer satisfaction you can identify trends of service levels then clear strategies can be implemented.

Internal metrics as you described Scott can then be correlated to the customer feedback.

What to measure (or what metrics to use) can be determined via up front focus groups etc to find out from customers what is important (i.e. is the phone being answered within 5 seconds important to customers or just to the internal managment?).

As you can probably tell i like to focus very much on the customer input and this probalby comes from the fact that i am not dealing with them directly like most of you guys. However if you know what a customer wants and then put in place systems and people capable of delivering those things then you won't go far wrong.

I hope this has been helpful, apologies if this has become more complicated than it needed to be.

Cheers

Barry

Author ayaree
Member 
#11 | Posted: 22 Jul 2008 19:59 
Barry, nice to see you back, and I wanted to jump in to say exactly that. I don't have a reply that speaks to all the best specifics right now, but I will give it a try later.

The fact that you are not directly involved with customers (talking with them) doesn't mean there is not something worth taking a look at.. Will come back later this week.

Thanks :)

Author bokhorstp
Member 
#12 | Posted: 30 Jul 2008 05:56 
HI Scott

Firstly , I'm a bit suprised that you have not received that many specific answers regarding your question - it can not be all that complicated I think.

For you to measure performance & productivity , you better split the metrics into 2 focus areas : 1 for quantity (productivity), 1 for quality. You obviously need both to ensure customer satisfaction (which is all what we are here for)

Quantitative measurements could be :

* nr of calls answered within 6-8 seconds (considered as world-class)
* service level percentage (minimum 80% )
* average call length
* abandon rate (should not be over 3% of total calls)
* log-in time
* outbound calls / length
* nr of orders placed

Quality measurements could be :

* call quality (record calls)
* complaint handling (soft skills)
* complaint logging and compliance (minimum 80%)
* customer satisfaction survey (weekly/monthly measurement consistently)
* order accuracy (target around 99%)


Hope this helps - if you need any further details, don't hesitate.

Good luck with your new role !

Kind regards

Pim

Customer Service Forum Customer Service Manager Forum / Customer Service Forum /
 Customer Service Statistics

Your Reply Click this icon to move up to the quoted message

 

 ?
Only registered users are allowed to post here. Please, enter your username/password details upon posting a message, or register first.

 

 
 ⇑