Service Desk Agent KPIs and Remediation

You can’t have a winning service desk without a scoreboard. The metrics matter and the numbers don’t lie.  What’s more, there’s no reason you’d want them to if continual service improvement is part of your game plan. A winning service desk not only delivers on SLAs (i.e. the cumulative performance metrics of all agents supporting a particular client) but monitors individual agent metrics or Key Performance Indicators (KPIs) and provides guidance at every opportunity.  For this reason, service desk agent KPIs are typically made available for agents to track their progress in real time and compare their progress with the team. More importantly, the team lead will conduct weekly or monthly KPI reviews or one-on-ones with each agent and discuss remediation strategies on any metrics where that agent may fall short. While service desks may vary on which metrics require higher priority or what target numbers to establish, the importance of each measurement and how it depicts the health of the solution is unwavering.

Average Contact Handle Time: 6.5 Minutes or Less

Though a 6.5-minute average may seem like an arbitrary number, years of Level 1 support metrics in a standard MS Office product support environment consistently bear out that industry benchmark. Getting off the phones quickly is not counter-intuitive to high-resolution rates. In fact, agents who habitually refuse to escalate incidents that require an on-site presence or additional access not only prolong resolution but may extend the Average Speed to Answer for the next caller.  Understandably, conscientious service desk agents abhor playing a log and route role and would rather work with an issue for 10, 15, and 20 minutes or even more before escalating it; however, considering that during a 20 minute contact as many as four other users can’t reach that agent for assistance puts that approach in proper perspective.  Even though the remaining agents available in the queue will pick up the slack on the latter metric, the lead conducts KPI reviews with all agents and instructs them on how to improve handle time lest more fall behind.

Leads remind agents that when abandon rates and hold times increase, it jeopardizes those measurable service levels the service desk is contractually obligated to meet.  Contacts of that length properly belong with Level 2 support, which can be handled by the client’s internal IT groups or the service desk vendor’s Remote Level 2 team.  As such, agents need to remember to quickly triage all calls and emails and if it’s going to take upwards of 30 minutes to resolve, they should promptly route it to the appropriate team who typically have no time restrictions.

During KPI reviews, team leads also play back the calls and coach agents on how to improve their resolution dialogue. All too often longer handle times are caused by agents not knowing the right questions to ask to identify the root cause. For other agents, the problem is not being sensitive to the amount of small talk the end user is willing to engage in before getting to the troubleshooting. And wherever additional technical or procedural training is required to speed up the resolution, the team lead will schedule those sessions immediately.

Average Wrap-Up: 90 Seconds or Less

Service quality means more than swift and friendly resolution of incidents using in-depth technical knowledge.  Completing the definition is an accurate documentation of those activities with thorough attention to detail. Otherwise, service desk reporting, root cause analysis, and knowledgebase development will suffer. Moreover, if incidents are escalated, detailed ticket info including contact information and previously attempted troubleshooting steps must be included. Especially in instances where a warm transfer is not possible, thorough ticket notes enable the follow-up technician to pick up where the service desk agent left off. Like all KPIs, the agent must strike a balance between speed and accuracy. For instance, if an agent lingers in wrap-up mode, he or she misses the next inbound contact, adversely impacting ASA and abandons rates, both contractual measures (SLAs) with broader consequences.  So in order to maximize their availability, agents often enter detailed ticket information as they speak to the end users and keep wrap up time within that 90-second target without shortchanging the documentation.

Percent of VoIP Contacts longer than 11 Minutes Total Handle Time (talk time plus wrap-up plus on hold): 15% or Less

Sometimes a longer than expected contact just can’t be helped.  If resolution is imminent, no service desk would instruct the agent to hang up on an end user the second the call surpasses the 10-minute mark.   However, if too many of the calls continue longer than 11 minutes, the team lead needs to do some deep dives into the ticket details and playback the call and possibly recommend additional training and/or a varied mix of clients handled (i.e. short process versus long technical contact types).

VoIP Average Speed to Answer: 5 Seconds or Less

This KPI measures the agent’s preparedness to respond to an inbound voice contact.  The clock ticks from the time an agent is in “available” state and answers an offered call (i.e. one that is ringing and not dropped).  Most service desk agents average 2.9 seconds to answer calls with approximately 90% averaging less than 5 seconds.  Considering the primary benefit of a first-point-of-contact help desk is being able to reach a remote agent quickly, ASAs are a critical measurement. Unlike high abandon rates which may be a consequence of insufficient staffing levels, ASA leaves the onus with the individual agent. If an agent is unable to bring this statistic within range on their own, leads can set up an auto-answer until the agent becomes accustomed to quickly picking up a call.

Customer Satisfaction Survey Rating: 98% or More

At the conclusion of an incident, the service desk can issue an electronic survey asking end users to rate agent performance in various categories with a simple thumbs up or thumbs down on overall satisfaction. Using these parameters, it is not unusual for a majority of service desk agents to maintain perfect 100% scores from month to month, but even the best of them may eventually mishandle an incident or fail to meet a particularly critical user’s expectations. Should the service desk receive negative feedback from a satisfaction survey for any reason, it is the team lead’s responsibility to listen to the call, review the ticket, and circle back with both the end user and the agent to discuss how the incident could have been handled better. Due diligence tasks aside, high satisfaction scores are often the result of the agent meeting or beating all other KPIs with regular input from the team lead. When those results are favorable, there will be no greater compliment than to have end users take the time to highly rate their support experience and acknowledge a job well done.

Related Posts