Measuring Quality - What are Key Performance Indicators
By Leigh Wilkinson, Performance Management
In order to measure the quality of a service, we first have to set a standard. Once the standard is set and agreed to by all the stakeholders, then we can measure our performance against the stated goal or standard. In Information Technology, one kind of standard is an SLA or service level agreement. An SLA becomes the stated goals for performance and the standards by which IT workgroups are measured.
One method used to track performance against goals is to measure what are known as Key Performance Indicators (KPIs). A complete set of KPIs will inform us where we are performing well and where we need to improve our performance. In the airline industry, for example, the number of times that a flight is delayed in departing from the gate at an airport is a valuable KPI. If the flight delays increase, so do related costs...... and customer satisfaction declines.
According to David Parmenter, a leading expert on measuring performance; a KPI will have very specific characteristics:
- non-financial measures
- measured frequently (daily, weekly or monthly)
- acted on by CIO or senior management team
- all staff can understand the metric and corrective actions required
- ties responsibility for results to an individual or work group
- significantly impact success in implementing high level goals
Over the last few months OIT leaders have been working to define what KPIs exist for their workgroups. We have been talking about:
- How do you measure your workgroup success at providing service on a daily or weekly basis?
- What is the source of those measurements and how frequently are reports created?
- How does this KPI relate to your stated goals and standards as defined in the OIT Service Catalog?
So far, we have found that, within our world of IT, KPIs are not as obvious or easy to measure as one might think:
- We have some very sophisticated reporting tools such as Web N'M but they are not yet configured to produce frequent reports and trends in performance. We do an excellent job of managing current, in the moment performance but have not developed a reporting sequence to track short and long term performance.
- Footprints is used to track help tickets and workflow but some of the critical metrics regarding escalation of tickets based on age or time spent working on a ticket resolution are not enabled in the system. So much variability has evolved in the primary Footprints projects that many KPIs are not measurable.
In response to these findings we are discussing methods within our existing systems to measure our performance against standards and KPIs. Finding the right measure could be compared to finding the best place to sit in a movie theater – If you sit too close to the screen the information is overwhelming and you get a headache. If you sit too far back in the theater, you miss the little details that can add up to a major part of the film. Sitting in the right spot of theater allows you to comprehend and enjoy the film. Here are some of things we are doing to find a good measure of performance:
- The network group is designing a report to measure network performance at a component or node level. Currently we can see the entire network is performing at 99.997% uptime but that does not accurately describe the fact that some areas of the network are stressed due to infrastructure improvements needed.
- The Footprints application is currently the focus of a project titled “Back to Zero”. (The title refers to one of the deliverables which is to clean up and close old tickets) The Footprints project team is busy developing business rules and analyzing opportunities to improve the existing design. So far, the group has defined roles, a change management process that aligns with the existing OIT request for change process and has drafted a plan to clean up old tickets through an automated process.
- The Footprints application currently provides information on customer satisfaction levels based on an email survey sent to every customer when their help ticket closes. The short survey asks the customer to rate their levels of service in timeliness, effectiveness, and courtesy. We estimate about 30% of these surveys are completed and the results are tracked monthly and on a year to date basis. From October 2007 to 2008, we received 26,770 surveys, averaging over 2,000 per month.
- Using Crystal Reports and the Footprints Application along with the Avaya Call Management Software; Steve Conary has been developing performance measures for the call center and service technicians.
- We have a draft version of the OIT service catalog online – it is in the final approval stages and should be available by the end of November. This catalog will be an easy way for our customers to search and find our rates and SLAs for each IT service.
- We are developing a rudimentary performance management web site connected to the OIT service catalog. It has two primary sections: “OIT by the numbers” showing statistical information and “OIT KPIs” which will reflect our performance in several areas. This will be a manual entry type of performance dashboard that will be used by OIT leadership to refine the metrics and to clarify both KPIs and their sources. We hope to be moving to a more automated version of this dashboard in 2010.
- We are reviewing the ITIL (Information Technology Infrastructure Library) standards and metrics for use in defining our own KPIs and SLAs. ITIL is an international standard for organizing, measuring and managing IT departments. ITIL is divided into two main categories with related subcategories each with very clear metrics for measuring
- Service support
- Incident Management
- Problem Management
- Change Management
- Configuration Management
- Release Management
- Service Desk
- Service Delivery
- Service Level Management
- Availability Management
- Financial Management
- Capacity Management
- IT Service Community Management
- Service support
In the State of Maine , the Office of Information Technology's creation of a set of standards and KPIs could be best described as a “moving target” – partially because we are still in the process of managing the change from multiple agency based groups to a centralized IT work group. Creating centralized standards that all stakeholders can agree to can be a long and difficult process of negotiation and decision making. Our challenge is to meet the user needs of a wide and varied set of shareholders. We are jointly managing for consolidation of resources and cost reduction while focusing on improving service levels and customer satisfaction. We will set initial metrics and adapt them over time to create better and better performance.
“What gets measured gets done” is old business maxim. Remember the airline metric of delays leaving the gate? Anyone who has sat for hours on a plane that pushed only a few feet from a gate while the plane was waiting for clearance to taxi onto the runway has been victimized by that metric. As long as they push away from the gate, the pilots get a desired score for that KPI but the customers are far from satisfied. We need to be aware that sometimes we can measure the right thing but have a negative impact on our performance. One last caution, not all metrics are a KPI. Albert Einstein had a sign on his office wall that read, “Not everything that matters can be measured, and not everything that can be measured matters”. Defining the right things to measure and the right way to measure will take some time and collaboration from all of us.