You are on page 1of 43

Guidelines for Core

Key Performance
Indicators
Interim Report on
Primary Service Channels

Guidelines for Core
Key Performance Indicators
Interim Report onfor
Primary
Service
Channels
Guidelines
Core
Key
Performance:
September 2004
Primary
Service Channels

Executive Summary
The development of key performance indicators (KPIs) for the Government of
Canada (GoC) became a priority as Canada’s Government Online (GOL) initiative
matured from 1998 through 2004. The rapid development of the Internet channel as a
means of providing effective public service delivery created an appetite for
revolutionary change in all types of service delivery. Prior to GOL, large-scale
improvements to service delivery were confined to specific government programs
and services. Interdepartmental projects were rare. The advent of the Internet and
the preference of Canadians to to access on-line government services has created
cutting edge opportunities for change in delivering services to Canadians.
In the past three years, dozens of interdepartmental initiatives have taken hold and
have helped to foster citizen-centred service delivery. As more and more business
improvement opportunities were conceived, it became clear that the Government of
Canada needed clear communication for analytical decision making. Many
departments have made significant investments in performance management and
made progress towards the disciplined decision-making characteristic of the world’s
best corporations. Nevertheless, differences in terminology, definitions, usage, data
collection, and performance frameworks were quickly identified as limiting the ability
to monitor and affect ‘enterprise’-level performance.
The genesis of the Core KPI project came from the GoC’s Telephony Service
Working Group – an interdepartmental collection of GoC call centre managers and
executives that came together to share best practices, establish consistent service
standards and generally improve the capabilities of GoC call centre operations. In
2003, this working group quickly identified, and provided precise definitions of,
common KPIs.
Coincident with this achievement, Treasury Board of Canada, Secretariat developed
a modernized approach to the management of the public sector organizations and
programs called the Management Accountability Framework (MAF). This
comprehensive set of tools, standards, and processes provided an over-arching
framework for the Core KPI project. The operational nature of KPIs strongly
supported the MAF and provided direct information to two of the primary MAF
categories – stewardship and citizen-focused service.
In 2003, as the GoC’s Internet channel rapidly matured and initial significant
transactional capability came online, new interdepartmental working committees
were formed to deal with the complexities of multi-service, multi-channel delivery
alternatives. Internet gateways and clusters rapidly evolved; this helped organize
services in parallel with client segments and life events. This has created
opportunities to effect corresponding changes in how GoC services are delivered inperson and by mail. By 2004, there was a clear need to establish common core KPIs

2

and establish a working environment to develop further a common performance
language.
The Core KPI project brought together numerous government managers –experts in
delivering services to Canadians, visitors and businesses. Managers with
operational responsibility for call and mail processing centres, Internet sites, and inperson locations were engaged in several meetings to identify the KPIs that provide
maximum management value.
The result of these meetings was a small set of channel-specific core KPIs that
reflect specific MAF themes. These KPIs will be required for a variety of reporting
requirements, Treasury Board submissions, and ongoing reviews. Additional
operational KPIs were identified that are recommended by Treasury Board (but not
required) as effective indicators that provide strong operational benefits to service
delivery organizations.
The Core KPI project is not complete. There is an ongoing requirement for
implementation, improvement, and additions as the GoC service delivery strategy
evolves. Perhaps the most important and lasting benefit is the networking of the best
performance management people in the GoC. These experts continue to develop
new techniques and identify improvements to ensure that Canada remains one of the
world leaders in public sector service delivery. And that position clearly improves our
competitive position in the twenty-first century.

3

Record of Changes
Version
V 0.9
V 1.0

Date
August 30, 2004
Sept. 30, 2004

Summary of Changes
First draft for formal review
Minor edits

Detailed Description of Changes from Previous Version

Acknowledgements
Project Authority:

Victor Abele

Director, Service Strategy,
CIOB Treasury Board Secretariat, Canada

Project Analyst:

Phillip Massolin

Analyst, Service Strategy
CIOB, Treasury Board Secretariat, Canada

Author:

Dan Scharf

Equasion Business Technologies

Contributors:

Daryl Sommers
Colin Smith
Reina Gribovsky
Dolores Lindsay
Daniel Tremblay
Kyle Toppazzini
Marg Ogden

Web Content:

Morris Miller

4

...............................................................11 5................................................15 Metrics for Delay....................21 Metrics for Access...................................................................0 DEFINING THE CHANNELS...........................19 Metrics for Channel Take-up..................25 5 ..23 Metrics for Agent Utilization......................................25 MAF CATEGORY: PEOPLE..........................................................................22 Metrics for Client Satisfaction.14 7...............7 2.................................................................................8 3......................................................0 RISK MANAGEMENT....................17 Metrics for Agent Utilization.................Service Improvement .........0 SERVICE STANDARDS.....................24 Metrics for Use of Technology......14 8....................17 MAF CATEGORY: STEWARDSHIP....................................................................21 MAF CATEGORY: CITIZEN FOCUSED SERVICE.........................................................................................................0 KEY PERFORMANCE INDICATORS – Phone Channel.............................................................16 Metrics for Client Satisfaction............10 4.....................15 Metrics for Quality...0 INTRODUCTION......................21 Metrics for Delay............................................17 Metrics for Service Effectiveness..........19 MAF CATEGORY: PEOPLE.............................................................................0 POLICY AND PROGRAMS...................20 9............................................................................................................23 Metrics for Service Effectiveness...................................0 KEY PERFORMANCE INDICATORS –In-Person Channel.................................0 ACCOUNTABILITY and Key Performance Measures.......................23 MAF CATEGORY: STEWARDSHIP...25 Metrics for Channel Take-up.........18 Metrics for Use of Technology.....................................................21 Metrics for Quality.....................................................0 MANAGEMENT ACCOUNTABILITY FRAMEWORK..........................15 MAF CATEGORY: CITIZEN FOCUSED SERVICE.........12 6..........................................Guidelines for Key Performance Indicators _______________________________________________________ Table of Contents 1..............15 Metrics for Access..........................................................

30 Metrics for Use of Technology....................................................42 6 .........................................................26 Metrics for Delay...........................................................................................................................0 KEY PERFORMANCE INDICATORS – Mail Channel.37 12.......................................35 Metrics for Agent Utilization.......37 MAF CATEGORY: PEOPLE............................................................................................................................ ...........................0 USING SERVICE DELIVERY KPI’s in DEPARTMENTAL REPORTING....................29 MAF CATEGORY: STEWARDSHIP..................36 Metrics for Channel Take-up........................31 MAF CATEGORY: PEOPLE.............................33 Metrics for Access......................26 MAF CATEGORY: CITIZEN FOCUSED SERVICE.............10........................38 13................................34 Metrics for Client Satisfaction......31 11..........41 APPENDIX C: Summary of Core Key Performance Indicators …............................................................26 Metrics for Access..........................36 Metrics for Use of Technology..........................35 MAF CATEGORY: STEWARDSHIP..........................35 Metrics for Service Effectiveness...........................0 KEY PERFORMANCE INDICATORS – Internet Channel..............................................................33 Metrics for Quality..............................40 APPENDIX B: References.........33 Metrics for Delay.........................................29 Metrics for Service Effectiveness...................................................................................29 Metrics for Agent Utilization.........................................................................................................................................31 Metrics for Channel Take-up.................................................................................................39 APPENDIX A: Terms and Definitions...............................28 Metrics for Client Satisfaction......0 CONTACT INFORMATION...........................................33 MAF CATEGORY: CITIZEN FOCUSED SERVICE.........................................28 Metrics for Quality........................................................

creating corresponding challenges for organizations to manage service delivery across all channels. You can navigate KPIs by: • Channel . KPIs are becoming an essential part of achieving Management Accountability Framework (MAF) compliance. 7 . Once approved. Workshop results can be assessed to help you create baseline frameworks for channel measurement and provide input into the process.0 INTRODUCTION Citizens are faced with a greater choice of channels than ever before to access government services (in-person. phone. For the GoC. Key Performance Indicators (KPI) are increasingly used by the private and public sectors to measure progress towards organizational goals using a defined set of quantifiable measures.each channel includes standard metrics (KPIs) for managing performance. or • Management Accountability Framework category A series of government-wide workshops identified the requirement for a consistent approach to measure service delivery performance across the GoC. the KPI framework will constitute a key element of departments’ annual monitoring.Service Improvement . internet and mail) .Guidelines for Key Performance Indicators _______________________________________________________ 1.

 Retail – some organizations provide ‘storefront’ operations where visitors can browse publications and utilize computers for self-service. publicly available computers. Internet Service: most surveys indicate this as preferred channel of the future. Surveys in the past few years indicate that citizens will increase the use of more timely and interactive channels. search and website navigation strategies  E-mail – which provides delayed support both automated and agent authored  Online chat technologies which provides ‘real time’ agent assisted Internet service delivery Mail: primary indicators indicate that this channel – the “paper channel” . and public phones.is decreasing in popularity. Modes of interaction include:  Self-service via online transactions.0 DEFINING THE CHANNELS Phone Service is the preferred service channel for most Canadians. Primary modes of interaction within this channel are:  Interactive Voice Response (IVR) to provide self-service. Mail is sent using three methods (analogous to the modes of interaction in the other channels):  Fax – instant transmission via facsimile devices over phone or broadband networks. multi-agent counter office which often has a reception counter to ‘triage’ visitors to the correct counter and answer simple questions. In-Person Service: Canada’s extensive network of local offices provides a significant proportion of all government service delivery using primarily queued and appointment based service models.  Regular Mail – via regular postal services (3 to 10 days).  Scheduled – significant volumes of in-person services are provided on a prescheduled ‘one-on-one’ basis.Key Performance Indicators for Service Delivery Channels 2. Some In-Person points of service also offer assisted Internet and telephone services through kiosks.  Outreach – several service delivery organizations schedule seminars and training sessions in communities throughout Canada. There are four services modes for the In-Person Service channel:  Queued – a managed.  Agent-based services – the most highly valued service channel for citizens today. routing.  Courier – expedited delivery via priority parcel carriers (within 48 hours). Service staff are . and broadcast services.

the use of TTY technology within the Phone Channel provides access to hearing impaired individuals. For example. Several GOC organizations are using multi-channel service strategies to achieve higher service value with economic investments. Several departments are experimenting with dedicated.Key Performance Indicators for Service Delivery Channels available to help and can either approach the visitor directly ‘on the floor’ or respond to visitors questions at service counters. specially trained call-centre agents to support Internet site visitors through toll-free direct assistance lines. The Internet Channel uses W3C accessibility standards to ensure that government websites are accessible using assisting technologies such as screen readers. lower counters. In the In Person channel. font magnifiers. and powered doors facilitate access for people who use wheelchairs. . assisted Internet support to visitors. as required. Service channels and modes of interaction are impacted by accessibility standards which maximize the availability of channels to people with disabilities. For example. KPI’s listed in this document are not specifically intended to measure the important service delivery issues of accessibility to people with disabilities and integrated multichannel implementation characteristics. the use of ramps. and speech recognition. Service Canada uses publicly available computers within its service outlets to provide self-service and.

Additional indicators measure progress to objectives under the Citizen-Focused Service Category and People. Specific assessment tools are used for the Policy and Programs and Risk Management categories. Stewardship. The Accountability category provides checklists and processes for establishing effective service level agreements.tbs-sct.asp ) .Key Performance Indicators for Service Delivery Channels 3.0 MANAGEMENT ACCOUNTABILITY FRAMEWORK The Government of Canada has instituted a consistent management framework for its programs and services. MAF provides deputy heads and all public service managers with a list of management expectations that reflect the different elements of current management responsibilities. Accountability) as shown in the following diagram.ca/maf-crg/maf-crg_e. Risk Management. The majority of service delivery indicators relate to the operational nature of the Stewardship category. Key Performance Indicators for Service Delivery are grouped within the MAF categories (Policy and Programs. Comprehensive information on the Management Accountability Framework (MAF) can be found at Treasury Board’s website (reference: http://www.gc. . Citizen-Focused Service. People.

how accessible was the service or ordering process to the client? Outcome .did the client receive what was needed? Satisfaction . Departments are required to set standards and measure progress to this goal using primary criteria such as: Timeliness .Next business day Queue time for in-person services . international visitors. The Citizen First Survey provides specific information on client service expectations through a formal comprehensive survey.0 SERVICE STANDARDS The Service Improvement Initiative defines specific guidelines for all departments and agencies to establish and publish services standards for citizens.Key Performance Indicators for Service Delivery Channels 4.overall client satisfaction with the service/product request. Common high-level service standards include: Average speed to answer .5 minutes Expected answer by e-mail . The overarching goal is to establish a 10% increase in client satisfaction by 2005. . and businesses using GOC services and programs.15 minutes.the time required to receive a service or product Access . Trends over the past 5 years indicate shifts in these expectations that provide government service delivery managers with effective direction to prioritize service improvement initiatives.

mail. Identifies a cost for service (even when user fees are not required) to ensure that users understand and Service Criticality Level Service Channels Service Primary Service Provider Service Partner Providers Pledge Delivery Targets Dates Costs . COMPONENT DESCRIPTION Service Level Agreement Name Service Description The name of the SLA.0 ACCOUNTABILITY and Key Performance Measures As specified in MAF. The following table presents a minimum set of components which must be included in a Government of Canada SLA which provides the foundation for service delivery to citizens. Most frequently. timeliness and accuracy.Key Performance Indicators for Service Delivery Channels 5. This level of criticality should be based primarily on the service user’s requirements. This is frequently time based e. visitors and businesses. A review date must also be identified so that performance measurements can be made and the SLA can be adjusted or action can be taken to improve the performance of an SLA. Internet. The effective start and end dates of the agreement.g.g. service delivery channels must employ clear accountability frameworks. in-person. The details of the service the government intends to provide and the benefits the users can expect to receive Normally identified in a Service Catalogue based on already defined metrics. and appropriate contact information for the channels. GTIS provides the Internet Server to the Health Canada Provides the details of the quality of the service a client can expect. such as access. particularly useful when a single SLA is used across multiple service offerings. Other partner-departments that provide support to a Primary Service Provider for a service. Identifies which channels this service is available through e. Passports will be processed in X number of days.g. e. departments and agencies use a Service Level Agreement (SLA) both for internally resourced services and for third party and partnership teams. telephone. The department or agency which is primarily responsible for the service. Delivery targets describe the key aspects of the service provided.

how will the change be managed to ensure that expectations are being met. Partners and the User Service availability e. Dates: start.Key Performance Indicators for Service Delivery Channels Complaint and Redress Service Hours Throughput Change Management Security and Privacy Service reporting and reviewing Performance Incentives/Penalties form realistic expectations about services offered by the Federal Government Provides the service user with mechanisms to resolve their Concerns. which have been caused by excessive throughput outside the terms of the SLA. Identifies inter-departmental policies on the sharing of user information for various services. Specifies the content. end and review Scope: (what is covered and what is not) Responsibilities: Service Providers. can be identified. Public holidays must be identified as well as the hours for each channel. . frequency and distribution of service reports and the frequency of service review meetings. Jun-Aug=50. Service hours should provide maximum cost-effective access for the service user. This is important so that any performance issues.000. Organizations must comply with PIPEDA and Treasury Board Policies.g. For example. Describes the anticipated volumes and timing for activities within a specific service e. if UIC benefits are going to be mailed out every 2nd month instead of every month.000. Identifies the policies surrounding any changes that will affect the service provided to the user. Penalty clauses can create a barrier to partnership if unfairly invoked on a technicality and can also make service providers and partners unwilling to admit mistakes for fear of the penalties being imposed. UI Applications Sep-May=100.g. This section identifies any agreement regarding financial incentives or penalties based upon performance against service levels. for example when the SLA has not been met. 24x7.

A. This work will allow us to formalize the approach to KPIs which is currently in draft format. Departmental Reports are the primary reporting tool used to document the overall policy effectiveness of specific programs. and effective communication approaches.0 RISK MANAGEMENT The Risk Management category of MAF specifies a checklist for departmental management to establish comprehensive and transparent identification of risks. readers should refer to MAF. In the Fall / Winter of 2004/05. tolerances. 7.0 POLICY AND PROGRAMS The Policy and Programs in this MAF context refers to relevant lines of activity within Departments and Agencies. mitigation strategies. Readers should consult the MAF. relevant Treasury Board Policies as well as the Program Activity Architecture (P.Key Performance Indicators for Service Delivery Channels 6.) for information and guidance in this category. For detailed information. we will be consulting with the service community with a view to developing a more comprehensive service policy.A. .

. Objective: Measures overall service capacity from ACD to Agent. Removes “repeat callers” from accessibility measure. Suggested benchmark / Range: 80% to 85% Status: Proposed as a Core KPI KPI: Abandoned Calls Description: Percentage of calls which are abandoned while in queue due to prolonged delay waiting for service.Key Performance Indicators for Service Delivery Channels 8. Objective: Key Measure for overall service level. expressed in seconds. typically for a live agent.0 KEY PERFORMANCE INDICATORS – Phone Channel MAF CATEGORY: CITIZEN FOCUSED SERVICE Metrics for Access KPI: Call Access Description: Percentage of calls presented that get into the ACD. Determines service level by counting “unserviced” callers. Derivation: ACD. while in queue before connecting to an agent. Definition: Total unique phone numbers completed divided by Total Unique Phone Numbers attempted. Definition: The total number of seconds from ACD queuing of call to agent acceptance / total agent calls. Suggested benchmark / Range: 10% to 15% Status: Proposed as a Core KPI Metrics for Delay KPI: Average Speed to Answer (ASA) Description: The average delay. Objective: Basic volume measure. Derivation: ACD Suggested benchmark / Range: 40% to 60% Status: Proposed as a Core KPI KPI: Caller Access Description: Percentage of unique callers who attempt and successfully access service. Definition: (Calls Answered +Calls Abandoned) divided by Calls Presented. Objective: Primary Indicator of caller satisfaction. Definition: Number of calls abandoned within agent-queue + IVR abandons before ‘success markers’ divided by total calls answered + total calls abandoned. Busy signals generated by switch divided by total calls received in reporting period.

Derivation: # of calls answered in IVR terminated at ‘success’ markers +# of agent calls resulting in success status times accuracy evaluation ratio Status: Proposed as a Core KPI KPI: Professionalism Description: Encompasses a range of soft-skills that govern the approach to delivering accurate information and reliable services. Objective: This measure is required in order to set and publish telephone service standards. Definition: Calls answered within Threshold + Calls Abandoned within Threshold / (Total Calls Answered + Total Calls Abandoned). Definition: Best measured through the use of a mystery shopper program that uses specific planned calls placed to the call centre by a measurement organization. Can also be measured by exit surveys performed immediately after call completion. . Objective: To ensure program integrity. Status: Recommended as an operational measure.Key Performance Indicators for Service Delivery Channels Derivation: Measured by ACD Status: Proposed as a Core KPI KPI: Service Level Description: Percentage of calls that reach an agent or are abandoned within a specified time threshold. Derivation: Measured by ACD Status: Proposed and required for phone service management Metrics for Quality KPI: Answer Accuracy Description: Consistency of IVR and agent answers. Objective: Identifies and reinforces effective communication. Definition: Local quality scorecard assessed by call monitoring and / or mystery shopper approaches.

. Objective: Primary indicator of service quality particularly when measured over time. Status: Proposed as a Core KPI KPI: Service Complaints Description: Count and categorization of complaints received through all channels concerning the Phone channel. trends provide strong evidence of service improvement levels Derivation: The multi-channel survey tool will be used (at a minimum) to determine the core measures relevant to the telephone channel. measured repeatedly. Derivation: Counted by incident tracking system. Objective: Provides high level indication and trend of overall service performance. Complaints received through other channels must be added to total. Objective: Ensures that agent resources are dedicated to required functions Status: Recommended as an operational measure .Key Performance Indicators for Service Delivery Channels Metrics for Client Satisfaction KPI: Client Satisfaction Level Description: Application of Common Measurement Tool (CMT) to assess and benchmark client satisfaction Objective: Primary indicator of client perception of service quality. Status: Recommended as a Core KPI but not currently feasible as most GoC organizations do not integrate service feedback information. MAF CATEGORY: STEWARDSHIP Metrics for Agent Utilization KPI: Cost per Call Description: The total operational cost of the call centre over the reporting period divided by total calls handled during the reporting period. Definition: will require further working group consultation Status: Recommended as Core KPI KPI: Agent Capacity Description: The anticipated number of hours of agent time available for telephone service for each full-time equivalent (FTE).

Not technically feasible at this time. Definition: will require further working group participation Status: Not recommended. Objective: Contributes to resourcing effectiveness Definition: Calculated as total agent login time divided by scheduled work time Status: Recommended as an operational measure KPI: Agent Occupancy Description: The percentage of agent time spent in direct service including “talk and wrap up time”. Objective: Ensures accurate resourcing levels to achieve target service levels. Objective: Measures effective use of channel resources. Definition: (Talk time + after call wrap up time) divided by total agent log in time over measured period.Key Performance Indicators for Service Delivery Channels KPI: Resource Allocation Description: A management indicator assessing allocated FTE’s to service delivery. Metrics for Use of Technology . Objective: Minimize cost and maximize client satisfaction. Definition: Locally defined Status: Recommended as an operational measure KPI: Agent Adherence Description: An assessment of telephone agent adherence to schedule and making oneself available during anticipated service periods. Suggested benchmark / Range: 85% Status: Recommended as an operational measure Metrics for Service Effectiveness KPI: First Call Resolution Description: The degree to which the client needs are met without further referral or call-back within a designated time interval. Objective: Measures key caller criteria of more than 2 transfers. Definition: number of single calls by unique phone number within 48 hour period not abandoned Status: recommended as Core KPI KPI: Accurate Referral Description: A redirect to the correct service for resolution of client need (may be to a higher service tier or to a separate organization/jurisdiction providing the service).

e. Definition: Calls terminated at specific IVR marker after bulletin Status: Proposed as Core KPI KPI: Calls Answered by IVR Successfully Description: A call that terminates in IVR tree after success marker. Objective: Measures utility of IVR response tree to provide self-service answer.Key Performance Indicators for Service Delivery Channels KPI: Call Avoidance Description: A call that quickly exits the system after an introductory message or bulletin that provides a desired answer for a substantial portion of the calling population. secondary indicator of client satisfaction. Status: proposed as Core KPI KPI: Callers Description: Unique Callers Objective: Measures service demand more accurately. related to an immediate but temporary service outage.g. Definition: Unique phone numbers dialing the service Status: proposed as Core KPI . Definition: Calls terminated at all IVR ‘success’ markers. Note that this will include ‘repeat callers’ who are refused at the switch. Status: Proposed as Core KPI Metrics for Channel Take-up KPI: Calls Description: Total calls received Objective: Measures overall service demand Definition: Number of calls received at switch. an important indicator of IVR utility. Objective: Measures utility of IVR/bulletins to answer high-volume inquiries.

for technology. Staff Turnover Ratio: A measure of the ‘churn’ rate within the Agent team. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Monitoring this over time provides a measure of the impact of staff turnover. Training is required for program/service delivery. Some examples of KPIs that might be suitable for this MAF category include: Total Months – Staff on Strength.Key Performance Indicators for Service Delivery Channels MAF CATEGORY: PEOPLE At publishing time. Average Months on Strength per Agent: A measure of the total experience level of the agents within the call centre. Further discussion with departments and agencies will be conducted to identify effective KPIs under this MAF category. and for the development of skills related to professionalism and customer interaction. Provides a secondary indicator of Call Centre health and it often correlates to overall customer satisfaction levels. Helps measure the utilization of Call Centre supervisor time as well as the investment in agent skill improvement. . KPIs for the MAF PEOPLE category had not yet been proposed to the working group for review. Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents.

Not trackable within ‘retail’ service model. Objective: Basic volume measure. Measured by all Queued service models. Measured by all Queued service operations. Tracked by all operations. Metrics for Delay KPI: Average Wait Time (AWT) Description: The average delay from time of entering facility to introduction at agent station. This depends on the service model and facility. Derivation: Measured by service management system Status: Recommended as an operational KPI. Objective: Primary Indicator of visitor satisfaction. Provides indication of utilization of self-service capabilities and overall operational capacity. Not relevant to “retail” service model. . Objective: This measure is required in order to set and publish in-person service standards. Definition: The total number of minutes from pulling of service ticket to service. KPI: Service Level Description: Percentage of visitors that reach an agent within target wait time. Definition: Total agent-visitor services divided by total visits Status: Recommended as an operational measure. KPI: Visitors Serviced Description: Ratio of visitors receiving agent service to total visitors.Key Performance Indicators for Service Delivery Channels 9. Definition: Total visitors entering facility over measurement period.0 KEY PERFORMANCE INDICATORS –In-Person Channel MAF CATEGORY: CITIZEN FOCUSED SERVICE Metrics for Access KPI: Visitor Access Description: Count of visitors who either a ) are serviced at agent stations or b) obtain self-service through in-location computers OR Count of visitors entering facility. Suggested benchmark / Range: TBD Status: Proposed as Core KPI. Definition: Visitors served within threshold/Total Visitors Serviced Derivation: Measured by service ticketing system Status: Recommended as an operational KPI.

KPI: Transaction duration variability Description: For operations providing specific transaction services. Applicability to be reviewed. Status: Proposed as an operational measure. May be impractical in several service models. KPI: Critical Error Rate Description: Some operations monitor application/transaction errors (typically omission of required information) requiring additional interactions with clients. Objective: To ensure program integrity. Definition: Local quality scorecard assessed by supervisor monitoring and / or mystery shopper approaches and/or exit surveys Derivation: # of visitors answered successfully by agents Status: Under review. Status: Proposed to working team.Key Performance Indicators for Service Delivery Channels Metrics for Quality KPI: Answer Accuracy Description: Reliability of agent answers. Objective: Assess process consistency across agents. Definition: Best measured through the use of a mystery shopper program that uses specific planned visits to the service centre by a measurement organization. Applicable only to some operations. KPI: Professionalism Description: Encompasses a range of soft-skills that govern the approach to delivering accurate information and reliable services. analysis of variance of transaction duration correlates strongly to application accuracy. . Status: Under review. Objective: Identifies and reinforces effective communication. May be impractical in several service models. Objective: Assessment of pre-visit instructions to clients and/or reception desk ‘triage’ procedures. Can also be measured by exit surveys either conducted by staff or at self-service computers. Possible as a recommended operational KPI.

KPI: Agent Capacity Description: The anticipated number of hours of agent time available for counter service for each agent. general e-mails. Objective: Provides a snapshot of current operational efficiency specifically related to agent/manpower. Objective: Primary indicator of service quality particularly when measured over time. KPI: Service Complaints Description: Count and categorization of complaints received through all channels concerning the In-Person channel. Caveat: Members noted that current systems do not currently support the collection and categorization of service complaints that are received through a wide variety of channels (e.g. Derivation: Counted by incident tracking system. trends provide strong evidence of service improvement levels Derivation: The multi-channel survey tool will be used (at a minimum) to establish the “CSat” level for in-person services. Status: Recommended as core KPI. measured repeatedly. Status: Recommended as core KPI.Key Performance Indicators for Service Delivery Channels Metrics for Client Satisfaction KPI: Client Satisfaction Level Description: Application of Common Measurement Tool (CMT) to assess and benchmark client satisfaction Objective: Primary indicator of client perception of service quality. Definition: TBD Status: Recommended as Core KPI. complaints at end of successful service phone call) MAF CATEGORY: STEWARDSHIP Metrics for Agent Utilization KPI: Cost per Contact Description: Total labour costs divided by total service requests. Complaints received through other channels must be added to total. Definition: Total service complaints received during reporting period divided by total number of calls. Minister’s correspondence. Objective: Ensures that agent resources are dedicated to required service . Definition of labour cost to be determined.

KPI: Agent Occupancy Description: The percentage of agent time spent in direct service including “talk and wrap up time”. Objective: Measures the response time to the client – primary indicator of customer satisfaction.Key Performance Indicators for Service Delivery Channels functions Status: Recommended as operational KPI for queued service models. Objective: Ensures accurate resourcing levels to achieve target service levels. . receipt by client) expressed as a percentage of target time. Objective: Measures effective use of channel resources. Suggested benchmark / Range: TBD Status: Recommended as operational KPI for queued service models. KPI: Agent Adherence Description: An assessment of service agent adherence to schedule and making oneself available during anticipated service periods. Definition: (Talk time + after visit wrap up time) divided by total agent log in time over measured period.e. KPI: Resource Allocation Description: A management indicator assessing allocated agent positions to service delivery. Definition: Locally defined Status: Recommended as operational KPI for queued service models. Definition: Status: Under review. KPI: Turn Around Time Description: The average time to transaction complete (i. Metrics for Service Effectiveness Working group is asked to contribute suggestions for KPIs in this theme. Objective: Contributes to resourcing effectiveness Definition: Calculated as total agent login time divided by scheduled work time Status: Recommended as operational KPI for queued service models.

and for the development of skills related to professionalism and customer interaction. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Monitoring this over time provides a measure of the impact of staff turnover. . MAF CATEGORY: PEOPLE Total Months – Staff on Strength. Definition: Count of number of computer accesses divided by total visitors during measurement period. Helps measure the utilization of service centre supervisor time as well as the investment in agent skill improvement. Average Months on Strength per Agent: A measure of the total experience level of the agents/staff within the service centre. Staff Turnover Ratio: A measure of the ‘churn’ rate within the Agent team. Training is required for program/service delivery.Key Performance Indicators for Service Delivery Channels Metrics for Use of Technology KPI: Self-Service Ratio Description: A visitor to the service office that accesses computers Objective: Measures utility computer facilities within service office. Status: Proposed as Core KPI. Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents. Status: Proposed as a Core KPI. for technology. Objective: Measures overall service demand Definition: See ACCESS measure. Further discussion with departments and agencies will be conducted to identify effective KPIs under this MAF category. Metrics for Channel Take-up KPI: Visitors Description: Total visitors entering the office. Provides a secondary indicator of service centre health and it often correlates to overall customer satisfaction levels.

Objective: Measures overall site access through search engines. Derivation: Web traffic statistics counting visits arriving at site without referring URL. KPI: Search Engine Ranking Description: Relevance ranking weighted from distribution of site visitors who entered the site through commercial search engines. hits versus visits).0 KEY PERFORMANCE INDICATORS – Internet Channel The Canadian Gateways team has published a definitive report on Internet Measurement identifying the suitability and meaning of specific web measures (for example.Key Performance Indicators for Service Delivery Channels 10. Definition: Sum of (relevance ranking multiplied by search engine referring count) divided by total search engine referrals Derivation: Relevance rank from top five referring search engines using visitor representative sample of search terms Suggested benchmark / Range: Status: Proposed as a Core KPI KPI: Direct Access Ratio Description: Percentage of visits which access the site directly via same or known URL to total visitors. links from other sites or via publication of the URL through other channels such as the phone and mail? and b) Is the site available for site visitors once it has been located? Other qualitative characteristics contributing to access include compliance with W3C Accessibility Standards to ensure the site is fully inclusive and available to persons with disabilities.This metric assumes that visits accessing the site directly are either typing or pasting a URL in from another source (e.g. Objective: Assessment of site ‘memory’ through known URL or bookmarking. a brochure) or have bookmarked the site as a result of repeated visits. Status: Proposed as a Core KPI KPI: Server Availability Percentage . the access theme includes measures concerning the availability of the site to potential site visitors. Definition: Visits arriving at any page in site who do not have a referring URL associated with the visit. There are two primary components to site availability: a) How easily can site visitors locate the site through search engines. Metric assumes that a high search engine rank provides maximum accessibility to those visitors who access the site via search. Readers are asked to review this document (see Appendix B). MAF CATEGORY: CITIZEN FOCUSED SERVICE Metrics for Access In the Internet Channel.

Derivation: web monitoring package Suggested benchmark / Range: Status: Proposed as Core KPI KPI: Abandonment Rate Description: Rate at which visitors initiate transactions but do not reach the ‘submit’ page PLUS visitors exiting site from non-content pages Objective: Key Measure for overall service level. Status: Proposed as Core KPI KPI: Conversion Rate Description: Rate at which visitors initiate transactions and reach the ‘submit’ page. Definition: Total visits arriving from specified websites divided by total visits.Key Performance Indicators for Service Delivery Channels Description: Total available server hours over total planned server hours during reporting period. other jurisdictions etc. other GoC sites. Objective: Measures another access route to the site and can be used to adjust access strategies. Definition: Visits with unsatisfactory exit pages divided by total visits Derivation: web traffic statistics Suggested benchmark / Range: Status: Proposed as Operational Measure .g. Objective: Key Measure of overall service level and visitor satisfaction Definition: Total visits reaching “submit” pages divided by total visits viewing transaction start pages. GoC Gateways. This KPI can be further broken down into specific sites: e. Description: Percentage of total visits arriving at the site from planned referral sites. Objective: Indicative of overall Internet service capacity Definition: sum of total available server hours less scheduled maintenance hours divided by total planned server hours Derivation: Server/Operating System Logs Status: Proposed as a Core KPI KPI: Referral percentage.

Ministerial Correspondence system Status: Recommended as an operational measure but not currently used within most departments.Key Performance Indicators for Service Delivery Channels Metrics for Delay KPI: Average Visit Duration Description: The average duration of a visit. complaints and compliments categorized into effective topics. However. Metrics for Quality KPI: Site Error Messages Description: Capture of all computer identified error conditions. Definition: Total ‘error’ page views divided by total visits Derivation: Web activity tracking Status: Recommended as Core KPI KPI: Internet Channel Feedback Description: Total criticisms. transactional. This metric can provide some indication of visitor need.g. However as more and more transactions are put online. search) Objective: Assessment of site “stickiness” – overall relevance of site content and transactions to visitors’ requirements. browse. invalid links. Definition: Total elapsed seconds from site entry at any page to site exit for all visits divided by number of visits Derivation: Measured by web traffic software. Definition: Count of complaints by topic over reporting period. Derivation: E-Mail. Such as page not found message. statistics for visit duration may need to be separated according to type of visit (e. transaction aborts etc. . received through all sources. Objective: Improves overall site quality and response. Objective: Contributes to program integrity. Phone Incident Tracking System. it is recognized as high value. It is recognized by working group that this is difficult to track today. Status: Recommended as an operational measure.

Objective: Provides high level indication and trend of overall service performance.. Objective: Identifies and reinforces effective web design and web authoring skills. As well. Definition: Best measured through the use of focus groups and independent testing organizations. All are recommended as Operational Measures. Status: Proposed as Core KPI but must be further developed by working group. Quality of email responses where implemented can be verified by Email Response Management System (# of QA corrections. KPI: Agent Capacity Description: The anticipated number of hours of agent time available for service for . measured repeatedly. Timeliness of speed of e-mail response can also be measured. Some input may be available from Media Metrics. Status: Proposed as Core KPI MAF CATEGORY: STEWARDSHIP KPI: Cost per Visit. Metrics for Client Satisfaction KPI: Client Satisfaction Level Description: Application of Common Measurement Tool (CMT) to assess and benchmark client satisfaction Objective: Primary indicator of client perception of service quality. etc). trends provide strong evidence of service improvement levels Derivation: The multi-channel survey tool will be used (at a minimum) to determine the core measures relevant to the Internet channel. Definition: will require further working group consultation Status: Recommended as Core KPI Metrics for Agent Utilization The following four measures can be tracked for agent-assisted calls concerning the Internet channel and for all messages/e-mails submitted through the Internet site. Cost per Visitor Description: The total operational cost of the site over the reporting period divided by total visits/visitors handled during the reporting period. period exit surveys should be conducted upon site exit.Key Performance Indicators for Service Delivery Channels KPI: Professionalism Description: Encompasses a range of soft skills that govern the approach to delivering accurate information and reliable services.

Definition: number of single unique visits within x-day period who exited the site from specific ‘success’ (i.e. User support metrics Suggested benchmark / Range: 85% Metrics for Service Effectiveness KPI: First Visit Resolution Description: Unique visitors over x-day period who exited the site from ‘success’ content pages Objective: Minimize cost and maximize client satisfaction.Key Performance Indicators for Service Delivery Channels each full-time equivalent (FTE). Objective: Ensures accurate resourcing levels to achieve target service levels. Objective: Contributes to resourcing effectiveness Definition: Calculated as total agent login time divided by scheduled work time KPI: Agent Occupancy Description: The percentage of agent time spent in direct service including “talk and wrap up time”. Objective: Measures effective use of channel resources. Definition: (Talk time + after call wrap up time) divided by total agent log in time over measured period. Definition: Locally defined KPI: Agent Adherence Description: An assessment of telephone agent adherence to schedule and making oneself available during anticipated service periods. answer found) pages . Objective: Ensures that agent resources are dedicated to required functions KPI: Resource Allocation Description: A management indicator assessing allocated FTE’s to service delivery.

Status: Proposed as Core KPI MAF CATEGORY: PEOPLE At publishing time. Some examples of KPIs that might be suitable for this MAF category include: Total Months – Staff on Strength. Staff Turnover Ratio: A measure of the ‘churn’ rate within the Agent team. Monitoring this over time provides a measure of the impact of staff turnover. . Status: Proposed as Core KPI KPI: Visitors Description: Unique Visitors Objective: Measures service demand accurately. Provides a secondary indicator of Call Centre health and it often correlates to overall customer satisfaction levels.Key Performance Indicators for Service Delivery Channels Metrics for Use of Technology As the Internet Channel is used to provide self-service through Technology. Helps measure the utilization of Call Centre supervisor time as well as the investment in agent skill improvement. this theme is not applicable within the channel. KPIs for the MAF PEOPLE category had not yet been proposed to the working group for review. Definition: Unique visitors counted either through registration/login processes or via cookies. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Metrics for Channel Take-up Web channel take up data is used in comparison with other channels to determine the impact of web site changes. Average Months on Strength per Agent: A measure of the total experience level of the agents within the call centre. KPI: Visits Description: Total site visits accepted Objective: Measures overall service demand Definition: Number of visit sessions initiated by web servers.

for technology. and for the development of skills related to professionalism and customer interaction.Key Performance Indicators for Service Delivery Channels Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents. Training is required for program/service delivery. . Further discussion with departments and agencies will be conducted to identify effective KPIs under this MAF category.

Definition: The total number of minutes from opening of envelope to mailing of response. KPI: Applications/Mail in Process Description: All files remaining open at end of reporting period. wrongly-addressed etc) Suggested benchmark / Range: Status: Proposed as Core KPI.Key Performance Indicators for Service Delivery Channels 11. KPI: Applications Completed Description: Outbound mail for completed files. Definition: Total envelopes opened less inappropriate mail (junk mail. Represents the ‘work in progress’ within the processing centre. Status: Recommended as a Core KPI. Definition: Status: Proposed as Core KPI. Objective: Primary Indicator of visitor satisfaction. . Status: Proposed as Core KPI. Metrics for Delay KPI: Average Cycle Time (ACT) Description: The average elapsed time that the application/mail was held within the processing centre prior to completion. Derivation: Measured by mail tracking system.0 KEY PERFORMANCE INDICATORS – Mail Channel MAF CATEGORY: CITIZEN FOCUSED SERVICE Metrics for Access KPI: Applications/Pieces Opened Description: Count of new envelopes opened during reporting period. Definition: Previous open files + applications received less applications completed. Objective: Basic volume measure.

post response phone call). Metrics for Quality KPI: Response Accuracy Description: Reliability of mail response/completion. Objective: This measure is required in order to set and publish mail service standards. . Status: Under review.Key Performance Indicators for Service Delivery Channels KPI: Pass Through Ratio Description: Ratio of total handling time to total cycle time. Objective: To ensure program integrity. Objective: Assessment of application instructions to clients Status: Proposed as an operational measure. KPI: Critical Error Rate Description: Some operations monitor application/transaction errors (typically omission of required information) requiring additional interactions with clients. Status: Recommended as an Core KPI. Derivation: QA report. May be impractical in several service models. Definition: Total minutes of processing time (time in agent) divided by total elapsed time. Definition: Local quality scorecard assessed by quality assurance review of outbound mail plus ‘write backs’ – One or more subsequent mail receipts for same applications. Ratio should approach 1. Derivation: Measured by mail tracking system. Definition: Applications completed within service threshold divided by total applications completed.0 to indicate zero delay between processes. Status: Under review. Definition: Best measured through the use of an enclosed feedback postcard or through alternate channel surveys (e. KPI: Service Level Description: Percentage of mail that are completed within target processing time. Objective: Identifies and reinforces effective communication. KPI: Professionalism Description: Encompasses a range of soft-skills that govern the approach to delivering accurate information and reliable services. Objective: Primary Indicator of workflow efficiency. Derivation: Measured by mail tracking system Status: Recommended as an operational KPI. Applicability to be reviewed.g.

Key Performance Indicators for Service Delivery Channels .

Caveat: Members noted that current systems do not currently support the collection and categorization of service complaints that are received through a wide variety of channels (e. Definition: TBD Status: Recommended as Core KPI. Objective: Ensures that agent resources are dedicated to required service . Definition of labour cost to be determined. phone call) MAF CATEGORY: STEWARDSHIP Metrics for Agent Utilization KPI: Cost per Contact Description: Total labour costs divided by total service requests. measured repeatedly. Objective: Provides a snapshot of current operational efficiency specifically related to agent/manpower. general e-mails.g. Derivation: Counted by incident tracking system. KPI: Agent Capacity Description: The anticipated number of hours of agent time available for mail service for each agent. Minister’s correspondence. Status: Recommended as core KPI. Objective: Primary indicator of service quality particularly when measured over time.Key Performance Indicators for Service Delivery Channels Metrics for Client Satisfaction KPI: Client Satisfaction Level Description: Application of Common Measurement Tool (CMT) to assess and benchmark client satisfaction Objective: Primary indicator of client perception of service quality. KPI: Service Complaints Description: Count and categorization of complaints received through all channels concerning the mail channel. Definition: Total service complaints received during reporting period divided by total number of mail received. Status: Recommended as core KPI. trends provide strong evidence of service improvement levels Derivation: The multi-channel survey tool will be used (at a minimum) to establish the “CSat” level for in-person services. Complaints received through other channels must be added to total.

Definition: Locally defined Status: Recommended as operational KPI for mail processing service models. Status: Recommended as operational KPI for mail processing service models. Objective: Measures effective use of channel resources. KPI: Agent Occupancy Description: The percentage of agent time spent in direct mail service including “wrap up time”. KPI: Resource Allocation Description: A management indicator assessing allocated agent positions to service delivery. Metrics for Service Effectiveness KPI: Description: Objective: Definition: Status: Metrics for Use of Technology KPI: Automated Response Ratio Description: Ratio of applications received and completed but not handled by agents to total applications received.Key Performance Indicators for Service Delivery Channels functions. KPI: Agent Adherence Description: An assessment of service agent adherence to schedule and making oneself available during anticipated service periods. Definition: (Response time + wrap up time) divided by total agent log in time over measured period. Suggested benchmark / Range: TBD Status: Recommended as operational KPI for mail service models. Objective: Ensures accurate resourcing levels to achieve target service levels. . Objective: Definition: Status: Proposed as an Operational KPI. Objective: Contributes to resourcing effectiveness Definition: Calculated as total agent login time divided by scheduled work time Status: Recommended as operational KPI for queued service models.

for technology. . Further discussion with departments and agencies will be conducted to identify effective KPIs under this MAF category. Training Days/Agent: Total training days delivered during the measurement period divided by the number of agents. Agent Coaching Ratio: Number of hours of 1 on 1 coaching time/agent. Objective: Measures overall service demand Definition: See ACCESS measure. Monitoring this over time provides a measure of the impact of staff turnover. Staff Turnover Ratio: A measure of the ‘churn’ rate within the Agent team. and for the development of skills related to professionalism and customer interaction. Provides a secondary indicator of service centre health and it often correlates to overall customer satisfaction levels. Average Months on Strength per Agent: A measure of the total experience level of the agents/staff within the service centre. MAF CATEGORY: PEOPLE Total Months – Staff on Strength.Key Performance Indicators for Service Delivery Channels Metrics for Channel Take-up KPI: Applications Received Description: Total applications/mail entering the processing centre. Training is required for program/service delivery. Status: Proposed as Core KPI. Helps measure the utilization of service centre supervisor time as well as the investment in agent skill improvement.

Secretariat is developing a Web-based approach to support and streamline departmental performance reporting. Service delivery key performance Indicators will become an important component of these departmental reports and will significantly contribute to common understanding of overall service channel performance across government.0 USING SERVICE DELIVERY KPI’s in DEPARTMENTAL REPORTING The MAF provides the primary framework for departments to prepare required annual performance reports to provide formal feedback to Deputy Ministers. Treasury Board of Canada.Key Performance Indicators for Service Delivery Channels 12. .

Shalini@tbs-sct.Key Performance Indicators for Service Delivery Channels 13. Service Delivery Improvement Telephone: (613) 946-6264 Fax: (613) 952-7232 Shalini Sahni (Sahni.ca) Director.gc.ca) Analyst Telephone: (613) 948-1119 Fax: (613) 952-7232 .0 CONTACT INFORMATION Further information. Secretariat 2745 Iris Street Ottawa. suggestions and contributions can be forwarded to: Service Delivery Improvement Treasury Board of Canada.gc. Ontario K1A 0R5 Victor Abele (abele.victor@tbs-sct.

referral/call back electronic forms with response script suggestion. IVR/VR – Interactive Voice Response/Voice Recognition – two related terms describing two types of self-service technology employed in the Telephone Service Channel. delivers IVR recordings as selected by the caller. customer satisfaction. quality. etc. CTI provides features such as automatic caller file retrieval.Key Performance Indicator – a measurable objective which provides a clear indication of service centre capability. Voice Recognition allows the caller to speak the question or say an option from a recorded list.. Internet. caller wait time. and quick access to the mainframe and online reference material. mail and in-person. soft phone. and routes calls from the queue to appropriate agents based on any number of caller parameters CTI – Computer Telephony Integration – technology that provides an integrated phone/computer capability to the service agent. Channel – The primary service channels are telephone.Key Performance Indicators for Service Delivery Channels APPENDIX A: Terms and Definitions ACD – Automatic Call Distributor – a software/hardware device that manages call queues. . KPI . Interactive Voice Response provides the caller with a series of options to be selected using the telephone keypad.

2003.A. October 1. Fiona Seward. Treasury Board Secretariat. Consulting and Audit Canada (Project 550-0743) . Summary Report on Service Standards. Canada. 2003. Service Improvement Initiative – How to Guide. McClure. Service Delivery Improvement. Version 2. Treasury Board Secretariat Canada and Burntsands Consulting. U. Key Performance Indicators Workshop. March 14. Ivackovic and Costa. Treasury Board Secretariat. Treasury Board Secretariat.S.0. 2000. Common Web Traffic Metrics Standards. Institute for Citizen-Centred Service. Version 1.. March 21. Institute of Public Administration of Canada. 2003. Sprehe and Eschenfelder. Canada. Eppes. Department of Works and Pensions. 2001. Canada. United Kingdom. Peformance Measures for Federal Agency Websites: Final Report. Joint report for Defense Technical Information Center.Key Performance Indicators for Service Delivery Channels APPENDIX B: References Citizen First 3 report. Erin Research Inc. 2004. 2003. 2000. Performance Management Metrics for DWP Contact Centres. Service Management Framework Report.1. Energy Infomration Administration and Government Printing Office.

Cost Applications per Visitor Received Visits Visitors .Key Performance Indicators for Service Delivery Channels APPENDIX C: Summary of Core Key Performance Indicators These core indicators were vetted by the working group and are recommended for inclusion into the MAF. Phone Call Access Caller Access Abandoned Calls In-person Calls Answered by IVR Successfully Calls Callers Mail Visitor Access Search Engine Applications/Pieces Ranking Opened Client Satisfaction Direct Access Ratio Applications Level Completed Service Complaints Server Availability Applications/Mail in Percentage Process Cost per Contact Referral Percentage Average Cycle Time Average Speed to Answer Answer Accuracy Visitors Client Satisfaction Level Cost per Call First Call Resolution Call Avoidance Internet Conversion Rate Pass Through Ratio Site Error MessagesClient Satisfaction Level Professionalism Service Complaints Client Satisfaction Cost per Contact Level Cost per Visit.