You are on page 1of 5

CORPORATE LEADERSHIP COUNCIL

MARCH 2004
www.corporateleadershipcouncil.com

LITERATURE K EY FINDINGS

Practical Examples of Measuring Training
Effectiveness
TRAINING ROI CALCULATIONS
§

Cost-benefit analysis—Vulcan Materials, a US building materials and construction
chemicals supplier, conducts cost-benefit analysis for many training programmes
through the following methods: 1
§

Before training—Prior to the start of a training programme, the organisation’s training staff
asks middle managers to estimate the savings that will result from a proposed training
intervention in their departments.
ü
The managers also assess, on a scale of 0 to 1.00, their confidence that the training will
produce s avings.
ü
Trainers then multiply savings by confidence ratings to obtain an estimate of total savings
they can expect from the course.

§

After training—Following training, evaluators review actual cost savings or revenue
increases.
ü
Line managers then estimate, by assigning a percentage, the extent to which they believe
the improvement(s) can be attributed to training.
ü
The organisation multiplies actual savings or increases by line managers’ estimates, and
compares the resulting figure to the forecast of total saving from the course.

To calculate the ROI of training
companies can use the following
methods:
§
§
§
§
§

Business impact analysis
Cost benefit-analysis
Impact of training on performance
improvement
Revenue increases through
training
Value added per participant
§

Training ROI Formula—Merck & Company uses a unique formula, detailed below, to
measure the impact of specific training programmes. With this model, Merck determined
that its average ROI for training programmes is 84 percent and terminated 53
programmes that were not producing high enough returns.2
GAIN = D x SD$ x JSI x N
D=shift in performance by average individual undergoing training expressed in standard deviations
from pre-training average
SD$=the value in dollars of one standard deviation of performance shift
JSI=percentage of job skills affected by training
N=number of participants who underwent training

§

Training expenditure calculations—Financial services companies profiled in past
Council research use the following metrics to evaluate their training functions:3
§
§

§

§

Training expenditures as a percentage of total payroll—Indicates the amount of training
investment relative to labour costs.
Training expenditures per employee—Identifies the average dollar amount spent on training
per employee. Allows HR professionls to compare a company’s investment in human
development to industry competitors.
Training hours per employee—Measures the temporal amount of training each employee
receives on average. Provides organisations with a non-financial quantity measure, allowing
them to adjust for different uses of training investment while benchmarking.
Trainer-to-employee ratio—Measures the number of employees that each training employee
supports in the organisation.

Specific training metrics at profiled financial services companies follow in the table
below.
TRAINING STATISTICS AT PROFILED FINANCIAL SERVICES COMPANIES
Profiled
Company
A
B
C

Training
Expenditure as
a Percentage of
Total Payroll
2 percent
2 percent
3.9 percent

Training
Expenditure Per
Employee

Training Hours
Per Employee

Trainer-toEmployee Ratio

€1,515
€457
€1,355

n/a
31.5
131.6

1:200
1:144
1:88

 2004 Corporate Executive Board
CATALOGUE NUMBER: CLC11W5Y7H

and business results. Boots The Chemists measures the impact of training on the bottom line. knowledge. using a scale from 1 to 5. However.CORPORATE LEADERSHIP COUNCIL PAGE 2 PRACTICAL EXAMPLES OF M EASURING T RAINING EFFECTIVENESS KEY FINDINGS TRAINING ROI CALCULATIONS (CONTINUED ) By assigning numerical values to pre. The survey takes into consideration the relative value of different jobs in the corporation to overall financial results as well as the relative value of individual skills. assigning different importance scores to specific dimensions of a given job.and post training employee performance.6 KIRKPATRICK MODEL EVALUATIONS Training evaluations with the Kirkpatrick model provide insight into the impact of training on employee satisfaction. business unit managers assess their employees’ current levels of performance along individual job dimensions. HR and business unit managers prepare job-specific performance criteria. the duration of performance gain in years. § Assessment of performance changes after training—Once employees complete their training. while the success of systems and operations training is measured in terms of reduction in time spent by staff on routine tasks. and 600 percent for sales training. and employee level to isolate the training impact on a per person margin Accenture determined a return of €2. companies can calculate the value that training interventions add to each participant. such as the marketing excellence programme. experience. their post-training performance change.. The scale requires that managers assess employees relative to their peer group. the company realised a 1. where 1 equals worst performance. the value of that performance change. In some courses. behavioural change. 3 equals average performance. § Value added per participant calculation—A medical products company profiled in past Learning and Development Roundtable research uses the following process to assess the ROI of training:4 § Employee assessment—Prior to any training intervention. market cycle. In the final step. sales staff training in product knowledge is measured by its impact on sales.91 in net benefits for every euro invested in training. The company also employs this tool to assess levels of usage. Next. For instance.g. saving the company millions in training. HR completes the ROI calculation. § All Kirkpatrick levels—Consignia (now Royal Mail) uses the Kirkpatrick model of training evaluation to monitor its e-learning programme. HR re-surveys business unit managers to assess employee performance changes. § Performance improvement ROI survey—Then business unit and HR managers complete a performance improvement ROI survey to jointly place a dollar amount on the value of one standard deviation in employee performance (e. The company eliminated its courses with low or negative results. § Isolation of training impact—Accenture analysed the ROI of all its training initiatives for 261. The company assesses different levels of the model as follows:7 § § § Level One—Online tool asks questions such as what participants thought of the product and how they accessed it.000 employees over the history of the company to learn that the ROI of training was 353 percent.200 percent ROI for its technical training curriculum. what would be the dollar value of that improvement?). the company undertakes follow -up work to measure the level of content understanding and how participants have applied their learnings within their role.  2004 Corporate Executive Board . considering the total number of training participants. if a sales representative currently assessed at 2 were to move up to 3 after sales training. As a result. the analysis revealed negative return on most corporate management training. Business impact of training—In addition to conducting end-of-course or training programme questionnaires .5 § § § Accenture used a patent-pending technique which forced out the effects of inflation. and 5 equals the best performance that the manager has ever experienced. HR then calculates the “value-added per participant” based on their performance changes and the previously determined dollar value of those performance changes. and the total cost of the training activity. Level Two—Tool incorporates tests of level of knowledge during course Level Three and Four—Sponsors and training professionals are responsible for measuring these levels.

13 § Telephone surveys and e-mails—The Defence Evaluation Research Agency (DERA) (now Defence Science and Technology Laboratory) evaluates its training courses through various methods. they were taped at the conclusion of the course. and instructor review the evaluation data to determine a course of action. The company bases most of its training decisions on qualitative data acquired from “worldwide employee surveys . an evaluator. Addressing this issue helped Jeffers substantially reduce the cost of training delivery. trainer observations. to overscoping of capacity and unnecessary cost outlays. the executives were taped six months later as they delivered presentations to clients. as opposed to 50 percent of the time prior to undergoing the training. The survey is conducted four times a year for two years. number of participants.CORPORATE LEADERSHIP COUNCIL PAGE 3 PRACTICAL EXAMPLES OF M EASURING T RAINING EFFECTIVENESS KEY FINDINGS KIRKPATRICK MODEL EVALUATIONS (CONTINUED ) § Level Three evaluation—Arthur Andersen (now Accenture) uses video cameras to conduct a Level Three evaluation of a two-day course for executives on the essentials of giving good business presentations .” such as six-month post-training self-reports. After training. In addition. participants receive evaluations from their supervisors as well as from five to eight subordinates via a survey. Jeffers’s learning function developed a cost analysis process for identifying the largest categories of waste in internal training administration.8 § Level Three evaluation—For a leadership training course. empty seats. 12 § Qualitative training evaluation—Training evaluation at Johnson & Johnson focuses on training participants’ change of behaviour on the job. and cancelled classes. annual surveys. If more than 25 percent of participants express dissatisfaction with some aspect of the programmes as it relates to the job. Next. Executives were first taped giving presentations prior to the course.9 § Level Three evaluation—Hutchinson Technology conducted Level Three evaluations by assessing employees’ skill levels prior to training by surveying customers. and longer e-mail responses can help organisations to gain a perspective on employees’ “feelings” about their training offerings. the company surveyed the same individuals using a similar survey. Evaluators send surveys to trainees. to assess training programmes for first-line supervisors participants have to define their perceptions of change in relation to a set of critical supervisory skills. supervisors and peers on how frequently employees displayed particular behaviors. and evaluators attempt to identify an increase in scores. Systems for feedback include the usual end-ofcourse questionnaires prepared by participants as well as the following:14 Measuring training waste such as number of empty seats and cancelled classes can help organisations to optimise their training resource allocation. § § § Evaluation of outcome forms completed by resource managers and business group managers for members of their team to ensure that training offerings meet and satisfy key areas More reflective longer e-mailed responses from participants Telephone surveys by training staff of 10 percent of participants to ask whether original objectives have been met and the desired learning outcomes been achieved  2004 Corporate Executive Board . For example. instructional designers.10 ALTERNATIVE TRAINING EVALUATION MODELS § Training waste analysis—Jeffers Company seeks alignment by ensuring that development investments produce the greatest possible benefit. and supervisor questionnaires. participants’ reactions are immediately captured after a training programme through evaluator observations and post-course group discussion. evaluators at Motorola use 360-degree evaluations to measure Kirkpatrick’s Level Three. a committee consisting of line managers. Jeffers identified “no shows” to training as a prominent cost driver. since excessive registrations led. For example. Finally. The survey revealed that employees used the selected skills approximately 75 percent of the time. trainees’ supervisors and their subordinates that ask all three groups to rate the trainees on the frequency with which they demonstrate certain behaviors associated with leadership. The company tracks metrics such as cost of classes. Qualitative training evaluations such as telephone surveys. For example.11 § Participant reaction assessment—The Travelers Companies use various qualitative and quantitative methods to evaluate training programmes. in the past. Evaluators graded each tape against a set list of performance indicators.

16 § Balanced scorecard—Qwest Communications uses a balanced scorecard that continually tracks financial and operational performance of the learning function and the business units with which it has partnered. for example.. The company is able to evaluate the effectiveness of its development interactions by measuring the following categories within the balanced scorecard:17 Training databases and balanced scorecards provide companies with a framework for tracking key indicators of training effectiveness. The impact of specific learning opportunities on employees is measured using the metrics outlined below:18 § § § § Pre-test and post-test scores per offering Time to completion (i. enrollment volume People —Percentage of development plans completed. any time a course scores below a pre-defined quality standard. content relevance and compression Financial—Unit costs for assessment. development and delivery. The manager responsible for a particular training course scoring below the threshold is accountable for presenting to training management an assessment of what caused the low evaluation scores and recommendations for improving or eliminating the course. A training manager can review. The system can also accumulate training cost data which the company uses to determine the ROI of training both companywide and by business unit. a trigger is pulled for review of that course and its instructor. time and visit required to complete individual offering) Time to competency as measured by post-test and on-the-job behaviour against course objectives Change in learner’s ability to perform job responsibilities against time spent learning  2004 Corporate Executive Board . L&D staff satisfaction and engagement E-Learning Metrics—Cisco Systems tracks a number of metrics specific to its extensive e-learning offerings across the organisation. § § § § § Client impact—Student ratings. productivity: delivery volume versus headcount increase Operational—Total training hours. each training department uses software to evaluate its own training functions.15 § Course review triggers in online database—A company profiled in past Learning and Development Roundtable research collects course evaluations in a single database based on a consistent set of metrics related to course relevance and effectiveness. an individual employee’s training history to identify gaps in development.e. classroom yield.CORPORATE LEADERSHIP COUNCIL PAGE 4 PRACTICAL EXAMPLES OF M EASURING T RAINING EFFECTIVENESS KEY FINDINGS ALTERNATIVE TRAINING EVALUATION MODELS (CONTINUED ) § Online training database—MCI Communications maintains a corporate training database that combines information from the company’s training sites. Using a PCbased information system .

we are unable to provide a copy of this research since your company is not a member of the Learning and Development Roundtable).B. Washington: Corporate Executive Board (2002)." Available: The Bureau of National Affairs.T. (Obtained through www. Dolezalek. Reframing the Measurement Debate: Moving Beyond Program Analysis in the Learning Function. “Training Strategies. (Unfortunately. 12 BNA's Human Resources Library on CD. (Obtained through Factiva). 5 Schettler. 9 Geber. “Training Strategies. we are unable to provide a copy of this research since your company is not a member of the Learning and Development Roundtable).irsemploymentreview.J. Washington: Corporate Executive Board (December 2001). 2  2004 Corporate Executive Board . 18 Author Unknown. The Productivity and Efficiency Imperative. (Unfortunately. “Training Programme Assessment and Evaluation. 13 BNA’s Human Resource Library on CD.H.. [CD-ROM]. (Due to copyright restrictions we are unable to provide a copy of this research). Barbian. Washington: Corporate Executive Board (2002).” IRS Employment Review 753 (June 2002). Washington: Corporate Executive Board (December 2001). we are unable to provide a copy of this research since your company is not a member of the Learning and Development Roundtable). (Obtained through http://formare. "The 2003 Training Top 100.B.” 16 Learning and Development Roundtable. (Unfortunately.” IDS Study 679 (November 1999). and Johnson. 7 IRS Employment Review. (Unfortunately. “Does Your Training Make a Difference? Prove It!” 11 Learning and Development Roundtable. [CD-ROM]. “Training Programme Assessment and Evaluation..erickson. Reframing the Measurement Deb ate: Moving Beyond Program Analysis in the Learning Function. 8 Geber. The Productivity and Efficiency Imperative. 17 Learning and Development Roundtable. Washington: Corporate Executive Board (December 2001). [CD-ROM]. Galvin. 15 BNA’s Human Resource Library on CD.pdf). The Productivity and Efficiency Imperative.” Cisco (2000).co. “Does Your Training Make a Difference? Prove It!” Training (March 1995). (Due to copyright restrictions we are unable to provide a copy of this research). Incorporated. 4 Learning and Development Roundtable.J.” 14 Author Unknown.” IDS Study 679 (November 1999). [CD-ROM].G. (Obtained through LEXIS-NEXIS).it/archivio/maggio_giugno/cisco. "Training Program Assessment and Evaluation. “Does Your Training Make a Difference? Prove It!” 10 Geber. Benchmarking Training Statistics.” Training (March 2003)..CORPORATE LEADERSHIP COUNCIL PAGE 5 PRACTICAL EXAMPLES OF M EASURING T RAINING EFFECTIVENESS 1 KEY FINDINGS BNA's Human Resources Library on CD. we are unable to provide a copy of this research since your company is not a member of the Learning and Development Roundtable).. (Unfortunately. “Delivering an E-learning Package.” Learning and Development Roundtable.B. 3 Corporate Leadership Council. 6 Author Unknown. "Training Programme Assessment and Evaluation. we are unable to provide a copy of this research since your company is not a member of the Learning and Development Roundtable). Washington: Corporate Executive Board (April 2002). “Cross-Functional Task Force to Measure Effectiveness and Impact of e-Learning.uk).