Evaluating HRD Programs

Chapter 7

Werner & DeSimone (2006)

1

Effectiveness
The degree to which a training (or other HRD program) achieves its intended purpose Measures are relative to some starting point Measures how well the desired goal is achieved
Werner & DeSimone (2006) 2

Evaluation

Werner & DeSimone (2006)

3

HRD Evaluation
Textbook definition: The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.

Werner & DeSimone (2006)

4

In Other Words
Are we training: the right people the right stuff the right way with the right materials at the right time?

Werner & DeSimone (2006)

5

Evaluation Needs
Descriptive and judgmental information needed 

Objective and subjective data

Information gathered according to a plan and in a desired format Gathered to provide decision making information
Werner & DeSimone (2006) 6

Purposes of Evaluation
Determine whether the program is meeting the intended objectives Identify strengths and weaknesses Determine cost-benefit ratio Identify who benefited most or least Determine future participants Provide information for improving HRD programs
Werner & DeSimone (2006) 7

Purposes of Evaluation

2

Reinforce major points to be made Gather marketing information Determine if training program is appropriate Establish management database

Werner & DeSimone (2006)

8

Evaluation Bottom Line
Is HRD a revenue contributor or a revenue user? Is HRD credible to line and upper-level managers? Are benefits of HRD readily evident to all?

Werner & DeSimone (2006)

9

How Often are HRD Evaluations Conducted?
Not often enough!!! Frequently, only end-of-course participant reactions are collected Transfer to the workplace is evaluated less frequently

Werner & DeSimone (2006)

10

Why HRD Evaluations are Rare
Reluctance to having HRD programs evaluated Evaluation needs expertise and resources Factors other than HRD cause performance improvements e.g., 
 

Economy Equipment Policies, etc.
Werner & DeSimone (2006) 11

Need for HRD Evaluation
Shows the value of HRD Provides metrics for HRD efficiency Demonstrates value-added approach for HRD Demonstrates accountability for HRD activities Everyone else has it why not HRD?
Werner & DeSimone (2006) 12

Make or Buy Evaluation
I bought it, therefore it is good. Since it s good, I don t need to posttest. Who says it s: 
  

Appropriate? Effective? Timely? Transferable to the workplace?
Werner & DeSimone (2006) 13

Evolution of Evaluation Efforts
1. Anecdotal approach
talk to other users 2. Try before buy borrow and use samples 3. Analytical approach match research data to training needs 4. Holistic approach look at overall HRD process, as well as individual training
Werner & DeSimone (2006) 14

Models and Frameworks of Evaluation
Table 7-1 lists six frameworks for evaluation The most popular is that of D. Kirkpatrick: 
  

Reaction Learning Job Behavior Results
Werner & DeSimone (2006) 15

Kirkpatrick s Four Levels
Reaction 

Focus on trainee s reactions Did they learn what they were supposed to? Was it used on job? Did it improve the organization s effectiveness?

Learning 

Job Behavior 

Results 

Werner & DeSimone (2006)

16

Issues Concerning Kirkpatrick s Framework
Most organizations don t evaluate at all four levels Focuses only on post-training Doesn t treat inter-stage improvements WHAT ARE YOUR THOUGHTS?

Werner & DeSimone (2006)

17

Other Frameworks/Models
CIPP: Context, Input, Process, Product (Galvin, 1983) Brinkerhoff (1987): 
    

Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth
Werner & DeSimone (2006) 18

Other Frameworks/Models
Kraiger, Ford, & Salas (1993): 
 

2

Cognitive outcomes Skill-based outcomes Affective outcomes Secondary Influences Motivation Elements Environmental Elements Outcomes Ability/Enabling Elements
Werner & DeSimone (2006) 19

Holton (1996): Five Categories: 
   

Other Frameworks/Models
Phillips (1996): 
   

3

Reaction and Planned Action Learning Applied Learning on the Job Business Results ROI

Werner & DeSimone (2006)

20

A Suggested Framework
Reaction 


1

Did trainees like the training? Did the training seem useful? How much did they learn? What behavior change occurred?
Werner & DeSimone (2006) 21

Learning 

Behavior 

Suggested Framework
Results 


2 

What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?

Werner & DeSimone (2006)

22

Data Collection for HRD Evaluation
Possible methods: Interviews Questionnaires Direct observation Written tests Simulation/Performance tests Archival performance information
Werner & DeSimone (2006) 23

Interviews
Advantages: Flexible Opportunity for clarification Depth possible Personal contact Limitations: High reactive effects High cost Face-to-face threat potential Labor intensive Trained observers needed
24

Werner & DeSimone (2006)

Questionnaires
Advantages: Low cost to administer Honesty increased Anonymity possible Respondent sets the pace Variety of options Limitations: Possible inaccurate data Response conditions not controlled Respondents set varying paces Uncontrolled return rate
25

Werner & DeSimone (2006)

Direct Observation
Advantages: Nonthreatening Excellent way to measure behavior change Limitations: Possibly disruptive Reactive effects are possible May be unreliable Need trained observers

Werner & DeSimone (2006)

26

Written Tests
Advantages: Low purchase cost Readily scored Quickly processed Easily administered Wide sampling possible Limitations: May be threatening Possibly no relation to job performance Measures only cognitive learning Relies on norms Concern for racial/ ethnic bias
27

Werner & DeSimone (2006)

Simulation/Performance Tests
Advantages: Reliable Objective Close relation to job performance Includes cognitive, psychomotor and affective domains Limitations: Time consuming Simulations often difficult to create High costs to development and use

Werner & DeSimone (2006)

28

Archival Performance Data
Advantages: Reliable Objective Job-based Easy to review Minimal reactive effects Limitations: Criteria for keeping/ discarding records Information system discrepancies Indirect Not always usable Records prepared for other purposes
29

Werner & DeSimone (2006)

Choosing Data Collection Methods
Reliability 

Consistency of results, and freedom from collection method bias and error Does the device measure what we want to measure? Does it make sense in terms of the resources used to get the data?
Werner & DeSimone (2006) 30

Validity 

Practicality 

Type of Data Used/Needed
Individual performance Systemwide performance Economic

Werner & DeSimone (2006)

31

Individual Performance Data
Individual knowledge Individual behaviors Examples: 
 



Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
Werner & DeSimone (2006) 32

Systemwide Performance Data
Productivity Scrap/rework rates Customer satisfaction levels On-time performance levels Quality rates and improvement rates

Werner & DeSimone (2006)

33

Economic Data
Profits Product liability claims Avoidance of penalties Market share Competitive position Return on investment (ROI) Financial utility calculations

Werner & DeSimone (2006)

34

Use of Self-Report Data
Most common method Pre-training and post-training data Problems: 

Mono-method bias 
Desire to be consistent between tests 



Socially desirable responses Response Shift Bias: 
Trainees adjust expectations to training

Werner & DeSimone (2006)

35

Research Design
Specifies in advance: the expected results of the study the methods of data collection to be used how the data will be analyzed

Werner & DeSimone (2006)

36

Research Design Issues
Pretest and Posttest  

Shows trainee what training has accomplished Helps eliminate pretest knowledge bias Compares performance of group with training against the performance of a similar group without training
Werner & DeSimone (2006) 37

Control Group 

Recommended Research Design
Pretest and posttest with control group Whenever possible:  

Randomly assign individuals to the test group and the control group to minimize bias Use time-series approach to data collection to verify performance improvement is due to training
Werner & DeSimone (2006) 38

Ethical Issues Concerning Evaluation Research
Confidentiality Informed consent Withholding training from control groups Use of deception Pressure to produce positive results

Werner & DeSimone (2006)

39

Assessing the Impact of HRD
Money is the language of business. You MUST talk dollars, not HRD jargon. No one (except maybe you) cares about the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.
Werner & DeSimone (2006) 40

HRD Program Assessment
HRD programs and training are investments Line managers often see HR and HRD as costs i.e., revenue users, not revenue producers You must prove your worth to the organization  Or you ll have to find another organization

Werner & DeSimone (2006)

41

Two Basic Methods for Assessing Financial Impact
Evaluation of training costs Utility analysis

Werner & DeSimone (2006)

42

Evaluation of Training Costs
Cost-benefit analysis 

Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sickdays, etc. Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
Werner & DeSimone (2006) 43

Cost-effectiveness analysis 

Return on Investment
Return on investment = Results/Costs

Werner & DeSimone (2006)

44

Calculating Training Return On Investment
Results Operational Results Area Quality of panels How Measured % rejected Before Training 2% rejected 1,440 panels per day Housekeeping Visual inspection using 20-item checklist 10 defects (average) Results After Training 1.5% rejected 1,080 panels per day 2 defects (average) Differences (+ or ) .5% 360 panels 8 defects Expressed in $ $720 per day $172,800 per year Not measurable in $

Preventable accidents

Number of accidents Direct cost of each accident

24 per year $144,000 per year

16 per year $96,000 per year

8 per year $48,000 $48,000 per year

ROI =

Return Investment

= =

Operational Results Training Costs $220,800 $32,564 = 6.8

Total savings: $220,800.00

SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.

Werner & DeSimone (2006)

45

Types of Training Costs
Direct costs Indirect costs Development costs Overhead costs Compensation for participants

Werner & DeSimone (2006)

46

Direct Costs
Instructor 
 

Base pay Fringe benefits Travel and per diem

Materials Classroom and audiovisual equipment Travel Food and refreshments
Werner & DeSimone (2006) 47

Indirect Costs
Training management Clerical/Administrative Postal/shipping, telephone, computers, etc. Pre- and post-learning materials Other overhead costs

Werner & DeSimone (2006)

48

Development Costs
Fee to purchase program Costs to tailor program to organization Instructor training costs

Werner & DeSimone (2006)

49

Overhead Costs
General organization support Top management participation Utilities, facilities General and administrative costs, such as HRM

Werner & DeSimone (2006)

50

Compensation for Participants
Participants salary and benefits for time away from job Travel, lodging, and per-diem costs

Werner & DeSimone (2006)

51

Measuring Benefits    

Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of labor and materials Reduction in preventable accidents measured in dollars ROI = Benefits/Training costs

Werner & DeSimone (2006)

52

Utility Analysis
Uses a statistical approach to support claims of training effectiveness: 
   

N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from training SDy = Dollar value of untrained job performance (in standard deviation units) C = Cost of training

(U = (N)(T)(dt)(Sdy)

C
53

Werner & DeSimone (2006)

Critical Information for Utility Analysis
dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained SDy = standard deviation in dollars, or overall productivity of organization

Werner & DeSimone (2006)

54

Ways to Improve HRD Assessment
Walk the walk, talk the talk: MONEY Involve HRD in strategic planning Involve management in HRD planning and estimation efforts 

Gain mutual ownership

Use credible and conservative estimates Share credit for successes and blame for failures
Werner & DeSimone (2006) 55

HRD Evaluation Steps
1. Analyze needs. 2. Determine explicit evaluation strategy. 3. Insist on specific and measurable training
objectives. 4. Obtain participant reactions. 5. Develop criterion measures/instruments to measure results. 6. Plan and execute evaluation strategy.
Werner & DeSimone (2006) 56

Summary
Training results must be measured against costs Training must contribute to the bottom line HRD must justify itself repeatedly as a revenue enhancer

Werner & DeSimone (2006)

57

Sign up to vote on this title
UsefulNot useful