You are on page 1of 27

GAP Analysis

Using Traditional Pre-Employment Tools to Identify Skill


Gaps in Incumbent Populations

Michael Blair, CenturyLink


Amanda Evans, PreVisor
Andy Solomonson, PreVisor

IPAC July 21, 2010


PreVisor CenturyLink
• Leading provider of assessments for • Formed via the merger of CenturyTel
pre-employment and post-hire use and Embarq
• Assessments for all jobs and • The fourth largest telecommunications
competencies in the U.S. economy provider in
• Serve more than 10,000 organizations the U.S.
worldwide; over 100 of Fortune 500 – S&P 500
• Team includes more than 70 – Headquartered in Monroe, LA
Industrial/Organizational Psychologists – Operates in 33 states
• Industry leading Select2Perform ™ – Serves rural and urban markets
online assessment platform – 7.1 million access lines
• Offices in US, UK, and Australia – 2.2 million high speed internet
subscribers
– 450,000 video subscribers
– Approximately 19,500 employees

2
Talent Management through Talent Measurement

Talent Measurement helps answer


Reorganization Recruiting the questions:
Performance Pre-Employment
• Who should I redeploy to different
Benchmarking Testing roles within my company?
• Which employees have skills that
are critical in other parts of the
Succession Interviewing
Planning company?
• Who should we train and develop?
On what competencies or skills?
Leadership
Development
Promotion • How will a reorganization affect
the talent landscape of my
organization?
Career
Development Placement • How do our people stack up
against changing roles, increasing
Training Needs Talent
and Skill Redeployment expectations?
Certification

3
Background

Organizational context
• Experienced significant organizational change due to
restructurings and realignments of employees & roles
• Workforce skill sets diverse and in some cases
mismatched with job roles
Technical evolution
• Change from voice-centric analogue world to data-
centric, high speed, digital world
Skills assessment was recommended by outgoing Director
• Well respected, helped to establish buy-in early on

4
Goals & Objectives
Desired Outcomes
• Understand cumulative impacts of changes on the
organization’s ability to effectively perform role and support
company
• Provide objective data to understand and address the
needs of the organization
• Equip managers to better understand and address the
needs of individual employees
• Identify technical and non-technical skills gaps
Reports/Tools
• Organizational-wide gap analysis roll-up
• Regional gap analysis roll-up
Leadership & Directors
• Manager gap analysis team roll-up
Managers
• Individual gap assessment report

5
Job Analysis to Gap Analysis – Conceptual Flow

Gap Analysis
Technical
Job Analysis Knowledge/
Technical Skill Test
Work Activities
& Knowledge/ Gap Analysis
Skill Areas Technical
Performance
Ratings

Gap Analysis
PreVisor
Competency
Job Analysis
Assessment
Non-Technical
Behaviors and
Gap Analysis
Competencies
Non-Technical
Performance
Ratings

6
Job Analysis Approach and Participants

Job Analysis Approach


• Background data review
• Focus groups and onsite observations
• Job analysis questionnaire (JAQ)
- Non-technical work behaviors and competencies
- Technical work activities and knowledge/skill areas
Participants
Employee
Category Responded % Complete
Count

Engineer I 32 29 91%

Engineer II 335 243 73%

Engineer III 3 6 200%

Total Engineers 370 276 75%

7
Job Analysis Outcomes

• Compared job requirements across jobs, levels, regions


• Established 8 Technical Knowledge Areas
• Examples: Fiber-based Design, Transmission Engineering
• Provided initial technical knowledge test “blueprint”
• Defined technical performance rating categories
• Confirmed Non-Technical Competencies
• Recommended 4 existing personality assessment components:
- Business Acumen, Drive for Results, Self Motivation, Building Relationships

• Recommended 16 non-technical performance rating dimensions


- Examples: Adaptability, Customer Focus, Decision Making

8
Technical Knowledge Test Development

• Identified SMES and hosted item writing workshops


• Refined items, created content validation form
• SMEs provided content validation ratings
• Refined item pool, defined final 100-item test form

9
Job Analysis and Assessment Summary
Access Engineering partnered with HR and PreVisor
• Regular communication on plan, progress, uses, benefits
Advisory panel
• Corporate and regional managers involved in all phases of project
Job analysis
• Identified technical and non-technical KSAOs
• 276 of 370 engineers participated
Technical Knowledge Test development
• 8 technical knowledge areas
• Advisory panel + 9 Senior Engineers
Assessment
• Online assessment – technical knowledge test and personality test
• 327 of 332* engineers participated
Performance ratings
• Online job performance rating form – technical and non-technical areas
• 40 managers provided ratings on participating engineers

*Turnover, promotions, & JA results reduced the target engineer pool


10
Sharing Results: Executive Briefing

Briefed VP and Regional Directors


Reviewed results
• Focused on explaining and interpreting “unexpected”
or “uncomfortable” outcomes
Walked through reports
• Provided leaders with complete set of reports
• Leaders knew what their organization would be
receiving
Allowed time for review and digestion
• Leaders were prepared for questions and concerns

11
Technical Assessment Results Summary

Assessment validity reinforced by results


• Test content supported by job analysis and content validation
• Test scores are predictive of technical job performance ratings
- Conducted concurrent, criterion-related validation study
- Overall score-performance r = .50
- Content area r ranged from .30 to .60
• Engineer IIIs outperformed IIs; Engineer IIs outperformed Is

Technical assessment results across all Engineers


• Average overall score = 66% (66/100 correct)
• Content areas scores ranged from 53% to 77%
- Highest scores = Telephony Knowledge and OSP Engineering
- Lowest scores = New & Emerging Technologies and COE Engineering

12
Challenge: Technical Knowledge Results
Access Engineering Knowledge Test - Total Score by Engineer Role

100%

90%
86%

80%
Total Test Score (% Correct)

70%
66% 67%

60%
55%
50%

40%

30%

20%

10%

0%
ALL ENGINEERS (Average) ENGINEER I AVERAGE (N=30) ENGINEER II AVERAGE (N=297) ENGINEER III AVERAGE (N=3)

• Results consistent with expected knowledge/skill level by role


• 66% average is indicative of expected level of proficiency and consistent with
SME feedback on test difficulty

13
Support: Technical Knowledge Manager Ratings
Technical Job Knowledge Ratings - Overall Rating Average by Role

7.00

6.00
5.71
Average Rating by Direct Managers (7-point scale)

5.00 4.81
4.67

4.00

3.33

3.00

2.00

1.00
ALL ENGINEERS (Average) ENGINEER I AVERAGE (N=30) ENGINEER II AVERAGE (N=297) ENGINEER III AVERAGE (N=3)

• Rating results follow a similar pattern as knowledge test results by


Engineer level

14
Additional Support: Scores by Content Area
Access Engineering Knowledge Test Scores by Content Area - All Engineers

100%

90%

80% 77% 76%

70% 69%
67%
Average Score (% Correct)

64% 62%

60%
54%
53%

50%

40%

30%

20%

10%

0%
Telephony Copper-Based Fiber-Based Design New & Emerging OSP Engineering Construction COE Engineering Transmission
Knowledge Design Technologies Design Engineering

• Highest proficiency = Telephony Knowledge; OSP Engineering


Test Content Area

• Lowest proficiency = New & Emerging Technologies; COE Engineering


• Scores by level (not shown) were consistent with overall results

15
Problem: Non-Technical Assessment Results

100

90

80
Percentile Score (National Norm)

70

60

50

42
40
34 34
30
30
23
20

10

0
Business Acumen Building Relationships Drive for Results Self Motivation TOTAL NON-TECH SCORE

Non-Technical Assessment Component

16
Average Supervisor Rating (7-point scale)

1
2
3
4
5
6
7
Ad
ap
ta
b ilit
y

5.14
m
un
ica
tio
n

5.17
ep
en
d en
c e
5.26
om
er
Fo
cu
s
5.60

io
n
M
ak
in
g
5.14

fo
rR
es
ul
ts
5.25

Pr
od
uc
t ivi
ty

17
5.47

In
i tia
t iv
e
5.46

ss
i on
a lis
m
5.76

O
ri e
nt
at
io
n

O
5.25

rg
an
iz
at
io
n
5.34

em
So
lvi
n g
5.26

w
ith
O
th
er
s
5.44

Pr
o fic
ie
nc
y
5.30
5.44
5.49
Resolution: Non-Technical Knowledge Manager Ratings
Cascading Results to Organization

Met with Directors, Regional Managers, and


Managers for each region
• Overview of project and results
• Reviewing and using roll-up reports
• Strategies for meeting with engineers
• Notes on interpreting results
• Provided point of contact from HR and PreVisor to
provide ongoing support
- Conducted follow-up meetings/training as needed

18
Reviewing and Using Roll-Up Reports
Reports are a tool to identify organizational, regional, and
manager/group level needs
Organizational and regional data:
• Strategic planning
• Organizational training and development
• Budget projections
• Comparisons across/within regions
Manager/Group data:
• Tactical planning
• Team training and development
• Assigning and managing teams
• Allocating work load

19
Reviewing & Using Individual Reports
Organizational and group reports provide baseline and context
Individual gap assessment reports
• Identify areas of strength and developmental opportunities
• Identify disconnects between assessment results and performance
ratings
• Anticipate employee’s questions and areas of concern
Individual review/feedback sessions
• Establish organizational and group level results as baseline
• Review individual report
• Use results to help prepare 2009 performance plan
Additional/future uses
• Gap analysis results are a tool to assist with:
- Performance management
- Coaching and mentoring
- Training and development opportunities

20
GAP Analysis Report Example

Yellow shaded row indicates group rollup % correct = % of questions answered


scores. % correct is group average. All Eng correctly by section or overall. Gray bar indicates Total test score
%ile is the percentile associated with the no data available. based on all 100
All Eng %ile = percentile rank based
group’s average score, based on test questions.
on comparison to All Engineers.
comparison to the All Engineer norm group.

21
GAP Analysis Report Example

Individual percentile Total score = Percentile Manager ratings of Average (mean)


scores based on score based on equally- individual employee Yellow shaded row rating across all
comparison to PreVisor weighted combination of on 7-point job indicates group average eight job content
national norm group. the four component scores. knowledge scale. (mean) rating. areas.

22
GAP Analysis Report Example

Manager
Manager ratings
ratings of
of Average
Average (mean)
(mean)
individual
individual employee
employee Yellow
Yellow shaded
shaded row
row rating
rating across
across all
all
on
on 7-point
7-point job
job indicates
indicates group
group average
average 15
15 performance
performance
performance
performance scale.
scale. (mean)
(mean) rating.
rating. dimensions.
dimensions.

23
GAP Analysis Report Example

“1” and “2” buttons


allow summary
(collapsed) and
detailed
(expanded) views,
respectively, for
report rows and
columns. “+” and “-
” buttons allow
expanded and
collapsed views of
specific report
sections.

24
Individual Assessment Report Overview

Technical test percentile scores based on


comparison to All Engineer norm group. Non-
“Score” for Technical technical percentiles based on comparison to
knowledge test is the PreVisor national norms. Percentile range = 0
total number correct to 100. Score zones: Low = 0 to 30, Medium =
overall and by section. 31 to 70, High = 71 to 100.

84

25
Individual Assessment: Non-Technical Content
This section provides in-depth interpretation and feedback
based on individual scores for each Non-Technical
assessment component, including component description,
score level interpretation, and development tips based on
score level (high, medium, or low).

26
Summary

Understand business needs


Determine employee skills and
competencies
Identify gaps
Close gaps

27

You might also like