You are on page 1of 95

© 2013 Cengage Learning

References and
Testing
© 2013 Cengage Learning

Why Check References?


• Check for resume fraud
• Find new information about the applicant
• Check for potential discipline problems
• Predict future performance
© 2013 Cengage Learning

Checking for Resume Fraud


• Why Check? • Obtaining Missing
– 1/3 resumes contain information
inaccurate info – unintentional omission
– over 500,000 people – strategic omission
have bonus degrees – deceptive omission
• Verifying Information • Alternative methods
– truth – bogus application items
– error – social security reports
– embellishment – hire professional
– fabrication reference checkers
http://www.youtube.com/watch?v=8u7WBlSIXWI
http://www.youtube.com/watch?v=B09DU_cXkR8
© 2013 Cengage Learning

Getting Info Can Be Difficult


Type of Information % Asking % Releasing
Employment dates 97 98
Eligible for re-hire 64 42
Salary history 66 41
Reason for leaving 94 19
Performance 86 18
Employability 16
Work habits 13
People skills 11
© 2013 Cengage Learning

Finding New Information


About the Applicant
• Types of Information • Alternative Measures
– personality – psychological tests
– interpersonal style – letters of
– background recommendation
– work habits – biodata
• Problems – resumes
– references seldom agree – interviews
– people act in different
ways in different
situations
© 2013 Cengage Learning

Checking for Potential Discipline


Problems
• Criminal Records
• Previous employers
• Motor vehicle records
• Military records
• Credit reports
• Colleges and universities
• Neighbors and friends
© 2013 Cengage Learning

Criminal Records
• Obtained from local and state agencies
• Check with each jurisdiction
• Only convictions can be used (EEOC Decision No. 72-
1460)
– “Reasonable amount of time” between release and
decision to hire
– In using convictions, employer must consider
• Nature and gravity of offense
• Amount of time that has passed since the conviction and/or
completion of the sentence
• The nature of the job held or being sought
© 2013 Cengage Learning

Credit Checks
• Purpose
– Predict motivation to steal
– Determine character of applicant
• Fair Credit Reporting Act
– Order through a Consumer Reporting Agency (CRA)
– Provide written notice to applicant to you will be checking credit
– Get applicant’s written authorization to check credit
– If adverse action is to be taken
• Provide applicant with “Pre-adverse Action Disclosure” which includes
copy of credit report
• Inform applicant that they will not be hired due to credit check and
provide name of CRA and notice of applicant rights to appeal within 60
days
http://www.youtube.com/watch?v=512GkwoZEFs
© 2013 Cengage Learning

Predicting Future Performance


• References are not good predictors of performance
– Uncorrected validity is .18
• References are not reliable (r = .22)
– High correlation between two letters written by the same person for
two people than between letters written by two people for the same
person
– They say more about the person writing the letter than the person
being written about
• References are lenient
– Fewer than 1% of applicants are rated below average!
© 2013 Cengage Learning

Why the Leniency?


• Applicants often choose their own
references
• Applicants often have the right to
see their files
• Former employers fear legal
ramifications
© 2013 Cengage Learning

Positive Neutral Negative


© 2013 Cengage Learning

References Often Have a Limited


Opportunity to View Behavior
Recalled

Remembered

Processed %

Observed

Behavior

0 20 40 60 80 100 120
© 2013 Cengage Learning

Potential Legal Ramifications


• Negligent hiring
http://www.youtube.com/watch?v=fpQeHuAe4E4
http://www.youtube.com/watch?v=ozMVeRT3pec

• Invasion of privacy

• Negligent reference

• Defamation
© 2013 Cengage Learning

Defamation
• Three types
– libel (written)
– slander (oral)
– self-publication
• Employers have a conditional privilege that
limits their liability
© 2013 Cengage Learning

Avoiding Liability for Defamation


Employers will not be liable if their
statements were
• Truthful • Made in good faith
– statements were true
– don’t offer unsolicited
– not true, but reasonable information
person would have believed
– statements cannot be made
them to be true
for revenge
– opinions are protected
– avoid personal comments
unless reference infers
opinion is based on facts • Made with the permission
that don’t exist of the applicant
• Made for a legitimate – use waivers
purpose – let the former employee
know if the reference will not
be positive
© 2013 Cengage Learning

Extraneous Factors Surrounding


the Reference
• Reference giver’s • The words used by the
ability to articulate reference giver
• The extent to which – cuter than a baby’s butt
the referee remembers – she has no sexual oddities
the applicant that I am aware of
– I have an intimate and
caring relationship with
the applicant
– Jill is a bud that has
already begun to bloom
© 2013 Cengage Learning

The Real Meaning of Recommendations


Recommendation Actual Meaning
He is a man of great vision He hallucinates
He is definitely a man to watch I don’t trust him
She merits a close look Don’t let her out of your sight
He’s the kind of employee you can He likes dirty jokes
swear by
She doesn’t mind being disturbed She spent 10 years in a mental
hospital
When he worked for us, he was He was arrested several times
given many citations
She gives every appearance of But, appearances are deceiving
being a loyal, dedicated employee
© 2013 Cengage Learning

The Real Meaning of Recommendations


Recommendation Actual Meaning
If I were you I would give him He can handle a broom
sweeping responsibilities
She commands the respect of But she rarely gets it
everyone with whom she works
I am sure that whatever task he He will foul up any project
undertakes, no matter how small,
he will be fired with enthusiasm
You would be very lucky to get She is lazy
this person to work for you
You will never catch him asleep on He is too crafty to get caught
the job
© 2013 Cengage Learning
© 2013 Cengage Learning

Personnel Selection Methods


• Training & Education • Skills
• Experience – Work Samples
– Applications/Resumes – Assessment Centers
– Biodata – References
– Interviews • Personality & Character
• Knowledge – Personality Tests
• Ability – Integrity Tests
– Cognitive • Medical
– Physical – Medical Exams
– Perceptual – Psychological Exams
– Drug Testing
© 2013 Cengage Learning

What types of employment tests have


you taken?
© 2013 Cengage Learning

Predicting Performance Using


Training and Education
© 2013 Cengage Learning

Ratings of Training
• Education
• Work-Related Training
• Military
© 2013 Cengage Learning

Does Education
Predict
Performance?
© 2013 Cengage Learning

Summary of Meta-Analyses
Meta-analysis Occupation K N ρ

Aamodt (2002) Police 38 9,007 .34

Vineberg & Joyner (1982) Military 35 .25


Ng & Feldman (2009) Many 85 47,125 .09

Hunter (1980) USES data 425 32,124 .10


Hunter & Hunter (1984) base
Schmidt & Hunter (1998)
Dunnette (1972) Entry level 15 .00
petroleum
© 2013 Cengage Learning

Education and Incremental Validity


• Schmidt & Hunter (1998) say no
– Cognitive ability (r = .51)
– Cognitive ability and education (r = .52)
© 2013 Cengage Learning

Validity of GPA
• GPA is a valid predictor of performance on the
job, training performance, starting salary,
promotions, and grad school performance
• GPA is most predictive in the first few years after
graduation (Roth et al., 1996)
• GPA will result in high levels (d=.78) of adverse
impact (Roth & Bobko, 2000)
• People with high GPAs
– Are intelligent (r = .50; Jensen, 1980)
– Are conscientious (r = .34; Bevier et al., 1998)
© 2013 Cengage Learning

Validity of GPA
Meta-Analysis Results
r ρ
Work-Related Criteria
Job performance (Roth et al., 1996) .16 .36
Training performance (Dye & Reck, 1989) .29
Promotions (Cohen, 1984) .16
Salary (Roth & Clarke, 1996)
Starting salary .13 .20
Current salary .18 .28
Graduate School Performance (Kuncel et al., 2001)
Grades .28 .30
Faculty ratings .25 .35
© 2013 Cengage Learning

Lingering Questions
• Is the validity of education job specific?
• What is the actual incremental validity of
education over cognitive ability?
• Why would education predict performance?
– Knowledge
– Liberal arts skills
– Mental ability
– Motivation
© 2013 Cengage Learning

Predicting Performance Using


Applicant Knowledge
• Taps job-related knowledge
• Good validity (ρ = .48)
• Face valid
• Can have adverse impact
© 2013 Cengage Learning

Predicting Performance Using


Applicant Ability
© 2013 Cengage Learning

Cognitive Ability Tests


• High validity (ρ = .51)

• Predicts training and job performance


for all jobs (Hunter, 1986)

• The more complex the job, the better


cognitive ability tests predict
performance
© 2013 Cengage Learning

Cognitive Ability Tests


Strengths
– Highest validity of all selection
measures (ρ = .51)
– Easy to administer
– Relatively inexpensive
– Most are not time consuming
© 2013 Cengage Learning

Cognitive Ability Tests


Weaknesses
– Likely to cause adverse
impact
– Low face validity
– Not well liked by applicants
© 2013 Cengage Learning

Perceptual Ability Tests


• Perceptual Ability (Fleishman & Reilly (1992)
– Vision (near, far, night, peripheral)
– Depth perception
– Glare sensitivity
– Hearing (sensitivity, auditory attention, sound
localization)
© 2013 Cengage Learning

Psychomotor Ability Tests


• Psychomotor Ability (Fleishman & Reilly (1992)
– Dexterity (finger, manual)
– Control precision
– Multilimb coordination
– Response control
– Reaction time
– Arm-hand steadiness
– Wrist-finger speed
– Speed-of-limb movement
© 2013 Cengage Learning

Physical Ability
• Used for jobs with high physical demands
• Three Issues
– Job relatedness
– Passing scores
– When the ability must be present
• Two common ways to measure
– Simulations
– Physical agility tests

http://www.youtube.com/watch?v=9BfqWGWzrfI
© 2013 Cengage Learning

Physical Ability
Physical Abilities (Fleishman & Reilly, 1992)
– Dynamic strength (strength requiring repetitions)
– Trunk strength (stooping or bending over)
– Explosive strength (jumping or throwing)
– Static strength
– Dynamic flexibility (speed of bending or stretching)
– Extent flexibility (Degree of bending or stretching)
– Gross body equilibrium (balance)
– Gross body coordination (coordination)
– Stamina
© 2013 Cengage Learning

Predicting Performance Using


Applicant Skill
© 2013 Cengage Learning

Work Samples
• Applicants perform tasks that replicate actual
job tasks
• Advantages
– Directly related to the job
– Good criterion validity
• Verbal work samples (ρ = .48)
• Motor work samples (ρ = .43)
– Good face validity
– Less adverse impact than cognitive ability
– Provide realistic job previews
• Disadvantages
– Can be expensive to develop and maintain
© 2013 Cengage Learning

Assessment Centers
What are They?
• A selection technique that uses multiple job-related
assessment exercises and multiple assessors to
observe and record behaviors of candidates
performing job-related tasks
© 2013 Cengage Learning

Guidelines for Assessment Center Practices


Joiner (2000)
• Based on job analysis • Use multiple assessors
• Behavioral • Assessor training
classification • Recording behavior
• Assessment techniques • Reports
• Use multiple • Overall judgment
assessment exercises based on integration of
• Simulations information
© 2013 Cengage Learning

Assessment Center Exercises


• Leaderless group discussions
• In-basket technique
• Simulations
– Situational exercises
– Work samples
• Role plays
• Case analyses and business
games
http://www.youtube.com/watch?v=eyWxjNECRBE&feature=related
http://www.youtube.com/watch?v=4eKuQ-RcHqY
© 2013 Cengage Learning

Evaluation of Assessment Centers


Reliability
– Can have low inter-rater agreement among raters
– Test/retest reliability pretty high (.70)
Validity (Arthur et al., 2003)
– Uncorrected .28
– Corrected .38
– Good face validity
© 2013 Cengage Learning

Evaluation of Assessment Centers


Weaknesses
– Very expensive
– Time consuming
– Can have low inter-rater agreement
– Behaviors can overlap into several
dimensions
– Safety of candidates for some work
samples
© 2013 Cengage Learning

When are assessment centers most appropriate?


– Most useful for promotion rather than selection
– When candidates have some knowledge of the job
– When you have the money to develop and maintain
assessment centers
– When you have the time and trainers
© 2013 Cengage Learning

Predicting Performance Using


Prior Experience
© 2013 Cengage Learning

Experience Ratings
• Past behavior predicts future behavior
– Experience is a valid predictor of future
performance (ρ = .27; Quinones et al.,
1995)

• Types of Experience
– Work
– Life
© 2013 Cengage Learning

Experience
• Evaluated through:
– Application blanks
– Resumes
– Interviews
– Reference checks
– Biodata instruments
© 2013 Cengage Learning

Experience
• Considerations
– How much experience?
– How well did the person perform?
– How related is it to the current job?
© 2013 Cengage Learning

Experience Predicts Best…


• Credit prior work experience only:
– In the same occupational area as that in which performance is to be
predicted
– In the performance of tasks or functions that have direct
application on the job
• Recency of experience should be used as a decision rule for awarding
credit only when justified on a case-by-case basis
• Credit for duration of work experience should be limited to a few
years.
• High prediction up to about 3 years of experience, declining to low
prediction for more than 12 years of experience.
© 2013 Cengage Learning

Experience for Selection: Some


Concerns
• Sullivan (2000) claims that “experience in solving ‘past
problems’ is rapidly losing its applicability to current and
future problems.”
• Organizations will increase their applicant pool if they
delete the “ancient history” requirements (i.e. “Ten years
experience required”).
© 2013 Cengage Learning

Sullivan (2000)
1) Reduce or eliminate the number of years required in your
ads and replace them with “the demonstrated ability to
solve problems with our required level of difficulty.
2) Use simulations and actual problems to assess applicants.
3) Develop “future-oriented” questions for applicants.
4) Train evaluators and compensation professionals to put
less weight on experience of candidates.
5) Revise job descriptions to include level of difficulty.
6) Identify the amount and type of experience and
competencies that would predict job performance.
7) Check to see if there is a correlation between the number
of years of experience an employee has and their success
in your firm.
© 2013 Cengage Learning

Experience: Some More Concerns


• Performance matters
• “Haven’t done” doesn’t mean “can’t do”
• Experience has a shelf life
• Listing something on a resume is not experience
• Where you get your experience matters
• Experience does not guarantee success
• Experience is expensive
• More experience might be bad (old ways and
ideas)
© 2013 Cengage Learning

Biodata
A selection method that considers an
applicant’s life, school, military,
community, and work experience
© 2013 Cengage Learning

Example of Biodata Items

Member of high school student government?


Yes No
Number of jobs in past 5 years?
1 2 3-5 More than 5
Transportation to work:
Walk Bus Bike Own Car Other
© 2013 Cengage Learning

Development of Biodata Items


• Choose a job
• Create pool of potential biodata items
• Choose a criterion to measure behavior
• Prescreen items and test on employees
• Retest items on second sample of
employees
© 2013 Cengage Learning

Biodata Standards
Gandy & Dye, 1989; Mael, 1991
Good Biodata Items Bad Biodata Items
Historical Future or Hypothetical
How old were you when you got What position do you think you will
your first paying job? be holding in 10 years?
External Internal
Did you ever get fired from a job? What is your attitude toward friends
who smoke marijuana?
Objective Subjective
How many hours did you study for Would you describe yourself as
your bar exam? shy?
First-hand Second-hand
How punctual are you about coming How would your teachers describe
to work? your punctuality?
© 2013 Cengage Learning

Good Biodata Items Bad Biodata Items


Discrete Summative
At what age did you get your driver’s How many hours do you study during an
license? average week?
Verifiable Non-verifiable
What was your grade point average in How may servings of fresh vegetables do
college? you eat everyday?
Controllable Non-controllable
How many tries did it take you to pass How many brothers and sisters do you
the CPA exam? have?
Equal Access Non-equal Access
Were you ever class president? Were you ever captain of the football team?

Job Relevant Not job relevant


How many units of cereal did you sell Are you proficient at crossword puzzles?
during the last calendar year?
Noninvasive Invasive
Were you on the tennis team in college? How many young children do you have at
home?
© 2013 Cengage Learning

Biodata Scoring
Variable Long Short Differences Unit Weight
Tenure (%) Tenure (%) in %
Education

High School 40 80 -40 -1

Bachelor’s 59 15 +44 +1

Masters 1 5 -4 0
© 2013 Cengage Learning

Strengths of Biodata

– Good validity (r = .36, ρ= .51)


– Can predict for variety of
criterion measures
– Easy to administer
– Relatively inexpensive
– Fairly valid
– Can have good face validity
© 2013 Cengage Learning

Weaknesses of Biodata

– Low face validity


– Can invade privacy
– Items can be offensive
– Expensive to develop
– Not always practical to
develop
© 2013 Cengage Learning

Validity Issues
• Shrinkage?
• Good validity but not sure why
• Validity seems to drop when items based
rationally (job analysis) rather than
empirically
© 2013 Cengage Learning
© 2013 Cengage Learning

Personality Inventories
Personality is a collection of traits that
persist across time and situations and
differentiate one person from another
© 2013 Cengage Learning

Differences in Personality
Inventories
• Types of Personality Inventories
– Measures of normal personality
– Measures of psychopathology
• Basis for Personality Dimensions
– Theory based
– Statistically based
– Empirically based
• Scoring
– Objective
– Projective
© 2013 Cengage Learning

Five-Factor Model (The Big 5)


Openness to Experience
– imaginative, curious, cultured
Conscientiousness
– organized, disciplined, careful
Extraversion
– outgoing, gregarious, fun-loving
Agreeableness
– trusting, cooperative, flexible
Neuroticism (emotional stability)
– anxious, insecure, vulnerable to stress
© 2013 Cengage Learning

Validity of Personality
Meta-Analysis

Hurtz & Donovan Barrick & Mount Tett et al. (1991)


(2003) (1991)
Dimension Observed True Observed True Observed True
Openness .03 .06 .03 .04 .18 .24

Conscientiousness .15 .24 .13 .22 .12 .16

Extroversion .06 .09 .08 .13 .10 .13

Agreeableness .07 .12 .04 .07 .22 .28

Neuroticism - .09 - .15 - .05 - .08 - .15 - .19


© 2013 Cengage Learning

Comparison of Meta-Analyses
Conscientiousness
Hurtz & Donovan Barrick & Mount Tett et al. (1991)
(2003) (1991)
Types of studies Only those Any test that could Only studies in
included in meta- developed to tap be assigned to a Big which a Big 5
analysis Big 5 5 dimension dimension was
hypothesized to be
related to
performance
k 42 123 7
n 7,342 19,721 450
Observed validity .15 .13 .12
© 2013 Cengage Learning

Evaluation of Personality
Strengths
– Relatively cheap
– Easy to administer
– Little adverse impact
– Predicts best when based on a
job analysis
Weaknesses
– Scale development
– Validity
– Faking
© 2013 Cengage Learning

Interest Inventories
• Tap an applicant’s interest in particular
types of work or careers
• Poor predictors of job performance (ρ = .13)
• Better predictors of job satisfaction
© 2013 Cengage Learning

Integrity Tests
• Estimate the probability that applicants
will steal money or merchandise
• Used mostly in retail, but gaining
acceptance for other occupations
© 2013 Cengage Learning

Types of Integrity Tests

Electronic Testing
• Polygraph testing

Paper and Pencil Testing


• Overt
• Personality based
© 2013 Cengage Learning

Polygraph Testing
• Polygraph (lie detector) is a machine that
measures the physiological responses that
accompany the verbal responses an individual
makes to a direct questions asked by
polygraph operator.
© 2013 Cengage Learning

Limitations of the Polygraph


• Emotions other than
guilt can trigger
responses
• Countermeasures used
to avoid detection
• Frequency of false
positives
• Frequency of false
negatives
© 2013 Cengage Learning

Legal Guidelines for Polygraph


Testing
Employee Polygraph Protection Act of 1988
makes it illegal to:
• Directly or indirectly require an employee to take a
polygraph
• Use, accept, refer to, or inquire about the results of any
polygraph test of any applicant or employee
• Discharge, discipline, discriminate against, or deny
employment or promotion to (or threaten such actions)
against any prospective or current employee who
refuses, declines, or fails to take or submit to a
polygraph
© 2013 Cengage Learning

Legal Guidelines for Polygraph


Testing
The following are exempt from these prohibitions

– Private employers providing security services

– Employers who manufacture, distribute, or dispense


controlled substances

– Federal, state, and local government employees.


© 2013 Cengage Learning

Paper and Pencil Integrity Tests


Overt integrity tests
• Directly ask for attitudes about theft and
occurrences of theft behavior
Personality based measures
• Measure traits linked to several theft related
employee behaviors that are detrimental to the
organization
© 2013 Cengage Learning

Overt Integrity Tests


Rationale is to measure job applicants’ attitudes and cognitions
toward theft that might predispose them to steal at work,
especially when both the need and opportunity to steal are
present.
Research has shown that the “typical” employee-thief:
• Is more tempted to steal
• Engages in many of the common rationalizations for theft
• Would punish thieves less
• Often thinks about theft related activities
• Attributes more theft to others
• Shows more inter-thief loyalty
• Is more vulnerable to peer pressure to steal than an honest employee
© 2013 Cengage Learning

Personality-Based Integrity
Measures
Employee theft is just one element in a
larger syndrome of antisocial behavior of
organizational delinquency. Therefore,
overt integrity tests overlook a number of
other counterproductive behaviors that are
costly to the organization
© 2013 Cengage Learning

Other Behaviors Integrity Tests Can


Predict
• Drug and alcohol abuse
• Vandalism
• Sabotage
• Assault behaviors
• Insubordination
• Absenteeism
• Excessive grievances
• Bogus workers compensation claims
• Violence
© 2013 Cengage Learning

The Validity and Reliability of Integrity


Tests
Validity
• Theft
• .41 for predicting probability of theft by employees
• Performance (Ones et al. 1993)
• Observed = .21
• True = .34
Reliability
• Reports of test-retest reliabilities between .90-.70
© 2013 Cengage Learning

Evaluation of Integrity Tests


• Advantages
– Good validity (ρ = .34)
– Inexpensive to use
– Easy to administer
– Little to no racial adverse impact
• Disadvantages
– Males have a higher fail rate than females
– Younger people have a higher fail rate than older people
– Failure has a negative psychological impact on
applicants.
© 2013 Cengage Learning

Conditional Reasoning Tests


• Designed to reduce faking
• Applicants are given a series of statements and asked to select the
reason that justifies each statement
• Aggressive individuals tend to believe
– most people have harmful intentions behind their behavior (hostile
attribution bias)
– it is important to show strength or dominance in social interactions
(potency bias)
– it is important to retaliate when wronged rather than try to maintain a
relationship (retribution bias)
– powerful people will victimize less powerful individuals (victimization
bias)
– evil people deserve to have bad things happen to them (derogation of
target bias)
– social customs restrict free will and should be ignored (social discounting
bias).
© 2013 Cengage Learning

Graphology
• Concept
– A person’s handwriting is a reflection on his or her personality and
character
• Use
– 6,000 U.S. organizations
– 75% of organizations in France
– 8% of organizations in the United Kingdom
• Evaluation
– Few studies
– Validity depends on the writing sample (Simner & Goffin, 2003)
• Autobiographical (r = .16, p = .22)
• Non-autobiographical (r = .09, p = .12)
© 2013 Cengage Learning
© 2013 Cengage Learning

Drug Testing
• Use
– In 2001, 80% of U.S. organizations tested for drugs
– In 2003, 4.6% of applicants tested positive for drugs
– In 2007, 8.2% of employees admitted to using drugs in
the past month
• Drug users are more likely to
– Miss work
– Use health care benefits
– Be fired
– Have an accident
© 2013 Cengage Learning

Drug Testing
Forms of Testing
– Pre-employment testing
– Random selection at predetermined times
– Random selection at random times
– Testing after an accident or disciplinary action
Responses to the Presence of Drugs
– 98% of job offers withdrawn
– Current employees who test positive
• 25% are fired after a positive test
• 66% are referred to counseling and treatment
© 2013 Cengage Learning

Two Stages of Drug Testing


• Initial screening of hair or urine
– Cheaper method (about $50)
– Enzyme Multiplied Immunoassay Technique (EMIT)
– Radioimmunoassay (RIA)
• Confirmation test
– Typically used only after a positive initial screening
– Thin layer chromatography/mass spectrometry
– More expensive

http://www.youtube.com/watch?v=aVLDkXj4K2A
© 2013 Cengage Learning

Should Organizations Test for Drugs?


© 2013 Cengage Learning

Typical Corrected Validity Coefficients for


Selection Techniques
Method Validity Method Validity
Structured Interview .57 References .29
Cognitive ability .51 Experience .27
Biodata .51 Situational judgment tests .26
Job knowledge .48 Conscientiousness .24
Work samples (verbal) .48 Unstructured interviews .20
Assessment centers .38 Interest inventories .10
Integrity tests .34 Handwriting analysis .02
College grades .32 Projective personality tests .00
© 2013 Cengage Learning

Adverse Impact
Technique White- White- Meta-analysis
Black Hispanic
Cognitive ability 1.10 .72 Roth et al. (2001)
GPA .78 Roth & Bobko (2000)
Work sample .73 Roth et al. (2008)
Assessment centers .52 .28 Dean et al. (2008)
Job knowledge .48 .47 Roth et al. (2003)
Situational judgment .38 .24 Whetzel et al. (2008)
Biodata .33 Bobko et al. (1999)
Structured interview .23 Huffcutt & Roth (1998)
Recommendations .22 Aamodt (2002)
Personality .09 Schmitt et al. (1996)
References .08 Aamodt & Williams (2005)
Integrity tests .07 -.05 Ones & Viswesvaran (1998)
© 2013 Cengage Learning

Applied Case Study:


New London, CT Police Department
© 2013 Cengage Learning

Focus on Ethics
Using Personality Inventories
• In your class, your professor will probably ask you to take
the Employee Personality Inventory in your workbook.
After you do, consider whether or not you want your job
performance to be judged based on the results of such a
test. Would you say that this test would fairly predict your
ability to perform in certain jobs?
• Does it accurately portray how you would fit into an
organization’s culture or how you would get along with
others? If it doesn’t accurately portray you, would you then
say such a test is
• unethical?
• Should the tests be better regulated? Are companies right
in using them in their selection process?
© 2013 Cengage Learning

Focus on Ethics
Using Personality Inventories
• Do you see any other ethical concerns
related to using personality inventories?
• Is there a fairer and more ethical way for
companies to determine if applicants will fit
into the organizational culture and get along
with others?

You might also like