You are on page 1of 20

Conceptualization and

Measurement
Moving from vague ideas to
recognizing and measuring it in
organizations

Topics for this Chapter

Concepts and conceptions


Conceptualization
Indicators and dimensions
Creating order
Definitions and research purposes
Criteria for measurement quality

Conceptions and Concepts


Terms used to communicate your
conception about discrimination.
Include observations that are related to
your concept of discrimination

Coming to an agreement
Conceptualization

Concept

Conceptions and Concepts


Measure three classes of things:
Direct
Indirect
Constructs

Constructs:
Theoretical
Not directly/indirectly observable

Concept = Family of constructs

Conceptualization
What will we mean when we use
discrimination?
Study the question
Meaning of discrimination
Agree on the answer

Indicators and Dimensions


Identify indicators of discrimination
Meanings of words and the actions by
the employees under study
Need to identify groups or dimensions of
discrimination
Subdivide discrimination according to
several dimensions
Conceptualization
Dimensions &
Indicators

Creating Order
Nominal definition:
Working definition of the term

Operational definition:
What we are going to observe in
organization
How we will do it
What interpretations are we going to
place on possible observations

Group exercise 1
Answer the following questions using
article of Allen and Ortlepp (2002):
Definition of work and career salience?
Any relationships with other concepts?
Why distinguish between work and
career salience?
How is each measured?

Group exercise 2
Answer the following questions using
article of Snyder (1995):
Definition of hope?
Any relationships with other concepts?
Why distinguish between hope and
optimism?
How is hope measured?

Creating Order (cont.)


Conceptualization
Nominal definition
Operational definition
Measurements in organizations

Definitions and Research Purpose


Descriptive vs Explanatory
Definitions determine descriptions
Change in Definitions
Change in Descriptive Interpretations

Criteria for Measurement Quality


Precision vs Accuracy
Fineness of distinctions
Impacts on operational definition

Focus also on:


Reliability
Validity

Reliability
Sources of inconsistency
Multiple observers
Self-reports

Techniques:

Test-retest
Equivalent form
Split-half
Internal consistency
Established tests

Test-Retest
Consistency of scores between one
test administration to the next
Administer test to group
Readminister same test to same group at
a later time
Correlate first set of scores with the
second

Test-Retest (cont.)
Problems with test-retest:
Attribute being measured may change
between first test and retest
Reactivity
Experience of taking test
Carry-over effects
Requires two test administrations
Coefficient of temporal stability

Alternate Forms
Equivalent in terms of:
Content
Response processes
Statistical characteristics

Procedure:
Administer form A to group
Administer form B to same group at a
later time
Correlate scores on form A and form B

Alternate Forms (cont.)


Counter problems associated with:
Carry-over
Reactivity
Control real changes over time

Problems:
Two administrations
Difficult to develop alternate forms

Split-Half
Procedure:
Administer test to group of employees
Split the test in half
Correlate scores on one half of the test
with scores on the other half

Advantage:
No use for two test administrations

Split is its greatest weakness

Internal Consistency
Determines reliability from:
Number of items in the test
Average intercorrelation among test
items
Result in coefficient alpha

Compare each item to every other


item

Validity
Face validity
Criterion-related validity
Concurrent validity
Predictive validity

Construct validity
Content validity
What is the connection with
conceptualization?