You are on page 1of 15

LOGO

RELIABILITY

Created by Nurliana Herlianti

www.themegallery.com
RELIABILITY

• Reliability refers to the consistency of


scores or an answers provided by an
instrument.

• Why?

www.themegallery.com
Distinction between reliability and validity

www.themegallery.com
ERRORS OF MEASUREMENT

• It refers to variation in scores


obtained by the same
individuals on the same
instrument

– RELIABILITY COEFFICIENT.

www.themegallery.com
Contents

1 Test Retest Method

2 Equivalent Forms

3 Internal Consistency

4 Scoring observer agreement

www.themegallery.com
TEST RETEST METHOD

• Test retest method involves administering


the same instrument twice to the same
group of individuals after certain time of
interval elapsed.

www.themegallery.com
EQUIVALENT FORMS METHOD

• Involves administering two different , but


equivalent forms of an instrument to the
same group of individuals at the same
time.

• High reliability coefficient = very reliable


instrument

www.themegallery.com
INTERNAL CONSISTENCY
METHOD

• Involves comparing two different set of


items that are part of an instrument.

• Several internal consistency method :


1. Split half procedure
2. Kuder-Richardson Approaches
3. Alpha Coefficient

www.themegallery.com
Internal Consistency – Split Half
Procedure

• Scoring two halves separately


• Usually odd items versus even items
• Spearman- Brown formulae

Reliability of scores on total test =


2 x reliability for ½ test
1 + reliability for ½ test

www.themegallery.com
Internal Consistency – Kuder
Richardson Approaches

• KR20 and KR21


• Frequently used KR21 reliability
coefficient:

[
K__ 1 – M(K – M) ]
K–1 K(SD2)

• For research purpose : 0.70


www.themegallery.com
Internal Consistency – Alpha
coefficient
• Cronbach Alpha
• KR20
• Essay Questions

www.themegallery.com
Standard Error Measurements

• An index that shows the extent to which a


measurement would vary under changed
circumstances.

• SD = √ 1 – r11,

where SD = Standard Deviation


r11 = the reliability coefficient
accroding to condition that varywww.themegallery.com
SCORING AGREEMENT

• Instruments that use direct observation are


highly vulnerable to observer differences.

• Such agreement is enhanced by training


the observers and by increasing the
number of observation periods.

• At least 80 percent agreement.

www.themegallery.com
Summary
Method Content Time Procedure
Interval
Test Retest Identical Varies Give identical
instrument twice

Equivalent Forms Different None Give two forms of


instrument

Equivalent Forms/ Different Varies Give two forms


Retest instruments, with
time interval
between
Internal Different None Divide instrument
Consistency into halves, use
KR
Scoring Identical None Compared scores
Agreement obtained by two or
more observers
LOGO

www.themegallery.com

You might also like