You are on page 1of 21


Session Topic:

Validity & Reliability

Session Objectives

The purpose of this session is to:

1) Define validity and reliability

2) Distinguish between valid and invalid inferences
3) Understand how to apply knowledge of validity
and reliability to the selection and development
of assessments
Defining Validity

Validity refers to the accuracy of inferences

drawn from an assessment.

It is the degree to which the assessment

measures what it is intended to measure.
Types of Validity

Construct validity- the assessment actually

measures what it is designed to measure.

A actually is A
Types of Validity

Concurrent validity- the assessment

correlates with other assessments that
measure the same construct.

A correlates with B
Types of Validity

Predictive validity- the assessment predicts

performance on a future assessment.

A predicts B
Valid Inferences

Validity is closely tied to the purpose or use of

an assessment.

DON’T ASK: “Is this assessment valid?”

ASK: “Are the inferences I’m making based on
this assessment valid for my purpose?”
Evidence-Centered Design
• Validity is about providing strong evidence

• Evidence-centered design boosts validity

– What do you want to know?
– How would you know?
– What should the assessment look like?
Defining Reliability

• Reliability refers to consistency and


• A reliable assessment provides a consistent

picture of what students know, understand,
and are able to do.

An assessment that is highly reliable is not

necessarily valid. However, for an assessment
to be valid, it must also be reliable.
Purchasing & Developing
What are you trying
to measure? Purpose

What assessments do
you already have that
purport to measure Review

If necessary, consider
assessments or create Purchase Develop
a new assessment
• Using what you have
– Is it carefully aligned to your purpose?
• Purchasing a new assessment
– Is it carefully matched to your purpose?
– Do you have the funds (for assessment, equipment, training)?
• Developing a new assessment
– Do you have the in-house content knowledge?
– Do you have the in-house assessment knowledge?
– Does your team have time for development?
– Does your team have the knowledge and time needed for
proper scoring?
Improving Validity & Reliability
• Ensure questions are based on taught curricula
• Ensure questions are based on standards
• Allow students to demonstrate knowledge/skills in
multiple ways
• Ensure a variety of item types (multiple-choice,
constructed response)
• Ask questions at varying Depth of Knowledge levels
• Ensure accurate test administration
• Include items that address the full range of standards
• Include multiple items that assess the same standard
• Review scorer reliability, when necessary
V&R : Student Learning Objectives
What makes high-quality evidence for SLOs:

• Aligned to the content standards (construct

• Being used for the purpose for which it was
• Administered properly
Objective: Students will demonstrate grade-
level proficiency in reading, writing, and
speaking French, including the accurate use of
past and present tenses.

How would you know if students were

proficient in reading, writing, and speaking
1. Written final exam measuring reading
comprehension, vocabulary,
conjugation/agreement in past and present
2. 300 word written composition in French, using
past and present tense in a familiar content
3. 5-minute conversation on one of 3 pre-selected
topics, using past and present tense.
Administration & Scoring

The exam and composition will be part of the written final, administered
during the final exam period.
– I will score the compositions using the Foreign Language Department level 2
writing rubric, which includes vocabulary, tense, subject-verb agreement,
spelling, level of detail, etc.
– Approximately 20% of the compositions will also be double-scored by the
other French teacher.

The oral assessment will be administered one-on-one in the last week of

school, prior to the exam period. I will develop the rubric with the other
French teacher and have it approved by the Department Chair.

I will administer and score most oral exams myself, though I will schedule
my Department Chair to sit in on and double-score the first 20%.
Upcoming Webinars

January 11th 9:30-10:30

Cultural & Linguistic Demands of Assessment