Professional Documents
Culture Documents
Objective scoring does not require the scorer to make a decision about the correctness of the
answer. It is either right or wrong, it either matches the key or it does not. A machine could
easily accomplish this task.
Subjective scoring requires the scorer to make judgments about the correctness and/or quality
of the response. It may be done holistically (all or nothing) or analytically (partial credit).
Analytic scoring requires the use of a checklist or rating scale, or some combination of the two.
Subjective scoring is necessarily less reliable than objective scoring.
The items are “objective” because that is how they are scored.
Objectively scored items are called Selected-Response Items. These include alternate-
response, matching, multiple choice, and sometimes, keyed response.
Subjective Test Items (and Alternative Assessments) require students to create or construct
a correct or best answer.
The items (or assessments) are “subjective” because that is how they are scored.
Subjectively scored items are called Constructed-Response Items. These include keyed
response items, fill-in-the-blank, short answer, and essay items. Alternative assessments, such
as performance and product-oriented tasks, are also scored subjectively.
They are called “alternative assessments” because they are an alternative to paper and pencil
assessments.
1
3. Congruence of Items with Objectives
Help ensure that your score-based decisions are valid (you measured what you intended
to measure) by paying attention to the “match” between the behavior described in the
objective and the task the student will perform for the assessment. Every item on your
test (or task in your assessment) must have a corresponding objective that it is
measuring.
Analyze the performance described in the instructional objective and create a test item
or assessment task that calls forth that performance.
Examples:
2
4. Developing Selected-Response Test Items
There are two choices to select from: T/F, Y/N, Noun/Verb, etc.
1. Make sure that the choices (T/F, Y/N, Norm/Criterion Referenced) logically
match the judgments.
3. Rarely use negative statements (never use them with very young children or
children with special needs!), never use double negatives. Always bold or
underline negative words in your items.
3
5. Never quote statements directly from the text.
Quotes taken directly from the text often lose their meaning without the
context of the paragraph they are taken from.
HINT: If you can write both a true and a false version of an alternate choice item, it is
probably a good alternate choice item.
C. Matching Items
Consist of the premises (questions) and responses. They are used to measure
knowledge of simple associations, such as between terms and their definitions.
D. Matching Guidelines
2. Use only homogeneous material for each set of items to avoid giving clues to
the correct response.
POOR Example:
Column A Column B
1.___Third level in Bloom’s Taxonomy a. Test
2.___Criteria for writing objectives b. Application
3.___An instrument used to collect c. Measurable
samples of student behavior d. Observable
3. Label your columns so you can refer to those labels in your directions.
4
4. Keep the responses brief and arrange them in logical order
(e.g. alphabetically, numerically, chronologically, etc.). Premises are read
once; the list of responses is read many times. Students should be able to
read over that list quickly.
POOR Example:
Column A Column B
1. ____First United States Astronaut a. Alexander Graham Bell
2. ____First president of the U.S. b. Christopher Columbus
3. ____Invented the Telephone c. John Glenn
4. ____Discovered America d. George Washington
6. Place all premises and responses for a matching item on a single page.
These are a cross between Matching and Fill-in-the-blank item types. Questions are presented
with a key word missing (the blank). Responses are presented as a “word bank” or “answer
bank”. Students write the missing word from the bank on the blank you provide.
We recommend using these items to help young students get used to writing on tests and to
reinforce writing and spelling skills in general. However, using this type of item can introduce
new sources of error to your scoring scheme.
5
F. Multiple-Choice Items
Consist of a stem (the question) and the alternatives, which are the distractors and one
correct answer
PROS: Allows for diagnosis of anticipated errors and misconceptions when plausible
alternatives are used.
Can be used to measure high levels of learning!
Can measure multiple concepts in one item
Do not require homogeneous material
Easy to administer and score
CONS: Can be difficult to write plausible alternatives, so time-consuming
Possible to guess correct answers when distractors are not plausible or clues
are given in the question
Examples:
Direct Question:
What is the capitol of Florida?
a. Miami
b. Tallahassee
c. Tampa
d. Ft. Lauderdale
Incomplete Statement:
The capitol of Florida is ___________.
a. Miami
b. Tallahassee
c. Tampa
d. Ft. Lauderdale
6
2. The stem should be clearly stated and should provide as much of the
information as possible to keep the responses short. So, stuff the stem with
information and make the alternatives as short as possible.
3. Avoid negatively stated items. Always emphasize negatives when you do use
them.
6. Avoid unintended clues by not having verbal associations between the stem and
response.
POOR Example:
Which of the following agencies should you contact to find out about a tornado warning in your
locality?
A. State farm bureau
B. Local radio station
C. United States Post Office
D. U.S. Weather Bureau
7
9. Avoid “none of the above” and “all of the above”. Teachers tend to use these
when they can’t think of another distractor. So, they usually are not the
correct response and don’t distract anyone, they are just a waste of space on
the test. However, none of the above can be useful on some math tests because
they force students to work problems all the way out.
POOR Example:
Which of the following birds does not fly?
A. Turkey
B. Herron
C. Lark
D. None of the above
8
7. Check for Quality
1. Write the test at least several weeks before you plan to administer it. Read
over the test several days after finishing it. You will find things to correct!
Make out the answer key before you copy the test!!!!!
2. Take the test yourself to check for errors and time restrictions. Remember
it will take your students 4 or 5 times longer than you to take the test.
3. Have a colleague take the test to check for errors and time
restrictions.
4. Get feedback from your high achieving students about the clarity of
directions and items. Use their suggestions to improve your test.