Professional Documents
Culture Documents
Test Construction PDF
Test Construction PDF
The easiest way to ensure a representative sample of content and cognitive objectives on the
test is to prepare a test blueprint. This table is simply a two-way chart listing the content topics
on one dimension and the cognitive skills on the other. We want to include content and skills in
the same proportion as they were stressed during instruction. Table 1 shows a simple test
blueprint; it is intended to be illustrative, not comprehensive. Remember to ask more questions
that require higher order thinking skills (HOTS – analyze, evaluate, create), and less of the
lower order thinking skills (LOTS - remember, understand, apply) items.
This table indicates the content topics, the objectives to be covered and the proportion of the
test that will be devoted to each. Evidently, more class time was spent on objectives 2 and 4
because 40% of the test questions deal with that compared with only 10% on each of the other
objectives. The column totals indicate that 50% of the items will be written at the analyze level,
20% each at the apply and understand level, and only 10% at the remember level. Using the
percentages assigned to each cell, one writes the appropriate number of items.
Coordinating test content with instruction content ensures content validity of the test. Using a
table of specifications also helps an instructor avoid one of the most common mistakes in
classroom tests, namely writing all the items at the knowledge level.
PART II. CONSTRUCTING / WRITING THE TEST
Suggestions for writing Multiple Choice Items which measure Higher Order Thinking Skills:
It is difficult and time-consuming to write multiple-choice items that measure the higher thinking
skills. The item writer has to be creative in order to develop challenging questions. The following
suggestions may provide some ideas for writing these kinds of questions.
1. Present practical or real-world situations to the students. These problems may use short
paragraphs describing a problem in a practical situation. Items can be written which call
for the application of principles to the solution of these practical problems, or the
evaluation of several alternative procedures.
2. Present the student with a diagram of equipment and ask for application, analysis, or
evaluations, e.g., "What happens at point A if...?," "How is A related to B?"
3. Present actual quotations taken from newspapers or other published sources or
contrived quotations that could have come from such sources. Ask for the interpretation
or evaluation of these quotations.
4. Use pictorial materials that require students to apply principles and concepts.
5. Use charts, tables or figures that require interpretation.
BINARY ITEMS
declarative sentence that the student is asked to mark true or false, right or wrong,
correct or incorrect, yes or no, fact or opinion, agree or disagree
there are only two possible answers (true or false; fact or opinion)
measures the ability to:
o identify the correctness of statements of fact, definition of terms, statements of
principles, and the like
o distinguish fact from opinion
o recognize cause-and-effect relationships
o do some simple aspects of logic
MATCHING ITEMS
presented in groups as a series of stems or prompts that must be matched by the student to
one of a group of possible answer options
useful format when the objective to be measured involves: association skills or the ability to
recognize, categorize, and organize information
measures high levels of understanding but are most typically used at the knowledge level
and for younger students
can cover a large amount of material in a relatively brief period of time
even more efficient than multiple-choice items because each stem acts as a separate
multiple-choice item with all the answer options as possible answers
Functionally, containing ten stems operates as ten multiple-choice items
When well-written, all the wrong answer options act as distractors
Guessing is difficult
the validity and reliability of classroom tests are improved
Both are supply-type test items that can be answered by a word, phrase, number, or
symbol.
They are essentially the same, differing only in the method of presenting the problem:
o SHORT-ANSWER ITEM- uses a question
o COMPLETION ITEM- consists of an incomplete statement
Suitable for measuring a wide variety of relatively simple learning outcomes.
o Knowledge of terminology, specific facts, principles, method or procedure
o simple interpretation of data
ESSAY ITEMS
This item allows student to supply, rather than select, the correct answer.
The student must compose a response to a question for which there is no single
response or pattern of responses can be cited as correct to the exclusion of all other
answers.
The accuracy and quality of such a response can often be judged ONLY by a person
skilled and informed in the subject area being tested.
It should measure complex cognitive skills or processes.
Distinct feature: freedom and response
REFERENCES
Cross, T.L.(1990) Testing in the College Classroom. Paper presented at the Annual Meeting of American Educational
Research Association, Boston.
Milton, O. and Associates. (1978). On College Teaching: A Guide to Contemporary Practices. San Francisco: Jossey-
Bass.
Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., Wittrock,
M.C. (2001). A Taxonomy for Learning, Teaching, and Assessing: A revision of Bloom's Taxonomy of Educational
Objectives. New York: Pearson, Allyn & Bacon.
Bloom, B.S. (Ed.). Engelhart, M.D., Furst, E.J., Hill, W.H., Krathwohl, D.R. (1956). Taxonomy of Educational
Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.
Clark, R., Chopeta, L. (2004). Graphics for Learning : Proven Guidelines for Planning, Designing, and Evaluating
Visuals in Training Materials . San Francisco: Jossey-Bass/Pfeiffer.
BINARY ITEM
Padua, R.and Santos, R. (1997). Educational Evaluation and Measurement: Theory, Practice and Application. Katha
Publishing Co., Inc.
Study Guides and Strategies Website
University of Wisconsin-Eau Claire (2011). Assessment Toolkit. Writing True/False Test Question
MATCHING TYPE
Kubiszyn, T. and Borich, G. (2013). Educational testing and measurement: Classroom Application and Practice (10 th
ed.). John Wiley and Sons, USA.
David Miller, M., Linn, R., and Gronlund, N. (2009). Measurement and Assessment in Teaching (10 th ed.). Pearson
Education, Inc., Upper Saddle River, New Jersey.
COMPLETION ITEM
Kubiszyn, T. and Borich, G. (2013). Educational testing and measurement: Classroom Application and Practice (10th
ed.). John Wiley and Sons, USA.
David Miller, M., Linn, R., and Gronlund, N. (2009). Measurement and Assessment in Teaching (10 th ed.). Pearson
Education, Inc., Upper Saddle River, New Jersey.
ESSAY ITEMS
Gronlund, N. & Linn, R. (1990). Measurement and Evaluation in Teaching 6th ed. New York:
Macmillan.
Kubisza, T.& Borich, G. (2000). Educational Testing and Measurement 10 th ed. New York: John
Willy & Son.
Special thanks to Drs. Remedios Dee-Chan, Gerald Alcid, Maria Rosario Cabansag, Elaine Cunanan, Carmelita
Dalupang, Mina Laguesma, Therese Mallen, Anita Sangalang and Julie Visperas.