You are on page 1of 43

Test assessing

WHAT IS A TEST?
•A set of written or
spoken questions used
for finding out how much
someone knows about a
topic
TYPES OF TESTS
• - multiple choice questions
• - Fill-in-the-blank (good for grammar)
• - Matching exercise (good for vocabulary)
• - True and false
• - Written sections like answering questions or
writing short essays
• - Reading Comprehension
• - Open tests
• - Close tests
EDUCATIONAL TESTS
- Primary function is the
measurement of results or
effects of instruction.

❖ Ex. Achievement Tests.


MASTERY TESTS
Achievement tests which measure the
degree to which an individual has
mastered certain instructional
objectives or specific learning
outcomes.
SURVEY TESTS
Measure a student’s general level
of the achievement regarding a
broad range of learning outcomes.
INDIVIDUAL TESTS
Administered on a one-to-one basis
using questioning

Ex. Individual Intelligence tests


GROUP TESTS

Administered to groups
of individuals
POWER TESTS
• Items are arranged in increasing
order of difficulty
• Measures the individual’s ability to
answer more and more difficult item
within a given field.
SPEED TESTS
The speed and accuracy with
which the pupil is able to
respond to the items are then
measured.
VERBAL TESTS

- Makes use of words


- Mental test consists of items measuring
vocabulary, verbal reasoning, comprehension
etc.

Examples: Verbal reasoning test, aptitude test etc.


NONVERBAL TESTS

- Paper and pencil tests or oral tests


- May involve drawings or physical
objects
Example: Non-verbal reasoning test
NON-VERBAL REASONING TESTS
INFORMAL TESTS

- Constructed by class-room
teachers

Ex: quizzes, long tests, etc.


STANDARDIZED TEST
Constructed by text experts,
administered and scored under
standard conditions
CRITERION-REFERENCED TEST
- Compares an individual's performance to the
acceptable standard of performance
- Requires completely specified objectives.

Applications
- Diagnosis of individual skill deficiencies
- Evaluation and revision of instruction
NORM-REFERENCED TEST

- Compares an individual's performance to


the performance of others.
- Requires varying item difficulties.

Ex: College entrance exams


CONSTRUCTING / IMPROVING
MAIN STEM

• The main stem of the test item


may be constructed in question
form, completion form or direction
form.
QUESTION FORM
Which is the same as four hundred
seventy?
a.
b.
c.
COMPLETION FORM
Four hundred seventy is the same
as_____.
a.
b.
c.
DIRECTION FORM
Add: 22
+ 43
a.
b.
c.
• The main stem should be clear.
• The question should not be trivial.
• Questions that tap only rote learning
and memory should be avoided.
• Questions should tap only one ability.
• Each question should have only one
answer, not several possible
answers.
CONSTRUCTING/IMPROVING
ALTERNATIVES

• Alternatives should be as closely related to each other as


possible.
• Alternatives should be arranged in natural order.
• Alternatives should be arranged according to length: from
shortest to longest or vice versa.
• Alternatives should have grammatical parallelism.
• Arrangemant of correct answers should not follow any
pattern.
RULES FOR CONTRUCTING
ALTERNATIVE-RESPONSE ITEMS

• Avoid specific determiners.


• Avoid a disproportionate number of either true or false
statements.
• Avoid the exact wording of the textbook.
• Avoid trick statement.
• Limit each statement to the exact point to be tested.
• Avoid double negatives.
• Avoid ambiguous statements
• Avoid unfamiliar, figurative, or literary language
• Avoid long statements, especially those involving complex
sentence structures.
• Avoid quantitative language wherever possible.
• Commands cannot be “true” or “false”.
• Require the simplest possible method of indicating the
response.
• Indicate by a short line by () where the response is to be
recorded.
• Arrange the statements in groups.
RULES FOR CONSTRUCTING COMPLETION
ITEMS
• Avoid indefinite statements
• Avoid over mutilated statements
• Omit key words and phrases, rather than trivial details.
• Avoid lifting statements directly from the text.
• Make the blanks of uniform length.
• Avoid grammatical clues to correct the answer.
• Try to choose statements in which there is only one
correct response for the blanks.
• The required response should be a single
word or a brief phrase.
• Arrange the test so that the answers are in
the column at the right of the sentences.
• Avoid unordered series within an item.
• Prepare a scoring key that contains all
acceptable answers.
• Allow one point for each correctly filled
blank.
SUGGESTIONS FOR CONSTRUCTING
MATCHING EXERCISES
• Be careful about what material is put into the
question column and what is put into the option
column.
• Include only homogenous material in each matching
exercise.
• Check each exercise carefully for unwarranted
clues that may indicate matching parts.
• Be sure that the students fully understand the bases
on which matching is to be done.
• Out items on the left and number them, put
options on the right and designate them by letters.
• Arrange items and options in systematic order.
• Place all the items and options for a matching type
exercise on a single page, if possible.
• Limit a matching exercise to not more than 10-15
items.
SUPPLY TESTS
- require examinees to recall
and supply the answer

Ex. essay tests


ADVANTAGES OF USING ESSAY
QUESTIONS

• Allows the student to express


himself in his own words.
• Measures complex learning
outcomes.
• Promotes the development of
problem-solving skills.
ADVANTAGES OF USING ESSAY QUESTIONS

• Easy and economical to


administer.
• Encourages good study habits in
students.
• Does not encourage guessing
and cheating during testing.
Example:
Poor: Compare the Democratic and Republican
parties.

Better: Compare the current policies of the Democratic


and Republican parties with regard to the role of
government in private business. Support your
statements with examples when possible. (Your
answer should be confined to two pages. It will be
evaluated in terms of the appropriateness of the facts
and examples presented and the skill with which it is
organized.)
4. Indicate an approximate time and limit for
each question.

• As each question is constructed, teacher


should estimate the approximate time
needed for a satisfactory response.
5. Avoid the use of optional questions

• The use of optional questions might test


the validity of the test results in the other
way.
SCORING ESSAY QUESTIONS
Tips to remember…
• Use clear specifications of scoring criteria
• Inform students of scoring criteria
• Use an initial review to find “anchor” responses for
comparison
• Use descriptive rather than judgmental scores or
levels (“writing is clear and thoughts are complete”
vs. “excellent”)
SCORING FOR RESTRICTED RESPONSE ESSAY
QUESTIONS
• In most instances, the teacher should write an
example of an expected response
• For example, if the student is asked to describe
three factors that contributed to the start of the
Civil War, the teacher would construct a list of
acceptable reasons and give the student 1 point
for each of up to three reasons given from the
list
SCORING FOR EXTENDED-RESPONSE ESSAY
QUESTIONS

Analytic Scoring Rubrics


• Consist of a rubric broken down into key
dimensions that will be evaluated
• Enables teacher to focus on one characteristic
of a response at a time
• Provides maximum feedback for students
Holistic Scoring Rubrics
• Yield a single overall score taking into account the
entire response
• Can be used to grade essays more quickly
• Does not provide as much specific feedback as
analytic rubric
• Should not consist of scores alone, but rather
contain scores accompanied by statements of the
characteristics of the response
SUGGESTIONS FOR SCORING ESSAY QUESTIONS

• Prepare an outline of the expected


answer in advance and use a clear
scoring rubric
• Use the scoring rubric that is most
appropriate
• Decide how to handle factors that
are irrelevant to the learning
outcomes being measured
ESTABLISHING THE TEST VALIDITY

• Validity can be best defined as the


degree to which a test is capable of
achieving certain aims. It is
sometimes defined as truthfulness.
KINDS OF VALIDITY

Criterion- Related  
Content Validity
Validity Construct Validity

Related to how adequately the The degree to which the test


content of the test samples the Pertains to the empirical scores can be accounted for
domain about which inferences technique of studying the by certain explanatory
are to be made relationship between the test constructs in a psychological
and some independent theory
external measures (criteria).

You might also like