You are on page 1of 7

Concepts of Testing and Assessment

 Testing and Assessment are not 9. Assessment of research


synonymous with each other, but they are  Research activities must be evaluated to
partners in the teaching-learning process. determine if all teachers/professors have
conducted research efficiently and
Meaning of Testing and Assessment
effectively.
Testing is defined as the administration of test and  There are three functions of teachers in the
use of test results to determine whether the learners department of education (DepEd) and private
can be promoted to the next grade/year level or must institutions, namely, Instruction, Research,
be retained in the same grade/year level and will and Extension.
undergo a restudy of the same lesson.  For State Universities and Colleges (SUCs)
professors perform (4) mandated functions
Assessment refers to the collecting of data based such as Instruction, Research, Extension,
on the performance, analyzing and interpreting the and Production.
data of Rubric evaluation by using statistical  The SUCs are evaluated yearly by DBM
techniques to arrive at valid results. about their performance of the four
Scope of Assessment mandated functions.
10. Assessment of extension
1. Assessment of curricular offerings  Extension activities must be assessed to
 The course offerings must be assessed to determine if all teachers/professors have
determine if they are relevant, realistic and extension classes to different adopted
responsive to the changing needs and barangays.
problems of the society.
2. Assessment of school programs Functions of Assessment
 The school programs must be appraised to 1. Measures Students’ Achievements
determine if teachers are not overloaded or  By giving test to students and assessing
under loaded. the test results, students’ achievement
3. Assessment of instructional materials can be determined whether they have
 The instructional materials like books, reached the goals of the learning tasks or
reference books, visual aids and devices not.
must be evaluated to ascertain if they are 2. Evaluates Instruction and Teaching
adequate and updated. Books and reference Strategies
books beyond ten years must be revised.  For instance, the teacher uses
4. Assessment of instructional facilities unstructured approach in teaching
 The instructional facilities like classrooms, Statistics to Grade 10 students and gives
computers, projectors, audio-visual achievement test after presenting the
equipment, laboratory equipment, laboratory lesson.
equipment, chemicals, physical plant, library 3. Assesses Lesson to be Re-taught
holdings and many others must be assessed  If teaching is ineffective as evidenced by the
if they are adequate. poor test results, item analysis comes in.
5. Assessment of teachers and professors Items with 49% and below difficulty need to
 The teachers and professors must be be re-taught.
evaluated if they are qualified, competent, 4. Evaluates School’s Program
and can deliver efficient services to the  Assessment evaluates school’s programs if
students effectively and excellently. they are relevant, realistic, and responsive to
6. Assessment of student/s the needs of the society.
 The student/s must be assessed if they have 5. Motivates Learning
not reached the goals of the learning tasks.  Upon knowing the results of achievement
Each student/s must have an individual test, the student’s enthusiasm is aroused if
portfolio for easy reference about their he gets high score. Otherwise, if his score is
achievements. low, he strives hard to get higher score in the
7. Assessment of graduates next examination.
 Graduates must be evaluated if they have 6. Predicts Success or Failure
passed the PRC Licensure Examination for  Success or failure of students is predicted
Teachers (LET) and if they are employed, through assessment of their learning.
underemployed or unemployed. 7. Diagnoses the Nature of Difficulties
8. Assessment of school managers  The weaknesses of the students can be
 School managers must be assessed if they detected through assessment of their
are democratic leaders to their subordinates learning.
and are approachable; understands the 8. Evaluates Teachers’ Performance
problems and needs of the
teachers/professors and students; and are
not corrupt but honest.
 The performance of the teachers are  This test is a series of items arranged in the
determined through assessment of students’ order of difficulty. Ex. Binet-Simon Scale
learning.
Speed test
9. Evaluates School’s Facilities and Resources
 Adequacy and inadequacy of school’s  This test measures the speed and accuracy
facilities and resources are determined of the examinee within the time imposed. It is
through assessment of students’ learning. also called the alertness test. It consists of
10. Evaluates School Managers Performance items of uniform difficulty.
 Assessment evaluates school managers
performance. Power test
 The result of accreditation lies on the  This test is made up of series of items
performance of school managers. arranged from easiest to the most difficulty.
Kinds of Tests Standardized test
Intelligence test  This test provides exact procedures in
 Test which measures the intelligence controlling the method of administration and
quotient (IQ) of an individual as genius, very scoring with norms and data concerning the
superior, high average, average, low reliability and validity of the test.
average, borderline or mentally defective. Teacher-made test
Personality test  This test is prepared by classroom teachers
 This test measures the ways in which the based on the contents stated in the syllabi
individual’s interest with other individuals or and the lessons taken by the students.
items of the roles an individual has assigned Placement test
to himself and how he adopts in the society.
 This test is used to measure the job an
Aptitude test applicant should fill in the school setting and
 This kind of test is a predictive measure of a the grade or year level the student/s should
person’s likelihood of benefits from be enrolled after quitting from school.
instruction or experience in a given field such Characteristics of Assessment Methods
as the arts, music, clerical work, mechanical
tasks, or academic studies. 1. VALIDITY
 means the degree to which a test measures
Prognostic test what it intends to measure or the
 This kind forecasts how well a person may truthfulness of the response
do in a certain school subject or work.  validity of a test concerns what the test
measures and how well it does so
Performance test  validity of the test must always be
considered in relation to the purpose it is to
 It is a measure which often makes use of
serve
accomplishing the learning task involving
 validity is always specific in relation to some
minimum accomplishment or none at all.
definite situation
Diagnostic test  a valid test, is always valid
 This test identifies the weaknesses of an Types of Validity
individual’s achievement in any field which
1. Content Validity
serves as basis for remedial instruction. Ex.
Iowa Silent Reading Test  This means the content or topic is truly the
Achievement test representative of the course
 Is described by the relevance of a test to
 This test measures how much the students different types of criteria
attain the learning tasks. Ex. NAT  depends on the relevance of the individual’s
responses to the behavior area in
Preference test
consideration rather than on the apparent
 This test measures of vocational or academic relevance of item content
interest of an individual or aesthetic decision  is commonly used in assessing
by forcing the examinee to make force achievement test
options between members of paired or group  is particularly appropriate for the criterion-
items. Ex. Kuder Preference Record referenced measure

Scale test 2. Concurrent Validity


 it is the degree to which the test agrees or receives rank 1, second highest, 2; third highest, 3
correlates with a criterion set up as an and so on…….
acceptable measure
Step 2. Rank the second set of scores in the
3. Predictive Validity second administration (Y) in the same manner as in
step 1 and mark as Ry.
⮚ this validity determines by showing how well
predictions made from the test are proven Step 3. Find the difference for every pair of ranks to
by proof collected at some succeeding time get D.

4. Construct Validity Step 4. Multiply the difference by itself to get D2

⮚ this is the extent to which the test measures Step 5. Add or total D2 to get ƩD2
a theoretical trait Step 6. Compute Spearman rho (rs) by using the
⮚ Ex. Intelligence and Aptitude test formula.

2. RELIABILITY Interpretation of Correlation Value


 means the extent to which a test is
consistent and dependable
 the test agrees with itself
 it is concern with the consistency of
responses from moment to moment
 however, reliable test may not always be
valid
Techniques in Testing the Reliability of
Assessment method
ren
Bear in mind, the perfect correlation is 1.0. If
Test-retest Method
correlation is more than 1.0, there is something
 the same test is administered twice to the gg in the computation.
wrong
same group of students and the correlation
Parallel Forms Method
coefficient is determined
Disadvantages of Test-retest Method  this test is administered to a group of
students and the paired observation is
 When the time is short, the respondents correlated
may recall their previous responses and this  In constructing parallel forms, the two forms
tends to make the correlation high of the test must be constructed that the
 When the time interval is long, factors such content, type of test item, difficulty, and
as unlearning, forgetting, among others may instruction of administration are similar but
occur and may result in low correlation of not identical
the test  Pearson product moment correlation
 Regardless of the time interval separating coefficient is the statistical tool used to
the two administrations, other varying determine the correlation of parallel forms
environmental conditions such as noise,
Parallel Forms Method formula:
temperature, lighting and other factors may
affect the correlation coefficient of the test
Spearman rank correlation coefficient (t-retest
method formula)
 Is the statistical tool used to measure the
relationship between paired ranks assigned
to individual scores on two variables, X and
Y of the first administration (X) and second Split-Half Method
administration (Y)
 is administered once, but the test items are
divided into two halves
 the two halves of the test must be similar but
not identical in content, number of items,
difficulty, means and standard deviations
 each student obtains two scores, one on the
To substitute formula, the steps are as follows; odd and the other even items in the same test
 the scores obtained in the two halves are
Step 1. Rank the scores of respondents from
correlated
highest to lowest in the first set of administration (X)
 the result is reliability coefficient for a half
and mark this rank as Rx. The highest scores
test
 since the reliability holds only for a half test,  is the third characteristics of assessment
the reliability coefficient for a whole test is method
estimated by using the Spearman-Brown  it means the test can be satisfactorily
formula used by teachers and researchers
without undue expenditure of time,
Split-Half Method formula:
money, and effort
The Spearman-Brown formula  in other words, practicability means
usability
Factors that Determine Practicability
1. Ease of Administration

 to facilitate ease of administration of the


test, instruction must be complete and
precise

Internal Consistency Method 2. Ease of Scoring

 is used in psychological test that consists of  ease of scoring depends upon the following
dichotomous scored items such as;
 the examinee either passes or fails in an (1) Construction of the test is objective,
item (2) Answer keys are adequately
 a rating 1(one) is assigned for correct prepared, and;
answer and 0 (zero) for incorrect responses (3) Scoring directions are fully
 this method is obtained by using Kuder- understood
Richardson Formula 20. 3. Ease of Interpretation and Application
Internal Consistency formula:  results of the test are easy to interpret and
Kuder-Richardson Formula apply if tables are presented
 As a rule, norms must be based on age and
grade/year level as in the case of school
achievement tests.
 It is also desirable if all achievement tests
must be provided with separate norms for
rural and urban learners and various degrees
of mental ability
4. Low Cost
 it is more practical if the test is low cost
material-wise
 it is also economical if test can be reused by
future teachers
5. Proper Mechanical Make-up
 A good test must be printed clearly in an
appropriate font size for the grade or year
level the test is intended to be given
4. JUSTNESS
 is the fourth characteristics of assessment
method
Formula to get the mean:  it is the degree to which the teacher is fair in
assessing the grades of the learners
 the learners must be informed on the criteria
they are being assessed
Morality in Assessment

Formula to get Standard Deviation:  it is the degree of secrecy of grades of the


learners
Morality or ethics in assessment of test results or
grades must be confidential to avoid slow learners
from embarrassment.
3. PRACTICABILITY
Assessment Tools  a test item may contain one or more
blanks
The assessment tools of teacher-made tests are
 this type of test is also called Fill in the
divided into (1) objective tests and (2) essay tests. blank
Objective Tests
Rules and Suggestions for the Construction
There are two main types of objective tests. The of Completion Test
recall type and recognition type.
1. Give the student a reasonable basis for the
Recall Type responses desired. Avoid indefinite statements.

a. Simple-recall Type 2. Avoid giving the examinee unwarranted clues


to the desired response.
 This type of test is the easiest to construct
among the objective type of tests because  There are several ways in which clues
the item appears as a direct question, a are often carelessly given. The following
sentence, word, or phrase or even a specific suggestions may help prevent common
direction errors in constructing completion test.
 The response requires the examinee to recall a. Avoid lifting statements directly
previously learned lessons and the answer from the book.
are usually short consisting of either a word b. Omit only key words or phrases
or phrase rather than trivial details.
 This test is applicable in Mathematics and c. Whenever possible avoid “a” or
Natural Sciences subjects like Biology, “an” before a blank. These
Chemistry, and Physics where the stimulus articles give a clue of whether a
appears in a form of a problem that requires response start with a consonant
computation. or a vowel.
d. Do not indicate the expected
Rules and Suggestions for the Construction response by varying the length of
of Simple-recall Type blank or by using a dot for each
1. This test items must be so worded that the letter in the correct answer
response is brief as possible, preferably a single e. Guard against the possibility that
word, number, symbol, or a brief phrase. This one item or part of the test may
objectifies and facilitates scoring suggest the correct response to
another item
2. The direct question is preferable than the f. Avoid giving grammatical clues to
statement form. It is easier to phrase and more the expected answer.
natural to the examinees
3. Arrange the test so as to facilitate scoring.
3. The blanks for their responses must be in a
column preferably at the right column of the a. Allow one point for each blank correctly filled.
items. This arrangement facilitates scoring and is Avoid fractional credits or unequal weighing
more convenient to the examinees because they of items in a test.
do not have to turn their neck to go back to the b. Select the items to which only one correct
left column to write their answer before the item. response is possible.
It is obsolete, though, this arrangement has been c. Arrange the items in a way that the
practiced for more than a century. The right examinees’ responses are in the right
column response arrangement is more column of the sentence.
convenient because the examinees write their d. Scoring is more rapid if the blanks are
response directly to the right column. numbered and the examinee is directed to
write in the appropriate number blanks
4. The question must be so worded that there is e. Prepare a scoring key by writing on another
only one correct response. Whenever this is sheet of test paper all acceptable answer.
impossible, all acceptable answers must be
included in the scoring Recognition Type

5. Make a minimum use of textbook language in Alternative Responses


wording the question. Unfamiliar phrasing  This test consists of a series of items
reduces the possibility of correct responses that where it admits only one correct
represent more meaningless verbal associations response for each item from two or three
Completion Test constant options to be chosen. This type
is commonly used in classroom testing,
 This test consists of a series of items that particularly, the two-constant alternative
requires the examinee to fill the blank type, namely, true-false type or yes-no
with the missing word of phrase to type
complete a statement.
 Other types of constant alternative- Suggestions for the Construction of Multiple-
response tests are: three constant Choice Test
alternative type, i.e., true-false, doubtful;
1. Statements borrowed from the textbook or
and constant alternative with correction,
reference book must be avoided. Use
i.e., modified true-false type
unfamiliar phrasing to test the
Suggestions for the Construction of True-False comprehension of students
Test 2. All options must be plausible with each other
to let the students attract the distractors or
1. The test items must be arranged in groups of
incorrect responses where only those with
five to facilitate scoring. The groups must be
high intellectual level can get the correct
separated by two single spaces and the
answer.
items within a group by a single space.
3. All options must be grammatically consistent.
2. In indicating response, it must be as simple
For instance, if the stem is singular, the
as possible where single letter is enough to
options are also singular
facilitate scoring, for instance, T for True and
4. Articles “a” and “an” are avoided as last word
F for False, or X for True and O for False. It
in an incomplete sentence. These articles
would be better if responses be placed in one
give students clues that the probable answer
column at the right margin to facilitate
starts with a consonant or vowel.
scoring.
5. Four or more options must be included in
3. The use of similar statements from the book
each item to minimize guessing.
must be avoided to minimize rote memory in
6. The order of correct answer in all items must
studying
be randomly arranged rather than following a
4. The items must be carefully constructed so
regular pattern
that the language is within the level of the
7. A uniform number of options must be used in
examinees, hence, flowery statements
all items. For instance, if there are four
should be avoided.
options in Item 1, the rest of the items must
5. Specific determiners like “all,” “always,”
also have four options
“none,” “never,” “not,” “nothing,” and “no” are
8. The correct option must be of the same
more likely to be false and should be avoided
length with the distractors or wrong answers.
6. Determiners such as “may,” “some,”
9. Homogeneity of the options must be
“seldom,” “sometimes,” “usually,” and “often”
increased in order for the examinee to
are more likely to be true, hence, the said
choose the correct option by using logical
determiners must be avoided because they
elimination
give indirect suggestion to probable answers
10. The simplest method of indicating a
7. Qualitative terms like “few,” “many,” “great,”
response must be used to facilitate scoring.
“frequent,” and “large,” are vague and
For instance, the options in each item are
indefinite and they must be avoided
numbered or lettered. Hence, the choice is
8. Statements which are partly right and partly either a number or a letter.
wrong must be avoided.
9. Statements must be strong considered that Kinds of Multiple-Choice Test
they represent either true or false.
Stem-and-option variety
10. Ambiguous and double negative statements
must be avoided.  This variety is most commonly used in
classroom testing, board examination,
Multiple-choice Test
civil service examination, and many
 This test is made up of items which consists others. The stem serves as the problem
of three or more plausible options for each and followed by four or more plausible
item. The choices are multiple so that the options in which examinees choose the
examinees may choose only one correct or best option or correct answer
best option for each item.
Setting-and-options variety
 The multiple-choice is regarded as one of
the best test forms in testing outcomes. This  The optional responses to this type of
test form is most valuable and widely used multiple-choice test is dependent upon a
in standardized test due to its flexibility and setting or foundation of some sort. A
objectivity in terms of scoring. setting can be a form of sentence,
 For teacher-made test it is applicable in paragraph, graph, equation, picture or
testing the vocabulary, reading some forms of representation.
comprehension, relationship. Interpretation
of graphs, formulate tables, and in drawing Group-term variety
of inferences from a set of data.  This variety consists of group of words
or terms in which one does not belong to
the group.
Structured-response variety
 This variety makes use of structured
response which are commonly used in
natural science classroom testing. For
instance, directly, indirectly, and no way.
This is to test how good the examinees
in determining statements which are
related to each other.
Contained-options variety
 This variety is designed to identify errors
in a word, phrase, or sentence in a
paragraph.

You might also like