Professional Documents
Culture Documents
http://ltj.sagepub.com/
Published by:
http://www.sagepublications.com
Additional services and information for Language Testing can be found at:
Subscriptions: http://ltj.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Citations: http://ltj.sagepub.com/content/21/3/360.refs.html
What is This?
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
ESL=EFL instructors’ classroom
assessment practices: purposes,
methods, and procedures
Liying Cheng Queen’s University, Todd Rogers and Huiqin Hu
University of Alberta
Student assessment plays a central and important role in teaching and learn-
ing. Teachers devote a large part of their preparation time to creating instru-
ments and observation procedures, marking, recording, and synthesizing
results in informal and formal reports in their daily teaching. A number of
studies of the assessment practices used by teachers in regular school class-
rooms have been undertaken (e.g., Rogers, 1991; Wilson, 1998; 2000). In
contrast, less is known about the assessment practices employed by instruc-
tors of English as a Second Language (ESL) and English as a Foreign Lan-
guage (EFL), particularly at the tertiary level. This article reports a
comparative survey conducted in ESL=EFL contexts represented by Cana-
dian ESL, Hong Kong ESL=EFL, and Chinese EFL in which 267 ESL or
EFL instructors participated, and documents the purposes, methods, and
procedures of assessment in these three contexts. The findings demonstrate
the complex and multifaceted roles that assessment plays in different teaching
and learning settings. They also provide insights about the nature of assess-
ment practices in relation to the ESL=EFL classroom teaching and learning
at the tertiary level.
1
We have chosen to use the term ‘instructors’ to refer to those who are teaching ESL=EFL at
the tertiary level, and the term ‘teachers’ to refer to those who are teaching in the school
system.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
362 ESL=EFL instructors’ classroom assessment practices
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 363
III Methodology
For the purposes of the present study, assessment is defined as the
process of collecting information about a student to aid in decision-
making about the progress and language development of the
student. Evaluation is defined as the interpretation of assessment
results that describes the worth or merit of a student’s perfor-
mance in relation to a set of learner expectations or standards of
performance.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
364 ESL=EFL instructors’ classroom assessment practices
1 Survey questionnaire
The survey questionnaire (see Appendix 1) comprised five parts
illustrating major constructs in classroom assessment (see Code for
Fair Testing Practices for Education, 1988; Rogers, 1993; Standards
for Teacher Competence in Educational Assessment of Students,
1990), the design of which was based on the studies reviewed in Sec-
tion I above. The questionnaire was piloted among a small number
of instructors in each of the three contexts. Approximately 40
minutes was required to complete the survey questionnaire.
2 Samples
Purposive sampling was employed to select ESL=EFL instructors
teaching at universities in the provinces of Alberta, British Colum-
bia, and Ontario in Canada and in Hong Kong and Beijing in
China. The three samples represented, respectively, three ESL=EFL
instructional settings: English-dominant, bilingual (English and
Cantonese), and Mandarin-dominant as it was expected that teach-
ing and learning contexts would differ. In each of these locations,
ESL=EFL instructors at each university that formally offered an
ESL=EFL program were sent a questionnaire and a self-addressed
envelope. Four researchers coordinated the study - two in the West
and East of Canada respectively, one in Hong Kong and one in
China. Wherever possible, meetings were held with the co-coordi-
nator at each university to explain the purpose of the study and to
answer any questions prior to the distribution of the questionnaires.
A glossary of assessment terms was also provided to promote the
validity of the questionnaire. Altogether 461 questionnaires 191
in Canada, 140 in Hong Kong, and 130 in Beijing were dis-
tributed. Of this number, 98 (51.3%) were returned in Canada, 45
(32.0%) in Hong Kong, and 124 (95.3%) in Beijing.
3 Analyses
The responses to the survey questionnaire were entered into a com-
puter file with 100% verification. Examination for missing item level
data revealed that the amount of missing items was such that 4
respondents were excluded from the analysis. The final sample sizes
were 95 for Canada, 44 for Hong Kong, and 124 for Beijing.
Descriptive analyses were used to summarize the bio-demo-
graphic information provided by the respondents. These analyses
revealed that it was not possible to cross any of the bio-demo-
graphic and teaching variables with setting due to insufficient
sample sizes in some of the cells. Consequently, the comparative
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 365
2
For example, as will be shown, there was significant difference for the purpose ‘formally
document growth in learning of my students’. However, the pattern of differences among the
three settings did not meet the property of transitivity. While the percentage for Beijing was
significantly lower than the percentage for Canada, the difference between Canada and Hong
Kong and between Beijing and Hong Kong was not significant. In cases such as these, the
significant difference is not claimed.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
366 ESL=EFL instructors’ classroom assessment practices
IV Results
This section provides background information on the instructors in
the three contexts, as these variables influence and determine the
assessment practices (Breen et al., 1997). This is followed by the
results on the purposes, methods, and procedures of assessment
used by ESL=EFL instructors.
1 Description of participants
The percentage of male instructors in Hong Kong (36.4%) out-
numbered those in either Canada (14.7%) or Beijing (14.5%). Those
in Beijing were younger than the instructors in Canada and Hong
Kong (83.9% vs. 38.9%; with 34.1% under 41 years of age). Re-
garding educational qualifications, all but 3 instructors in Canada
and 5 instructors in Beijing possessed a university degree, but there
were differences in the highest degree attained: 90.7% of the instruc-
tors in Hong Kong possessed a masters or doctoral degree in com-
parison to 55.8% in Canada and 59.5% in Beijing. Lastly, slightly
more than 2 out of 5 instructors in Beijing reported that they had
attended a course or workshop of more than three hours in which
the topic of assessment and evaluation was considered. In contrast,
approximately 4 out of 5 instructors in Canada and Hong Kong
indicated they had attended a full course on assessment and evalu-
ation (45.3%, 43.1%) or a course in which assessment and evalu-
ation were key topics (31.6%, 52.3%).
The instructors in Hong Kong had more years of ESL=EFL
teaching experience than their counterparts in Canada and in
Beijing; respectively, 17.16, 13.58, and 10.32 years. More than 90%
of the instructors in Hong Kong and Beijing had full-time teaching
appointments, as compared to approximately 75% in Canada. The
majority of courses taught in Hong Kong and Beijing were univer-
sity degree courses (93.2% and 93.7%); in Canada they were evenly
divided between degree (52.6%) and diploma=certificate courses
(45.3%). The ranges and mean numbers of classes taught were simi-
lar across the 3 groups, with the average class size in Beijing the
largest (45 students) followed, in turn, by Hong Kong (19 students)
and Canada (15 students).
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 367
n % n % n %
Student centered
Obtain information on 94 98.8 42 97.4 124 97.9
my students’ progress
Provide feedback to my 95 98.8 44 100.0 119 86.6 B < (C ¼ H)
students as they progress
through the course.
Diagnose strengths and 94 97.6 43 89.7 121 86.6
weaknesses in my students
Determine final grades 94 91.7 43 92.3 115 75.3 B < (C ¼ H)
for my students
Motivate my students 94 86.9 43 79.5 124 93.8
to learn
Formally document growth 93 82.1 41 64.1 111 51.6
in learning of my students
Make my students work 91 64.3 41 69.2 119 94.8 (C ¼ H) < B
harder
Prepare students for tests 94 53.4 41 7.7 121 68.0 H < (C ¼ B)
they will need to take
in the future (e.g., TOEFL,
MELAB, CET)
Instruction
Plan my instruction 94 92.9 42 66.7 117 87.6 H < (C ¼ B)
Diagnose strengths and 94 76.2 43 64.1 119 92.8 (H ¼ C) < B
weaknesses in my own
teaching and instruction
Group my students at 91 65.5 41 5.1 119 60.8 H < (C ¼ B)
the right level of instruction
in my class
Administration
Provide information to the 93 83.3 44 89.7 116 65.0 B < (C ¼ H)
central administration
Provide information to an 88 16.7 40 12.8 108 8.2
outside funding agency
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
368 ESL=EFL instructors’ classroom assessment practices
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 369
Beijing reported using the results from their student assessments and
evaluations to plan their instruction, in contrast with two-thirds of
the instructors in Hong Kong. Further, slightly more than nine out
of 10 instructors in Beijing used their assessment results to diagnose
strengths and weaknesses in their own teaching and instruction,
compared with approximately 3 out of 4 instructors in Canada and
less than two out of three instructors in Hong Kong. Lastly, while
between 6 and 7 out of 10 instructors in Canada and Beijing used
their assessments and evaluations to group their students for in-
structional purposes, only 1 out of 20 instructors in Hong Kong
used the results for this purpose.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
370 ESL=EFL instructors’ classroom assessment practices
Instructor-made
Short answer items 82.9 66.7 86.1
Matching items 75.7 27.7 61.5 H < (C ¼ B)
Interpretive items 75.7 40.0 50.8 (H ¼ B) < C
Truefalse items 72.9 33.3 86.9 H < (C ¼ B)
Multiple-choice items 68.6 46.7 91.8 (H ¼ C) < B
Cloze items 62.9 40.0 69.7
Sentence completion items 61.4 20.0 63.9 H < (C ¼ B)
Editing 50.0 6.7 48.4 H < (C ¼ B)
Completion of forms 27.1 6.7 20.5
(e.g., application)
Student-conducted
Student summaries of 91.4 66.7 89.3
what they read
Student journal 61.4 6.7 19.7 (H ¼ B) < C
Oral interviews=questioning 58.6 40.0 85.2 (H ¼ C) < B
Peer assessment 47.1 20.0 28.7
Read aloud=dictation 37.1 20.0 68.8 (H ¼ C) < B
Self assessment 40.0 26.7 41.0
Student portfolio 35.7 20.0 13.9
Non-instructor developed
Standardized reading test 27.1 6.7 83.6 H<C<B
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 371
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
372 ESL=EFL instructors’ classroom assessment practices
Instructor-made
Short essay 91.9 86.0 88.6
Editing a sentence or paragraph 86.5 53.5 69.5 (H ¼ B) < C
Long essay 55.4 72.1 13.3 B < (C ¼ H)
Multiple-choice items to identify 32.4 14.0 48.6 H < (C ¼ B)
grammatical errors in a sentence
Matching items 25.7 2.3 25.7 H < (C ¼ B)
Truefalse items 16.2 4.6 26.7
Student-conducted
Student journal 73.0 18.6 31.4 (H ¼ B) < C
Peer assessment 60.8 41.9 38.1
Student portfolio 55.4 44.2 10.5 B < (C ¼ H)
Self assessment 51.4 32.6 34.3
Non-instructor developed
Standardized writing test 27.0 14.0 75.2 (H ¼ C) < B
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 373
writing did not vary significantly among the three settings (see
Table 3).
. Standardized testing: Like reading, the use of a standardized
measure, in this case writing, in Beijing is significantly greater
(75.2%) than in Canada (27%) and Hong Kong (14%).
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
374 ESL=EFL instructors’ classroom assessment practices
Instructor-made
Take notes 70.3 20.5 70.5 H < (C ¼ B)
Prepare summaries of what is heard 69.1 30.8 75.6 H < (C ¼ B)
Multiple-choice items following 58.0 12.8 83.2 H<C<B
listening to a spoken passage
Student-conducted
Oral presentation 95.1 94.9 88.2
Oral interviews=dialogues 77.8 51.3 79.8 H < (C ¼ B)
Oral discussion with each student 72.8 43.6 73.1 H < (C ¼ B)
Retell a story after listening to a passage 71.6 10.2 85.7 H < (C ¼ B)
Provide an oral description 62.9 23.1 71.4 H < (C ¼ B)
of an event or thing
Peer assessment 51.8 48.7 21.8 B < (C ¼ H)
Oral reading=dictation 49.3 12.8 79.0 H < C ¼ B)
Self assessment 40.7 35.9 20.2
Follow directions given orally 39.5 15.4 41.1 H < (C ¼ B)
Public speaking 37.0 20.5 53.8
Give oral directions 29.6 7.7 39.5 H < (C ¼ B)
Non-instructor developed
Standardized speaking test 18.5 12.8 32.8
Standardized listening test 25.9 5.1 79.0 H < C < B}
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 375
feedback and to report to their students, and how much time they
devoted to assessment related activities in relation to their teaching.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
376 ESL=EFL instructors’ classroom assessment practices
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 377
Proportion of total time Canada (n ¼ 94) Hong Kong (n ¼ 43) Beijing (n ¼ 122)
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
378 ESL=EFL instructors’ classroom assessment practices
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 379
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
380 ESL=EFL instructors’ classroom assessment practices
Acknowledgements
Support for the project was made possible in part through funds
from the Social Sciences and Humanities Research Council
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 381
VI References
Alderson, J.C. and Hamp-Lyons, L. 1996: TOEFL preparation courses: a
study of washback. Language Testing 13, 28097.
Anderson, J.O. 1989: Evaluation of student achievement: teacher practices
and educational measurement. The Alberta Journal of Educational
Research 35, 12333.
—— 1990: Assessing classroom achievement. The Alberta Journal of Edu-
cational Research 36, 13.
Andrews, S. and Fullilove, J. 1993: Backwash and the use of English oral
speculations on the impact of a new examination upon sixth form
English language testing in Hong Kong. New Horizons 34, 4652.
Breen, M.P., Barratt-Pugh, C., Derewianka, B., House, H., Hudson, C.,
Lumley, T. and Rohl, M. 1997: Profiling ESL Children: how teachers
interpret and use national and state assessment frameworks. Canberra
City, Australia: Department of Employment, Education, Training
and Youth Affairs.
Calderhead, J. 1996: Teachers: beliefs and knowledge. In Berliner, D.C.
and Calfee, R.C., editors, Handbook of educational psychology.
New York: Macmillan Library Reference, 70925.
Cheng, L. 1999: Changing assessment: washback on teacher perspectives
and actions. Teaching and Teacher Education 15, 25371.
Cheng, L. and Gao, L. 2002: Passage dependence in standardized reading
comprehension: exploring the College English Test. Asian Journal of
English Language Teaching 12, 16178.
Code of Fair Testing Practices for Education 1988: Washington, DC: Joint
Committee on Testing Practices. Available online from: http:==www.
apa.org=science=fairtestcode.html (March 2004).
Cumming, A. 2001: ESL=EFL instructors’ practices for writing assessment:
specific purposes or general purposes? Language Testing 18, 20724.
Davison, C. and Leung, C. 2001: Researching teacher-based language
assessment: whose criteria, whose language? Colloquium presented
at the American Association of Applied Linguistics, St. Louis, MD.
—— 2002: Problems in (re)interpreting construct validity: diverse com-
munities of practice in school-based teacher assessment. Colloquium
presented at the American Association of Applied Linguistics, Salt
Lake City, UT.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
382 ESL=EFL instructors’ classroom assessment practices
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 383
Purpose=Reason Yes No
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
384 ESL=EFL instructors’ classroom assessment practices
Reading
If you do not teach reading, please put a check mark here and
go to the next page.
Instruction: Please put a check mark (X) in the space to the left
for each method you use to evaluate your students in reading.
Spaces have been provided at the end of the list for methods not on
1. Read aloud=dictation
2. Oral interviews=questioning
3. Teacher-made tests containing
a. cloze items
b. sentence completion items
c. true-false items
d. matching items
e. multiple-choice items
f. interpretative items (e.g.
reading passage; interpret a
map or a set of directions)
g. forms such as an application form
or an order form of some kind
h. short answer items
i. editing a piece of writing
4. Student summaries of what is read
5. Student journal
6. Student portfolio
7. Peer assessment
8. Self assessment
9. Standardized reading tests
10. Other:
11. Other:
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 385
the list. If you use other methods, please be sure to write or describe
what the other methods are.
Writing
If you do not teach writing, please put a check mark here and
go to the next page.
Instruction set 1: Please put a check mark (X) in the space to the
left for each method you use to evaluate your students in writing.
Spaces have been provided at the end of the list for methods not on
the list. If you use other methods, please be sure to write or describe
what the other methods are.
Instruction set 1: Please put a check mark (X) in the space to the
left for each method you use to evaluate your students’ oral skills.
Spaces have been provided at the end of the list for methods not on
the list. If you use other methods, please be sure to write or describe
what the other methods are.
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
386 ESL=EFL instructors’ classroom assessment practices
1. Oral reading=dictation
2. Oral interviews=dialogues
3. Oral discussion with each student
4. Oral presentations
5. Public speaking
6. Teacher made tests asking students to
a. give oral directions
b. follow directions given orally
c. provide an oral description of an
event or object
d. prepare summaries of what is heard
e. answer multiple-choice test items
following a listening passage
f. take notes
g. retell a story after listening to a passage
7. Peer assessment
8. Self assessment
9. Standardized speaking test
10. Standardized listening tests
11. Other:
12. Other:
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 387
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
388 ESL=EFL instructors’ classroom assessment practices
a. part-time. ———
b. full-time. ———
6. I am teaching
a. Yes ———
b. No ———
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014
Liying Cheng, Todd Rogers and Huiqin Hu 389
End of Questionnaire
Downloaded from ltj.sagepub.com at Scientific library of Moscow State University on January 2, 2014