You are on page 1of 23

___________________________________________________________________ CRILE Working Papers No.

58 (2004) ___________________________________________________________________

Is there life beyond language testing? An introduction to alternative language assessment. Dina Tsagari

This paper aims to be an introduction to the so-called ‘movement of alternative assessment’ (Alderson and Banerjee, 2001) that has recently made its appearance within the field of language testing and assessment. The paper attempts to familiarise readers interested in the area with the fundamental principles, much of the associated terminology and methods. It also raises a number of issues in the hope that they will serve as a springboard for further discussion, research and experimentation in the field.

CWP 58 (2004)

1. Introduction
Language testing, generally associated with formal assessment procedures such as tests and examinations carried out at specified times and serving a variety of purposes (i.e. diagnostic, achievement, progress, etc.), is a vital component of instructional language programmes throughout the world. While this type of assessment is a mainstay of educational programmes (Butterfield et al., 1999), educators and critics from various backgrounds have raised a number of concerns about its usefulness as the primary measure of student achievements. Before attempting to discuss ‘alternative assessment’ at any length, it is useful first to look at some of the issues that have contributed to the need for assessment reform.

2. Concerns about language testing 2.1. Dissatisfaction with types of information gathered
Proponents of process-oriented curricula and instruction argue that traditional testing techniques, e.g. multiple-choice, fill-in-the-gaps, matching, etc., are often incongruent with current second/foreign language classroom practices. In particular, they argue that rich, descriptive information about the products and, more importantly, about the process of learning and the ongoing measurement of student growth needed for formative evaluation and for planning instructional strategies cannot be gathered by conventional testing methods (Barootchi & Keshvarz, 2002). As Genesee and Hamayan (1994: 229) stress “... tests can be useful for collecting information about student achievement under certain restricted conditions, but they are not particularly useful for collecting information about students' attitudes, motivation, interests, and learning strategies” (for similar discussions see also Archbald, 1991; Herman and Winters, 1994; Madaus, 1988; Resnick and Resnick, 1992; Wiggins, 1989a, 1989b, 1994; Wolf et al., 1991).

2.2 Dissatisfaction with high-stakes/standardised tests
The literature also presents an array of negative criticism with regard to the ‘washback effects’ or consequences of high-stakes standardised tests and exams experienced on a number of levels:


Gipps. iii) Psychological level Furthermore. scope and types of instructional materials teachers use. 1988. Hilke. As a result. Cheng. the range. Haladyna et al. Entwistle and Entwistle. 1996). 1998. 1990. 1997). 1994. 1991. students’ learning and studying practices.CWP 58 (2004) i) Curricular level Critics of high-stakes tests attest that these are responsible for narrowing the school curriculum by directing teachers to focus only on those subjects and skills that are included in the examinations. Dietel. 1956: 166.e. 1991). i. i.. i. Herman and Knuth. see also Kirkland. inter alia). 1997. 1991). in high-stake examination contexts students tend to adopt ‘surface’ approaches to learning as opposed to ‘deep’ approaches (Crooks. Lam. 1971.. 1993. 1991. As a consequence. b. 1999. students’ ‘reasoning power’ is impeded. Paris et al. 1993. highstakes exams gradually turn instructional materials into replicas of the actual examination papers (Bailey. students’ psychology. rotememorisation is encouraged by concentrating on recall of isolated details and students resist attempts to engage in risky cognitive activities which can prove both effective and potentially beneficial for their future improvement (Black and Wiliam. high-stakes standardised tests are also said to have undesirable effects on: a. Wall. Mehrens & Kaminsky. Newstead and Findlay. 1997.e. i. the methodology teachers use in the classroom. Shepard. ii) Educational level Critics also point out that high-stakes examinations affect a. it is believed that the role of the students in contexts where high-stakes tests are introduced is that of passive recipients of knowledge and their needs and intentions are generally ignored. Hamp-Lyons. and Wadden. teachers restrict the methods they use and employ various exam preparation practices (also known as “coaching” or “cramming”) at the expense of other learning activities which do not always contribute directly to passing the exam (Alderson and Wall. 1998. c. 1991. High-stakes 3 . 1989.e. Shepard. such tests are said to “dominate and distort the whole curriculum” (Vernon.e.

2003. Spielberger. 4 . Gipps. 2. according to the literature. anxiety and anger (Gipps. Herman and Golan. 1994. 1991). Paris et al. Johnstone et al. it is argued that the dictates of high-stakes tests reduce the professional knowledge and status of teachers and exercise a great deal of pressure on them to improve test scores which eventually makes teachers experience negative feelings of shame. 2002. according to Black and Wiliam. As Black and Wiliam (1998) point out. also do not trust or use their test results as these do not tell them what they need to know about their students’ learning and appear to be unaware of the assessment work of their colleagues. effort. too (see also Harlen & Deakin-Crick. 1996. b. Teachers. 1998). 1992). Zeidner.. interest and involvement in the language learning experience and induce negative feelings in students such as anxiety. O'Malley & Valdez Pierce. 1993 and Crooks. 1995.3 Dissatisfaction with teacher-made tests In addition to the above. teachers’ psychology. 2003). guilt. it is also said that teachers do not generally review the assessment questions or tasks they use in their classroom tests and do not discuss them critically with peers. which. 1988). It is also believed that the use of tests in classroom settings tends to overemphasise the grading function more than the learning function of the language learning process. As a consequence there is little reflection on what is being assessed (Black and Wiliam. 1991. Shepard. embarrassment. 2002. worry and fear. self-confidence.e. 1998). 1994. it is also argued that teacher-made tests. 1991. i. Madaus.CWP 58 (2004) tests are also said to have detrimental consequences on students’ intrinsic motivation. if used as the sole indicators of ability and/or growth of students in the classroom. in such contexts there is a tendency to use a normative rather than a criterion approach to assessment which is likely to create competition between pupils rather than personal improvement leading to de-motivation and making students lose confidence in their own capacity to learn (see also Black. 1993. In addition. 1972. 1988. 1988. Madaus. may generate faulty results which cannot monitor student progress in the school curriculum (Barootchi & Keshvarz. are not conducive to learning (Broadfoot. Smith. boredom..

To this Smith (1999) adds that “[a]lternative assessment might take place outside the classroom or even the institution at various points in time. 1995). Kohonen (1997) makes the point that alternative assessment (the author uses the term ‘authentic assessment’) …. 2000. 1993). 1995. 3. As a consequence of all the above criticisms. 2000. Martin-Kniep. Hamayan (1995) sees that alternative assessment “refers to procedures and techniques which can be used within the context of instruction and can be easily incorporated into the daily activities of the school or classroom” (ibid:213).1 Definitions There is no single definition of ‘alternative assessment’ in the relevant literature. linguistic.4 Equity in education Other than the above. and cultural biases found in traditional testing in order to ensure equity in educational opportunities and achieve educational excellence for all students (Hamayan. For some educators. 2000.g. It uses such 5 . and the subjects being tested may be asked to present their knowledge in various ways” (ibid:703). 1995. For instance. This new tendency in assessment has come to be known as the ‘alternative assessment movement’ in recent state-of-the-art articles (Alderson and Banerjee. Others look at alternative assessment in more general terms. interest groups representing both linguistically and culturally diverse students and students with special education needs have called for a change in approaches to assessment that are more multiculturally sensitive and free of normative. What is alternative assessment? 3. Huerta-Macias. Worthen. alternative assessment is a term adopted to contrast with standardised assessment. emphasises the communicative meaningfulness of evaluation and the commitment to measure that which we value in education. professionally-prepared objective tests consisting mostly of multiplechoice items especially in the US tradition (Huerta-Macias. Bachman. inter alia).CWP 58 (2004) 2. 2001. a shift in practice from psychometrics to educational assessment made its appearance. e. Soodak.

Hart. Kohonen. In a more recent publication. inter alia). Elliott. Genesee & Upshur 1996. 1998b. 1997. 2001... 1998. 1999. 1991. Shohamy. 1994. 1998a. a variety of labels has been used to refer to ways of assessing students’ language achievements without the use of tests. based on the knowledge of learner progress (ibid:13). 1995. Fradd et al. 1991. 1999. 1991. 1999. motivation and attitudes on instructionally-relevant classroom activities . 1989a. Hancock. there is also a plethora of terms used to refer to ways of assessing students’ language products and processes without the use of tests. 1996. 1994. Shavelson et al. Puhl. Terwilliger. Brown and Hudson. 1997. 2000.. The most frequent are: ƒ ‘authentic’ assessment (Cumming and Maxwell. 1992. ƒ ‘performance’ assessment (Aschbacher. Newman et al. Huerta-Macias. Soodak. Its results can be used to improve instruction. which are usually formative rather than summative in function. inter alia). Glover & Thomas.. Brown. and are claimed to have beneficial washback effects (ibid: 228) 3. 1994. inter alia). 1994. are often low-stakes in terms of consequences. Hamayan. achievement.CWP 58 (2004) forms of assessment that reflect student learning. Clapham. 1998. Gipps and Stobbart. Alderson and Banerjee (2001) provide the following definition: ‘Alternative assessment’ is usually taken to mean assessment procedures which are less formal than traditional testing. Wiggins. Balliro. 1992. 1997. 2000. Other than the term ‘alternative’ assessment (see Alderson and Banerjee. 1989b. Smith. O’Malley and Valdez Pierce. 2000. 1993. 1993. 2003. 6 . ƒ ‘continuous assessment’ (Bruton. which are gathered over a period of time rather than being taken at one point in time. 1998. 1991.. Herman et al. 1998. Darling-Hammond. inter alia). Wolf et al.2 Some further terminology Other than the diversity of definitions as to what alternative assessment is. 1999. 1995.

Due to lack of space the differences in meaning and use could not be discussed here. should be seen as an integral part of students’ assessment. Other than the above belief. assessment’.CWP 58 (2004) ƒ ‘on-going assessment’ (Carbery. ‘direct assessment’. Benefits of alternative assessment Researchers and practitioners in the field believe that alternative assessment can: a. Croker. 7 . However. familial or educational backgrounds. the term ‘alternative assessment’ will be used in this paper since it is more generic than the other terms and it incorporates characteristics of the other commonly-used labels. ‘complementary assessment’. Kohonen.g. 1994. which can be especially important when planning and evaluating the effectiveness of instruction (Genesee & Hamayan. cultural. 1999. etc. ‘instructional ‘formative assessment’. The interested reader could explore these through the references mentioned. 1989a. it is possible to focus on both the process and the product of language learning (Belanoff & Dickson. Hamayan. O’ Malley and Valdez Pierce. inter alia) as well as ‘informal assessment’. 1996). Genesee & Hamayan. Evaluate the process and product of learning as well as other important learning behaviours It is stressed that because most alternative assessment is ongoing in nature. through alternative assessment. according to the authors. Furthermore. 1995. assessment’. 1991. e. their family education. 1999. 1989b). e. Wiggins. ‘descriptive assessment’. educationists also claim that through alternative assessment it is possible to collect information about some of the factors that influence achievement found in the students’ linguistic.g. Genesee and Upshur (1996) stress that alternative assessment methods can also gather information about those factors that affect student achievement which. ‘situated/contextualised assessment’ and ‘assessment by exhibition’. the picture that emerges about the learner and his or her language proficiency also reflects the developmental processes that take place in language learning over time. ‘dynamic assessment’. 1994. their prior educational experiences. ‘responsive ‘portfolio evaluation’. Thus. 1997. 4.

whether the student takes risks.g. resourceful. self-corrects. socialises with peers. meets goals.g. whether the student works cooperatively. shows initiative) b.CWP 58 (2004) • • • learning strategies (e. participates in class discussion) reactions to the course (e. Evaluate and monitor instruction Alternative assessment is also believed to provide a strong link between instruction and assessment by forming part of a feedback loop that allows classroom teachers to monitor and modify instruction continually in response to results of student assessment. follows instructions well. prepares for class homework. uses first language strategies) affective and personality styles (e. student participates actively in class activities.g. This process is illustrated in Figure 1 (adapted from Genesee and Hamayan. Revise Instructional Plans Proceed Instruction Assessment Objective not achieved Objective achieved Figure 1. selfreliant. Classroom-based assessment 8 . 1994:215). requires extra guidance. passive) students’ work habits (e.g. seeks assistance when needed) • • students’ social behaviour (e. whether the student is enthusiastic.g. whether the student is punctual. focuses on meaning/form. improvises.

Hamayan (1995) makes the point that this represents a tremendous benefit not only for teachers but other ‘clients’ of assessment. it allows them to assume responsibility for their learning” (ibid: 215) while parents are offered a clear insight into what their children are doing in school. Alternative assessment also gives them the opportunity to chronicle the success of the curriculum and can present them with a framework for organising students’ work. (1991:4) argue. but proceeds in many directions at once and at an uneven pace. Even administrators can benefit from alternative assessment. which suggest that learning is not linear. They also stress that alternative assessment techniques allow learners plenty of time to ‘generate’ rather than ‘choose’ a response: after recentlyacquired knowledge is brought to the forefront of their minds. the higher-order thinking skills of synthesis and analysis are required for the learners when participating in alternative assessment activities. which they can later reconsider by critically working together with the teacher or other learners in sharing perceptions. “administrators. alternative assessment is also said to be in line with views expressed in cognitive psychology. e. 2001. 2000).CWP 58 (2004) c.” (ibid: 215). as Dietel et al. Under this perspective. students should be given the opportunity to use the strategies they acquired at the right time and in the right way so as to apply them for the realization of particular tasks.g. consequently. According to Hamayan. can benefit from the clear information about student and teacher attainment over time” (1995: 215). d. Produce meaningful results to a variety of stakeholders It is also believed that information obtained from alternative methods of assessment can be much more useful and informative compared to test scores and easy to interpret and understand (Alderson and Banerjee. Represent a collaborative approach to assessment Alternative assessment also represents a collaborative approach to assessment that enables teachers and students to interact in the teaching/learning process (Barootchi & 9 . In particular she sees that alternative assessment methods allow students to “see their own accomplishments in terms that they can understand and. Clapham. who are typically least convinced of the advantages of alternative assessment. Relate to cognitive psychology and related fields Furthermore. students. Teachers are also provided with “data on their students and their classroom for educational decision-making…. e. parents and administrators.

3. Kohonen 1999.. 1993. inter alia). 1986. Support students’ psychologically In addition to the above. language learning strategies and cognitive styles) and thus develop lifelong learning skills (Brindley. Wiggins. inter alia). 1998. Mortimer. Furthermore. 1998. h. 1994. 2001. in the context of alternative assessment. 1995. 2000. Leach et al. Thus. 2003. alternative assessment is said to enhance learners’ self-esteem and feelings of efficacy as a growing person. 2003. 1997. Genesee (2001) points out that “[t]hese new evaluation approaches recognise classroom teachers as reflective. collaborative work is reinforced among students and/or between students and teachers within a relaxed classroom atmosphere. Promote autonomous and self-directed learning It has also been argued that participating in alternative assessment can assist learners in becoming skilled judges of their own strengths and weaknesses and in setting realistic goals for themselves which can develop their capacity to become selfdirected and autonomous learners (by acquiring the necessary metacognitive knowledge and strategies. 2000. Gardner. 2001. 1993. Gottlieb. Luoma and Tarnanen. 1998. Kohonen. Leites & Butureira. Wolf et al. self-motivated professionals” (ibid:150) while Kohonen (1997) points that alternative assessment allows teachers more space for developing criteria (ibid:14) and strengthens “the importance of the teacher’s professional judgement and commitment to enhancing student learning” (ibid:13). Genesee & Hamayan. 1991. 2002).CWP 58 (2004) Keshvarz.3 Alternative Methods of Assessment The following list summarises some of the most commonly used types or methods of alternative assessment (based on Brown. 10 . it is believed that alternative assessment can foster intrinsic learning motivation and learner involvement (Broadfoot. Cohen. Provide new roles for teachers With regard to the role of teachers within the alternative assessment paradigm.. Lemos. f. 1999. Council of Europe. g.

The author cites the following as the most frequent ways of recording alternative assessment: • • • Anecdotal records Checklists Learner profiles • • • Progress cards Questionnaires Rating Scales (for a different classification of methods of alternative assessment. 1993).especially in classes with large numbers of learners (Alderson and Banjeree. Kohonen. 1997). see also Herman et al. 2001. Ioannou-Georgiou and Pavlou.. Brindley. Clapham.. Newman and Smolen. Concerns raised about certain qualities of alternative assessment Although alternative assessment provides new possibilities for language evaluation. 5. Hamayan. Another concern is related to the special skills needed by teachers in order to successfully implement alternative methods of assessment (Breen et al. 1993. For instance.CWP 58 (2004) 1994. technical. 1992. 2000).. Clark and Gipps. As Cizek (2000:2) comments in the context of general education in the USA: “Perhaps the peskiest pocket of resistance in the assessment revolution is 11 . practical. concerns about how certain of its qualities (ie conceptual. 1990 and Short. it is argued that alternative assessment documentation provides rich data about learning but it is much more costeffective and time-consuming for the teacher to administer and analyse thoughtfully in order to give accurate feedback to the learner . Genesee and Upshur 1996. 1997. 1993): • • • • • • • Conferences Debates Demonstrations Diaries/Journals Dramatizations Exhibitions Games • • • • • • • Observations Peer-assessment Portfolios Projects Self-assessment Story retelling Think-alouds It is important to note here. 2000. 2001. 1995. Short. 2003. etc) may be realised and/or appropriately investigated have been voiced by educational measurement and language testing specialists. Navarrete et al. following Hamayan’s suggestion (1995:218) that the above methods of assessment need to be distinguished from tools or ways which educators can use to record alternative assessment information. O’Malley and Valdez Pierce 1996.

Hamp-Lyons (1996) and Hamp-Lyons and Condon (2000). that is validity. 2. become well versed in issues of assessment and measurement. Clapham. a mechanism for evaluation and self-criticism of alternative assessment practices is established. on the basis of their studies of portfolio assessment mainly conducted in the US.CWP 58 (2004) the inadequate preparation of teachers and administrators in the fundamentals of educational assessment”. 12 5. feedback based on the assessment results (see also Alderson and Banerjee. content coverage The question of whether alternative assessment can be used for large-scale evaluation is also discussed in the literature devoted to alternative assessment. the users of alternative assessment. 2000. content quality 4. being accustomed to more traditional language assessment practices. whether they are teachers or administrators. 1998b) also point out that alternative assessment needs to satisfy the same standards or psychometric qualities as do conventional tests. Brown and Hudson also emphasise that decisions for use of any alternative assessment procedures should also be informed by considerations of consequences (washback) and the significance. reliability and practicality and should be critically evaluated for their ‘fitness for purpose’ (what Bachman and Palmer (1996) called ‘usefulness’). conceptual clarity is achieved to ensure consistency in the applications of alternative assessment. and value of. also argue the case for the adoption of a number of practices to ensure an ethical basis for the evaluation of alternative assessments. 1993). 4. Worthen (1993:447453) proposes that alternative assessment can reach its full potential in education for large-scale assessment applications if: 1. cost and effect . 2003 . focusing their discussion on the following criteria: 1. meaningfulness 6. Brown and Hudson (1998a. need for. To this Kohonen (1997) adds that learners also need a great deal of personal supervision and clear guidelines as it is quite likely that certain learners may resist the new practices. 2001. 3. transfer and generalizability 2. Gipps and Stobbart. standardisation of assessment judgements is introduced. cognitive complexity 3. Worthen.

Stansfield.CWP 58 (2004) 5. 1998. teachers. due to the direct nature of the assessment. on the same basis that we apply to traditional forms of assessment. (see also Brindley. etc) are persuaded of its importance and usefulness.g. We cannot assume that because alternative assessments start from humanistic concerns they produce outcome that do only good and no harm…” (ibid:300) 6. Gipps and Stobbart. students. Huerta-Macias (1995) argues that alternative assessment is valid and reliable by virtue of its close integration with learning and teaching: trustworthiness of a measure consists of its credibility and auditability. legislators. it would seem prudent to develop and test alternative assessment approaches in low-stakes settings where they can serve needs for better classroom assessment’ (ibid : 451). 6. linguistic. by using multiple tasks. Consistency is ensured by the auditability of the procedure (leaving evidence of decision making processes). and by triangulating any decision making process with varied sources of data (for example. education’s key stakeholders (e. 2003 . Responses to concerns raised Advocates of alternative assessment object to the above views on philosophical grounds. 7. students. school boards. associations of professional educators. the ability to assess complex thinking skills can be established. families and teachers). For instance. by training judges to use clear criteria. Alternative assessment consists of valid and reliable procedures that avoid many of the problems inherent in traditional testing including norming. Alternative assessments are in and of themselves valid. Worthen also suggests that ‘in the interim. Van Daalen (1999:21) concurs that there is a need for on-going research on psychometric features of alternative assessment as part of the development of alternative assessment procedures. Hamp-Lyons (1997) also sees the need for further studies: “We must conduct studies of the impact of alternative assessment. 1994). and cultural biases (ibid: 10) 13 . the fiscal and logistic feasibility of alternative assessment for large-assessment is shown.

i. 1991. is seen to embody a different concept of assessment. further theoretical and empirical work needs to be done to examine alternative assessment practices in depth. In an attempt to address this issue. 1994. (1991) have proposed that a different set of validation criteria needs to be applied to alternative assessment. Van Daalen. Moss. for instance: • • • • • • the extent of transfer and generalizabilty of the assessment tasks beyond the assessment situation the cognitive complexity of students’ responses to the assessment tasks the content quality of the tasks the adequacy of sampling the meaningfulness of the assessment to students and the cost efficiency of the assessment system (see also Garcia & Pearson. 1994. Linn et al. In this regard. 1991. However. we need to reconceptualise alternative assessment and its relationship to standardised testing. it has been argued that new rules of evidence are needed for alternative assessment. Conclusion The alternative assessment paradigm.CWP 58 (2004) Hamayan (1995). also argues that alternative assessment approaches provide a wealth of information which could serve as a context for more valid interpretations of standardised test results. Lynch (2001) further argues that alternative assessment represents a different paradigm (an ‘assessment culture’) and therefore cannot be evaluated from within the traditional positivist framework of educational measurement (a ‘testing culture’). 1992). 1999) 7. Linn et al. Other researchers have also suggested that the application of psychometric criteria for technical adequacy may result in comparisons that reflect unfairly on alternative methods of assessment (Gipps. assessment as an essential part of the learning process.. as discussed in the present paper. to understand how the aspects of alternative assessment are actually accomplished in classroom interaction and to develop appropriate theory and research methods in the study of this highly complex and dynamic teaching-learning-assessing interface before any definite conclusions about its positive effects on teaching and learning are 14 . a strong supporter of alternative assessment. Gipps. She also stresseds that information from alternative assessment procedures can constitute the sole basis for educational and instructional decision-making.e. For example.

15 . the present paper makes an urgent appeal to future researchers with an interest in the area to conduct empirical research in this exciting field within foreign/second language settings. Therefore.CWP 58 (2004) drawn.

Language Testing in Practice. Performance assessment: State activity. Bachman. Portmouth. Bachman. P. Language Testing 15. 1997. C. Authentic assessment: an introduction to a neo-behavioural approach to classroom assessment. interest and concerns. K. TOEFL Monograph Series MS15. 1: 45-85. J. Outcomes-based assessment and reporting in language learning programmes: A review of the issues. 4:213-236. D. What kind of alternative? Examining alternative assessment. 49-97. 1986. Language Teaching 34. Training and Youth Affairs. Formative and summative assessment by teachers. Language Testing 18. J. TESOL Quarterly 27. Educational Research 44. 3:558-561. M. Portfolios: Process and Product. Brindley. B. A. Bailey. H. and Roth. 1993. Modern language testing at the turn of the century: assuring that we count counts. M. Profiling ESL children. Alderson. M. Lumley T. 4: 273-278. 1993. and Palmer. P. Applied Measurement in Education 4. F.. House. Volume 1: Key issues and findings. Outcomes-based assessment in practice: some examples and emerging insights. L. D. 1991. Derewianka. P. Black. Barootchi. C. NH: Boyton/Cook. 2000.. 1: 1-42. 1998. G. N. Black. P..CWP 58 (2004) BIBLIOGRAPHY Alderson. Profiles and records of achievement. and Wiliam.. 1998. Assessment in Education: principles. Broadfoot. 1: 7-74. Brindley. Aschbacher. Belanoff. Oxford: Oxford University Press. Princeton. 4:393-407. Archbald. 2002. 1999. S. L. Language Testing 17. A. NJ: Educational Testing Service. Studies in Science Education 21. 3:279-288. and Keshavarz. Language testing and assessment (Part 1). policy and practice 5. Assessment of achievement through portfolio and teacher-made tests. D. C. 2001. C. M. Balliro. 1993. P. London: Holt. ed. Assessment and Classroom Learning. J. 16 . F. R. G. and Dickson.. Does washback exist? Applied Linguistics 14. eds. 1991. J. 1996. H. 2: 115-129. Department of Employment. Canberra. School Psychology Quarterly 6. and Wall. Hudson. Baratt-Pugh. M. 1991. 2001. L. M. 4: 275-288. Washback in Language Testing. Education. Breen. and Banerjee.

Pockets of resistance in the assessment revolution. 1999. G. Bruton. and Gipps. Annual Review of Applied Linguistics 20:147-161. 4:653-675. P. 2:79-103. Cizek. MA: Heinle and Heinle. B. Paper presented over the 25th Language Testing Research Colloquium. teaching. 17 . Shiken: JALT Testing & Evaluation SIG Newsletter 3. 2nd edition. J. 2225 July. Evaluation and Research in Education 14:38-52. Shiken: JALT Testing & Evaluation SIG Newsletter 3. 1998b. Clapham. 10:14-20. The role of teachers in teacher assessment in England 1996-1998. University of Reading. Boston. J. Crooks.246. A. 1994. Williams. The Washback Effect of Public Examination Change on Classroom Teaching. R. A. 1991. Assessment and Testing. Assessment in Education 6. USA: TESOL. Cambridge: Cambridge University Press. L.. 2000. 1998. 1997. J. Testing round the world: continuous assessment in Spanish State Schools. 1999. The impact of classroom evaluation practices on students. C. S. T. B. S. S. Fundamentals of on-going assessment. ed. M. The alternatives in language assessment: Advantages and Disadvantages.CWP 58 (2004) Rinehart and Wilson. Common European Framework of Reference for Languages: Learning. 1:3-7. The alternatives in language assessment’ TESOL Quarterly 32. University of Hong Kong: PhD thesis. Department of Linguistics . Dark Alleys and Blind Bends: Testing the Language of Learning. J. Council of Europe 2001. 2003. assessment. 2000. B. 1998a. A. 3:19-23. and Hudson. Brown. D. Educational Measurement: Issues and Practice 19. C. Butterfield. Review of Educational Research 58: 438-481. Talking about Assessment: mentorstudent dialogues about pupil assessment in initial teacher training. Carbery. A. 1:8-12. 1999. Fundamentals of on-going assessment. University of Hawai’i Working Papers in ESL. and Marr. Brown. 16. 1988. Assessing language ability in the classroom. Clark. and Hudson. Brown. 2:225. Croker. Cheng. Cohen. T. Language Testing Update. T. Broadfoot. New Ways of Classroom Assessment.. 2000.

J. Pp. Harvard Educational Review 64. and Stobbart. S. Gipps. 1: 5-30. A. What does research say about assessment. J. MA: Addison-Wesley. and Upshur. R.1994. Massachusetts. 1991. 2001. 1991.htm Elliott. eds. 22:205-227. J.Wingate. R. eds. L. 4:273-278. E. and Hamayan. Assessment in Education 6. 1999. Higher Education. F. Garcia. C. Darling-Hammond. D. New York: Teachers’ College Press. Glover. Entwistle. Stufflebeam and L. Gipps. Evaluation. and Pearson. H. 18 . and Wilen. Classroom-based Evaluation in Second Language Education.CWP 58 (2004) Cumming. C. Contextualising Authentic Assessment’ Assessment in Education 6. NCREL.. and Knuth. and Entwistle. London: Falmer Press. Carter and D. Hiebert. Pp. 2:177-194. Herman. H. T. R. Genesee. 1993. J. S. School Psychology Quarterly 6. Coming to grips with continuous assessment. achievement and instruction eds. Authentic assessment: an introduction to a neobehavioral approach to classroom assessment. J. D. and Maxwell.ncrel. L. S. 212239. Performance-based assessment and educational equity. Assessment in context: the alternative to standardized testing. eds. 1:117-127. Kellaghan. T. R. Cambridge: Cambridge University Press. Pp. 144-150. Gifford and M. 1-17. O’Connor. Beyond Testing: towards a theory of educational assessment. 1991. P. 1999. 1996. E. 549-576. In International Handbook of Educational Evaluation. D. Gardner. F. Genesee Cambridge: Cambridge University Press. F. Fradd. P. Larrinaga McGee. In The Cambridge Guide to Teaching English to Speakers of Other Languages. C. Nunan. N. In Educating Second Language Children. G. Available online at http://www. 1994. K. G. eds. F. In Changing assessments: alternative views of aptitude. Dietel. In Literacy for a Diverse Society. and Thomas. Genesee. G. L. Cambridge: Cambridge University Press. Genesee. The Role of Assessment in a Diverse Society. USA: Kluwer Academic Press. P. N. 2003. 1991. E. Contrasting forms of understanding for degree examinations: The student experience and its implications.. A. J. Instructional Assessment. Reading. Alternative Assessment. Classroom-based assessment. 1994. Dordrecht: Kluwer Academic F. V. 1994.

C. 2:169-208. Pp. 1992. Saville. 2002. 1:12-14. 5:2-7. Hamayan. L. Approaches to Alternative Assessment. Harlen. Pp. Hancock. Cambridge: Cambridge University Press.ioe.. 2:48-55. and Golan. USA: National Textbook Company.1) Research Evidence in Education Library. Milanovic and N. The effects of standardized testing on teaching and schools. 2000. Raising standardized achievement tests scores and the origins of test score pollution. 151 164. Applying ethical standards to portfolio assessments in writing in English as a second language. L. and Deakin-Crick. and Haas.Lincolnwood. Portfolio research: a slim collection. Hart. 1997. TESOL QUARTERLY (Forum Commentary) 32(2): 329-337. Policy & Practice 10. Ethical Test Preparation Practice: The Case of the TOEFL. Herman. Institute of Education. Assessing the Portfolio. Hamp-Lyons. Testing and Assessment: Making the Connection. Assessment in Education: Principles. Glossary of Selected Terms. M.Centre Review. 1994. Annual Review of Applied Linguistics 15:212-226. Language Testing. Harlen. L. L. 1991. Testing and motivation for learning. New York: AddisonWesley. J. TESOL Journal 5. A practical guide to alternative assessment. R. 4:20-25. Aschbacher. Cognition and Assessment. Herman. 14/3:295-303. Educational Measurement: Issues and Practices 12.. Available online at http://eppi. 1995. New Jersey: Hampton Press Inc. Hamp-Lyons. London: EPPI-Centre. R. P. In Teaching. 1995. S. 2003.. 1994. 235-240. S. S. L. Hamp-Lyons. R. Washback. W. 1994. L. W. R. M. D. VA: Association for Supervision and Curriculum Development. Herman. Nolen. R. and Condon W. A systematic review of the impact of summative assessment and tests on students’ motivation for learning (EPPI. Authentic assessment: a handbook for educators. B. Social Science Research Unit. T. Nurturing student learning through portfolios. Haladyna. V. ed. version 1. J. and Deakin-Crick. and Winters. and Winters. Alexandria. Educational Research 20. L. Educational Leadership 52. In Performance Testing. 1996.CWP 58 (2004) Gottlieb. 1993. Illinois. 19 . eds. L. impact and validity: ethical concerns. E. Hancock . N.Cresskill. C. Hamp-Lyons. J.

N. S. S. 1995. Brown. 2003. eds.. UK: MA dissertation. Humanizing Language Teaching 2. RELC Journal 28(1): 29-53. 1993. TESOL Journal 5. Alternative assessment: responses to commonly asked questions.. performance-based: expectations and validation criteria. Guice. In Affect in Language Learning. 6: 471-485.Complex. V. Kohonen. Baker. Educational Researcher 20:15-21.. University of Leeds. 4:440-465. 1991. N. eds. Authentic Assessment as an Integration of Language Learning. Review of Educational Research 41:303-350. J. Thompson. S. Oxford: Oxford University Press. Pp. Malone. L. M. S. M. P. In Current Developments and Alternatives in Language Assessment: Proceedings of LTRC 1996. K. Luoma. Pp. International Journal of Educational Research 31. A.CWP 58 (2004) Hilke. ed. and Wadden. England: Kogan Page. A. A. Creating a self-rating instrument for L2 writing: from idea to implementation. Authentic assessment in affective foreign language education. Lam. Available online at: http://www. Teaching. and Butureira. and Michelson. Kurki-Suonio and S. and Dunbar. S. Motivation in assessment. 1: 8-11.. 1998. 279-294. Cambridge: CUP. V. Johnstone. Neutze G. R. Pp. Arnold. The effects of tests on students and schools. Assessing Young Learners. M. J. L. In Motivating Students. Cohonen. 1999. V. Teaching and Teacher Education 11:359-371. P. The TOEFL and its Imitators: Analyzing the TOEFL and Evaluating TOEFL-prep texts.. Armstrong.722. Student reflection in portfolio assessment: making language learning more visible. Huhta. R. 201209 Leites. E. P. 1999. H. Ioannou-Georgiou. Department of 2000. Kohonen. Language Testing 20. University of Jyvaskyla: Jyvaskyla. Self-directed learning as a growing trend in incompany EFL. 2003. Huerta-Macias. Linn. Washback-Can it be quantified ? A study on the impact of English English Examinations in Hong Kong. Kohonen. 1995. 20 . and Pavlou.. 2000. Luoma. P. S. 1997. 1971. 6:1-7. C. V. Students’ goals and self-regulation in the classroom. Assessment of teaching and learning in literature-based classrooms.1997. and Tarnanen. Babylonia 1:15-18. Evaluation and the Teacher’s Professional Growth. Leach.hltmag. Baker. M. Kirkland. G. and Zepke.htm Lemos.

. 2001. An exchange of views on ‘semantics.. G. ed. Language Testing 18. Pp. Pp. S. J. and diversified assessment: addressing equity issues at the classroom level. Educational Researcher 27. 1992. Brandt. O’Malley. 1989. Martinez. Turner. P. 83-121. 5:1220. J. L. C. Paris. Shifting conceptions of validity in educational measurement: implications for performance assessment. psychometrics. G. A. 4: 351-372. Nelson. Review of Educational Research 62. 173-187 Moss. A.. 1:14-22. Brown. and Smolen. K. and Roth. 6:19-22. feedback. B. Standards.. Reading & Writing Quarterly 16: 239–256 Mehrens. In Critical Issues in Curriculum: 87th Yearbook for the National Society for the Study of Education. 1993. N. C. C. Methods for improving standardized test scores: fruitful. Some problems with using examination performance as a measure of teaching ability. L. L. 1990. 1:2330. 1998. Psychology Teaching Review 6. England: Kogan Page. W. Thompson.C. and Kaminski. Lawton. E. and Valdez Pierce. and Wiggins. Wilde. F.. G. Chicago: University of Chicago Press. 1996. Newman. fruitless or fraudulent? Educational Measurement: Issues and Practice 8. S. New York: Addison-Wesley. J. G. 1991. Authentic assessment for English language learners. Tanner. S. 21 . R. M. J. Motivating Student Learning through Facilitating Independence: Self and Peer Assessment of Reflective Practice – An Action Research Project.CWP 58 (2004) Lynch. L. eds. T.. R. The influence of Testing on the Curriculum. 1998. Rethinking assessment from a critical perspective. Mortimer.. Newman. Navarrete. A developmental perspective on standardized achievement testing. Washington. G. O. 2000. and assessment reform: a close look at “authentic” assessments. Martin-Kniep. Informal assessment in educational evaluation: Implications for bilingual education programs. 1997. Portfolio assessment in our schools: implementation. A. In Motivating Students. DC: National Clearinghouse for Bilingual Education. 1988. 3:229258. F. and Findlay. Madaus. S. advantages and concerns. Newstead. Educational Researcher 20. Armstrong and G. J. Mid-Western Educational Researcher 6: 28-32. and Hargett.

J. Pp. 5:8-11. C. Test Usefulness in Alternative Assessment. Stansfield. P. Assessing integrated language and content instruction. Develop. eds. and Resnick. Boulder. Performance assessments: political rhetoric and measurement reality. MA: Kluwer. Performance assesement: exploring issues of equity and fairness.gseis. C. Educational Researcher 26. psychometrics. 4:22-27. B. B. R. Dialog on 22 . R. Shavelson. 37–75. Semantics. Alternative assessment in language testing: applying a Multiplism approach. Hong Kong: Language Centre. 1999. J. Li. 6:22-23. L. Van Daalen. Giþord and M. J. D. Pp.CWP 58 (2004) Puhl. Amsterdam: Elsevier. Spielberger. and G. Connor. J. 99-114 Short. Shepard. and Pine. 703-706. TESOL Quarterly 27. In Concise Encyclopedia of Educational Linguistics. Hancock. C. Will national tests improve student learning? CSE Technical Report 342. Baxter. 1991. achievement and instruction. 4: 627-656. D. 1972. Shepard. Terwilliger. L. A. Shohamy. Pp. Hong Kong University of Science and Technology. 1994. J. Anxiety: Current trends in theory and research. Put to the test: the effects of external testing on teachers’ Educational Researcher 20. Lincolnwood. CREEST.. and assessment reform: a close look at “authentic” assessments. Not Judge: Continuous Assessment in the ESL Classroom. 1990. Reading and Writing Quarterly 16: 175–178. Smith. L. 1998. 1991. testing and assessment: Making the connection eds. Boston. M. Pp. James.43-67. In Changing assessments: Alternative views of aptitude. L. Rejoinder: response to Wiggins and Newman. 1999. 1997. E. A. K. Available online at http://www. University of Colorado. 8:24-27. O. D. New York: Academic Press. C.. Developments in foreign language testing and instruction: A national perspective. E. In Testing and Evaluation in Second Language Education. English Teaching Forum 35. 1992. 1992. L. Resnick. Smith. 2:2-9. Spolsky. C. eds. Terwilliger. 1997. A. Educational Researcher 21. 1998. Assessing the thinking curriculum : New Tools for educational reform. Educational Researcher 26. ed.ucla. In Teaching. Language Testing: Alternative Methods. 2000. Inflated test score gains: is the problem old norms or teaching the test? Educational Measurement: Issues and Practice 9:15-22. C. M. G. Soodak. IL: National Textbook Company. 1993.

D. M. Wiggins. M. Introducing new tests into traditional systems: insights from general education and from innovation theory. R 1993. Bixby. Wall. and Gardner. 1996. ed. San Francisco: Jossey Bass. G. H. Test Anxiety – The State of the Art. J. Worthen. Toward more authentic assessment of language performances.7:4147. R. London: University of London Press. Educational Leadership 46. A true test: Toward more authentic and equitable assessment. 1989b. New York: Plenum Press. Review of Research in Education 17: 31-74. Phi Delta Kappan 74. Glenn. 94:69-85. Lincolnwood. Illinois USA: National Textbook Company. 1989a. 1956. 23 . Wiggins.CWP 58 (2004) Language Instruction 13. G. J. 1993. Teaching to the (authentic) test. Assessing student performance. Wiggins. G. Wolf. To use their minds well: Investigating new forms of student assessment.1&2:1-26. 1998. 1994. P. Language Testing 13. C. The Measurement of Abilities (2nd edition). 6:444-456. How do high school and college students cope with test situations? British Journal of Educational Psychology 66:115-128. Wiggins. E. 1991.. Testing and Assessment: Making the Connection. In Teaching. Pp. 1996. Zeidner. Vernon. G. B. Critical issues that will determine the future of alternative assessment. Hancock. D. Phi Delta Kappan 70:703-713. Zeidner. 3:334-354.