You are on page 1of 3

Assessing Writing 33 (2017) 1–11

Contents lists available at ScienceDirect

Assessing Writing

The TOEFL iBT writing: Korean students’ perceptions of the


TOEFL iBT writing test
Eun-Young Julia Kim
Andrews University, Berrien Springs, MI, USA

a r t i c l e i n f o a b s t r a c t

Article history: The TOEFL is one of the most widely recognized language proficiency tests developed to
Received 12 June 2016 measure international students’ level of readiness for degree study. Whereas there exist
Received in revised form a number of correlational studies conducted by various affiliates of ETS based on large-
29 December 2016
scale quantitative data, there is a dearth of studies that explore test-takers’ perceptions
Accepted 15 February 2017
and experiences concerning the TOEFL iBT. Writing skills have paramount importance for
academic success, and high-stakes tests such as the TOEFL have a tendency to influence
Keywords:
test-takers’ perceptions on what defines good academic writing. To date, no research has
Language proficiency tests
specifically focused on test-takers’ perceptions on the writing section of the TOEFL iBT.
Standardized writing tests
Testing establishments To fill this gap, this study explores Korean students’ perceptions of effective strategies for
TOEFL preparation preparing for the TOEFL iBT writing test, challenges they face in the test-taking and test-
Korean EFL students preparation processes, and implications such findings have for various stakeholders, by
analyzing online forum data. Findings indicate that the scores for the writing section of
the TOEFL iBT, albeit helpful for the initial benchmarking tool, may conceal more than
it reveals about Korean students’ academic writing ability. The study suggests that the
format, questions, and the scoring of the TOEFL iBT writing test be critically examined from
test-takers’ perspectives.
© 2017 Elsevier Inc. All rights reserved.

1. Introduction

Most English-medium colleges and universities require scores from standardized English language tests to screen quali-
fied international students. The TOEFL (Test of English as a Foreign Language) has been considered one of the most recognized
and widely used language proficiency tests since its development in the 1960s by the Educational Testing Service (ETS).
Accepted in more than 9000 colleges, agencies, and other institutions in over 130 countries (ETS, 2016, “Who Accepts TOEFL
Scores?”), the TOEFL has garnered international recognition and power over the last few decades. As a growing number of
international students perceive receiving academic degrees from English speaking countries, such as the U.S. or U.K, as a
ticket to better jobs and career advancement, many foreign students continue to invest a large amount of time and money
in preparing for and taking these high-stakes tests each year.
In the case of Korea, a plethora of TOEFL profit-driven preparation institutions exist throughout the country (Senior,
2009), cashing in on highly motivated students to improve their TOEFL scores to enter desired academic institutions in an
English speaking country. Koreans are the third biggest group of international students in the U.S. (Newman, 2014), and
the well-known ‘English fever’ among Koreans has been the test-case among scholars who critically examine the ethical

E-mail address: keun@andrews.edu

http://dx.doi.org/10.1016/j.asw.2017.02.001
1075-2935/© 2017 Elsevier Inc. All rights reserved.
2 E.-Y.J. Kim / Assessing Writing 33 (2017) 1–11

dimensions of teaching and using English as a global language (Kim, 2006; Kim, 2002; Joseph S. Park, 2009; Jin-Kyu Park,
2009; Joo-Kyung Park, 2009).
First introduced a decade ago, the newest version, the Internet-based TOEFL iBT, was designed to assess test-takers’
abilities to communicate in a real academic setting by testing integrated language skills in reading, listening, speaking, and
writing (Zareva, 2005). It draws from a lengthy, extensive line of research, and the validity evidence for the interpretation
and use of its scores has been fully elaborated by Chapelle, Enright, and Jamieson, (2008) and Chapelle, Enright, and Jamieson
(2010), based on six inferences: domain description, evaluation, generalization, explanation, extrapolation, and utilization.
Although the test has strong theoretical and empirical basis for its construction, ETS (2011) recognizes that the TOEFL test
validation is an ongoing process because it will continue to “grow and be refined” (p. 10) “as the TOEFL is used and more
evidence pertaining to validity appears” (Chapelle et al., 2008).
Various scholars have argued that in order to establish validity of a test, perceptions and attitudes of all stakeholders
should also be examined. For instance, Brown (1993) noted the importance of incorporating test-takers’ feedback into the
revision of test items and the writing of test rubrics for a test of spoken Japanese for the tourism and hospitality industry. In
a study of language teachers’ perceptions of the TOEFL Junior, So (2014) recommended incorporating teachers’ perspectives
into the test development of large-scale international language assessment to improve the validity of the test. Sato and Ikeda
(2015) recommended that test-takers’ perceptions be integrated into test development to promote positive washback on
students in their study of Japanese and Korean university students’ perceptions of past entrance examinations. Stricker and
Attali (2010) asserted that “acceptance by test takers, test users, and the public is essential to the continued viability of the
TOEFL” (p. 1). To date, however, few studies have explored test-takers’ perceptions and experiences concerning the TOEFL
iBT, although some studies on test-takers’ perceptions focused on the computer-based TOEFL (TOEFL CBT) (e.g., Powers &
O’Neill, 1993; Stricker et al., 2004) or the Test of Written English (TWE). The TOEFL CBT was used before TOEFL iBT was
introduced, and TWE is the proto-form of TOEFL writing tests administered by ETS to foreign students. He and Shi’s (2008)
study on Chinese students’ perceptions and experiences of the TWE raised concerns as the students reported having received
a passing score because they had memorized “general structures or ‘general sentences’ that they could use in almost in any
essay or any opening and concluding paragraphs” (p. 137).
Whereas Jamieson et al. (1999) and Stricker et al. (2004) found students’ attitudes toward computer-based testing to
be mainly positive, a study conducted by Puspawati (2012) reported mostly negative attitudes towards the test as his
participants strongly felt that the TOEFL failed to measure their real language proficiency and perceived that the topics used
in the tests were discipline-specific and, therefore, disadvantaged those who came from different educational and cultural
backgrounds.
Stricker and Attali (2010) observed that test-takers’ perceptions of the TOEFL iBT differed by region. For instance, students
from Germany had either neutral or more negative attitudes towards the test compared to students from China, Egypt, and
Columbia. Although useful, Stricker and Attali’s (2010) study provides limited information about test-takers’ perceptions on
the TOEFL iBT writing test as it examined students’ perceptions through a single survey item, “The TOEFL gave me a good
opportunity to demonstrate my ability to write in English,” which was rated in a three-point Likert scale, Agree, Disagree,
and Do Not Know. More recently, Malone and Montee (2014) found that Saudi and Korean students felt that there was a
discrepancy between their true English proficiency and the TOEFL iBT scores. In addition, the South Korean participants
reported using a template to formulate their response. One Korean student in their study reported that his performance on
the TOEFL was better than his actual writing ability because he practiced extensively using a response template.
In order to understand how test-takers perceive the tests and what challenges they face, I believe it is worthwhile to
investigate their perceptions and preparation processes in each skill area in greater depth. It can be safely assumed that
writing skills have paramount importance for academic success, and yet, no research has entirely focused on the test-takers’
experiences and perceptions of the writing sections of the TOEFL iBT.
To fill this gap, this study investigates what preparation strategies Korean students perceive to be effective, what problems
and challenges they face in the test-taking and test-preparation processes, and what implications such findings have for
various stakeholders. Although findings of this study may not be directly applicable to all other language groups, insights
provided this study will shed some important light on the validity of the current TOEFL iBT writing test. The study centers
on the following questions:

1. What strategies do Korean test-takers perceive to be effective for preparing for the TOEFL iBT writing test?
2. What difficulties do they experience in taking and or preparing for the test?

A brief description of the current TOEFL iBT writing test will precede the presentation of the research.

2. TOEFL iBT writing test

The writing section of the TOEFL iBT consists of two parts. The first part measures integrated writing skills by incorporating
reading, listening, and writing. In the integrated writing task, students are provided with a 250–300 word passage on the
computer screen to be read within three minutes, immediately followed by a lecture of a similar length on a related topic
without a script. Then test-takers are given 20 min to summarize the points made in the lecture, explaining how they cast
doubt on the points made in the reading. The topics include various academic subjects.

You might also like