Professional Documents
Culture Documents
1323-Article Text-2572-1-10-20130528
1323-Article Text-2572-1-10-20130528
Kurniawan Aprianto
Abstract
The study investigates the validity of the English tests in the National Examination of Senior
High Schools and its washback to the teaching and learning process in the classroom. This
is a combination of a documentary and a survey study. The data consisted of test items of
English tests in the National Examination in 2010 and 2011 and teachers’ responses to the
questionnaires about English tests. The data were analyzed using a simple descriptive statistics
and a content analysis. The results show that there were significant incompatibilities between
the English test items in the National Examination and the school-based curriculum. The test
items were less authentic. Teachers used drills, exercises and tricks in assisting the students to
answer questions. In addition, the teachers conducted regular teaching and learning activi-
ties during the first two years but focus on practicing on test items that might be found in the
National Examination.
© 2013 Universitas Negeri Semarang
Alamat korespondensi: ISSN 2087-0108
Kampus Unnes Bendan Ngisor, Semarang 50233
E-mail: jurnalpps@unnes.ac.id
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
107
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
108
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
concerning the school condition); the Regulation to determine the participants of the study (Cohen
from Ministry of National Education was also et al., 2000). Thus, each school was represented
considered. Moreover, more data about teachers’ by one English teacher as the object of the study.
perspectives on viewing English test in National The written data, the last two consecutive
Exam was also taken into consideration. NE questions (2010-2011), were collected inclu-
There are two kinds of objects in this study. ding the listening materials. While about school-
First, written object which consists of questions based curriculum, competence standard and ba-
of National Exams, competence standard and sic competence of English subject enclosed in
basic competence in the syllabus of senior high Ministry Rule no 26 year 2006 was considered as
school. The questions of National Exams in year the data.
2010 and year 2011 are then taken as the first ob- A survey using a questionnaire as the
ject. The second object is information from teach- instrument was employed to collect the second
ers of senior high schools in Mataram about the data. The instrument was partly developed using
process of the teaching-learning of English in the three-point Likert-scale questionnaire covering
classroom. That information was gained through their views and opinions dealing with the role of
questionnaires. Teachers from various senior high NE to improve students’ competences in English
schools were the participants of the study. In con- as mentioned in school-based curriculum, a cho-
ducting this survey all respondents were asked the ice depending on NE or teacher-made evaluati-
questions that were appropriate to them, and so on, academic advantages, material development
that, when those questions are asked, they are al- regarding to NE, components of English skills
ways asked in exactly the same way (Brace, 2004). to be examined, the function of scores from NE,
There are 23 senior high schools which consist of and their readiness to conduct the evaluation.
10 public schools and 13 private schools in Ma- Each question was completed with the open en-
taram. The combination between probability and ded question for further explanation of the choice
non-probability sampling techniques were used. considered. The table below shows some sample
The randomized quota sampling was conducted questions in the questionnaire.
The written objects (questions of the latest in NE (2010 and 2011) and the School-based cur-
English test in National Exam and the latest Eng- riculum. This analysis was conducted to find out
lish curriculum) were analyzed using qualitative the validity of the test concerning what had been
descriptive analysis as content analysis did. They mandated in the curriculum. This is very crucial
are categorized based on the similarities and dif- because English test in NE should evaluate what
ferences and generate some inferences based on students had learned.
those similarities and differences. Three analyses The questions of the questionnaires emp-
were done: 1) comparing between the Curricu- loying Likert-scale and closed-ended questions
lum (School-based curriculum) and the operatio- were summarized using descriptive statistics.
nal indicators of English test in National Exam Open-ended questions were analyzed according
from two consecutive years (2010 and 2011); and to the tradition of content analysis and were cate-
2) comparing between the questions of English gorized based on the similarities. In other words,
test in National Exam from two consecutive yea- these were analyzed qualitatively to reveal the
rs (2010 and 2011) and the operational indicators patterns of relation among verbal responses made
of the test from the same years; 3) analyzing the by the respondents (Sulistyo, 2009). Respondents
compatibility of the questions of the English test answers in the questionnaire were coded as 1 (for
109
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
“Yes” answer), 2 (for “In between” answer), and tional progress in Indonesia, it seemed to succeed
3 (for “No” answer). The data then were tabula- for some extent. The test would reveal how well
ted to find out the frequency of the answer. To students answer the questions. However, it was
get deeper and detailed answer, the respondents only part of students’ proficiency since the tests
were to propose the reason to elaborate their ans- only portrayed students’ receptive skills.
wers. After the coding was complete, the data Authenticity is a quality about to what ex-
were further analyzed. Firstly, simple descriptive tent a test task related to the target language use
statistics (% of each answer) to find out the trend task. In other word, it provides an investigation to
was applied and then content analysis to establish which score interpretations generalize based on
the inference by categorization was conducted. the performance of the test to language use in the
Secondly, the data gathered from these analyses target language use domain. We realize that it is
were then described and used to answer the pre- not easy to capture target language use task since
sent research questions. students in Indonesia commonly do not have real
life use of English in their daily lives. Otherwise,
RESULTS AND DISCUSSION what we need to do is making language instruc-
tional TLU domain, that is, situations in which
As most experts consider that language language is used for the purpose of teaching and
ability consists of four language skills: listening, learning the language. I adopt some characteris-
reading, speaking, and writing, the language pro- tics of authenticity by Mueller (2012). Related to
ficiency test should contain all language skills. the characteristics proposed by Mueller, English
One biggest problem is how to conduct produc- Test in National Examination had relatively low
tive skills assessment. These two skills, speaking in authenticity.
and writing, required different instruments com- Interactiveness of the test shows the test
pared to the other two. They could not be done taker’s individual characteristics are involved in
by machines because assessing those skills invol- accomplishing a test task. This includes language
ved human cognitive activity which could not be ability (language knowledge and strategic compe-
replaced by machinery. Doing such assessment to tence), topical knowledge and affective schemata.
measure a very large number of test takers would In understanding whether a test task brings a rela-
face some problem especially when conducted at tively high Interactiveness or not, all the compo-
relatively the same time. The following discussi- nents must be regarded. But sometimes a test task
on focused on two receptive language skills, i.e. does not need to have a high level of interactive-
listening and reading, found in English Test in ness, the minimum set of acceptable level would
National Examination. be enough. Based on the indicators of listening
Based on the data, the representativeness and reading section of the English test in NE, the
of the test is discussed in a number of aspects: degree of interactiveness of the test task was re-
latively high.
a. Test Coverage Most teachers (47.37%) believed that Eng-
In listening section, all questions were lish test in National Exam has improved students’
covered all indicators of Graduate competency language ability, but a relatively high percentage
Standard. But in Reading section (NE 2011), of teachers (36.84 %) did not really believe that
some points in the indicators were not covered in it supported students’ improvement for some
the test items. reasons. They said that ET in NE did not assess
all skills so that it was far from the whole desc-
b. Test Relevance ription of students’ improvement. Some others,
Communicative competences required a 15.79% of teachers, actually said the same thing
test which accommodate all skills integratedly as those who did not really believe that ET in NE
while in the English test of NE measure skills in improved of students’ competences. It only asses-
isolation one another. The English tests in Natio- sed perceptive skills (reading and listening) which
nal Examination were far from being relatively were not communicative as not all skills are inclu-
relevant. Or we could say that they were less re- ded, whereas actually they are inseparable.
levant to achieve the goal of Competency Based In seeing the curriculum, almost 100%
Curriculum. (94.74%) of teachers had the same opinion that
the current curriculum was already ideal for the
c. Program Coverage time being, i.e. having communicative competen-
If the English test in National Examinati- ces as the teaching-learning goal. Only one res-
on were supposed to show the mapping of educa- pondent gave ‘in between’ opinion and the rea-
110
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
son was that in her opinion the evaluation of the tences they had to achieve. This tended to make
learning process should be a collaboration of two students commit dishonesty. The students lacked
stakeholders of the national education, i.e. the motivation to learn the language and to practice
government and the teachers. She actually agreed it. On the teachers’ side, the impact was more or
that the current curriculum provided students less the same. As teachers were part of the edu-
with more communicative goal. But the final test cational system, they would feel ashamed if their
(NE) did not really assess students’ achievement. students managed bad scores in NE. This could
There were still gaps between the school-based make them do everything in order to make their
curriculum and the final test (NE). students succeeded in their final examination.
Final assessment, as one component of According to this phenomenon, what the respon-
the most influential and critical factors to decide dents (15 respondents out of 19) did was to pre-
whether a student has finally completed the study pare everything related to National Examination,
at senior high level, is importantly conducted by practicing kinds of test which were like the model
the right party. 12 teachers (63.16%) agreed that of National Examination. Giving question drills
it was the government’s job to conduct the assess- to the students during the last semester was only
ment. In contrast, 4 teachers (2.05%) answered an option they did besides giving students more
‘No’ with some different perspectives. Some of time in the afternoon to have more practices.
them stated that only the teachers knew their stu- The findings about the teachers’ point of
dents best, so that in developing the final assess- view regarding the carrying out of the final as-
ment, the teachers should be in charge. Moreover, sessment of English in National examination
the current curriculum was actually built up by showed that only 26. 32% of the respondents be-
the teachers, and that was why called School- lieved that English test in National Exam was still
based Curriculum. The rest of the respondents (3 good for their students as it gave a significant ef-
teachers or 15.79% of all) answered ‘in between’. fect on students’ motivation. This test was a kind
Regarding to the regulation that final test of a qualified standard test because the govern-
held by the school was also a part of the whole ment did it by involving highly qualified teachers
score, teachers played in this role. They tried to / researchers in developing the test. But most res-
make the score as high as possible to mark up the pondents (63.16% who answer ‘No’ and 10.53%
final score. It could be conducted by having a re- who answer ‘in between’) said that there were still
latively easier questions or having remedial test many things to consider about the test such as the
for those who got relatively low scores. area (skills) assessed, scoring system, its passing
Ten teachers (52.63%) felt that they had a grade, its reliability, and some other external as-
positive gain from it, directly or indirectly. Some pects (e.g. socio-cultural aspect).
other teachers (5 in number or 26.32%) felt that All respondents mentioned that there was
they did not get anything from the NE so far. They nothing they could do except preparing their stu-
stated that the pressure on the teachers that the dents to be able to answer the test and motiva-
students must succeed in National Exam made ting them to be honest when doing the test. They
the teaching-learning process dull and tiring. No simply tried to deliver what they called communi-
fun at all. So they came to the conclusion that it cative language learning before the last semester
has a negative effect on the teaching-learning pro- because their sixth semester was totally for pre-
cess as it should have been a communicative lear- paring students to face the English test in NE.
ning process (elaborating all skills) but it turned Furthermore, they also gave some points of view
out to be non-communicative activities during to reduce disadvantages they already faced. Here
teaching-learning process. While four teachers are their expectations to NE:
(21.05%) seemed to have indirect benefit from the 1. There should be an effort to bring back a
presence of National Exam. They felt alright as more communicative language teaching
long as the UN was conducted well. They indi- because schools are not cram courses. It is
rectly learned about the variation in the test task to make students really have communica-
to measure students’ achievement. tive competence. A test is only a part of
Certain passing grade was to be an exit re- learning process, so they hoped that the
quirement. But this leads to some disadvantages government would not use it to be an exit
for students and teachers. In teachers’ opinion requirement from schools anymore.
(the respondents), students had already got addi- 2. The government should provide a more
tional burden as they should surpass the passing comprehensive test, not only assessing re-
grade in order to graduate. This made them fo- ceptive skills.
cus more on the score, not on the actual compe- 3. Highlighting the function of the English
111
Kurniawan Aprianto / English Education Journal 3 (1) (2013)
test. The test is to measure students’ le- ing of TEFLIN International Conference 2006.
vel of ability in acquiring English. So that Brace, I. 2004. Questionnaire Design: How to Plan, Struc-
cheating in doing the test can likely to be ture and Write Survey Material for Effective Market
avoided because the students would not be Research. London: Kogan Page Ltd.
Brown, H. D. 2004. Language Assessment: Principles and
afraid of the penalty, i.e. failing to finish
Classroom Practices. New York: Pearson.
the study. BSNP. 2006. Panduan Penyusunan Kurikulum Tingkat
4. Removing the passing grade as the exit Satuan Pendidikan Jenjang Pendidikan Dasar Dan
requirement because it likely makes stu- Menengah.
dents not perform their actual performan- Cohen, L. and L. Manion and K. Morrison. 2000. Re-
ce. Standardized tests such TOEFL and search Method in Education. London: Routledge
IELTS can portray the level of competen- Falmer
ces which can be the alternative assess- Fulcher, G. 2010. Practical Language Testing. London:
ments for students to do. Hodder Education
Government Regulation no. 45 year 2010
Let the schools to be the institutions that
Government Regulation. 22 year 2006
can manage the graduation themselves though Hughes, A. 1989. Testing for Language Teachers. Cam-
there is still a kind of monitoring systems from bridge: Cambridge University Press
the Ministry of National Education. This hope- Kattington, L. E. 2010. Handbook of Curriculum Devel-
fully leads to school independency and responsi- opment. New York: Nova Science Publishers,
bility Inc.
Krippendorff, K. 2004. Content Analysis: An Introduc-
CONCLUSION tion to its Methodology. California: Sage Publica-
tions, Inc.
Lightbrown, P. M. and N. Spada. 2006. How Languages
Due to the relatively low validity, teachers
are Learned. Oxford: Oxford University Press
actually wanted a better test system in assessing Ministry of National Education Regulation 22 year
students’ competences. They preferred to con- 2006
duct more comprehensive test task which inclu- Ministry of National Education Regulation 59 year
des all language skills. This kind of test would en- 2011
courage students to focus on language ability. The Razmjoo, S. A. 2007. High Schools or Private Insti-
teachers also propose that there should not be a tutes Textbooks? Which Fulfill Communicative
passing grade for students to reach in order to fi- Language Teaching Principles in the Iranian
nish their study. The test system such as IELTS Context? Asian EFL Journal, Vol. 9, No. 4. 2007,
pp. 126-140
and Cambridge English Exam would give a desc-
Richards, J. C. 2001. Curriculum Development in Lan-
ription about the level of achievement, not jud- guage Teaching. Cambridge: Cambridge Univer-
ging whether he/she has passed the exam or not. sity Press.
Spratt, M. 2005. Washback and the Classroom: the Im-
REFERENCES plication for Teaching and Learning of Studies
of Washback from Exams. Language Teaching
Bachman, L. F. and A. S. Palmer. 1996. Language Test- Research. Vol 9, No. 1, (2005); pp. 5-29.
ing in Practice: Designing and Developing Useful Sulistyo, G. H. 2009. English As A Measurement Stan-
Language Test. Oxford: Oxford University Press. dard In The National Examination: Some
Bharati, D. L. and Suwandi. 2006. UAN and its Rele- Grassroots Voice. TEFLIN Journal, Volume 20,
vance to the New Curriculum, KTSP. Proceed- Number 1, February 2009; pp. 1-24.
112