Professional Documents
Culture Documents
Department of English
Case study of 3rd year LMD students of Larbi Ben M’hidi University department of
English
Degree of
2019-2020
General Introduction
EFL learners, throughout their language learning process aspire to achieve a high
language proficiency level. This endeavor can be ultimately accomplished through the
appropriate use of the communicative skills namely speaking and writing. Writing is often
thought to have a higher status than speaking especially among the academic community. Writing
is an extremely versatile tool that can be used to accomplish a variety of goals (Graham,
Gillespie, & McKeown, 2013, p. 3). It has become an everyday life requirement nearly for any
vocational position in which students are going to pursue beyond school. For high school seniors,
writing skills are among the best predictors of success in course work during their freshmen year
of college (Geiser & Studley, 2001). Most importantly, writing provides the ability to articulate
ideas, argue opinions, and synthesize multiple perspectives (Mcnamara, Crossley, & Mccarthy,
2010, p. 58). Thus, learners need to master the writing skill and to compose high quality pieces of
professional raters. Teachers as professional assessors of students’ writings implement rubrics and
writing assessment methods to “evaluate writers’ performance or potential through a writing task”
(Behizadeh & Engelhard, 2011). Though, teachers of the “writing expression module” invest so
much efforts to improve learners’ writings skills; yet still, many students struggle to compose a
good quality piece of writing. Many studies have investigated what is considered to be good
quality writing (Witte & Faigley, 1981; Ferris, 1994; Jarvis, Grant, Bikowski, & Ferris, 2003;
Mcnamara et al., 2010). Jarvis (2003) sates that “There have been numerous attempts to quantify
second language (L2) writing quality in terms of the frequency and distribution of linguistic
features that occur in written texts.” This is considered of great benefit to teachers as well as
students in the sense that it provides an efficient scheme to writing pedagogy and assessment.
1
However, putting finger on the relationship between the linguistic features and writing quality is
still evasive.
Students’ writings vary greatly in a lot of aspects, one of which is the linguistic features
utilized in their compositions. Teachers, while assessing, take into account numerous aspects; for
instance, sentence structure, punctuation, relevance of ideas…etc. A study done by Varner et al.
(2013) concludes “that teachers’ quality rating covers linguistic features at both “surface- and
deep-levels of the text” while students tend to judge the text on its surface level which indicates
that there is a misalignment between students’ and teachers’ judgments”. Assessors, also, follow
rubrics which are descriptive in nature to match the students’ writings with the description
provided by the rubric. However, it is unclear which of these linguistic features determine
whether the learners’ writing is high or low and what raters perceive as highly or poorly writing
quality.
In the present study, investigations are going to be administered in order to pinpoint the
differences between highly and poorly rated learners’ writings it terms three types of linguistic
Research Questions
Does highly rated learners’ writing differ in terms of linguistic features from poorly rated
writings?
2
Do assessors take into consideration all these linguistic features to rate learners’ writings?
Research Hypothesis
Null hypothesis:
H0 Highly and poorly rated writings don’t differ in terms of linguistic features.
H0 If there were any differences they don’t influence the assessor’s ratings.
Alternative hypothesis:
Research design
1. The sample:
The target population is the university students of Algeria majoring in English language;
however, due to the limitations of resources the accessible population is Larbi ben M’hidi
university students of the English department academic year of 2019/2020. Henceforth, third
2. The method:
A writing task will be assigned to 3rd year students by their “written expression” teachers.
A corpus of the writings will be created and analyzed by Coh-Metrix web tool developed by
Arthur C. Graesser and Danielle S. McNamara, which provides “a bank of indices [that]
term “measure” [is used] to describe a theoretical construct (e.g., referential cohesion, lexical
diversity, word frequency), [and] the terms “index” or “indices” to describe any one of the
2014, p. 61). Using the descriptive statistics provided by Coh-Metrix, inferential statistical
3
tool called discriminant analysis will be used to assess the adequacy of the classification of
highly and poorly rated writings in relation to the linguistic features in order fulfill the above
stated aim and to assess the hypotheses and answer the research questions.
The study will mainly be comprised of three chapters. In the first chapter we’ll deal with
Natural Language Processing NLP and how did it contribute to the creation of coh-metrix
and the technology that is used in this analysis tool. The second chapter will include two
sections; the first section is going to be concerned with defining writing assessment and
the linguistic features intended to be studied and how they affect the writing’ quality,
while the background studies and the related literature will be left to the second section.
The third chapter will be devoted to the data collection and analysis and discussion of the
findings.
research but still not answered clearly. Thereby, it is expected that the findings of this
study will serve inflicting new prospects in writing assessment and students writing
4
References
Behizadeh, N., & Engelhard, G. (2011). Writing assessment. Retrieved November 17, 2019, from
https://en.wikipedia.org/wiki/Writing_assessment#cite_note-1
Ferris, D. R. (1994). Lexical and Syntactic Features of ESL Writing by Students at Different
Graham, S., Gillespie, A., & McKeown, D. (2013). Writing: Importance, development, and
Geiser, S., & Studley, R. (2001). UC and the SAT: Predictive validity and differential impact of
Jarvis, S., Grant, L., Bikowski, D., & Ferris, D. (2003). Exploring multiple profiles of highly
https://doi.org/10.1016/j.jslw.2003.09.001
Mcnamara, D. S., Crossley, S. A., & Mccarthy, P. M. (2010). Linguistic Features of Writing
McNAMARA, D. S., GRAESSER, A. C., McCARTHY, P. M., & CAI, Z. (2014). Automated
Varner, L. K., Roscoe, R. D., & Mcnamara, D. S. (2013). Evaluative misalignment of 10th-grade
student and teacher criteria for essay quality : An automated textual analysis. Journal of
Witte, S. P., & Faigley, L. (1981). Coherence , Cohesion , and Writing Quality. 32(2), 189–204.