You are on page 1of 6

People’s Democratic Republic of Algeria

Ministry of Higher Education and Scientific Research

Larbi Ben M'Hidi University, Oum El Bouaghi

Faculty of Letters and Languages

Department of English

Assessing lexical diversity and language sophistication: a comparative

study between highly and poorly rated learners’ writings

Case study of 3rd year LMD students of Larbi Ben M’hidi University department of

English

A research proposal Submitted to the Faculty of Letters and Languages, Department of


English

Degree of

Master in Language Sciences and Teaching English as a Foreign Language

Supervisor: Mr Bouri Hadj

By: Djebar Ahmed Houssam

2019-2020

General Introduction
EFL learners, throughout their language learning process aspire to achieve a high

language proficiency level. This endeavor can be ultimately accomplished through the

appropriate use of the communicative skills namely speaking and writing. Writing is often

thought to have a higher status than speaking especially among the academic community. Writing

is an extremely versatile tool that can be used to accomplish a variety of goals (Graham,

Gillespie, & McKeown, 2013, p. 3). It has become an everyday life requirement nearly for any

vocational position in which students are going to pursue beyond school. For high school seniors,

writing skills are among the best predictors of success in course work during their freshmen year

of college (Geiser & Studley, 2001). Most importantly, writing provides the ability to articulate

ideas, argue opinions, and synthesize multiple perspectives (Mcnamara, Crossley, & Mccarthy,

2010, p. 58). Thus, learners need to master the writing skill and to compose high quality pieces of

writing in order to achieve better results in their academic career.

A major determiner of leaners’ success is the grades assigned to their writings by

professional raters. Teachers as professional assessors of students’ writings implement rubrics and

writing assessment methods to “evaluate writers’ performance or potential through a writing task”

(Behizadeh & Engelhard, 2011). Though, teachers of the “writing expression module” invest so

much efforts to improve learners’ writings skills; yet still, many students struggle to compose a

good quality piece of writing. Many studies have investigated what is considered to be good

quality writing (Witte & Faigley, 1981; Ferris, 1994; Jarvis, Grant, Bikowski, & Ferris, 2003;

Mcnamara et al., 2010). Jarvis (2003) sates that “There have been numerous attempts to quantify

second language (L2) writing quality in terms of the frequency and distribution of linguistic

features that occur in written texts.” This is considered of great benefit to teachers as well as

students in the sense that it provides an efficient scheme to writing pedagogy and assessment.

1
However, putting finger on the relationship between the linguistic features and writing quality is

still evasive.

Statement of the Problem

Students’ writings vary greatly in a lot of aspects, one of which is the linguistic features

utilized in their compositions. Teachers, while assessing, take into account numerous aspects; for

instance, sentence structure, punctuation, relevance of ideas…etc. A study done by Varner et al.

(2013) concludes “that teachers’ quality rating covers linguistic features at both “surface- and

deep-levels of the text” while students tend to judge the text on its surface level which indicates

that there is a misalignment between students’ and teachers’ judgments”. Assessors, also, follow

rubrics which are descriptive in nature to match the students’ writings with the description

provided by the rubric. However, it is unclear which of these linguistic features determine

whether the learners’ writing is high or low and what raters perceive as highly or poorly writing

quality.

Aim of the Study

In the present study, investigations are going to be administered in order to pinpoint the

differences between highly and poorly rated learners’ writings it terms three types of linguistic

features: syntactic complexity, lexical diversity, words characteristics (e.g., frequency,

concreteness, and imageability); henceforth determining whether writing with complex

sophisticated language is mirrored in teacher’s ratings.

Research Questions

 Does highly rated learners’ writing differ in terms of linguistic features from poorly rated

writings?

 If so, do these differences influence assessors’ ratings?

2
 Do assessors take into consideration all these linguistic features to rate learners’ writings?

Research Hypothesis

Null hypothesis:

 H0 Highly and poorly rated writings don’t differ in terms of linguistic features.

 H0 If there were any differences they don’t influence the assessor’s ratings.

Alternative hypothesis:

 H1 Highly and poorly rated writings differ in terms of linguistic features.

 H1 These differences influence assessors’ ratings.

Research design

1. The sample:

The target population is the university students of Algeria majoring in English language;

however, due to the limitations of resources the accessible population is Larbi ben M’hidi

university students of the English department academic year of 2019/2020. Henceforth, third

year students are going to be the sample of the study.

2. The method:

A writing task will be assigned to 3rd year students by their “written expression” teachers.

A corpus of the writings will be created and analyzed by Coh-Metrix web tool developed by

Arthur C. Graesser and Danielle S. McNamara, which provides “a bank of indices [that]

describes a group of conceptually or mathematically similar indices or measures, [while] the

term “measure” [is used] to describe a theoretical construct (e.g., referential cohesion, lexical

diversity, word frequency), [and] the terms “index” or “indices” to describe any one of the

ways Coh-Metrix assesses that measure”(McNAMARA, GRAESSER, McCARTHY, & CAI,

2014, p. 61). Using the descriptive statistics provided by Coh-Metrix, inferential statistical

3
tool called discriminant analysis will be used to assess the adequacy of the classification of

highly and poorly rated writings in relation to the linguistic features in order fulfill the above

stated aim and to assess the hypotheses and answer the research questions.

3. Structure of the study:

The study will mainly be comprised of three chapters. In the first chapter we’ll deal with

Natural Language Processing NLP and how did it contribute to the creation of coh-metrix

and the technology that is used in this analysis tool. The second chapter will include two

sections; the first section is going to be concerned with defining writing assessment and

the linguistic features intended to be studied and how they affect the writing’ quality,

while the background studies and the related literature will be left to the second section.

The third chapter will be devoted to the data collection and analysis and discussion of the

findings.

4. Significance of the study:

The present study is significant in because it focuses on pinpointing the relationship

between linguistic features (syntactic complexity, lexical diversity and words

characteristics) and writing-quality’ rating, which has been considered in previous

research but still not answered clearly. Thereby, it is expected that the findings of this

study will serve inflicting new prospects in writing assessment and students writing

instructions and strategies.

4
References

Behizadeh, N., & Engelhard, G. (2011). Writing assessment. Retrieved November 17, 2019, from

https://en.wikipedia.org/wiki/Writing_assessment#cite_note-1

Ferris, D. R. (1994). Lexical and Syntactic Features of ESL Writing by Students at Different

Levels of L2 Proficiency. TESOL Quarterly, 28(2), 414. https://doi.org/10.2307/3587446

Graham, S., Gillespie, A., & McKeown, D. (2013). Writing: Importance, development, and

instruction. Reading and Writing, 26(1), 1–15. https://doi.org/10.1007/s11145-012-9395-2

Geiser, S., & Studley, R. (2001). UC and the SAT: Predictive validity and differential impact of

the SAT I and SAT II at the University of California.

Jarvis, S., Grant, L., Bikowski, D., & Ferris, D. (2003). Exploring multiple profiles of highly

rated learner compositions. Journal of Second Language Writing, 12(4), 377–403.

https://doi.org/10.1016/j.jslw.2003.09.001

Mcnamara, D. S., Crossley, S. A., & Mccarthy, P. M. (2010). Linguistic Features of Writing

Quality. Written Communication, 27(1), 57 –86. https://doi.org/10.1177/0741088309351547

McNAMARA, D. S., GRAESSER, A. C., McCARTHY, P. M., & CAI, Z. (2014). Automated

Evaluation of Text and Discourse with Coh-Metrix. Cambridge University Press.

Varner, L. K., Roscoe, R. D., & Mcnamara, D. S. (2013). Evaluative misalignment of 10th-grade

student and teacher criteria for essay quality : An automated textual analysis. Journal of

Writing Research, x(2013), 35–59.

Witte, S. P., & Faigley, L. (1981). Coherence , Cohesion , and Writing Quality. 32(2), 189–204.

You might also like