You are on page 1of 206

Second Language Learning and Teaching

Alia Moser

Written Corrective
Feedback: The Role
of Learner
Engagement
A Practical Approach
Second Language Learning and Teaching

Series Editor
Mirosław Pawlak, Faculty of Pedagogy and Fine Arts, Adam Mickiewicz
University, Kalisz, Poland
The series brings together volumes dealing with different aspects of learning and
teaching second and foreign languages. The titles included are both monographs and
edited collections focusing on a variety of topics ranging from the processes under-
lying second language acquisition, through various aspects of language learning in
instructed and non-instructed settings, to different facets of the teaching process,
including syllabus choice, materials design, classroom practices and evaluation.
The publications reflect state-of-the-art developments in those areas, they adopt a
wide range of theoretical perspectives and follow diverse research paradigms. The
intended audience are all those who are interested in naturalistic and classroom
second language acquisition, including researchers, methodologists, curriculum and
materials designers, teachers and undergraduate and graduate students undertaking
empirical investigations of how second languages are learnt and taught.

More information about this series at http://www.springer.com/series/10129


Alia Moser

Written Corrective Feedback:


The Role of Learner
Engagement
A Practical Approach
Alia Moser
Bundeshandelsakademie Baden
Baden, Austria

ISSN 2193-7648 ISSN 2193-7656 (electronic)


Second Language Learning and Teaching
ISBN 978-3-030-63993-8 ISBN 978-3-030-63994-5 (eBook)
https://doi.org/10.1007/978-3-030-63994-5

© Springer Nature Switzerland AG 2020


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of
the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology
now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Acknowledgements

Written corrective feedback has always been an important part of my teaching, espe-
cially a need to understand why learners engage with written CF, which has led me
to conduct a small-scale study in order to investigate mediating factors impacting on
learners’ engagement with written CF. The English Department at Graz University,
especially, plays an important role in this respect, and I have to thank Margit Reit-
bauer for being a superb supervisor during my studies and the writing of the book at
hand. Her views on the respective chapters of it and the advice she provided, guided
me in my quest for getting to the bottom of learner engagement with written CF.
Teaching at school and writing a book at the same time has been rather challenging,
hence I must acknowledge the administration of my school, who permitted me to
conduct my study with my former students. Additionally, I also want to thank those
colleagues at school that showed genuine interest in my research and encouraged me
along the way.
What would I do without my family? Education has always been very important
for all of us and my family is one of the reasons that I could study at Graz University
and teach at various institutions abroad. Knowing that they are interested in what I
am doing and they always asking how I was getting on helped me through the ups
and downs while writing. I want to thank them all for believing in me.
My partner, Christian Kostial, for being there every day and encouraging me to
commence this endeavour of writing a book while teaching full-time. He had to
endure me constantly talking about written CF and learner engagement, but never
got tired of listening to me. Not only did he support me all the way to the finish line,
he also gave me valuable feedback as well. I cannot have met a better person to share
my life with.
Finally, I am especially grateful to all my students past and present, who challenge
and encourage me to continually make adjustments to my teaching practice. I dedicate
this book to those twelve students that took part in my pilot study as well as the
actual study. Without their honesty, curiosity and willingness to share their opinions
on written CF with me, this book would never have been possible.

v
Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 The Structure and Focus of Chapters 2–6 . . . . . . . . . . . . . . . . . . . . . . 3
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 The Engagement Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1 Emergence of Student Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.1.1 Terminological Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.1.2 School Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.2 Concepts of Student Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.2.1 Finn’s Participation-Identification Model . . . . . . . . . . . . . . . . 16
2.2.2 Skinner and Pitzer’s Model of Motivational Dynamics . . . . . 17
2.2.3 Achievement Goal Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.2.4 Self-Determination Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.3 Dimensions of Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.3.1 Pittaway’s Engagement Framework . . . . . . . . . . . . . . . . . . . . . 23
2.3.2 Researching Dimensions of Engagement . . . . . . . . . . . . . . . . 24
2.4 Measurability of Student Engagement . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.4.1 Measurement Instruments at Secondary Level . . . . . . . . . . . . 26
2.4.2 Measurement Instruments at Tertiary Level . . . . . . . . . . . . . . 27
2.5 Engagement and Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.5.1 Mastery/Performance—Intrinsic/Extrinsic . . . . . . . . . . . . . . . 29
2.5.2 The Engagement Motivation Dilemma . . . . . . . . . . . . . . . . . . 29
2.5.3 Indicators, Facilitators and Outcomes of Engagement . . . . . 31
2.6 Engagement and Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3 A Short History of Written Corrective Feedback . . . . . . . . . . . . . . . . . . 45
3.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.2 The Question of Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
3.2.1 Systematic and Unsystematic Errors . . . . . . . . . . . . . . . . . . . . 50
3.2.2 Treatable and Untreatable Errors . . . . . . . . . . . . . . . . . . . . . . . 50
3.2.3 Positive and Negative Evidence . . . . . . . . . . . . . . . . . . . . . . . . 51
3.3 Providing Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
vii
viii Contents

3.3.1 Formative and Summative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53


3.3.2 Indirect and Direct . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.3.3 Implicit and Explicit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.3.4 Metalinguistic Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.3.5 Focused vs. Unfocused . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.3.6 Focus on Form vs. Focus on Content . . . . . . . . . . . . . . . . . . . . 57
3.3.7 Effective Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
3.4 Types of Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
3.4.1 Error Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.4.2 Error Indication/No Correction . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.4.3 Reformulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.4.4 Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.4.5 Self-Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.4.6 Peer Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.4.7 Personal Written Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.5 The Engagement-Mediator-Feedback Model . . . . . . . . . . . . . . . . . . . 63
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4 Methodological Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.1 Austrian School System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.2 Teaching Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
4.3 Teacher Researcher . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.4 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
4.5 Context and Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
4.6 Discussion of Feedback Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
4.7 Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
5 Findings and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
5.1 Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
5.1.1 Emotional Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
5.1.2 Cognitive Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
5.1.3 Behavioural Engagement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
5.2 Feedback Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
5.2.1 Error Correction by Teacher . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.2.2 Error Indication/No Correction . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.2.3 Reformulation of Sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
5.2.4 Codes in Margin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
5.2.5 Self-Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
5.2.6 Peer Correction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
5.2.7 Personal Written Statement by Teacher . . . . . . . . . . . . . . . . . . 121
5.2.8 No Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
5.2.9 Combination of Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
5.3 Mediating Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
5.3.1 Teachers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
5.3.2 Peers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
Contents ix

5.3.3 Level of English and Grades . . . . . . . . . . . . . . . . . . . . . . . . . . . 137


5.3.4 Feedback Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
5.3.5 Attitudes and Emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
5.4 Strategies for Working with Feedback . . . . . . . . . . . . . . . . . . . . . . . . . 148
5.4.1 Strategies for Writing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
5.4.2 Cooperation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
5.4.3 Time Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
6.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
6.2 Possible Limitations of the Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.3 Future Implications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
List of Figures

Fig. 2.1 Finn’s participation-identification model (Finn & Zimmer,


2012, p. 101) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Fig. 2.2 Skinner and Pitzer’s model of motivational dynamics
(Skinner & Pitzer, 2012, p. 23) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Fig. 2.3 A motivational conceptualization of engagement
and disaffection in the classroom (Skinner & Pitzer, 2012,
p. 25) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Fig. 2.4 Pittaway’s engagement framework (Pittaway, 2012, p. 40) . . . . . . 24
Fig. 2.5 Model of associations between context, engagement,
and student outcomes (Reschly & Christenson, 2012, p. 10) . . . . . 32
Fig. 2.6 Dynamic-engagement-framework . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Fig. 3.1 Engagement-Mediator-Feedback Model . . . . . . . . . . . . . . . . . . . . . 64
Fig. 4.1 Austrian school system (Online 10) . . . . . . . . . . . . . . . . . . . . . . . . . 74
Fig. 4.2 Online platform—weekly tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Fig. 4.3 Student’s first draft . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Fig. 4.4 Student’s second draft . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Fig. 5.1 Areas focused on in class . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Fig. 5.2 Level of engagement with feedback methods . . . . . . . . . . . . . . . . . 111
Fig. 5.3 Mediating factors for engagement with feedback . . . . . . . . . . . . . . 128
Fig. 5.4 Commitment—Worked harder than expected . . . . . . . . . . . . . . . . . 142
Fig. 5.5 Writing—Areas worked on . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Fig. 5.6 Work put into activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Fig. 5.7 Development of areas improved through experience at school . . . 152

xi
List of Tables

Table 4.1 Participants and languages spoken . . . . . . . . . . . . . . . . . . . . . . . . 81


Table 4.2 Length of interviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Table 5.1 Homework record . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Table 6.1 Homework record pilot study students and non-participants . . . . 165

xiii
Chapter 1
Introduction

Why another book on written CF? Is this really necessary? Some might argue there
have been enough studies on it already and others might disagree, because there
are still areas that need to be investigated more thoroughly. Hence, I would like to
give at least two reasons for the need for yet another book on written CF: First,
research on written CF has not ceased to be popular among researchers and they
vary in their opinions on its effectiveness for language learning (e.g., Cohen, 1987;
Chandler, 2003; Ellis, 2012). Second, because giving feedback is undeniably one
of the areas with which teachers engage themselves repeatedly every day they are
teaching, but learners, who should benefit from it, do not always engage with it.
Both oral and written feedback which the learners get should help them to progress
in their development as learners, thus learners’ willingness to work with it seems to
be worthwhile investigating. Moreover, from a pedagogical perspective, insights are
needed into the effectiveness of written CF, as teachers should know if the feedback,
they provide to their learners, is beneficial to them (see Bitchener & Storch, 2016).
The first essential decision when undertaking research in the area of feedback
is whether to investigate oral and/or written feedback. One of the advantages of
written CF is definitely the permanence of written texts. In contrast to oral feedback,
learners can always go back to their written texts as well as when producing these,
they have more time to think about linguistic structures they want to use, developing
their arguments, etc. This permanence of written texts is one of the reasons, I am
particularly interested in feedback on these. I have been asking myself for quite a
while about the most beneficial way to give written CF which will result in most
learners engaging positively with the feedback they get. As Ellis (1997) points out a
“theory should be restricted in domain, focusing on a limited set of phenomena on the
grounds that some facts are more important than others and that trying to account for
all known facts only results in incoherence.” (p. 99). Consequently, Ellis’ statement
reinforced my view that researching both types of feedback might merely lead to
superficial research. For this reason, I decided to focus on written CF and learners’
reactions to it as these two areas have always been at the core of my research interests.

© Springer Nature Switzerland AG 2020 1


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_1
2 1 Introduction

While written CF as one area of investigation was there from the very beginning of
my teaching, the concept for researching learners’ reasons for working with feedback
was not. Through the study of literature on motivation and numerous discussions with
fellow students and university teachers, I came across the concept of engagement,
which has not largely been investigated with respect to feedback (Handley, Price, &
Millar, 2011; Han & Hyland, 2015; Price, Handley, & Millar, 2011), but appeared to
be one innovative way to tackle the issue of learners’ engagement with it.
Not surprisingly, written CF is one of the areas any language teacher has to deal
with very often, and research has also investigated this area quite extensively. Some
learners engage with written CF provided by teachers, others hardly ever, or never.
The teacher’s role in written CF has been researched in manifold ways, but few
studies focus on the role of the learners, and how they react to the feedback provided
(see Handley et al., 2011; Han & Hyland, 2015; Lee, 2005; Price et al., 2011). Various
researchers (e.g., Bitchener & Knoch, 2009; Chandler, 2003; Ellis, 2012; Hyland &
Hyland, 2006) have commented on the importance of written CF, and their studies
show a variety of different approaches to it, but most of them do not investigate the
learners’ point of view extensively. Hyland and Hyland (2006) stressed that although
“feedback is a central aspect of L2 writing programs across the world, the research
literature has not been equivocally positive about its role in L2 development, and
teachers often have the sense they are not making use of its full potential.” (p. 83).
Exactly this notion of not making the best use of feedback was one of the reasons
for this book. One question has kept coming up during my teaching too: What kind
of feedback should be used to foster learners’ willingness to work with it?
As Ellis (2009) reports in “A typology of written corrective feedback styles”, there
are various ways of dealing with feedback, ranging from direct (i.e. providing the
correct answer), indirect or metalinguistic corrective feedback to electronic feedback
as well as reformulation (see Ellis, 2009, p. 98). There has been an ongoing debate
about the usefulness of written CF and Cohen (1987, p. 58), for example, points out
that “[…] research into feedback on compositions has mainly concerned itself with
the ‘best’ means of teacher correction on written work, rather than with the issue of
how students actually respond to each of these methods”, a fact still true indicating
that there is a need to investigate learners’ beliefs on written CF as well as why they
engage with certain kinds of feedback. As a consequence, the objective of my book
is to investigate these areas with a specific focus on the learners, what they expect
from written CF, and how they react to and engage with it. It should be noted that my
teaching context is a secondary business school in Lower Austria, where English is
taught as EFL (English as a Foreign language). For the most part English is taught as
one of the compulsory subjects in schools at Secondary Level I and Secondary Level
II (see Sect. 4.1. for a more thorough discussion of the Austrian school system) and
is not one of the official languages spoken in Austria. Hence, most of the learners
very often only use English in the classroom situation.
How, then, to investigate learner engagement with written CF? If insights into the
learners’ points of view are one of the desired outcomes, then practical action research
seems to be the logical choice. The term ‘action research’ has been associated with
social psychologist Kurt Lewin (1946 and 1948 as cited in Burns, 2005) who is
1 Introduction 3

said to be the founder of this research area (cf. Burns, 2005; Dörnyei, 2007). Action
research has been tried out among researchers but is not without its critics. One
purpose of action research is that monitoring one’s own classroom gives the teacher
researcher the opportunity to critically evaluate his or her own teaching and make
a useful contribution to the body of knowledge. Some researchers, Dörnyei (2007,
p. 191) among them, point out that action research “just does not seem to work in
practice.” On the one hand, it might be true that if teachers are collaborating with
researchers, the research agenda might be difficult to coordinate. On the other hand,
if teacher and researcher are the same person, this might be easier to tackle. It has to
be stressed, however, that as a teacher researcher one needs to be aware of this bias.
Consequently, so that valuable insights into classroom practices can be the outcome,
one has to stay true to the data gathered in the classroom context and be objective
when reporting any findings. Dörnyei stressed that action research can be a desirable
research practice, but the necessary support to provide teachers with the opportunity
to conduct action research (e.g., reduction of their workload to be able to take part in
action research) is missing in the different teaching contexts. As Richards believes:
Most ESOL [i.e. EFL/ESL] teachers are natural researchers. We’re used to working out the
needs of our students, evaluating the effects of particular approaches, spotting things that
work or don’t work and adjusting our teaching accordingly. Very few teachers approach their
teaching mechanically and nearly all of us reflect on what we do in the classroom. (Richards,
2003, p. 232 as cited in Dörnyei, 2007, p. 194)

Although action research is viewed rather critically by some researchers, I decided


to take on an action research approach as I firmly believe that investigating learner
engagement with written CF needs to be studied at the core: in the classroom.
Moreover, if investigating learner engagement with feedback, I believe that studies
conducted by teachers in their own classroom are paramount to be able to portray
the complexity of this research area. Therefore, my book seeks to share my find-
ings with the research community and providing implications for the importance of
learner engagement with written CF in the EFL classroom.

1.1 The Structure and Focus of Chapters 2–6

In Chap. 2, before looking into engagement more deeply, the origins of the term
‘engagement’ as well as its usage in the late twenty- and twenty-first century are
briefly explored. I then continue with a general outline of the engagement concept,
namely the difficulties of terminology it is still facing. The manifold definitions of
engagement, which have recently been termed ‘student engagement’ and ‘learner
engagement’ and are therefore used interchangeably in this book, are examined. The
overview begins with the first instances the term was used and its development to
date.
Particular emphasis is put on the engagement concept at secondary
and tertiary level. To show concepts which have been used in various engagement
4 1 Introduction

frameworks, Finn’s Participation-Identification Model, Skinner and Pitzer’s Model


of motivational dynamics, achievement goal theory and self-determination theory are
explained to provide a framework for the dimensions of engagement which are crucial
for the engagement concept. Among these, Pittaway’s (2012) as well as Fredricks,
Blumenfeld and Paris’ (2004) views on the dimensions of engagement are portrayed
to show how diverse the various opinions are.
The measurability of engagement is of interest to many researchers, and there are
well-established tools at secondary and tertiary level a researcher can draw on. The
most common ones are discussed in this chapter and some of these were the basis
for the questionnaire used in the small-scale study in this book.
I then discuss what I termed The Engagement Motivation Dilemma, showing that
these concepts are either seen as separate or closely interlinked ones. I conclude this
section of the chapter with its relevance for my study and how these two concepts
are used in this book.
The final section of this chapter outlines the Dynamic-Engagement-Framework
which I developed on the basis of existing research on engagement. It shows how
engagement and motivation are integral parts of this framework which is a dynamic
process influenced by mediating factors. This framework lays the foundation for
engagement with written CF.
In Chap. 3, I present a very brief review of the literature on written CF, starting with
the definition of errors, especially systematic and unsystematic, treatable and untreat-
able errors, as well as positive and negative evidence. Giving-Feedback-Guidelines,
based on Sadler’s (1989) conditions for feedback, are introduced to strengthen the
importance of explaining one’s chosen feedback method to your learners to enhance
its effectiveness. These are essential when providing feedback, because teachers have
so many options, for example formative and summative or indirect and direct feed-
back. Despite these forms, teachers need to decide which type of feedback to use
with their respective learners. The most common types, such as error corrections,
codes, peer correction among others, are considered in this section of the chapter.
This chapter concludes with the Engagement-Mediator-Feedback Model which
is based on the Dynamic-Engagement-Framework discussed in Chap. 2. This model
shows how engagement, feedback, mediating factors and strategies for working with
written CF interrelate and can inform teachers about its underlying dynamic process.
In Chap. 4 the setting, methods and research tools of my qualitative study are
discussed and critically evaluated. As the context is specific, namely an Austrian
secondary business school, the Austrian school system in general is described to give
an overview of its complexities. This is followed by my particular teaching context
where this distinct type of school within the Austrian school system is explained in
some detail. In addition, being a teacher researcher, the tricky issue of investigating
my own classroom and all its implications are addressed. Despite stressing the rele-
vance of undertaking action research, my role as the research participants’ teacher,
as well as a researcher, is critically explored.
In the next section of this chapter, reasons for conducting this study are discussed
and I also show the process of formulating and changing my research questions due
to thorough engagement with the literature in both areas. In order to fully understand
1.1 The Structure and Focus of Chapters 2–6 5

the research context, not only giving an insight into the context and participants is
vital, but also discussing the feedback methods these participants encountered during
English – especially as they are also an essential element of the empirical part of the
small-scale study.
The final section of this chapter summarises the data collection. Starting with a
general overview on the data collected, I continue with a thorough explanation for
the various research tools used. The transcription and coding process of the data
gathered concludes this chapter.
In Chap. 5 I present the results of my small-scale study, where excerpts from the
interviews conducted are used to demonstrate learner engagement with written CF.
If appropriate, other tools (e.g., learners’ blog entries, pieces of homework assign-
ments) are taken into account as well to underline the different dimensions of engage-
ment. First, I start with a brief introduction on engagement where the learners’ view
on this term are included too. This is followed by an explanation of the learners’
emotional engagement with written CF to show how this can have an impact on
their level of engagement. Not only are the learners’ relationship with their peers
and teachers explored in this respect, but also their emotional engagement with
school and a specific subject. The interplay between their emotional engagement
with their teachers and the feedback methods provided conclude this section of the
chapter. In line with the three dimensions of the Engagement-Mediator-Feedback
Model, learners’ cognitive engagement with the feedback they get, are explored.
Excerpts from the interviews are used to demonstrate the manifold methods upon
which learners rely in order to succeed in working with written CF. Additionally,
ways in which teachers can foster engagement and why this might influence cogni-
tive engagement, or even lead to a lack of it, are pointed out. Learners’ behavioural
engagement is looked into and differences to cognitive engagement are portrayed.
Besides learners’ reasons for engaging positively with written CF, one section deals
with homework assignments to illustrate how the type of task can influence learners’
willingness to engage with it.
The next section of Chap. 5 deals with feedback methods. First, the level of the
learners’ engagement with some of the most common feedback methods such as error
correction by the teacher, reformulation of sentences, colour coding, or the personal
written statement by the teacher are depicted. Opinions on these methods, voiced
directly by learners, indicate their effectiveness in engaging them. Second, insights
gained from the learners’ voices are used to investigate which kind of feedback
method might be an efficient way to engage learners with feedback. To illustrate
how this could work in practice, the feedback method these participants encountered
in English in their final year is discussed.
As established within the Dynamic-Engagement-Framework, mediating factors
can influence learners’ engagement with written CF. Looking at teachers and their
way of giving feedback, unclear instructions and the concept of feedback seen as
a dialogue are explored to show (un)successful ways of learner engagement. Other
important factors such as peers, grades and the feedback method are additional factors
that foster or hinder engagement. Attention is drawn to the differing opinions among
learners about the degree to which these factors influenced them. Thus, attitudes
6 1 Introduction

and emotions are finally highlighted to show how factors such as individuality, self-
confidence, commitment, laziness or anxiety (among others) play a role in learner
responses to written CF.
The final section of this chapter summarises strategies needed for working with
feedback. Clearly, teachers cannot assume that learners know how to deal with feed-
back they get, hence teachers need to provide them with the tools to do so. The
data gathered is used to stress what is needed to enhance learners’ engagement with
feedback.
The final chapter summarises the value of the concept of learner engagement with
written CF. I reflect on the research questions and critically evaluate their effective-
ness within the Engagement-Mediator-Feedback Model. Once again, excerpts from
the interviews are used to show why such a model may or may not result in engaging
learners with written CF. This also includes taking a critical stance on possible limi-
tations of the research design. I conclude this chapter with indications for future
research in this area.

References

Bitchener, J., & Knoch, U. (2009). The value of focused approach to written corrective feedback.
ELT Journal, 63(3), 204–211.
Bitchener, J., & Storch, N. (2016). Written corrective feedback for L2 development. Bristol:
Multilingual Matters.
Burns, A. (2005). Action research: An evolving paradigm? Language Teaching, 38(2), 57–74.
Chandler, J. (2003). The efficacy of various kinds of error feedback for improvement in the accuracy
and fluency of L2 student writing. Journal of Second Language Writing, 12, 267–296.
Cohen, A. D. (1987). Student processing of feedback on their compositions. In A. Wenden & J. Rubin
(Eds.), Learner strategies in language learning (pp. 57–69). Language Teaching Methodology
Series. New York et al.: Prentice Hall.
Dörnyei, Z. (2007). Research methods in applied linguistics: Quantitative, qualitative, and mixed
methodologies. Oxford: Oxford University Press.
Ellis, R. (1997). Second language acquisition. Oxford: Oxford University Press.
Ellis, R. (2009). A typology of written corrective feedback styles. ELT Journal, 63(2), 97–107.
Ellis, R. (2012). Language teaching research and language pedagogy. Chichester: John Wiley &
Sons Ltd.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 62, 60–71.
Handley, K., Price, M., & Millar, J. (2011). Beyond ‘doing time’: Investigating the concept of
student engagement with feedback. Oxford Review of Education, 37(4), 543–560.
Han, Y., & Hyland, F. (2015). Exploring learner engagement with written corrective feedback in a
Chinese tertiary EFL classroom. Journal of Second Language Writing, 30, 31–44.
Hyland, K. & Hyland, F. (2006). Feedback on second language students’ writing. Language
Teaching, 39, 83–101. Retrieved from http://hub.hku.hk/bitstream/10722/57356/1/133933.pdf.
Lee, I. (2005). Error correction in the L2 writing classroom: What do students think? TESL
Canada Journal, 22(2), 1–16. Retrieved from http://teslcanadajournal.ca/index.php/tesl/article/
view/84/84.
Lewin, K. (1946). Action research and minority problems. Journal of social issues, 2(4), 34–46.
Lewin, K. (1948). Resolving social conflicts: Selected papers on group dynamics. New York: Harper
& Row.
References 7

Pittaway, S. (2012). Student and staff engagement: developing an engagement framework in a


faculty of education. Australian Journal of Teacher Education, 37(4), 37–45.
Price, M., Handley, K., & Millar, J. (2011). Feedback: Focusing attention on engagement. Studies
in Higher Education, 36(8), 879–896.
Richards, K. (2003). Qualitative inquiry in TESOL. Basingstoke: Palgrave Macmillan.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional
Science, 18(2), 119–144. Retrieved from http://pdf.truni.sk/eucebnice/iktv/data/media/iktvv/
Symposium_LTML_Royce%20Sadler_BFormative_Assessment_and_the_design_of_instructi
onal_systems.pdf.
Chapter 2
The Engagement Concept

The last two decades of the twenty-first century have seen a growing interest in
engagement, most often known as research on student engagement or learner engage-
ment. At the advent of research in this area student engagement was the commonly
used terminology, whereas recently learner engagement seems to be the preferred
term. Thus, as already mentioned in the introduction, I use both terms interchange-
ably. Before outlining research on student engagement, it might be worthwhile to take
a brief look at the etymology of the word engage. Dating back to late Middle English
in the mid-sixteenth century, it derives from the French verb engager or rather from
the base of gage. Originally, the word meant ‘to pawn or pledge something’ which
changed to ‘pledge oneself (to do something)’ and other meanings before it was
defined as ‘involv[ing] someone or something else’ (cf. Soanes & Stevenson, 2003).
The noun engagement then developed in the early seventeenth century, meaning ‘a
legal or moral obligation’ (cf. ibid.). Its use declined in the late nineteenth century
and became extremely popular again in the twenty-first century.
Since then, ‘engagement’ has been used in the context of research to refer to
‘student engagement’; compared to other research interests a fairly new concept
in Second Language Acquisition (SLA), but one that has been used for some time
in Higher Education (HE) in terms of student participation in academic work and
extracurricular activities, among other aspects. What is more, some researchers
(Dunne & Derfel, 2013a) interpret student engagement as working with and not
providing for students, as ‘engagement’ also implies that not only learners them-
selves, but several other factors come into play. If the emphasis is on working with
learners and consequently engaging them is the ultimate aim, learners need to be seen
as co-enquirers (Bryson, 2014a). That being the case, teachers and learners together
can create a positive learning environment, boosting the student experience in the
various teaching contexts.
Furthermore, student engagement in HE is also interested in enhancing the learner
experience of disadvantaged students (e.g., underrepresented groups) as well as the
many international students who might feel alienated at universities abroad (Krause,

© Springer Nature Switzerland AG 2020 9


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_2
10 2 The Engagement Concept

2005). A noble notion, especially in comparison to engagement solely seen as student


satisfaction, which is the case in some areas of HE. Sometimes—as surveys in the
UK show—students are viewed as customers that need to be satisfied (Solomonides,
2013) and the real issue of how to engage students and enhance the student experience
recede into the background. One reason for student engagement equalling customer
satisfaction is undoubtedly our globalised and competitive world that has found its
way into educational institutions and is every now and then the driving force behind
educational policies.
Another reason—perhaps even one of the most prominent ones—for the emer-
gence of student engagement was to improve the performance of students with
mediocre or poor results in school (cf. Finn & Zimmer, 2012) and to reduce school
dropout (e.g., Finn, 1989; Rumberger, 1983). During early research into engage-
ment, psychologists looked at participation in academic settings and the behavioural
aspects influencing that (Wolters & Taylor, 2012). What is more, over time it has
developed from a one- to a multidimensional construct (Fredricks, Blumenfeld, &
Paris, 2004; Furlong et al., 2003; Jimerson, Campos, & Greif, 2003) and the debate
on the number of dimensions it comprises is still ongoing. Thus, it is not surprising,
that the definition of the term ‘student engagement’ is not as easy as one might
think. In addition to educational settings, engagement has also found its way into
the world of work, where the construct has been embraced as a tool to counteract
burnout. Marcum (2000) has developed a particular interest in engagement and tries
to explain it as a mathematical equation: E = L (I + Cp + Ch) x Inv (A + Co +
Cm) → IK/Ef → E (ibid., p. 59). In his opinion engagement equals learning, which
consists of interest, competence and challenge, multiplied by involvement, which
includes activity, communication and commitment. These components then produce
increased knowledge and effectiveness leading to increased engagement (see Garrett,
2011; Marcum, 2000).
A formula for defining engagement would definitely be welcomed by some, but
is it possible to calculate engagement with a mathematical formula? I would argue
not, for two reasons: First, individual learners are involved who cannot simply be
represented as symbols in an equation, and second, many factors can influence the
level of engagement of the respective learner, thus making it even more difficult to
express in mathematical terms. Not without reason have researchers commented on
the difficulty of defining student engagement as so many aspects are involved in it,
thus the terminology related to engagement is vast: student engagement is used by the
majority of researchers as an umbrella concept, but also student involvement (Astin,
1984, 1985; Pace, 1984; Pascarella & Terenzini, 1991, 2005), school engagement
(e.g., Finn & Zimmer, 2012; Fredricks et al., 2004; Reschly & Christenson, 2012),
classroom engagement (Reeve, 2012), engagement in school, academic engagement,
engagement in class, engagement in school work, engagement with language (Storch,
2008; Svalberg, 2009) and learner engagement (Ellis, 2010; Han & Hyland, 2015;
Moyer, 2014) are used on a regular basis. Hardy and Bryson (2010) when discussing
the issue of the concept and measurement of student engagement state “[t]here can be
no ‘quick fix’ solution, it is a multi-faceted, social constructivist concept that should
take account of students’ sense of self and aspirations and contexts they are in” (p. 19).
2 The Engagement Concept 11

Their view is one among many, as various researchers interpret the term differently,
making it necessary to declare which definition is being used in the different research
contexts (see Sect. 2.1. for a thorough discussion).
Undeniably, it involves learners being engaged in a process; the only question
remaining is how they are engaged in it. The teacher plays a role in this respect as
he or she is the one giving instructions and providing feedback—but to what extent?
Just how much influence does a teacher have in regard to the learners’ engagement?
Skinner and Belmont (1993) in their study on ‘reciprocal effects of teacher behaviour
and student engagement’ point out that both psychological and educational literature
should be considered when investigating learners and their motivations for learning,
as these two research areas complement one another. Their findings also show that the
interpersonal relationship between teachers and students can foster or hinder student
engagement. Consequently, it should be kept in mind that as educators “we cannot
afford to have a bad day simply because the students we work with never deserve
to have a bad day because of us […]” (Whitaker, Whitaker, & Lumpa, 2009, pp.
xv, xviii cited in Dörnyei & Ushioda, 2011, p. 159). Easily said and definitely true
but achieving this sentiment every single day might be quite a challenge at times. In
line with that, if teachers are not engaged in their subject/teaching, then it is highly
unlikely that they can engage learners (cf. Middlecamp, 2005). Another aspect to be
considered when talking about engagement was pointed out by Egbert (2015), saying
“it is a rare occasion when all students are fully engaged at the same time. However,
all students can be engaged a lot of the time” (p. 70). With that statement she identifies
the root of the problem: engagement is a complex and dynamic construct which can
be influenced by many different factors, thus making it challenging for teachers to
figure out which methods might work for engaging learners.
Another dilemma researchers in student engagement face is whether or not moti-
vation and engagement are two distinct models or closely linked. The Quality Assur-
ance Agency for Higher Education (QAA) in the UK, for example, even includes
motivation in their definition of engagement (Online 1). No matter how they are
viewed, one criticism is linked to both: a lack of students’ voices. Ellis (2012),
for example, pointed out motivation as a key factor in influencing the success of
instruction, but “surprisingly, there are remarkably few studies that have specifi-
cally examined how motivation affects the way learners respond to instruction or
how instruction affects learners’ motivation” (p. 325). Similarly, Dunne and Derfel
(2013b) emphasised in their introduction to The Student Engagement Handbook that
“[i]ronically, direct student voices are rarely apparent in texts on the ‘student voice’
or on student engagement” (p. xv). Yazzie-Mintz and McCormick (2012) reported
on some of the reasons in the school context why they are often left out of the
equation. On the one hand, there are school staff who do not trust what students
say, believing they know what the students’ perspective on school-related matters
is. On the other hand, researchers often do not study students directly due to several
restrictions (e.g., consent forms, parent/guardian permission, authority permission)
when working with minors. Nevertheless, when studying student engagement, the
students’ voices are paramount in effecting positive change to existing systems. As a
consequence, to make the students’ voices heard the students might need to be seen
12 2 The Engagement Concept

as co-enquirers who are part of the dialogue between the (teacher) researcher and
the students, otherwise real engagement cannot begin.

2.1 Emergence of Student Engagement

The term ‘student engagement’ has been used since the late twentieth century mainly
in literature on Higher Education, especially in the USA, Australia, (a little later in the
UK) and in research on school engagement (Fredricks et al., 2004). The exact starting
point of this concept might be difficult to determine, but student engagement theory
dates back to the work of Astin (1984, 1985, 1993), Chickering and Gamson (1987),
Kuh, Schuh, Whitt, and Associates (1991), and Pace (1984) to name just a few. It
has to be stated, however, that Astin and Pace used the term ‘student involvement’
instead of ‘student engagement’. Hence, it is not surprising that the earliest review
that included the term ‘student engagement’ by Mosher and MacGowan (1985) did
not mention Astin and Pace, but especially reviewed the work by Natriello (1984)
and Rumberger (1983), who did use the term ‘engagement’ (see Sect. 2.1.1.) yet
referred to ‘student involvement’, a term which is also used in research today.
Generally speaking, student engagement includes the “degree of attention,
curiosity, interest, optimism, and passion that students show when they are learning
or being taught” (Abbott, 2014—Online 2). Abbott argues that students are engaged
if these areas are stimulated, and very likely disengaged if not. Student engagement
has also been depicted as a mediator between contextual (e.g., classroom setting) and
personal factors (e.g., interest, boredom) and academic performance or achievement
(e.g., school completion). Drawing on these, it seems one of the reasons why research
on student engagement emerged was because of worrying dropout rates and student
alienation, as well as the desire to find a way to enhance the student experience at
secondary and tertiary level. In the light of low levels of academic achievement, high
levels of boredom among students and high dropout rates in certain areas (cf. Finn,
1989; Fredricks et al., 2004), it is not surprising then that research in student engage-
ment is becoming more and more popular, resulting in intervention programmes
(e.g., Christenson & Reschly, 2010) and student engagement questionnaires (e.g.,
HESSE, NSSE), for example.
At the beginning at-risk students were often researched and factors such as
parents, peers, teachers, working hard, student behaviour, etc. were identified as
influencing student engagement. What is more, these studies showed that not only
at-risk students, but all students are likely to be affected by these factors (cf. Finn &
Rock, 1997; Reschly & Christenson, 2006; Yazzie-Mintz, 2007). In the past student
engagement has been mainly defined as students’ participating in activities offered by
schools/universities and their attitudes towards it that will lead to engagement and
positive outcomes (e.g., achievement, academic knowledge). This has changed as
more and more researchers explore the construct in other contexts—Guthrie’s studies
with Anderson (1999) and Wigfield (2000) on student engagement with reading being
one example.
2.1 Emergence of Student Engagement 13

2.1.1 Terminological Issues

Already at the beginning of research on student engagement, Mosher and MacGowan


(1985) criticised that the meaning of the concept was rather unclear—one of the crit-
icisms which can still be found in literature on engagement today (Reschly & Chris-
tenson, 2012; Skinner, Kindermann, & Furrer, 2009) urging researchers to clearly
conceptualise their engagement concept (Trowler & Trowler, 2010a). Fredricks et al.
(2004) also commented on this issue in their review of school engagement, stating
that, while it is definitely a valuable concept, it needs clarification as it not only
partly overlaps with other areas of research (see Sect. 2.3.) but is also not as precise
as existing research in motivation, for instance. Likewise, Dunne and Derfel (2013b)
explicitly state in their handbook which problems the research on student engagement
is still facing, showing that this area is still in its initial stages:
[Student engagement] is used in the same breath as student participation, involvement,
commitment, effort, time on task or motivation. It is associated with teamwork, leadership,
community or civic engagement, democracy, with partnership, co-creation and collabora-
tion, with developing new relationships between staff and students, and with students as
customers. It is linked to curriculum design and is claimed to be promoted by a variety of
modes of teaching or learning, as well as by the nature of assessment and feedback provided
for learners, to the role of peers as mentors, or to the power of technology to promote
engagement. (Dunne & Derfel, 2013b, p. xv)

Researchers in this area are clearly challenged by the lack of a distinct definition
of the term ‘student engagement’ (Bryson & Hardy, 2011; Dunne & Derfel, 2013a;
Han & Hyland, 2019; Trowler & Trowler, 2010b), as its meaning varies so much.
While some only link it with extracurricular activities, others use it to describe student
participation in class or even equate it with student satisfaction. I believe that the latter
is highly problematic, especially when student engagement is defined as a complex
and dynamic system, a process that can vary over a period of time. Hence, students
could be highly engaged on several levels but still not be satisfied with the outcome.
Unsurprisingly, researchers also argue whether or not student engagement should
be seen as a process or an outcome. I strongly argue that it is a process that can
change during the course of time. If students decide to engage with their learning, it
is a process that will proceed through several stages and will definitely be influenced
by many factors, such as group dynamics, family, friends and teachers, to name just
a few. It is essential to keep in mind that:
[…] the L2 classroom should never be treated as monolithic in nature (and there is danger
in some of the research of viewing it this way). Classrooms are complex social contexts
and they vary enormously. Furthermore, they are not fixed and static social contexts. They
are dynamic, constantly changing, in part at least because of the part played by learners in
helping to construct and reconstruct them […]. (Ellis, 2012, p. 192)

Furthermore, I would argue that engagement cannot be solely seen as a process, it


is more a process which can result in an outcome. If students are engaged in their
academic work, for example, they will complete their education in the end, hence
14 2 The Engagement Concept

their engagement process will result in their graduation (see also Sect. 2.5.3.1. and
2.6.).
As illustrated with the example above, student engagement is a highly complex
concept and is not without its challenges as researchers interpret student engagement
differently, making it immensely challenging to establish a common framework. But
it is definitely a concept worthy of investigation, when the desired outcome should be
improving learner engagement at school and university level. The three dimensions
of engagement (see Sect. 2.3.) and how to tap into them are definitely one area that
could help improve the student experience at school/university.

2.1.2 School Context

Natriello (1984) was one of the first researchers to mention the concept of student
engagement, in terms similar to researchers today, namely as participation in school
activities. Although he mentioned student engagement, he focused more on the
concept of disengagement, which he seemed to view as the complete opposite of
engagement (cf. Mosher & MacGowan, 1985). Nevertheless, Natriello’s view of
engagement infers that it is a dynamic system which can change according to various
factors influencing it. To research disengagement at two suburban high schools in
the USA Natriello used interviews to ask students, for example, about unexcused
absences, skipping school or cheating on a test. Even his early study showed that
students’ voices appear to be absolutely necessary to be able to get to the core of
engagement (or disengagement). Likewise, Rumberger (1983) focused on disengage-
ment, especially on reasons for dropping out of school. In other words, it appears the
first studies on engagement concentrated more on the negative effects of it and how
to overcome them—a notion still present in research today. Rumberger and Roter-
mund (2012) reported that economist James Heckman, after analysing several sets
of data on dropout rates, concluded that the USA’s graduation rate is 77% at present,
hence has decreased in the last 40 years (Heckman & LaFontaine, 2010 as cited
in Rumberger & Rotermund, 2012). Not surprising then, that research on student
engagement has increased in order to find solutions to enhance graduation rates in
the USA and tackle the difficult issue of student dropout.
Furthermore, research on disengagement also focused on psychological factors.
Mosher and MacGowan (1985), for example, named ability, cognitive complexity,
student morale or attitudes toward learning to have an impact on students’ engage-
ment. Sprinthall and Collins (1984) looked at psychological alienation, where the
structure of secondary schooling was seen at the root of students’ alienation. They
concluded that the majority of students “drift through school, not really understanding
the point of the curriculum, not grasping the concepts […] not having the requisite
skills to succeed in extracurricular activities” (Sprinthall & Collins, 1984, p. 404 as
cited in Mosher & MacGowan, 1985, p. 10). Comparing these psychological factors
to research on student engagement nowadays, the majority of them can be found in
the three dimensions of engagement, namely emotional, behavioural and cognitive
2.1 Emergence of Student Engagement 15

(see Sect. 2.3.). Mosher and MacGowan believed in the potential of student engage-
ment back in the 1980s, concluding that real theory for the concept was still missing,
and further study was not only needed, but necessary. A point of view still relevant,
because many researchers comment on the fact that the concept is still lacking clarity
as its definitions vary tremendously.
Despite the lack of a common definition, student engagement necessarily involves
a theoretical framework for researching dropout and supporting school comple-
tion. Engaged students are said to be eager to do more than simply complete their
school/academic work, but thrive for challenge. Additionally, it is seen as a multidi-
mensional construct, where the academic environment as well as student behaviour
are crucial factors, thus the context and the individual are essential components.
What is more, effective instruction is another important factor that increases student
engagement.

2.1.2.1 Shared Responsibilities

Schafer and Polk (1972) already emphasised the idea that active involvement of
all parties concerned in education might generate better academic outcomes among
students. A little later Newmann (1981) stressed the importance of the school context
when looking at student alienation, which he saw as one reason for other problems like
vandalism, violence and poor achievement. His research focused on the secondary
level, because this area of education appeared to be the one with the highest levels of
student alienation. After researching an extensive body of literature from the 1950s
onwards, he came up with six guiding principles to improve engagement and decrease
student alienation:
1. Reforms that encourage voluntary choice on the part of students.
2. Student participation in policy decisions.
3. Maintain clear and consistent educational goals.
4. Keep school sizes small.
5. Encourage cooperative student-staff relationships.
6. Provide an authentic curriculum. (Finn & Zimmer, 2012, p. 99)
These guidelines are still relevant for research today; as choice, taking part in
school decisions or being a part of the governing body, clearly communicated learning
outcomes, school size, the student-teacher relationship as well as authenticity in the
curriculum and teaching materials are still areas that are stressed by policymakers
but not always adhered to in the school context. Consequently, researchers are inves-
tigating ways to improve the student experience. Reschly and Christenson (2012),
for example, identified four sources that can move research on student engagement
forward:
1. Dropout and intervention theory.
2. General school reform perspective.
3. Motivational literature.
4. Subdisciplines of psychology (e.g., educational psychology).
16 2 The Engagement Concept

All these instances show that there seems to be a need for programmes to engage
students in their academic work. Student engagement, with its vast definitions and the
various settings in which it has been investigated so far, offers a great opportunity
to provide constructs to do so. What Reschly and Christenson named conceptual
haziness might also be understood as an advantage, using the umbrella term ‘student
engagement’ as a general framework in order to contextualise it in manifold ways
and contribute to the body of knowledge. The question remains as to whether or not
its lack of definition mentioned by numerous researchers should be seen as a strength
which aids researching this construct in many areas, consequently moving learners
forward in their engagement with school in general and academic work specifically.

2.2 Concepts of Student Engagement

The diversity to be found in defining student engagement emerges in its conceptu-


alisation as well. Different researchers draw on various existing fields in research
(e.g., motivation, psychology, education) to conceptualise their definitions of student
engagement, hence it might be valuable to outline some of those that have been used
by many researchers. This subchapter will briefly describe several models that have
served researchers as a framework for their engagement constructs: from one of the
earliest of these—Finn’s participation-identification model—to more recent ones,
such as self-determination theory or achievement goal theory.

2.2.1 Finn’s Participation-Identification Model

A model some researchers draw on in their definitions of engagement is Finn’s


(1989) participation-identification model (see Fig. 2.1). His model emphasises the
importance of students’ active participation (e.g., school and classroom activities)
and a form of identification with school.
On the behavioural level, for instance, students can display lower or higher levels
of behaviour (Finn & Voelkl, 1993). Being in class, paying attention to the teacher
and reacting appropriately to questions, directions and assignments can be attributed
to lower levels of engagement. If students are not only at the receiving end but also
initiating discourse or investing more time into academic work, for example, it can
be credited to higher levels of behaviour. Despite participation and identification
factors, such as the quality of instruction, students’ abilities play an essential role in
successful performance outcomes as well. All in all, this construct can definitely be
seen as one to counteract high dropout rates in schools. A similar view is expressed
by Voelkl (2012) who argues that school identification is “as an affective form of
engagement comprised of students’ sense of belonging in school and feeling that
school is valuable” (ibid., p. 212).
2.2 Concepts of Student Engagement 17

Fig. 2.1 Finn’s participation-identification model (Finn & Zimmer, 2012, p. 101)

In the same way Finn emphasised in his article “Withdrawing From School”
that instead of focusing on school failure, students’ engagement (he used the term
‘involvement’) within and outside school should be investigated to aid students’
success in completing school—a notion that can also be found in student engage-
ment questionnaires (e.g., NSSE) today. What is more, drawing on Finn’s model
the programme Check & Connect was developed which is a programme designed to
boost student success and engagement with school and learning on a more individual
level (Christenson & Reschly, 2010; Sinclair, Christenson, Lehr, & Anderson, 2003).
A so-called monitor is responsible for checking, for instance, a student’s attendance,
grades, or suspensions which provide the basis for the monitor to aid the student in
keeping and increasing school connectedness to prevent dropout. Communicating
with families is viewed as another essential part of the programme to make it work.

2.2.2 Skinner and Pitzer’s Model of Motivational Dynamics

Skinner and Pitzer (2012) specify four levels of engagement that have been studied
so far (see Fig. 2.2). Generally, the first level shows, for example, engagement with
church, school and family which supports positive developments and reduces delin-
quent behaviour. The second level—engagement with school—facilitates school
completion and protects against dropout. Engagement in the classroom—the third
level—includes the teacher, the curriculum, peers, friends and classmates as essential
factors. Finally, the fourth level—engagement with learning activities—is crucial for
students to acquire knowledge and skills to succeed in their school career, as well as
later on in their lives.
18 2 The Engagement Concept

Fig. 2.2 Skinner and Pitzer’s model of motivational dynamics (Skinner & Pitzer, 2012, p. 23)

Teachers and peers as well as self-systems as the motivational context have an


impact on the students’ learning, coping and resilience. Self-systems go back to the
so-called self-systems process model (Connell, 1990; Connell & Wellborn, 1991)
which promoted the fulfilment of basic needs such as competence, autonomy, and
relatedness (see also Ryan, 1995). They foster engagement and contexts that meet
these basic needs, which are said to result in higher levels of student engagement
and academic success (Connell, Spencer, & Aber, 1994; Finn & Zimmer, 2012).
In Skinner and Pitzer’s model, engagement is part of the students’ motivation and
has an effect on how others react to them. What is more, they suggest that teachers can
shape student engagement: First, providing students with challenging or fun tasks,
2.2 Concepts of Student Engagement 19

where they are allowed to tackle these on their own accord. Clear instructions and
constructive feedback are crucial to foster student engagement. Second, establishing
a classroom environment in which students can develop ways to fulfil tasks that are
not that much fun, but still need to be done as part of the curriculum.
In their motivational dynamics model Skinner and Pitzer view engagement as
action, where behaviour, emotion and cognitive orientation all contribute to engage-
ment. Their model also includes disaffection (or burnout) as the opposite of engage-
ment (see Fig. 2.3) and depicts the various factors having an impact on the students’
engagement with school/school work. Whether it is behaviour, emotion or cognitive
orientation, all three rely on initiation, ongoing participation and re-engagement,
hence action is needed to make engagement work. Their conceptualization clearly
shows how many different areas (e.g., involvement, interest, mastery) can have an
impact on students’ engagement and disaffection, hence stressing the multidimen-
sionality of the construct as behavioural, emotional and cognitive factors that is at
the core of their model. In line with that, it depicts engagement as a dynamic system
as all the factors mentioned can change over time and its impact on the students’
engagement or disaffection can vary immensely.

Engagement Disaffection
Behavior Action initiation Passivity, Procrastination
Initiation Effort, Exertion Giving up
Ongoing participation Working hard Restlessness
Re-engagement Attempts Half-hearted
Persistence Unfocused, Inattentive
Intensity Distracted
Focus, Attention Mentally withdrawn
Concentration Burned out, Exhausted
Absorption Unprepared
Involvement Absent
Emotion Enthusiasm Boredom
Initiation Interest Disinterest
Ongoing participation Enjoyment Frustration/anger
Re-engagement Satisfaction Sadness
Pride Worry/anxiety
Vitality Shame
Zest Self-blame
Cognitive Orientation Purposeful Aimless
Initiation Approach Helpless
Ongoing participation Goal strivings Resigned
Re-engagement Strategy search Unwilling
Willing participation Opposition
Preference for challenge Avoidance
Mastery Apathy
Follow-through, care Hopeless
Thoroughness Pressured

Fig. 2.3 A motivational conceptualization of engagement and disaffection in the classroom


(Skinner & Pitzer, 2012, p. 25)
20 2 The Engagement Concept

2.2.3 Achievement Goal Theory

A framework for unattainable goals, which could lead to disengagement and frus-
tration, was presented by Ford and Nichols (1987) who named five umbrella cate-
gories: affective goals, cognitive goals, subjective organization goals, social rela-
tionship goals and task goals. Although Ford and Nichols used these categories to
explain negative outcomes, many can be found in student engagement today and also
in research on achievement goal theory which looks into students’ motivations and
behaviours in relation to “reasons and purposes they adopt while engaged in academic
work” (Wolters, 2004, p. 236). Some research on achievement goal theory distin-
guished between target goals, general goals and achievement goals (see Pintrich,
2000 for a thorough discussion of these). Target goals are set by the individual, for
example to get a B on the next written exam in English, but do not include how
to achieve this goal. How to achieve this goal can be found in general goals that
range from happiness and the self to mastery. Last but not least, achievement goals
look at why individuals strive for achievement tasks, not only on a general level,
but achievement relative to certain criteria of high achievement levels. In order to
measure their progress, students, for instance, need to be able to compare it to set
standards or an established criterion.
Anderman and Patrick (2012) connected the concept of achievement goal theory
to student engagement. They stressed that opinions among researchers about its
terminology vary somewhat, but at the advent of goal models, researchers identified
performance and learning goals (see Grant & Dweck, 2003). Nowadays, these goals
are very often labelled mastery and performance goals (Pintrich, 2000). In general,
“mastery” refers to an individual’s progress, e.g., when mastering a task, and “per-
formance” relates to the individual’s performance compared to others. Recently,
researchers have developed a 2×2 framework (Elliot & McGregor, 2001; Harack-
iewicz, Barron, Pintrich, Elliot, & Thrash, 2002; Middleton & Midgley, 1997;
Pintrich, 1999), namely mastery-approach, mastery-avoid, performance-approach
and performance-avoid goals. This construct still faces criticism because of the diffi-
culty in distinguishing between approach and avoid goals. Similar criticism is levelled
at the many different definitions of performance-approach goals (cf. Harackiewicz
et al., 2002).
Another aspect on which researchers in the achievement goal domain disagree, is
whether or not performance and mastery goals are mutually exclusive. Some believe
they oppose each other and that students should not be encouraged to pursue both
goals. In contrast to that, others came to the conclusion that both goal perspec-
tives have benefits for students, hence teachers should not discourage students from
striving to engage in both goals (Senko, 2016). This view has also been supported by
the multiple goals perspective on achievement goal theory, where researchers have
acknowledged that students might pursue multiple achievement goals simultane-
ously. Hofer and Fries (2016) pointed out, however, that once again goals restricted
to the school context are researched, neglecting goals that go beyond the school
context, which might be necessary to understand students’ achievement.
2.2 Concepts of Student Engagement 21

A slightly different approach that has emerged in this field is the goal standards
model, where goals are viewed in respect of standards of competence that need to
be achieved. More recently this has been advanced to the goal complex approach
(Elliot, 2005; Senko, 2016), where standards are still at its core, but with students
having various reasons to achieve goals. The reasons for the students’ engagement
can either be within the students themselves or other contexts (social, learning).
When trying to explain the benefits of student engagement for achievement goal
theory, Anderman and Patrick (2012) pointed out the value of the engagement
construct and named the three dimensions (see Sect. 2.3.) in various contexts, but
interestingly enough mostly used the term “motivation” to explain it. In other words,
this stresses once again the ongoing debate about whether or not motivation and
engagement are closely linked or separate concepts (see Sect. 2.5.).

2.2.4 Self-Determination Theory

Some researchers (e.g., Reeve, 2012; Skinner & Pitzer, 2012) based their research
on self-determination theory (SDT), an area of special interest to psychologists Deci
and Ryan (Deci & Ryan, 1985, 2000, 2008a, 2008b; Ryan & Deci, 2000). SDT is a
so-called macro-theory of motivation and consists of basic needs theory (concepts
of psychological needs), organismic integration theory (types of extrinsic motiva-
tion), goal contents theory (intrinsic vs. extrinsic goals and impact on psychological
needs), cognitive evaluation theory (effect of external events on intrinsic motiva-
tion) and causality orientations theory (individual differences). Findings suggest that
competence, autonomy and relatedness are basic psychological needs (Ryan, 1995).
When satisfied it can lead to higher self-motivation and mental health, whereas when
dissatisfied to lower self-motivation and well-being (Ryan & Deci, 2000). The under-
lying assumption is that “people are by nature active and self-motivated, curious and
interested, vital and eager to succeed because success itself is personally satisfying
and rewarding” (Deci & Ryan, 2008a, p. 14). Rather than looking at motivation as
a unity, SDT differentiated types of motivation, with the type of motivation for each
individual being more important than its totality for predicting positive outcomes.
The types of motivation Deci and Ryan (2008b) referred to are autonomous and
controlled motivation. Autonomous motivation consists of intrinsic and extrinsic
motivation (Deci & Ryan, 2000, 2008a) with people not only valuing activities but
identifying themselves with them. In this context they are more likely to invest time
in the activities at hand. In contrast to that, controlled motivation is influenced by
external factors, such as imposed rules, as well as notions like avoiding shame, and
people will behave in a particular way they believe is expected of them. Furthermore,
extrinsic motivation is said to consist of four types: First, external and introjected,
which are both connected to negative emotions which can lead to low levels of
engagement. Second, identified and integrated, which are usually associated with
positive emotions as well as value and importance (Kaplan & Patrick, 2016; Ryan &
Deci, 2016). Its broad definition has allowed researchers to look at SDT in different
22 2 The Engagement Concept

areas ranging from healthcare or education to psychotherapy. Applying their theory


to engagement means that at the core is the belief that students possess inner motiva-
tional resources and that teachers can aid these in countless ways to increase students’
engagement (Niemiec & Ryan, 2009; Reeve & Halusic, 2009).

2.3 Dimensions of Engagement

Although researchers differ with respect to defining and conceptualising engagement,


they agree on the existence of dimensions of engagement—whether two, three, four
or even more is, once again, viewed differently by them (e.g., Appleton, Christenson,
& Furlong, 2008; Finn & Zimmer, 2012; Fredricks et al., 2004; Maroco, Maroco,
Campos, & Fredricks, 2016; Reeve, 2012; Reeve, Jang, Carrell, Jeon, & Barch, 2004;
Skinner & Pitzer, 2012; Trowler, 2010). In his early work, Finn (1989), though not
using the term ‘student engagement’, already used many aspects of it still relevant
for research today, such as the behavioural and emotional dimension, which can be
found in his report on withdrawing from school. Johnson, Crosnoe, and Elder (2001),
when reporting on the role of race and ethnicity in relation to student engagement
and student attachment, stress that a common definition of the concept would be
welcomed. Furthermore, they emphasise the need to distinguish between affective
(or emotional) and behavioural engagement.
Nowadays, a lot of researchers name three distinct dimensions of engage-
ment, namely behavioural, cognitive, and emotional (sometimes also termed affec-
tive). Whereas behavioural engagement “draws on the idea of participation […]”
(Mahatmya, Lohman, Matjasko, & Feldman Farb, 2012, p. 47) and is said to be
necessary for positive academic outcome, cognitive engagement employs “the idea
of investment, it incorporates thoughtfulness and willingness to exert necessary
effort to comprehend complex ideas and master difficult skills” (ibid.). Emotional
engagement on the other hand, includes students’ positive and negative reactions to
classmates, teachers, etc., hence fostering or hindering their willingness to complete
their work (cf. ibid.). Some researchers (e.g., Fredricks et al., 2004) have concluded
that the three dimensions of engagement draw on existing research in other areas.
Behavioural engagement, for instance, can be found in literature on student conduct
and on-task behaviour (Karweit, 1989; Peterson, Swing, Stark, & Waas, 1984). When
looking at emotional engagement, it is associated with research on student attitudes
(Epstein & McPartland, 1976; Yamamoto, Thomas, & Karns, 1969) and student
interest and values (Eccles et al., 1983). Finally, cognitive engagement is linked to
motivational goals and self-regulated learning (Boekaerts, Pintrich, & Zeidner, 2005;
Zimmermann, 1990).
Behavioural engagement in particular has been interpreted differently by
numerous researchers. Mosher and MacGowan (1985) point out that being in school
does not equal participating but might be a necessary behaviour to make engage-
ment possible. Some researchers (Appleton, Christenson, Kim, & Reschly, 2006;
Furlong & Christenson, 2008) even divide behavioural engagement further and term
2.3 Dimensions of Engagement 23

it academic engagement. Generally, the differences are that behavioural engage-


ment is more about classroom participation, attendance, effort and extracurricular
activities, whereas academic engagement includes time spent on tasks or quantity
of assignments completed. What is more, some behaviours are said to be universal,
like regular class attendance, staying out of trouble, obeying rules, etc. (Furlong
et al., 2003), whereas others can vary in different contexts. Furthermore, behavioural
engagement can have different levels as students might show more enthusiasm or
spend more time on tasks than others. Participation in extracurricular activities can
also indicate a higher level of engagement in the behavioural dimension.
Similarly, Finn and Zimmer (2012) name academic, social, cognitive and affective
engagement as the most commonly used dimensions of engagement. Their definition
attributes the first three to behavioural, where the cognitive dimension is seen as
behavioural internal, a definition that is questionable as the cognitive dimension
relates to internal cognitive processes, which contribute to shallow (e.g., memorising
facts) or deep engagement (e.g., getting to the root of a problem) with academic work.
Distinguishing between the behavioural and the cognitive dimension is crucial, as
cognition is much more difficult to measure when we do not exactly know what is
going on in the brains of the students. Hence, researchers have to rely on the students’
own reflection, using, for example, stimulated recall (Bloom & Broder, 1958 as cited
in Finn & Zimmer, 2012), which later developed into think-alouds to research that
particular area of engagement. Collaboration with neuroscientists might be another
way forward in researching and measuring the cognitive dimension of engagement.
Alternatively, Pekrun and Linnenbrink-Garcia (2012) even propose five cate-
gories of engagement—cognitive, motivational, behavioural, cognitive-behavioural
and social-behavioural. In their view, emotional engagement needs to be seen as a
precursor of engagement. They stress that research on emotion has received a lot of
attention lately, and is now seen as essential for students’ learning, achievement, etc.
Furthermore, they favour motivation in the context of task involvement as a form of
engagement.

2.3.1 Pittaway’s Engagement Framework

Another area where engagement can be especially beneficial is in research on pre-


service teachers as they should be able to use methods that can engage students
with the various subjects taught at all levels of education (Newton & Newton, 2011;
Pittaway, 2012). With the three dimensions of engagement at its core, the engagement
framework designed by Pittaway could be used for all disciplines, levels and courses
(Pittaway, 2012; Pittaway & Moss, 2013). One of its purposes is to illustrate how
students engage. In Pittaway’s framework the term ‘student engagement’ is seen as a
complex and dynamic system. Figure 2.4 shows the five elements of engagement—
personal, academic, intellectual, professional and social—which are non-hierarchical
and obviously interlinked with each other. Personal engagement is at the centre of
all elements of engagement. Pittaway claims that without personal engagement, the
24 2 The Engagement Concept

Fig. 2.4 Pittaway’s engagement framework (Pittaway, 2012, p. 40)

other dimensions would not work out. The respective student must believe in him-
or herself and his or her ability to be able to achieve the pre-set goals.
Correspondingly, Reeve and Tseng (2011) implement agentic engagement as their
fourth dimension of engagement, which they claim makes the construct complete.
Agentic engagement refers to the student actively contributing to the learning activ-
ities/environment he or she encounters—more precisely, he or she contributes to the
instruction received (e.g., make a suggestion). In their opinion the well-established
three dimensions mean that students react to, for example, instruction, and the agentic
dimension requires proactivity on the students’ part. Is the cognitive dimension one
where students only react to an action? Is it not true that they, for instance, draw on
existing strategies to tackle an assignment set? The notion of students being active
participants (Han & Hyland, 2015; Storch & Wigglesworth, 2010) in their learning is
a vital one, but needs to be seen as part of the existing dimensions and not a separate
one. I strongly argue that without agency learner engagement will not occur, as action
and willingness to engage on the part of the learner are necessary for engagement to
commence.

2.3.2 Researching Dimensions of Engagement

While there is contention as to how many dimensions there are to student engage-
ment, researchers also disagree on how they should be researched. Finn (1989), for
instance, talked about participation as being one prerequisite for formal learning to
2.3 Dimensions of Engagement 25

happen and also used the terms ‘behavioural’ and ‘emotional dimension’ which, in
his opinion, need to be seen separately, a view some researchers in student engage-
ment disagree with. Fredricks et al. (2004) argue that researching all three dimensions
of engagement in one study is desirable but might be less precise than researching
one dimension in more depth. Taking all three dimensions into account can arguably
lead to less accuracy but might be worthwhile as it can depict a very multifaceted
picture of students’ engagement. Thus, making students’ voices heard through qual-
itative research might be one essential tool to gain more insight into this research
area. Moreover, it might be beneficial to see engagement as a complex, multidimen-
sional concept, where its three dimensions are researched together and not, as so
many studies to date, separately. Like Guthrie together with Anderson (1999) and
Wigfield (2000) suggested in their work on reading and engagement—researching
engagement should include multiple components of it.

2.4 Measurability of Student Engagement

How to measure student engagement has been addressed by various researchers


(Appleton 2012; Betts, 2012; Darr, 2012; Fredricks & McColskey, 2012; Samuelsen,
2012; Yazzie-Mintz & McCormick, 2012). Behavioural engagement is supposedly
easier to measure than cognitive and emotional engagement as the latter are seen as
more abstract concepts and either difficult or even impossible to observe. To believe
measuring behavioural engagement is simple, does not reflect the reality, however.
It might be the case for attendance, but, for instance, Reeve’s (2012) assumption that
teachers can easily observe if a student is paying attention is contradicted by Peterson
and colleagues’ (1984) study, which suggests that a researcher observing behaviour
in class might think a student is engaged when, in fact, the opposite is true. Counting
student engagement as observable behaviour, like Reeve, is controversial as various
aspects of the behavioural dimension as well as emotional and cognitive engagement
are particularly difficult to observe (cf. Appleton et al., 2008).
One of the most common tools to measure the latter two dimensions of engagement
are student self-reports (Appleton et al., 2006). The students choose their answers
from a couple of items describing these dimensions, choosing an answer that fits
best. The majority of these questions are general and not subject specific. One of the
few exceptions, for example, is Wigfield and colleagues’ (2008) survey on reading.
Another popular tool is teacher ratings of students using checklists or rating scales.
Skinner and colleagues (2008) even used student self-reports and teacher ratings to
cross-reference the results. Their results showed that on the behavioural dimension
the ratings correlated much stronger than on the emotional dimension. These results
are not surprising as emotional engagement is much more difficult to observe than the
behavioural one, hence what teachers might observe about the students’ emotional
engagement level can be totally different from the students’ actual level.
26 2 The Engagement Concept

2.4.1 Measurement Instruments at Secondary Level

At school level two measurement instruments have been used frequently: the High
School Survey of Student Engagement (HSSSE) and the Student Engagement Instru-
ment (SEI). The HSSSE is a survey designed on the basis of the National Survey of
Student Engagement (NSSE), measuring students’ attitudes, perceptions and beliefs
about school work, school learning environment and interactions within the school
community (cf. Online 3). Their rationale for conducting these as stated on their
website is:
• To provide teachers and school administrators with valid and meaningful data on
student engagement at their specific school.
• To help educators use student engagement data to develop and implement
strategies and practices that increase student engagement at their specific school.
• To help schools interpret their own student engagement data relative to the
aggregate results from other schools (Online 3).
The HSSSE measures three dimensions, namely cognitive/intellectual/academic
engagement, social/behavioural/participatory engagement and emotional engage-
ment (Yazzie-Mintz & McCormick, 2012). The survey contains 35 major items
and all sub-questions include students’ responses to more than 100 items. Addi-
tionally, the last question is an open-response question where students can elabo-
rate on their school experience. The SEI, on the other hand, consists of 33 items,
measuring cognitive and emotional/affective engagement, using five subtypes. The
cognitive part looks at future plans and the relevance of schoolwork, whereas the
emotional/affective part investigates the relationships between teachers and students,
students and students as well as students and families. The students have four options
to answer these items: Strongly Agree, Agree, Disagree and Strongly Disagree.
Researching two (SEI) or three (HSSSE) dimensions is essential to further enhance
the student engagement construct, but these examples are rare because, more often
than not, only one dimension of engagement is researched.
Understandably, a lot of studies look at one dimension or a single construct when
researching engagement in order to make the sheer amount of data manageable. Very
often student behaviour is researched on its own without looking at its connection
to the student experience. This approach, however, might lack a broader view where
the school structure and its environment (Finn & Voelkl, 1993), the teacher’s role
(Reeve et al., 2004) and interaction at several levels in the school context (Johnson
et al., 2001) appear to be neglected. Hence, further research is clearly needed to move
research into multi-dimensionality forward.
2.4 Measurability of Student Engagement 27

2.4.2 Measurement Instruments at Tertiary Level

Even though C. Robert Pace used ‘student involvement’ and connected it with quality
of effort (how much students invested on their own) that would enhance the student
experience at college, some researchers see him as the forefather of student engage-
ment, as he designed the College Student Experiences Questionnaire which was in
use until 2014. The other student engagement instrument that has been developed
from the work of Astin and others is the National Survey of Student Engagement
(NSSE). This is a questionnaire which was developed in 1998, piloted in 1999 and
then became an annual survey instrument in 2000. Nowadays it is also used in Canada
as well as other countries. Adapted versions of it (see Solomonides, 2013 for a detailed
discussion) include the South African Survey of Student Engagement (SASSE), the
Australian University Survey of Student Engagement (AUSSE) and the National
Survey on Student Engagement-China (NSSE-C). In 2016 NSSE was used by 560
universities and colleges (cf. Online 4). According to NSSE student engagement
represents two critical features of collegiate quality. The first is the amount of time and effort
students put into their studies and other educationally purposeful activities. The second is
how the institution deploys its resources and organizes the curriculum and other learning
opportunities to get students to participate in activities that decades of research studies show
are linked to student learning. (Online 4)

This definition seems to be in line with other researchers’ beliefs (e.g., Garrett, 2011;
Kuh 2001; Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008) that in some respect student
engagement can be seen as a “phenomenon confined to the curriculum.” (Ratcliffe
& Dimmock, 2013, p. 60). As a result, NSSE has become a very popular instrument
and many institutions have used this questionnaire to measure student engagement.
Basically, the underlying belief is that levels of engagement can be measured by
involvement in particular curricular and extracurricular activities (cf. Solomonides,
2013, p. 45). As can be expected, it is also becoming a tool that stakeholders “relate
[…] to the quality assurance and enhancement agenda” (Hardy & Bryson, 2010,
p. 19). Because of that Hardy and Bryson voice their concern that in the UK especially,
student engagement seems to be equated with student satisfaction and the collective
voice rather than acknowledging the individuality of each student.
Lately, the structure of the NSSE survey instrument has faced criticism: not only
are its definitions said to be too broad, but the survey, more related to habits (e.g.,
studying) and student life (Kahu, 2013; Maroco et al., 2016), does not really take the
emotional dimension into account. Issues of its validity for other educational contexts
as well as the five engagement scales—academic challenge, active learning, interac-
tions, enriching educational experiences and supportive learning environment—have
been questioned as well (see Kahu, 2013; Bryson, 2014b).
How should we measure engagement then, if this rather abstract concept includes
so many diverse factors and might differ from one individual to the other? Mosher
and MacGowan (1985) point out that class attendance, for example, can be easily
measured, but engagement not so much. What implications does this have for existing
tools to measure engagement? Whereas the NSSE has student engagement already
28 2 The Engagement Concept

embedded in its name, the UK’s survey instrument National Student Survey (NSS)
does not. In the UK the students are asked to give their opinion on their student experi-
ence to provide institutions with information on their performance. The NSS has been
conducted since 2005 and focuses on final year undergraduate students. In contrast to
that the NSSE includes first year and senior students, which might provide a deeper
insight into how student engagement develops over time. Nevertheless, the validity of
the NSSE survey instrument has been challenged due to the usage of the dimensions
of engagement not being as clearly depicted as might be necessary for a thorough
understanding of the student engagement construct. The question then remains as to
whether a quantitative approach is the right one when investigating student engage-
ment or if either a more qualitative or mixed methods approach (quantitative and
qualitative) would provide deeper insights into this construct.

2.5 Engagement and Motivation

The term ‘motivation’ is used in numerous contexts and is often a cause of heated
debate among teachers in the context of school students and their alleged lack of
motivation. The literature on motivation is extensive and Dörnyei (1994, p. 516)
points out that “terminological issues are more than mere technological questions;
the labelling of key terms and the use of certain metaphors are at the core of any
theory”. Although motivation is said to be a more widely researched area than student
engagement, terminological issues and their exact definitions are essential too. As
Dörnyei and Ushioda (2011, p. 3) state “while intuitively we may know what we
mean by the term ´motivation‘, there seems to be little consensus on its conceptual
range of reference”. The American Psychological Association even thought about
“replacing the word ´motivation‘ as a search term […] because, as a concept, it had
too much meaning and therefore was not very useful” (Dörnyei & Ushioda 2011,
p. 3). It is hardly surprising then that student engagement faces similar terminological
challenges.
Likewise, researchers in motivation view certain areas in a different light. Self-
determination theory, for example, differentiates between types of motivation,
namely intrinsic and extrinsic, and proposes that these can complement each other—
a view not shared by all researchers (cf. Ryan & Deci, 2016). Some researchers found
out, for example, that monetary incentives for doing intrinsically motivated tasks had
a negative effect on students, while positive feedback (compared with no feedback)
had a positive effect.
2.5 Engagement and Motivation 29

2.5.1 Mastery/Performance—Intrinsic/Extrinsic

Literature on motivation has also looked into mastery and performance goals
(Midgley, 2001; Schunk, Meece, & Pintrich, 2013). Mastery goals are said to be
linked to intrinsic motivation whereby the learning process is influenced by the indi-
vidual’s desire to master a certain topic, skill, etc. Performance goals, on the other
hand, are about getting good grades and performing well for the sake of others.
Hence, deep strategy use is related to mastery goals which can be reinforced by
teachers if they draw more attention to critical thinking, problem solving, etc. instead
of grades and performing well on (standardized) tests. Psychologists Deci and Ryan
investigated intrinsic and extrinsic motivation too, in their enhancement of self-
determination theory (see Sect. 2.2.4.). In contrast to other conceptualizations of
motivation, they used the concept of autonomous motivation, where people act of
their own accord, and controlled motivation, where “people act without a sense of
personal endorsement” (Deci & Flaste, 1995, p. 2 as cited in Brooks, Brooks, &
Goldstein, 2012, p. 545). Thus, intrinsic and extrinsic motivation reflect students’
personal involvement in the learning process, whereas controlled motivation is part
of external entities (e.g., teacher) regulating how students should behave in the school
context. Brooks and colleagues (2012) argued that variables of intrinsic motivation
are closely linked to engagement and extrinsic motivation can sometimes even lead
to less engagement.

2.5.2 The Engagement Motivation Dilemma

As can be expected, researchers’ opinions on the relationship between engagement


and motivation differ enormously. Mahatmya and colleagues (2012), for instance,
outline the various points of view on this relationship. These include engagement
and motivation being separate constructs, motivation being a prerequisite of engage-
ment, engagement as an outward manifestation of motivation (Skinner & Pitzer,
2012; Taras, 2015) or Connell’s process model of motivation and how these influ-
ence engagement (Blumenfeld, Kempler, & Krajcik, 2006; Connell & Wellborn,
1991; Connell et al., 1994; Davis & McPartland, 2012; Dunne & Derfel, 2013a).
Engagement and motivation have also been closely linked (Krenn, 2017; Martin,
2012; Pekrun & Linnenbrink-Garcia, 2012; Schunk & Mullen, 2012). Nichols and
Dawson (2012), for example, viewed engagement as a specific type of motivation,
which involves students’ active participation and effort in learning processes and
achievement outcomes. A lot of researchers, however, appear to believe that moti-
vation precedes engagement (Janosz, 2012; Wentzel & Wigfield, 2009). Mahatmya
and colleagues’ own definition of student engagement, on the other hand, include
motivation as an integral part of all three dimensions. In line with that, Griffiths and
colleagues (2012) view motivation as an internal process improving the chances of
30 2 The Engagement Concept

a student to engage. Engagement in their view is the “more visible manifestation of


such motivational tendencies” (ibid., p. 565). Once again engagement and motivation
are seen as concepts that are closely linked.
In contrast to that, Cleary and Zimmerman (2012) promote a social-cognitive theo-
retical framework of self-regulatory engagement which has a cyclical feedback loop
(before, during and after a task) at its core, with motivation and engagement being
two essential components of this framework as it is their belief that these comple-
ment each other, but are distinct constructs. It has to be noted that self-regulation
theorists are particularly interested in mechanisms of strategic thinking and cogni-
tive management of learning which foster student engagement, hence their approach
focuses more on cognitive engagement.
Similarly, Newmann and colleagues (1992) looked at engagement in academic
work and concluded that engagement and motivation are related but differ in
several ways. They claimed engagement in academic work is defined by three
factors: “students’ underlying need for competence, the extent to which students
experience membership in the school, and the authenticity of the work they are
asked to complete” (Newmann, Wehlage, & Lamborn, 1992, p. 17). Likewise,
the National Research Council published a report in 2004 that also established a
difference between engagement and motivation. Furthermore, they stated engage-
ment consisted of observable behaviour (e.g., completing homework), unobservable
behaviour (e.g., attention) and emotions (e.g., interest, boredom), thus including
two of the three dimensions of engagement widely accepted by researchers (see
Sect. 2.3.). Both acknowledge that engagement and motivation are linked, but differ
in their conceptualizations—a view not shared by all researchers.
Ainley (2012), who investigated interest as part of motivation, says motivation
is an “underlying psychological process and engagement […] a descriptor for the
level of involvement or connection between person and activity” (ibid., p. 285, see
also Russell, Ainley, & Frydenberg, 2005). In her view engagement can be found
on a personal level where it shows a person’s involvement with an activity. In line
with that, Bempechat and Shernoff (2012) point out that motivation has had a long
tradition and has often been viewed as a psychological construct, which, in their
opinion, is not true of engagement as it refers to involvement or commitment. Like
others they argue that the increasing interest in engagement is also due to the fact
that it is malleable, a view definitely true as engagement has been used in so many
different contexts where supporting students and engaging them within a certain
area (school, tasks, feedback, alleviating dropout, etc.) is at the core of all of these
constructs. Its malleability seems to be one factor for its increasing popularity in
research.
Schunk and Mullen (2012), for example, stressed that motivation affects students’
engagement, hence seeing motivation and engagement as two different concepts.
Guthrie, Wigfield, and You (2012) portray motivation and engagement as different
constructs as well stating that “motivation […] energizes and directs behavior and
often is defined with respect to beliefs, values, and goals individuals have for different
activities” (p. 602, see also Eccles & Wigfield, 2002; Wigfield, Eccles, Schiefele,
Roeser, & Davis-Kean, 2006). Some view motivation and engagement as paramount
2.5 Engagement and Motivation 31

for learning processes (see Martin, 2012) and a lot of researchers acknowledge agency
being a part of motivation and engagement. Reeve (2012) and Reeve and Tseng
(2011), for example, even include it in their engagement framework. These few exam-
ples of researchers’ views on engagement and motivation depict the current debate
on engagement and motivation being either one concept, two separate concepts, or
two concepts complementing one another.

2.5.3 Indicators, Facilitators and Outcomes of Engagement

Skinner and Pitzer (2012) stress the need for a distinction between indicators, facil-
itators and outcomes of engagement (cf. also Skinner et al., 2008; Skinner, 2016)
to further provide the construct with a clearer definition. Despite the behavioural,
emotional and cognitive dimension, observable activities, such as homework comple-
tion, would be indicators. Academic performance, e.g., tests, grades, however, are
possible outcomes of engagement. As facilitators they name personal and social
facilitators. Personal ones include, for example, perceptions of the self and self-
systems (see Fig. 2.2), whereas social facilitators show the relationship between
teachers, peers, parents and whether or not these are dependable, controlling, etc.
Drawing on Skinner and Pitzer’s definition, Lam, Wong, Yang, and Liu (2012) stress
that facilitators should not be part of the concept of student engagement, because
researchers cannot research the influence of contextual factors (e.g., teachers, peers)
on student engagement. Additionally, they emphasise that outcome (e.g., grades,
school completion) should also not be included in the engagement construct—for
the same reason mentioned above. This would only leave indicators to be researched
within the engagement construct, which I believe would limit research on student
engagement tremendously.
It could be argued that the dimensions of engagement consist of internal
processes (e.g., cognitive and emotional) and external indicators (e.g., behavioural
and academic) which impact on the level of engagement. It is likely that research on
engagement needs to distinguish more rigorously between engagement both in- and
outside the classroom. To obtain a clearer picture of student engagement it might
even be necessary not only to distinguish between these two areas but also to look
at how one area might have an influence on the other.

2.5.3.1 Outcome or Process?

Reschly and Christenson (2012), who promote four dimensions of engagement,


namely academic, behavioural, cognitive and affective, view engagement as a process
and an outcome (see also Bryson, 2014b), with many variables influencing it (see
Fig. 2.5). In their research, they looked at the context (family, peers, school and
community) that might have an influence on indicators of engagement. Indicators
are on the individual student’s level as they are related to school connectedness,
32 2 The Engagement Concept

Fig. 2.5 Model of associations between context, engagement, and student outcomes (Reschly &
Christenson, 2012, p. 10)

ability or the attendance record. Facilitators, on the other hand, can be influenced
by the school body (e.g., disciplinary measures), parents (homework supervision)
and peers (value of academic achievement) that are needed for intervention. If prob-
lems arise, indicators and facilitators can be the basis for intervention to re-engage
students that can lead to successful learning outcomes and finally graduation from
high school, hence engagement being an outcome and a process. Reschly and Chris-
tenson also emphasised that the students’ own point of view is important for change
in their learning and behaviour.
Reeve (2012) stresses that the distinction between motivation and engagement
lies in the first being an unobservable process and the latter an observable product
(= behaviour). But taking the emotional and cognitive dimension into account,
engagement cannot be described as observable behaviour (cf. Furlong & Christenson,
2008) as internal processes in the above-mentioned dimensions are rather difficult
to observe, but vital for students’ engagement. Reeve argues that those researching
student engagement from a motivational perspective see it as an outcome and engage-
ment theorists view motivation as a source of engagement. The educational psychol-
ogists Gettinger and Walter (2012), for example, postulate engagement as being an
outcome of motivation. They stress that engagement is a mediator between moti-
vation and learning, and is responsible for achievement in the end. These different
2.5 Engagement and Motivation 33

views of researchers portray the dilemma of engagement, making it paramount to


define (once again) whether engagement is seen as an outcome or a process—or
both—in the various studies.

2.6 Engagement and Feedback

As outlined in the previous subchapters, engagement is a multifaceted construct,


interpreted rather differently in research. Therefore, engagement in this study is
seen as a multidimensional construct, where the three dimensions of engagement
(Fredricks et al., 2004)—emotional, cognitive and behavioural—are at the core of its
definition. Not one but all three dimensions will be discussed in chapter five when
the findings of my data are presented. Although some researchers name academic
engagement (e.g., on-task behaviour, academic activities) as a fourth dimension I
argue this should not be seen as a separate entity but an integral part of the behavioural
dimension (Lam et al., 2012; Skinner & Belmont, 1993). A way forward would be
to distinguish between lower and higher levels of behavioural engagement, where,
for instance, attendance alone suggests lower levels of behavioural engagement and
academic engagement a higher level. Likewise, agentic engagement, named by Reeve
and Tseng (2011) as a separate dimension, is inherent to engagement, because without
agency engagement is hardly possible (cf. Finn & Voelkl, 1993) and should therefore
be part of the behavioural dimension of engagement.
Additionally, engagement is seen as dynamic, especially as engagement can vary
over time which will be portrayed, for instance, in the subchapter on individuality in
Chap. 5. The level of engagement can fluctuate on all three dimensions and students
cannot be engaged on all three dimensions with the same intensity at all times. Not
only that, there are a lot of mediators which can have an influence on engagement as
well, thus engagement can also be seen as a mediator between engagement and the
feedback method. Consequently, engagement is a process that evolves over time and
changes constantly until it manifests itself in outcomes such as deep engagement with
feedback (see Sect. 5.2.). Skinner and Pitzer’s (2012) distinction between indicators,
facilitators and outcomes of engagement (see Sect. 2.5.3.) is a vital one, as the
feedback method is an indicator of the students’ engagement on all three dimensions.
The facilitators—researched as mediating factors in this study (see Sect. 5.3.)—can
have a huge impact on engagement as well.
Finally, motivation is seen as an integral part of engagement (see Sect. 2.5.) as
I believe that engagement without motivation and vice versa cannot exist. Skinner
and Pitzer’s (2012) notion of engagement being the outward manifestation of moti-
vation is valuable for the study at hand. I would even go further and argue that
engagement and motivation are inherently linked, being that the three dimensions
can be intrinsically or extrinsically motivated as engagement with feedback will
show (see Sect. 5.2.). Whether or not deep or shallow engagement (Crick 2012; Lam
et al., 2012) occurs, this can not only be aided by the feedback method used, but
34 2 The Engagement Concept

Fig. 2.6 Dynamic-engagement-framework

also by mediating factors impacting on engagement (see Fig. 3.1). Finally, inves-
tigating mediating factors (see Sect. 5.3.) influencing learners’ engagement with
written corrective feedback also suggests motivation and engagement being closely
linked.
Drawing on the above-mentioned areas of engagement, I propose the Dynamic-
Engagement-Framework (see Fig. 2.6), where engagement and motivation are seen as
integral parts. Engagement consists of the three aforementioned emotional, cognitive
and behavioural dimensions as well as motivation and its intrinsic and extrinsic moti-
vations. Mediating factors have an impact on these and shape the learners’ engage-
ment. Their engagement is dynamic, thus can change over time and is therefore
seen as a process that can eventually lead to an outcome (e.g., passing an exam,
graduation).
The idea of commitment can also be found in research on engagement, which I,
among other researchers, argue is undeniably linked with engagement. It also features
in research on alienation and school failure (cf. Finn, 1989; Firestone & Rosenblum,
1988a, 1988b; Polk & Halferty, 1972), where student attachment to school (subject,
teacher, peers, etc.) can lead to identification with school, thus influencing their
willingness to stay in school. An engaged student is, for example, one committed
to doing his or her assignments or participating in class as well as extracurricular
activities.
2.6 Engagement and Feedback 35

The level of engagement might vary—for instance, on the behavioural level it


might range from simply doing the assigned pieces of homework to doing a second
draft or extra pieces in order to improve. Cognitive engagement can be memorising
facts and figures, but also drawing on learning strategies to get to the root of a
problem. Emotionally, students can, for example, engage with feedback because they
like their teacher or even identify with him or her and invest more effort into doing
their written assignments. Their level of engagement can vary immensely across the
three dimensions and is unique to every learner, which will also be portrayed in
chapter five. What needs to be kept in mind, though, is that emotional engagement
may not only lead to heightened behavioural and cognitive engagement but also to
the complete opposite—disengagement (Newton & Newton, 2011).
Is it true that engagement is paramount for learning to occur like some researchers
argue (Skinner, 2016)? What indeed makes student engagement a powerful construct
is it being a valuable construct for learning—especially as engagement is malleable,
dynamic and subject to influence from peers, teachers, interests, etc. If engagement
is prone to influence, then looking at the engagement construct from a feedback
perspective might not only be worthwhile but necessary. Giving feedback is one
aspect of teaching that teachers have to do constantly, but how learners react to it and
benefit from it is sometimes less clear-cut.
Many studies have researched learner engagement on the macro level, more
precisely school engagement and all its facets. I, however, look at the concept of
engagement at the micro level as it explores the construct in respect of written CF.
In order to gain a better understanding of this construct, the involvement of the
students’ voices appear to be inevitable. As Yazzie-Mintz and McCormick (2012)
state, research on student engagement should be about finding the humanity in the
data. In addition, engagement is seen as a dynamic system (see Lewis & Granic,
2000) where the chosen feedback method should be the first step in engaging students.
Student-teacher as well as teacher-student relationships have an impact on students’
engagement (Pianta, Hamre, & Allen, 2012), hence a positive atmosphere in which
to work in might be crucial when engaging students. Regarding feedback, person-
alizing it to the individual student could be one way to engage students with their
written work (see Sect. 5.2.).
Engagement research would be incomplete if it did not include student engage-
ment with written CF. So many studies have been conducted on feedback, where
researchers disagreed on its value (Bitchener & Knoch, 2009; Chandler, 2003; Ellis,
2012; Ferris, 1999; Hendrickson, 1980; Truscott, 1999), but providing feedback for
learners is still one of the major tasks of teachers everywhere. Hence, the three-
dimensional construct of engagement is a valuable way to work out how we can
effectively engage more students with written CF.
36 2 The Engagement Concept

References

Abbott, S. (Ed.). (2014). The glossary of education reform. Retrieved from http://edglossary.org/
hidden-curriculum.
Ainley, M. (2012). Students’ interest and engagement in classroom activity. In S. L. Christenson,
A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 283–302).
New York: Springer.
Anderman, E. M., & Patrick, H. (2012). Achievement goal theory, conceptualization of
ability/intelligence, and classroom climate. In S. L. Christenson, A. L. Reschly, & C. Wylie
(Eds.), Handbook of research on student engagement (pp. 173–191). New York: Springer.
Appleton, J. J. (2012). Systems consultation: Developing the assessment-to-intervention link with
the student engagement instrument. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),
Handbook of research on student engagement (pp. 725–741). New York: Springer.
Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and
psychological engagement: Validation of the student engagement instrument. Journal of School
Psychology, 44(5), 427–445.
Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school:
Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45,
369–386.
Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal
of College Student Personnel, 25(4), 297–308.
Astin, A. W. (1985). Involvement the cornerstone of excellence. Change: The Magazine of Higher
Learning, 17(4), 35–39.
Astin, A. W. (1993). What matters in college? Four critical years revisited (Vol. 1). San Francisco:
Jossey-Bass.
Bempechat, J., & Shernoff, D. J. (2012). Parental influences on achievement motivation and student
engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on
student engagement (pp. 315–342). New York: Springer.
Betts, J. (2012). Issues and methods in the measurement of student engagement: Advancing the
construct through statistical modeling. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),
Handbook of research on student engagement (pp. 783–803). New York: Springer.
Bitchener, J., & Knoch, U. (2009). The value of focused approach to written corrective feedback.
ELT Journal, 63(3), 204–211.
Bloom, B. S., & Broder, L. J. (1958). Problem-solving processes of college students. Chicago:
Chicago Press.
Blumenfeld, P. C., Kempler, T. M., & Krajcik, J. S. (2006). Motivation and cognitive engagement in
learning environments. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences
(pp. 475–488). New York: Cambridge University Press.
Boekaerts, M., Pintrich, P. R., & Zeidner, M. (2005). Handbook of self-regulation. San Diego et al:
Academic Press.
Brooks, R., Brooks, S., & Goldstein, S. (2012). The power of mindsets: Nurturing engagement,
motivation, and resilience in students. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),
Handbook of research on student engagement (pp. 541–562). New York: Springer.
Bryson, C. (2014a). Reflections, and considerations about the future of student engagement. In C.
Bryson (Ed.), Understanding and developing student engagement (pp. 231–240). The staff and
educational development series. London and New York: Routledge.
Bryson, C. (2014b). Clarifying the concept of student engagement. In C. Bryson (Ed.), Under-
standing and developing student engagement (pp. 1–22). The staff and educational development
series. London and New York: Routledge.
Bryson, C., & Hardy, C. (2011). Clarifying the concept of student engagement: A fruitful approach
to underpin policy and practice. Paper presented at the HEA Annual Conference, 5–6 July,
Nottingham.
References 37

Chandler, J. (2003). The efficacy of various kinds of error feedback for improvement in the accuracy
and fluency of L2 student writing. Journal of Second Language Writing, 12, 267–296.
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate
education. AAHE Bulletin, 3, 7.
Christenson, S. L., & Reschly, A. L. (2010). Check & connect: Enhancing school completion
through student engagement. In E. Doll & J. Charvat (Eds.), Handbook of prevention science
(pp. 327–348). Mahwah, NJ: Lawrence Erlbaum Associates Inc.
Cleary, T. J., & Zimmerman, B. J. (2012). A cyclical self-regulatory account of student engagement:
Theoretical foundations and applications. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),
Handbook of research on student engagement (pp. 237–257). New York: Springer.
Connell, J. P. (1990). Context, self and action: A motivational analysis of self-system processes
across the life span. In D. Cicchetti & M. Beeghly (Eds.), The self in transition: Infancy to
childhood (pp. 61–97). Chicago: University of Chicago Press.
Connell, J. P., Spencer, M. B., & Aber, J. L. (1994). Educational risk and resilience in African-
American youth: Context, self, action, and outcomes in school. Child Development, 65, 493–506.
Connell, J. P., & Wellborn, J. G. (1991). Competence, autonomy and relatedness: A motivational
analysis of self-system processes. In M. R. Gunnar & L. A. Sroufe (Eds.), Self processes and
development (pp. 43–77). Hillsdale: Lawrence Erlbaum Associates Inc.
Crick, R. D. (2012). Deep engagement as a complex system: Identity, learning power and authentic
enquiry. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 675–694). New York: Springer.
Darr, C. W. (2012). Measuring student engagement: The development of a scale for formative
use. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 707–723). New York: Springer.
Davis, M. H., & McPartland, J. M. (2012). High school reform and student engagement. In S. L.
Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 515–539). New York: Springer.
Deci, E. L., & Flaste, R. (1995). Why we do what we do: Understanding self-motivation. New York:
Penguin.
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior.
New York and London: Plenum Press.
Deci, E. L., & Ryan, R. M. (2000). Intrinsic and extrinsic motivations: Classic definitions and new
directions. Contemporary Educational Psychology, 25, 54–67.
Deci, E. L., & Ryan, R. M. (2008a). Facilitating optimal motivation and psychological well-being
across life’s domains. Canadian Psychology, 49(1), 14–23.
Deci, E. L., & Ryan, R. M. (2008b). Self-determination theory: A macrotheory of human motivation,
development, and health. Canadian Psychology, 49(3), 182–185.
Dörnyei, Z. (1994). Understanding L2 motivation: On with the challenge! The Modern Language
Journal, 78(4), 515–523.
Dörnyei, Z., & Ushioda, E. (2011). Teaching and researching motivation (2nd ed.). Harlow: Pearson.
Dunne, E., & Derfel, O. (Eds.). (2013a). The student engagement handbook: Practice in higher
education. Emerald: Bingley.
Dunne, E., & Derfel, O. (2013b). Introduction. In E. Dunne & O. Derfel (Eds.), The student
engagement handbook: Practice in higher education (pp. xv–xxv). Bingley: Emerald.
Eccles, J., Adler, T. F., Futterman, R., Goff, S. B., Kaczala, C. M., Meece, J. L., & Midgley, C.
(1983). Expectancies, values, and academic behaviors. In J. T. Spence (Ed.), Achievement and
achievement motives (pp. 75–146). San Francisco: W. H. Freeman.
Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values and goals. Annual Review of
Psychology, 53(1), 109–132.
Egbert, J. (2015). Plenary: Engagement and practice in classroom learning, language and technology.
In T. Pattison (Ed.), IATEFL 2015: Manchester conference selections (pp. 64–72). Faversham,
Kent, UK: IATEFL 2016.
38 2 The Engagement Concept

Elliot, A. J. (2005). A conceptual history of the achievement goal construct. In A. J. Elliot & C. S.
Dweck (Eds.), Handbook of competence and motivation (pp. 52–72). New York: Guildford.
Elliot, A. J., & McGregor, H. A. (2001). A 2×2 achievement goal framework. Journal of Personality
and Social Psychology, 80(3), 501–519.
Ellis, R. (2010). A framework for investigating and oral and written corrective feedback. Studies in
Second Language Acquisition, 32, 335–349. Retrieved from https://www.cambridge.org/core.
Ellis, R. (2012). Language teaching research and language pedagogy. Chichester: Wiley.
Epstein, J. L., & McPartland, J. M. (1976). The concept and measurement of the quality of school
life. American Educational Research Journal, 13(1), 15–30.
Ferris, D. R. (1999). The case for grammar correction in L2 writing classes: A response to Truscott
(1996). Journal of Second Language Writing, 8(1), 1–11. Retrieved from http://www.sciencedi
rect.com/science/article/pii/S1060374399801106.
Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59(2), 117–142.
Finn, J. D., & Rock, D. A. (1997). Academic success among students at risk of school failure.
Journal of Applied Psychology, 82(2), 221–234.
Finn, J. D., & Voelkl, K. E. (1993). School characteristics related to student engagement. The
Journal of Negro Education, 62(3), 249–268.
Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? why does it matter? In S.
L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 97–131). New York: Springer.
Firestone, W. A., & Rosenblum, S. (1988a). Building commitment in urban high schools.
Educational Evaluation and Policy Analysis, 10(4), 285–299.
Firestone, W. A., & Rosenblum, S. (1988b). The alienation and commitment of students and teachers
in Urban High Schools. Office of Educational Research and Improvement (ED), Washington.
Retrieved from http://files.eric.ed.gov/fulltext/ED294959.pdf.
Ford, M. E., & Nichols, C. W. (1987). A taxonomy of human goals and some possible applications.
In M. E. Ford & D. H. Ford (Eds.), Humans as self-constructing living systems: Putting the
framework to work (pp. 289–311). Hillsdale: Lawrence Erlbaum Associates, Inc.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 62, 60–71.
Fredricks, J. A., & McColskey, W. (2012). The measurement of student engagement: A comparative
analysis of various methods and student self-report instruments. In S. L. Christenson, A. L.
Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 763–782). New
York: Springer.
Furlong, M. J., & Christenson, S. L. (2008). Engaging students at school and with learning: A
relevant concept for all students. Psychology in the Schools, 45(5), 365–368.
Furlong, M. J., Whipple, A. D., St. Jean, G., Simental, J., Soliz, A., & Punthuna, S. (2003). Multiple
contexts of school engagement: Moving toward a unifying framework for educational research
and practice. The California School Psychologist, 8, 99–113.
Garrett, C. (2011). Defining, detecting, and promoting student engagement in college learning
environments. Transformative Dialogues: Teaching & Learning Journal, 5(2), 1–12.
Gettinger, M., & Walter, M. J. (2012). Classroom strategies to enhance academic engaged time. In S.
L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 653–673). New York: Springer.
Grant, H., & Dweck, C. S. (2003). Clarifying achievement goals and their impact. Journal of
Personality and Social Psychology, 85(3), 541–553.
Griffiths, A. J., Lilles, E., Furlong, M. J., & Sidhwa, J. (2012). The relations of adolescent student
engagement with troubling and high-risk behaviours. In S. L. Christenson, A. L. Reschly, & C.
Wylie (Eds.), Handbook of research on student engagement (pp. 563–584). New York: Springer.
Guthrie, J. T., & Anderson, E. (1999). Engagement in reading: Processes of motivated, strategic,
knowledgeable, social readers. In J. T. Guthrie & D. E. Alvermann (Eds.), Engaged reading:
Processes, practices, and policy implications (pp. 17–45). New York: Teachers College Press.
References 39

Guthrie, J. T., & Wigfield, A. (2000). Engagement and motivation in reading. In M. L. Kamil, P. B.
Mosenthal, P. D. Pearson, & R. Barr (Eds.), Handbook of reading research (Vol. 3, pp. 403–422).
New York and London: Routledge.
Guthrie, J. T., Wigfield, A., & You, W. (2012). Instructional contexts for engagement and achieve-
ment in reading. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research
on student engagement (pp. 601–631). New York: Springer.
Han, Y., & Hyland, F. (2015). Exploring learner engagement with written corrective feedback in a
Chinese tertiary EFL classroom. Journal of Second Language Writing, 30, 31–44.
Han, Y., & Hyland, F. (2019). Learner engagement with written feedback: A sociono cleracognitive
perspective. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts
and issues (2nd ed., pp. 247–264). Cambridge: Cambridge University Press.
Harackiewicz, J. M., Barron, K. E., Pintrich, P. R., Elliot, A. J., & Thrash, T. M. (2002). Revision of
achievement goal theory: Necessary and illuminating. Journal of Educational Psychology, 94(3),
638–645.
Hardy, C., & Bryson, C. (2010). Student engagement: Paradigm change or political expediency?
Networks, 9, 19–23.
Heckman, J. J., & LaFontaine, P. A. (2010). The American high school graduation rate: Trends and
levels. The Review of Economics and Statistics, 92, 244–262.
Hendrickson, J. M. (1980). The treatment of error in written work. The Modern Language Journal,
64(2), 216–221. Retrieved from http://www.jstor.org/stable/pdf/325306.pdf.
Hofer, M., & Fries, S. (2016). A multiple goals perspective. In K. R. Wentzel & D. B. Miele (Eds.),
Handbook of motivation at school (2nd ed., pp. 440–458). New York: Routledge.
Janosz, M. (2012). Part IV commentary: Outcomes of engagement and engagement as an outcome:
Some consensus, divergences, and unanswered questions. In S. L. Christenson, A. L. Reschly,
& C. Wylie (Eds.), Handbook of research on student engagement (pp. 695–703). New York:
Springer.
Jimerson, S. R., Campos, E., & Greif, J. L. (2003). Toward an understanding of definitions and
measures of school engagement and related terms. The California School Psychologist, 8(1),
7–27.
Johnson, M. K., Crosnoe, R., & Elder, G. H., Jr. (2001). Students’ attachment and academic
engagement: The role of race and ethnicity. Sociology of Education, 74(4), 318–340.
Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education,
38(5), 758–773.
Kaplan, A., & Patrick, H. (2016). Learning environments and motivation. In K. R. Wentzel & D. B.
Miele (Eds.), Handbook of motivation at school (2nd ed., pp. 251–274). New York: Routledge.
Karweit, N. (1989). Time and learning: A review. In R. E. Slavin (Ed.), School and classroom
organization (pp. 69–95). Hillsdale, NJ: Lawrence Edition.
Krause, K. (2005). Understanding and promoting student engagement in university learning commu-
nities. Paper presented as keynote address: ‘Engaged, inert or otherwise occupied? Deconstructing
the 21st century undergraduate student’, James Cook University, Queensland. Retrieved from
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1&rep=rep1&type=pdf.
Krenn, S. (2017). The students’ perception of their engagement in their EFL classes. Unpublished
Ph.D. thesis, University of Graz.
Kuh, G. D. (2001). The national survey of student engagement: Conceptual framework and
overview of psychometric properties (pp. 1–26). Bloomington, IN: Indiana University Center
for Postsecondary Research.
Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of
student engagement on first-year college grades and persistence. The Journal of Higher Education,
79(5), 540–563.
Kuh, G. D., Schuh, J. H., Whitt, E. J., & Associates. (1991). Involving colleges: Successful
approaches to fostering student learning and development outside the classroom. San Francisco:
Jossey-Bass.
40 2 The Engagement Concept

Lam, S. F., Wong, B. P., Yang, H., & Liu, Y. (2012). Understanding student engagement with a
contextual model. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research
on student engagement (pp. 403–419). New York: Springer.
Lewis, M. D., & Granic, I. (Eds.). (2000). Emotion, development, and self-organization: Dynamic
systems approaches to emotional development. Cambridge: Cambridge University Press.
Mahatmya, D., Lohman, B. L., Matjasko, J. L., & Feldman Farb, A. (2012). Engagement across
developmental periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of
research on student engagement (pp. 45–63). New York: Springer.
Marcum, J. W. (2000). Out with motivation, in with engagement. Global Business and Organiza-
tional Excellence, 19(4), 57–60.
Maroco, J., Maroco, A. L., Campos, J. A. D. B., & Fredricks, J. A. (2016). University student’s
engagement: Development of the University Student Engagement Inventory (USEI). Psicologia:
Reflexão e Crítica, 29(1), 21. Retrieved from https://prc.springeropen.com/articles/10.1186/s41
155-016-0042-8.
Martin, A. J. (2012). Part II commentary: Motivation and engagement: Conceptual, operational, and
empirical clarity. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research
on student engagement (pp. 303–311). New York: Springer.
Middlecamp, C. H. (2005). The art of engagement. Peer Review, 7(2), 17–20.
Middleton, M. J., & Midgley, C. (1997). Avoiding the demonstration of lack of ability: An
underexplored aspect of goal theory. Journal of Educational Psychology, 89(4), 710–718.
Midgley, C. (2001). A goal theory perspective on the current status of middle level schools. In T.
Urdan & F. Pajares (Eds.), Adolescence and education (Vol. I, pp. 33–59). Greenwich: Information
Age Publishing.
Mosher, R., & MacGowan, B. (1985). Assessing student engagement in secondary schools: Alter-
native conceptions, strategies of assessing, and instruments. A Resource Paper for the University
of Wisconsin Research and Development Centre. Retrieved from http://files.eric.ed.gov/fulltext/
ED272812.pdf.
Moyer, A. (2014). Exceptional outcomes in L2 phonology: The critical factors of learner engagement
and self-regulation. Applied Linguistics, 35(4), 418–440. Retrieved from http://applij.oxfordjou
rnals.org.
Natriello, G. (1984). Problems in the evaluation of students and student disengagement from
secondary schools. Journal of Research and Development in Education, 17(4), 14–24.
Newmann, F. M. (1981). Reducing student alienation in high schools: Implications of theory.
Harvard Educational Review, 51, 546–564.
Newmann, F. M., Wehlage, G. G., & Lamborn, S. D. (1992). The significance and sources of
student engagement. In F. M. Newmann (Ed.), Student engagement and achievement in American
secondary schools (pp. 11–39). New York: Teachers College Press. Retrieved from http://files.
eric.ed.gov/fulltext/ED371047.pdf.
Newton, D. P., & Newton, L. D. (2011). Engaging science: Pre-service primary school teachers’
notions of engaging science lessons. International Journal of Science Mathematics Education,
9(2), 327–345.
Nichols, S. L., & Dawson, H. S. (2012). Assessment as a context for student engagement. In S. L.
Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 457–477). New York: Springer.
Niemiec, C. P., & Ryan, R. M. (2009). Autonomy, competence, and relatedness in the classroom:
Applying self-determination theory to educational practice. Theory and Research in Education,
7, 133–144.
Pace, C. R. (1984). Measuring the quality of college student experiences. Higher Education Research
Institute, Graduate School of Education, University of California, Los Angeles. Retrieved from
http://files.eric.ed.gov/fulltext/ED255099.pdf.
Pascarella, E. T., & Terenzini, P. T. (1991). How college affects students: Findings and insights from
twenty years of research. San Francisco: Jossey-Bass.
References 41

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research
(Vol. 2). San Francisco: Jossey-Bass.
Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S.
L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 259–282). New York: Springer.
Peterson, P. L., Swing, S. R., Stark, K. D., & Waas, G. A. (1984). Students’ cognitions and time on
task during mathematics instruction. American Educational Research Journal, 21(3), 487–515.
Pianta, R. C., Hamre, B. K., & Allen, J. P. (2012). Teacher-student relationships and engage-
ment: Conceptualizing, measuring, and improving the capacity of classroom interactions. In S.
L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 365–386). New York: Springer.
Pintrich, P. R. (1999). The role of motivation in promoting and sustaining self-regulated learning.
International Journal of Educational Research, 31(6), 459–470.
Pintrich, P. R. (2000). An achievement goal theory perspective on issues in motivation terminology,
theory, and research. Contemporary Educational Psychology, 25(1), 92–104.
Pittaway, S. (2012). Student and staff engagement: Developing an engagement framework in a
faculty of education. Australian Journal of Teacher Education, 37(4), 37–45.
Pittaway, S., & Moss, T. (2013). Student engagement in and through engagement. In E. Dunne & O.
Derfel (Eds.), The student engagement handbook: Practice in higher education (pp. 275–290).
Bingley: Emerald.
Polk, K., & Halferty, D. (1972). School cultures, adolescents commitment, and delinquency: A
preliminary study. In K. Polk & W. E. Schafer (Eds.), Schools and delinquency (pp. 70–90).
Englewood Cliffs: Prentice Hall.
Ratcliffe, A., & Dimmock, A. (2013). What does student engagement mean to students? In E. Dunne
& O. Derfel (Eds.), The student engagement handbook: Practice in higher education (pp. 59–76).
Bingley: Emerald.
Reeve, J. (2012). A self-determination theory perspective on student engagement. In S. L. Chris-
tenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 149–172). New York: Springer.
Reeve, J., & Halusic, M. (2009). How K-12 teachers can put self-determination theory principles
into practice. Theory and Research in Education, 7, 145–154.
Reeve, J., Jang, H., Carrell, D., Jeon, S., & Barch, J. (2004). Enhancing students’ engagement by
increasing teachers’ autonomy support. Motivation and Emotion, 28(2), 147–169.
Reeve, J., & Tseng, C. M. (2011). Agency as a fourth aspect of students’ engagement during learning
activities. Contemporary Educational Psychology, 36(4), 257–267.
Reschly, A., & Christenson, S. L. (2006). Prediction of dropout among students with mild disabili-
ties: A case for the inclusion of student engagement variables. Remedial and Special Education,
27(5), 276–292.
Reschly, A., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and
future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie
(Eds.), Handbook of research on student engagement (pp. 3–19). New York: Springer.
Rumberger, R. W. (1983). Dropping out of high school: The influence of race, sex, and family
background. American Educational Research Journal, 20(2), 199–220.
Rumberger, R. W., & Rotermund, S. (2012). The relationship between engagement and high school
dropout. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 491–513). New York: Springer.
Russell, V., Ainley, M., & Frydenberg, E. (2005). Student motivation and engagement. Schooling
Issues Digest 2. Department of Education, Science and Training. Retrieved from http://web.arc
hive.org/web/20120721015202/http://www.dest.gov.au/sectors/school_education/publications_
resources/schooling_issues_digest/schooling_issues_digest_motivation_engagement.htm.
Ryan, R. M. (1995). Psychological needs and the facilitation of integrative process. Journal of
Personality, 63(3), 397–427.
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic
motivation, social development, and well-being. American Psychologist, 55(1), 68–78.
42 2 The Engagement Concept

Ryan, R. M., & Deci, E. L. (2016). Facilitating and hindering motivation, learning, ad well-being
in schools: Research and observations from the slef-determination theory. In K. R. Wentzel & D.
B. Miele (Eds.), Handbook of motivation at school (2nd ed., pp. 96–119). New York: Routledge.
Samuelsen, K. M. (2012). Part V commentary: Possible new directions in the measurement of student
engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on
student engagement (pp. 805–811). New York: Springer.
Schafer, W. E., & Polk, K. (1972). School conditions contributing to delinquency. In K. Polk & W.
E. Schafer (Eds.), School delinquency (pp. 181–238). Englewood Cliffs: Prentice Hall.
Schunk, D. H., Meece, J. R., & Pintrich, P. R. (2013). Motivation in education: Theory, research,
and applications (4th ed.). Harlow: Pearson Education Limited.
Schunk, D. H., & Mullen, C. A. (2012). Self-efficacy as an engaged learner. In S. L. Christenson,
A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 219–235).
New York: Springer.
Senko, C. (2016). Achievement goal theory: A story of early promises, eventual discords, and future
possibilities. In K. R. Wentzel & D. B. Miele (Eds.), Handbook of motivation at school (2nd ed.,
pp. 75–95). New York: Routledge.
Sinclair, M. F., Christenson, S. L., Lehr, C. A., & Anderson, A. R. (2003). Facilitating student
engagement: Lessons learned from check & connect longitudinal studies. The California School
Psychologist, 8(1), 29–42.
Skinner, E. A. (2016). Engagement and disaffection. In K. R. Wentzel & D. B. Miele (Eds.),
Handbook of motivation at school (2nd ed., pp. 145–168). New York: Routledge.
Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher
behaviour and student engagement across the school year. Journal of Educational Psychology,
85(4), 571–581.
Skinner, E. A., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping
and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of
research on student engagement (pp. 21–44). New York: Springer.
Skinner, E., Furrer, C., Marchand, G., & Kindermann, T. (2008). Engagement and disaffection in the
classroom: Part of a larger motivational dynamics? Journal of Educational Psychology, 100(4),
765–781.
Skinner, E., Kindermann, T. A., & Furrer, C. J. (2009). A motivational perspective on engagement
and disaffection: Conceptualization and assessment of children’s behavioral and emotional partic-
ipation in academic activities in the classroom. Educational and Psychological Measurement,
69(3), 493–525.
Soanes, C., & Stevenson, A. (Eds.). (2003). Oxford Dictionary of English (2nd ed.). Oxford: Oxford
University Press.
Solomonides, I. (2013). A relational and multidimensional model of student engagement. In E.
Dunne & O. Derfel (Eds.), The student engagement handbook: Practice in higher education
(pp. 43–58). Bingley: Emerald.
Sprinthall, N. A., & Collins, W. A. (1984). Adolescent psychology: A developmental view (2nd ed.,
1988). Crown Publishing Group/Random House.
Storch, N. (2008). Metatalk in a pair work activity: Level of engagement and implications for
language development. Language Awareness, 17(2), 95–114.
Storch, N., & Wigglesworth, G. (2010). Learners’ processing, uptake, and retention of corrective
feedback on writing: Case Studies. Studies in Second Language Acquisition, 32, 303–334.
Retrieved from https://www.researchgate.net/profile/Gillian_Wigglesworth/publication/232025
193_Learners_processing_uptake_and_retention_of_corrective_feedback_on_writing_Case_s
tudies/links/53d1f0150cf2a7fbb2e956d0.pdf.
Svalberg, A. M. L. (2009). Engagement with language: Interrogating a construct. Language
Awareness, 18(3), 242–258.
Taras, A. (2015). Teacher conceptualizations of student engagement: An exploratory study.
Unpublished MA thesis, University of Graz.
References 43

Trowler, V. (2010). Student engagement literature review. York: The Higher Education Academy.
Retrieved from https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_
1.pdf.
Trowler, V., & Trowler, P. (2010a). Framework for action: Enhancing student engagement at the
institutional level. Retrieved from http://www.heacademy.ac.uk/assets/documents/studentengag
ement/Frameworkforaction_institutional.pdf.
Trowler, V., & Trowler, P. (2010b). Student engagement evidence summary. Retrieved from http://
eprints.lancs.ac.uk/61680/1/Deliverable_2._Evidence_Summary._Nov_2010.pdf.
Truscott, J. (1999). The case for “The case against grammar correction in L2 writing
classes”: A response to Ferris. Journal of Second Language Writing, 8(2), 111–122.
Retrieved from http://www2.fl.nthu.edu.tw/old_site/FL2/faculty/John/The%20case%20for%
20the%20case%20against%201999.pdf.
Voelkl, K. E. (2012). School identification. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.),
Handbook of research on student engagement (pp. 193–218). New York: Springer.
Wentzel, K. R., & Wigfield, A. (Eds.). (2009). Handbook of motivation at school. New York:
Routledge.
Whitaker, T., Whitaker, B., & Lumpa, D. (2009). Motivating and inspiring teachers: The educational
leader’s guide for building staff morale (2nd ed.). Larchmont, NY: Eye on Education.
Wigfield, A., Eccles, J. S., Schiefele, U., Roeser, R. W., & Davis-Kean, P. (2006). Development
of achievement motivation. In E. Eisenberg, W. Damon, & R. M. Lerner (Eds.), Handbook of
child psychology: Vol. 3, social, emotional and personality development (6th ed., pp. 933–1002).
Hoboken: Wiley.
Wigfield, A., Guthrie, J. T., Perencevich, K. C., Taboada, A., Klauda, S. L., McRae, A., & Barbosa,
P. (2008). Role of reading engagement in mediating effects of reading comprehension instruction
on reading outcomes. Psychology in the Schools, 45(5), 432–445.
Wolters, C. A. (2004). Advancing achievement goal theory: Using goal structures and goal orien-
tations to predict students’ motivation, cognition, and achievement. Journal of Educational
Psychology, 96(2), 236–250.
Wolters, C. A., & Taylor, D. J. (2012). A self-regulated learning perspective on student engage-
ment. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 635–651). New York: Springer.
Yamamoto, K., Thomas, E. C., & Karns, E. A. (1969). School-related attitudes in middle-school
age students 1. American Educational Research Journal, 6(2), 191–206.
Yazzie-Mintz, E. (2007). Voices of students on engagement: A report on the 2006 high school
survey of student engagement. Center for Evaluation and Education Policy, Indiana University.
Retrieved from http://files.eric.ed.gov/fulltext/ED495758.pdf.
Yazzie-Mintz, E., & McCormick, K. (2012). Finding the humanity in the data: Understanding,
measuring, and strengthening student engagement. In S. L. Christenson, A. L. Reschly, & C.
Wylie (Eds.), Handbook of research on student engagement (pp. 743–761). New York: Springer.
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview.
Educational Psychologist, 25(1), 3–17.

Online Sources

Online 1: http://www.qaa.ac.uk/assuring-standards-and-quality/the-quality-code/quality-code-
part-b [23.03.2017].
Online 2: http://edglossary.org/student-engagement/ [30.11.2016].
Online 3: http://ceep.indiana.edu/nais/about/index.html [03.03.2017].
Online 4: http://nsse.indiana.edu/html/about.cfm [04.12.2016].
Chapter 3
A Short History of Written Corrective
Feedback

This chapter is divided into two sections: First, I briefly outline the history of written
corrective feedback, including types of errors and their role in written CF. Second,
this is followed by a brief description of various feedback methods teachers can
choose from. These are of particular interest for Chap. 5, as many of these are part
of the small-scale study I conducted to determine learners’ perceptions on different
kinds of written CF. Several researchers (Bitchener, 2019; Bitchener & Ferris, 2012;
Ferris, 2004; Guénette, 2007; Lyster & Saito, 2010; Storch, 2010) have pointed out
that studies on written corrective feedback are often difficult to compare due to
varying contexts, participants, the type of errors focused on, or whether the studies
were conducted in a classroom setting or in a laboratory. What many studies on
written CF lack is not arguments for its usefulness but information on which kind of
written feedback might be the most effective for learners as well as teachers. As a
consequence, further research might confirm that not one but a combination of various
kinds of feedback methods proves to be most effective. Like Hattie and Timperley
(2007) stress, more research is needed on how feedback—oral or written—works in
the classroom.
As mentioned in the introduction, focusing on a set of phenomena is important
when deciding which kind of feedback should be used for exploring feedback and
learner engagement. Hence the focus of this small-scale study is on written CF. This
is because on the one hand, teachers spend so much time on marking learners’ work,
arguably more than in any other area of teaching (Ferris & Hedgcock, 2011) and do
not always seem to be satisfied with it, and on the other hand, because, as mentioned
by Hyland and Hyland (2006a, 2006b), research on second language (L2) acquisition
has established that feedback is a central aspect of L2 classrooms but it has still not
been discovered which kind of feedback might work most effectively.
Before exploring the feedback aspect, it has to be kept in mind that L2 can refer
to learning a second, third, etc. language (see Ellis, 1997, 2008, 2015). Research
has established that the acquisition of your L1 (first language/mother tongue) differs
from L2 acquisition. As can be expected, the acquisition of your L1 mostly occurs

© Springer Nature Switzerland AG 2020 45


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_3
46 3 A Short History of Written Corrective Feedback

in its natural setting whereas L2 acquisition can take place in a natural setting or in
the classroom, naturally differing a bit. When using the term ‘L2 acquisition’ it has
to be noted that some researchers distinguish between second language and foreign
language acquisition as they argue second language acquisition takes place in a setting
where, for example, English is a means of communication and is learned by people
living in an English speaking country but the learners speak another first language.
In comparison, foreign language acquisition mainly takes place in a classroom and
the language is not important within the community (for a thorough discussion see
Ellis, 2008, 2015). I agree with Ellis (2008) that second language acquisition refers
to the acquisition of any language other than the mother tongue. It is true that the
context is not the same when learning a language as a second language in the country
where the language is spoken on a daily basis, as opposed to a context where it is
mainly spoken in the classroom. But only because the context is different, does this
not imply that the process of acquiring a language is (see Ellis, 2015). Hence, when
using the term ‘L2’ in this study, it also includes the term ‘foreign language’.
Further, some researchers (cf. Bitchener & Storch, 2016) also distinguish between
L2 development and L2 acquisition. According to them “L2 development is, arguably,
more about specific stages in the learning process. L2 acquisition can be understood
in terms of either the acquired end-product […] or the process of acquiring the L2
and, in this regard, is similar to the process of L2 learning and L2 development”
(p. 2). A useful distinction also with written CF, as a written assignment in the
school context can either be seen as the end-product, or, more popular nowadays, as
a process, hence a process approach to writing is needed (see Lee, 2014). Producing
multiple drafts of one assignment not only helps the learner to work on his or her
linguistic errors, but also on paragraphing, content, to name just a few, hence offers
plenty of opportunities to reflect on and respond to feedback.

3.1 Terminology

As Bitchener and Storch (2016, p. 1) state “written CF is a written response to a


linguistic error [and it] seeks to either correct the incorrect usage or provide informa-
tion about [the error.]” A precise definition, but within the teaching context, corrective
feedback has been used for several decades and has proven to be a very complex
process as it “varies from teacher to teacher depending on such factors as the broader
institutional context, the kind of instructional activity they are engaged in, and the
teachers’ assessment of the relevance of correction to the particular learners they are
teaching” (Ellis, 2012, p. 12). But why does it vary so much? Not only are teachers
influenced by the educational setting in which they are teaching, but also by their own
beliefs about corrective feedback and its usefulness. Like researchers, teachers too
have preferences, and due to the many feedback methods available they can choose
one that caters to the needs of their learners in their specific contexts (Hyland &
Hyland, 2006b; Brown, 2012). Considering relevant literature on written feedback
in L2 learning when choosing which method to undertake, teachers will come across
3.1 Terminology 47

an extensive body of literature and will encounter numerous positions on it which


have shifted through the decades. Not surprisingly, researchers have questioned the
value of feedback in general (Freedman, Greenleaf, & Sperling, 1987), error correc-
tion specifically, and some even found it to be ineffective (see e.g., Hendrickson,
1980; Truscott, 1996, 1999, 2007). Truscott (1999), for example, “emphasized that
although most L2 students clearly desire grammar correction, teachers should not
give it to them” (Liu, 2008, p. 66). Truscott’s viewpoint has not only sparked a heated
debate among researchers, but has also shifted the focus to providing a theoretical
framework for written CF. Many studies at the advent of research on the effective-
ness of written CF were often conducted by teacher researchers and based on the
assumption that written CF can facilitate the learning of the target language, but
this hypothesis lacked the actual evidence (Bitchener, 2019). As a consequence,
researchers looked at existing theories and analysed how they could aid research on
the effectiveness of written CF and many studies now provide evidence of written CF
being helpful to acquire target language features (Ferris & Kurzer, 2019). In contrast
to this, Truscott even concluded in his research that sometimes corrective feedback
can be detrimental, hence should not be given. Based on several studies he claimed
that there is no evidence that corrective feedback is effective, as it cannot support
L2 competence and teachers’ feedback creates all sorts of issues, e.g., anxiety on the
students’ part (cf. Sheen, 2011). His view has been challenged by Ferris (1999) and
also by other studies which showed that written CF can be useful (e.g., Bitchener,
2012; Bitchener & Ferris, 2012; Bitchener & Knoch, 2009a; Chandler, 2003; Ellis,
2012). How should learners acquire target language structures when they are not
supported along the way to achieve their goal? When you learn your L1, you also
make errors which will be corrected by parents, teachers, etc. in order to acquire the
desired language structures. Why should that, then, not be desirable in the L2? Unde-
niably, Truscott’s view on written CF has sparked many debates among researchers,
which contributed to research in key areas, such as if focused CF is more beneficial
than unfocused CF, whether or not explicit types of written CF are more effective
than implicit types, or the need to look into individual and contextual factors when
providing feedback.
The term ‘corrective feedback’ has been used by interactionist-cognitive theories
of L2 acquisition (Ellis, 2012) and refers to directly correcting an error. How this is
accomplished can vary due to the feedback method used. Conversational analysts,
who focus more on social and contextual aspects of language use—which Firth and
Wagner (1997) claim has been neglected by SLA—prefer the term ‘repair’, where not
a single error but a longer chunk is analysed (Ellis, 2012). As outlined by Ellis (2010),
there has been more research conducted on oral than written corrective feedback.
In studies on oral feedback, opinions on how corrective feedback should be given
differ regarding the theoretical background: theories based on Chomsky’s Universal
Grammar claim that negative corrective feedback has no place in L2 learning, whereas
cognitive interactionist theories stress that it can aid learners to acquire target-like
structures. Sociocultural theories, however, are of the opinion that there is no single
type of corrective feedback that is best for learning, because individual learners’
needs have to be taken into account (cf. Ellis, 2010 for a more thorough discussion).
48 3 A Short History of Written Corrective Feedback

No matter which approach is taken to written CF, there seems to be no consensus


among researchers as to which kind might be effective or even what should be
corrected. A teacher, therefore, must decide which errors to correct, when to correct
them as well as how to do that—should the teacher him- or herself do that, or the
students? Hendrickson (1978) concluded that research on corrective feedback needs
to answer five questions:
(1) Should learner errors be corrected?
(2) If so, when should learner errors be corrected?
(3) Which learner errors should be corrected?
(4) How should learner errors be corrected?
(5) Who should correct learner errors? (Hendrickson, 1978, p. 389)
This shows how complex written CF is and how many decisions teachers need
to make before providing learner feedback. Not only is the timing for correction
crucial, but a teacher also needs to decide whether or not to give focused or unfocused
feedback (see Sect. 3.3.5), which feedback method supports learners in their learning
process, thus deciding if the teacher corrects the learners or whether probably learners
should be correcting each other. No matter who the feedback provider is, there are
three questions that should be answered by the teacher as well as the learner:
(1) Where am I going (What are the goals?)
(2) How am I going? (What progress is being made towards the goal?)
(3) Where to next? (What activities need to be undertaken to make better progress?)
(Hattie & Timperley, 2007, p. 86)
I firmly believe that regardless of the feedback method and who provides the
feedback, both parties need to ask themselves the aforementioned questions. The
teacher needs to justify why he or she uses a certain feedback method and if progress
is being made on the part of the learners. Every now and then the feedback method
should be critically evaluated and if necessary adjusted to make sure that the learners
benefit from it. Likewise, learners should ask themselves what the purpose of the
feedback they get is and whether they can work with it or not. If no progress is made,
they should question their strategies and seek the dialogue with their teacher to work
out what they might need to change to be more successful.
Cognitivist theories emphasise the importance of correcting errors for language
acquisition as it might help learners to recognise specific language patterns, which
could be avoided over time. In contrast to that sociocultural theorists view learning
as participation rather than acquisition, where corrective feedback is also desirable
but they acknowledge that it might not work for all learners in the same way, as the
corrected form might not be understood by them due to a lack of comprehension on
the learner’s part (see Ellis, 2012 for a more thorough discussion).
What is the purpose of feedback then and why do students need it? According to
Winne and Butler (1994) “feedback is information with which a learner can confirm,
add to, overwrite, tune, or restructure information in memory, whether that informa-
tion is domain knowledge, meta-cognitive knowledge, beliefs about self and tasks, or
cognitive tactics and strategies” (1994, p. 5740 as cited in Hattie & Timperley, 2007,
3.1 Terminology 49

p. 82). This definition shows that learners already have ideas, concepts, etc. about
language in their minds, because they acquired their L1. Feedback learners get in the
L2, for example, helps them to establish that their language forms are either correct,
need to be repaired or they encounter a completely new form/structure they can add
to their existing knowledge. Looking at feedback from this perspective, written CF
seems to be useful to guide learners and provide them with opportunities to improve
their knowledge of the second language. Consequently, it is even more important
to choose the feedback method wisely and challenge its effectiveness on a regular
basis.

3.2 The Question of Errors

Unsurprisingly, when learning a foreign language, learners making errors is an essen-


tial part of achieving L2 competence. Are L2 errors, then, comparable to errors chil-
dren make when acquiring their L1? Or are L2 errors just a figment of teachers’
imaginations when correcting learners’ work with finding errors in mind? In the
1960s Brooks (1960, p. 58) claimed “[l]ike sin, error is to be avoided and its influ-
ence overcome, but its presence is to be expected” (as cited in Hendrickson, 1978,
p. 387). This is a rather strong statement to which researchers and teachers nowadays
(as well as in the past) might object to. At the core of it is one assumption, though:
errors exist, made by native and non-native speakers alike. Naturally, teachers are
very likely to have one goal: to reduce these errors. How teachers go about it and
how much emphasis they put on correcting errors might have an influence on the
atmosphere in the classroom and the learners’ willingness to engage with corrective
feedback. As Hattie (2012) stresses “[e]rrors invite opportunity. They should not be
seen as embarrassments, signs of failure or something to be avoided […] they are
signs of opportunities to learn and they are to be embraced” (p. 124). Exactly that
sentiment should be in a teacher’s mind when working with learners on reducing
their linguistic errors in the foreign language.
Nevertheless, definitions of the term ‘error’ are manifold and depending on the
teachers’ preferred one, errors can then be seen from two perspectives. Using Brooks’
definition, one will view them as incorrectness in the target language that needs to be
avoided at all costs. If, for example, one prefers Ferris’ (2011) definition of errors—
“morphological, syntactic and lexical forms that deviate from rules of the target
language, violating the expectations of literate adult native speakers” (p. 3)—they
will be seen as a developmental stage in the learners’ L2 acquisition showing the gap
between existing knowledge and the target language structure that needs to be worked
on (see Bitchener and Ferris 2012 for an extensive discussion). Not only a lack of
knowledge of the L2 can cause these gaps, but also transfer from L1 structures which
do not work in the L2. These issues as well as differing views among researchers
(avoidance vs. developmental stage), once again, have an impact on the teacher’s
attitude towards error correction and certainly have an influence on which type of
feedback will be implemented in the classroom.
50 3 A Short History of Written Corrective Feedback

3.2.1 Systematic and Unsystematic Errors

What kind of errors can learners make? Corder (1967) claims that errors can occur in
two forms: systematic and unsystematic (or non-systematic). He elaborates further
that systematic errors can be linked with competence, where a certain form, for
example, has not been fully acquired. In comparison, unsystematic errors can be
attributed to performance (e.g., in conversations), where the learner can easily correct
unsystematic errors him- or herself. To make a more precise distinction, he proposed
the term ‘error’ should be used for systematic errors and the term ‘mistakes’ for
unsystematic errors. In other words, Corder distinguishes “between ‘errors’ resulting
from gaps in learners’ L2 knowledge and ‘mistakes’ due to lapses of concentration”
(Ellis, 2012, p. 136; see also Ellis, 1997). Is it easy then for a teacher to distinguish
between an error and a mistake? I would argue not—take spelling mistakes as an
example. It is true that these can sometimes be contributed to lapses in concentration
and when pointed out to the learner might be easily corrected by him or her. But
if the same mistakes occur several times, then these are spelling errors as they are
systematic. If this is the case, the next step for a teacher in order to aid the learner
to be aware of and/or overcome these errors would be to determine whether or
not the learner suffers from a learning disability like dyslexia, because traditional
means of written CF might not be sufficient. This shows that feedback needs a lot of
consideration on the teachers’ part for it to be effective for learners.

3.2.2 Treatable and Untreatable Errors

Ferris (1999) in her response to Truscott’s (1996) claim that grammar correction
is ineffective, identified ‘treatable’ and ‘untreatable’ errors in respect of feedback.
‘Treatable’ errors are, for example, subject-verb agreement, missing article, etc.
that can be amended by the learners. As ‘untreatable’ errors she described errors
such as lexical and syntax errors (including word order problems), which are much
harder (or sometimes even impossible) to correct by the learners themselves, because
they cannot simply consult a rule book, dictionary, etc. to correct their errors. As a
consequence, some researchers, for example Bitchener (2008) and Sheen (2007),
conclude that focused error correction should be favoured in written CF—a view
which might not be shared by all researchers (see Sect. 3.3.5). One aspect stressed
by Ferris’ ‘treatable’ and ‘untreatable’ errors is that feedback is very complex and
that there is a lot of work included on the learner’s and teacher’s part when feedback
should aid language proficiency. To accomplish this, a teacher needs to decide which
kind of input is needed to foster that.
Although some researchers stress that focused error correction should be favoured,
the participants of my small-scale study disagreed, as they wanted to know about all
their errors. Aiding their cognitive engagement with written CF definitely was the
colour code (see Sect. 5.2.4.1), which was used as part of their feedback method.
3.2 The Question of Errors 51

Consequently, the learners realised in which category (grammar, spelling, expression,


etc.) they made the most errors. True, most of them were not able to correct all their
errors, but they were more aware of their treatable und untreatable errors, thus being
able to avoid some of these in the written exam.

3.2.3 Positive and Negative Evidence

According to Gass (1997, 2003) there are two types of input with which the teacher
can provide learners when giving feedback: positive and negative evidence. Positive
evidence includes informing learners about forms that are acceptable in the L2,
whereas negative evidence informs about the incorrectness of utterances made by a
learner and very likely corrective feedback is provided. As Li (2010) points out some
researchers claim that only positive evidence is needed (e.g., Krashen, 1981; Truscott,
2007), hence negative evidence should be avoided, and learners should be exposed
to as much positive evidence as possible. Others, however, disagree and believe that
both positive and negative evidence is needed to acquire the target language (Swain,
1985).
Hardly anyone would question that positive evidence is needed to help learners
acquire the L2, but negative evidence will always be a part of it too as learners produce
forms that deviate from the target language. Instead of abandoning negative evidence,
it should be embraced and used to explain why this particular form does not work
in the L2. I believe that learners can greatly benefit from such an approach, because
their language awareness can be raised which might help them to spot non-target
forms themselves.

3.3 Providing Feedback

According to Sigott (2013), teaching always follows three stages: the teacher provides
input, the learners use or produce language and then receive feedback on their perfor-
mance and/or make use of the information given. Gass (1997) on the other hand, in
her computational framework states there are five key stages. Using this framework
for written CF, the five stages include (1) attending to feedback, (2) processing and
working with feedback, (3) analysing feedback and comparing feedback with their
actual written text, (4) integrating feedback, (5) producing new written text. In addi-
tion to that, teachers first need to decide which type of written CF to choose. Ellis
(2009, p. 97) points out there are “various options (both familiar and less familiar)
for correcting students’ written work.” Hence, before investigating learners’ written
work the teacher needs to have a clear concept of the kind of written feedback to
be used in the classroom setting in mind. Will learners benefit more from direct
(i.e. providing the correct answer), indirect or metalinguistic corrective feedback?
Or should electronic feedback as well as reformulation be chosen (see Bitchener &
52 3 A Short History of Written Corrective Feedback

Storch, 2016; Ellis, 2009)? Moreover, peer correction also needs to be considered as
one of the options of feedback as it “represents a move away from the teacher-centred
classroom and promotes learner-centredness” (Meštrović Štajduhar, 2013, p. 87).
A rather controversial view in this respect is that “research has found correction to
be a clear and dramatic failure” (Truscott, 2007, p. 271). Supporters of no grammar
correction will welcome this notion, those in favour of it probably to a lesser extent.
One could conclude from Truscott’s statement that he rejects correction altogether.
But this is not entirely true as he (1999) had already stated in his response to Ferris.
In his opinion many studies have proven that grammar correction is ineffective, but
he acknowledges that there could be instances when it might prove to be effective.
Feedback on content, for example, is an area which Truscott does not dismiss as
being ineffective. His claim that grammar correction does not improve accuracy in
student writing is questionable, though, as most studies on it are not longitudinal. To
make such a claim, would it not need a study where learners are investigated over a
very long period—years even? Arguably, this might be difficult to achieve, but this
is exactly what would be needed before being in a position to declare an approach
ineffective. Due to the differing viewpoints among researchers, it is not surprising
that teachers have varying opinions on the effectiveness of feedback too.
Consequently, one essential question teachers need to ask themselves is: What is
meant by feedback? Is it teachers giving feedback and learners responding to it? Is it
different in the classroom context compared to other contexts? Or is it learners inter-
acting with their teachers? As can be expected, providing feedback, either written
or oral, is one of the manifold tools teachers use in their teaching. Undeniably,
written feedback is an essential part of teachers’ lives and the literature also points
out that many researchers have focused on various feedback types. Hence, practi-
tioners can draw on a variety of different methods, but establishing which mode
of written feedback might be engaging for learners has yet to be investigated. Ellis
(2012) emphasises that an ongoing “debate has raged regarding the value of written
corrective feedback in helping learners achieve greater accuracy in their writing and
also the need to investigate such issues on different populations of instructed learn-
ers” (p. 3). One feedback method might work perfectly for one population but not
the other, as not only the teaching environment needs to be taken into account, but
also the attitude towards learning within a certain culture might play a crucial role.
It needs to be investigated as to whether or not it is possible to establish a written
feedback method that works most effectively or if that notion will remain a myth as
too many factors influence learners’ engagement with feedback.
Another dilemma teachers constantly face in their teaching profession, as
Widdowson (2003) stresses, is that
[t]here is no shortage of people recommending what language teachers should do, whether
they call themselves methodologists, teacher trainers, or applied linguists, whether they base
their recommendations on practical experience, empirical evidence or theoretical expertise.
But they are in no position to recommend particular courses of action though they can, of
course, point out the possibilities it might be profitable to explore. (Widdowson, 2003, p. 15)

This implies that although there are endless recommendations as to what teachers
ought to do, in the end it is the teacher’s own decision about which recommendation
3.3 Providing Feedback 53

is implemented in his or her teaching. In other words, it is a rather difficult endeavour


as research on feedback includes so many different aspects. To name just a few, they
include praise (e.g., Mercer & Ryan, 2013) and attributional (behavioural) feedback,
where feedback, for example, is either connected to the learners’ ability or effort
(Brophy & Evertson, 1976; Burnett, 2003; Dohrn & Bryan, 1994; Mueller & Dweck,
1998). In another study high-quality feedback has been described as a dialogue
between the teacher and his or her learners, where they not only get the correct answer,
but strategies to tackle incorrect linguistic forms on their own (see Brophy, 1986).
Other studies show that highest effects of feedback occur when learners get feedback
on a task and direction about how to improve doing it (see Hattie & Timperley, 2007).
No matter which feedback method is adopted, learners might be able to perform at
higher levels, when teachers interact with them and strive for deep rather than shallow
engagement. Feedback needs to be about improvement and how to accomplish this
goal (cf. Evans, Hartshorn, McCollum, & Wolfersberger, 2010), not only pointing
out the negative aspects or what has gone wrong, but possible solutions to tackle this
(cf. Hills, 2016).

3.3.1 Formative and Summative

Feedback can basically be used for two different functions: formative or summative.
Scriven (1967) coined these terms and distinguished formative, where the focus is
on learning processes and how to support learners to improve their performance,
from summative, which concentrates on assessing learners’ performances, e.g., tests
to grade students (cf. Hyland & Hyland, 2006b). The distinction between forma-
tive and summative in research is very often found in connection with assessment.
Sadler (1989) reports that formative assessment is usually used to help learners
improve their competence. In contrast to that, summative assessment generally sums
up the learners’ achievement at the end of a course, school year, etc. Whereas forma-
tive assessment should have an impact on learning, summative assessment does not
immediately have one, but can have a delayed effect on the learners’ personal and
educational decisions (cf. ibid.). Although the importance of formative feedback has
been stressed by many researchers and is a practice that should be implemented in
teaching, I disagree with Hyland’s and Hyland’s (2006a, p. 83) claim that summative
feedback “has generally been replaced or supplemented by formative feedback”.
Not only would an extensive study be needed to verify that claim, but often with
very large class sizes and increasing workloads for teachers it seems to be inevitable
that summative feedback will be used extensively as well (cf. Campbell & Schumm
Fauster, 2013).
In classroom practice formative and summative forms of feedback are used.
Formative feedback should be non-evaluative, hence the learner should be provided
with information, for example on his or her writing assignments, to help to improve
these (cf. Juwah et al., 2004; Shute, 2008). As Shute (2008) points out, an interesting
54 3 A Short History of Written Corrective Feedback

distinction of formative feedback was mentioned by Black and Wiliam (1998): direc-
tive and facilitating feedback. The former explicitly states what needs to be corrected,
the latter indicates (e.g., suggestions, comments) what needs to be done (cf. Shute,
2008). In other words, this stresses the complexity of feedback, because for feed-
back to be useful for learners so many factors come into play. Like many other areas
of feedback, formative feedback can take many forms and teachers need to decide
which form is most suitable for their teaching context and their learners. To put it
more simply, when providing feedback on writing the purpose of the task decides
which approach to take. For homework (e.g., articles, essays, reports), where the
focus might be more on the process of writing, formative feedback to aid the students
in their improvement of mastering different writing tasks might prove to be more
effective, whereas when grading students’ written performance, focus on the product
is inevitable, hence summative feedback will very likely be used more often.

3.3.2 Indirect and Direct

A common distinction in written corrective feedback is direct (e.g., correct form


is given) and indirect (e.g., incorrect form indicated, but no correction) feedback,
but researchers have used these terms differently. Hendrickson (1978, 1980), for
example, uses the terms ‘indirect correction treatment’, which means indication or
exact location of an error, but no explanation, and ‘direct correction treatment’ where
in addition to indication or location clues or tips are provided to help learners to self-
correct their errors. To give some examples, direct feedback can be crossing out the
wrong form, providing the correct form, adding items that were omitted, or including
metalinguistic feedback, explaining the type of error and giving an example (see
Sect. 5.1.2). Indirect feedback includes highlighting, underlining, circling or only
indicating the number of errors in the margin with a check, for example (cf. Haswell
1983). In other words, indirect feedback also varies in its directness, as in the latter
example the learner needs to find the errors him- or herself whereas the other forms
mentioned locate the error.
Ferris, Liu, Sinha, and Senna (2013) point out that indirect feedback (e.g.,
Hendrickson, 1980; Lalande, 1982) might be more valuable for writing develop-
ment. Learners focused on self-correcting their errors are engaged cognitively with
the feedback they have been given. Evaluating and re-evaluating their written text,
helps them to recognise certain error patterns and deepens their existing knowledge
of the foreign language. Direct feedback (e.g., Bitchener & Knoch, 2010), on the
other hand, might be more useful for language acquisition. Essential is the learner’s
deep engagement with his or her errors, because only then acquisition can take place.
What is more, the proficiency level of the learner is another aspect that definitely
needs to be part of the equation. Direct feedback, for example, might be neces-
sary for learners with a lower proficiency level as they might lack the necessary
language skills to work with indirect feedback (see Ellis, 2009). This implies that
teachers need to be aware of this difference, when providing learner feedback. The
3.3 Providing Feedback 55

teacher has to think about the learners’ level as well as the purpose of the feedback,
whether it should aid the learners’ development, either with their writing skills or
their language acquisition. Writing skills especially need training as well, as it is
crucial that learners are familiar with elements of language, such as spelling, word
choice and the mechanics of writing, as well as planning and organising a written
text (Richards & Renandya, 2002; Sadeghi & Mosalli, 2013; Seow, 2002). Providing
feedback is a complex issue, as so many factors foster or hinder learners’ engagement
with written CF. Consequently, the teacher’s role in feedback is a crucial one in so
many ways.

3.3.3 Implicit and Explicit

Research on oral corrective feedback is more extensive than on written corrective, but
more and more studies on the effectiveness of written corrective feedback are being
conducted (see Ferris et al., 2013 for a detailed discussion). The distinction between
implicit and explicit feedback can be found in oral corrective feedback, where two
types are used: input-providing (e.g., recasts) and output-pushing feedback (self-
correction of the learner being the aim) being one and implicit (e.g., reformulation
of an utterance, but the focus is on meaning) and explicit (e.g., explicit correction or
metalinguistic feedback) the other (cf. Ellis, 2010; Ellis, Loewen, & Erlam, 2006).
Ellis (2010) argues that any kind of written CF is explicit as no matter which method is
being used, the learner notices immediately that parts of his or her writing are not quite
right. If one sees written CF as an explicit act, then a distinction between implicit-
explicit (e.g., colour code) and explicit-explicit (e.g., metalinguistic) feedback might
be necessary. One crucial factor to be kept in mind, no matter which kind of feedback
is provided, is that just because an error was once pointed out to the learner, does not
mean he or she can produce the correct linguistic form (Hyland & Hyland, 2006a).
For acquisition to take place, many repetitions and revisions are necessary so that the
information is stored in the long-term memory and the target linguistic forms can be
produced correctly.

3.3.4 Metalinguistic Feedback

Metalinguistic feedback includes explaining errors to learners, which can be deliv-


ered in two types (for a thorough discussion see Ellis, 2009). First, the most commonly
used form is error codes either in the text (e.g., located above the error) or in the
margin, where learners need to find the error themselves. How explicit these codes
should be is still an issue for debate. Very often rather broad definitions are used, for
example, article or tense. The question is whether or not these broad definitions of
codes are sufficient for learners or if they need more exact clues (e.g., definite and
56 3 A Short History of Written Corrective Feedback

indefinite article, past tense form needed). Consequently, the degree of explicitness
depends on the proficiency level.
Second, the other type of metalinguistic feedback is metalinguistic explanations.
In this case the learners’ errors are explained (for an example see Sect. 5.1.2). This
form of feedback seems to be less common as a feedback method as it is much
more time-consuming than error codes (cf. Ellis, 2009). Nevertheless, if learners do
not understand why their linguistic form is incorrect, a metalinguistic explanation
might be what is needed to help them improve their mastery of the L2. It can be
easily implemented within an existing feedback method, especially when learners
make recurring errors. In this case metalinguistic explanations help learners better
understand the underlying problem of the error.

3.3.5 Focused vs. Unfocused

Teachers can decide to use either focused error correction, where they only indicate
and/or correct two or three categories of errors, or unfocused, where all errors are
marked (see Van Beuningen, 2010). Researchers disagree on which of the two is
more useful in respect of written CF. Lee (2005) in her study on learners’ beliefs
on error correction reported that 82.9% of the participants preferred unfocused error
correction, because they wanted to know about their errors and thought this might
help them to avoid these. Amrhein and Nassaji (2010) in their study on what learners
and teachers preferred regarding written CF, found out that 93.9% of the learners
chose mark all errors, showing that they would like to know what went wrong in
their writing, as the majority (77.8%) wanted to use the feedback as a learning tool.
This indicates that unfocused error correction seems to be valuable, when learning
and improvement are at the centre of attention.
On the other hand, Bitchener (2008) and Sheen (2007), for example, concluded in
their respective studies that for certain grammatical structures, focused error correc-
tion is more beneficial than unfocused and should therefore be used. It might be true
that learners would welcome focused error correction, focusing on only two or three
types of errors. That way progress in these areas might be more likely as they pay
more attention to these forms. What about the other error categories, though? Taking
the Austrian educational setting, for example, when writing an exam, learners are
judged on their overall performance and not only two or three error categories. The
same is true when they are applying for jobs, for instance. A lack of accuracy could
lead to their rejection in certain areas. In other words, teachers need to decide when
to use focused or unfocused error correction to prepare their learners for their next
step in their professional lives.
3.3 Providing Feedback 57

3.3.6 Focus on Form vs. Focus on Content

Another issue teachers need to keep in mind when providing feedback on student
writing is whether or not to focus on content or form. But can content and form be
seen separately? Or can one not survive without the other? Which area to focus on
when correcting learners’ written texts has been a debate for decades (cf. Fathman
& Whalley, 1990) and it is very unlikely that it is going to be resolved in the near
future. One can argue that it might not need to be solved, because it depends on the
purpose as to whether or not to focus on form or content. I believe that these two areas
complement each other and when giving feedback both areas should be considered
(cf. Hyland & Hyland, 2006b). If the aim is linguistic accuracy, then focus on form
is inevitable. If it is well-structured arguments, a focus on content is definitely more
suitable. I believe it is the teacher’s responsibility to make an informed decision
about which form is more appropriate for the different writing tasks when correcting
student writing.

3.3.7 Effective Feedback

Feedback should guide learners in their learning, so that they can eventually take
the next step to improve on their own (cf. Black & Wiliam, 1998; Sadler 1989).
Ramaprasad (1983), from a management theory perspective, came up with the defi-
nition that feedback “is information about the gap between the actual level and the
reference level of a system parameter which is used to alter the gap in some way”
(p. 4). Based on Ramaprasad’s definition of feedback, Sadler (1989) promotes three
conditions for feedback to be effective for the learner:
(1) possess a concept of the standard (or goal, or reference level) being aimed for;
(2) compare the actual (or current) level of performance with the standard;
(3) engage in appropriate action which leads to some closure of the gap. (Sadler,
1989, p. 121)
This is a lot to grasp on the learner’s part in order to get the most out of feedback.
Hence, which kind of feedback is effective to alter the gap? Naturally, the kind of
feedback given also varies according to the context. Learners at secondary level need
another kind of feedback on their writing than learners at tertiary level as the purpose
is different—writing a blog post compared to writing a term paper, for instance. What
some researchers say is lacking at the tertiary level is a cycle of feedback and revision,
where drafts can be produced, learners then receive formative feedback, edit their
draft and resubmit it (Carless, 2007; Taras, 2006; Weaver, 2006). This way, learners
get the opportunity to engage with their writing and get sufficient feedback to be able
to achieve the standard for writing a paper.
This idea of producing drafts and giving learners the chance to improve their
pieces of writing should also be implemented at secondary level, because learners
58 3 A Short History of Written Corrective Feedback

would be encouraged to work on their writing skills. For that reason, guidelines for
providing learner feedback should be established. Drawing on Sadler’s aforemen-
tioned conditions, I strongly propose Giving-Feedback-Guidelines where learners
are informed about:
(1) purpose of the feedback method;
(2) how the feedback method works;
(3) strategies to engage with it.
These could ensure that learners understand the feedback they get. I strongly
argue that for feedback to be effective, the Giving-Feedback-Guidelines for learner
feedback need to be in place. First, learners need to know why their teacher has
decided to use a certain feedback method. They can relate to the feedback method
more easily and chances of their willingness to engage with it are heightened. Second,
for deep engagement to happen, learners need to know how the feedback method
works. Investing time to thoroughly explain it to them, ensures that learners can
benefit from it. Finally, strategies for working with it need to be made explicit,
because teachers cannot expect their learners to know how to work effectively with the
feedback they get. Cohen (1987), for example, reported that learners had difficulties
dealing with feedback, because they only had limited strategies for working with it.
On the other hand, Ferris (1995a) showed that in her study the majority of research
participants used all kinds of strategies to work with the feedback they got. Obviously,
several factors can influence the use of strategies, ranging from the educational
context, strategies the learners have already encountered to the way feedback is
given, all of which might impact on their ability to work with feedback. In other
words, the Giving-Feedback-Guidelines might be a way forward in the feedback
debate.

3.4 Types of Feedback

There are many types of feedback a teacher can rely on, but an essential factor that
accompanies this is whether learners are actually able to deal with written feedback.
As established, learners might need strategies for processing these different kinds
of feedback (cf. Cohen, 1987). Therefore, if a teacher wants his or her learners to
succeed in working with written feedback, he or she needs to provide them with
strategies to do so. When deciding which type of feedback, this aspect needs to be
taken into consideration as the feedback method should help the learner to progress
in his or her proficiency of the L2.
Another factor to consider before choosing a feedback type is the learners’ reac-
tion to it. Cohen (1987) reported in his study on how learners respond to teacher
feedback, that those learners claiming to be poorer learners are the ones that appear
to implement only a few strategies to work with teacher feedback. Moreover, “stu-
dents who were perhaps in greatest need of input from the teacher—namely, those
3.4 Types of Feedback 59

who rated themselves poorer learners—were also least likely to read over their papers
and attend to corrections” (p. 66). How can teachers then help these learners?
Drawing on my own experience as a language teacher, I have noticed that not only
the feedback method but also the learners’ motivation to work with feedback they
get is an important issue within the classroom setting. Some learners are keener on
doing and correcting their written homework, while others simply refuse to do any
kind of written assignments. Which type of feedback could help learners to engage
with it then?

3.4.1 Error Correction

One of the most common explicit forms of written corrective feedback is error correc-
tion—at least in the Austrian educational system. It is said to be a fast way of
responding to learners’ texts and immediately shows learners what went wrong as
their errors are located and the correct version is given. Research has shown that direct
error correction on specific linguistic forms (e.g., English articles) can be effective
(Bitchener, 2008; Bitchener & Knoch, 2008; Sheen, 2007), and under certain circum-
stances “busy teachers should not necessarily feel that they need to go beyond this
approach” (Bitchener & Knoch, 2009b, p. 328).
Generally speaking, error correction is a good idea, but the crucial part is that the
learners have to do a second draft with this kind of feedback, otherwise they will very
likely not reflect on the feedback they get (cf. Irwin, 2017). This can be supported by
my small-scale study, where Katharine confirmed in her individual interview: “it’s
when we don’t have to do a second draft […] or something. Then it’s just like ..
Okay. …. Okay. I don’t really look at it” (Katharine, Paragraph 484). Several studies
have shown that provision of the correct form is something learners want to get (cf.
Amrhein & Nassaji, 2010; Bitchener & Ferris, 2012; Lee, 2005), but it should be the
last step in the feedback process (cf. Ferris, 2011; Milton, 2006) and not the first and
only one.

3.4.2 Error Indication/No Correction

Error indication, where no correction is provided, is one of the implicit forms of


written corrective feedback. How the error is indicated can vary a bit. Common forms
are circling, highlighting, underlining the word/phrase or check(s) in the margin indi-
cating how many errors occurred but not indicating the exact location. An important
aspect to keep in mind with this kind of feedback is that learners need to produce
a second draft, because otherwise it will be ineffective in respect of acquiring the
target forms.
When using error indication another issue to consider is how learners react to
finding the errors themselves. It seems to be logical that learners will be challenged
60 3 A Short History of Written Corrective Feedback

cognitively, when only the number of errors is indicated but not their location (cf.
Bitchener & Ferris, 2012). However, if learners cannot locate the errors, they might
get frustrated quickly and simply give up. It is often the case that learners have implicit
knowledge of a rule, but cannot explicitly use it, as they have not yet reached the
autonomous stage, where they use certain linguistic forms more automatically and
unconsciously. In line with that, it seems that this type of feedback should be used
with more advanced learners who have already obtained a much higher proficiency
level.

3.4.3 Reformulation

Originally, reformulation derived from the concept of reconstruction where the


teacher reconstructed the learners’ texts so that it sounded native-like but keeping the
learners’ ideas. The learners then decided which of the reconstructions they would
like to take on board or dismiss (cf. Ellis, 2009). This method includes a form of
error correction as well as a revised form of the learners’ texts.
In the school context, reformulation of single sentences is sometimes used, where
the learners get a new version of their sentences. The reasoning behind this technique
is to provide learners with a structure that sounds more native-like which they can
adopt in subsequent writing. Learners, however, sometimes react negatively to this
kind of correction as they feel their sentences are no longer theirs, hence they have lost
ownership of parts of their texts (see Sect. 5.2.3). Another reason why reformulation
of sentences might not be well-perceived by learners could be that the teacher might
have misunderstood his or her intention and provided a completely new version
(Amrhein & Nassaji, 2010). As a consequence, this could even lead to frustration on
the learners’ parts and less engagement with the feedback provided. So, when using
reformulation, the teachers should inform his or her learners why he or she believes
this type of feedback is effective.

3.4.4 Codes

Another form of implicit error correction is using codes. These can take on many
forms, from symbols (e.g., Art = article, VT = verb tense, Spell = spelling) to colours
which indicate the error category (see Sect. 4.6). When using codes, the teacher
needs to make sure that the learners understand these, which research has shown is
not always the case (see Lee, 2005 for a more thorough discussion). If learners have
no concept of what an article, a noun, a verb, etc. is, for example, they will not be
able to work with this type of feedback. This implies that the number and type of
codes needs to be considered as well. Too many codes might be too confusing for
the learners, thus making it difficult to decipher and work on, and in the end might
prove to be ineffective.
3.4 Types of Feedback 61

When using codes, teachers need to take the time and explain their codes thor-
oughly to their learners to ensure they can make sense of these. Ideally, returning
the first assignment should include a session where learners can check the written
CF they got and immediately ask questions about how to correct it. If some errors
occurred in several pieces of the learners’ written work, these can be addressed in
class to discuss these. Hopefully, this creates a learning environment where learners
understand that making errors is part of their L2 acquisition and they are encouraged
to ask language related questions after each homework assignment.

3.4.5 Self-Correction

Self-correction is one type of feedback that is used when feedback is seen as a


process: “From a sociocultural perspective, progress also includes a greater ability
to self-correct” (Bitchener & Storch, 2016, p. 2). Many of the other types (e.g.,
error indication, colour code) can be a first step. It definitely demands a lot from the
learners, as they need to invest some time to be able to correct their errors. In other
words, self-correction might be most effective with advanced learners, especially as
their ability to self-correct can vary, ranging from small steps to moving forwards and
backwards when processing and working with feedback. A natural process, especially
when writing is seen as a dynamic process, which is immensely influenced by the
learner’s engagement with written CF.
One can speculate that self-correction might not be so popular among learners
due to the simple fact that it is rather time-consuming and involves a lot of effort
on the learners’ part (see Sect. 5.2.5). From a writing skills perspective it might
be very important for becoming a skilled writer. But learners definitely need to be
familiarised with this type of feedback (Ferris, 1995b) to be able to use it effectively.
Without doubt, being able to self-correct one’s own pieces of writing is a desirable
skill. If teachers implement it into their written CF, they will undeniably need to
explain its purpose and how to work with it to their learners (see Sect. 3.3.7), so that
learners can improve their proficiency level. Or as Susan put it: “if you can’t work
with this kind of feedback [mm hmm] then you won’t improve” (Susan, Paragraph
316).
What is more, meta-linguistic clues might be necessary to aid the self-correction
process. Spelling mistakes (or errors) might easily be solved by the learners, but
especially in L2 acquisition grammatical patterns and idiomatic expressions are often
very difficult for them to self-correct. Using this type of feedback, the teacher needs
to consider how many clues are necessary to support the learners’ L2 acquisition,
because too few could lead to frustration on the learners’ part.
62 3 A Short History of Written Corrective Feedback

3.4.6 Peer Feedback

Peer feedback, also termed peer response, peer correction, peer review, peer revision,
peer editing, peer tutoring, peer critiquing, writing group and collaborative writing,
was developed for L1 writing and has been implemented into L2 writing as well,
especially in respect of writing development. Its effectiveness is still an issue for
debate among researchers (cf. Hyland & Hyland, 2006c), but recent reviews (Yu
& Lee, 2016) show that there is an “increased recognition of the potential of [peer]
feedback to direct and facilitate learning” (Hu, 2019, p. 45). Some studies indicate that
peer feedback is beneficial for L2 writing under certain circumstances (see Nelson
& Carson, 2006). Nevertheless, many learners are still uncertain if their feedback
is useful to their peers, or are reluctant to use peer feedback they received in their
revised drafts (Storch, 2019). Like many other types of feedback, peer feedback
needs to be introduced into the classroom setting step by step, as learners need to be
familiarised with it. After a transition phase, learners know how it works and then
peer feedback can contribute to a dialogic view of writing as learners feed of each
other and support each other in developing an idea within a paragraph, for example.
It has to be mentioned that peer feedback is very often investigated within socio-
cultural theory (SCT), which is related to Vygotsky (1978). In this theory the social
element is part of cognition and learning, and higher thinking skills, for example, are
shaped by social interaction (for a thorough discussion see Villamil & de Guerrero,
2006; Bitchener & Storch, 2016). According to Vygotsky (1978) a zone of proximal
development (ZPD) exists, where learners are ready to process information about,
for example, a specific linguistic form and respond to intervention by others (e.g.,
teachers, peers). In line with that, SCT views peer feedback on writing as a process,
where peers can help each other in the writing process.
One of the reasons why peer feedback has been promoted is that it gives more
control to the learners. What is more, the learners have an audience for their writing
which is not only the teacher who reads it. Another argument for using peer feedback
is that learners might listen more to feedback given by peers than teachers. To use
peer feedback effectively, however, teachers would need to train themselves as well
as learners on its usage (cf. Clarke, 2014; Storch, 2019). Further, it has to be noted
that peer correction cannot replace teacher feedback, but only precede it in most
cases.

3.4.7 Personal Written Statement

Evidence that written comments are effective dates back to the late 1950s (Butler,
1987; Black & Wiliam, 1998; Crooks, 1988; Page, 1958). My learners, even if they
did not do their second draft, were keen on reading them. As Julie put it: “But I always
.. look for it, because I always want to know […] what your statement is […] to the,
the written text of mine” (Julie, Paragraph 488). This shows that a personal written
3.4 Types of Feedback 63

statement (or comment) by the teachers is valued by learners. One of the reasons
definitely is the personalised aspect, especially if it starts off with their names and
states explicitly what they have done well and which areas need to be improved.
Vague or general comments like ‘Good effort!’ will either not be appreciated by
learners (Weaver, 2006), or will very likely be ignored (Zellermayer, 1989) as they
are not effective, because how shall a learner act on such a statement? What learners
need are tips on what needs to be improved and how they can accomplish that. What
is more, the personal written statement needs to be written in such a manner, which
the learners can understand because otherwise they will not be able to work with it
(cf. Knoblauch & Brannon, 1981).
Unsurprisingly, our modern-day communication tools add to the effect of using
personal written statements within a feedback method. Platforms such as Coursesites
by Blackboard, have a blog function, where the teacher and his or her learners can
communicate with each other (see Sect. 4.6). As all learners can read one another’s
posts, the teacher needs to make sure that the statements are tailored to the individual
learner. Although it might sometimes be difficult, always start on a positive note and
then give the learner tips on how to improve his or her written text. The participants
of my small-scale study stated that the personal written statement gave them a sense
of appreciation and it should be implemented in any kind of feedback method.

3.5 The Engagement-Mediator-Feedback Model

To support engagement with feedback, a feedback-rich environment is necessary,


where formative assessment and a learning-oriented atmosphere should be created
to foster learner engagement. Several educational systems rely much more on summa-
tive assessment, which cannot be changed by one teacher, but formative assessment
can still be part of the overall summative assessment. Especially when giving written
CF, the learners should be allowed to experiment with language, which could only
be achieved when they are not awarded grades on it. Instead, clear instructions and
writing tasks related to the topics covered in class or the learners’ personal lives
should be set in order to not thwart their engagement. It is crucial that learners are
motivated and want to receive feedback on their written texts. Only if this criterion
is met, learners can engage with the written CF provided and eventually learn from
it (Ellis, 2008; Hyland, 2003; Kormos, 2012).
The Engagement-Mediator-Feedback Model (Fig. 3.1) is one method to accom-
plish engagement with written CF. It consists of four pillars, namely mediating
factors, the feedback method, engagement and strategies. At its core is the Dynamic-
Engagement-Framework that lays the foundation for the Engagement-Mediator-
Feedback Model. As already noted in Chap. 2, engagement is a dynamic process
prone to change throughout a learner’s educational path. The three dimensions
of engagement (emotional, behavioural and cognitive), the learner’s intrinsic and
extrinsic motivations and at the same time mediating factors impeding or fostering
64 3 A Short History of Written Corrective Feedback

Fig. 3.1 Engagement-Mediator-Feedback Model

engagement are the heart of this dynamic engagement process. Thus, the Dynamic-
Engagement-Framework acts as a driver to facilitate engagement with the chosen
feedback method—always influenced by strategies and mediating factors—which
manifest itself in the Engagement-Mediator-Feedback Model. One of the pillars of
the model, which should not be underestimated, are mediating factors, as these play
a crucial role when the ultimate aim is to engage learners with written CF.
To get an idea of just how powerful mediating factors in this multi-dimensional
model are, I use three brief examples from my data (Chap. 5). First, in the emotional
dimension anxiety was a prominent factor. Learners reported on being anxious,
because they had not developed a positive relationship with their teacher, very often
even being afraid of them, affecting their emotional engagement with the feedback
provided by that particular teacher. Teachers need to remind themselves that their
attitude towards learners plays a crucial role in this respect. Second, homework
assignments are a good example for the behavioural dimension. The analysis of
my data shows that many factors contributed to the learners’ willingness to fulfil
homework assignments. Their own laziness was one of the reasons given, stating
that sometimes relaxing was more important than doing assignments. Like laziness,
many mediating factors in this area are beyond the teacher’s control as it concerns
the learners’ personal lives. Strategies how to overcome their own laziness might
3.5 The Engagement-Mediator-Feedback Model 65

help, but in the end it is the learners’ decision to act upon these. Awareness of medi-
ating factors can often explain learners changing attitudes towards written CF, being a
useful tool for teachers to better understand their learners’ attitudes. Third, the cogni-
tive dimension is needed for deep engagement to happen. When self-correction is
part of the feedback method, learners need to know how to work with the feedback
to be able to self-correct their errors. Once cognitively engaged, learners reported
on recognising recurring errors more easily, because they had to self-correct these in
their writing tasks.
Closely interlinked with the above-mentioned awareness of error patterns, is
another pillar of the model: strategies. When using this model for classroom practice,
the necessary steps for written CF to be effective are explaining the chosen feedback
method in detail to the learners, which can be accomplished by using the Giving-
Feedback-Guidelines (see Sect. 3.3.7). As stressed in these, providing strategies for
working with written CF are essential to guide learners in their writing process.
This, however, can only be achieved if teachers implement such strategies into their
teaching routines, thus providing learners with ample tools they can benefit from
when correcting their written texts. Ideally, after adequate training, learners should
be able to develop their own strategies enhancing learner autonomy.
In their handbook on student engagement, Dunne and Derfel (2013, p. 271) stress
that “student involvement in their learning is intrinsically linked […] to the quality
of teaching provided […]. In short, students are likely to engage when teachers make
concerted efforts to engage them.” Undoubtedly, the notion of concerted efforts on
the teachers’ part are as crucial as the learners’ in the engagement process. An issue
not explained by Dunne and Derfel is which concerted efforts teachers could make
to engage their learners. For that reason, the Engagement-Mediator-Feedback Model
could be the answer, as it gives insights into the complexity of learner engagement
with written corrective feedback.
For feedback to be effective, the knowledge of mediating factors impacting written
CF can be used to provide the individual learners with feedback that is tailored to
their needs. Sometimes teachers need to accept the fact that mediating factors can be
so powerful that learners occasionally withdraw entirely from the feedback process.
Therefore, feedback is a form of assistance to learners, so they can accomplish
their goals in mastering the language, and as feedback is a dynamic process, it is
prone to constant changes. The teacher’s role is to support the respective learners in
their engagement with feedback and to regularly re-evaluate the feedback method
used. The unique way teachers respond to learners’ written texts, for example in their
personal written statements (see Sect. 5.2.7), cannot only affect the learners’ reaction
to, but also their engagement with it (Hyland & Hyland, 2019). As my data shows, the
editing of their written text occasionally takes learners longer than anticipated, hence
they often find it difficult to maintain their level of engagement. Helping learners to
use strategies to sustain their engagement with written CF is a key factor, especially
when working with the written CF they received on their first draft.
When we think of written CF, we can see that feedback is one key in improving
language accuracy. Its effectiveness, however, depends on many factors, such as the
teaching context and learners’ beliefs about feedback methods. The teacher is one
66 3 A Short History of Written Corrective Feedback

part of the equation in learner engagement. He or she can provide all the necessary
tools for engaging learners in the feedback process, but if learners are not willing
to improve their writing skills, for example, they will very likely not benefit from
the feedback they get (cf. Guénette, 2007). Nevertheless, teachers always try to find
ways to engage as many learners as possible, a notion which can be accomplished
with this model. Understanding underlying mediating factors can aid teachers in
their quest for an effective feedback method. Embracing the multi-dimensionality
of the Engagement-Mediator-Feedback Model is a chance to cater to the needs of
individual learners.

References

Amrhein, H. R., & Nassaji, H. (2010). Written corrective feedback: What do students and teachers
think is right and why? Canadian Journal of Applied Linguistics/Revue canadienne de linguis-
tique appliquée, 13(2), 95–127. Retrieved from https://journals.lib.unb.ca/index.php/CJAL/art
icle/view/19886/21713.
Bitchener, J. (2008). Evidence in support of written corrective feedback. Journal of Second
Language Writing, 17(2), 102–118. Retrieved from http://www.jimelwood.net/students/grips/
tables_figures/bitchener_2008.pdf.
Bitchener, J. (2012). Written corrective feedback for L2 development: Current knowledge and future
research. TESOL Quarterly, 46(4), 855–860. Retrieved from http://www.jstor.org/stable/pdf/432
67894.pdf.
Bitchener, J., & Ferris, D. R. (2012). Written corrective feedback in second language acquisition
and writing. New York and London: Routledge.
Bitchener, J., & Knoch, U. (2008). The value of written corrective feedback for migrant and interna-
tional students. Language Teaching Research, 12(3), 409–431. Retrieved from http://www.csudh.
edu/ccauthen/575S12/bitchener-knoch.pdf.
Bitchener, J., & Knoch, U. (2009a). The value of focused approach to written corrective feedback.
ELT Journal, 63(3), 204–211.
Bitchener, J., & Knoch, U. (2009b). The relative effectiveness of different types of direct written
corrective feedback. System, 37(2), 322–329.
Bitchener, J., & Knoch, U. (2010). Raising the linguistic accuracy level of advanced L2 writers with
written corrective feedback. Journal of Second Language Writing, 19(4), 207–217.
Bitchener, J., & Storch, N. (2016). Written corrective feedback for L2 development. Bristol:
Multilingual Matters.
Bitchener, J. (2019). The intersection between SLA and feedback research. In K. Hyland & F.
Hyland (Eds.), Feedback in second language writing: Contexts and issues (2nd ed., pp. 85–105).
Cambridge: Cambridge University Press.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education:
Principles, Policy & Practice, 5(1), 7–74.
Brooks, N. (1960). Language and language learning. New York: Harcourt, Brace and World Inc.
Brophy, J. (1986). Teacher influences on student achievement. American Psychologist, 41(10),
1069–1077.
Brophy, J., & Evertson, C. (1976). Learning from teaching: A developmental perspective. Boston:
Allyn & Bacon.
Brown, D. (2012). The written corrective feedback debate: Next steps for classroom teachers and
practitioners. TESOL Quarterly, 46(4), 861–867. Retrieved from http://www.jstor.org/stable/pdf/
43267895.pdf.
References 67

Burnett, P. C. (2003). The impact of teacher feedback on student-self talk and self-concept in reading
and mathematics. Journal of Classroom Interaction, 38(1), 11–16.
Butler, R. (1987). Task-involving and ego-involving properties of evaluation: Effects of different
feedback conditions on motivational perceptions, and performance. Journal of Educational
Psychology, 79(4), 474–482. Retrieved from http://www.richardnelsononline.co.uk/blog/wp-
content/uploads/2014/07/Effects-of-No-Feedback-Task-Related-Comments-and-Grades-on-Int
rinsic-Motivation-and-Performance.pdf.
Campbell, N., & Schumm Fauster, J. (2013). Learner-centred feedback on writing: Feedback as
dialogue. In M. Reitbauer, N. Campbell, S. Mercer, J. Schumm Fauster, & R. Vaupetitsch (Eds.),
Feedback matters: Current feedback practices in the EFL classroom (pp. 55–68). Frankfurt am
Main: Peter Lang.
Carless, D. (2007). Conceptualizing pre-emptive formative assessment. Assessment in Educa-
tion, 14(2), 171–184. Retrieved from http://www.tandfonline.com/doi/full/10.1080/096959407
01478412.
Chandler, J. (2003). The efficacy of various kinds of error feedback for improvement in the accuracy
and fluency of L2 student writing. Journal of Second Language Writing, 12, 267–296.
Clarke, S. (2014). Outstanding formative assessment: Culture and practice. London: Hodder
Education.
Cohen, A. D. (1987). Student processing of feedback on their compositions. In A. Wenden & J. Rubin
(Eds.), Learner strategies in language learning (pp. 57–69). Language Teaching Methodology
Series. New York et al.: Prentice Hall.
Corder, S. P. (1967). The significance of learners’ errors. International Review of Applied Linguistics,
5, 161–169. Retrieved from http://files.eric.ed.gov/fulltext/ED019903.pdf.
Crooks, T. J. (1988). The impact of classroom evaluation practices on students. Review of
Educational Research, 58(4), 438–481. Retrieved from https://www.researchgate.net/profile/
Terry_Crooks/publication/240314226_The_Impact_of_Classroom_Evaluation_Practices_on_
Students/links/57c0a6dc08aeda1ec38a5222.pdf.
Dohrn, E., & Bryan, T. (1994). Attribution instruction. Teaching Exceptional Children, 26(4), 61–63.
Dunne, E., & Derfel, O. (2013). Students taking responsibility for their learning. In E. Dunne & O.
Derfel (Eds.), The student engagement handbook: Practice in higher education (pp. 271–274).
Emerald: Bingley.
Ellis, R. (1997). Second language acquisition. Oxford: Oxford University Press.
Ellis, R. (2008). The study of second language acquisition (2nd ed.). Oxford: Oxford University
Press.
Ellis, R. (2009). A typology of written corrective feedback styles. ELT Journal, 63(2), 97–107.
Ellis, R. (2010). A framework for investigating and oral and written corrective feedback. Studies in
Second Language Acquisition, 32, 335–349. Retrieved from https://www.cambridge.org/core.
Ellis, R. (2012). Language teaching research and language pedagogy. Chichester: Wiley.
Ellis, R. (2015). Understanding second language acquisition (2nd ed.). Oxford: Oxford University
Press.
Ellis, R., Loewen, S., & Erlam, R. (2006). Implicit and explicit corrective feedback and the acqui-
sition of L2 grammar. Studies in Second Language Acquisition, 28(2), 339–368. Retrieved from
https://www.cambridge.org/core.
Evans, N. W., Hartshorn, K. J., McCollum, R. M., & Wolfersberger, M. (2010). Contextualizing
corrective feedback in second language writing pedagogy. Language Teaching Research, 14(4),
445–463. Retrieved from http://journals.sagepub.com/doi/pdf/, https://doi.org/10.1177/136216
8810375367.
Fathman, A., & Whalley, E. (1990). Teacher response to student writing: Focus on form versus
content. In B. Kroll (Ed.), Second language writing: Research insights for the classroom (pp. 178–
190). Cambridge et al.: Cambridge University Press.
Ferris, D. R. (1995a). Student reactions to teacher response in multiple-draft classrooms. TESOL
Quarterly, 29(1), 33–53.
Ferris, D. R. (1995b). Teaching students to self-edit. TESOL Journal, 4(4), 18–22.
68 3 A Short History of Written Corrective Feedback

Ferris, D. R. (1999). The case for grammar correction in L2 writing classes: A response to Truscott
(1996). Journal of Second Language Writing, 8(1), 1–11. Retrieved from http://www.sciencedi
rect.com/science/article/pii/S1060374399801106.
Ferris, D. R. (2004). The “grammar correction” debate in L2 writing: Where are we, and where do
we go from here? (and what do we do in the meantime…?). Retrieved from http://citeseerx.ist.
psu.edu/viewdoc/download?doi=10.1.1.110.4148&rep=rep1&type=pdf.
Ferris, D. R. (2011). Treatment of error in second language student writing (2nd ed.). Michigan
Series on Teaching Multilingual Writers. Michigan: The University of Michigan Press.
Ferris, D. R., & Hedgcock, J. S. (2011). Teaching ESL composition: Purpose, process, and practice
(2nd ed.). New York, London: Routledge.
Ferris, D. R., Liu, H., Sinha, A., & Senna, M. (2013). Written corrective feedback for individual
L2 writers. Journal of Second Language Writing, 22(3), 307–329. Retrieved from https://www.
researchgate.net/profile/Aparna_Sinha2/publication/259142527_Written_corrective_feedback_
for_individual_L2_writers/links/559ae16408ae21086d2778b7.pdf.
Ferris, D., & Kurzer, K. (2019). Does error feedback help L2 writers? In K. Hyland & F. Hyland
(Eds.), Feedback in second language writing: Contexts and issues (2nd ed., pp. 106–124).
Cambridge: Cambridge University Press.
Firth, A. & Wagner, J. (1997). On Discourse, Communication, and (Some) Fundamental Concepts
in SLA Research. The Modern Language Journal, 81(3), 285–300. Retrieved from http://www.
jstor.org/stable/pdf/329302.pdf.
Freedman, S. W., Greenleaf, C., & Sperling, M. (1987). Response to student writing. Urbana,
IL: National Council of Teachers of English. Retrieved from http://files.eric.ed.gov/fulltext/ED2
90148.pdf.
Gass, S. M. (1997). Input, interaction, and the second language learner. Mahwah: Lawrence
Erlbaum Associates.
Gass, S. M. (2003). Input and interaction. In C. Doughty & M. Long (Eds.), The handbook of second
language acquisition (pp. 224–255). Malden, MA: Blackwell Publishing.
Guénette, D. (2007). Is feedback pedagogically correct? Research design issues in studies of
feedback on writing. Journal of Second Language Writing, 16(1), 40–53. Retrieved from
http://s3.amazonaws.com/academia.edu.documents/34377841/Guenette_2007_Is_feedback_p
edagogically_correct.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=149
2624320&Signature=keJz9lkedyYVfNty6WQE2ZVcjRI%3D&response-content-disposition=
inline%3B%20filename%3DIs_feedback_pedagogically_correct.pdf.
Haswell, R. H. (1983). Minimal marking. College English, 45(6), 600–604.
Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. London and New
York: Routledge.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1),
81–112. Retrieved from https://insightplatform-public.sharepoint.com/SiteAssets/feedback-and-
reporting/characteristics-of-effective-feedback/power_feedback.pdf.
Hendrickson, J. M. (1978). Error correction in foreign language teaching: Recent theory, research
and practice. The Modern Language Journal, 62(8), 387–398. Retrieved from https://elanolden
burg.files.wordpress.com/2008/11/hendricksonerrorcorrection2.pdf.
Hendrickson, J. M. (1980). The treatment of error in written work. The Modern Language Journal,
64(2), 216–221. Retrieved from http://www.jstor.org/stable/pdf/325306.pdf.
Hills, J. (2016). The feedback debate. Retrieved from: https://www.headheartbrain.com/brain-
savvy-hr/the-feedback-debate/.
Hu, G. (2019). Culture and peer feedback. In K. Hyland & F. Hyland (Eds.), Feedback in second
language writing: Contexts and issues (2nd ed., pp. 45–63). Cambridge: Cambridge University
Press.
Hyland, F. (2003). Focusing on form: Student engagement with teacher feedback. System,
31(2), 217–230. Retrieved from http://www.sciencedirect.com/science/article/pii/S0346251X
03000216.
References 69

Hyland, K., & Hyland, F. (2006a). Feedback on second language students’ writing. Language
Teaching, 39, 83–101. Retrieved from http://hub.hku.hk/bitstream/10722/57356/1/133933.pdf.
Hyland, K., & Hyland, F. (2006b). Contexts and issues in feedback on L2 writing: An introduction.
In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues
(pp. 1–19). New York: Cambridge University Press.
Hyland, K., & Hyland, F. (Eds.). (2006c). Feedback in second language writing: Contexts and
issues. New York: Cambridge University Press.
Hyland, K., & Hyland, F. (2019). Interpersonality and teacher-written feedback. In K. Hyland & F.
Hyland (Eds.), Feedback in second language writing: Contexts and issues (2nd ed., pp. 165–183).
Cambridge: Cambridge University Press.
Irwin, B. (2017). Written corrective feedback: Student preferences and teacher feedback practices.
IAFOR Journal of Language Learning, 3(2), 35–58.
Juwah, C., Macfarlane-Dick, D., Matthew, B., Nicol, D., Ross, D. & Smith, B. (2004). Enhancing
student learning through formative effective formative feedback. The Higher Education Academy
(Generic Centre), 1–42. Retrieved from http://ctlt.illinoisstate.edu/downloads/modules/design/
enhancing_learning-through_formative_feedback.pdf.
Knoblauch, C. H., & Brannon, L. (1981). Teacher commentary on student writing: The state of the
art. Freshman English News, 10(2), 1–4.
Kormos, J. (2012). The role of individual differences in L2 writing. Journal of Second Language
Writing, 21, 390–403.
Krashen, S. D. (1981). Second language acquisition and second language learning. Pergamon Press
Inc. Retrieved from http://www.sdkrashen.com/content/books/sl_acquisition_and_learning.pdf.
Lalande, J. F. II (1982). Reducing composition errors: An experiment. The Modern Language
Journal, 66(2), 140–149. Retrieved from http://www.jstor.org/stable/pdf/326382.pdf.
Lee, I. (2005). Error correction in the L2 writing classroom: What do students think? TESL
Canada Journal, 22(2), 1–16. Retrieved from http://teslcanadajournal.ca/index.php/tesl/article/
view/84/84.
Lee, I. (2014). Revisiting teacher feedback in EFL writing from sociocultural perspectives. TESOL
Quarterly, 48(1), 201–213.
Li, S. (2010). The effectiveness of corrective feedback in SLA: A meta-analysis. Language
Learning, 60(2), 309–365. Retrieved from https://www.researchgate.net/profile/Shaofeng_Li2/
publication/229940242_The_Effectiveness_of_Corrective_Feedback_in_SLA_A_Meta-Ana
lysis/links/563d21bb08aec6f17dd7edad.pdf.
Liu, Y. (2008). The effects of error feedback in second language writing. Arizona Working Papers
in SLA & Teaching, 15(1), 65–79. Retrieved from http://slat.arizona.edu/sites/default/files/page/
awp15liu.pdf.
Lyster, R., & Saito, K. (2010). Oral feedback in classroom SLA. Studies in Second Language
Acquisition, 32(2), 265–302. Retrieved from http://kazuyasaito.net/SSLA2010.pdf.
Mercer, S., & Ryan, S. (2013). Praising to learn: Learning to praise. In M. Reitbauer, N. Campbell,
S. Mercer, J. Schumm Fauster, & R. Vaupetitsch (Eds.), Feedback matters: Current feedback
practices in the EFL classroom (pp. 21–35). Frankfurt am Main: Peter Lang.
Meštrović Štajduhar, I. (2013). Web-based peer feedback from the students’ perspective. In M. Reit-
bauer, N. Campbell, S. Mercer, J. Schumm Fauster, & R. Vaupetitsch (Eds.), Feedback matters:
Current feedback practices in the EFL classroom (pp. 87–102). Frankfurt am Main: Peter Lang.
Milton, J. (2006). Resource-rich web-based feedback: Helping learners become independent writers.
In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues
(pp. 123–139). New York: Cambridge University Press.
Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermined children’s motivation
and performance. Journal of Personality and Social Psychology, 75(1), 33–52.
Nelson, G., & Carson, J. (2006). Cultural issues in peer response; Revisiting “culture”. In K. Hyland
& F. Hyland (Eds.), Feedback in second language writing: Contexts and issues (pp. 42–59). New
York: Cambridge University Press.
70 3 A Short History of Written Corrective Feedback

Page, E. B. (1958). Teacher comments and student performance: A seventy-four classroom


experiment in school motivation. Journal of Educational Psychology, 49(4), 173–181.
Ramaprasad, A. (1983). On the definition of feedback. Systems Research and Behavioural Science,
28, 4–13.
Richards, J. C., & Renandya, W. A. (Eds.). (2002). Methodology in language teaching: An anthology
of current practice. Cambridge: Cambridge University Press.
Sadeghi, K., & Mosalli, Z. (2013). The effect of task complexity on the quality of EFL learners’
argumentative writing. Iranian Journal of Language Teaching Research, 1(2), 115–134.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional
Science, 18(2), 119–144. Retrieved from http://pdf.truni.sk/eucebnice/iktv/data/media/iktvv/
Symposium_LTML_Royce%20Sadler_BFormative_Assessment_and_the_design_of_instructi
onal_systems.pdf.
Scriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.), Curriculum evaluation
(pp. 39–83). Chicago: Rand McNally.
Seow, A. (2002). The writing process and process writing. In J. C. Richards & W. A. Renandya
(Eds.), Methodology in language teaching: An anthology of current practice (pp. 315–320).
Cambridge: Cambridge University Press.
Sheen, Y. (2007). The effect of focused written corrective feedback and language aptitude on ESL
learners’ acquisition of articles. TESOL Quarterly, 41(2), 255–283. Retrieved from http://www.
jstor.org/stable/pdf/40264353.pdf.
Sheen, Y. (2011). Corrective feedback, individual differences and second language learning.
Dordrecht et al.: Springer.
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.
Retrieved from http://www.jstor.org/stable/pdf/40071124.pdf.
Sigott, G. (2013). A global perspective on feedback. In M. Reitbauer, N. Campbell, S. Mercer, J.
Schumm Fauster, & R. Vaupetitsch (Eds.), Feedback matters: Current feedback practices in the
EFL classroom (pp. 9–20). Frankfurt am Main: Peter Lang.
Storch, N. (2010). Critical feedback on written corrective feedback research. International Journal
of English Studies, 10(2), 29–46. Retrieved from http://revistas.um.es/ijes/article/view/119181/
112311.
Storch, N. (2019). Collaborative writing as peer feedback. In K. Hyland & F. Hyland (Eds.),
Feedback in second language writing: Contexts and issues (2nd ed., pp. 143–161). Cambridge:
Cambridge University Press.
Swain, M. (1985). Communicative competence: Some roles of comprehensible input and compre-
hensible output in its development. In S. Gass & C. Madden (Eds.), Input in second language
acquisition (pp. 235–252). Rowley: Newbury House.
Taras, M. (2006). Do unto others or not: Equity in feedback for undergraduates. Assessment &
Evaluation in Higher Education, 31(3), 365–377. Retrieved from http://www.tandfonline.com/
doi/abs/, https://doi.org/10.1080/02602930500353038.
Truscott, J. (1996). The case against grammar correction in L2 writing classes. Language Learning,
46(2), 327–369. Retrieved from http://web.ntpu.edu.tw/~language/file/grammarcorrection.pdf.
Truscott, J. (1999). The case for “The case against grammar correction in L2 writing
classes”: A response to ferris. Journal of Second Language Writing, 8(2), 111–122.
Retrieved from http://www2.fl.nthu.edu.tw/old_site/FL2/faculty/John/The%20case%20for%
20the%20case%20against%201999.pdf.
Truscott, J. (2007). The effect of error correction on learners’ ability to write accurately. Journal of
Second Language Writing, 16(4), 255–272. Retrieved from http://epi.sc.edu/ar/AS_4_files/Tru
scott%202007.pdf.
Van Beuningen, C. (2010). Corrective feedback in L2 writing: Theoretical perspectives, empirical
insights, and future directions. International Journal of English Studies, 18(2), 1–27.
Villamil, O. S., & de Guerrero, M. C. M. (2006). Sociocultural theory: A framework for under-
standing socio-cognitive dimensions of peer feedback. In K. Hyland & F. Hyland (Eds.), Feedback
References 71

in second language writing: Contexts and issues (pp. 23–41). New York: Cambridge University
Press.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.
Cambridge: Harvard University Press.
Weaver, M. R. (2006). Do students value feedback? Students’ perceptions of tutors written
comments. Assessment & Evaluation in Higher Education, 31(3), 379–394. Retrieved from http://
www.tandfonline.com/doi/full/10.1080/02602930500353061.
Widdowson, H. G. (2003). Defining issues in English language teaching. Oxford: Oxford University
Press.
Winne, P. H., & Butler, D. L. (1994). Student cognition in learning from teaching. In T. Husen & T.
Postlewaite (Eds.), International encyclopaedia of education (2nd ed., pp. 5738–5745). Oxford:
Pergamon.
Yu, S., & Lee, I. (2016). Peer feedback in second language writing (2005–2014). Language Teaching,
49(4), 461–493.
Zellermayer, M. (1989). The study of teachers’ written feedback to students’ writing: Changes in
theoretical considerations and the expansion of research contexts. Instructional Science, 18(2),
145–165. Retrieved from http://www.jstor.org/stable/pdf/23369144.pdf.
Chapter 4
Methodological Design

Many studies have been conducted on written CF (see Chap. 3), and research has
shown that there are multiple ways of working with written CF methods, but the
most effective written CF method has still not been found, if there even is one.
One of the main issues with written CF might be that “[…] research into feedback on
compositions has mainly concerned itself with the ‘best’ means of teacher correction
on written work, rather than with the issue of how students actually respond to each
of these methods” (Cohen, 1987, p. 58). The question remains why learners engage
with the written corrective feedback method provided.
Learner engagement, on the other hand, is a relatively new field in Second
Language Acquisition (SLA) research which has received a lot of attention in the last
couple of years (see Chap. 2). So far it has been investigated in the Higher Education
sector as well as in the context of schools (e.g., Fredricks, Blumenfeld, & Paris, 2004;
Yazzie-Mintz, 2007). Investigating how learners engage with, for instance, academic
work or extracurricular activities, several questionnaires have been piloted and used
since. The most common ones are NSSE (National Survey of Student Engagement—
Online 5) for the USA and Canada as well as the UK, AUSSE (Australasian Survey of
Student Engagement—Online 6), HSSSE (High School Survey of Student Engage-
ment—Online 7) and CLASSE (Classroom Survey of Student Engagement—see
Quimet, 2011).
In order to figure out why learners engage with written CF, I decided to inves-
tigate my own students at a secondary business school in Lower Austria and their
engagement with written CF. I strongly believe that if the aim is to figure out why
students work with feedback, this can only be achieved by doing participant action
research, where the learners’ voices are heard.

© Springer Nature Switzerland AG 2020 73


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_4
74 4 Methodological Design

4.1 Austrian School System

All Austrian schools are obliged to follow the legal requirements for Austrian schools
from 1962 as well as the national curricula (Online 8) which are updated on a regular
basis. According to Euroguidance Austria (Online 9) the Austrian school system is
divided into Primary Level, Secondary Level I and Secondary Level II (see Fig. 4.1—
Online 10). Additionally, Special Needs Education/Integrative Education (Grade 1–
9) and Integrative Education (Grade 10–13/14) are offered as well. Children usually
start their formal schooling at the age of six (if they are born before 1 September)
and have to undergo nine years of compulsory education. Primary Level consists
of four years (Grade 1–4) of primary school. After that the students continue at

Fig. 4.1 Austrian school system (Online 10)


4.1 Austrian School System 75

Secondary Level I (Grade 5–8), where they can choose between two types of schools:
Mittelschule (secondary school) and Gymnasium (academic secondary school lower
cycle). At primary level all schools have to adhere to educational standards in German
and Maths, at lower secondary level in German, Maths and English implemented by
the BIFIE (Education research, innovation and development of the Austrian school
system—Online 11).
Going into Secondary Level II, students have several options: Those that attended
general secondary or new secondary school can go to a four year academic secondary
school upper cycle (these schools do not have an academic secondary school lower
cycle), a school for intermediate vocational education (Grade 9, Grade 9–10, Grade
9–11 or Grade 9–12 depending on the type of school) or a college for higher vocational
education (Grade 9–13). In addition, these students can also opt for dual training,
attending a pre-vocational school to finish compulsory education and then contin-
uing with part-time vocational education and an apprenticeship. Students from an
academic secondary school lower cycle very often continue to the upper level within
their school. If they decide against it, they have the same options as mentioned above.
Students attending an academic secondary school upper level or a college for
higher vocational education finish with their A Level exams, called Matura and
Reife- und Diplomprüfung respectively. The written and oral exams, which usually
take place in May and June, are a prerequisite for studying at tertiary level. It has to
be stated that Secondary Level II in Austria is multifaceted and for the purpose of
this study I only take a closer look at one secondary business school (a particular type
of college for higher vocational education) in order to explain my research context
in more detail.

4.2 Teaching Context

The secondary business school I teach at is in Lower Austria where students can
either attend a school for intermediate vocational training lasting three years, which
they finish with a school-leaving exam, or a college for higher vocational educa-
tion lasting five years and finishing with their Reife- und Diplomprüfung. In contrast
to our traditional grammar schools (academic secondary schools lower and upper
cycle), Accountancy and Business Studies are two core subjects at business schools,
alongside general subjects such as German, Maths, Sciences, English and an addi-
tional foreign language (Italian, Spanish, Russian, French, to name just a few) that
are taught at both types of school.
After their final year, which our students usually finish around the end of April,
they first take their written A Level exams at the beginning of May. Up to the academic
session 2014/15 all students had to do an exam in German and a combined exam
of Accountancy and Business Studies. The third subject was either Maths, a foreign
language (English, French or Italian) or IT. The oral exams always included Busi-
ness Studies, a foreign language and an optional subject (e.g., Religious Education,
History, Geography, or Economics). English was compulsory (either a written or
76 4 Methodological Design

an oral exam) until June 2015, but that changed with the implementation of the
new standardised A Level exams. The students I investigated were the last ones
being examined according to the old guidelines for A Level exams, meaning that
the written exams were designed by the respective teachers at each school. From
the academic session 2015/16 onwards the written tasks for German, English (as
well as all other foreign languages) and Maths were designed by the BIFIE. This
changed in 2017 when the Austrian Ministry of Education (Bildungsministerium
für Bildung—BMB, formerly known as Bundesministerium für Unterricht, Kunst
und Kultur—BMUKK) took responsibility for the development of these. With the
new A Level exams students also have to do an A Level exam in Maths (written or
oral) which had not been compulsory up to the academic session 2014/15. One of
the reasons for implementing these standardized tests was to ensure comparability
across all schools in Austria as well as to provide equal chances for all students.
The oral exams, however, are not designed by an external organisation. The guide-
lines, which must be adhered to, have been implemented by the Austrian Ministry
of Education to ensure that oral A Level exams are designed according to pre-
defined competences in the Austrian curriculum. A handbook has been designed by
CEBS (Center für berufsbezogene Sprachen—Online 12) as guidance for teachers.
Although the new guidelines for the oral A Level exams only came into effect in
the academic session 2015/16, the English teachers at our school decided to change
the layout of these exams for the academic session 2014/15. Hence, all tasks for
the English A Level exams follow the two-stage system of sustained monologue
followed by spoken interaction, or vice versa.1 The layout is always the same: First,
the situation is explained, followed by the two tasks. To give an example, one of
topics in the academic session 2015/16 was society and politics, with one of the
sub-topics asylum policy:
Situation:
In your final year at XXXX2 you have been discussing asylum policies in Europe and their
impact on people’s lives.
Task 1 (Sustained Monologue)
After having read the article3 on helping people not borders, you are now ready to present
your findings at the A Level exams. In your presentation you should

• point out what Martin Popp claims to be wrong with border protection
• explain which problems refugees come across before being able to seek asylum in Europe
• judge whether or not the suggestions made by various organisations for legal ways for
refugees to reach Europe are realistic

Task 2 (Spoken Interaction)


After your presentation, you discuss refugee issues with your teacher. In your discussion
you

1 The format of the oral exams at academic secondary schools and schools for higher vocational
education differ.
2 Name of school has been removed.
3 See Appendix A.
4.2 Teaching Context 77

• describe what you can see in the graph4


• contrast the refugee situation in Austria to other European countries
• state your opinion on how to assist asylum seekers in a humane way

4.3 Teacher Researcher

Being a teacher and investigating your own students from a research perspective is
not without its drawbacks. There has been some harsh criticism by several researchers
in the past “concerning the impracticality of asking teachers to engage in research”
(Ellis, 1997, p. 25) as well as “a belief that many teachers will not be able to achieve
the standards professional researchers deem necessary” (ibid.). Why should teachers
not acquire the necessary skills to do research adhering to the same standards as
any other researcher? What is essential for doing research is familiarising yourself
with the relevant literature, methodology and methods in addition to well-designed
research questions, which should be tackled with the appropriate tools. Furthermore,
being able to manage your subjectivity and “to attest to the credibility of your data
and the trustworthiness of your results – in spite of any limitations” (O’Leary, 2014,
p. 60) is crucial in any kind of research. Bitchener and Storch (2016) also stress that
classroom-based research is necessary to fully understand “how and why learners
engage with the feedback and feedback providers” (p. 134).
Practical action research is becoming more and more popular among researchers,
as monitoring one’s own classroom gives the teacher researcher the opportunity to
critically evaluate his or her own teaching, and to make a useful contribution to the
body of knowledge. As Ellis (2012) points out teachers are very often “expected
to perform the ‘script’ dictated by the method rather than their own script” (p. 52).
Hence, teachers should critically look at their classroom practice and which problems
they encounter. Instead of just doing what theory tells them to do, they should engage
with the process, formulate research questions and conduct their own research to
enhance their teaching experience.
This was exactly my starting point. Having taught for eleven years in several
teaching contexts, one aspect of my teaching had constantly been changed: written
corrective feedback. It is undeniably an essential part of teachers’ lives and there are
various feedback methods (Chap. 3) a teacher could use. No matter which of these
methods I had used it always revolved around one question: Why do or do students
not engage with the different written CF methods? As Ellis (2012, p. 27) emphasises
“action research is a form of self-reflective enquiry undertaken by practitioners in
their own contexts of action”. But how, then, do you separate the teacher from the
researcher? Or should you embrace both aspects at the same time?
I am fully aware of the fact that I was my students’ teacher who graded them
at the end of each school year and a researcher who would like to investigate their
engagement with written CF—feedback provided by myself as well. Consequently,

4 See Appendix A.
78 4 Methodological Design

I addressed this issue with the students that participated in this study and told them
that their being honest was crucial as I wanted to share the insights I had gained from
them with a broader community who could then better understand what students’
reasons might be for engaging or not engaging with the written CF method provided.
The students’ views certainly influenced my style of teaching as I had critically
reflected on my own teaching practice because of the research I had conducted,
but it did not influence my critical engagement with the collected data to establish
feedback methods they found engaging, in what ways they engaged with them and
what students perceived as mediating factors (see Sect. 4.4 for a thorough discussion
of the research questions).
Before I analysed my data, I stepped back from my teacher’s role, thought about
my research questions and tried to stay true to the data I had gathered. It was not
always easy, but in order to ensure the credibility of my data, that was precisely
what I had to do. I constantly reminded myself of the responsibility I had towards
the research community as well as my students. During this process I distanced
myself from being their teacher and analysed the data as the researcher who wanted
to answer her research questions in order to gain a better understanding of learner
engagement with written CF.
Stepping back from my role as a teacher was far easier when conducting the focus
group interviews. On the one hand, because four participants took part in each of
them, there was less guidance necessary from my part. On the other hand, I created
prompt cards that gave the focus group interviews a natural structure. On these cards
I printed the following terms:

Student engagement Written statement by teacher


Reformulation of sentences Codes in margin
Error correction by teacher Error indication—no correction
Self-correction Colour code
Peer review Teacher’s role
My preferred feedback method Reasons for (not) doing homework
Suggestions for improvement of current feedback method

The learners picked them up and immediately started talking about these. Thus,
I mostly asked questions which arose from their discussions about the prompts. The
individual interviews, however, were quite challenging to conduct. Several times I
had to remind myself that I should not influence the learners’ views due to the fact that
I had to answer my research questions. But being aware of this bias certainly helped
me to restrain myself and step back from being their teacher to being a researcher who
wanted to investigate learner engagement with written CF (cf. Taylor & Robinson,
2014).
In order to be prepared for a possible breakdown in communication I designed
questions for each of the eight participants (see Appendix C). For that reason, I
analysed the two focus group interviews as well as the questionnaires they filled in
4.3 Teacher Researcher 79

before the individual interviews (see Sect. 4.7. for a more thorough discussion of
the research tools). These steps clearly helped me in staying true to my role as a
researcher.

4.4 Research Questions

Correcting learners’ writing seems to make up most of my teaching load. During


my teacher training, teacher trainers had always commented on the importance of
correcting and giving feedback on written homework, but the strategies mentioned
were mostly locating errors and/or providing students with the correct form. More
than sixteen years of teaching at educational institutions abroad as well as in Austria
have proven that this kind of feedback is not very satisfactory for me. I kept asking
myself what the purpose of providing the learners with the correct form was, when
they would probably not even look at my corrections.
One aspect that certainly fosters the focus on error correction seems to be that
within the Austrian educational system focus on the product rather than the process
is still more commonly practised. This approach is definitely backed by the design
of our curricula where assessment in order to grade students still plays a major role.
A reflection of a focus on the product can also be detected among learners, as a
lot of them, for example, do not see the writing assignments as a process-oriented
approach, but as a product where they only hand in one draft and do not work on it
anymore. The question remains whether learners are willing to invest time in writing,
as a focus on process also implicates more work on their parts (e.g., writing a first
draft, then a second draft, where they not only try to correct the errors themselves
but should also work on their sentence structure, for instance).
At the beginning of my research project, where the focus was still more on
designing a written CF method a lot of learners find engaging, I wanted to answer
the following three research questions:
• Research Question 1: What kinds of written corrective feedback do students find
motivating?
• Research Question 2: Why do students engage with written corrective feedback?
• Research Question 3: Which kind(s) of written corrective feedback is effective
for improving writing skills?
Through my focus group interviews (see Sect. 4.7.) not only had my focus shifted
towards engagement with written CF, as a result my research questions had changed
too. First, the problem with Research Question 1 was definitely the term motivating.
As Dörnyei and Ushioda (2011, p. 3) state “while intuitively we may know what
we mean by the term ´motivation‘, there seems little consensus on its conceptual
range of reference”. Motivation is multifaceted and an abstract concept “that refers
to various mental (i.e. internal) processes and states [and] this means that there are no
objective measures of motivation” (ibid., p. 197). The aim of the study was definitely
not measuring the learners’ level of motivation but what methods they were interested
80 4 Methodological Design

in and why, hence the term motivating in Research Question 1 did not really fit my
purpose any longer.
Second, while conducting and then transcribing the pilot study focus group inter-
view, it dawned on me that Research Question 3 (Which kind[s] of written corrective
feedback is effective for improving writing skills?) was not the real issue but what
influenced the learners’ decision to work/not work with feedback methods. Thus,
I drew the conclusion that learner engagement was the determining factor in my
research. As a result, the research questions I wanted to answer were:
• Research Question 1: What kinds of written corrective feedback do students find
engaging?
• Research Question 2: In what ways do students engage with written corrective
feedback?
• Research Question 3: What do students perceive as mediating factors when
engaging with written corrective feedback?

4.5 Context and Participants

My small-scale study was being conducted as part of my doctoral thesis in a secondary


business school in Lower Austria and for doing my research project I had to seek
permission from the principal of my school. Moreover, I was only allowed to conduct
research with students from one of my classes, preferably 16 years or older. As a
consequence, my research population were my final year students of the academic
session 2014/15 who graduated from school after their written (May 2015) as well as
oral A Level exams in June 2015. 18 students were in my English class, four of those
took part in my pilot study focus group interview and another eight students aged
between 18 and 21 voluntarily participated in the actual study. The reasons for not
participating in the study were diverse, from not wanting to be recorded to not wanting
any of their work being published, although they knew that every single participant
would be anonymised, and all their rights protected. To ensure their privacy all
students participating in the study signed a consent form (see Appendix D).
It has to be stated that one aspect definitely distinguishing our type of school from
others is the diversity of nationalities attending. In the academic session 2014/15
students from 26 different nations speaking 31 different mother tongues attended our
business school. This diversity can also be seen in my participants (see Table 4.1),
where only five out of twelve participants (pilot study included) had German as their
mother tongue.
Last but not least, to understand the assessment system of my school it has to
be mentioned that it is our policy that written exams account for 40% of the overall
grade, and students get a positive grade when they achieve 51% altogether. According
to the curriculum for our type of school students are expected to reach level B2 on
the Common European Framework of Reference (see Online 13 and Online 14)
in English in their final year. My final year students had been learning English for
between nine and thirteen years and since they were attending a business school, they
4.5 Context and Participants 81

Table 4.1 Participants and


Pilot Study Participants
languages spoken
Participant Mother Second Foreign
tongue language languages
Angela Polish German English,
French,
Spanish
Dana Romanian German English,
French,
Spanish
Marvin German – English,
Italian,
Spanish
Tara German – English,
Italian,
Spanish
PhD Study Participants
Participant Mother Second Foreign
tongue language languages
Alan German – English,
Italian,
Spanish
Anne Serbian German English,
Italian,
Spanish
Emily Romanian German English,
Italian,
Spanish
Julie Serbian German English,
Italian,
Spanish
Katharine Slovak German English,
Italian,
Spanish
Michael German – English,
Italian,
Spanish
Susan Albanian German English,
French,
Spanish
Veronica German – English,
Italian,
Spanish
All participants’ names have been anonymised
82 4 Methodological Design

had had an average of three hours of English per week. Each semester the students
received a grade based on one written exam, the so-called Schularbeit, usually four
vocabulary quizzes, participation in class and homework (ranging from writing tasks,
listening tasks and reading tasks to watching video clips to be discussed in class).

4.6 Discussion of Feedback Method

The final year students I investigated encountered several kinds of written CF


methods. In their first two years, I used direct corrective feedback, providing them
with the correct answer, as I had been trained to do so in various didactics semi-
nars at university level as well as in courses in the teacher training programme. I
was very dissatisfied with this kind of feedback, because my learners showed little
improvement in their writing and recurring errors seemed to prove my point of direct
CF being ineffective. In their third year I tried to combine a marking code, using
symbols, i.e. G for grammar, Spell for spelling, WW for wrong word, as well as
short written comments to indicate areas they were already good at and areas they
should work on. When reviewing the literature on feedback (e.g., Chandler, 2003;
Ellis, 2009; Hendrickson, 1980; Hyland & Hyland, 2006; Scriven, 1967; Widdowson,
2003) I realised that the focus in respect of feedback had always been more on the
teacher than on the learner. Two essential questions needed to be addressed then:
First, which kind of written feedback method was the most effective? And second,
which method(s) did learners find engaging?
At the beginning of the academic session 2013/14 I changed two aspects of my
teaching. First, I introduced our platform 4a’s English Corner, a closed commu-
nity learning platform offered for free through CourseSites by Blackboard, which
only my students and I had access to. In the section called “Weekly Tasks” (see
Fig. 4.2), I uploaded supplementary articles, video clips and listening comprehen-
sion tasks related to the various topics (e.g., globalisation, customer service, retail
chain) covered during the course of the school year.
Second, on 4a’s English Corner, learners also had a homework blog, where each
of them uploaded his or her piece of written homework. This way they all had access
to each other’s pieces of writing which could be used for peer correction as well. What
is more, all the data being accessible online means that learners can check their own
as well as their peers work at a later date. Hence, the platform is a tool where all drafts
of written texts can be stored and function as a resource which can be easily accessed
(Hewings & Coffin, 2019). Furthermore, I used a colour code to indicate areas that
needed to be improved: word order , grammar, spelling , expression , content , and ¥:
something is missing (colour indicates in which area). What should be corrected
in each area had been discussed with the learners at the beginning of the academic
session 2013/14. In the first step I only used the colour code and uploaded their
documents on their homework blog on our platform. The learners then handed in the
second draft with their corrections. They kept the colour code too and any additional
corrections they had made were in bold type. Therefore, I did not need to read the
4.6 Discussion of Feedback Method 83

Fig. 4.2 Online platform—weekly tasks

whole piece of written homework once again, only the improved parts. This time,
despite error correction where I underlined everything I changed and/or added, they
also got a personal written statement on what they had done exceptionally well and
where they still needed to work on certain aspects of their writing.
To illustrate the feedback method, I included an excerpt from a student’s first and
second draft (see Figs. 4.3 and 4.4) of a piece of homework. The assignment was
adapted from a task in our course book Focus on Modern Business 4/5 (Ashdown,
Clarke, & Zekl, 2012). We had been talking about sponsorship for a couple of weeks
and I then asked them to write an article for our school magazine First Choice. They
had to argue whether or not they were in favour of Fryalot, a fast food company,
to sponsor the annual school ball. Obviously, Tara’s first draft did not resemble an
article, although I had uploaded a file on types of text (see Online 15) relevant for
the written A Level exam in English. I pointed that out to her (and others who forgot
to check the file) in class, which clearly helped for doing the second draft. Thus,
I realised that there were still areas that needed adjustments, so that learners could
benefit more from written CF.
Analysing my data (see Chap. 5) made it obvious that this feedback method needed
to be improved in order to help the learners to engage with working on their errors.
What some of them commented on was that they needed more help in categories such
as grammar, expression and content. As Emily stressed, without the clues “it’s really
like a quiz. […] grammar, absolutely, I need more. I, I need hints. Expression, … oh,
…. maybe” (Foc Group II, Paragraphs 667–669). For these areas a simple colour code
took them far too long to figure out what was wrong in the first place. Subsequently, I
84 4 Methodological Design

Fig. 4.3 Student’s first draft Sponsorship


At first, I would like to say that it’s very thankful that the
company Fryalot would offer us a sponsorship ¥. But
there are many things to overthink about this
sponsorship. Fryalot is a fast food company, so if they
would offer snacks on our Ball then it will be
unfortunately unhealthy. Otherwise when I am thinking
about some party’s then I never eat something healthy
there For our students it would be very cheap because
fast food isn’t as expensive then healthy food. Maybe if
we could have some volunteers they could make a
healthy buffet with smoothies and some fruits I think.
Indeed I also have to say I don’t know if they would make
many profits. In the case, what is very likely, teenagers are
drunk and then they would rather prefer fast food than
fruits or smoothies.

Fig. 4.4 Student’s second Schools


draft
Snacks of Fryalot for the ball?

By Tara Darby

At first, I would like to say that we’re very thankful that


the company Fryalot would offer us a sponsorship for our
ball. But there are many reasons to overthink about this
sponsorship.

Fryalot is a fast food company, so if they offered snacks on


our ball then unfortunately it would be unhealthy. Indeed,
when I am thinking about some par es then I never eat
something healthy there or at a ball. For our students it
would be very cheap because fast food isn’t as expensive
as healthy food. Maybe if we could have some volunteers,
they could make a healthy buffet with smoothies and some
fruits, I think. To be honest I also have to say I don’t know
if they would make much profit. In the case, which is very
likely, that teenagers are drunk and then they would rather
prefer fast food than fruits or smoothies.
4.6 Discussion of Feedback Method 85

added clues in brackets to help them with their corrections, e.g., for grammar, saying
in brackets tense, personal pronoun needed, etc. Furthermore, all of them mentioned
that a personal written statement should be part of the first and second draft (as well
as part of any feedback method being used for that matter), telling them what they
did well, which areas they still needed to work on or could even improve further.
As Veronica put it, the personal written statement is “motivation. […] you know
what you can do better” (Foc Group II, Paragraph 377). They especially liked the
personalised aspect of this feedback method but stressed that it had to be different
for every learner and not just a formula a teacher used for every learner (see also
Higgins, Hartley, & Skelton, 2002).

4.7 Data Collection

The data discussed and analysed were collected during and after the academic session
of 2014/15 within an eleven-month period. It consisted of two focus group interviews
with four students in each group, a questionnaire, a follow-up interview conducted
individually, and written samples produced either in school or at home by the students.
For each student the written data consisted of three texts (two articles and one report)
written during their written exams in the first and second semester, fourteen pieces
of written homework (first and second draft) from most students as well as several
blog entries on our online platform. All learners provided permission to use their
oral (see Appendix D) and (if relevant) written data for my study.
Although questionnaires are said to be an objective way of answering research
questions, I first decided that they were not the best means for my purpose, espe-
cially as I wanted to find out which feedback methods learners engaged with, and
why. To gather a more in-depth insight into learners’ ideas, I opted for focus group
interviews as they “are ideal for exploring people’s experiences, opinions, wishes
and concerns” (Kitzinger & Barbour, 1999, p. 5). Contrary to this, focus groups
have been criticised by claiming that they “are artificial situations which would not
exist without the intervention of the researcher” (see Kitzinger, 1994 as cited in
Green & Hart, 1999, p. 24). Then again, would any kind of research exist without
the researcher? It is therefore not a question of artificiality, but of credibility in the
process of gathering and analysing data. If the researcher is also the interviewer, a
“more productive approach would be to get all participants to feed off each other
with the moderator’s role being ‘relegated’ to that of being one of the discussants,
with occasional clarifying or directional questions” (Stewart & Shamdasani, 2014,
p. 85). Making the interviewees the centre of the interview is crucial then.
Taking all these considerations into account, a semi-structured approach seemed
to be appropriate to give the learners the opportunity to express their own ideas. The
participants were divided into two focus groups with four participants in each group.
They themselves decided on who joined Focus Group I or II, gave me a date they
wanted to do the interview, and told me they preferred the school premises as our
interview location. Moreover, they wanted the recording to take place in their own
86 4 Methodological Design

Table 4.2 Length of


Participant Focus group Follow-up
interviews
Alan 1:05:14 00:41:26
Anne 1:06:13 00:51:51
Emily 1:06:13 00:50:22
Julie 1:05:14 00:48:05
Katharine 1:05:14 00:50:42
Michael 1:06:13 00:41:23
Susan 1:05:14 00:37:28
Veronica 1:06:13 00:54:23

classroom as they felt comfortable there. The focus group interviews took place in
December 2014.
I used the prompt cards (see Appendix B) generated for the pilot study and placed
these on the table during the interview (see Table 4.2), which was recorded. The
order in which the learners discussed the various aspects was chosen by them. The
overall aim of the interview was to let the learners discuss these areas with as little
interference in their interactions from myself as possible in order not to influence
them in any way (see also Taylor & Robinson, 2014). In case their conversation
should break down, I also prepared interview questions (see Appendix E), which I
did not need to use at all, as all questions asked developed through their discussions.
The focus group interviews were transcribed and analysed with audiotranskription
software using f4transkript for transcribing the audio recordings and f4analyse for
analysing the transcripts. During that process it became evident that an individual
follow-up interview (see Table 4.2) was absolutely necessary. On the one hand, I
wanted to give the more introverted learners the opportunity to voice all their ideas,
and their discussion had raised new questions. For example, what they claimed they
did and what they actually did sometimes differed enormously. The constant research
into literature on learner engagement throughout my study, on the other hand, had
made it clear that in order to answer all my research questions, I needed a follow-
up interview too. Despite clarifying aspects from the focus group interviews, I also
wanted to analyse the behavioural, cognitive and emotional dimension of engagement
(see Sect. 2.3).
A semi-structured interview as in the focus group did not seem to be suitable for
an individual interview—especially when investigating the above-mentioned three
dimensions of engagement. Although I had dismissed using a questionnaire at the
beginning of my study, I realised that for investigating dimensions of engagement, this
research tool proved to be invaluable. During extensive research I found several ques-
tionnaires on student engagement, all of which I analysed in so far as how they were
relevant to be able to answer my research questions. After thorough research I created
a questionnaire (see Appendix F) based on NSSE, AUSSE, HSSSE and CCSSE (see
Online 5–7, Online 16), adapted some of their items and added items relevant to my
research context of student engagement with written corrective feedback.
4.7 Data Collection 87

As the main focus of my study was on learner engagement with written CF, simply
using one of the existing questionnaires, such as the above-mentioned, proved to be
ineffective as these focused more on extracurricular activities among other areas.
I, however, wanted to research learner engagement with written CF. Hence, I had
to design my own questionnaire, but decided to use questions from the existing
questionnaires that suited my study. For all questionnaire items generated by NSSE,
AUSSE and CCSSE that I used I got permission from the respective organizations
(see Appendix G). HSSSE stated on their website (Online 7) in June 2015 that
questionnaire items could be used for educational purposes and therefore no item
user agreement was needed then. As this policy had changed in the meantime, I
contacted them in August 2016 and got their permission as well (see Appendix G).
One might argue that using items from different research tools is problematic.
However, if you combine two areas of research that have not been combined before,
it is crucial to evaluate what has been done in the respective areas so far. Hence,
using parts of existing tools, which have already been piloted and been in use for
years, seems to be adequate. After careful consideration I chose items which served
my purpose and added items of my own to be able to answer my research questions.
As a result of using interview and questionnaire data I had to determine how
to best present the collected data. It has to be stated that this study is not a mixed
methods research (quantitative and qualitative research combined) study but draws
on mixed methods as I used the quantitative part to complement the qualitative one.
The analysis of the questionnaire is a simple count of numbers, a method Becker
termed quasi statistics (see Becker, 1970 as cited in Maxwell, 2010). Hence, each
student’s response to all the questions in the questionnaire was collated in one docu-
ment and then the overall percentage was calculated. To give an overview of the
students’ responses and to visualise the data, Microsoft Excel 2010 was used in
the Findings and Discussions chapter. As the questionnaire was designed after the
focus group interviews and used as the basis for the individual interviews, the graphs
using percentages are always supported by statements from the students. This study
does by no means want to claim generalizability, but what Maxwell (1992) called
internal generalizability which means “generalization within the setting or collection
of individuals studied, establishing that the themes or findings identified are in fact
characteristic of this setting or set of individuals as a whole” (Maxwell, 2010, p. 478).
To illustrate the results of the questionnaire and to underline the data gathered from
the focus and individual interviews using numbers was paramount.
The questionnaire (see Appendix F) consisted of nine questions and took 15
minutes to complete. The first five questions focused on the students’ general expe-
rience at school, asking more generally about how much teachers focused on memo-
rising facts and figures for classes or evaluating a point of view as well as how much
the learners experience contributed to their development in areas such as writing
clearly and effectively, working effectively with others or learning effectively on
their own. The remaining four questions asked about their engagement with various
feedback methods, factors influencing their decision to work with them as well as
aspects of their English homework. These questions included, for example, questions
about how many hours the learners have completed homework for class or written the
88 4 Methodological Design

second draft of an assignment outside school. They were also asked about how often
they prepared a draft of an assignment before turning it in, received feedback from
teachers or attended class with all assignments completed. Reflecting on their own
skills, abilities, motivations and aspects related to the three dimensions of engage-
ment were part of question five. The remaining four questions all centred around
feedback methods they engaged with, strategies they used while doing their English
homework, recurring errors learners corrected as well as factors influencing their
engagement with written CF.
The students filled in the questionnaire before the individual interview sessions
which were conducted in July and August 2015. This time the learners were given
several dates they could choose from to do their respective individual interview.
These interviews were once again recorded at the school premises in their former
classroom. It was the students’ idea to do the follow-up interview in July and August
after they had finished their A Level exams. They argued that they were no longer
part of our school, hence any doubts they might have had beforehand about revealing
certain kinds of thoughts on school-related issues were no longer a problem. The
learners had clearly been thinking about my request to be absolutely honest with
me, which shows that they took their role in this study seriously and underlines the
credibility of my data too.
During the follow-up interview I talked the questionnaire through with each of
them individually and asked additional questions when necessary. Questions for each
participant were prepared on the basis of their answers in the focus group interview
as well as the questionnaire (see Appendix C). Once again, the individual inter-
views were recorded, then transcribed and analysed using F4transkript and F4analyse
respectively.
After transcribing the focus group and individual interviews using f4transkript,
I coded the data using f4analyse. Keeping the three research questions in mind, I
not only needed to code the data according to the obvious categories of engagement
and feedback methods, but also to mediating factors that influenced the learners’
decision to work with written CF. Hence, during the coding process it became
evident that besides the main codes of attitudes and motivation, the teacher’s role
needed to be implemented as well. Working through all transcripts several times, I
divided the code system into the ten main categories assessment, attitudes, demotiva-
tion, engagement, feedback methods, homework, motivation, pressure, strategies and
teacher’s role. Some of these categories were then divided further into subcategories.
Attitudes included self-confidence, peers, individuality, error-driven, anxiety, fixed
roles, commitment, interest and laziness. Engagement was subdivided into learning
by heart, students’ definition, emotional, cognitive, lack of cognitive engagement
and behavioural. Feedback methods was divided into no feedback, combination
of methods, peer review, colour-code, error indication, error correction, reformu-
lation, codes in margin, statement by teacher and self-correction. Homework had the
subcategory tasks, motivation consisted of extrinsic, intrinsic and feedback. While
coding I added time and time management to the main category pressure. Strategies
was expanded by writing and cooperation. Finally, teacher’s role included unclear
4.7 Data Collection 89

instructions, feedback as dialogue and feedback by teacher. Unsurprisingly, some


of the learners’ statements had more than one code, as they referred to two or three
categories at the same time.

References

Ashdown, S., Clarke, D., & Zekl, C. (2012). Focus on modern business 4/5 (p. 17). Berlin and Linz:
Cornelsen.
Becker, H. S. (1970). Field work evidence. In H. Becker (Ed.), Sociological work: Methods and
substance (pp. 39–62). New Brunswick, NJ: Transaction Books.
Bitchener, J., & Storch, N. (2016). Written corrective feedback for L2 development. Bristol:
Multilingual Matters.
Chandler, J. (2003). The efficacy of various kinds of error feedback for improvement in the accuracy
and fluency of L2 student writing. Journal of Second Language Writing, 12, 267–296.
Cohen, A. D. (1987). Student processing of feedback on their compositions. In A. Wenden & J. Rubin
(Eds.), Learner strategies in language learning (pp. 57–69). Language Teaching Methodology
Series. New York et al: Prentice Hall.
Dörnyei, Z., & Ushioda, E. (2011). Teaching and researching motivation (2nd ed.). Harlow: Pearson.
Ellis, R. (1997). SLA research and language teaching. Oxford: Oxford University Press.
Ellis, R. (2009). A typology of written corrective feedback styles. ELT Journal, 63(2), 97–107.
Ellis, R. (2012). Language teaching research and language pedagogy. Chichester: Wiley.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 62, 60–71.
Green, J., & Hart, L. (1999). The impact of context on data. In R. S. Barbour & J. Kitzinger (Eds.),
Developing focus group research: Politics, theory and practice (pp. 21–35). London et al: Sage.
Hendrickson, J. M. (1980). The treatment of error in written work. The Modern Language Journal,
64(2), 216–221. Retrieved from http://www.jstor.org/stable/pdf/325306.pdf.
Hewings, A., & Coffins, C. (2019). Fostering formative online forums: Feedback, dialogue and
disciplinarity. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts
and issues (2nd ed., pp. 184–205). Cambridge: Cambridge University Press.
Higgins, R., Hartley, P., & Skelton, A. (2002). The conscientious consumer: Reconsidering the
role of assessment feedback in student learning. Studies in Higher Education, 27(1), 53–64.
Retrieved from http://nitromart.co.uk/jem/docs/tt/assessment%20feedback%20on%20student%
20learning%20journal%20article.pdf.
Hyland, K., & Hyland, F. (Eds.). (2006). Feedback in second language writing: Contexts and issues.
New York: Cambridge University Press.
Kitzinger, J. (1994). The methodology of focus groups: The importance of interaction between
research participants. Sociology of Health & Illness, 16(1), 103–121.
Kitzinger, J., & Barbour, R. S. (1999). Introduction: The challenge and promises of focus groups.
In R. S. Barbour & J. Kitzinger (Eds.), Developing focus group research: Politics, theory and
practice (pp. 1–20). London et al: Sage.
Maxwell, J. A. (1992). Understanding and validity in qualitative research. Harvard Educational
Review, 62, 279–300.
Maxwell, J. A. (2010). Using numbers in qualitative research. Qualitative Inquiry, 16(6), 475–482.
O’Leary, Z. (2014). The essential guide to doing your research project (2nd ed.). Los Angeles et al:
Sage.
Quimet, J. A. (2011). Enhancing student success through faculty development: The classroom survey
of student engagement. 高等教育ジャーナル: 高等教育と生涯学習 = Journal of Higher
Education and Lifelong Learning, 18, 115–120.
90 4 Methodological Design

Scriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.), Curriculum evaluation


(pp. 39–83). Chicago: Rand McNally.
Stewart, D. W., & Shamdasani, P. N. (2014). Focus groups: Theory and practice (Vol. 20). Thousand
Oaks, CA: Sage.
Taylor, C., & Robinson, C. (2014). ‘What matters in the end is to act well’: Student engagement and
ethics. In C. Bryson (Ed.), Understanding and developing student engagement (pp. 161–175).
The Staff and Educational Development Series. London and New York: Routledge.
Widdowson, H. G. (2003). Defining issues in English language teaching. Oxford: Oxford University
Press.
Yazzie-Mintz, E. (2007). Voices of students on engagement: A report on the 2006 high school
survey of student engagement. Center for Evaluation and Education Policy, Indiana University
(2007). Retrieved from http://files.eric.ed.gov/fulltext/ED495758.pdf.

Online Sources

Online 5: http://nsse.indiana.edu/html/survey_instruments.cfm [26.06.2015].


Online 6: http://www.acer.edu.au/ausse/reports [26.06.2015].
Online 7: http://ceep.indiana.edu/hssse/index.html [26.06.2015].
Online 8: https://www.bmbwf.gv.at/Themen/schule/schulpraxis/lp.html [18.04.2020].
Online 9: http://www.bildungssystem.at/en/footer-boxen/euroguidance-austria/about-us/
[03.08.2016].
Online 10: www.bildungssystem.at, OeAD-GmbH / Euroguidance Österreich [18.04.2020].
Online 11: https://www.bifie.at/bildungsstandards-und-kompetenzorientierter-unterricht/
[18.04.2020].
Online 12: https://www.cebs.at/service-angebote/wegweiser/wegweiser-muendliche-rdp/
[18.04.2020].
Online 13: https://www.coe.int/en/web/common-european-framework-reference-languages/ level-
descriptions [18.04.2020].
Online 14: http://www.bildungsstandards.berufsbildendeschulen.at [18.04.2020].
Online 15: https://www.hpt.at/sites/default/files/fileadmin/documents/SchulbuchPlus/
170022_SBP_Going_for_Finals_BHS/Textsorten_mit_Model_Task_for_BHS_2016.pdf
[18.04.2020—newest version].
Online 16: https://www.ccsse.org/aboutsurvey/CCSR_2005.pdf [26.06.2015].
Chapter 5
Findings and Discussion

How to present the data of the eight participants was the main concern at the beginning
of the analysis. Would it make more sense to discuss each participant individually
or should the results be presented according to the different categories that emerged
while analysing the data? Reading a lot of studies and comparing the various styles,
I decided not to discuss each participant individually for the presentation of the find-
ings, the reason being that it was not designed as a study with a focus on individuals’
beliefs, but was meant to gather an overall understanding of engagement with written
CF.
While coding the data a lot of categories and subcategories arose (see Appendix
H), hence it was essential to decide how to structure these. After careful consideration
and taking the research questions into account, the analysed data are portrayed in
the subchapters engagement, feedback methods as well as mediating factors, each
of these categories include several subcategories, for example the three dimensions
of engagement, various feedback methods or attitudes and emotions, to show the
diversity of these. The learners’ answers to and opinions on engagement, feedback
and mediating factors created another important category in this respect, namely
strategies for working with feedback. This subchapter discusses why this aspect might
be essential for learners when asked to work with any kind of feedback method the
teacher chooses.
To what extent the research questions were answered is discussed in the conclu-
sion. For that reason, the results of the analysed data were critically evaluated, and
attention was drawn to any pitfalls that might have arisen due to the methodological
design. Implications for practice and future research are discussed in the final chapter
as well.

© Springer Nature Switzerland AG 2020 91


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_5
92 5 Findings and Discussion

5.1 Engagement

Engagement is multifaceted and research has shown that it consists of three distinct
dimensions: emotional, cognitive and behavioural (see Chap. 2). Combining this
concept with written CF entails investigating how much influence these dimensions
have on learners’ engagement with it. Is one dimension more prominent than the
other two or is the level of engagement unique to every learner? This uniqueness
might be an essential feature for the learners’ level of engagement. The results of
the data analysis indicate that engagement is at the core of working with written
CF. As the concept of learner engagement is a relatively new one and opinions on
it among researchers are quite diverse (see Chap. 2), I included the term as one of
the prompts (see Appendix B) in the focus group interviews as well. Like a lot of
researchers my students also had quite different ideas about what engagement could
mean. Veronica, for example, defined it in “the sense of, um, like marriage” (Foc
Group II, Paragraph 862).1 This was supported by Julie, who viewed engagement
as an interplay between students and their teachers: “[W]hen it comes to homework
that we, students, should do it, in order to improve ourselves and to help the teacher
help us improve ourselves. And not only homeworks, but like in the lessons that we
are doing something with the teacher” (Foc Group I, Paragraph 304).
Katharine, who agreed with Julie that students and teachers should be working
together, added that in “the end it’s all on us. […] Because we have to learn it, we
have to do the homework” (Foc Group I, Paragraph 331). Hence, despite the teachers,
learners have a responsibility too when working with feedback. Susan went even
further by saying “you know that you are doing the homework for you and not for
the teacher” (Foc Group I, Paragraph 305). This notion was further underlined by
Alan’s and Anne’s opinions that the level of engagement with written corrective
feedback was your personal decision and no one else’s (cf. Foc Group I & II). Emily
emphasised “I think that for me engagement means that […] students are focused
on […] school or on how to improve theirselves to get … forwards” (Foc Group
II, Paragraph 935). The participants realised that their emotional engagement with
the teacher was one part which influenced their engagement, but consequently they
should engage with written CF for their own benefit. Which, as they explained, was
not always easy, when they had negative feelings towards one of their teachers. This
could be best illustrated with Julie who stressed:
I mean, … if we like the teacher, then we’ll … most likely also have … fun in the lessons,
and do […] the things the teacher asked from us. But like in other subjects where […] it’s
boring, or […] nobody is paying attention to what the teacher is saying, … then … there
[…] won’t be like a student enchangement (= engagement). (Foc Group I, Paragraph 326)

Even in their short statements on this matter one aspect became clear: learner
engagement is a complex system, which definitely needs to be considered when
working with written CF. In order to be able to better understand learners’ decisions
to (not) work with it, engaging with this concept appears to be inevitable.

1 Key to transcription conventions see Appendix I.


5.1 Engagement 93

5.1.1 Emotional Engagement

Emotional engagement includes, for example, learners’ reactions to peers and


teachers—two factors that were mentioned by the participants in the focus group
and individual interviews as well. Although some claimed that peers had rather
little impact on them (see Sect. 5.3.2), this issue came up quite a lot during the
interviews. When talking about peer correction, for instance, Susan stressed that she
had thought about how to give her peers feedback, because she argued: “don’t want
to hurt the other one” (Foc Group I, Paragraph 135). As a consequence, she would
have never given a peer harsh feedback as she had experienced from some teachers.
Julie emphasised getting feedback from peers was something she was “looking
forward to […] what he or she has to say about it. If it’s good, if it’s bad.” (Foc
Group I, Paragraph 139–141). Seeing peers as someone who could motivate them,
when teachers failed to do so, was another key factor in emotional engagement (cf.
Foc Group II). Emily especially highlighted that in her case peers sometimes really
helped her to get back on track after she had received a bad result. That peers could
be important for a learner’s level of engagement could also be illustrated with Julie.
She stressed “I liked the class so much, because […] whenever I .. was saying, uh,
what was in my mind […] everyone was kind to me” (Julie, Paragraph 60). The
cooperation of her peers was acknowledged by her as well, which every now and
then resulted in her doing a second draft as Emily reminded her that homework was
due. Without her Julie would have simply forgotten about the deadline.
Peers having a positive and negative influence at the same time can be best demon-
strated with Katharine. She appeared to be a rather shy student, and when asked in
the individual interview about it she reported that was just the case in “classes I
think. .. But […] Susan knows all my life” (Katharine, Paragraph 272). When asked
what she was thinking about when her classmates were discussing a topic and she
rarely actively participated in discussions, Katharine emphasised: “I just listened to
everything […] and […] I think about it and then that’s … Uh, yeah, that would .. fit
in, right into the topic. And then […] boom, Michael said it!” (Katharine, Paragraph
250). What is more, she explained whenever she felt disappointed by teachers, she
could rely on Susan to support her and cheer her up. In connection with feedback,
however, comparing her own written work to her peers’, sometimes resulted in not
finishing her assignments (see Sect. 5.3.5.7) as she felt her writing as well as her
ideas were not good enough. In her case the emotional dimension played a huge role
and was occasionally her reason for not engaging with written CF. An interesting
aspect for teachers to consider when trying to figure out motives behind a learner’s
lack of engagement.
Every now and then one peer seemed to be the reason for other students finding it
difficult to engage with feedback. Several students reported difficulties with another
peer when working on their final year project (the students had to write a thesis on
a topic of their choice related to business in a group of a maximum of five people).
Some of them said that these difficulties caused an inner conflict, because “we had
a person in our group that […] nobody really … could work with […]. That was
94 5 Findings and Discussion

really hard [..] we had to, because […] it would also be unfair to leave her alone
and out of this […], because we know she was in […] final version of it (= thesis)”
(Julie, Paragraph 100). In order to get a good grade on their project, some students
of that group ended up writing parts for that particular classmate. As a consequence,
this contributed to their level of frustration and made the writing process as well as
working with the feedback they got much harder. Finishing their project posed quite
a challenge to them as this student did not change her attitude. This instance proves
how much impact another peer can have on the level of engagement of others.
Additionally, the teacher’s role featured prominently when students were talking
about their engagement with written CF (see Sect. 5.3.1). When asked why the
engaged with the feedback method I used with them, where many of them stated that
it was a lot of work for them, they explained:
Emily: Because of you. (laughter)
Researcher: Okay.
Veronica: Because, um, the way how we should correct it with the, um, this one
Michael: Colours, colours.
Veronica: Yeah!
Researcher: The colour-code.
Veronica: With the colour code, um, is a good way [yeah] and not, not like the, um,
underlining something. And, … yeah … [with the colour-code I]
Emily: [And it’s something new.]
Veronica: I correct it.
Michael: [It just makes sense, it,]
Emily: [Yeah, and it’s something new, or?]
Michael: [definitely.] It makes sense and you [Yeah.] you can improve that’s why I do
it. [Yeah.] ……. Basically, because
Emily: And I think that if someone gives us the possibility … you waste, uh, you
invest a lot of time in it [yeah]. So, we should really appreciate that.
Michael: Mm hmm.
Emily: That’s why.
Veronica: Mm hmm.
Emily: Because I don’t want to … Ich möchte Sie nicht enttäuschen.
Researcher: I don’t want to disappoint you.
Michael: Ohhh!
Emily: I don’t want to disappoint you. … That’s th ….., uh, … at other teachers it
doesn’t matter to me, so to say.
Michael: Yeah.
Emily: If they, if I disappoint them. (Foc Group II, Paragraphs 885–906)

This depicts how teachers’ attitudes and reactions towards them also had an effect
on their willingness to work with written CF. Explaining to them why I used the
colour code and self-correction, for example, made a huge difference to my learners
and heightened their willingness to work with it although it meant more work for
them. While analysing the focus group and individual interviews it became even
5.1 Engagement 95

more evident that the participants’ emotional engagement with their teachers had a
rather huge impact on working with feedback: “when I knew it was not so good and
then I don’t want to disappoint you and I don’t want to disappoint myself.” (Susan,
Paragraph 334). Data from the questionnaire indicate that the teacher was a reason
for working with feedback for a majority of learners (see Sect. 5.3.1). During one
of the focus group interviews, the participants were discussing reasons for engaging
with certain topics as well as the feedback they got. Susan voiced an interesting
notion, saying
I think .. motivation is something that … you can’t give another person, they must find it
for … themself. [Mm hmm.] Because ….., it, I think, it doesn’t work if you are trying to …
motivate Julie for globalization, because … she must find her motivation […]. If she don’t
want to be motivated, you, as a teacher, you can’t do anything. Because then she simply
don’t want. (Foc Group I, Paragraph 383)

Although the emotional engagement students share with their teachers might help
to foster engagement with feedback, it cannot be taken for granted. When I asked
Julie—who really disliked the topic of globalization—in her individual interview
about whether or not I could have motivated her for this topic she said: “Not really
(laughs). […] It has nothing to do with you” (Julie, Paragraph 40–42). She high-
lighted the fact that she got a lot of ideas about this topic from her peers as well as
myself but she simply could not relate to this matter. Hence, she decided not to write
her homework assignment on globalization. Michael also believed that the motiva-
tion to work with feedback was something inherent to an individual, which could
be supported by a teacher, but only to a certain extent. In the end it was his own
responsibility whether he engaged with the feedback he got or not. Further, Michael
elaborated that the reason for engaging with the feedback in English but not in Italian
and Spanish was because it was important to him and “I have the feeling that the
teacher really engages with […] my text […]. And I feel like, um, it’s interesting for
the teacher, and not just their job […], you know. That’s […] my point” (Michael,
Paragraph 254).
Emily, on the other hand, believed that teachers could motivate their learners.
In her individual interview she accentuated she was sad that not more teachers
had given her the opportunity to self-correct her homework and provided her with
strategies to work with feedback, “[b]ecause I could do it better if […] the teacher
say […] you should yourself do it.” (Emily, Paragraph 398). What was more, a lack
of appreciation from the teacher was often missing, as “I […] want to have like a
compliment […]. So to say […], you have good worked […]. But .. that was not
always […]. So you worked just harder […], but it really […] doesn’t […] count for
the teacher” (Emily, Paragraph 252). That some teachers did not emotionally engage
with their learners was also mentioned by Anne, who said “they don’t care about
how we do that or how difficult it […] could be for us.” (Foc Group II, Paragraph 67).
This was supported by Veronica: “It’s .. just the grade that counts and […] nobody
thinks about .. uh, the day … where you have written the […] text or something like
that […] Because it’s .. can also be that that’s .. not your best day and […] you don’t
feel .. very good.” (Veronica, Paragraph 366). Teachers only doing their jobs, namely
96 5 Findings and Discussion

teaching the subject matter and not caring about them as individuals, was a concern
among the majority of the population. Consequently, it seems that for a lot of these
learners the emotional dimension was essential when working with feedback.
Another factor that emerged during the interviews was, that a teacher’s reaction to a
student’s question regarding feedback they got, could create a strong emotional reac-
tion from them. Veronica, who had difficulties working with the feedback she received
from her German teacher, was not happy about her teacher’s reaction, because “the
German homework I gave it to her, and one week later I ask her what’s wrong, and
she said […] I can’t remember” (Foc Group II, Paragraph 49). To her mind working
with feedback in this subject was more or less pointless, because she did not know
what her errors were, and her teacher could not help her. Ultimately, she gave up and
either asked another peer for help, who was good at German, or basically did not do
her homework assignments.
How much influence a teacher can have on learners’ attitudes can be portrayed
by the relationship between most of the participants and their Italian teacher, which
seemed to have been a bit problematic. Veronica reported on an incident where their
Italian teacher gave feedback rather differently on her written exam and Alan’s: “[H]e
has […] the correction with […] okay that’s wrong, you should write it that way, and
at my test it was just wrong. ……. And I don’t know what’s wrong.” (Foc Group II,
Paragraph 231). Several participants agreed that they did not receive the same kind of
feedback as their Italian teacher corrected the errors of some learners, whereas others
only got feedback that something was wrong. Further, Julie stressed that often “last
week it was okay to write, uh, the sentence like this and then in the exam […] it’s
wrong. Okay? Good.” (Julie, Paragraph 242). This led to not putting too much effort
into this subject and even resulted in her not wanting to go to school on Mondays,
“because we had two […] periods of … Italian” (Julie, Paragraph 321). Katharine
could relate to Julie’s emotions towards Italian, saying it “was just like …. grrr […]
I hate it so much!” (Katharine, Paragraph 624–626), hence being the reason why
most of the time she did not engage with feedback in this subject.
Not investing the same level of engagement in written corrective feedback due to
their emotional relationship with their teacher was mentioned by Michael as well:
“[I]n English, I think, […] I’m …. very […] keen on that […]. But, for example,
[…] in subjects like Spanish or Italian …. I never did that […] very thoroughly.”
(Michael, Paragraph 106–108). In his opinion these teachers did not give him the
feeling that his improvement in the respective language was important to them.
Although constructive feedback could lift the learners’ spirit and encourage them to
engage with it, this positive incentive could change, however, rather quickly:
Alan: ….. Yeah, and it’s also exciting … to wait of a statement, because, […] if you are
proud of a text, which you have written, and you think it’s, it’s very good. Then you
can’t wait […] of the statement. So ……. Yeah ……. If the statement is good, then
you are … happy … for the rest of the day. (laughter)
Susan: Yeah.
Julie: Or the next two hours.
Alan: Yeah. … Until Italian. ……. Yeah.” (Foc Group I, Paragraph 184–187)
5.1 Engagement 97

It appears that teachers, their feedback method and how the students emotionally
engage with them, has a great influence on their level of engagement. As can be seen
in the quote above, it sometimes changes rather quickly from very high to very low.
Hence, the students’ level of engagement can fluctuate immensely, which can be
seen as a result of various mediating factors (see Sect. 5.3) having an impact on it. In
line with that, the Engagement-Mediator-Feedback Model (Chap. 3) just portrays that
notion. Alan, for instance was really excited about getting his statement on one of the
English homework assignments, hence his level of engagement was rather high. But
within hours that changed, because they had a double period of Italian. Consequently,
a learner’s engagement is multifaceted, dynamic and highly influenced by mediating
factors, further strengthening the Engagement-Mediator-Feedback Model as being a
valuable concept for designing feedback methods.
Discussing reasons for not engaging with feedback in more detail, two participants
said the following:
Emily: But now we are, we have five years of school with this […] teachers, and we
are really struggling with them. But, um, I don’t know. If there […] were […]
different teachers, like we now have, I would do it.
Michael: Yeah.
Emily: But [… like]2
Michael: [Yeah, because] I think we are all interested in the […] languages […], but ….. I
mean it’s simply annoying and exhausting.” (Foc Group II, Paragraph 212–215)

This view was backed by other participants as well, thus the emotional dimension
of engagement should not be underestimated. Veronica, for example, believed that the
level of engagement on the teacher’s part had an influence on the learners’ willingness
to work with feedback as it was linked to “how motivated the teacher is.” (Veronica,
Paragraph 532).
The learners’ emotional engagement with school or a specific subject emerged as
another factor for (not) engaging. Being unhappy about her choice of type of school
influenced Julie’s engagement with subjects in general and feedback specifically. At
times she was rather upset, because “I could have gone to IT,3 uh, because that’s
something I like more, … than […] Accountancy or Italian, or whatever. I’d rather
be sitting in front of a computer figuring out what’s wrong with it.” (Foc Group
I, Paragraph 101). In her case a lack of engagement with written CF was often the
result of wanting to fix a computer problem rather than working on sentence structure,
phrases, etc. Katharine on the other hand, engaged with feedback in English, because
“I just knew that it was a .. really important language so […] I put every […] effort
that I have .. into it to […] improve [..] .. more and more.” (Katharine, Paragraph 92).
As she was of the opinion that her English was not good enough, although she really
enjoyed the subject, she invested a lot of time in getting better. Despite doing most of
her assignments and every now and then actively participating in class, she also spent

2 Brackets here are used to indicate overlapping speech.


3 When Julie attended our school, she could decide between a language, information technology or
specialised accounting focus in her studies. She chose the languages specialisation (English and
Italian/French all five years; Spanish three years).
98 5 Findings and Discussion

some time abroad during one of her school holidays to interact with native speakers.
Her emotional engagement with English as a language she really liked triggered her
desire to engage with it fully.
Last but not least, as the feedback system in English was a bit more work than
in other subjects (see Sects. 4.6. and 5.4.3), I asked the learners why they engaged
with it anyway. Their responses showed that to a certain extent their relationship
with me contributed to their engagement with written corrective feedback. Emily,
for instance, who constantly emphasised the lack of time, said that she did her written
assignments because of me (cf. Emily, Paragraph 452). What was more, in her opinion
“if someone gives us the possibility [… and] you invest a lot of time in it […]. So
we should really appreciate that.” (Foc Group II, Paragraph 897). A view that was
backed by Anne, Michael and Veronica as well. Anne added an interesting aspect,
saying “because in some subjects […] we didn’t get any feedback […] but …. in,
uh, English, for example, .. we get .. very … many feedbacks from you and …. so it
influenced myself to .. work in a other […] way […] to force more myself […] to do
the work better, because .. as I said […] it lasts you also time to correct it.” (Anne,
Paragraph 534). Like Emily she reasoned that because in her opinion I invested more
time in giving them feedback, the same could be asked of herself when working on
her second draft. Additionally, Emily and Susan both stated that another reason for
doing their second drafts was because they did not want to disappoint me (cf. Foc
Group II, Paragraph 904; Susan, Paragraph 334). When I asked Emily whether she
did her second draft because I wanted her to do so, she elaborated:
Emily: …. No, not because of that [aha]. Because …
Researcher: Yeah?
Emily: … You give … uh, give me the feeling that you want to .. us to be better [mm
hmm] than we could be as it, .. We can work on it [mm hmm] .. and you …,
um, support us […]” (Emily, Paragraph 454–456)

Julie voiced a similar opinion, as she was proud of herself when she got feedback
where not only her weaknesses but especially her strengths were pointed out. In
addition, she stressed that her not doing a written assignment was due to her laziness
(see Sect. 5.3.5.4), but she appreciated that if she did one, she got extensive feedback
on it.
In conclusion, peers and teachers seem to have a notable influence on learners’
engagement with written CF. The results of the data analysis show that although
teachers more often than not are the crucial factor in it, they sometimes have very
little influence on their learners’ emotional engagement with written CF. But their
way to give feedback has a considerable effect on the learners—which can be positive
or negative—and needs to be kept in mind. I believe as teachers we should discuss
the feedback method with our learners on a regular basis to make necessary changes
to it. Like engagement, feedback is constantly changing too, because we are working
with human beings whose individual differences shape our way of giving feedback
as well.
5.1 Engagement 99

5.1.2 Cognitive Engagement

Within the Engagement-Mediator-Feedback Model the three dimensions play an


equal role. Interlinked with the emotional dimension, the cognitive one is an
intriguing one, as it is definitely harder to measure than emotional or behavioural
engagement. Cognitive engagement means, for instance, that learners draw on strate-
gies they have already acquired and use effectively. Additionally, they try to get to the
bottom of a problem and deeply engage with their own learning process. According
to Barkley (2010)
active learning means that the mind is actively engaged. Its defining characteristics are
that students are dynamic participants in their learning and that they are reflecting on and
monitoring both the process and the result of their learning. […] An engaged student actively
examines, questions, and relates new ideas to old, thereby achieving the kind of deep learning
that lasts. (Barkley, 2010, p. 17)

These areas appear to be essential when engaging actively with written CF as well.
Increased cognitive strategies, especially, are being used when learners are trying to
decipher learning material and draw on various learning strategies that go beyond
summarising and memorising facts (Wolters & Taylor, 2012). The analysis of the
data shows that the learners’ cognitive engagement with the feedback they got helped
them to avoid certain errors when writing their written exams:
Michael: What I really like is, when you, for example, in one homework you have lots of, I
don’t know, um, spelling mistakes, or grammar mistakes or expression mistakes
and then you think that special kind of, of, of um, errors in the next homework
[mm hmm]. And then you really, um, focus on that and try to avoid this, these
errors. Or what’s also cool is when you see, um, your improvement … [mm hmm]
in the, in one homework. If you see, okay, last time I was really bad or had, uh,
had many grammar mistakes and then in the next homework you see okay, I, I
got better. I think, that’s.
Veronica: And, and I, I remember my mistakes, because …
Michael: Mm hmm.
Veronica: at the last homework I, I wrote some if-sentences with the conditional II and I, I
wrote would instead of past tense, I think, and I now I know it! (laughter) And
… I never knew it.
Anne: And I think in English it helps the commentary you, uh, Ms Moser writes us,
because, um, in the business letters, for example, I wrote always Austira, uh,
instead of Austria (laughter) and then in the, uh, exam or so, I think, oh, I should,
um, take a little care of this and [mm hmm] ……. Focus more on this one. (Foc
Group II, Paragraph 718–722)

Several participants commented on the fact that when self-correcting their pieces
of homework they had to think about how to correct their own errors “[b]ecause I
really thought of the […] text I […] wrote” (Michael, Paragraph 158), hence the
result being, as Emily pointed out, “you know it for a longer time.” (Foc Group II,
Paragraph 60). Most of the population stated that they checked their second drafts
before each written English exam, because “I know where my mistakes are, and then
in the exam I can … use … the corrected form.” (Julie, Paragraph 376). What was
100 5 Findings and Discussion

more, Emily, for example, stated that “I sometimes .. just look on the homework ..
also after two weeks […] And I .. try to think of how would I really write that […]
and try to .. think of, okay, what .. was, was I thinking about that? .. And just try it
again.” (Emily, Paragraph 596). These learners realised that going through the final
corrections raised their awareness for recurring errors they made and consequently
helped them to avoid some of these in the respective written exam.
Further, Michael pointed out why he believed engaging with the feedback he got
was beneficial:
[I]f we would have a really, I don’t know, hard homework to do, and really learn with that,
then we wouldn’t have to learn that much for a test, for example, like in English. […] I mean
[…] for the globalization, um, article, um, for example, I didn’t really have to, …. learn
again for that topic, because I wrote it […] once, so ….. I know how it works. (Foc Group
II, Paragraph 70)

To his mind doing his first and second drafts thoroughly and actively participating
in class discussions on the various topics, saved him a lot of time when studying for
an exam. Additionally, he thought that not talking in class meant that you did not
think about the respective topic. Veronica strongly disagreed with him, because “I ..
say not every time something, but I think always.” (Foc Group II, Paragraph 824).
The same was true for Katharine as most of the time she was listening intently to
what was being said in class, but did not always contribute to class discussions. Only
because learners remain silent in class does not necessarily mean that they do not
cognitively engage with the subject matter at hand.
The first question in the questionnaire was “During your final year, how much
have your teachers in various subjects focused on the following?”: Items 1a-e asked
about memorizing facts and figures for classes, understanding information and ideas
for classes, analysing ideas in depth for classes, applying knowledge or methods to a
problem or new situation as well as evaluating a point of view, decision, or informa-
tion source. The results of the questionnaire (see Fig. 5.1) show that understanding
information and ideas for classes were focused on very much (87.5%) followed
closely by memorizing facts and figures (75%). Less focus was put on analysing
ideas in depth, applying knowledge or methods and evaluating decisions, etc.
The data suggest that the majority of teachers of this population put more emphasis
on shallow cognitive engagement where learners were not asked to engage more
deeply with certain areas. Like Michael stressed “in our school system […] you have
to memorise a lot of facts, […] but you don’t learn how to use these facts, I guess. […]
sometimes I think we, we forgot […] the facts after a test or a revision. And that’s
not good I think.” (Michael, Paragraph 6). Deep cognitive engagement occurred too
seldom but the majority of the students believed that it should have been fostered
more strongly. They argued that the tight timetable, too many subjects and only two
to three lessons a week for most subjects might be the reason for a lack of deep
cognitive engagement. What is more, they mentioned that especially in their final
year most teachers drew more focus on applying knowledge, analysing ideas in depth
as well as evaluating a point of view. To their mind one of the reasons could have
been the approaching A Level exams where these skills were necessary to perform
5.1 Engagement 101

90%
80%
Very much
70%
Percentage

60% Some
50%
40% Very li le
30%
Not at all
20%
10%
0%

Fig. 5.1 Areas focused on in class

well. Several participants voiced the opinion that they would have appreciated more
focus on deep cognitive engagement a bit earlier. As Katharine highlighted, “it was
mostly in the final year when […] it was like … ‘Wait!’ […] We heard that […] in, I
don’t know, Accounting and then now it […] was in, […] Business Studies […] and
[…] everything makes […] sense now!” (Katharine, Paragraph 396–398).
During the focus group and individual interviews, the students stressed that doing
their second draft and working on their errors took quite some time (see Sect. 5.2.9.1),
especially at the beginning when the feedback method was still alien to them. For
most of them adjusting to self-correcting their errors was somewhat difficult, because
they were not used to the idea that they should think about their errors and figure
out the correct solution by themselves, before they got the teacher’s final feedback.
As Veronica stated “the concept is, is good […]. It’s, yeah, hard work sometimes
[…] but .. I remember my errors.” (Veronica, Paragraph 508). In order to ensure
that students also get the opportunity to engage with feedback on a cognitive level,
they might sometimes need further help from the teacher. Katharine, for example,
believed “there are just those mistakes that you can’t see by yourself when you read
through it again. You just overlook it .” (Foc Group I, Paragraph 110). Hence, most
learners need support from their teachers to realise where their weak—as well as
their strong—spots are. For this population the colour code (see Sect. 5.2.4.1) in
combination with the clues as well as the personal written statement seemed to have
fostered the learners’ engagement with feedback.
To further improve the feedback method and enhance learner engagement the
learners had some ideas of their own as well. Alan, who often struggled with idiomatic
expressions, pointed out that he would have liked to either get an explanation in
his personal written statement (see Sect. 5.2.7) or in the final correction by the
teacher, as he wanted to know “[w]hy I can’t use some expressions.” (Foc Group I,
Paragraph 226). Specifically, as he very often checked the frequency of expressions
on google.co.uk to make sure that he picked an idiomatic phrase. Every now and
102 5 Findings and Discussion

then, the expression he chose could not be used in this context, thus giving him a
reason for that might have further promoted his cognitive engagement with his own
errors.
Another area in which students would have liked an explanation as well was
grammar. As established after the focus group interviews, clues (see Sect. 4.6) were
absolutely necessary for the students to be able to work on their errors themselves:
“[I]f the hints were not there, I think, I would have …. most of the time have no idea
[…] what […] … shall I do.” (Susan, Paragraph 376). In addition, Anne and Veronica
would have needed an explanation regarding tense use in the teacher’s final feedback,
because as they stated for them there was no logic behind grammatical patterns.
To help them to better understand why they had to use a certain tense, I started
giving them (as well as all the other learners in their English class) explanations
in my final corrections of their texts. Emily, for instance, found it difficult to use
present progressive tense. In one of her written assignments she wrote the following
sentence: Since 2012 he works as a chief editor for the Austrian weekly journal called
“Falter”. In her second draft she replaced ‘works’ with ‘is working ’, hence I not only
corrected it to ‘has been working ’ in the final correction, but provided the following
explanation as well: “You need a present perfect progressive here, because he started
in 2012 but is still working there. Hence, the process started in 2012 but is not over
yet.” (see Appendix J). Emily as well as other learners appreciated explanations
they got either in the personal written statement or the final correction which was
implemented into the feedback method after the focus group interviews. Michael, for
example, got confused whether to use “in the picture” or “on the picture”. Although
he checked on linguee.de he did not quite grasp the difference, hence he asked for
clarification, which he got in the personal written statement on his homework blog:
Posted by Alia on Sunday, 12 October 2014 09:56:50 o’clock EDT
Hi Michael,
:-) The correct one to use is “in the picture”, simply because “on the picture” is mostly used
in connection with e.g., impact on the picture, effect on the picture, seen on the picture (of a
CD cover). The German equivalent for “on the picture” would be “auf dem Bild”, whereas
“in the picture” is used for the German “im Bild”.
Probably it is easier to explain with the following example: There is a fly in the picture vs.
the fly is on the picture (meaning a fly landed on the picture, but is not part of the picture).
I hope, you are a little bit less confused now:-)
See you tomorrow. AM

When discussing whether the first or second draft involved more work, the partic-
ipants disagreed strongly on which one needed more time to do. Alan, Emily, Julie
and Susan stated that more effort was needed for the first draft, because they had to
think about ideas, the structure of the text as well as which phrases to use. In contrast
to that, Anne, Katharine, Michael and Veronica believed that more work was put into
the second draft:
Veronica: The correction is always longer than […] the first version.
Researcher: Why is that the case?
5.1 Engagement 103

Anne: Because, you … need more time to think about your mistakes.”
Veronica: Because at the first draft, you just write it. And, then you, you don’t think
about your mistakes. (Foc Group II, Paragraph 783–787)

The third question asked “In a typical 7-day week during your final year, about
how many hours have you done the following outside school?”: Completing home-
work for class, studying for tests or quizzes, writing your final year project, writing
the first draft of an assignment (e.g., for English), writing the second draft of an
assignment (e.g., for English) and working with classmates outside class to prepare
class assignments. The results show that on average half the population spent 1 h or
less on their first draft and the other 50% 2–3 h, whereas 62.5% claimed having spent
1 h or less on their second draft in contrast to 37.5% whom it took 2–3 h to complete
it (see Appendix K). They all agreed though, that they deeply engaged with their
errors and realised which areas they had to put more work into. Because of doing
a first and a second draft, most of the population also recognised what they should
focus on more when doing their homework assignments. A lot of the participants
mentioned that another by-product of working on the first draft was changing phrases
and adding ideas although that had not been specifically pointed out in the personal
written statement. As Katharine explained: “[W]ell, sometimes I read it through and
then like … I could write that too. [… a]nd just added it into it.” (Katharine, Paragraph
730–732), thus the second draft provided the learners with enough opportunities to
not only correct their errors but also to work on the arguments they developed, which
very often resulted in deep engagement with their texts and enhanced their cognitive
engagement with the subject matter.
The strategies the students drew on when correcting their errors were rather
diverse. Whereas Julie stated “I don’t want to .. like check it online, because I want
to think […] on my own.” (Julie, Paragraph 470), most of the other participants used
the Internet to correct some of their errors. Emily mentioned: “If I think how much
work I … put on Google […] and Englishhilfen.de […] or ego4u [..] to check it out
what can be wrong […]. And […] I can say that .. I just memorise it better.” (Emily,
Paragraph 350). The majority of the population stressed that due to self-correcting
their errors they really had to think about what could be wrong and their awareness
of certain recurring errors they made was heightened—like Anne who stressed that
when “I .. get my correction back […] and see my errors or mistakes … I know that
I should, uh, focused more .. the next time.” (Anne, Paragraph 156). How reflective
some of the learners were on their progress in English, can be best portrayed with
Emily, saying “for sure, that I improved […]. Also with, uh, tenses and so on …, but
there are still …. mistakes too, that have been … /throughout/.” (Emily, Paragraph
566). Reflecting on their own errors as well as the personal written statement (see
Fig. 5.6) on their written assignments appears to be part of deep cognitive engage-
ment with the feedback they got. Despite the fact that most of the students tried to
transfer that knowledge, like Susan who stressed, “I compare it, and [..] try […] to
figure out what shall I do to make it … better […] the next time.” (Susan, Paragraph
344).
104 5 Findings and Discussion

Last but not least, another way to engage students cognitively might be when
feedback is seen as a dialogue between learners and teachers (see Sect. 5.3.1.3). As
already mentioned, one part of the feedback method in English was a personal written
statement by the teacher on our online platform (see Sects. 4.6. and 5.2.7). How the
participants engaged cognitively with the feedback they got can be best illustrated
with two examples: After writing the article on globalization in the first semester,
where they had to outline issues related to current trends in globalization and how
these have been tackled so far (cf. Ashdown, Clarke, & Zekl, 2012, p. 111), Michael
was dissatisfied with his first draft and asked me for tips on how to improve it:
Globalisation
Posted by Michael at Tuesday, 21 October 2014 11:53:51 o’clock EDT
Last Edited: Sunday, 26 October 2014 13:11:01 o’clock EDT
Hi Ms Moser!
Here is my article about globalisation. To be honest I am not quite happy with my text - but
I don’t know why. I read through it several times and thought of any other ways of saying
what I want to say, but for some reason I couldn’t. So, I would be very thankful if you could
give me some response on my text and tell me what you think about it. I am really not sure if
my article outlines the problems that were asked in the instruction, so I am looking forward
to your answer.
See you tomorrow! :-)
Greetings from a desperate Michael

Michael had clearly engaged cognitively with this topic and was trying to figure
out how to improve his article. As he could not manage on his own, he asked for my
advice, which he got as follows:
Posted by Alia on Sunday, 26 October 2014 13:10:28 o’clock EDT
Hi Michael,
As you said yourself, it is a good article, but there is room for improvement. First, in the
introduction, you need to clearly state why you picked out global natural resources. The
reader should know from the start why you believe this issue to be of importance.
Second, in your second paragraph, I would add a reason why the environment suffers
from goods being transported from the USA to Europe. Furthermore, I would add another
paragraph where you include another example of how we waste our natural global resources.
Third, for your conclusion, just add one example what you could buy in your local area
instead of getting it from the USA, Australia, etc. to stress your suggestion as to what can
be done to improve the situation.
See you tomorrow.
Best AM

Michael included a paragraph on how we waste our natural global resources, where
he stated: “Another example of how our natural resources get destroyed is how we
deal with the huge amount of wrappings that are being used nowadays. The resources
that are made use of to produce wrapping-material only achieve one thing—they get
wasted. Why? Well, it does not matter if it is packaging for a new iPhone or simply
for cheese—the bottom line is that consumers throw it into the rubbish bin.” Using
5.1 Engagement 105

the pieces of advice given in the personal written statement, Michael changed quite
a lot in his second draft and produced a cohesive and coherent text on globalization
in the end (see Appendix L).
Another instance that proves a learner’s cognitive engagement with her written
assignment was Emily, who was dissatisfied with her article on sponsorship
(Ashdown et al., 2012, p. 17) and supposed it was because of the topic, thus asking
for advice on our homework blog as well which I responded to:
Re: Homework Sponsorship deal
Posted by Alia on Friday, 6 March 2015 11:43:17 o’clock EST
Hi Emily,
Don’t be so critical of yourself. I didn’t fall asleep reading it:-)
The first and second paragraph are well-argued, I would suggest that you work on your
final paragraph. You could argue, for example, that it is the students’ choice and that they
themselves have to decide whether or not they can live with a bad reputation for their school,
because some people might not go to the ball because of a fast food chain sponsoring the
event.
Have a nice weekend.
Best AM

These two examples in order to illustrate learners’ deep cognitive engagement


strengthen the Engagement-Mediator-Feedback Model where feedback is seen as
a process. Learners and teachers feed off each other in order to increase learner
engagement. Feedback as dialogue is clearly the way forward to engage learners
cognitively. Using a blog, where learners upload their written pieces, like I did with
this population and I am still using with my present students, is definitely a beneficial
way to enhance the dialogue with them. Not only can they ask questions or get
clarification on their written text, but also compare their pieces to those of other
classmates, contributing to their cognitive development when actively engaging with
written texts of their classmates.

5.1.2.1 Lack of Cognitive Engagement

One of the reasons for not engaging with the feedback learners received appears to be a
lack of cognitive engagement. When discussing error correction, which was amongst
the least engaging feedback methods (see Fig. 5.2), Alan explained he did not see
any point in engaging with error correction as the correct version was already given,
hence no real reason to cognitively engage with this feedback method. In addition to
that Susan said “sometimes you get the correction and you accept it, but […] you can’t
understand why maybe your expression wasn’t right, and then […] you have no, um,
chance to, to do it again better. And that’s […] not more interesting.” (Foc Group I,
Paragraph 35). Veronica believed that you could probably understand the kind of error
you made, “but you .. don’t remember it.” (Veronica, Paragraph 570) as they did not
have to work on their errors themselves, thus error correction neither triggered further
engagement with these nor helped them to make any kind of progress in this respect.
106 5 Findings and Discussion

Similar statements were given in connection with the feedback method error
indication/no correction: “If I have just […] underlined the wrong word […] and
I don’t know […] how […] and .. what is wrong.” (Anne, Paragraph 478–480).
Or as Veronica stressed: “it’s wrong but I don’t know why it’s wrong” (Foc Group
II, Paragraph 162). Once again, no cognitive engagement on the learner’s part was
the result as she did not know how to work with this feedback method. Emily, when
talking about her experience with writing clearly and effectively, stated “it was for me
a […] not very easy […]. Because how can I […] change something when nobody
says (= tells me).” (Emily, Paragraph 76). All these instances seem to show that
learners need help and, more precisely, strategies (see Sect. 5.4), to be able to engage
with the feedback they receive.
Several participants stressed that they could have improved in the respective
subjects if their teachers had pointed out their weaknesses in writing and how to
overcome these sooner and not only shortly before the written A Levels exam: “And
that’s why […] I always think my texts are alright […], but then in the end they turn
out they were not okay […]. And then I never really …. know … where .. was the
mistake […]. But then in the end I realised it was always the same ones […]. That I
could have improved.” (Julie, Paragraph 132). More support from her teacher might
have led to more cognitive engagement and could have helped her to succeed more
easily in writing texts in German. Emily, who also struggled in German, reported that
the sessions with her teacher between the written and oral A Level exams, where they
thoroughly worked through her errors, were extremely beneficial. She was convinced
that she would have profited from this kind of feedback had she gotten it much earlier.
Another interesting aspect regarding a lack of engagement with feedback was
mentioned by Michael in his individual interview, when commenting on not doing
homework assignments in some of the languages: “I didn’t have the, the motivation
[…] to do that […]. I was like okay…. also if I write that text now on my own
[…] without any help […] I don’t learn anything, because […] I don’t get […]
feedback […] that I can use to improve […] my skills …. and my language.” (Michael,
Paragraph 110). It seems that not being able to transfer the feedback he got to using
it in order to improve in the respective subject, led to a lack of cognitive engagement
with it. The data suggest that in order to foster engagement not only the learners need
to be willing to engage with feedback, but also the teacher needs to provide them
with plenty of opportunities to engage with it.
Finally, a heavy workload might be another reason for students not engaging
cognitively with feedback: “because if you have so much pressure for doing home-
works, making presentations, and this and that, then […] you will reach the point
that you just do it, because you have [to], and you don’t think about that what you
are doing.” (Foc Group I, Paragraph 89). In conclusion, despite the various reasons
already mentioned too many assignments to fulfil might have an influence of the
learners’ level of cognitive engagement which might sometimes even lead to a lack
of it.
5.1 Engagement 107

5.1.3 Behavioural Engagement

Behavioural engagement does not only include following rules and adhering to class-
room norms, but also the learners’ involvement in their own active learning, engaging
with tasks and showing behaviours such as effort or attention (cf. Fredricks, Blumen-
feld, & Paris, 2004). In contrast to cognitive engagement, where active learning
also plays a crucial part, behavioural engagement does not include getting to the
bottom of a problem and thoroughly engaging with their own learning process. When
analysing the focus and individual interviews, the above-mentioned areas emerged
as well, especially during the focus group interviews where a lot of the participants
commented on their own involvement in doing their homework assignments. Susan,
for example, stressed “sometimes there, there are so many teachers that want some-
thing from you and then you think, okay I will do it tomorrow, I will do it the day after
tomorrow. And then .. the deadline is over.” (Foc Group I, Paragraph 4). Sometimes
something simple as “in summer, for example, when all .., uh, others are outside and
[…] .. enjoy the .. beautiful weather […] and I have to study at home it is very …
difficult.” (Anne, Paragraph 138) was the reason for delaying an assignment. Post-
poning doing an assignment was something Anne and Veronica knew well too, but in
their cases most of the time they managed to meet the deadline and finish it on time.
Another aspect Susan mentioned in this respect was that she persuaded herself into
thinking “one homework is not the problem, when I don’t do it and then at the end of
the year it’s like […] a lot of homework I don’t have done.” (Foc Group I, Paragraph 6).
How learners react to tasks as well as what keeps them going appears to be
sometimes of little and other times of huge influence on the teacher’s part. Having
said that, on the one hand, a teacher can tell his or her students why doing homework
assignments and engaging with feedback are important, but every now and then their
attitudes will very likely not be influenced by that. Like Julie emphasised:
Researcher: Yeah. So, was that because the […] teacher wanted you to do so?
Julie: No.
Researcher: No?
Julie: Because I simply wanted to.
Researcher: Aha, you simply wanted to. So, you wanted to what … impress … yourself
[Na] … or impress the teacher, or?
Julie: I actually wanted to improve. (Julie, Paragraph 270–275)

Many participants reported they knew they should do their homework assignments
or that working with feedback was beneficial, but still could not motivate themselves
to do so and the reasons for not doing them were manifold (see Sect. 5.3). At the
same time teachers did have a rather huge impact on learners’ engagement with
doing homework tasks. The majority of the population stressed that some of their
teachers either did not even correct their pieces of homework, hence they had no clue
what they should be working on. Veronica was of the opinion that “when there is
homework teachers should correct it […] wirklich gut (= really well) .” (Foc Group
108 5 Findings and Discussion

II, Paragraph 94). What was more, most of the students stressed that when they got
feedback it should include tips and/or strategies on how to improve.
The analysis of the data also shows that little engagement is taking place when
the learners had a rather short period of time for completing assignments. Several
participants reported that the shortest period was only one day and then more often
than not they compared the solutions either on the overhead projector or the data
projector, e.g., in Accounting. Some of them questioned the value of that method as
it frequently resulted in not doing the assignments and simply copying it from the
projector or their peers. Writing tasks in German produced a similar effect with most
of the learners, as the participants reported that they only got the chance to do one
assignment before the actual written exam, thus engagement with feedback in order
to improve was rather limited or even impossible due to a lack of time (see Sect. 5.2).
An interesting aspect that surfaced during the interviews was the double-edged
sword of the learners’ own responsibility for their engagement and fixed roles that
were anchored in their belief about school in general. When talking about if deadlines
were necessary, Julie stressed that “if you have no deadline, you won’t do it anyway.”
(Foc Group I, Paragraph 311). Only a minority of the participants claimed that they
would do written assignments for their own benefit even if they had no deadline. The
same was true when the students started a discussion on whether or not self-correcting
their written assignments in English should be compulsory or voluntary:
Anne: The self-correction should, um, uh, also be like a kind of, um, voluntaries,
because, um, some did it just because they have to did it, uh, do this. And some,
um, take more time in this, put more time [in the self-correction.]4
Michael: [Yeah, but with ..] Na, that wouldn’t work.
Veronica: Nobody would do it [if it would be voluntary]
Emily: [To be honest to you],
Michael: We are students.
Emily: I would not do it either. (Foc Group II, Paragraph 730–735)

Even though many of the participants stressed that they had benefitted tremen-
dously from self-correcting their errors because of the deep engagement with it,
the majority believed they would not have done it had it been voluntary. Most of
them thought the reason for that was the perception they had about school in general
and their roles as students specifically which had been instilled into them from an
early age onwards. Like Susan pointed out: “we have learned it so. Also in primary
school.” (Foc Group I, Paragraph 335). Julie portrayed their—in her opinion some-
times passive—role as something “already pre-made for you, and the only thing that’s
missing is … that you just come by at school, and sit there, and learn something,
and go home, and … the whole thing repeats itself. It’s like a circle.” (Foc Group I,
Paragraph 338).
In contrast to that, a lot of the students held the belief that doing their homework
assignments and working with the feedback they got was their own responsibility.
Despite self-discipline and motivation being a pre-requisite, Susan stated that “the

4 Brackets here are used to indicate overlapping speech.


5.1 Engagement 109

teacher can’t say, you don’t have to do [your homework], because he or she needs
something to give you your grade, […] you’re not doing it for the teacher, to be honest,
you are doing it to get better in school.” (Foc Group I, Paragraph 321). Anne, who
sometimes had difficulties getting herself to do the homework assignments, found a
simple solution for herself: “I .. was in school because I thought if I go home then I
.. […] won’t […] do it immediately […] so I stayed at school […] and I told myself
so I have to do it .. now my homework […] and then I go home.” (Anne, Paragraph
118). Both learners believed that the reason for doing any kind of assignment should
be because they themselves wanted to improve and not because somebody else told
them to do so. In reality, however, this was not always so easy to execute, hence the
support and encouragement of others at times helped them to fulfil their academic
commitments.
Additionally, reasons for doing assignments and engaging with feedback included
being prepared for written exams as well, as Alan emphasised “you don’t have to learn
so much for […] a test if you have done all the homework.” (Foc Group I, Paragraph
13). To Anne’s mind paying attention in class was also important for being able to do
your homework assignments as well as crucial for the written exam as “in English,
for example, you need every lesson, every lesson is important for the exam […]. And
in other, um, subject matters you don’t need every lesson.” (Foc Group II, Paragraph
833). All in all, the majority of the population believed that it was in their own hands
how much they engaged with the subject matter as well as feedback. As Susan put it
“I mean, if you want to get feedback, then it’s … your responsibility […] to write it
(= homework).” (Foc Group I, Paragraph 363). She added a very remarkable point
of view in the individual interview, saying “I’ve chosen to go to this school and […]
sometimes you have to do things you don’t want […] and … that’s .. like my job
[…]. And I have to do my job.” (Susan, Paragraph 432).

5.1.3.1 Homework Assignments

Analysing the data showed that the students’ relationship between grades and doing
their homework assignments was a little bit ambivalent. One reason for that might
have been the different teachers’ attitude towards it. Anne, Emily, Michael and
Veronica believed that for the majority of their teachers homework assignments
were “[…] not so important, because they just want to do homework to say, yes, we
do homework […] for the grades .” (Foc Group II, Paragraph 63–65) and not to help
their learners in order to improve in the respective subjects. Whereas the participants
claimed that grades were not the most important factor for engaging with the feed-
back method (see Fig. 5.3), when talking about reasons for (not) doing homework
they sometimes did play a role. Several students highlighted skipping homework
assignments in one subject they were rather good at to concentrate more on subjects
they struggled with. That one’s own attitude towards doing homework can change
from time to time can be best illustrated with Michael. When asked about his reasons
for doing homework, he stated he might still be affected by his teachers in grammar
school, because they “were very strict about homework […]. And that’s why […] I
110 5 Findings and Discussion

always did homework, because it really, um, influenced […] the grade […] at the end
of the year.” (Michael, Paragraph 72). This view had changed, however, in his last
couple of years at the secondary business school—he did his homework assignments
as well as engaged with the feedback he got, because he wanted to improve and not
necessarily to get a better grade (cf. Michael, Paragraph 172, 269ff.).
Last but not least, the type of homework assignment had an influence on the
learners’ engagement with it as well. Several students reported that boring or unnec-
essary assignments were every now and then the reason for not doing them. When
asked what they meant by boring it ranged from not having to think critically, only
writing something very similar to one of the tasks in the respective course book, fill-
in exercises to doing the same type of text over and over again. But Alan, Julie and
Susan also stressed that some assignments were boring for them, because they did
not like the topic, and thus found it difficult to write something about it. Unnecessary
they termed assignments with no apparent learning outcome. As Susan stressed “I
think that I don’t learn from this.” (Susan, Paragraph 64). As a consequence, home-
work assignments should be tailored to learning outcomes and the feedback should
give learners plenty of opportunities to experiment with language to foster learner
engagement with it. Discussing writing assignments with the learners is another
factor that helps them to better understand the task at hand. Brainstorming together
in class can trigger their engagement with the topic too, as the teacher as well as
peers can point out strategies for writing about this topic. Again, the Engagement-
Mediator-Feedback Model stresses the importance of mediating factors for learner
engagement, as the teacher and peers can obviously prompt engagement.

5.2 Feedback Methods

Many feedback methods have been researched (see Chap. 3) and it is obvious that
learners need some kind of feedback to improve in their writing. Researchers are
of differing opinions when it comes to giving feedback but so are learners. Even
among my small research population opinions about the value of each feedback
method very often varied enormously. For question 6 of the questionnaire I chose
feedback methods that were commonly used among my population, which they had
mentioned in the focus group interviews, in order to get a clearer picture which one
they preferred. Question 6 “How much do each of the feedback methods interest or
engage you?” included error correction by teacher, error indication/no correction,
reformulation of sentences, codes in margin, self-correction, peer correction and
personal written statement by teacher.
Looking at each feedback method separately, the results of the questionnaires
(see Fig. 5.2) would suggest that their preferred feedback methods were the personal
written statement by the teacher and codes in margin. Error indication and self-
correction, on the other hand, seemed to be among the least popular feedback
methods.
5.2 Feedback Methods 111

100%
90%
80%
Percentage

70%
Very much
60%
50% Some
40%
30% Very li le
20%
10% Not at all
0%

Fig. 5.2 Level of engagement with feedback methods

Their responses in the focus group interviews, the answers they gave in the ques-
tionnaire and then in the individual interviews, however, portrayed a slightly different
picture. Although they stated that one method engaged or interested them either very
much, some, very little or not at all, it became evident that the respective degree of
engagement was a totally different matter (see Sect. 5.1).
Unsurprisingly, my research population had encountered various feedback
methods before coming to our school, hence some of them might have had negative
experiences with any of the above-mentioned feedback methods. It should not be
underestimated that this can affect the learners’ willingness to engage with a certain
type of written CF (see Hu & Ren, 2012), and changing their mind-set takes some
time. So, teachers need to explain to their learners why they believe their feedback
method is beneficial to them and give them manifold opportunities to familiarise
themselves with it (see Giving-Feedback-Guidelines, Chap. 3). Because without the
learner’s desire to actively engage in the feedback they received on their written text,
they will less likely cognitively engage with it (see Bitchener & Storch, 2016).

5.2.1 Error Correction by Teacher

One feedback method frequently used by a lot of teachers seems to be error correction.
My population also encountered this kind of feedback method in many of their
subjects. The results of the questionnaire showed that error correction engaged 37.5%
of the students very much, whereas 25% and 37.5% respectively engaged some or
very little with it (see Fig. 5.2). When asked about what engaged or interested them
about it, Alan said “[s]o, yeah, maybe we, we don’t look at it …. when we get it back,
but later … for a test or so, we can check, okay, [mm hmm] what we can do better.”
(Alan, Paragraph 222). He also emphasised his opinion in the focus group interview:
112 5 Findings and Discussion

“to be honest I’m, I’m just looking at the corrected homework ….., um, before a test.
Because then I, I look […] what mistakes I have done and what I can do better in the
test.” (Foc Group I, Paragraph 56). Susan confirmed that “[e]xcept when you have a
test, then you look maybe at it.” (Foc Group I, Paragraph 28). Michael explained his
negative attitude towards this method saying you “don’t have to […] correct it, so we,
we get it back and then you like, okay. That’s fine, but you … don’t have the chance
to improve it .” (Foc Group II, Paragraph 295). Michael and Veronica both stressed
that they read through the corrected version of a written assignment immediately,
but then put it away and never looked at it again (cf. Michael, Paragraph 189–194;
Veronica, Paragraph 573–574).
Their statements clearly show that the learners disagreed about this feedback
method. It seems that some of them appreciated error correction as they could look
at it at a later stage to prepare for a test or written exam. The value of this kind of
feedback method, however, was questioned by those learners in favour of it as well.
Alan, for example, stated that “[b]ecause, yeah, it is already corrected ….. And, yeah,
so I do not have to look at it again.” (Foc Group I, Paragraph 36). Susan voiced her
opinion in the focus group interview, saying “I think with the error correction, when
they correct everything that’s ….. […] we don’t then improve ourself.” (Foc Group
I, Paragraph 28). Although the students saw the importance of checking their errors
before a written exam, for instance, they were rather critical about its importance for
improving their writing skills and avoidance of recurring errors.
This fact was also supported by the learners who said that error correction by the
teacher engaged them some or very little. Michael was of the opinion that
as a student you see that and then you think, okay. … I mean .. that’s not a motivation for a
student to work [mm hmm] on, on that errors, because [okay] you don’t get a feedback on
how you can improve [mm hmm] or your, um, yourself. Or .. how to um, …. ts, um, …. na
….. vermeiden?
Researcher: Uh, to avoid!
Michael: To avoid, thank you. To avoid errors. (Michael, Paragraph 184–186)

Anne had similar thoughts “because […] there is no, not as much on .. it like
a commentary (= personal written statement). …. There are .. not as many ideas
how to improve myself.” (Anne, Paragraph 426). The correct version was there,
but the students kept wondering how they should improve their writing. Veronica
commented that with corrected errors “[y]ou .. see it, you can understand it but
you .. don’t remember it.” (Veronica, Paragraph 570). Julie went a step further and
emphasised “when it comes to grammar or something and you didn’t understand it
earlier […] if the teacher didn’t explain it to you, and you don’t understand it right
away, then you just ….. lose the interest in learning and studying.” (Foc Group I,
Paragraph 34).
What had obviously been missing for most of the students were strategies (see
Sect. 5.4) to work with error correction. Veronica, for example, understood that her
version was wrong but did not know why it was wrong (cf. Veronica, Paragraph
162). She would have needed an explanation of her errors which she very rarely
got from teachers (see Sect. 5.3). Katharine stated that she would work with error
5.2 Feedback Methods 113

correction when she had to do a second draft: “then .. yeah .. I just really look into
it.” (Katharine, Paragraph 492). Error correction on its own seemed to be of little
interest to many learners, but it should be considered as the last part of a combination
of feedback methods to use with your learners (see Sect. 5.2.9).

5.2.2 Error Indication/No Correction

Error indication but no correction was not very popular among the students. Only
12.5% found it very much engaging, 25, 37.5 and 25% of them some, very little or not
at all (see Fig. 5.2). Some voiced their opinions about this feedback method rather
strongly and showed a rather negative attitude towards it. Michael, for instance, stated
“I mean what the, the most bad way to correct is simply to … underline anything,
and just say I don’t like that.” (Foc Group II, Paragraph 129). Julie said “that’s what
we have in German. She gives it back to you, or to us, with nothing on it, and says
here and there are some mistakes. But what are the mistakes? And what do you do
then?” (Foc Group I, Paragraph 63). Katharine seemed to face similar problems: “I
mean I wrote it because I think .. it was true. And if I don’t … If it’s not so obvious
…. for me at least then I don’t know … Okay, great you underlined it and I don’t
know why.” (Katharine, Paragraph 502).
Once again, the students seemed to miss instructions on how to work with this
particular feedback method (see Sect. 5.4): “[I] don’t know, okay, what I have done
wrong? …. What, what is fine? What, where I can do …. it better.” (Alan, Paragraph
228). Susan said that you could see error indication/no correction “like a challenge
to find your errors. You know there are mistakes, but it’s the challenge is to find
it.” (Foc Group I, Paragraph 62), but stated in the individual interview that a second
draft should also be the next step, because “if you correct it then you have to think
about it.” (Susan, Paragraph 218).
Alan pointed out an interesting aspect when working with any kind of written
CF method in the focus group interview: “I think […] it depends […] on yourself,
how, if you are ambitious …. then you maybe, or as you said you are interested in
something, then maybe, um, look at their homework if it is corrected or not, and to
find out what you have done wrong.” (Foc Group I, Paragraph 80). His statement
shows how not only the feedback method provided but the learners’ own attitude
towards certain methods influence their level of engagement with it (see Sect. 5.3.5).

5.2.3 Reformulation of Sentences

The populations’ opinion on the reformulation of sentences was very similar to the
results on error indication/no correction (see Sect. 5.2.2). 12.5% said it was very
much engaging, but 37.5% some and very little, followed by 12.5% saying not at all
(see Fig. 5.2). Some of the learners were distressed by the idea that the teacher had
114 5 Findings and Discussion

reformulated their sentences, giving them the feeling that something they created
was taken from them. Julie stated “I mean if he’s … changing your content, then …
you know the teacher is doing something wrong. …. Because that’s your opinion.
[…] like you all said before he shouldn’t touch your opinion, he shouldn’t change it
[…].” (Foc Group I, Paragraph 197). It appears that some learners felt rather strongly
about their own writing and saw reformulation of sentences as a form of the teacher
putting their opinions right. As Katharine put it, “I think it would, um, kind of stop
the way that we would think. Like rather [mm hmm] putting ideas in our head than
making think of our own.” (Foc Group I, Paragraph 193). Alan’s opinion, that a text
“is something what you have written [mm hmm]. With your ideas, and … if they
are reformulated […] your personal … impact get lost.” (Foc Group I, Paragraph
196) underlines the learners’ belief that either their opinion was wrong or their
personal touch got lost by reformulating sentences. Several learners mentioned that
these reformulated sentences were no longer theirs anymore (cf. Anne, Paragraph
460; Alan, Paragraph 236; Julie, Paragraph 360; Susan, Paragraph 226). Something
they created and probably put a lot of effort into had been changed dramatically.
As a consequence, the learners no longer acknowledged it as their own work, and
experienced a loss of ownership.
One interesting aspect was Katharine’s response to this method. During the focus
group interview (see above) she was rather critical of this method, and then ticked
very much in the questionnaire. When questioned about her change of opinion, she
said “I don’t know it’s kind of […] getting to know more ways [mm hmm] to express
yourself. (Katharine, Paragraph 518). But she immediately stressed that only “if they
don’t change what I really wanted to say.” (Katharine, Paragraph 520). On the one
hand, she understood how she could benefit from this kind of feedback method. On
the other hand, she was afraid, like some of the others, that the teacher wanted to
impose his or her thoughts and ideas on her, hence changing her way of writing
and thinking. Learners rejecting some of the feedback provided, is not uncommon,
especially when they believe that the teacher changes their intended meaning (see
Swain & Lapkin, 2002).
Others voiced similar concerns regarding their style of writing. Emily, for
example, emphasised that “sometimes .. the teacher doesn’t want the style of writing
[mm hmm]. But I think […] that should not be criticised.” (Emily, Paragraph 370).
Emily’s statement suggests that teachers should keep in mind that the individual
learners’ personalities are reflected in their writing as well. Because of that they might
sometimes feel taken aback when too many of their sentences are reformulated. To
their minds, not only was their style of writing being criticised, but they themselves
as well. Very often the learners seemed to think that teachers (see Sect. 5.3.1) did
not like their style of writing: “just because as .. the teacher doesn’t like your style
of writing [okay] that wouldn’t be good.” (Michael, Paragraph 206), a notion that
Veronica knew well, stressing “that’s my style […] and I want to write it that way.”
(Paragraph 488). She was of the opinion that her German teacher frequently refor-
mulated her sentences, because she did not like her way of expressing herself very
much. What Veronica and the others might have needed to be able to appreciate
reformulation of sentences was to be told how they could benefit from that method.
5.2 Feedback Methods 115

Moreover, learners should be made aware that reformulating their sentences does not
mean that something is wrong with their style of writing, but that some types of text
simply need a different style of writing they very likely have not yet mastered. This
feedback method allows the teacher to provide learners with expressions/phrases they
can use for a specific type of text. Thus, the aim is showing them ways to express
themselves more idiomatically in the foreign language.
Within this research population several wondered about the effectiveness of refor-
mulation of sentences. Emily, for example, said “the teacher reform[ulates] your own
sentence, your own style of writing […] into a new form. Like maybe a completely
new form. And maybe […] say something else that you wanted to say. And you
don’t improve.” (Foc Group II, Paragraph 502). Many of them asked themselves
how they could improve, when they received reformulated sentences. For them the
strategy, how to work with this kind of written corrective feedback, was missing,
hence they did not see its effectiveness for their own improvement in writing. Telling
learners why a certain feedback method has been implemented is crucial. Therefore,
the Giving-Feedback-Guidelines are a useful tool to ensure just that.

5.2.4 Codes in Margin

Codes in margin where the error is indicated and symbols tell the learners, which
category their error is in, engaged 62.5% very much and 37.5% to some degree (see
Fig. 5.2). As Alan put it “I think it’s also an easy way to correct it by yourself, because
you have, as Katharine said, […] grammar, uh, spelling and so. And you know what
you have done wrong, and so you can easily correct it .” (Foc Group I, Paragraph
112). In his individual interview he further stressed:
codes of margin, I think [mm hmm], yeah, it’s a, a very good method, because, um, …. you
can read through it and … okay, you see … yeah, um, expression is [mm hmm] not fine or,
um, grammar or … yeah, whatever. And … okay then you read through it, and you know
okay, yeah, that’s, that’s wrong, that’s wrong and you … correct yourself. (Alan, Paragraph
256)

Unsurprisingly, as with other types written CF, Alan stressed the importance of
self-correction, indicating once again that the combination of feedback methods
engages learners. Susan also commented on the value of this particular feedback
method, saying that “you have, um, many mistakes in spelling, then you know that
you have to do something for that […] and then you can focus on this part, and …..
improve.” (Foc Group I, Paragraph 115).
Michael, on the other hand, although rather interested in this feedback method,
questioned its value as it was “like an, an oral […] feedback, when your teacher
tells you that your, um, argument is wrong [mm hmm], and doesn’t give a reason
for that.” (Michael, Paragraph 220). He also wondered if all learners would be able
to work with codes in margin, because “I mean it’s also just an indication [mm
hmm], but …. I don’t know, I think .. students don’t really, don’t really have …..
116 5 Findings and Discussion

Mm really can, can work with that, I think” (Michael, Paragraph 208). When asked
he confirmed that doing a second draft where you have to correct your errors is
definitely necessary to work with this feedback method more effectively.

5.2.4.1 Colour Code

When discussing codes in margin with the participants, they also mentioned the
colour code (see Sect. 4.6) they had in English, as it is a very similar concept where
instead of symbols colours are used to indicate different categories of errors. Veronica
stated that “you have the, the colour-code and then […] you have to think about it.
And then you .. can remember it easily […] with the other methods it was .. not so
easily to remember” (Veronica, Paragraph 566). Interestingly enough, they always
connected the colour code with doing a second draft and Susan, for example, said
“you get a second chance to […] do your text, and the first […] text […] is like a,
… Probedurchlauf (= trial run).” (Foc Group I, Paragraph 203). She even stressed
that “it’s also like a […] challenge when you, you know […] there are mistakes, and
then you have to figure out what […] was …. the problem.” (Foc Group I, Paragraph
207), which the others agreed with as well.
Both feedback methods seemed to engage the students a lot when they knew that
they had to correct their pieces of writing and try to figure out the solution for the
correction of their errors on their own first. As Veronica stated “you know what’s
wrong, and you, so you can do it better. And you can think of why it’s wrong, and
[…] then you can give your own correction.” (Foc Group II, Paragraph 611). Some
areas, however, like grammar or clarifying arguments, proved to be more difficult
for them to deal with, hence as Emily said for “grammar, absolutely, I need more. I,
I need hints.” (Foc Group II, Paragraph 669). Julie stressed that
[m]ost of the time when you, uh, finish writing your text or a homework, and you read
through it, we don’t find any mistakes, because we think it’s okay like that. And when one or
two days pass, and we get like that back with the colour code, […] after that you see finally
the mistake. So, we need a help from someone else to see our mistakes, we can’t do that on
our own. […] And it’s a good way to do that. (Foc Group I, Paragraph 209)

Engaging the learners on a cognitive level (see Sect. 5.1.2) appears to be the reason
why they enjoyed working with either codes in margin or a colour code. Not giving
learners the right solution but getting them to engage with their errors and trying to
tackle them on their own enhances learner engagement.

5.2.5 Self-Correction

According to the results of the questionnaire, self-correction was amongst the least
favourite feedback methods of the research population. 25% engaged with it very
much and 37.5% show some engagement, whereas 12.5% and 25% respectively very
5.2 Feedback Methods 117

little or not at all (see Fig. 5.2). Several participants pointed out that it took a lot of
time to correct their own errors, for example Emily, Michael, Susan, Veronica, which
might have been one of the reasons for some to take a dislike to this kind of feedback
method. Even Michael, who was in favour of it, stated “[t]hat’s great. I like it. I mean,
the problem is that the, the second draft takes me much more [time.]” (Foc Group II,
Paragraph 692). When using this feedback method, the teacher should probably think
about ways to make it less time-consuming for his or her students and explain to them
why it could be beneficial to their improvement in writing (see Sect. 5.2.9.1). Being
familiar with a feedback method helps learners realise its benefits, like Veronica
stated: “the first draft, you just write it. And, then you, you don’t think about your
mistakes. When you have to correct it, and you see okay that’s wrong, you have to
think why it’s wrong” (Foc Group II, Paragraph 789).
Dismissing self-correction, because learners think it is time-consuming, is defi-
nitely not the way forward, but as one method within a combination of several
methods is. Analysing the data prove that notion: Although some learners claimed to
be engaged very little or not at all with it, they did not mind doing it in combination
with other feedback methods—best demonstrated with Katharine. In the question-
naire she ticked that this feedback method did not engage her at all, but while talking
about the feedback method we used in English she stated that she did not see that as
self-correction, because “I know what I did wrong” (Katharine, Paragraph 568) and
she saw it as an opportunity to first work on her errors herself. A strong indicator
that any kind of feedback method used in isolation might not engage the learners at
all, whereas in combination with a second one the learner’s level of engagement can
definitely be increased.
In the focus group interview the participants were discussing why they would
engage with self-correction, even though this method might not be amongst their
preferred ones. Michael even raised the question why self-correction was good in
English, but not in other subjects. Emily replied that it “depends on the teacher, I think.
[…] If the teacher is strict, then you will do it […], and really carefully. Because
the teacher recognises if you just do it like …. half-hearted[ly].” (Foc Group II,
Paragraph 743). Anne was also of the opinion that a teacher “also knows how …
much time you put into […] the self-correction.” (Foc Group II, Paragraph 744–746),
hence he or she would recognize how thoroughly the self-correction had been done.
Michael emphasised that he took the time to self-correct his homework in English
but would not have done so in other subjects like Italian, for example:
Michael: (Exhales.) I mean in English, I think, um, …. I’m …. very ….. ts, uh, keen on
that [mm hmm]. But, for example, in, in, in subjects like Spanish or Italian ….
I never did that [mm hmm] very, … um, …. I forgot the word for genau?
Researcher: Very thoroughly.
Michael: Very thoroughly, yeah, exactly.
Researcher: Uh, and why is that? Why English and not the other subjects?
Michael: I don’t know, because I didn’t have the, the motivation for, to do that [aha].
Because I, … I was like okay…. also if I write that text now on my own
[mm hmm] without any help [mm hmm], without the Internet …. I don’t learn
anything, because I, …. I don’t get a, um, a feedback [mm hmm] that I can
118 5 Findings and Discussion

use to improve my, my, my skills …. [and my language.] (Michael, Paragraph


106–110)

One reason for his not engaging with written CF was definitely his relationship
with his Italian teacher. In conversations I had during my English lessons with many
learners, Michael very often emphasised his relationship with her was not the best.
To his mind, she had her teacher’s pets, whom she treated rather differently from
the rest, and was unfair in her assessment. Very often he did not do his homework
assignments as he believed he could not benefit from her error correction. What
is more, because he had taken a personal dislike to his Italian teacher, he believed
he would not have self-corrected his homework assignments had she used that for
giving feedback. Veronica had her difficulties too, as she believed her Italian teacher
personally disliked her. Like many others, Veronica voiced her opinion that her Italian
teacher’s feedback was unfair und unevenly distributed among students:
Veronica: For example, um, Alan’s first Italian test was like okay, that’s wrong with the
correction. And my, my Italian test was, okay that’s wrong. …
Emily: Yeah.
Researcher: So, no correction?
Michael: Mm hmm.
Veronica: Yeah.
Researcher: Like error correction, but no correction.
Emily: And if you ask her, she just say … no, I didn’t like that.
Veronica: … But, but he has the, the, the correction with, ah, okay that’s wrong, you
should write it that way, and at my test it was just wrong. ……. And I don’t
know what’s wrong, and yeah /xxx/ (Foc Group II, Paragraphs 224–231)

Emily emphasised its importance for her future studies, saying “I want to do it in
German, because now I have to write projects and so on [mm hmm]. .. I have to do that
over and over again. Maybe I need […] your help (researcher laughs) in German .”
(Emily, Paragraph 392). Although her main concern with self-correction was always
the time issue, she realised the benefits, saying “I know now .. that it helped me a lot.
[…] And it will help me [mm hmm] … for the next three years.” (Emily, Paragraph
384–386). Moreover, she stated that working on her errors and trying to figure out
what was wrong engaged her even more with self-correction and helped her to avoid
certain errors. Veronica backed that notion, because it was “hard work sometimes
[mm hmm] …. but .. I remember my errors.” (Veronica, Paragraph 508). During
the discussion the participants started thinking about the value of self-correction and
said that they would have liked it in German as well:
Veronica: Especially in German.
Michael: Especially in German. Because Italian or Spanish we maybe, with a, with another
teachers, but
Emily: With that one not.
Veronica: But I think we would need it in German. (Foc Group II, Paragraph 755–758)

Another aspect some students valued with this kind of feedback method was, like
Veronica said, “you know what’s wrong, and […] so you can do it better. And you can
5.2 Feedback Methods 119

think of why it’s wrong, and […] then you can give your own correction.” (Foc Group
II, Paragraph 611). Being able to work on their errors and to come up with a new
word or phrase was unquestionably what engaged the learners with self-correction
(see Sect. 5.1.2). Contrary to reformulation of sentences (see Sect. 5.2.3), where
they felt that the teacher imposed their style of writing on them, self-correction gave
them the freedom to experiment with language. Creativity is another aspect teachers
should consider when they want to engage as many learners as possible with written
CF, which self-correction can provide, especially if they know that writing text as
homework assignments is their space to be creative in the foreign language.

5.2.6 Peer Correction

One of the more popular feedback methods among the research population was peer
correction with 37.5% engaging with the method very much or to some degree, in
contrast to 12.5% very little or not at all (see Fig. 5.2). This result would suggest
that the majority of the learners were in favour of peer correction, but their opinions
in the focus group as well as the individual interviews were rather critical at times.
Some really liked the idea of working on a peer’s piece of writing, whereas others did
not appreciate that at all. Michael, who said that peer correction did not engage him
at all, argued that one problem was that most learners made the same errors, hence
he could not see any advantage of students correcting each other’s pieces of writing
(cf. Foc Group II, Paragraph 337). He was also wondering about the effectiveness of
peer correction, saying
Researcher: [Y]ou say it doesn’t engage you at all.
Michael: No, because….. um, peers are, I think, not that mo, keen on correcting […]
other texts.
Researcher: Yeah?
Michael: And I am very thoroughly with that […], and […] I don’t think that […]
a student, um, can improve with that […] style of correction. (Michael,
Paragraph 241–244)

In his opinion a teacher was much better at correcting learners’ writing and helping
them to improve in mastering languages. When doing peer correction in German he
stated: “I saw the results I was like, okay, well and now?” (Michael, Paragraph
248). Despite peer correction, he would have liked to get the teacher’s feedback as
well. In general, as proposed in the Engagement-Mediator-Feedback Model, teachers
absolutely need to point out the benefits of the feedback method to their learners as
well as which strategies (see Sect. 5.4) need to be implemented to be able to use it
effectively (Kaplan & Patrick, 2016).
Julie who was critical about peer correction in the focus group interview, then
ticked very much in the questionnaire. When asked what made her change her mind,
she stated “I can then see the mistakes from.. my peer [okay], and he sees mine [mm
hmm]. And then we can discuss about it.” (Julie, Paragraph 378). Although she could
120 5 Findings and Discussion

see the benefits of peer correction, she was worried that learners might start fighting
with each other, because her peer might not like the suggestions she made:
everyone has.. different kind of expressions and [mm hmm] if I try to correct, uh, the
expression of my peer.. she would like uh, so, that’s wrong, why is it wrong [mm hmm]
and then [mm hmm] attack me by that and I’ll like okay, I didn’t say anything. (Julie,
Paragraph 394)

She was even wondering if the result then would be not to make any suggestions
at all in order to avoid friction between peers. Anne, in favour of peer correction,
raised the question whether a peer could help you recognise your own errors, because
“[m]aybe for Michael is my mistake logical, and, uh, he couldn’t explain me why,
why, uh, what I could do better.” (Foc Group II, Paragraph 352). Emily contradicted
her, saying “when you do it with Michael […], he can do grammar good… And he
explain it me […] again.” (Emily, Paragraph 400). This also shows that the learners’
opinion on being able to help each other differed widely.
Katharine, on the other hand, who was only engaged very little by peer correction,
could get positive aspects out of it as well. Feeding off each other’s ideas and maybe
pointing out errors to one another were especially appealing to her. Veronica
expressed the same idea in her individual interview saying “peer correction is..
good, because, um, the other person has also ideas […] and then you… can connect
that.” (Veronica, Paragraph 514).
Another aspect for learners not wanting to work with this feedback method was
pointed out by Susan: “I mean I have no problem to give my homework [mm hmm]
to someone else [mm hmm]. But I know that there are people who don’t like this at
all.” (Susan, Paragraph 246; cf. Foc Group I, Paragraph 128). In a discussion after
one of our English lessons, Susan assumed that some of her classmates were simply
shy or not very self-confident, thus they were reluctant to exchange their work with
peers. Not only because they were probably not satisfied with their piece of writing,
but also because they were afraid of their peers’ reaction to it. Her statement clearly
shows that the learners’ personality as well as group dynamics might also play a role
when (not) engaging with certain feedback methods.
The learners’ ideas on how to use peer correction in class differed as well. Whereas
Anne thought that she would benefit more from working with a learner at the same
level as herself, Emily emphasised that
good students go with the bad students, together. [Mm hmm] And they try to explain them.
….. What is okay, what do they have to think of, like in another way. But not alone, on, not
only to compare the mistakes, but to tell them in an easier way how they understand it. So
that way, I think, it’s really good, so. I’m really, I like it. (Foc Group II, Paragraph 357)

Katharine and Alan both thought that you should work with a different peer every
time you do peer correction. That way you could get to know different opinions on
the same topic or encounter various styles of writing.
The participants of the study mentioned reasons why learners might be especially
interested in this idea too. Alan pointed out that feedback from a teacher might be
seen as something negative by the learners, but “with peer correction you don’t feel
it maybe as a negative criticism.” (Alan, Paragraph 276). Susan believed that “it’s
5.2 Feedback Methods 121

not so […] binding as we, when the teacher corrects it, because you are, the people
in school are your friends, and then it’s more fun to do it this way.” (Foc Group I,
Paragraph 122).

5.2.7 Personal Written Statement by Teacher

The most popular feedback method among the participants was the personal written
statement by the teacher which was favoured by 100% (see Fig. 5.2). As Alan put
it “I think a written statement by the teacher is also a good method of, of feedback.
Because so you get a text from your teacher, what you could do better, what you
have done wrong. […] and […] know what to do and you can communicate with a
teacher.” (Foc Group I, Paragraph 159). Moreover, the personal touch was another
reason for the students’ appraisal of that feedback method. Emily stated that not only
that aspect was appealing:
Emily: You are not.. only one of the group.
Researcher: Okay.
Emily: Because.. then.. you only see the two with the two good grades [mm hmm]
that get always [mm hmm] motivation and feedback [yeah]. But you are one
that.. counts too.
Researcher: [Aha. I]5
Emily: [You have,] you think of me [mm hmm] when you write that down. (Emily,
Paragraph 416–420)

Michael voiced a similar opinion saying “because then I have the feeling that the
teacher really engages with […] with my text [mm hmm]. And I feel like, um, it’s
interesting for the teacher, and not just their job.” (Michael, Paragraph 254). Susan
mentioned that “it’s like a little letter and then you are excited [mm hmm] when it’s
in.” (Foc Group I, Paragraph 177 ). Julie even said that it feels like being “a little
child before Christmas.” (ibid., Paragraph 178). Despite the personalised feedback
they valued, the learners had some very clear ideas on what a well-done personal
written statement was too.
During the focus group interviews the participants, for instance, were discussing
the length of the personal written statement. They agreed that it depended on the
kind of assignment whether a statement should be short or long—as Julie explained
it “doesn’t have to be a written essay of a statement (laughter). Just short and simple
is also fine.” (Foc Group I, Paragraph 170). Again, the individuality of each learner
was stressed, and they asked themselves what they should do with Well done, which
they received rather often on pieces of writing. Or as Emily stressed, “not on every
sheet of paper, good work […], good work, good work. […] Something that makes
us […] special.” (Foc Group II, Paragraph 491). Questions raised by statements such

5 Brackets here are used to indicate overlapping speech.


122 5 Findings and Discussion

as these were, for example, which part of the assignment was well done, what they
should be working on, etc. Michael emphasised that
a good statement should first […] include, um, the parts where you were good [mm hmm].
Um, or were you, um, improved yourself [mm hmm], but then, um, as you, as you always
told us, there is always room for improvement [mm hmm]. And I think you should, um,
point that out as well [mm hmm], where you can work on […]. I think for a student it is
important to hear […] that you, um, accomplished something [mm hmm] or improved […]
in anything. (Michael, Paragraph 344)

Susan shared Michael’s view and stressed that it was “like the teacher… really
[…] thinks about your homework” (Susan, Paragraph 256). Susan’s and Michael’s
views underline that learners need to receive feedback that is tailored to their needs
to engage with it. It seems to be important for them to be seen as individuals (see
Sect. 5.3.5.1) and not only one among a group of learners who all get a similar kind
of feedback. It is even worse for learners when only a couple of good learners are
singled out for all the praise and feedback. Thus, feedback is definitely dialogic,
where teachers need to remind themselves that they are responding to an individual
and not solely to a written text.
Another aspect the participants mentioned in this respect was the role of the
teacher (see Sect. 5.3.1 for a more thorough discussion). As Julie said “it’s nice to
see, […] the teacher puts that much effort to correct, um, our mistakes and to tell
us how to improve […]. And that’s also […] a rare case […], we only have that in
English really.” (Julie, Paragraph 398). What seemed to be lacking for the learners
with other feedback methods was at the core of this one: tips on how to improve
in various aspects of their writing (see Sect. 5.4), which is in line with the ‘zone
of proximal development’ (Vygotsky, 1978). Depending on the learners’ level, they
need different degrees of scaffolding when correcting their errors. As was the case
with this cohort, the longer they had been working with my feedback method, the
less scaffolding they needed. Some of them reached a very high level of autonomy
in this respect. Emily stressed that for her “the encouragement, the statement […]
is […] where you can work on […]. That you can take with me, yeah, I want to be
better, I want to,.. to.. make you, like.. pride (= proud).” (Emily, Paragraph 426).
Alan said it was “exciting … to wait of a statement, because […] if you are proud
of a text, which you have written, and you think […] it’s very good. Then you can’t
wait […] of the statement. […] If the statement is good, then you are … happy …
for the rest of the day.” (Foc Group I, Paragraph 184).
Anne believed that the personal written statement, where recurring errors she made
in her pieces of writing were pointed out, helped her to remember to avoid these in
a written exam (cf. Foc Group II, Paragraph 722). Emily, Katharine and Veronica
highlighted that the personal written statements also included an explanation (see
Sect. 5.1.2) for some of their errors, e.g. why a certain tense had to be used or an
expression did not work in the way they had used it. This helped them a lot while
writing the next assignments they were given.
The personal written statement was a method most of the learners had never
encountered before or had last got when they were still at Secondary Level I (see
Sect. 4.1). All of them favoured this method and agreed that it would have been
5.2 Feedback Methods 123

beneficial for them in German as well—or any (foreign) language for that matter. In
Julie’s opinion this was
what I think makes also doing homework more fun, because you know you’ll get a statement
[yeah] later on. And maybe that could motivate, uh, students that hate homework, like Susan.
Susan: I don’t hate it.
Julie: Okay, you don’t like it.
Susan: Yeah, we are not good friends. (laughter) (Foc Group I, Paragraphs 180–183).

Important to consider, though, is that the statements should be different for all
learners, so they get the feeling that the teacher engages with their work.

5.2.8 No Feedback

What my students commented on as well was that in many subjects they did not
get any feedback at all. Their opinions on how much or little feedback they got
differed quite a bit, but they agreed that in German they mostly received it shortly
before a written exam. In Spanish and Italian they generally got grammar exercises as
homework and not so many texts to write, which is the reason for little or no feedback
on their written work in these subjects. Some of them mentioned that even when they
got feedback “you literally don’t learn anything from it, […] the teacher doesn’t even
correct it properly.” as Julie explained (Foc Group I, Paragraph 14). Michael stressed
that he did not like question marks on some parts of the written work, saying “that’s
what I don’t like […] when teachers does that. It’s just, uh, … yeah, it’s like I don’t
know either, but something is missing.” (Foc Group II, Paragraph 675). Katharine
was really angry at times, because “I don’t get why we have to give it to them, when
we don’t get it back, um, corrected.” (Foc Group I, Paragraph 16). According to Julie
it was even worse when in “the end we [only] learn it from the test.” (Foc Group
I, Paragraph 23) and she had no opportunity beforehand to work on her errors. The
purpose for doing written assignments or developing strategies on how to improve
was clearly missing, making it impossible for the students to engage with written
corrective feedback (see Sect. 5.4).
Their feedback in German mainly consisted of a tick (✓), where Veronica
wondered “now what should I do with the tick?” (Veronica, Paragraph 462), and
sometimes handwritten comments on the page a lot of learners could not decipher.
What really seemed to get at them was that “[s]he gives it back to you, or to us, with
nothing on it, and says here and there are some mistakes. But what are the mistakes?”
(Julie, Foc Group I, Paragraph 63). Katharine was really frustrated with this lack of
feedback “when they just tick it and say, yeah,.. just look over it I found some things
like… I don’t know.. I wrote it […], because I think that it’s true so […] just tell me
what’s wrong!” (Katharine, Paragraph 146). Most of the learners did their written
assignments to get feedback on them with tips on how to improve and they also
wanted a corrected version to check again before a written exam. Michael pointed
out that he sometimes received similar feedback stating he should improve his text,
124 5 Findings and Discussion

without any clues as to which areas he should look into more thoroughly. Veronica
was rather upset by her German teacher as well, because when she asked her why
she put a question mark next to a sentence or what she should improve, the teacher’s
answer was sometimes that she could not remember what she had wanted her to do.
Some of the learners were asking themselves whether some teachers only set
homework tasks to be able to show parents that homework was given in their respec-
tive subjects. Veronica assumed that some teachers just saw it as “if you did the
homework.. it was good […].. and if not.. yeah, then not. But.. what’s.. in the text
[…].. is not important.” (Veronica, Paragraph 468). The learners’ concerns about
getting little or no feedback at all from their teachers underlines the fact that if
teachers do not value feedback, their learners possibly will not either.

5.2.9 Combination of Methods

The results of the questionnaire would suggest that the feedback method that engaged
the learners the most was the personal written statement, followed by codes in margin.
Should all other feedback methods then be dismissed? The focus group as well
as individual interviews, however, prove that that was not the case. When asked
whether only one feedback method should be used, they all agreed that a combination
of methods would be the best way of giving feedback as this would engage more
learners. They did emphasise though, that a personal written statement should be
part of any feedback method (see Sect. 5.2.7).
Some of the participants mentioned their frustration with no feedback (see
Sect. 5.2.8) and all of them emphasised that the teacher’s feedback on their correction
was crucial. To Alan’s mind this was also true for peer correction, “because if you
exchange your homework with a, […] school partner (= peer), […] you have […]
one correction already. And then you handle it into your […] teacher. So you have
two corrections of your homework from two persons.” (Foc Group I, Paragraph 117).
Susan stressed “I think all these methods depends on if […] you, um, handle it to
your teacher a second time.” (Foc Group I, Paragraph 106). This notion of a second
step where the learners had to work with the provided feedback method and then got
final feedback featured prominently among the participants. Julie was of the opinion
that
the most important part, I think, is that the teacher has to be there for the students … a second
time. Not only when you do the homework, and you get it back, and that’s it. So if it … it,
he has, or she has to take his or her time for the students. Even if it takes longer than just
one period. ……. To, to sit back and explain what is wrong, why ……. To help the student
improve himself or herself. (Foc Group I, Paragraph 113)

Michael who did not fancy codes in margin but really liked the similar concept
of the colour code I use, explained the reason for that with “but then we had to do a
second draft, you know. […] And that’s the point.” (Michael, Paragraph 214–216).
Susan claimed that self-correction did not engage her at all, because “I don’t like
5.2 Feedback Methods 125

to…. read again over my work and then thinking of… what is good, what is bad.”
(Susan, Paragraph 230). In the focus group, however, she stated that she liked the idea
of a second chance. She explained this bias with “but because there are also this, um,
colour codes […] and we have to hand it a second time out (= in).” (ibid., 232–234).
Their statements strengthen the proposed Engagement-Mediator-Feedback Model,
which has a combination of several feedback methods as one of its pillars. My data
clearly show this is one of the reasons for learner engagement to take place.
This can best be illustrated with the feedback method (Sect. 4.6) I used with
these learners, which shows that in combination, certain feedback methods were not
seen as very little or not at all engaging. Despite the colour code and a personal
written statement in the first step, this feedback method included self-correction
by the learners in the second step, followed by error correction and a final written
statement by the teacher as a third step. The various types of written CF ensure that
many learners can engage with the feedback provided, guiding them towards less
scaffolding and more autonomy in their writing process.

5.2.9.1 Aspect of Time

While it could be argued that the above-mentioned kind of written CF is overly time-
consuming for both learner and teacher, I would argue that it is not. It is true that
the learners have to write first and second drafts and that the teacher has to colour
code and provide a personal written statement for the first draft and to error correct
and comment on the second draft, but these, as part of a routine, need not take up
too much time. As the learners must keep the colour code and mark anything they
add in bold print, you put the first and second draft onto your computer screen and
simply compare the colour coded parts. This ensures that the teacher does not need
to spend too much time on the second draft.
When using a combination of feedback methods, one crucial aspect is to opt for
quality not quantity. The main aim is to help the learners to improve in their writing.
Hence a couple of assignments they work on thoroughly is more beneficial than
many where they, for example, only get error correction and no further engagement
with the feedback method is involved (see Sect. 5.1). To engage learners even more
a credit system for their homework record could be used to give them the possibility
of choosing how they would like to work with the feedback method. As a result of
this study, my current students get a maximum of three credits for each piece of
homework. One credit for only doing the first draft, two credits for doing a first and
second draft (including working on some areas indicated by the colour code and the
personal written statement) and three credits for both drafts as well as working on
all areas.
Furthermore, with this method it is not just the teacher who decides on focused
or unfocused error correction (see Sect. 3.3.5) as learners can follow the colour code
indicating areas to be worked on, hence they have a choice (Price, Handley, & Millar,
2011). The participants in this study sometimes opted for unfocused error correction
where they corrected everything and at other times chose focused correction, concen-
trating on one particular area and ignoring others. Sometimes they opted for both. It
126 5 Findings and Discussion

has to be stressed, however, that some of them always corrected everything that was
pointed out to them (see Table 5.1). Thus, this approach also gives the learners more
autonomy in working with this particular feedback method. Moreover, if they have
little time to spend on their homework due to demands from other teachers, they still
get one credit for doing a first draft and do not miss out because they did not do a
second draft.
Time is always an issue for teachers when correcting learners’ work. Although
this feedback method might sound like a lot of work, it is not. One advantage is
that my learners used an online platform where they uploaded their documents (see
Sect. 4.6), hence all corrections were done electronically. In the first part I colour
code (including clues if necessary) the learners’ written assignment and write the
personal written statement giving them tips on how to further improve their piece
(e.g., type of text, structuring, etc.). As said before, the students keep the colour code
for their corrections and use bold type for everything they add, there is no need to
read the whole piece again, just the colour coded areas as well as the ones in bold
type, which I then correct if necessary. If the learners still do not get some parts right,
I also give them an explanation for the right solution in the final personal written
statement. It takes a bit more time than simple error correction, but the chance that
more learners engage with this kind of feedback is higher. As Anne put it “you forced
us to […] work through our mistakes and […] to think more” (Anne, Paragraph 224)
about their errors, because I did not give them error correction right away, but asked
them to figure it out on their own first.

5.3 Mediating Factors

Some learners engage with written CF, others little or not at all. But what are these
mediating factors that influence the learners’ decisions to work with written CF? Is
it the teachers, the peers or the learners’ intrinsic desire that makes them interested
in it? Or something entirely different?
In the questionnaire question 7 “In your experience during your school career,
how much have the following aspects influenced your decision to engage with written
corrective feedback?” focused on if teachers, peers, getting good grades or the feed-
back method influenced learners to work with it. Alongside these factors, the learners
were also questioned whether improving their level of English played a role as well.
The results of the questionnaire show that improving their level of English came
first when engaging with feedback, closely followed by the teacher as well as the
feedback method provided (see Fig. 5.3).
Naturally, the obvious factors are not the only ones which impact engagement
with written CF. During the focus group and individual interviews other interesting
aspects surfaced that were a factor in the learners’ level of engagement. A lot of
their statements included attitudes and emotions (see Sect. 5.3.5) that had an impact
on their willingness to work with written CF. The teacher’s role (see Sect. 5.3.1)
in it, for example, was rather diverse and ranged from useful teacher’s feedback
Table 5.1 Homework record
5.3 Mediating Factors

First semester 1st or 2nd Second semester


semester
Drafts: 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd
Analysing a Article Poverty (type pf What I always Commentary Article Report
cartoon globalization text—free wanted Investigative sponsorship outsourcing
choice) to tell you Journalism
(creative)
Alan ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Anne ✓ ✓ ✓ ✓ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Emily ✓ ✓ ✓ ✓ ¥ ¥ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓
Julie ✓ ¥ ¥ ¥ ✓ ✓ ✓ ¥ ¥ ¥ ✓ ✓ ¥ ¥
Katharine ✓ ✓ ✓ ¥ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓ ¥ ¥
Michael ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Susan ¥ ¥ ¥ ¥ ¥ ¥ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Veronica ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
127
128 5 Findings and Discussion

90%
80%
70%
Percentage

60% Very much


50%
Some
40%
30% Very little
20%
Not at all
10%
0%
Teachers Peers Good Improving Feedback
grades English method

Fig. 5.3 Mediating factors for engagement with feedback

to unclear instructions. Many participants commented on the lack of feedback from


their German teacher, which Anne, among others, clarified in her individual interview,
saying:
Researcher: […] And you said that you received feedback from teachers often….. So not
very often, but often.
Anne: … Yes, in.. English every time…
Researcher: Yeah, okay. And in the other subjects? [And what]
Anne: [And in the] other subjects… Yes, in…. German in the last.. two weeks…
before.. [the Matura]
Researcher: [In the last] two weeks?
Anne: Yes.
Researcher: Okay.
Anne: We, um,.. She did really.. thoroughly with the critical6 ones in German. Uh,
and.. gaved us really feedbacks how… we could do it better and.. [on what]
Researcher: [But just].. the two weeks before the written..
Anne: Yes.
Researcher: A-levels? So, she didn’t do it throughout the five years?
Anne: No, because she focused on the whole class and we had, we are too much.
(Anne, Paragraphs 275–286)

Learners who were already very good at writing texts in German minded the lack of
written CF in German less, but the weaker ones struggled a lot, because they were not
provided with strategies to improve in their text composition skills. These instances
demonstrate the various factors that had an impact on the learners’ engagement
and differed sometimes greatly among them, a notion even Mosher and MacGowan
(1985) commented on in their early report on student engagement. This underlines
the complexity of the Dynamic-Engagement-Framework, hence making mediating
factors a very valuable aspect of the Engagement-Mediator-Feedback Model.

6 She means learners probably failing the subject at the end of the academic session.
5.3 Mediating Factors 129

5.3.1 Teachers

The student-teacher relationship being important in the learning environment has


been subject of many studies (e.g., Bryson & Hand, 2007; Garcia, Agbemakplido,
Abdela, Lopez, & Registe, 2006; Hamre & Pianta, 2006). When asked about the
teacher’s role, my learners simply said: “Giving feedback, of course” (Foc Group I,
Paragraph 27), but some participants stressed that “the point of view is in… some
subjects, um, only… the, the teacher’s point of view.” (Veronica, Paragraph 56). In
the questionnaire 75% of the participants claimed that the teacher influenced them
very much and 25% some when working with written CF (see Fig. 5.3). The results
seemed to state the obvious that teachers are important in respect of feedback, but
just how much influence does a teacher have on their engagement with it (see also
Foster & Southwell-Sander, 2014)? Analysing the focus and individual interviews
indicated that teachers could even be the cause of a learner’s withdrawal from written
CF altogether. Something Julie encountered with her Italian teacher which even
resulted in her not wanting to go to school on Mondays, because she had a double
period of Italian that day. Her dislike was so strong that she did not see any point in
doing Italian homework assignments as in her view she would not get any valuable
feedback on her written texts. Thus, teachers need to keep in mind, just how much
impact our comments, our way of giving feedback or how we treat our learners can
have on them. Do we know anything at all about our learners, their lives outside
their learning environment, what kind of problems they encounter which, combined,
might have an effect on their attitude towards learning?
When commenting on grades and their performances in all kinds of exams, Anne,
for example, stated that “some teachers are not interested […] how the feelings are
from some students. Or, or why they write a Five (= fail). Because maybe they have,
uh, problems at home or something like that. And not just, yes, they … can’t do this.
That can’t […] manage the exam or something like that.” (Foc Group II, Paragraph
1042). Some of the learners explained that in their opinion some of their teachers
simply wanted to deliver their subject matter, but were not interested in them as
human beings.
Teachers’ reactions when giving the learners feedback was also a topic of discus-
sion in the focus group interviews. Veronica and Anne were talking about me and
comparing me to other teachers they had, saying:
Veronica: She is always nice, (laughter) when you ask.
Anne: Also if you are a little bit stupid, but she is (laughter) nice. And some, and some
teachers are, oh God, I will kill you! (Foc Group II, Paragraph 622–623)

Veronica and Anne were both under the impression that some teachers thought
that learners asked silly questions which they should already know the answer to
as they “are in the fifth form (= final year)!” (Foc Group II, Paragraph 628). Even
worse, the participants explained that some teachers really got angry at them for not
knowing the correct answers. As Michael put it: “that’s the worst thing of teachers,
really.” (Foc Group II, Paragraph 621).
130 5 Findings and Discussion

What is more, the learners expressed their astonishment that some teachers seemed
to enjoy their bad grades—“like the teacher has fun with your bad grades.” (Foc
Group II, Paragraph 981) which they did not like at all. Emily gave an example of
a teacher’s response to a learner’s bad grade that was not at all appreciated as an
appropriate reaction to a learner’s performance:
Emily: Or … Mrs Brown told, tell to Moira, because she has a Five (= F, fail) now in,
in Sp, French, ‘Yeah, alle Jahre wieder, Moira.’ (= Every year, Moira.)
Researcher: I see.
Veronica: Oh!
Emily: I think [that it’s like]7
Anne: [Yes, and it’s really]
Michael: [Like a punch in the face.]” (Foc Group II, Paragraph 970–975)

While it was true that this particular learner struggled with French every year, the
others nevertheless believed that a teacher did not need to point it out to her when
she was already upset about her bad grade on the written exam. More helpful for the
learner would have been to discuss her exam and to give her tips on what she had to
do to get a positive grade in the end.
This dialogic aspect of feedback, which was obviously lacking with their French
teacher for some learners, was also mentioned during the focus group interviews.
While doing their corrections, some of them would have preferred to ask their teachers
for clues to tackle the problem at hand. That was one aspect they liked about my
feedback system, because they had the chance to get in touch with me outside school.
Like Julie said:
What I like the most about it, is that, um, … when you see you have done something wrong,
or like grammar, and you don’t understand what is actually wrong, you get the chance to ask
the teacher. Like in our case [mm hmm], even if it’s, uh, I don’t know, in the afternoon, we
can write on the platform, and ask you. And sooner or later, you will respond. And then we
know what is wrong, and we’ll learn from it. (Foc Group I, Paragraph 202)

Unsurprisingly, learners need support from their teachers the most. As Anne
pointed out “I like that when we are talking about a topic in English and someone,
um, says something unnecessary, then you, um, say, ‘aha, interesting’, and not like,
‘oh, why do you thinking about that? That’s totally wrong’ or something (laughter)
like that and so it’s good.” (Foc Group II, Paragraph 926). Even if the learners said
something that might not be correct or a little bit off topic, they wanted their teachers
to react nicely and steer them in the right direction, e.g., by giving them another
chance to put it right.

5.3.1.1 Teacher’s Feedback

A lot of the learners commented on how their various teachers gave feedback. They
not only commented on written CF but also on oral feedback they got on presentations.

7 Brackets here are used to indicate overlapping speech.


5.3 Mediating Factors 131

Although oral feedback was not part of this study, some of the learners’ statements
were used to highlight the teacher’s role in the learners’ reaction to their feedback.
One instance they talked about was that they were sometimes asked to give feedback
on what they were satisfied with and which changes they would like in the subject.
Emily, for example, was really annoyed when
[y]ou give her a feedback, you don’t write your name on it. Okay, maybe she know my, uh,
writing style (= handwriting). But I wrote like, what, what, one, once, one wrote that …
we have too much homework. ‘Do you really think we have too much homework, Emily?’
And she wants from me an answer, and it’s a feedback, an anonymous feedback! And what
would do you say? No? Or yeah, uh, would you say yes? Because she really …. puts you
under pressure. (Foc Group II, Paragraph 467)

What especially got to Emily was that other learners had voiced similar concerns,
but the teacher simply responded that she did not think they had too much homework.
The learners asked themselves what the purpose of anonymous feedback was when
the teacher did not act upon it. They believed when teachers asked about feedback
on the subject as well as on themselves and their feedback method(s), they should
merely do it when they were prepared to honestly reflect on their teaching methods
and think about the suggestions made by the learners. If they then decided that they
did not agree with all of the learners’ suggestions, they should explain their decision
to them.
Another aspect the participants commented on was the feedback they got on
presentations in various subjects, which they had to do quite a lot. Some of them
stressed that particular teachers treated them differently, namely that two students
did something similar and got rather different feedback on it. Emily was critical of
teachers that had their teacher’s pets and always gave them the best feedback no
matter whether their performance was good or not. Katharine was especially upset
by one of her teachers when she had put a lot of work into her presentation, but had
problems delivering it due to her nervousness speaking in front of an audience:
Katharine: And…. but then the reading.. That I.. did read it off [mm hmm] was kind of..
more…, um,….. it doesn’t… had more a level on to my grade that I got from
it… then… what I said.
Researcher: What you said. [So]
Katharine: [And] then.. when somebody else.. did the presentation and that.. did the same
thing it was… Perfect!… You are so great! And then I was like…. seriously?!
Uh. (Katharine, Paragraph 644–646)

What Katharine had been criticised for in her presentation (reading from the
prompt cards) was not at all criticised in another peer’s presentation. That not all
learners were graded according to set standards but to a teacher’s preferences troubled
her the most. In the end, after several presentations where the content was good, but
her style of delivering it was not approved of, she made the decision “then it was just
like.. okay.. what… whatever and giving up.” (Katharine, Paragraph 630).
The quality of the feedback was another issue the learners were thinking about
and this varied considerably from teacher to teacher. Julie said “mostly […] when
you are so proud of your text, and then you get only bad feedback. […] the teacher
132 5 Findings and Discussion

doesn’t tell you if the text you’ve written is okay, or if it’s …. well-written. ….
Or well-expressed.” (Foc Group I, Paragraph 157). Katharine went a step further,
saying that “mostly what you hear is what you did wrong.” (Katharine, Paragraph
416). Both learners wanted to improve in the respective language, but did not know
how to achieve that because only their errors were pointed out to them, but no
strategies for correcting these (see Sect. 5.4). Like Alan said, what learners needed
was “a positive feedback […] you know okay, that could, that I can do better and
[…] in a matter of time you […] improve yourself.” (Alan, Paragraph 306).
He was also of the opinion that “the teacher should support us in learning and
… give us, as I said, feedback what we can do better, and so on.” (Foc Group I,
Paragraph 24). Alan’s view was supported by Veronica “because […] in English I, I
know okay that are my mistakes [..], that can I do better, and […] in German I don’t
know what I can do better.” (Foc Group II, Paragraph 102). One of the teacher’s roles
is to help learners and encourage them to work on their errors in order to get better.
As Anne said if “we get a good feedback, or … a good motivation from the teacher,
so we … want to improve more. ….. Or to do it the next time also.” (Foc Group
II, Paragraph 1004). Ideally, teachers can foster motivation when emphasising the
learners’ strength and not only their weaknesses. Like Veronica said: “Yeah, because
[…] the teachers are… not so motivating every time. […] Because… they, they don’t
say, okay,.. that was good […]. It’s just.. okay, that was bad. […] all the time they
see the bad.. things and not the good.” (Veronica, Paragraphs 308–310).
Being left alone with only little support by some teachers was bothering the
learners as well, especially when talking about their final year projects they had to
do in groups (see Sect. 5.1.1). Emily stressed that “nobody helped us, so to say.
Also in the project […]. We just tried it out ourselves […] and she just said no, no
that’s not correct, try it again […]. But there are no […] that I can see how I change
it […]. There was just […] that’s not correct, write it again.” (Emily, Paragraph
74). How should learners correct their written work when they were clearly at a
loss about how to do so? Michael emphasised that “if she would have told me,
for example, that my, um, style of writing doesn’t fit or, […] there are too many
informations, information […] I would have known what I should improve […]. But
just improve your, your text is like… okay.” (Michael, Paragraph 98). Instead of a
general statement, like Michael got, specific feedback is needed, telling the learners
which areas need to be improved and probably sometimes giving the reason for
that too. The Giving-Feedback-Guidelines can definitely aid this process within a
Dynamic-Engagement-Framework.

5.3.1.2 Unclear Instructions

One reason why learners did not work with written CF, were either unclear or even no
instructions on the part of the teacher: “Um, yeah, but.. she was ni, never really trying
to sit down with us and discuss the things how you write a text, and stuff like that
[…], so we had kind of to figure it out on our own” (Julie, Paragraph 76). As Michael
stressed “because if […] the … teacher doesn’t explain anything, then [it] will be
5.3 Mediating Factors 133

difficult. […] I think so, like in Italian, I mean?” (Foc Group II, Paragraph 28). When
talking about Accounting Emily added the teacher “didn’t explain it, you just see the
Buchungssätze (= accounting records) and just okay how […] does she come up with
it? […] So no […] explation (= explanation) behind it.” (Foc Group II, Paragraph 36).
Unclear instructions sometimes seemed to be closely linked with a lack of dialogue
between teachers and learners (see Sect. 5.3.1.3). Some of them explained that in
certain subjects they did not do their homework assignments because they did not
know how to do them.
Doing homework and getting feedback on the type of text, their errors, structuring,
etc. should contribute to the learners’ ability to perform certain tasks in written exams.
The lack of it could pressurise learners in a situation where the stakes are high. Anne
(cf. Foc Group II) said that in their written exam they sometimes had to write a text
in Italian on a topic they had not done before. What is more, they did not write a
lot of texts in Italian anyway, hence it took the learners rather long to write texts in
the exam, because they did not have enough practice. In that case engagement with
written CF was impossible as they either had no piece to write or they only got error
correction. Moreover, the learners sometimes did not know how to write a certain
type of text, because the teacher’s instructions were unclear. When they asked her
to explain it again the response was that she had already done that and once was
enough. An attitude which for most learners has a negative impact on doing their
homework assignments, as learners need input on a new type of text, so they can
engage with the writing task and develop their writing skills further. Not providing
that kind of input can lead to a lack of engagement or even withdrawal from doing
any assignments at all.

5.3.1.3 Feedback as Dialogue

It can be argued, that learners’ beliefs about written CF have an impact on their
engagement with it. My learners had already had eight years of experience with
written CF before they started at our school. Unsurprisingly, many had a clear picture
of different types of feedback in mind. Therefore, some were rather sceptical about
the written CF I gave them, because it was radically different from what they had
encountered before. Like Anne stressed:
Anne: Um,.. at the begin it was for me.. really annoying with the platform, because…
I.. forgot sometimes, uh, how to.. log in and.. where was the homework and
how to upload my homework. But… after one year, um, I…., uh, gewöhnen?
Researcher: .. Uh, you got used to it.
Anne: Yes [mm hmm], I got used to it and so it was quite fine. And with your
commentaries on it, it was really.. helpful…. (Anne, Paragraphs 328–330)

One part they were especially critical about was the fact that they had to write a
first and second draft of their written text. This can be explained that some learners
as well as teachers view feedback as a product rather than a process (see Sect. 4.6)—
they write their homework assignment, hand it in and are done with it. Learners get
134 5 Findings and Discussion

the teacher’s feedback and no further engagement with it seems to be happening,


but to ensure the learners’ willingness to engage with feedback, dialogue between
the learners and the teacher is definitely needed (cf. Campbell & Schumm Fauster,
2013; Higgins, Hartley, & Skelton, 2001; Sambell, 2013). Not knowing how to work
with the feedback method or no support from the teacher to help them correct their
errors also featured prominently among my participants. Feedback, therefore, needs
to be seen as a dialogue between learners and teachers.
Feedback as a one-way system and not a dialogue was experienced by Susan
“because sometimes you get the correction and you accept it, but you, you can’t
understand why maybe your expression wasn’t right, and then ….. you, you have no,
um, chance to, to do it again better.” (Foc Group I, Paragraph 35). Julie commented
on not having a second chance, but also added that “you don’t have a clue how to
go […] further on.” (Foc Group I, Paragraph 43). When challenged why they did
not ask their teachers to tell them what to do they replied that most of the time they
either said they had no time to spare (cf. Foc Group II) at that moment or “had kind
of to figure it out on our own.” (Julie, Paragraph 76). What is more, Emily reported
one incident where she asked her Business Studies teacher how she could improve
one part and the response was “I didn’t like that.” (Foc Group II, Paragraph 230).
Something similar happened with their German teacher:
Emily: And if you ask her she says …. “I don’t know what I meant” (laughter)
Veronica: [“I can’t remember”]8
Emily: [“I can’t remember.”]
Michael: [“I don’t remember] I don’t like it”, so okay. (Foc Group II, Paragraph 140–143)

The teacher’s role is undoubtedly aiding and guiding their learners. The instances
my research population reported, does not support that notion. Instead of giving them
tips on how to improve, all my learners got was that either their style of writing was
disliked by the teacher, or he or she could not even remember why he or she marked
that particular section in the written text. Naturally, it might happen sometimes that a
teacher marked something and when asked by a learner could not remember exactly
what was meant by it. In the case of their German teacher, however, the participants
stressed this happened frequently. As Julie said: “the teachers should be […] there to
lead the students in the right way to correct their homework.” (Foc Group I, Paragraph
309). Some of the learners emphasised that they felt a couple of their teachers did
not want to help them. Susan pointed out that “maybe I don’t understand it again,
and then I also need my teacher to tell me what’s the problem. And if there is no
teacher and the teacher […] maybe don’t want to tell me, I don’t know, then I again
can’t […] improve.” (Foc Group I, Paragraph 106). Learners need their teachers
for guidance when working with written CF, hence it is crucial to see feedback as
dialogue, especially as autonomy is one of the aims of written CF. The dialogic nature
of feedback supports that, as learners need less and less scaffolding, when they have
been working with the chosen feedback method for a longer period of time. Denying
learners this process very likely leads to less or no engagement with written CF.

8 Brackets here are used to indicate overlapping speech.


5.3 Mediating Factors 135

A good example for the dialogic nature of feedback and what some learners truly
want from their teachers was voiced by Emily: “I think that you should not be behind
the student, you should be beside the student. Not pushing him, […] going with him
the way.” (Foc Group II, Paragraph 962). The majority of the learners were of the
opinion that most teachers were on one side of the fence and the learners on the
other, but they would benefit more from working together. Feeding off each other
was something the learners really appreciated. This was also expressed in one of the
English classes where the students commented on the way I returned their written
exams. Instead of just returning it to them, I asked every learner individually to
come to my desk and then discussed his or her exam in detail. Most learners were
a bit taken aback when they first got to know this kind of feedback on a written
exam, Julie included, because it was “at first new to me […], and then I liked it […],
because it was, oh, okay, she really tries.. to help me with that.” (Julie, Paragraph
406). Respecting your learners and showing genuine interest in them as individuals
is a necessary pre-requisite for engagement to happen.
Once a teacher has decided which feedback method works for him or her and his
or her learners, they should be told why it is being used and get the opportunity to ask
questions—even if they ask the same question repeatedly. Most learners want to get
better and therefore they need the dialogue with their teachers. Working together and
feeding off each other is crucial when giving feedback. As the Dynamic-Engagement-
Framework shows, writing texts and working with written CF is a dynamic process,
constantly changing as not only learners develop, but their teachers too. Learners
and teachers respectively need to reflect on their feedback culture. Asking teachers
might be easy for some learners and rather difficult for others. Promoting feedback
as dialogue starts with the teacher, but learners need to seize that opportunity, like
Julie stressed: “most of the times when I have asked you, […].. you explained it to
me and then I got it, and then I corrected it.” (Julie, Paragraph 478).

5.3.2 Peers

The participants’ opinions on how much influence their peers had regarding their
engagement with written CF varied immensely. 25 and 37.5% respectively claimed
that it influenced them very much or to some extent, in contrast to 12.5 and 25%
very little or not at all (see Fig. 5.3). Emily and Katharine both stated that their
peers sometimes reminded them that they had to do a piece of homework or that a
second draft was due. Without that they would sometimes have simply forgotten to
do an assignment. This was also true for Susan who explained that her peers were
of very little influence on her decision to work with feedback but reminded her to do
the homework and correction. In contrast to that Emily and Katharine also looked
at their peers’ pieces of writing and tried to help each other while working on the
second draft.
Veronica and Julie claimed that peers only influenced them to some extent, but
also sometimes asked peers for advice, when they got stuck during working on their
136 5 Findings and Discussion

second draft. Contrary to that was Michael’s and Alan’s view that their peers did not
influence them at all. By no means could they imagine how peers might help them
to engage with written CF. The learners’ responses show the diversity of opinions
amongst them, which has an impact on giving feedback: What engages one learner
might be quite ineffective for another one.
Although some learners were rather critical about peers having an influence on
them, they agreed that peers could indeed help you:
Emily: And when you are alone.
Veronica: You give up.
Emily: You give up.
Researcher: Mm hmm.
Anne: Yeah.
Michael: Yeah, …. and I think others can help. (Foc Group II, Paragraph 1014–1019)

Sharing their thoughts and doubts with some of their peers, very often helped to
keep going. Had my research population been on their own, they would have given
up. Similarly, some of them realised that forming study groups was quite efficient,
because “what I, um, saw was that when you learn, um, in, in, within a group […],
then there are always people that are motivated and they help the others.” (Michael,
Paragraph 334). Especially in their final year they had quite a few study groups for
various subjects, which they realised would have been beneficial throughout their
school career. Emily reported that some of them talked about errors they made in
English which, in her opinion, helped them raise their awareness for them.
Anne and Katharine both commented on their peers’ influence on their attitude
towards work. Anne, on the one hand, stated that she did not put a lot of effort into
her school work in her first year at our school, but “I changed my friends from the
first form to the.. second form […] And with my new friends, um, it was easier for
me to learn” (Anne, Paragraph 346–348). Because her friends liked studying, she
realised that it was not so bad after all and might be good for her future. Katharine, on
the other hand, had a negative experience with her peers’ attitude towards learning
in her English group and at one point decided to switch groups:
Katharine: I was kind of.. influen… I.. It was not that.. the other teachers.. the teacher
was… bad at what she did. It was just that… It was, um, the other class it was
all to.. because we had two classes..
Researcher: Mm hmm?
Katharine: .. in it. And… I was the D class and they.. didn’t really.. wanted to be good a…
in English [mm hmm] and I wanted to be. (Katharine, Paragraph 98–100)

This lack of engagement of her peers, however, triggered her desire to change
groups. This was possible for Katharine, because at our school two classes are some-
times divided into three English groups, which was the case with hers. I taught
18 learners and her other classmates were in an English group with one part of
another class within her year group. She was very pleased to be allowed to change
groups and Katharine’s statement illustrates how peers can have an impact on learner
engagement.
5.3 Mediating Factors 137

5.3.3 Level of English and Grades

Although teachers and the feedback method provided ranked high amongst the partic-
ipants’ reasons for engaging with written CF, improving their level of English scored
highest. 85% of the learners claimed that it engaged them very much and 12.5%
stated to some extent. Grades, on the other hand, as a reason for engaging with the
feedback method (see Fig. 5.3) portrayed a slightly different picture. In contrast to
improving their level of English, only 25% claimed it influenced them very much.
The majority stated that grades had some (62.5%) and very little (12.5%) influence
on their decisions.
Alan who stated that improving his level of English engaged him very much and
getting good grades some said
Alan: It, it, um, it’s also relating to improving […] yourself with the, um, corrective
feedback, um, because you are, as I said, you get a positive feedback […] you
know okay, […] that I can do better and […] in a matter of time you….., um,
improve yourself.
Researcher: Yeah, so […] improving […] the […] level of language […] is more important
than […] grades for you. […] Also in the other […] languages would you say
that was also a mo, motivator? […]
Alan: Yeah, … I would, I would say that. (Alan, Paragraph 306–308)

In English in particular, doing their homework counted 20% towards the final
grade (see Sect. 4.6) which might have been one of the reasons for the learners to
engage with it. Nevertheless, the learners engaged more with the feedback method
because of their desire to improve their knowledge of English. Michael, who was
rather critical of the feedback method I used with them at the beginning, explained
“I mean, I was not sceptical about that, but I, I couldn’t imagine if that really…
helps me to […] improve my English […].” (Michael, Paragraph 156). He was the
only learner who claimed that getting good grades influenced him very little. He
reasoned that “because I, I don’t define myself.. by my grades […], or something
like that […]. I mean, […] I’m kind of proud […] of my grades, but…. they are not
the most important thing.” (Michael, Paragraph 172). Talking about our feedback
method after one of our English classes he told me that when doing his second draft
he never thought about it having a positive impact on his grade, just that he could
get better and better in English.
This notion of grades not being that important was also expressed by Emily,
because “[m]otivation and statements from you… are more for me than… good
grades.” (Emily, Paragraph 466). Improving her level of English and getting a
personal written statement were far more significant in her case. Despite the reasons
mentioned by Emily, Veronica named her lack of participation in class—due to her
shyness—as the main driving force behind improving:
Veronica: Because I think… Eng.. English is, is very important and if I don’t [mm hmm]..
really talk so much in class you.. it’s very important to.. do the homework and..
to work, um, with the feedback.
Researcher: …. I see. So that is, um,..
138 5 Findings and Discussion

Veronica: That’s.. just I think [mm hmm].. my [mm hmm].. way [mm hmm].. to get
better. (Veronica, Paragraph 558–560)

If improving their current level of the respective subject is the main reason for
learners to engage with written CF, the teacher subsequently needs to choose the
feedback method(s) carefully and to find ways to support the individual learner to
achieve his or her goal. Grades might be the motivator for some, but other factors
seem to be more significant than that.

5.3.4 Feedback Method

The results of the questionnaire show that the feedback method provided engaged
75% of the students very much with written CF and 25% to some extent (see Fig. 5.3).
As feedback methods have already been discussed more thoroughly in Sect. 5.2, two
things need to be pointed out here: First, the feedback method(s) played a major role
in the learners’ willingness to engage with it. Second, very often not one factor but
several contributed to the learners’ engagement with written CF. Anne’s reason, for
example, was not only the feedback method:
Anne: …… Um,.. yes because in some subjects it.. we didn’t get any feedback [mm
hmm] but…. in, uh, English, for example,.. we get.. very… many feedbacks
from you and…. so it influenced myself to.. work in a other [mm hmm] way
[mm hmm]… When I do my homeworks… to force more myself to… to do
the work better, because.. as I said you.. it lasts you also time to correct it
and…
Researcher: And that’s why you said, okay,… put more effort into it.
Anne: Yes. (Anne, Paragraph 534–536)

Anne reasoned that because I too had a lot of work as I checked their second draft,
corrected some of it and gave them a second personal written statement, she engaged
more with the feedback method. Susan, who stated the feedback method provided
and teachers influenced her to some extent, said in the individual interview that after
thinking about it she thought that teachers played a lesser role than the feedback
method, because “I mean, I know that I’m doing my homework for me […] and not
[…] for the teachers […]. And I’m doing it to improve myself […]. So, if I…. do it
only for the teachers, that’s not the.. aim […], I think.” (Susan, Paragraph 266).

5.3.5 Attitudes and Emotions

That other factors such as students’ attitudes and emotions also influenced their will-
ingness to engage with written CF, surfaced to some extent in the focus group, but
especially in the individual interviews. On the one hand, teachers, peers or the feed-
back method were sometimes less important than their own laziness or commitment
5.3 Mediating Factors 139

towards doing their homework and engaging with feedback. Interestingly enough,
some students mentioned that they had to force themselves to do the writing tasks,
because they did not like the writing process as such because “it takes so much time”
(Veronica, Paragraph 82). Emotions, on the other hand, played a part too, ranging
from simply having a bad day to anxiety.

5.3.5.1 Individuality

The majority of learners wanted to be seen as individuals and not only part of a peer
group. Julie remarked that “when we talked about poverty, […] I really got into it
and I wanted to […] write my opinion. And I was really proud of the text that I also
uploaded. Because that was my opinion, my […] statement.” (Foc Group I, Paragraph
371). Julie mentioned that she always tried to write the homework assignments, but
if she was not able to express her opinion clearly, she decided not to finish it as she
did not want to copy any ideas from her peers or from the Internet. In her case the
individual touch when writing her first draft and then editing it was the main reason
for working with the feedback she got. Alan voiced a similar view, because “I wanted
to […] express my own ideas” (Alan, Paragraph 420). Being able to convey their
messages and to show how they thought about a certain topic was an important factor
influencing their engagement as well. Like Emily said:
Emily: .. I don’t like to think in a box […]….. I like [I see] to think out of the box
[….]…. I, I like to have the free t,…. to be not in a box like in… Business
Studies […] where you always had… you have to on that format, you have to
have that, that, that and that.
Researcher: I see… Aha.
Emily: For me.. I like to think […] of the box […]. To be creative.. to try it out
something new.. And… I know.. it.. takes more time.., but it’s also.. I like it
more. It’s more,… […] more.. not that stress..ful for me…. I just figured out in
the Business Studies project […]…, because it was… just stressful. Because
you always have to think… is it… Is it right in the box? Or is it… then outside?..
Or? […] You have to be… back again. (Emily, Paragraphs 310–312)

Being more than just part of a whole seemed to be very important for most of the
learners. Emily often commented on the fact that she wanted her teachers to see her
as an individual and not “just one of.. hundreds […] It’s not, you are not possible (=
you are invisible)” (Emily, Paragraph 610). One of the reasons why she appreciated
the personal written statement so much was, because “you think of me […] when
you write that down” (Emily, Paragraph 420) and “you give your heart to that.. […]
and give me the right, um,….. give me motivation” (Emily, Paragraph 422). Her
individuality was acknowledged as every statement started with Dear Emily and
not only her strengths and weaknesses were pointed out, but tips on how she could
improve were included as well. Michael stressed that “when the teachers, um, tells the
class, yeah, the homework was okay […]. Then it’s like, yeah, and what’s with me?
[…] I mean I can’t, I can’t do anything with that information. […] I want feedback
for my own.” (Michael, Paragraph 258). Learners appreciated individual feedback
140 5 Findings and Discussion

they could work with and ask the teacher for help if they did not know how to tackle
a certain problem.
Another good example of how different learners are, was Susan, who did not do
any homework assignments in the first semester (see Table 5.1). A teacher could
assume that she did not do it, because she was not interested in the feedback method
or simply did not want to do her homework assignments. Her reason was one that
had nothing to do with the feedback method, peers, teachers, etc. Susan was too
preoccupied as she was interested in politics and was taking part in the communal
elections of her hometown. Her time was rather limited, and at some point, she had
spent more time on the election campaign than on school work. That changed rapidly
in the second semester, where she did all the homework assignments—including the
second draft—because “then when I’ve learned for the, um, test I’ve realised that it
would have been better to do the interpreting, interpretation of the cartoon.” (Foc
Group I, Paragraph 4). It was rather difficult for her in the written exam of the first
semester to analyse the cartoon as she had not practised it beforehand. Her conclusion
was that doing her homework and engaging with the feedback she got was beneficial,
hence she did every first and second draft in the second semester.
In some cases, the learners’ individualities determine the level of engagement with
written CF. Activities outside school might have such an impact on the learners’ lives
that they simply decide to put less energy into school work. Occasionally, learners’
interests shift their engagement from written CF to something completely different—
a factor teachers should keep in mind as well when wondering about reasons for their
learners’ lack of engagement.

5.3.5.2 Self-Confidence

Katharine, Anne and Veronica were all rather shy and quiet learners at the beginning.
Speaking in front of a large group was not one of their favourite activities in class,
which was one reason for their engagement with written CF. Veronica stated “it was
always about […] when I do my homework […] I don’t have to participate in class
so much.” (Veronica, Paragraph 372). Anne who hardly spoke at all in the first year
at our school realised after some time that she “saw that.. others are also not perfect
in English […] and.. it helped me.” (Anne, Paragraph 532). The same was true for
her written English as she also thought that others were far better than her, but after
a while she understood that doing the first and second draft helped her improve in
her written and spoken English. Katharine, on the other hand, was quite good at
writing, but did not think so herself. Very often, when reading what some of her
peers had written, she felt that it was “really bad […] what I wrote. I just don’t
want to look at it again […] God, you are such an idiot!” (Katharine, Paragraph
746). This view was contradicted by Michael who explained “I often read through,
um, Katharine’s homework […], because I was, I was very, very interesting in that.”
(Michael, Paragraph 340). Furthermore, he stressed that he could understand why his
classmates sometimes did not work with the feedback they got: “[I]t’s just annoying
sometimes when you always hear that you are bad in that kind of thing […]. But
5.3 Mediating Factors 141

[…] you never get a clue of, of what you are good at.” (Michael, Paragraph 346).
He believed that it would boost learners’ self-confidence if they were told what their
strengths were and how to work on their weaknesses.
In contrast to Katharine was Emily’s view of herself. Although she sometimes
struggled with getting good grades or positive feedback she claimed “when I have..
worked a lot […] on it.. and.. it doesn’t… really have been good […] I don’t want
to be sad of it […], because I do a lot of work and I want to […] be… proud of the
work that I do” (Emily, Paragraph 470). Not being able to live up to the expectations
of some of her teachers did not bother Emily: “I know I’m not… stupid.. […] I know
I’m intelligent” (Emily, Paragraph 280). She even admitted that every now and then
a bad grade or negative feedback was a result of her not putting enough effort into
her school work.
Although some learners might think highly of another classmates’ piece of writing,
the respective learners him- or herself does not necessarily share the same view. Thus,
the learners’ level of self-confidence could have a positive or negative influence on
his or her engagement and is a factor to think of when giving feedback, as a harsh
comment could negatively affect the learners’ confidence in their own abilities.

5.3.5.3 Commitment

Another essential factor influencing learner engagement with written CF, which
emerged during the focus group and individual interviews, was their level of commit-
ment. It is crucial to view commitment as an interplay between students and teachers
where working with and giving feedback might be a driving force behind learner
engagement. Some of the learners, for example, believed teachers recognized how
much effort they put into correcting their homework, because they knew “if the
student had tried to find the mistakes or not.” (Foc Group I, Paragraph 77). For
Susan commitment was something which “you’re not doing […] for the teacher, to
be honest, you are doing it to get better in school.” (Foc Group I, Paragraph 321). She
thought that too many of her classmates did their homework because of a teacher,
their parents or peers and not to get the most out of it for themselves. An attitude Susan
disliked: “I mean we are all here because we want to be here […], and not because we
have, like the Schulpflicht (= compulsory education).” (Foc Group I, Paragraph 89).
The level of commitment was something the learners themselves could influence,
and as Julie said “we, students, should do it, in order to improve ourselves and to
help the teacher help us improve ourselves.” (Foc Group I, Paragraph 304). Veronica,
too, emphasised that “in the end… I have to be good at the exam […]. And.. when I..
don’t.. improve myself…. and try to get better then… I won’t get better” (Veronica,
Paragraph 348).
Their opinions on why learners were committed to working on their pieces of
homework were rather diverse. One aspect they disagreed on was referring to doing
homework voluntarily. Alan suggested it being a good idea to give the learners more
autonomy, but the others disagreed with his view. Julie, for example, believed that it
would have had a huge impact on her commitment, because “if you have no deadline,
142 5 Findings and Discussion

Fig. 5.4 Commitment— Strongly Disagree Strongly agree


Worked harder than 13% 12%
expected

Agree
25%
Disagree
50%

you won’t do it anyway.” (Foc Group I, Paragraph 311). The deadline actually helped
her to do the homework most of the time: “[F]rom my personal…. point of view […]
I need a deadline, […] because then (= without it) I can’t finish it, because I’ll be like,
oh, there is… still time […]. And maybe… if you would have given us no deadline
at all, [it would] be like […] 2020” (Julie, Paragraph 192–194). Susan believed the
purpose of homework should be that “you don’t do it, because you must, but you
do it, because […] you want to get better in English or in German.” (Foc Group I,
Paragraph 316).
In the fifth question the participants had to agree or disagree with several state-
ments, and item h “I have worked harder than I expected to do in school” asked about
the learners’ level of commitment on school work in general. 50% of the students
disagreed and 13% disagreed strongly (see Fig. 5.4) that they had worked harder
than expected. This was also mirrored in the focus group and individual interviews.
Some of them commented that their lack of commitment was linked to their own
laziness from time to time (see Sect. 5.3.5.4).
Susan and Emily both stated that their level of commitment changed depending
on how much work they had to complete for other subjects: “[B]efore a Accountings
test… I won’t be so motivated for English.” (Susan, Paragraph 342). Additionally,
other factors influenced them as well. Anne, for example, said that her level of
commitment always seemed to decrease in May and June, because she wanted to
enjoy the beautiful weather instead of doing her school work. Emily, on the other
hand, had a part-time job that sometimes cut into her school work, making it difficult
for her to meet all obligations. Susan’s political ambitions steered her commitment
in that direction resulting in a decline of her performance which she rapidly changed
in the second semester, because she realised that her achievements at school might
be important for her future career. Another factor that sometimes influenced her was:
“I know that I can do it better bu, but if I.. don’t force myself then I […] sometimes
[…] just do it because I have to do it and not because I really want it and then it’s
not so good.” (Susan, Paragraph 40). A notion Veronica was familiar with too: “at
the beginning of the year.. you think, okay, you do that and then you do that […]…
5.3 Mediating Factors 143

And then… in the end of the year you think… okay… I….. I could be… could be
better.” (Veronica, Paragraph 422).

5.3.5.4 Laziness

Another mediating factor the students named in the focus group and individual inter-
views was their own laziness having an impact on their engagement with written
CF. They stressed, however, that their laziness was also affected by outside influ-
ences such as part-time jobs, friends, family, etc. Various reasons were mentioned
why they sometimes simply could not motivate themselves to engage with feedback.
Julie stressed that “it’s at some part also all our fault. That we just wait too long for
it, and we are just like I can do it next, on the next day.” (Foc Group I, Paragraph
88). Most of the time, when having missed a deadline, Julie believed that learners
would not do it “even if the teacher gives you a second chance.” (ibid). Anne, Emily,
Katharine and Veronica agreed that their own laziness every now and then prevented
them from engaging with feedback.
Thinking about causes for not engaging, Emily admitted “maybe it’s also my
mistake. I didn’t put myself into it.” (Emily, Paragraph 106). She realised engaging
with written CF not only helped her during her school career but might be beneficial
for her future career where writing papers would be an essential part, which is one of
the reasons she decided to do her second draft in English and fight her own laziness.
Michael, who was one of the learners hardly ever forgetting to do a homework
assignment, was wondering about that too:
I think for students the most, um, important thing, and the most motivating thing is when
you see your improvements […] And I think that’s really, um, the one thing that can help
you to overcome laziness, or to fight laziness. When you see okay, I got better, probably I, I
got a better grade […]. Because, um, what I, um, saw was that when you learn, um, in, in,
within a group […] there are always people that are motivated and they help the others […]
to fight […] their laziness. (Michael, Paragraph 334).

Alan thought this notion of getting better was sometimes lacking, because “I think
[…] some students are …. too lazy to look at it, and they simply ask the teacher, …
what they have done wrong.” (Foc Group I, Paragraph 76).
The issue of being lazy resurfaced several times during English lessons as well.
Many learners were wondering why it was so hard to overcome their own laziness.
Several learners tried to fight their own laziness by drawing up a schedule and if
they managed to keep to it, they rewarded themselves with a treat. Setting yourself
goals is a worthwhile strategy and a useful tool which teachers can promote as well.
My research population stressed that being well-equipped with strategies and being
able to talk with teachers about their problems with engaging in school work, helped
them to some extent. Fighting their own laziness is definitely a difficult endeavour,
especially when so many other interesting things are going on in the lives of our
learners.
144 5 Findings and Discussion

5.3.5.5 Interest

The learners named being interested in a particular topic or subject as another factor
impacting on their engagement with written CF. Julie voiced her opinion on that
matter rather strongly, because “in my case it depends on the topic. […] Like, like I
said with ….. doing the homework. It depends on the topic, if it’s interesting for me,
if I can … write a statement of my own, then I’ll do it, and then […] I’ll even correct
it.” (Foc Group I, Paragraph 216). Her interest in a certain topic was not only the
reason for doing her homework, but also for writing her second draft. What is more,
she explained “I mean when we … discussed in, in class about globalization ……. It
wasn’t my topic, it wasn’t … interesting that much for me.” (Foc Group I, Paragraph
371). Although she participated actively in discussions on globalization, Julie could
still not get herself to write the article on globalization. In contrast to that, “when we
talked about poverty, I was, …. I don’t know. ……. I really got into it and I wanted
to […] write my opinion. And I was really proud of the text that I also uploaded.
Because that was my opinion, my […] statement.” (Foc Group I, Paragraph 371).
Although the topic was interesting for her, she could incorporate her individuality
(see Sect. 5.3.5.1) into her commentary on poverty too.
For some it might be the topic, for others like Susan it “depends also if you like
the subject, or not. Because when, for example, I like German, and then I overlook
(= check) it when […] Mrs Edwards had, um, correct it. But in Spanish I never
would overlook … it, because I really don’t like Spanish.” (Foc Group I, Paragraph
46). Despite her dislike of Spanish, error correction and reformulation (which were
the preferred feedback methods of her Spanish teacher) contributed to her lack of
engagement with written CF. Her responses in the questionnaire backed this statement
as these feedback methods engaged her very little or not at all.
Alan and Veronica were thinking about the connection between the topic and
their performance. Alan indicated “if you are interested in something, then you are
proud. […] You want to make it and you are proud that you have made it.” (Foc
Group I, Paragraph 10). Veronica, on the other hand, believed “[s]ome topics are
more interesting than others and if a topic is not as much interesting as the last
were, you are not so good.” (Foc Group II, Paragraph 811). Both of them stressed
that their interest in a topic helped them to write better texts and express themselves
more clearly. This was supported by Julie “because when, when I was interested
I had the motivation for it, and I had […] the ideas for it without.. checking up on
the Internet” (Julie, Paragraph 82). Obviously, this way it was far easier for them
to argue their point of view well. As a result, not doing their first and second drafts
was sometimes due to a lack of interest in the topic/subject on the learners’ parts.
Katharine pointed out an interesting aspect as well, as she stated that she was
interested in a topic, but had difficulties in expressing her opinion. That was why she
sometimes did not do her homework assignments. She even started writing it, but she
did not finish it as she was dissatisfied with what she had already written. Contrary
to that learners sometimes felt that they were forced to express the teacher’s and not
their own opinions. Julie highlighted “[s]o I, I know it’s not really my text, I was kind
of forced into doing it. So that I get my points or my grade.” (Foc Group I, Paragraph
5.3 Mediating Factors 145

216). Veronica shared Julie’s opinion, because “I think it’s… most[ly] about her point
of view. […] And the others are […] not so good.” (Veronica, Paragraph 60–62).
Apart from that a lot of the learners mentioned many of the set tasks in the various
topics were “just boring and you (= it) don’t makes fun.” (Foc Group II, Paragraph
217). Susan was really upset by a lack of interest on the one hand, and the obligation
to do certain things learners either did not want to do or did not understand the reason
for doing on the other hand:
Susan: Yeah, but, when, when you learn to play the guitar, for example, there is also nobody
that forces you. You do it, because you like it and why [yeah] can’t school be also
… like learning an instrument?
Alan: Yeah, I think the, our school system is wrong. ……. But, yeah, I think … if you do
something with passion [mm hmm] then you … are, yeah, you get used to it longer
and you can learn it easier. But in school it, hmm, I think it’s a different, you can’t
compare with learning, uh, to play a guitar. (Foc Group I, Paragraph 95)

In the learners’ view learning and studying should be fun. They were well aware
there were tasks they did not like but still had to do, however, they wanted to know
why. The same was true for feedback methods—being told the purpose behind these
made all the difference for them, which strengthens the need for the Engagement-
Mediator-Feedback Model where explaining the purpose of the feedback method is
essential for its effectiveness.

5.3.5.6 Fixed Roles

During the focus group interviews the participants also started talking about the
Austrian school system and the perceptions they had of it. Alan thought that many
people had a rather fixed mind-set about what school should be like and especially
in their case forgot “that we are here voluntary[ly].” (Foc Group I, Paragraph 95),
which was true because their education was no longer compulsory (see Sect. 4.1).
They added that one of the pre-defined perceptions about being a student was that
teachers were authority figures and students had little or no say in the feedback
they got, which topics they discussed, etc. Thus, teachers told them what to do but
working together as a team did not take place. This was the students’ opinion of their
relationship with some of their teachers as well.
Julie emphasised “like everything is already pre-made for you […]. And in the
end of the year, … everyone is asking so much from you that they (= teachers) are
already thinking, okay she studied it, she knows it.” (Foc Group I, Paragraph 338).
In their opinion some teachers assumed because they heard something once they
should have stored the information, which is not the case as it takes much longer to
process information for it to be stored in the long-term memory. The same is true for
recurring errors—it takes quite a while to get it right and some errors simply persist.
Probably teachers sometimes need to remind themselves that students are just that:
learners that make mistakes and some of their errors will take longer to be erased
despite the fact that a teacher might think that the students should know that by now.
146 5 Findings and Discussion

On the other hand, learners very often join our school and are astonished by the
level of self-discipline and autonomy that is expected from them. Many of them say,
as did my research population, they have been spoon-fed in their previous schools.
Very often they lack creativity and self-organisation as they are given pre-prepared
schedules which they simply have to fulfil in order to pass the school year. In line
with that, it takes these learners quite a while to change from their fixed role of being
spoon-fed to actively engage in the learning process. One of the reasons it usually
takes learners several months to get used to my feedback method, because sometimes
learners are rather critical about it at the beginning. Like Michael said:
with the whole platform stuff, and, and, and second and… first and second draft, um, I was….
I mean, I was not sceptical about that, but I, I couldn’t imagine if that really… helps me to
[…] improve my English […]. But, as I said, when I first saw my better results,…. then I
was like, hey… it, it works out and I, I was motivated. (Michael, Paragraph 156)

First, it takes them some time to familiarise themselves with the colour code.
Second, working on their errors and self-correcting these poses a challenge to many
learners as they are not used to it. Finally, getting personal written statements that
include further tips on improving their written text on the first and second draft, is
new to most. At the beginning some learners forget to read the statement and realise
a little bit later that they could have benefitted even more from the feedback, had
they read it. This is part of the Dynamic-Engagement-Framework as working with
feedback needs to be learned as well. Learners need time to adjust to something new
but if they are willing to try it out, their supposedly fixed role as feedback receivers
changes too.

5.3.5.7 Anxiety

Last but not least, the participants mentioned anxiety as another factor influencing
their engagement with any kind of feedback. When talking about peer correction,
they were thinking about reasons why learners might not want to exchange their
piece of writing with someone else:
Susan: Yeah, but what’s with students tha, they don’t like to … that others read their
texts, because they are not so […]
Katharine: They are ashamed of it, or what?
Susan: Yeah. Maybe.
Katharine: Maybe.
Julie: I don’t understand. Why should they be ashamed of it?
Susan: Maybe they think their text is not so good? They [Uh] don’t want to [okay]
……. that the others see it. (Foc Group I, Paragraph 128–133)

Although they had known each other for more or less five years, some learners
were really anxious when they were asked to exchange their work with others. Inter-
estingly enough, these learners had an online platform in English where they uploaded
their homework onto their blogs which all learners had access to (see Sect. 4.6).
5.3 Mediating Factors 147

When asked in the individual interviews whether they checked each other’s pieces
of writing online, their answers were rather diverse. Veronica stated that she never
checked anything on the blogs, because “I think…. it’s about… privacy.” Similarly,
Alan always asked his peers if it were okay to read their article, commentary, etc.
Michael, on the other hand, checked his peers’ work on a regular basis after he had
uploaded his own piece as he was interested in the ideas the others had. He was
astonished that “often it was very…, yeah, it was not the same.” (Michael, Paragraph
342).
In contrast to that, Katharine sometimes checked the blogs of her peers before
she started or in the middle of writing, especially on a topic she was not so inter-
ested in, to get some ideas. The result of that was her being even more anxious as
“[i]t was horrible…. So… Uh, okay,.. you had like [… an]… A Level topic.. what-
ever, perfectly done […]. And.. me just no… (laughs)…. It’s just the comparing..
to others.” (Katharine, Paragraph 766). Every now and then this resulted in her
not finishing her piece, because she thought hers was not good enough (see also
Sect. 5.3.5.2). Susan, who also checked her peers’ pieces before she started writing
sometimes, reacted totally differently:
Researcher: […] Have you ever checked onlike, online!, on our homework blog what the
others did,… um, for this particular homework? […]
Susan: Yeah, when I have no idea [Yeah?] what [okay] I should write.
Researcher: Mm hmm, and then you checked?
Susan: Yeah, then I needed some inspiration.
Researcher: Mm hmm. And you did get inspiration from the others?
Susan: Yes. (Susan, Paragraph 351–356)

Whereas Susan simply checked the others’ pieces and got ideas for her own,
Katharine got really anxious and concluded that some of her peers were far better
than herself. These statements clearly show that whereas some learners profited from
checking their peers’ homework, others were demotivated by it as they could not see
the benefit. This is one problem teachers need to bear in mind when working with
feedback methods as well, as one method might automatically work for one learner,
but another might need an explanation to be able to work with it. Sometimes even that
might not help, as Julie said: “When I understand my errors then.. I can… improve
and then it interests me […]. Sometimes there are, there, I do mix, mistakes […]
and the teachers tries to explain it to me and I still don’t get it and then I’m like I’ll
leave it […]. I don’t want… to do it.” (Julie, Paragraph 325). One possible outcome,
if learners are not provided with the type of feedback (or assistance with feedback)
they need, may be that they lose interest and disengage from the written CF they
receive (Pitt & Norton, 2017).
Finally, Emily believed one of the reasons for not engaging with a certain feedback
method was “you are afraid of them, …. that’s the problem.” (Foc Group II, Paragraph
987). Other learners agreed with her highlighting that some of their peers were so
afraid of getting it wrong or being told off by the teacher for making mistakes that their
solution was not doing most of the homework assignments to avoid disappointment.
148 5 Findings and Discussion

As Susan put it “you always hear what you have done wrong, but nobody tells
you […] what you have done good. […] you are afraid doing it wrong again.” (Foc
Group I, Paragraph 156). In one of our English lessons anxiety was addressed as well
and a lot of the learners reported that they had one teacher who they were simply
afraid of, resulting in their lack of engagement with—or even withdrawal from—
written CF. Not only was that particular teacher very strict, but also often rather
harsh in her comments on learners’ written text. When they dared to ask her how
they could improve in their writing, she replied they should practise more without
any recommendations on which area exactly they needed to work on. Similarly,
when asking her to explain a type of text once more, she often told them they should
know that by now and she would definitely not revise that with them. Because of her
reactions to them many leaners were anxious to ask anything at all to avoid being
told off or even shouted at. Unsurprisingly, anxiety is a powerful mediating factor.
If a learner is introvert and then encounters a teacher like my research population’s,
disengagement from feedback is a very natural reaction as learners want to protect
themselves.

5.4 Strategies for Working with Feedback

The analysis of the data clearly shows that no matter which feedback method is
used, the crucial part is to explain to the learners the reason for choosing a particular
feedback method as well as providing them with strategies to work with it, because
“teachers should be the, there to lead the students in the right way to correct their
homework” (Foc Group I, Paragraph 309). Otherwise “if […] you can’t work with
this kind of feedback […] then you won’t improve.” (Susan, Paragraph 316). As Julie
pointed out in the focus group interview: “[H]ow can she give it (= homework) back
to you and just underline what’s wrong, when you don’t have a clue how to go […]
further on?” (Foc Group I, Paragraph 43). Her problem was “what are the mistakes?
And what do you do then?” (Foc Group I, Paragraph 63). Most of the time, after
consulting a teacher and not getting a (satisfactory) answer, Julie decided to not put
more thought and energy into it.
Therefore, one strategy is feedback seen as a dialogue (see Sect. 5.3.1.3) between
learners and teachers. This issue surfaced during one of the focus group interviews
too. Asking the learners about the feedback method(s) and enquiring if something
needs to be changed or added, can definitely lead to an improved dialogue between
learners and their respective teachers. Julie commented on the aforementioned lack
of communication by stating that “we do the homework and lots of people don’t
understand why there is a solution for this, um, like why we have to go in that
direction and not in a other.” (Foc Group I, Paragraph 25). What was missing for the
learners was “for example in Maths, or in […] German, […] she doesn’t sit with us
down […], so we can do it together. … And then we don’t know, what is the right
correction.” (Foc Group I, Paragraph 25). Even working together with other peers
did not help them tackle the problems they encountered. The participants stressed
5.4 Strategies for Working with Feedback 149

that they would have needed the teacher’s help and also advice on a strategy (or
strategies) they could use to solve problems on their own.
In the focus group interviews the participants commented on the feedback method
we used in English as well. Some interesting issues surfaced that led to a change of
certain areas of it (see Sect. 4.6). At the beginning the learners just had the colour
code, which Alan saw as “sometimes […] it’s okay if there is only […] the colour,
[…] But sometimes, as Katharine said, […] you don’t know what could be wrong,
and […] you think again, and think, and think, and without end.” (Foc Group I,
Paragraph 239). Katharine supported Alan’s view as “sometimes it takes me way too
long to figure out what’s wrong. So, because […] I wrote it, I think it’s okay.” (Foc
Group I, Paragraph 238). Susan stated that she sometimes saw it as a guessing game,
simply trying out something and hoping it was correct (cf. ibid, Paragraph 240).
When asked which areas they definitely needed clues on, they identified grammar
and content as well as sometimes expression. As a consequence, I discussed this
with the whole class and most of them agreed that it would be easier if they got
clues for these areas. Emily, for example, emphasised that she would need more
clues, especially in grammar, because otherwise “it’s like really a quiz.” (Group II,
Paragraph 667). In her case the clues she got saved time and helped her in correcting
her errors. What one learner prefers, however, can be rejected by another, like Julie,
whose reaction to clues was rather different. Her needing less assistance shows her
evolving autonomy with responding to the feedback she got:
Julie: Yeah, maybe I would.. correct it, but then it would be too many hints, and then
I […] I’m forced to [mm hmm] pick that tense in that word and.. everything…
But, um,… when I don’t have that, when it’s only [mm hmm] like… [the
colour]
Researcher: [Blue or whatever, yeah]
Julie: Blue [aha], then I have to think about it. What fits more.. [mm hmm] what is,
what is better, then I know what kind of mistake I did.
Researcher: Yeah, okay.
Julie: And then I fix it. And if it’s correct, then I learn something by it, but if [mm
hmm] I have too many hints, I will be like, yeah, […] what’s the point then.
(Julie, Paragraph 496–500)

As a result, Julie and I agreed she would only get the colour code in order to ensure
her engagement with the feedback method. Taking Vygotsky’s ZPD into account, in
Julie’s case too much assistance would have been detrimental, because she wanted
to figure out her errors on her own. Due to the dialogic approach towards feedback,
we co-constructed a way that benefitted her engagement with written CF. When she
did not know what was wrong, she always asked me in class or contacted me on our
blog on CourseSites. Her response to the planned adjustments proves that not only
strategies but the learner’s individuality (see Sect. 5.3.5.1) could be a determining
factor when engaging with feedback (cf. Bitchener, 2019).
Because of the learners’ comments on the feedback method during the focus
group interviews, I included question 9 “In your experience during your final year,
about how often have you done each of the following when working with written
150 5 Findings and Discussion

70%
60%
50% Very often
Percentage

40%
Often
30%
20%
Sometimes
10% Never
0%

Fig. 5.5 Writing—Areas worked on

corrective feedback on your English homework?” in the questionnaire I used for the
follow-up interview. They had to state how often they corrected spelling, grammar,
words/phrases, word order, improved sentence structure, worked on writing style,
added ideas or voluntarily re-wrote parts of the assignment. The results (see Fig. 5.5)
show that grammar, words/phrases and added ideas were mentioned as areas they
corrected very often or often. These correlate with the areas grammar, expression
and content of their feedback in English, which the learners remarked on as being
difficult to correct in the focus group and individual interviews as well. Therefore,
clues in these areas are a necessary strategy in order to make it easier for most learners
to engage with this feedback method.
Interestingly enough, learners less often spent time on their sentence structure or
re-writing parts of their written text. Talking to them in class about this issue, many
of them said these areas needed a lot of time and they were not always willing to put
much time into correcting their texts. For them writing was still more a product than
a process, as English was the only subject where they had to do a second draft. Some
learners speculated that had they had this feedback method from the very start, they
might have been more used to writing being a process and not a product.
On the other hand, adding ideas more than two-thirds of the learners did very
often or often, showing that slowly but steadily they embraced the idea of writing
being a process, like Michael who said sometimes “I correct something that isn’t
highlighted.” (Foc Group II, Paragraph 694). Alan, for example, was dissatisfied with
his article on sponsorship and decided to add other ideas which led to him changing
most of his first draft. Similarly, Michael, who at first had difficulties writing reports,
stated in his blog entry:
Rural outsourcing
Posted by Michael on Saturday, 21 March 2015 10:00:34 o’clock EDT Last Edited: Sunday,
22 March 2015 15:17:21 o’clock EDT
5.4 Strategies for Working with Feedback 151

Hello Miss Moser,


here is my corrected report about rural outsourcing. I hope that I was able to improve my text.
As you will see I put in a completely new paragraph about the definition of rural outsourcing
as I thought that it is actually a different issue than the reasons of the development. Please
let me know what you think about that and my other corrections.
See you on Monday!
Best wishes
Michael

Both instances show that the learners deeply engaged with the written CF they
got and autonomously added new ideas to improve their respective texts.

5.4.1 Strategies for Writing

Question 8 of the questionnaire “How much work have you put into each of the
following activities in your English homework?” asked about brainstorming ideas,
planning the structure, writing and editing a first draft, writing a second draft and
reflecting on the teacher’s final feedback. The participants, for example, knew that
brainstorming ideas and planning the structure of a text was important and the results
of the questionnaire show that in English 50% of the population spent very much
time on these. Less time was devoted to writing and editing a first draft, as well
as writing a second draft. One reason for not doing their second draft was because
“sometimes I know.. when it’s really bad […] what I wrote. I just don’t want to look
at it again [… Once] is enough” (Katharine, Paragraphs 746–752). Reflecting on the
final feedback they got, on the other hand, was what 75% of them did very much
(see Fig. 5.6).

80%
70%
60%
Percentage

50%
40% Very much
30% Some
20%
Very little
10%
0% None at all

Fig. 5.6 Work put into activities


152 5 Findings and Discussion

The majority of the learners reported that writing all types of texts was quite a
challenge for them. What is more, a lot of them stated that they lacked the knowledge
of how to structure their writing process. An aspect that has to be taken into account
too, when teaching writing. Learners are more likely to engage with the writing
process if they are not only well-equipped with strategies for writing, but also know
effective tools that support it. Are plenty of these available to learners, it makes it
easier for them to tackle the difficult issue of text composition.
The results of question 8 emphasise that learners need to be familiarised with
strategies to be able to write effectively. Michael pointed out that one subject (=
Personal Development and Social Skills) they had in their first year at our Business
School, had helped him to realise how important strategies were to succeed in school.
Nevertheless, he explained “one thing […] I never learned, basically, [was] how to
brainstorm […] when I wrote a text I was like okay, just do it” (Michael, Paragraph
280). He mentioned that he structured his texts, but “I was kind of in the middle
[…] my text, then I began.. to structure it.” (Michael, Paragraph 288). Consequently,
strategies need to be explained to learners, so they can draw on them when needed.
Michael, for instance, used a couple of strategies, but some a bit differently and
others not at all. Alan, on the other hand, who had hardly used any strategies in the
first two years, changed that “because I noticed […] if I do… brainstorming… and
planning […] before a text […], it’s much easier to write then.” (Alan, Paragraph
328).
“How much has your experience at your school contributed to your development
in the following areas?” was question 2 of the questionnaire, where the participants
commented on writing clearly and effectively, thinking critically and effectively,
working effectively with others and learning effectively on your own. Throughout
their school careers the learners had to produce many texts, but how much their
experience at school contributed to writing clearly and effectively differed a bit. 50%
of the population said very much, whereas 37.5 and 12.5% claimed that it had little
or very little effect (see Fig. 5.7). Anne stated that she had difficulties improving in

100%

80%
Very much
Percentage

60% Some
40% Very little
20% Not at all

0%
Writing Thinking Working Learning
effectively effectively others own

Fig. 5.7 Development of areas improved through experience at school


5.4 Strategies for Working with Feedback 153

German, because in her view she did not get enough support from her teacher. Anne,
Emily and Julie then stated in their individual interviews, they finally got support and
were given a couple of strategies from their German teacher before their written A
Level exams in May. As Emily illustrated, it “was good […] in German that… we did
it together […]. And… we underline […] every… grammar part together .” (Emily,
Paragraph 360). What is more, they highlighted that they discussed common errors
as well as strategies they could employ during the written A Level exam with their
teacher. All of them stressed, though, that this was what they would have needed a
long time ago in order to improve in German and not just before their final exam.

5.4.2 Cooperation

Cooperation was one aspect that helped a lot of the learners when they struggled
while writing or brainstorming ideas for a type of text as well as while doing their
final year project (see Sect. 5.1.1 for an explanation of this concept). Looking at
the results of the questionnaire (see Fig. 5.7) 50% of the population stated that their
experience at school contributed very much to working effectively with others. Alan
highlighted “I think especially if you are unsure about your text, […] then it’s the
right way if you ask somebody of your […] colleagues …. if they could help you”
(Foc Group I, Paragraph 149). He argued that peers looked differently at your text
and sometimes pointed out areas that should be improved, or had ideas that could be
implemented in the text. Katharine had a similar experience as it “helps me when I
get, um, ideas from other person about a topic, and then I can write about it.” (Foc
Group I, Paragraph 158).
Anne also emphasised that working as a team and being able to draw on the
expertise of her peers had helped them to finish the final year project in the end.
Working in small groups had further contributed to their cooperation. The participants
mentioned working in small groups as not always being easy, especially when there
were peers not doing their share of the work. In order to tackle this particular problem,
they tried talking to the respective peers which sometimes helped. Every now and
then “the others don’t do their work […]. But we need it for this week […] and
then…. you have sometimes to write it for the others” (Susan, Paragraph 98) as not
handing in their parts at the deadline had an impact on their final grade of the project.
Some of the learners ended up writing parts of others, but some, like Michael, refused
to do so, because “I did not want to write the whole project. I mean […] we were
five people […] and in my opinion everybody should do something for the project.”
(Michael, Paragraph 80). He was lucky, though, as in his group all students did hand
in their respective parts on time. The same was true for Veronica and after getting
their final feedback this group even corrected their project together in school after
they were finished for the day. Katharine and Susan, on the other hand, were not that
fortunate and ended up doing some of their peers’ work.
All in all, the learners agreed that cooperating with each other had a huge impact on
their attitude towards working together and working with the feedback they got. The
154 5 Findings and Discussion

final year project groups, where all students finished their parts on time, benefitted
from each other, whereas the others were really frustrated at times when they had
to write yet another part for some of their peers that did not cooperate fully. The
students were of the opinion that the concept of team work should be used more
often but should also be explained to them in more detail to ensure that the majority
of students understood the importance of doing their share.

5.4.3 Time Management

When analysing the focus group and individual interviews the aspect of time emerged
as one factor that influenced learner engagement with written CF. The majority
of learners linked time—mostly a lack of it—with pressure they had experienced
throughout their school career, which was caused by peers, teachers, parents, etc.
Susan, for example, explained there were so many teachers that demanded some-
thing from them, that they postponed doing their homework to the last minute and
repeatedly missed the deadline. Alan felt “I have to learn for three tests, and have to
do much homework. […] I have so much […] pressure.. that I don’t want to do the
homework.” (Foc Group I, Paragraph 5).
Susan made an interesting point about the relationship between pressure and
engaging with feedback “because if you have so much pressure for doing homeworks,
making presentations, and this and that […] you will reach the point that you just
do it, because you have, and you don’t think about that what you are doing.” (Foc
Group I, Paragraph 89). She believed that the amount of work you had to do for
other subjects sometimes resulted in not engaging with feedback thoroughly. For
Katharine, however, the workload led to being stressed and sometimes even panicking
that she might not be able to finish all her assignments on time.
Emily was especially worried about a lack of time, because it was “one reason
for not doing your homework […] Because I just want to take more time for that
to be concentrated on it. […] Like you could do it better, but you don’t have the
time.” (Foc Group II, Paragraph 17). In her case her part-time job was one of the
reasons for not having enough time to spend on school work. She admitted, however,
that time management played an important part as well, which she was not so good
at. Further, very often she did her homework in a hurry, resulting in her not being
satisfied with the first draft, because it did not meet her expectations. Susan was of
a similar opinion: “I want to do my homework […], and then I think oh, no, this is
more important and I can do it later .” (Susan, Paragraph 80). Time management was
a problem for her as well as for a lot of the other students. Learning how to prioritise
tasks and manage their time wisely seems to be one strategy teachers should work
on with their learners.
To illustrate that time management was indeed an issue with most of the popu-
lation, their creative assignment What I always wanted to tell you best proves the
point. I told the students in September that they could choose the topic as well as
the type of text and set the deadline for the end of April. I believed that most of the
5.4 Strategies for Working with Feedback 155

learners would do this assignment right away as the first couple of weeks at school
were rather quiet. I was proven wrong as only Veronica uploaded it in February and
everyone else in April. Anne, Alan and Susan, for example, uploaded it very close
to the deadline. When questioned why they had submitted it so close to the deadline
in one of the English classes, they named too much time and their own laziness as
reasons. This shows that time management is a skill students need, but which prob-
ably has to be implemented in school from early on for learners to manage their time
more effectively.

References

Ashdown, S., Clarke, D., & Zekl, C. (2012). Focus on modern business 4/5 (p. 17). Berlin and Linz:
Cornelsen.
Barkley, E. F. (2010). Student engagement techniques: A handbook for college faculty. San
Francisco: John Wiley & Sons.
Bitchener, J. (2019). The intersection between SLA and feedback Research. In K. Hyland & F.
Hyland (Eds.), Feedback in second language writing: Contexts and issues (2nd ed., pp. 85–105).
Cambridge: Cambridge University Press.
Bitchener, J., & Storch, N. (2016). Written corrective feedback for L2 development. Bristol:
Multilingual Matters.
Bryson, C., & Hand, L. (2007). The role of engagement in teaching and learning. Innovations in
Education and Teaching International, 44(4), 349–362.
Campbell, N., & Schumm Fauster, J. (2013). Learner-centred feedback on writing: Feedback as
dialogue. In M. Reitbauer, N. Campbell, S. Mercer, J. Schumm Fauster, & R. Vaupetitsch (Eds.),
Feedback matters: Current feedback practices in the EFL classroom (pp. 55–68). Frankfurt am
Main: Peter Lang.
Foster, E., & Southwell-Sander, J. (2014). The NTSU Outstanding Teaching Awards: Student
perspective on engagement. In C. Bryson (Ed.), Understanding and developing student engage-
ment. The Staff and Educational Development Series (pp. 139–150). London and New York:
Routledge.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 62, 60–71.
Garcia, V., Agbemakplido, W., Abdela, H., Lopez, O., & Registe, R. (2006). High school students’
perspectives on the 2001 No Child Left Behind Act’s definition of a highly qualified teacher.
Harvard Educational Review, 76(4), 698–724.
Hamre, B. K., & Pianta, R. C. (2006). Student-Teacher relationships. In G. G. Bear & K. M.
Minke (Eds.), Children’s need III: Development, prevention, and intervention (pp. 151–176).
Washington: National Association of School Psychologists.
Higgins, R., Hartley, P., & Skelton, A. (2001). Getting the Message Across: The problem of commu-
nicating assessment feedback. Teaching in Higher Education, 6(2), 269–274. Retrieved from
http://nitromart.co.uk/jem/docs/tt/getting%20the%20message%20across.pdf.
Hu, G., & Ren, H. (2012). The impact of experience and beliefs on Chinese EFL student writers’
feedback preferences. In R. Tang (Ed.), Academic writing in a second or foreign language (pp. 67–
87). London: Continuum.
Kaplan, A., & Patrick, H. (2016). Learning environments and motivation. In K. R. Wentzel & D. B.
Miele (Eds.), Handbook of motivation at school (2nd ed., pp. 251–274). New York: Routledge.
Mosher, R., & MacGowan, B. (1985). Assessing student engagement in secondary schools: Alter-
native conceptions, strategies of assessing, and instruments. A Resource Paper for the University
156 5 Findings and Discussion

of Wisconsin Research and Development Centre. Retrieved from http://files.eric.ed.gov/fulltext/


ED272812.pdf.
Pitt, E., & Norton, L. (2017). ‘Now that’s the feedback I want!’: Students’ reactions to feedback on
graded work and what they do with it. Assessment and Evaluation in Higher Education, 46(2),
499–515.
Price, M., Handley, K., & Millar, J. (2011). Feedback: Focusing attention on engagement. Studies
in Higher Education, 36(8), 879–896.
Sambell, K. (2013). Engaging students through assessment. In E. Dunne, & O. Derfel (Eds.), The
student engagement handbook: Practice in higher education (pp. 379–396). Bingley: Emerald.
Swain, M., & Lapkin, S. (2002). Talking it through: Two French immersion learners’ response to
reformulation. International Journal of Educational Research, 37(3–4), 285–304.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.
Cambridge: Harvard University Press.
Wolters, C. A., & Taylor, D. J. (2012). A self-regulated learning perspective on student engage-
ment. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 635–651). New York: Springer.
Chapter 6
Conclusion

Simply giving learners’ feedback on their written work once will not be effective,
as I firmly believe that feedback cannot be seen as a one-way street. One reason
to provide learners with feedback on their written texts should definitely be that
they can work with it in order to improve their writing skills. This seems to be
logical. Nevertheless, exactly that is sometimes rather difficult to manage, because
if learners do not understand the feedback they get (cf. Chanock, 2000) it is quite
unlikely they will engage with it. Learners’ reasons for (not) engaging with written
CF are manifold and getting to the root of these might be one way to tackle the
difficult issue of engaging many learners with the feedback method(s) they get. The
concept of engagement seems to be one way to explain the complex issue of learners’
willingness to engage with it.
The overall aim of my study was to investigate the interplay between learner
engagement and written CF. On the one hand, feedback methods learners find
engaging, as well as the various ways they could engage with them, were explored
which led to interesting results (see Chap. 5). On the other hand, mediating factors
that influenced the level of engagement were looked into as well, which result in
the conclusion that strategies appear to be one crucial factor in ensuring learners’
willingness to work with written CF (see ibid.). That teachers play an essential role
in learner engagement might be accepted without a moment’s hesitation by many (cf.
Hattie, 2009), but just how much impact they can have on their respective learners
is one result of this study.
In which way teachers react to learners’ questions is one of the various factors
influencing their engagement. Additionally, how teachers provide feedback can not
only foster engagement, but lead to a lack of it on the learners’ part too, especially
when, for example, one learner gets specific feedback and another one is simply told
that his or her version is incorrect without providing any further explanation. The
feedback method I used with this population is one way to engage more learners
with written CF. It has to be stated, though, that this study does not claim that the
feedback method used here is the best way to give feedback. On the contrary, it has

© Springer Nature Switzerland AG 2020 157


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5_6
158 6 Conclusion

been used to illustrate how a combination of methods could help to engage more
learners with it. The proposed Giving-Feedback-Guidelines are an essential feature
of this method as learners need to know how to work with the feedback provided.
What is more, learners get feedback tailored to their needs where strengths and
weaknesses are pointed out. Furthermore, learners are encouraged to work on their
errors and then get final feedback with additional corrections if necessary. Finally,
the personal written statement on the first and second draft appears to trigger their
engagement even more. All in all, the aim of giving feedback should be to help
learners to improve, but one needs to be realistic, as it is very unlikely there will
ever be a method that engages all learners. This can also be supported by my data,
when you compare the research population with the participants of the pilot study
as well as the non-participants (see Tables 5.1 and 6.1) as the level of engagement
differed among all of them. The findings of my data clearly show, that there are
many different factors influencing the learners’ level of engagement. Sometimes the
teacher has no influence on these, because, for instance, he or she cannot address
the learner’s laziness for him or her (see Sect. 5.3.5.4.). What should be considered,
however, is that learners need to be told why a certain method is being used.
What is more, learners need strategies to be able to work with it to enhance the
chances of as many of them engaging with the feedback they get, as we cannot
assume that learners automatically know how to deal with it (cf. Goldstein, 2006).
Like Skinner and Belmont (1993) emphasise, teachers “can provide structure by
clearly communicating their expectations, by responding consistently, predictably,
and contingently, by offering help and support, and by adjusting teaching strategies
to the level of the [learner]” (p. 572). Regarding feedback, I would argue that the
notion of the ‘best’ way to give feedback is, and will remain, a myth, as one feedback
method might work perfectly for some learners but not for others (cf. Ellis, 2012),
hence being flexible and adapting feedback methods to the needs of the learners is
an essential key in learner engagement.

6.1 Research Questions

Just taking the results of the questionnaire for the follow-up interviews into account,
it seems the findings for research question 1—What kinds of written corrective feed-
back do students find engaging?—are that learners engage with the personal written
statement, followed by codes in margin (see Fig. 5.2). All other methods did not score
very highly in the learners’ responses and hence might be ineffective. This conclu-
sion, however, would be far too simplistic and shows, to my mind, the limitation of
using only a questionnaire in contexts such as this one. That filling in a questionnaire
can be influenced by the learners’ state of mind or the time of the day they do it,
can be best portrayed with data from Veronica and Katharine. In her questionnaire
Veronica, for instance, who ticked that good grades influenced her to some extent to
work with the feedback method, stated in the individual interview that it was actually
very much. When asked why she ticked some in the first place, she said “I don’t know
6.1 Research Questions 159

why I did that. […] It’s like with the .. questionnaire. You […] do that and […] then
two hours later you, you read … through that and then you think, okay, why ..?”
(Veronica, Paragraph 544–548). Katharine, on the other hand, ticked some for her
engagement with codes in margin, when she actually meant very much:
Researcher: Codes in margin. You said […] it interests or engages you some.
Katharine: …. Hmm. ……… I don’t know why I did that.
Researcher: What would, um, …
Katharine: It’s the thing that you do, right? […]
Katharine: Colour code.
Researcher: Yeah.
Katharine: Yeah, .. I know. … I don’t know why I ticked so.
Researcher: What would you like to tick? …. I mean that’s why we talk about it [really].
Katharine: [Yeah.] …. I like it a lot.
Researcher: Very much?
Katharine: Yeah (Katharine, Paragraph 529–542).

Katharine and Veronica’s statements highlight the fact that questionnaires, which
are arguably more objective than qualitative research tools, might sometimes produce
statements by research participants that in retrospect are inaccurate. As mentioned
before, only relying on the results of the questionnaire would have led to the conclu-
sion that two feedback methods (personal written statement and codes in margin)
were most engaging and all others not. During the individual interviews, however,
when I asked my informants about statements they had made in the focus group inter-
views and the questionnaire, it turned out that learners found other feedback methods
engaging as well, provided that they were used in conjunction with other methods.
Therefore, the individual interviews had proven to be paramount as they showed that
not only the personal written statement and the codes in margin (or colour code) were
engaging, but also that error correction and self-correction were when combined with
other methods as well. On its own, most students did not appreciate error correction
because it provides no opportunity to improve their written work themselves. This
notion can also be backed by research, because several studies have shown that error
correction, especially in the foreign language classroom, should be the last step in
the feedback process (cf. Ferris, 2011; Milton, 2006).
Like with other feedback methods, a lot of the learners were rather critical about
self-correction, mainly because they thought it was too much work and additionally,
they believed that without getting a final correction from their teacher they would not
benefit from it either. Nevertheless, many learners did not mind self-correction as part
of our feedback method (see Sect. 5.2.5.). As many of them stated, the majority of the
feedback methods had their value when a second step was involved. Consequently,
engaging learners with the various feedback methods can be accomplished when
writing is seen as a process and feedback as a dialogue by both learners and teachers
(see also Juwah et al., 2004; Price, Handley, & Millar, 2011).
Error indication/no correction, reformulation of sentences as well as peer correc-
tion were viewed rather critically by the participants of the study at hand. In their
160 6 Conclusion

opinion it was crucial to combine any of these methods with another one to ensure
their engagement with it. Otherwise most learners held the belief they would not
profit from these methods. The reformulation of sentences, especially, was not very
popular with them. The majority of the population believed their personal impact
got lost when teachers reformulated complete sentences or even a paragraph. Rather
controversial was their point of view on peer correction—ranging from very much to
not at all engaging (see Fig. 5.2)—as some of them could not see any advantage in
having their written work corrected by a peer. Nevertheless, they agreed that working
with peer correction might be beneficial for them if they got their peers’ feedback
first and then handed it into their teacher to get a second piece of feedback.
The analysis of the data shows that according to this research population the
personal written statement should be included in any feedback method used, as the
personalised feedback (see also Bitchener & Storch, 2016; Handley, Price, & Millar,
2011; Murtagh & Baker, 2009) made them feel special and not that they were simply
just a learner who received the same feedback as all the others. Consequently, the
statement needs to be individualised for each learner and not just a pre-designed
formula to be used by the teacher (see Sect. 5.2.7.). In conclusion, it is the teacher’s
responsibility to explain the feedback method used to his or her learners and provide
them with strategies to work with these. Furthermore, my data strongly supports the
fact that the combination of several feedback methods might be one way to foster
learner engagement with written CF. Using several methods could cater to the needs
of more learners, thus increasing chances of learner engagement with written CF.
If learners engage with feedback, then, what are the learners’ reasons for
engaging? To answer research question 2—In what ways do students engage with
written corrective feedback?—the three dimensions of engagement (emotional,
cognitive and behavioural) revealed a fascinating picture. First, on the emotional
dimension the data highlighted the influence of teachers and peers, as both could
cause an increase or a decline in a learner’s level of engagement. My findings demon-
strate how much the emotional relationship between learners and their teachers can
foster or hinder engagement (see Sect. 5.1.1.). Teachers having an impact on learners’
engagement (Wentzel, 2016), was also one of the findings of Skinner and Belmont’s
study where they looked at Reciprocal Effects of Teacher Behaviour and Student
Engagement Across the School Year (cf. 1993), stating teachers “shape the extent
to which children feel that their needs are met, not only for relatedness but also for
competence and self-determination. When teachers are less involved with students,
students not only miss the involvement but also experience [them] as less consistent
and more coercive” (Skinner & Belmont, 1993, p. 577). Likewise, Michael explained
his lack of engagement with feedback in the other foreign languages, because his
teachers were not showing much interest in his abilities and progress in the respec-
tive subjects (see Sect. 5.1.1.). In a project by the International School Psychology
Association where twelve countries (Austria among them) took part, Lam, Wong,
Yang, and Liu (2012) in their study on personal and contextual antecedents of learner
engagement in Chinese schools found out that teacher support was the most powerful
predictor of learner engagement, ahead of parent and peer support. In line with that,
we should probably sometimes remind ourselves that our learners have a life outside
6.1 Research Questions 161

school as well and it might be beneficial for their engagement to ask them about it.
As Garrett (2011, p. 9) put it “I think it may be important for reminding students that
you value their efforts to engage in the learning process and such introspection helps
them remember their accountability for their own learning.”
In addition, the analysis of the data show that the learners’ emotional engagement
with school in general and the various subjects in particular had an effect too. As Julie,
for instance, pointed out, she sometimes did not engage with feedback because she
was much more interested in computers and all problems related to them, than most
subjects taught at her business school. In fact, she believed that this type of school had
been the wrong choice for her (see Sect. 5.1.1.) and influenced her (un)willingness
to engage with feedback to a great extent. In her case, conflicting mediators thwarted
her engagement with written CF, a problem Hofer and Fries (2016) reported on in
the so-called Theory of Motivational Action Conflict. This theory looks at academic
and non-academic goals that can be rather conflicting, hence learners need to decide
which of these are more important for them at the moment.
Second, the cognitive dimension is another area that needs to be taken into account
with engagement. In order to help learners to improve in their writing, for example,
engagement on a cognitive level seems to be inevitable. Several learners commented
on the fact that working on their errors and expressions, as well as improving content
and using more linking words in their second drafts, resulted in making fewer and
fewer errors in this respect. As most of the population emphasised in the various inter-
views, they had to think about their errors, hence they were cognitively engaged in
their learning process (see Sect. 5.1.2.) and made use of that knowledge in their
written exams and homework assignments. Consequently, teachers need to give
learners explanations for the recurring errors they make to help them overcome
some of these. What is more, the cognitive dimension includes using strategies as
well; thus, one task of the teacher includes providing these for the learners, in order
to help them succeed in working with the teacher’s chosen feedback method.
Third, the behavioural dimension, especially, had an impact on the learners’
engagement when they had to do homework assignments but got little or no feedback
on them to work with. Similarly, assignments with unclear instructions led to less
engagement, as did a rather short period of time for completing them. In contrast
to that, the data clearly show fixed roles (regarding being a student and their own
responsibility towards doing school work) having an influence as well. In addition,
their own attitudes either hindered or fostered their engagement with written CF.
Closely linked to the three dimensions of engagement are mediating factors. The
analysis of my data underlines that learner engagement with written CF can be influ-
enced immensely by them. In other words, my findings for research question 3—What
do students perceive as mediating factors when engaging with written corrective
feedback?—show the manifold factors increasing or reducing learner engagement
(see Sect. 5.3.). Obviously, the teacher giving feedback is one of the most influential
mediating factors (see Fig. 5.3). In this respect, the learners’ statements highlighted
several areas having an effect on them, ranging from the teacher’s feedback, the
(im)possibility to ask questions, unclear instructions, and no strategies to work with
feedback, to a lack of feedback as dialogue (see Sect. 5.3. for a thorough discussion
162 6 Conclusion

of these). Additionally, peers and the feedback method itself impacted their level of
engagement too.
Despite the above-mentioned factors, it should not be underestimated how much
the learners’ own attitudes affected their engagement with written CF as well. Their
interest in the respective subject and commitment to school work was as much a factor
as their own laziness, which was identified as one of the most common reasons for
not doing their homework assignments and, as a consequence, not engaging with
feedback. Hence, the teacher might have provided everything the learners needed,
but they still could not get themselves to sit down and actually do the school work.
What could be discussed with the learners is ways to encourage them to do the tasks
at hand and how to tackle the difficult issue of time management. In the end, however,
it is the learners’ own decision whether or not they make use of the tips given, but
the teacher can definitely encourage them to do so.
My data also portray the individuality (see also Han & Hyland, 2015) of each
learner having an effect on their level of engagement, as different learners identified
different reasons for their engagement, or a lack of it. Sometimes other areas of their
lives were so prominent that they neglected everything else, which was illustrated
with Susan (see Sect. 5.3.5.1.) who did not do any homework assignments in the first
semester, but did all in the second semester. She realised that doing them and working
with the feedback made it easier to compose texts in the written exam. This implies
that in some cases a lack of engagement is caused by other factors than teachers,
peers or the feedback method, to name just a few. Alternatively, self-confidence was a
factor in engagement too. Veronica and Anne, for example, engaged actively with the
feedback they got on their written work, because they were not so keen on speaking
in class. Their reasoning was that working hard on their first and second drafts could
compensate for less active participation in class discussions. Katharine, on the other
hand, sometimes did not complete a piece of homework as she was convinced others
were better than her and she could not write a text that was good enough.
Finally, anxiety was identified as another mediating factor. This was not only
mentioned in contexts where learners were too afraid of their teachers to ask ques-
tions, and feared being shouted at or told off about errors they, according to their
teacher, should not make any longer. Peer correction was brought up too, as some
learners were more introvert and afraid of sharing their written work with others.
Given the above, teachers should be aware of mediating factors influencing the
learners’ willingness to work with written CF and how their way of giving feedback
could contribute to that.
In conclusion, the findings of my data suggest that the personal written statement
should be implemented in any feedback method used. Codes in margin (or the colour
code) was the second most popular feedback method, but all other methods (e.g.,
error correction, self-correction, peer correction) are valuable, especially when they
are used in combination with other methods. When choosing a feedback method, the
proposed Giving-Feedback-Guidelines are a valuable tool to make one’s intentions
clear to the respective group of learners. The three dimensions of engagement, namely
emotional, cognitive and behavioural, play a crucial factor in written CF as well. The
data illustrate how mediating factors influence the various dimensions, and as a result,
6.1 Research Questions 163

strategies for working with feedback (see Sect. 5.4.) appear to be paramount when
engaging learners with written CF. Research on superficial learning strategies (e.g.,
memorising facts) compared to deep learning strategies (e.g., getting to the root of
a problem using various strategies, such as cognitive or metacognitive) to promote
understanding (Blumenfeld, Kempler, & Krajcik, 2006) could enhance written CF
too. My data suggest that shallow and deep engagement (Crick, 2012) with feedback
could be fostered by providing learners with various strategies to work with it. Finally,
when giving feedback the aim should be to focus on quality and not quantity. One way
to achieve this goal is to only set a couple of pieces (e.g., three) of written assignments
each semester, but promote deep engagement with them on the learners’ part. I firmly
believe that learners need to work with their errors and try to edit their texts (sentence
level, content, etc.) if engagement with written CF and learners’ progress in their
writing is the ultimate aim. This can be achieved with the Engagement-Mediator-
Feedback Model, based on the Dynamic-Engagement-Framework, illustrated in this
book.

6.2 Possible Limitations of the Study

Using an online tool to give feedback could be an obstacle for teachers and learners in
various countries around the globe. Consequently, the feedback method introduced
in this book would have to be used in a different way. In the first couple of years at
our school, for example, the students did not have laptops in their first two years,
hence they handed their homework in on paper. Instead of the colour code (which
would be impractical doing by hand), codes in the margin (e.g., Art = article, T =
tense, etc.) were used, giving them clues for grammar, expression and content. The
personal written statement was done as well. The students self-corrected their errors,
underlined their corrections and handed in both drafts to make it easier for me to
check what they had edited. Similarly, to the platform (see Sect. 4.6.), I corrected
errors if necessary, gave metalinguistic explanations for errors and wrote a personal
written statement on their second draft.
In addition, it could be argued this study cannot generalise findings for the research
community because of the small number of participants. Qualitative research,
however, never claims generalizability as their research population is very often
much smaller than in quantitative research, for the simple reason that spoken data
takes much longer to be transcribed and/or coded and is difficult to be expressed in
numbers. Kahu (2013, p. 769) stressed that research with small populations, even
within one single institution, is necessary, as in her opinion “a broad generalisation of
the student experience is ill-advised”. What is more, if little is known about a certain
research area, the study of a small number of participants is suitable—especially as
you cannot rely on research that has already been conducted (cf. Dörnyei, 2007). For
this reason, it is my belief that this small-scale study has shown how important the
concept of learner engagement is in relation to written CF. Even among this small
research population it became clear that all of them were influenced by different
164 6 Conclusion

mediating factors which had an impact on their level of engagement. Moreover, the
emotional, cognitive and behavioural dimensions were different for each learner as
well and were subject to change from time to time. The analysis of the data empha-
sises how important the three dimensions of engagement are for written CF. Finally,
the analysed data highlight the value of the various feedback methods—not used in
isolation but when combined with other methods they can engage many learners in
the feedback process.
One criticism of the study could be that the learners only engaged with written
CF because they took part in my study. Nonetheless, this can be contradicted by
those learners that participated in the pilot study, as well as the six learners that did
not take part at all. Was it true that their helping me with my study triggered more
engagement? Those learners being a part of the pilot study should have done most
of their written assignments. This was not the case, however, as two out of four did
not do a single piece and another one completed only the first draft of the creative
assignment. Out of the six non-participants all of them did at least two assignments
and three completed all of them (see Table 6.1).
One explanation for the learners’ engagement can be best illustrated with data from
Anne. While talking about working with feedback, she mentioned a very interesting
aspect in her individual interview:
Anne: And, it’s like, um, .. you forced us […] to work, um, … through our mistakes
and
Researcher: Aha [And how did I force you?]
Anne: [to think more, because] you .. told us .. we have to .. look, uh, at paragraph
two, for example, and so and we should upload it again [mm hmm] and then I
.. um, I thought, uh, you .. it lasts you a .. long time to correct our homeworks
and then ..
Researcher: That it takes me a long time?
Anne: Yes [mm hmm], and then if I .. just, um, .. don’t, uh, force .. on my … homework
and upload it just .. again. .. Uh, so it wouldn’t be fair to you
(Anne, Paragraph 224–228)

What she named forcing them was actually making learners aware that they should
work on their errors themselves. Engaging them on a cognitive level where they draw
on strategies to tackle, and in the long term, memorise these errors to potentially avoid
them is one aim of this feedback method. Moreover, her emotional engagement
clearly encouraged Anne to work thoroughly with the feedback she received, though
it was not the only driving force.
While it might be true that some of the learners put more effort into doing their first
and second drafts, as they believed I put considerable work into giving them feedback
too, that was not the sole reason. Despite their emotional engagement with me (see
Sect. 5.1.1.), wanting to improve in English for their own benefit was as much a cause
for their engagement as the feedback method provided (see Fig. 5.3). Hence several
factors prompted their engagement, but the findings of my data clearly show that
the emotional dimension can have a huge impact on the learners. Another example
of emotional engagement was demonstrated with the rather difficult relationship
Table 6.1 Homework record pilot study students and non-participants
First semester 1st or 2nd Second semester
semester
Drafts: 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd 1st 2nd
Analysing a Article globalization Poverty (type of What I always Commentary Article sponsorship Report outsourcing
cartoon text—free wanted to tell investigative
6.2 Possible Limitations of the Study

choice) you (creative) journalism


Angela ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥
Dana ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥
Marvin ¥ ¥ ¥ ¥ ¥ ¥ ✓ ¥ ¥ ¥ ¥ ¥ ¥ ¥
Tara ✓ ✓ ✓ ✓ ¥ ¥ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓
Stud. 1 ¥ ¥ ✓ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ¥ ✓ ✓
Stud. 2 ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ¥ ¥ ✓ ✓ ✓ ✓
Stud. 3 ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Stud. 4 ✓ ✓ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Stud. 5 ✓ ✓ ✓ ¥ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Stud. 6 ¥ ¥ ¥ ¥ ¥ ¥ ✓ ✓ ¥ ¥ ¥ ¥ ✓ ✓
165
166 6 Conclusion

between many students and their Italian teacher (see Sect. 5.1.1.). As teachers we
should therefore consider what impact we can have on our learners. It seems there is
no doubt that a healthy emotional relationship between students and teachers is one
aspect amongst many that can foster learner engagement with written CF.
Finally, using a questionnaire to accompany the individual interviews was defi-
nitely a good choice. The questionnaire itself, however, could be further improved.
Drawing on existing questionnaires for engagement was inevitable, but question five
“How much do you agree or disagree with the following statements?”, including
items such as “I have the skills and ability to complete my work” or “In general,
I am excited about my classes”, could have been left out. Although the questions
on school in general were quite interesting, they proved to be irrelevant for learner
engagement with written corrective feedback. In addition, in question 4 “In your final
year, about how often have you done each of the following?”, item c “Completed a
formal writing assignment”, item e “Attended class with all assignments completed”
and item g “Worked harder than you thought you could to meet a teacher’s stan-
dards or expectations”, did not contribute to answering my research questions as
well, hence they were obsolete. While coding my data, the categories assessment
and motivation seemed to make sense, but when analysing the codes most of what I
had termed motivation actually fitted much better into my engagement category. As a
result, I merged motivation with engagement, in line with my Dynamic-Engagement-
Framework where motivation and engagement are integral parts. Assessment, on the
other hand, did not generate a lot of codes, and these were therefore excluded from
the analysis. However, this would definitely be an area worth investigating in the
future where the intersection between the feedback method and assessment could be
investigated.

6.3 Future Implications

Although this small-scale study shows that researching written CF with the concept
of learner engagement has provided an interesting insight into learners’ reasons for
their engagement with it, much research is still needed. It would be worthwhile
to investigate the response of more learners to get an even better understanding of
mediators influencing feedback, thus providing teachers with more facts to tailor
feedback methods to the needs of their learners. Like Kahu (2013) I believe that
when researching engagement as a multidimensional construct which is dynamic,
qualitative methods might be more fruitful (Bryson, 2014; Fredricks, Blumenfeld, &
Paris, 2004; Harris, 2008; Storch, 2010) than solely using quantitative ones. Drawing
on quantitative tools to support qualitative research might be another way forward
in learner engagement research.
In addition to researching the positive aspect of engagement, investigating disen-
gagement could be another way to inform research on learner engagement in getting
a better understanding of learners whose engagement decreases over time (Hyland,
2003; Ferris, Liu, Sinha, & Senna, 2013). As demonstrated with my pilot study
6.3 Future Implications 167

learners (see Sect. 6.2.), some students did not engage with the feedback. Therefore,
it would be very informative to get a clearer perspective on their reasons. These
learners’ voices (see also Quaglia & Corso, 2014) might provide us with data that
could be translated into strategies/measures helping even more learners to (re)engage
with feedback. In general, learners’ voices in research on positive and negative aspects
of engagement are essential and can be incorporated in various ways (think-aloud
protocols, self-reports, individual interviews, etc.) to deepen the understanding of
their engagement (Finn & Zimmer, 2012; Wylie & Hodgen, 2012).
Researching the cognitive dimension in more depth was beyond the scope of
this study. It could be achieved when comparing learners’ first and second drafts
throughout a semester and cross-referencing these with their written exams to eval-
uate to what extent the learners had transferred their knowledge from working on
their first drafts to the exam situation. This way learners’ progress on detecting recur-
ring errors, and their cognitive engagement with them, could be measured and might
provide interesting insights into learning processes (Han & Hyland, 2019).
Another way of looking at research on engagement could be in terms of dialogic
learning. Dialogic learning itself it is not a new concept, as it dates back to Socrates,
Rousseau, Dewey and Piaget, but dialogic learning in the context of learner engage-
ment certainly is. Sambell (2013) stressed the need for dialogue when talking about
engagement, feedback, and assessment (also see Campbell & Schumm Fauster, 2013;
Hyland & Hyland, 2006). According to Sambell’s findings, a dialogic dynamic in
learning and assessment is what is needed to make sure that learners achieve their
pre-set goals. In her view engagement is a dynamic process, which consists of many
facets, therefore a holistic approach towards learner engagement is mandatory.
To sum up, regardless of the feedback method used, the learners need to be aware
of the reasons for using it and how feedback ties in with their assessment at the end of
a semester or school year. This awareness is also crucial when learners are working on
their own errors, because without knowing where their weak points are, improving
these will be difficult, if not impossible, for most learners. Avoiding errors in the
exam situation might be impossible due to the learners’ lack of awareness of these.
In conclusion, it might be inevitable to not view awareness, feedback, engagement
and assessment as separate concepts, but as a model that interlinks these four concepts
to feed forward and engage our learners with their writing process.

References

Bitchener, J., & Storch, N. (2016). Written corrective feedback for L2 development. Bristol:
Multilingual Matters.
Blumenfeld, P. C., Kempler, T. M., & Krajcik, J. S. (2006). Motivation and cognitive engagement in
learning environments. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences
(pp. 475–488). New York: Cambridge University Press.
168 6 Conclusion

Bryson, C. (2014). Clarifying the concept of student engagement. In C. Bryson (Ed.), Understanding
and developing student engagement (pp. 1–22). The staff and educational development series.
London and New York: Routledge.
Campbell, N., & Schumm Fauster, J. (2013). Learner-centred feedback on writing: Feedback as
dialogue. In M. Reitbauer, N. Campbell, S. Mercer, J. Schumm Fauster, & R. Vaupetitsch (Eds.),
Feedback matters: Current feedback practices in the EFL classroom (pp. 55–68). Frankfurt am
Main: Peter Lang.
Chanock, K. (2000). Comments on essays: Do students understand what tutors write? Teaching in
Higher Education, 5(1), 95–105. Retrieved from http://dx.doi.org/10.1080/135625100114984.
Crick, R. D. (2012). Deep engagement as a complex system: Identity, learning power and authentic
enquiry. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student
engagement (pp. 675–694). New York: Springer.
Dörnyei, Z. (2007). Research methods in applied linguistics: Quantitative, qualitative, and mixed
methodologies. Oxford: Oxford University Press.
Ellis, R. (2012). Language teaching research and language pedagogy. Chichester: Wiley.
Ferris, D. R. (2011). Treatment of error in second language student writing (2nd ed.). Michigan
series on teaching multilingual writers. Michigan: The University of Michigan Press.
Ferris, D. R., Liu, H., Sinha, A., & Senna, M. (2013). Written corrective feedback for individual
L2 writers. Journal of Second Language Writing, 22(3), 307–329. Retrieved from https://www.
researchgate.net/profile/Aparna_Sinha2/publication/259142527_Written_corrective_feedback_
for_individual_L2_writers/links/559ae16408ae21086d2778b7.pdf.
Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter? In S.
L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement
(pp. 97–131). New York: Springer.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the
concept, state of the evidence. Review of Educational Research, 62, 60–71.
Garrett, C. (2011). Defining, detecting, and promoting student engagement in college learning
environments. Transformative Dialogues: Teaching & Learning Journal, 5(2), 1–12.
Goldstein, L. (2006). Feedback and revision in second language writing: Contextual, teacher, and
student variables. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing:
Contexts and issues (pp. 185–205). New York: Cambridge University Press.
Han, Y., & Hyland, F. (2015). Exploring learner engagement with written corrective feedback in a
Chinese tertiary EFL classroom. Journal of Second Language Writing, 30, 31–44.
Han, Y., & Hyland, F. (2019). Learner engagement with written feedback: A sociono cleracognitive
perspective. In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts
and issues (2nd ed., pp. 247–264). Cambridge: Cambridge University Press.
Handley, K., Price, M., & Millar, J. (2011). Beyond ‘doing time’: Investigating the concept of
student engagement with feedback. Oxford Review of Education, 37(4), 543–560.
Harris, L. R. (2008). A phenomenographic investigation of teacher conceptions of student engage-
ment in learning. The Australian Educational Researcher, 35(1), 57–79. Retrieved from http://
files.eric.ed.gov/fulltext/EJ793463.pdf.
Hattie, J. A. C. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. London and New York: Routledge.
Hofer, M., & Fries, S. (2016). A multiple goals perspective. In K. R. Wentzel & D. B. Miele (Eds.),
Handbook of motivation at school (2nd ed., pp. 440–458). New York: Routledge.
Hyland, F. (2003). Focusing on form: Student engagement with teacher feedback. System,
31(2), 217–230. Retrieved from http://www.sciencedirect.com/science/article/pii/S0346251X
03000216.
Hyland, K., & Hyland, F. (2006). Contexts and issues in feedback on L2 writing: An introduction.
In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues
(pp. 1–19). New York: Cambridge University Press.
Juwah, C., Macfarlane-Dick, D., Matthew, B., Nicol, D., Ross, D., & Smith, B. (2004). Enhancing
student learning through formative effective formative feedback. The Higher Education Academy
References 169

(Generic Centre), 1–42. Retrieved from http://ctlt.illinoisstate.edu/downloads/modules/design/


enhancing_learning-through_formative_feedback.pdf.
Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education,
38(5), 758–773.
Lam, S. F., Wong, B. P., Yang, H., & Liu, Y. (2012). Understanding student engagement with a
contextual model. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research
on student engagement (pp. 403–419). New York: Springer.
Milton, J. (2006). Resource-rich web-based feedback: Helping learners become independent writers.
In K. Hyland & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues
(pp. 123–139). New York: Cambridge University Press.
Murtagh, L., & Baker, N. (2009). Feedback to feed forward: Student response to tutors’ written
comments on assignment. Practitioner Research in Higher Education, 3(1), 20–28. Retrieved
from http://194.81.189.19/ojs/index.php/prhe/article/viewFile/30/28.
Price, M., Handley, K., & Millar, J. (2011). Feedback: Focusing attention on engagement. Studies
in Higher Education, 36(8), 879–896.
Quaglia, R. J., & Corso, M. J. (2014). Student voice: The instrument of change. Thousand Oaks:
Corwin.
Sambell, K. (2013). Engaging students through assessment. In E. Dunne & O. Derfel (Eds.), The
student engagement handbook: Practice in higher education (pp. 379–396). Bingley: Emerald.
Skinner, E. A., & Belmont, M. J. (1993). Motivation in the classroom: Reciprocal effects of teacher
behaviour and student engagement across the school year. Journal of Educational Psychology,
85(4), 571–581.
Storch, N. (2010). Critical feedback on written corrective feedback research. International Journal
of English Studies, 10(2), 29–46. Retrieved from http://revistas.um.es/ijes/article/view/119181/
112311.
Wentzel, K. R. (2016). Teacher-student relationships. In K. R. Wentzel & D. B. Miele (Eds.),
Handbook of motivation at school (2nd ed., pp. 211–230). New York: Routledge.
Wylie, C., & Hodgen, E. (2012). Trajectories and patterns of student engagement: Evidence from a
longitudinal study. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research
on student engagement (pp. 585–599). New York: Springer.
Appendices

Appendix A—Oral A Level Exam

Society and Politics—Asylum Policy


Situation:
In your final year at XXXX1 you have been discussing asylum policies in Europe
and their impact on people’s lives.
Task 1 (Sustained Monologue)
After having read the article on helping people not borders, you are now ready to
present your findings at the A Level exams. In your presentation you should
• point out what Martin Popp claims to be wrong with border protection
• explain which problems refugees come across before being able to seek asylum
in Europe
• judge whether or not the suggestions made by various organisations for legal
ways for refugees to reach Europe are realistic.
Task 2 (Spoken Interaction)
After your presentation, you discuss refugee issues with your teacher. In your
discussion you
• describe what you can see in the graph
• contrast the refugee situation in Austria to other European countries
• state your opinion on how to assist asylum seekers in a humane way.
Spiegel Online
Europe Should Protect People, Not Borders
By Maximilian Popp/20 April 2015

1 Name of school has been removed.

© Springer Nature Switzerland AG 2020 171


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5
172 Appendices

The mass deaths of refugees like those seen this weekend on the European
Union’s external borders is not a consequence of politicians looking away. We
are in fact causing the problem with our Fortress Europe policies.
Workers at the Warsaw headquarters of Frontex, the European border protec-
tion agency, track every single irregular boat crossing and every vessel filled with
refugees. Since December 2013, the authority has spent hundreds of millions of euros
deploying drones and satellites to surveil the borders.
The EU registers everything that happens near its borders. In contrast to the claims
that are often made, they do not look away when refugees die. They are watching very
closely. And what is happening here is not negligent behavior. They are deliberately
killing refugees.
People have been perishing as they sought to flee to Europe for years now. They
drown in the Mediterranean, bleed to death on the border fences of the Spanish
North African conclaves of Ceuta and Melilla or freeze to death in the mountains
between Hungary and Ukraine. But the European public still doesn’t appear to be
entirely aware of the dimensions of this humanitarian catastrophe. We have become
accomplices to one of the biggest crimes to take place in European postwar history.
Barbarism in the Name of Europe
It’s possible that 20 years from now, courts or historians will be addressing this
dark chapter. When that happens, it won’t just be politicians in Brussels, Berlin and
Paris who come under pressure. We the people will also have to answer uncomfortable
questions about what we did to try to stop this barbarism that was committed in all
our names.
The mass deaths of refugees at Europe’s external borders are no accidents—they
are the direct result of European Union policies. The German constitution and the
European Charter of Fundamental Rights promise protection for people seeking flight
from war or political persecution. But the EU member states have long been torpe-
doing this right. Those wishing to seek asylum in Europe must first reach European
territory. But Europe’s policy of shielding itself off from refugees is making that next
to impossible. The EU has erected meters-high fences at its periphery, soldiers have
been ordered to the borders and war ships are dispatched in order to keep refugees
from reaching Europe.
For those seeking protection, regardless whether they come from Syria or Eritrea,
there is no legal and safe way to get to Europe. Refugees are forced to travel into
the EU as "illegal" immigrants, using dangerous and even fatal routes. Like the one
across the Mediterranean.
A Darwinist situation has emerged on Europe’s external borders. The only people
who stand a chance of applying for asylum in Europe are those with enough money to
pay the human-traffickers, those who are tenacious enough to try over and over again
to scale fences made of steel and barbed wire. The poor, sick, elderly, families or
children are largely left to their fates. The European asylum system itself is perverting
the right to asylum.
EU Policies Are Causing Refugee Crisis
There is widespread dismay in Europe over the latest ship sinking last weekend—
an incident in which more than 650 people died off the coast of Libya on their way
Appendices 173

to Italy, the greatest number in such an incident yet. Once again, people are saying
that a tragedy like this cannot be allowed to be repeated. But the same words were
uttered following the disaster off the coast of Lampedusa in autumn 2013 and off
the coast of Malta last September. On Monday, just hours after the latest incident,
history threatened to repeat itself, with hundreds of people aboard a refugee ship in
the Mediterranean in distress.
European politicians lament the refugee drama. But then they continue to seal the
borders—the very act which is the precondition for the disaster. The leaders of the
EU member states and their interior ministers can no longer be allowed to continue
to get away with the status quo. The EU must move immediately to create legal
ways for refugees to reach Europe. The United Nations Refugee Agency (UNHCR),
human rights organizations like Germany’s Pro Asyl and Human Rights Watch have
long pointed out ways in which this might be done.
• The Italian Navy’s Operation Mare Nostrum rescue mission, which protected
hundreds of thousands of refugees from drowning, needs to be resumed without
delay. The Italian government suspended the program due to a lack of funding.
And Frontex’s own Operation Triton, whose aim is to parry migrants, should be
eliminated.
• The EU should create asylum procedures at the embassies of its member states in
the same way Switzerland has done. This would mean that in the future, refugees
could apply for asylum at the embassies of EU member states outside of Europe.
This would spare them the potentially deadly path across the borders.
• The EU also needs to finally begin participating seriously in the UNHCR resettle-
ment program. For years now, the UN has been helping bring refugees from acute
crisis areas for a limited period of time to secure states without subjecting them
to bureaucratic asylum procedures. UNHCR is currently seeking guest countries
for several hundred thousand refugees who need to be resettled. In 2013, North
America took in more than 9,000, but Germany only accepted 300.
• The visa requirement for people from countries in crisis like Syria or Eritrea
should also be temporarily lifted. That would allow asylum-seekers to request
admission at European border control posts without being given blanket rejection
by police. The EU’s Dublin Regulation, which only allows refugees to apply for
asylum in their country of arrival, also needs to be eliminated. Instead, asylum-
seekers should be distributed among EU countries through a quota system. The
freedom of movement that has long applied to EU citizens should then also be
extended to recognized refugees.
• People fleeing their home country largely for economic reasons rather than polit-
ical persecution, should be given the possibility of labor migration—through the
creation of a Green Card for immigrants from poorer countries, for example.
These reforms wouldn’t suddenly eliminate irregular migration, but they would help
to reduce the suffering. Contrary to what European leaders and interior ministers
claim, deaths at Europe’s borders can be prevented. At the very least, their numbers
could be dramatically reduced. But that requires a readiness on the part of Europeans
to protect people and not just borders.
174 Appendices

Sourcehttp://www.spiegel.de/international/europe/opinion-europe-should-pro
tect-people-not-borders-a-1029594.html (11 June 2015).

Sourcehttps://bnicholson3.files.wordpress.com/2014/09/asylum_eur_
2014_08_465px.png?w=470 (11 June 2015).
Appendices 175

Appendix B—Prompt Cards for Focus Group Interviews

Student engagement

Written statement by teacher

Reformulation of sentences

Codes in margin

Error correction by teacher

Error indication – no correction

Self-correction
176 Appendices

Colour code

Peer review

Teacher’s role

Suggestions for improvement of current feedback method

My preferred feedback method

Reasons for (not) doing homework


Appendices 177

Appendix C—Interview Questions

Possible questions for individual interviews:


Alan:
1. Why do you need pressure to do your homework?
2. Which kind of homework is boring for you?
3. How much time do you spend on your first draft?
4. Why have you worked on your English homework?
5. Have you compared your text to your classmates’ texts on the homework blog?
6. How do you feel when a teacher reformulates some of your sentences?
7. Which methods do you use to correct your errors?
8. How can you motivate yourself to do assignments that do not really interest you?
Anne:
1. How long does it take you to complete a writing assignment in English?
2. Have you compared your text to your classmates’ texts on the homework blog?
3. What kind of homework is boring for you?
4. Should teachers spend more time on helping students to improve, for example,
their writing style?
5. Do you think that you could have benefitted from peer review?
6. What is a good written statement for you?
7. How do you feel when a teacher reformulates some of your sentences?
8. How can you motivate yourself to do assignments that do not really interest you?
Emily:
1. Do you think that time management plays a role in doing your homework?
2. Has working part-time taken away much time from your school work?
3. Have you compared your text to your classmates’ texts on the homework blog?
4. In your last school, did the teacher correct your errors and then you had to write
the writing assignment again?
5. What is a good written statement for you?
6. What is too much homework for you?
7. How do you feel when a teacher reformulates some of your sentences?
8. How can you motivate yourself to do assignments that do not really interest you?
Julie:
1. What kind of pressure do you get from your teachers?
2. Anything a teacher can do when you are not interested in a topic (e.g.
globalization)?
3. What kind of feedback would you like to get from the teacher?
4. How much time do you spend on your first draft?
5. Have you compared your text to your classmates’ texts on the homework blog?
6. How do you feel when a teacher reformulates some of your sentences?
178 Appendices

7. Why would you not correct grammar when you know it is tense, personal
pronoun, etc. you have to correct?
8. Which methods do you use to correct your errors?
9. How can you motivate yourself to do assignments that do not really interest you?
Katharine:
1. Why do you sometimes not want to do the homework?
2. How much time do you spend on your first draft?
3. Do you think that peers are not honest when giving feedback on a classmate’s
piece of homework?
4. Have you compared your text to your classmates’ texts on the homework blog?
5. What do you do to get ideas for writing a text?
6. How do you feel when a teacher reformulates some of your sentences?
7. Which methods do you use to correct your errors?
8. How can you motivate yourself to do assignments that do not really interest you?
Michael:
1. Is there anything you could do to fight laziness?
2. Why have you always done your homework?
3. Have you compared your text to your classmates’ texts on the homework blog?
4. Which role should homework play in your opinion?
5. Why are you willing to take the time for doing the English homework but not,
for example, the Italian homework?
6. What is a good written statement for you?
7. How do you feel when a teacher reformulates some of your sentences?
8. How much time do you spend on the second draft?
9. How can you motivate yourself to do assignments that do not really interest you?
Susan:
1. Do you think that time management plays a role in doing your homework?
2. How much time do you spend on your first draft?
3. Do you believe that students only do their homework because they have to?
4. Why do you not especially like doing homework?
5. Have you compared your text to your classmates’ texts on the homework blog?
6. How do you feel when a teacher reformulates some of your sentences?
7. Which methods do you use to correct your errors?
8. How can you motivate yourself to do assignments that do not really interest you?
Veronica:
1. Would you do your homework if it did not count for the grade?
2. Have you compared your text to your classmates’ texts on the homework blog?
3. When is a writing assignment well-corrected for you?
4. Have you ever asked your teachers when you did not know how to complete a
writing assignment?
5. What is a good written statement for you?
Appendices 179

6. How do you feel when a teacher reformulates some of your sentences?


7. Why do you think hardly anyone would do the second draft if it were voluntary?
8. In which way do you engage in the lessons when you do not participate in the
discussion?
9. How can you motivate yourself to do assignments that do not really interest you?

Appendix D—Consent Form

Consent for Participation in the PhD study


Student Engagement with Written Corrective Feedback in the EFL Classroom
I volunteer to participate in a PhD study conducted by Alia Moser at the University
of Graz. I understand that the study is designed to gather information about student
engagement with written corrective feedback in the EFL classroom.
1. My participation in this study is voluntary and I can withdraw my participation
at any time. I understand that I will not be paid for my participation.
2. If I feel uncomfortable in any way during the focus group interview session (and
a possible follow-up individual interview), I have the right to refuse to answer
any question or to end the interview.
3. Participation involves being interviewed by the researcher, your English teacher,
Alia Moser. The interviews (focus group and if necessary follow-up individual
interview) will last approximately 60 min. An audio recording of the focus group
interview (and a possible follow-up individual interview) and transcripts of these
will be made.
4. The researcher will not reveal my name and will use a pseudonym to protect my
identity in any reports using information from this interview/these interviews.
5. I have read and understand the consent form. All my questions have been
answered and I voluntarily agree to participate in this study.
Name. …………………………………………………………………………...
Signature: …………………………………………………………………….....

Appendix E—Interview Questions

Possible questions for focus group interview:


1. Why do you (not) work with the written feedback provided?
2. What kinds of written feedback methods do you find motivating?
3. Which kind of written corrective feedback would you like to get?
4. Is there anything a teacher can do to motivate you to work with the feedback
provided?
5. Is written corrective feedback unnecessary?
6. Does written corrective feedback help you to improve your writing skills?
180 Appendices

7. Would you like to make any changes to the colour-code?


8. Is the colour-code on its own enough to help you to correct your written
assignment?
9. Which factors play a role in (not) doing your written assignments?
10. Do you just do your written assignments before a written exam?
11. Does the teacher influence your decision to do your written assignments?
12. What do you believe is student engagement?

Appendix F—Questionnaire

Student Engagement with Written Corrective Feedback


This study is conducted as part of my doctoral thesis at the English Department
of the University of Graz. It aims at a better understanding of why or why not
students engage with written corrective feedback. It is also the basis for your
individual interview during which I will discuss your answers in detail with
you. The results of this questionnaire will be used for research purposes only,
so please answer honestly.
This questionnaire consists of nine questions. Please tick (✓) the appropriate
box for your answers in the spaces provided. It will take you approximately
15 min to complete this questionnaire. Thank you very much for your help!
1. During your final year, how much have your teachers in various subjects
focused on the following?

Very much Some Very little Not at all


a. Memorizing facts and figures for classes
b. Understanding information and ideas for
classes
c. Analysing ideas in depth for classes
d. Applying knowledge or methods to a
problem or new situation
e. Evaluating a point of view, decision, or
information source

2. How much has your experience at your school contributed to your


development in the following areas?

Very much Some Very little Not at all


a. Writing clearly and effectively
b. Thinking critically and effectively
c. Working effectively with others
d. Learning effectively on your own
Appendices 181

3. In a typical 7-day week during your final year, about how many hours have
you done the following outside school?

1 or less 2–3 4–7 8 or more


a. Completing homework for class
b. Studying for tests or quizzes
c. Writing your final year project
d. Writing the first draft of an assignment (e.g., for
English)
e. Writing the second draft of an assignment (e.g., for
English)
f. Working with classmates outside class to prepare class
assignments

4. In your final year, about how often have you done each of the following?

Very often Often Some- times Never


a. Prepared a draft of a paper or assignment
before turning it in
b. Completed a creative writing assignment (e.g.,
poems, short stories)
c. Completed a formal writing assignment (e.g.,
article, report, commentary)
d. Received feedback from teachers on
assignments or other class work
e. Attended class with all assignments completed
f. Connected ideas or concepts from one class (or
subject) to another in written assignments or
discussions
g. Worked harder than you thought you could to
meet a teacher’s standards or expectations

5. How much do you agree or disagree with the following statements?

Strongly agree Agree Disagree Strongly disagree


a. I have the skills and ability
to complete my work
b. I put forth a great deal of
effort when doing my
school work
c. I am motivated by my
desire to learn
d. I am motivated by my
desire to get good grades
(continued)
182 Appendices

(continued)
Strongly agree Agree Disagree Strongly disagree
e. I am motivated by teachers
who encourage me
f. I am motivated by my desire
to succeed in the world
outside school
g. I take pride in the quality of
my school work
h. I have worked harder than I
expected to do in school
i. I enjoy working on tasks
that require a lot of thinking
and mental effort
j. In general, I am excited
about my classes

6. How much do each of the following feedback methods interest or engage


you?

Very much Some Very little Not at all


a. Error correction by teacher
b. Error indication/no correction
c. Reformulation of sentences
d. Codes in margin
e. Self-correction
f. Peer correction
g. Personal written statement by teacher

7. In your experience during your school career, how much have the following
aspects influenced your decision to engage with written corrective feedback?

Very much Some Very little Not at all


a. Teachers
b. Peers
c. Getting good grades
d. Improving my level of English
e. Feedback method provided
Appendices 183

8. How much work have you put into each of the following activities in your
English homework?

Very much Some Very little None at all


a. Brainstorming ideas before writing a text
b. Planning the structure of a text
c. Writing a first draft
d. Editing a first draft before handing it in
e. Writing a second draft
f. Reflecting on teacher’s final feedback

9. In your experience during your final year, about how often have you done
each of the following when working with written corrective feedback on your
English homework?

Very often Often Some- times Never


a. Corrected spelling
b. Corrected grammar
c. Corrected words/phrases
d. Corrected word order
e. Improved sentence structure
f. Worked on writing style
g. Added ideas (e.g., content—argument
unclear)
h. Voluntarily re-wrote parts of assignment

Thank you for your cooperation!


Items 1a–c, 3a–b, 4a–f and 5a–j used from High School Survey of Student Engage-
ment, CEEP, Copyright © 2012 Indiana University. Items 4b, 4c and 4f were adapted
for this questionnaire.
Items 1d–e and 2a–c used with permission from The College Student Report,
National Survey of Student Engagement, Copyright 2001–15 The Trustees of Indiana
University. Items 1d, 2b were adapted for this questionnaire.
Items 2d and 4g used with permission from the Center for Community College
Student Engagement, Community College Survey of Student Engagement Copyright
[2005], The University of Texas at Austin. Item 4g was adapted for this questionnaire.
Item 3f used with permission from Australasian Survey of Student Engagement,
Australian Council for Educational Research (ACER), Copyright © 2012. Item 3f
was adapted for this questionnaire.
184 Appendices

Appendix G—Usage Agreements

Usage Agreement NSSE


Appendices 185
186 Appendices

Usage Agreement AUSSE


Alia Moser
ausse@acer.edu.au
07/20/15 at 10:18 AM
Dear Sir or Madam,
My name is Alia Moser and I am a PhD student at the University of Graz (Austria)
where I am currently working on my PhD thesis “Student engagement with written
corrective feedback in the EFL classroom”.
This study is a small-scale study with eight participants. I am going to conduct
individual interviews with business high school students and wanted to ask if I could
get permission to use one survey item from AUSSE as part of the questionnaire the
students fill out before the interview.
The item is:
Working with classmates outside class to prepare assignments (adapted)
I am looking forward to hearing from you.
Yours faithfully
Alia Moser
AUSSE <ausse@acer.edu.au>
Alia Moser
07/20/15 at 5:48 PM
Dear Alia,
Many thanks for your email and your interest in the AUSSE questionnaire.
We would be very happy to grant permission for you to use this adapted question
from the AUSSE questionnaire in your PhD research.
Please let me know if you require any further information about the AUSSE and best
of luck with your research project.
Best wishes,
Ali
Appendices 187

Usage Agreement CCSSE


188 Appendices
Appendices 189
190 Appendices

Usage Agreement HSSSE


HSSSE <hssse@indiana.edu>
Alia Moser
Aug 22 at 8:54 PM
Hi Alia-
Sorry for the delay in response—yes, you may use the HSSSE items in your study.
Please be sure to cite the Center for Evaluation and Education Policy (CEEP) at
Indiana University, in any work that you produce. Also please be sure to note any
adapted or modified items as such.
Sincerely,
Kathleen Lorenzen
HSSSE Project Associate
From: Alia Moser
Sent: Thursday, August 4, 2016 5:32 AM
To: HSSSE
Subject: HSSSE survey items
Dear Sir or Madam,
I am a PhD student at the University of Graz (Austria) where I am currently writing
up my PhD thesis "Student engagement with written corrective feedback in the EFL
classroom".
My study is a small-scale study with eight participants. For my follow-up inter-
views I designed a questionnaire last year, where I used items from NSSE, AUSSE,
CCSSE as well as your questionnaire. I contacted NSSE, AUSSE and CCSSE as it
stated on their respective websites and got permission to use items from their ques-
tionnaires. Your website then (http://ceep.indiana.edu/hssse/index.html—no longer
active) stated that the questionnaire can be used for educational purposes without
any further permission. When checking all my references I realised that this policy
has been changed in the meantime. I therefore wanted to ask what I have to do to be
allowed to include the items I used for my questionnaire in my PhD thesis.
I have attached a copy of my questionnaire where the HSSSE items as well as the
citation are highlighted. From the HSSSE questionnaire Copyright © 2012 Indiana
University I used the following items: 6a–6c, 8a–b, 9d, 9e–f (adapted), 9g–h, 9m
(adapted) as well as items 18a–h, 18k and 18m.
I am looking forward to hearing from you.
Yours faithfully
Alia Moser
Appendices 191

Appendix H—Code System

Assessment Pressure
Time
Attitudes Time management
Self-confidence
Peers Strategies
Individuality Writing
Error-driven Cooperation
Anxiety
Fixed roles Teacher's role
Commitment Unclear instructions
Interest Feedback as dialogue
Laziness Feedback by teacher

Demotivation

Engagement
Learning by heart
Students' definition
Emotional
Cognitive
Lack of cog eng
Behavioural

Feedback methods
No feedback
Combination of methods
Peer review
Colour-code
Error indication
Error correction
Reformulation
Codes margin
Statement by teacher
Self-correction

Homework
Tasks

Motivation
Extrinsic
Intrinsic
Feedback
192 Appendices

Appendix I—Key to Transcription Conventions

• .. One second pause


• An extra dot is added for each additional one second pause
• Underline marks emphatic stress
• /xxx/ indicates transcription impossible
• /Words/ within slashes indicate uncertain transcription
• [Lines in brackets indicate overlapping speech]
• [Two people talking at the same time]
• (Italicised words in brackets are used to comment on context, e.g. both laughing;
exhales, etc.)
• Italics used for German words/expressions.
Appendices 193

Appendix J—Commentary by Emily

Commentary2

Last week on Wednesday Mr XXX visited us again in our school and held a speech, which lasted for
two hours. Florian XXX is an Austrian lawyer, investigative journalist and writer. Since 2012 he has
been working (You need a present perfect progressive here, because he started in 2012 but is still
working there. Hence, the process started in 2012 but is not over yet.) as a chief editor for the
Austrian weekly journal called “Falter”. He told us about the different types of journalists, for
example, lap dogs, hyenas and watchdogs. Moreover, he also described his procedure, when he
investigates a new case. His main articles deal with corruption, human rights abuse, but also human
trafficking, justice and police organization (Use another term for System). Furthermore, he told us
about the freedom of press and how far the press is allowed to go.

First of all, I have to say that last year Mr XXX was also in our school and told us more or less the
same topics, but it was not boring to hear it again. During the speech I realised that there is a big
difference between daily newspapers and weekly newspapers, not only is the time the difference,
but also the way of investigation is different. Anyway I do not agree with Mr XXX’s opinion that daily
newspapers just copy the news and don’t check the news correctly. Even if the expenditure of time
is a difference, the daily press also has to investigate, for example, if the news is correct or not.
What’s more, the topics of the daily press and weekly newspapers are different. Therefore,
investigative journalists are more focused on a range of topics and research in more depth.

In my opinion the most interesting topic was about the right of privacy of each person. Take, for
example, the case of Natascha Kampusch. Her face has been shown all over the world, everybody
recognizes (You need a third person –s here.☺) her and she is always surrounded by paparazzi. Most
probably she will never lead a normal life, she will always be confronted with her past instead
of letting bygones be bygones. That is why I think that every person whether private or famous
should have the right of privacy.

Summing up I can say that it was a very interesting speech and I would also enjoy it to hear more
about these topics and I especially want to point out that it was a great opportunity to take part in
this speech. Because as a student of a business high school, you rarely get in touch with journalisms,
so it was a great opportunity to catch a glimpse of this topic. But I would suggest extending the
speech, since there was no time for questions anymore.

2 Name of journalist has been removed.


194 Appendices

Appendix K—Results Questionnaire in Percent

1. During your final year, how much have your teachers in various subjects
focused on the following?

Very much (%) Some (%) Very little (%) Not at all
a. Memorizing facts and figures 75 12.5 12.5
for classes
b. Understanding information and 87.5 12.5
ideas for classes
c. Analysing ideas in depth for 37.5 62.5
classes
d. Applying knowledge or 37.5 50 12.5
methods to a problem or new
situation
e. Evaluating a point of view, 37.5 37.5 25
decision, or information source

2. How much has your experience at your school contributed to your


development in the following areas?

Very much (%) Some (%) Very little (%) Not at all
a. Writing clearly and effectively 50 37.5 12.5
b. Thinking critically and 100
effectively
c. Working effectively with others 50 50
d. Learning effectively on your 75 25
own

3. In a typical 7-day week during your final year, about how many hours have
you done the following outside school?

1 or less (%) 2–3 (%) 4–7 (%) 8 or more (%)


a. Completing homework for class 25 62.5 12.5
b. Studying for tests or quizzes 12.5 75 12.5
c. Writing your final year project 12.5 12.5 25 50
d. Writing the first draft of an 50 50
assignment (e.g., for English)
e. Writing the second draft of an 62.5 37.5
assignment (e.g., for English)
f. Working with classmates outside class 25 50 12.5 12.5
to prepare class assignments
Appendices 195

4. In your final year, about how often have you done each of the following?

Very often (%) Often (%) Some- times (%) Never


a. Prepared a draft of a paper 12.5 87.5
or assignment before
turning it in
b. Completed a creative 25 75
writing assignment (e.g.,
poems, short stories)
c. Completed a formal 62.5 37.5
writing assignment (e.g.,
article, report,
commentary)
d. Received feedback from 37.5 12.5 50
teachers on assignments or
other class work
e. Attended class with all 25 62.5 12.5
assignments completed
f. Connected ideas or 12.5 37.5 50
concepts from one class (or
subject) to another in
written assignments or
discussions
g. Worked harder than you 12.5 62.5 25
thought you could to meet
a teacher’s standards or
expectations

5. How much do you agree or disagree with the following statements?

Strongly agree (%) Agree (%) Disagree (%) Strongly disagree


(%)
a. I have the skills 100
and ability to
complete my
work
b. I put forth a great 62.5 37.5
deal of effort
when doing my
school work
c. I am motivated by 50 25 25
my desire to learn
d. I am motivated by 12.5 75 12.5
my desire to get
good grades
e. I am motivated by 62.5 37.5
teachers who
encourage me
(continued)
196 Appendices

(continued)
Strongly agree (%) Agree (%) Disagree (%) Strongly disagree
(%)
f. I am motivated by 75 12.5 12.5
my desire to
succeed in the
world outside
school
g. I take pride in the 37.5 62.5
quality of my
school work
h. I have worked 12.5 25 50 12.5
harder than I
expected to do in
school
i. I enjoy working 50 25 25
on tasks that
require a lot of
thinking and
mental effort
j. In general, I am 25 62.5 12.5
excited about my
classes

6. How much do each of the following feedback methods interest or engage


you?

Very much (%) Some (%) Very little (%) Not at all (%)
a. Error correction by teacher 37.5 25 37.5
b. Error indication/no 12.5 25 37.5 25
correction
c. Reformulation of sentences 12.5 37.5 37.5 12.5
d. Codes in margin 62.5 37.5
e. Self-correction 25 37.5 12.5 25
f. Peer correction 37.5 37.5 12.5 12.5
g. Personal written statement 100
by teacher

7. In your experience during your school career, how much have the following
aspects influenced your decision to engage with written corrective feedback?

Very much (%) Some (%) Very little (%) Not at all (%)
a. Teachers 75 25
b. Peers 25 37.5 12.5 25
(continued)
Appendices 197

(continued)
Very much (%) Some (%) Very little (%) Not at all (%)
c. Getting good grades 25 62.5 12.5
d. Improving my level of 87.5 12.5
English
e. Feedback method provided 75 25

8. How much work have you put into each of the following activities in your
English homework?

Very much (%) Some (%) Very little (%) None at all
a. Brainstorming ideas before 50 37.5 12.5
writing a text
b. Planning the structure of a text 50 25 25
c. Writing a first draft 25 75
d. Editing a first draft before 12.5 50 37.5
handing it in
e. Writing a second draft 37.5 25 12.5
f. Reflecting on teacher’s final 75 12.5 12.5
feedback

9. In your experience during your final year, about how often have you done
each of the following when working with written corrective feedback on your
English homework?

Very often (%) Often (%) Some- times (%) Never (%)
a. Corrected spelling 37.5 50 12.5
b. Corrected grammar 62.5 25 12.5
c. Corrected words/phrases 37.5 50 12.5
d. Corrected word order 12.5 37.5 50
e. Improved sentence structure 25 37.5 37.5
f. Worked on writing style 37.5 25 12.5 25
g. Added ideas (e.g., 25 50 25
content—argument unclear)
h. Voluntarily re-wrote parts of 12.5 25 25 37.5
assignment
198 Appendices

Appendix L—Article on Globalization by Michael

Globalization
Living in a small village called earth
By Michael
It is probably the most important change of our modern world and community—
globalization. It has an impact on every living creature of this earth, but especially
on us human-beings.

Well, at first let me tell you, that I am not pollute the environment because of the CO2
going to solve the problems of globalization emissions. Consumers should definitely
in this short article. I am just an Austrian think about that before buying anything.
student who wants to outline some of the
current problems of this unbelievably Another example of how our natural
complicated process that happens in every resources get destroyed is how we deal with
second of our lives. Due to the complexity the huge amount of wrappings that are
of that issue I will limit my article to one being used nowadays. The resources that
topic which is very important for me. In the are made use of to produce wrapping-
following text I will write about global material only achieve one thing (You know I
natural resources and in which way dislike the use of thing but here it totally
globalization destroys them. In my opinion fits☺) – they get wasted. Why? Well, it does
the impact (no plural) of the changing not matter if it is packaging for a new
nature is the most radical one, and iPhone or simply for cheese – the bottom
especially for us human-beings they could line is that consumers throw it into the
be very dramatic. rubbish bin.

Thinking of the future as well as the way we To come to an end, I want the reader of this
are wasting our global resources, it is clear article to remember that there is only this
that we have to face many huge problems. one world and that the global natural
As modern human-beings it is normal for us, resources are our highest properties. We
that we are able to enjoy products from all are responsible for the other living
over world. We do not really think about creatures and should question our style of
how they are produced, which resources get living. It is necessary to live in a sustainable
wasted for that and especially how they get way and think about the next generations.
transported to us. When Austrians buy So, instead of wanting everything that is not
some products from, for example, the immediately available in your home
United States, it is obvious that the country, like exotic fruits, enjoy the great
environment suffers from that – even if we local goods that are around you. And
are dreaming of living in a small village. But believe me: you will be as happy as before –
every long transport routes damage and and even take care of the environment.

Blue: grammar
Green: spelling
Orange: expression
Red: content
Magenta: word order
¥: something is missing (colour indicates which area)
Bibliography

Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.). (2012). Handbook of research on student
engagement. New York: Springer.
Griffiths, A. J., Lilles, E., Furlong, M. J., & Sidhwa, J. (2012). The relations of adolescent student
engagement with troubling and high-risk behaviours. In S. L. Christenson, A. L. Reschly, & C.
Wylie (Eds.), Handbook of research on student engagement (pp. 563–584). New York: Springer.
National Research Council and Institute of Medicine. (2004). Engaging Schools: Fostering High
School Students’ Motivation to Learn. Committee on Increasing High School Students’ Engage-
ment and Motivation to Learn, Board on Children, Youth and Families, Division of Behav-
ioral and Social Sciences and Education. Washington: The National Academy Press. Retrieved
from https://www.nap.edu/catalog/10421/engaging-schools-fostering-high-school-students-mot
ivation-to-learn.
Newmann, F. M. (Ed.). (1992). Student engagement and achievement in American secondary
schools. New York: Teachers College Press.
Rosenblum, S., & Firestone, W. A. (1987). Alienation and Commitment of High School Students
and Teachers. Paper presented at the Annual Meeting of the American Educational Research
Association. Washington. Retrieved from http://files.eric.ed.gov/fulltext/ED289930.pdf.
Wentzel, K. R., & Miele, D. B. (Eds.). (2016). Handbook of motivation at school (2nd ed.). New
York: Routledge.

© Springer Nature Switzerland AG 2020 199


A. Moser, Written Corrective Feedback: The Role of Learner Engagement,
Second Language Learning and Teaching,
https://doi.org/10.1007/978-3-030-63994-5

You might also like