You are on page 1of 17

Available online at www.sciencedirect.

com

System 38 (2010) 211e227


www.elsevier.com/locate/system

Development of English language teaching reflection inventory


Ramin Akbari*, Foad Behzadpoor, Babak Dadvand
English Department, Tarbiat Modares University, Chamran Expressway, Tehran, Iran

Received 2 November 2009; revised 18 December 2009; accepted 29 January 2010

Abstract

The present study was carried out to develop and validate an L2 teacher reflection instrument. For this purpose, a six component
model of second language (L2) teacher reflection, encompassing practical, cognitive, meta-cognitive, affective, critical and moral
reflection, was developed using experts’ opinion and a comprehensive review of the related literature. This initial model was then
operationalized in an instrument consisting of 42 items, i.e. 7 behavioral items for each component. The piloting and testing of the
tentative model through exploratory and confirmatory data analyses on a sample of 300 ESL teachers reduced the instrument to 29
items, removing the ‘‘morality’’ factor, and reducing the items in the ‘‘affective’’ factor to half.
Ó 2010 Elsevier Ltd. All rights reserved.

Keywords: Teacher reflection; L2 teacher reflection; Teacher reflection instrument; Reflective teaching model

1. Introduction

The disappearance of the concept of method from applied linguistics discussions (Crandall, 2000;
Kumaravadivelu, 1994, 2001; Pennycook, 1989; Pica, 2000; Richards and Rodgers, 2002) has been a positive
change, bringing a sense of practice and realism to the profession. This development has been due to the ELT’s
recognition of the complexity of L2 teaching/learning processes, and the social/political forces that are at play in any
typical pedagogical context.
An area which has been indirectly affected by this conspicuous absence of method is teacher education. In the past,
methods provided compatible frameworks for novice teachers’ initiation into the world of practice envisioned by the
method of the day; now that method is gone there is not such a framework and teacher educators must look for
sophisticated alternatives capable of responding to the pedagogical/socio-political demands made of ELT teachers.
The concept which seems to have the informal agreement of teacher educators of the field is reflective teaching
(Akbari, 2007; Halliday, 1998; Richards and Lockhart, 1994).
A problem with this model of teacher education, however, is lack of evidence as to its effectiveness; there is not any
published piece of research in applied linguistics (and even in mainstream education), to the best of our knowledge, to
indicate that teacher reflection will have any positive (or negative) effect on L2 learners’ achievement or efficiency of

* Corresponding author.
E-mail addresses: akbari_r@modares.ac.ir, akbari_ram@yahoo.com (R. Akbari), behzadpoor_f@yahoo.com (F. Behzadpoor), babak.
dadvand@gmail.com (B. Dadvand).

0346-251X/$ - see front matter Ó 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.system.2010.03.003
212 R. Akbari et al. / System 38 (2010) 211e227

instruction (Griffiths, 2000; Korthagen and Wubbles, 1995; Thiessen, 2000). This absence of evidence is largely due to
the fact that the construct of reflection has not been well elaborated on, that is, we still do not know what reflection
could entail and what components it consists of. In other words, the construct has not been defined in its operational
terms to allow for its quantification, mainly due to the absence of any instrument for measuring teacher reflection.
The present study, therefore, was conducted to fill this gap. There were two motives behind this project; first to
come up with a model of teacher reflection in applied linguistics, and second, to design an instrument to allow for the
quantification of the construct and consequently, its empirical investigation. The following sections will provide
a background to the concept of teacher reflection and details of the development and validation of English Language
Teacher Reflective Inventory (ELTRI).

2. What is reflection?

In most of the articles and books dealing with reflective teaching, the roots of the term reflection are traced back to
John Dewey (1933/1993) and his influential book ‘How we think: a re-statement of the relation of reflective thinking to
the educational process’ and Schon (1983, 1987, 1991). Dewey defines reflection as action based on ‘‘the active,
persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that
support it’’ (p. 9). Reflective action is contrasted with impulsive and routine actions. Impulsive action is based on trial
and error, drawing on biological/instinctive principles, while routine action is ‘‘based largely on authority and tra-
dition.undertaken in a passive, largely unthinking way’’ (Griffiths, 2000: 540). A reflective teacher, according to this
definition, is one who critically examines his/her practices, comes up with some ideas as how to improve his/her
performance to enhance students’ learning, and puts those ideas into practice, what Schon (1983) calls the cycle of
appreciation, action, and re-appreciation.
Schon makes a distinction between reflection-in-action and reflection-on-action. Reflection-in-action is the real
life, online reflection that teachers get engaged in as they confront a problem in the classroom while teaching. It
happens when ‘‘professionals are faced with a situation which they experience as unique or containing an element of
surprise. Rather than applying theory or past experience in a direct way, professionals draw on their repertoire of
examples to reframe the situation and find new solutions’’ (Griffiths, 2000: 542). Reflection-on-action, on the other
hand, is the type of reflection that teachers get involved in posteriori of the event. It is the most common type of
reflection which is encouraged and practiced in universities or centers of higher education, and unlike reflection-in-
action, which is an individual activity, reflection-on-action is normally exercised collectively and in groups.
The starting point for reflection is usually a problem, or what Munby and Russell (1990) call puzzles of practice.
‘‘There is some puzzling or troubling or interesting phenomenon with which the individual is trying to deal. As he tries
to make sense of it, he also reflects on the understandings which have been implicit in his action, understandings which
he surfaces, criticizes, restructures, and embodies in further action’’ (Schon, 1983: 50).
Reflection, however, seems to be much older than Dewey’s formulation and that of Schon. Fendler (2003), in
addition to Dewey and Schon, refers to reflection as having a Cartesian basis, which views knowing about the self or
self-knowledge as a valid means of knowledge generation. From this viewpoint, any reflection is a positive activity,
since it will result in self-understanding (the same idea is also echoed by Socrates and Freud). In the realms of teaching
and education, ‘‘when teachers are asked to reflect on their practices, the Cartesian assumption is that self-awareness
will provide knowledge and understanding about the self’’ (Fendler, 2003: 17). This knowledge will result in the
development of what Inchausti (1991) has called the second self, which can be viewed as a kind of intrapersonal
knowledge contributing to the self-actualization of the individual, helping him/her to live a rich, full life. Most of the
techniques resorted to in training reflective teachers indirectly draw upon this Cartesian view of reflection; when
teachers are encouraged to write reflective journals or autobiographical notes, the idea is to encourage them ‘‘to value
their own lives and experiences as a source of knowledge about what they may expect to encounter in their own
classrooms and lives of children they will teach’’ (Braun and Crumpler, 2004: 61; also see Carter, 1993; Connelly and
Clandinin, 1990). The use of autobiography will also justify teachers’ personal knowledge and enables them to move
beyond ‘‘the normalizing constraints that objective knowledge claims impose’’ (Davis and Adams, 2000: 18).
Another trend, according to Fendler, which has contributed to the reflective movement is feminism. The basic
assumption here is that expert knowledge is usually generated and controlled by masculine mechanisms, and this
masculine nature of knowledge has put women in a subordinate role (p. 19). Reflection on knowledge generation
mechanisms and subverting those mechanisms to give more voice to women, or more feminine approaches, can be
R. Akbari et al. / System 38 (2010) 211e227 213

viewed as a force behind the rise of the reflective movement in education. The fact that in language teaching, and also
in mainstream education in general, most of the academics are males and most of the practitioners are females
(Pennycook, 1989) is a good piece of evidence supporting the validity of this claim. By assigning value to the
knowledge generated by practitioners, feminists hope to tilt the balance in favor of women.
The fact that reflection has been influenced by different philosophies and motivations in its genesis makes an exact
definition of the term problematic. Different attempts have been made to define the concept of reflective practice by
either its components or the processes it entails. Van Manen (1977), for example, viewed reflection as comprising of
three elements of technical rationality, practical reflection, and critical reflection, while Valli (1990) adds moral
reflection to the list. Korthagen (2001), on the other hand, regards reflection as consisting of organized, rational,
language-based decision making processes that also include non-rational, gestalt type operations. In another classi-
fication, Jay and Johnson (2002) regard reflective practice as consisting of three crucial steps of description,
comparison, and criticism. The descriptive stage is devoted to problem setting, during which the teacher determines
which aspect of the classroom or his/her practice should form the core of his/her reflective attention. The second stage,
i.e., comparison, is the phase during which the teacher starts ‘‘thinking about the matter for reflection from a number
of different frameworks’’ (p.78). It is during the comparative stage that the practitioner tries to make sense of other
people’s viewpoints, or develops a new frame of reference (Schon, 1983), which will enable him/her to comprehend
viewpoints which may run counter to the ones he/she holds. This ability to detach oneself from the limits of one’s
experience will enable us to ‘‘discover meaning we might otherwise miss’’ (Jay and Johnson, 2002: 78). The ultimate
result will be a more comprehensive understanding of the teaching context and its complexity. The last stage of
reflection is what is termed as the critical stage. At this stage, the reflective practitioner evaluates different choices and
alternatives and integrates the newly-acquired information with what he/she already knows. It is, in fact, the decision
making stage resulting from careful analysis of the situation and deliberation. This last stage will form the basis for the
formulation of alternative ways of teaching or approaching the problem on the part of the teacher.

3. A proposed model of teacher reflection

As it was pointed out earlier, not much has been done to operationalize the construct of reflection, and this is largely
due to lack of consensus as to what reflection actually entails. In the current study, therefore, the first priority was to
develop an instrument based on a tentative model of teacher reflection and its components.
Following the standard procedure for developing a valid and reliable measurement instrument (Brown, 2001;
Dornyei, 2003), we initially carried out a comprehensive review of related literature to check for any available model
of teacher reflection and/or its components This literature review provided an initial draft of the constructs and
behaviors perceived as relevant to teacher reflection.
More specifically, the review resulted in the accumulation of more than six hundred reflective categories and
behaviors out of which a temporary data driven model of teacher reflection was to be developed. For this to happen,
however, a cycle of item accumulation, item arrangement, model development, and model test was followed.
The term ‘categories’ here is used to refer to those ideas that were still general and not formulated as an actual
instance of behavior by the teachers. For example, Ward and McCotter (2004) refer to a reflective practitioner as
someone who focuses on teaching tasks such as management and planning. This was viewed as an example of
category since it did not require the teacher to engage in any specific, classroom based teaching behavior. As another
example of a category, Zeichner and Liston (1987) encourage teachers to reflect on the socio-political and cultural
context of their classes. In contrast, the idea of keeping a journal and writing about one’s teaching experiences was an
example of ‘behavior’ since it represented an instance of reflection on one’s actual course of pedagogical practice. As
a further example, consulting with a colleague on an issue pertaining to actual teaching practice was taken to be
another instance of reflective behavior due to its specified practice-oriented nature.
The next stage included two phases; during the first phase we eliminated those items that overlapped or were mere
repetitions of one another, for both categories and behaviors, reducing the list to 302 items. In the second phase of
stage two, an effort was made to translate the existing categories into actual instances of reflective behavior. For
example, the category of being sensitive to political and cultural context of teaching was rephrased as ‘‘I think about
instances of social injustice in my own surroundings and try to discuss them in my classes’’ or ‘‘In my teaching, I
include less-discussed topics, such as old age, AIDS, discriminations against women and minorities, and poverty’’.
214 R. Akbari et al. / System 38 (2010) 211e227

The final stage was devoted to grouping all the developed items and finding themes or commonalities among them,
i.e., going over all the items and grouping them based on what they purportedly measured. This analysis stage gave
way to the following six overarching components of reflection, to be measured and validated in the subsequent phases
of the study:

3.1. Practical element

This component includes those items that deal with the tools and the actual practice of reflection. Different tools/
procedures for the reflective practice include ‘journal writing,’ ‘lesson reports,’ ‘surveys and questionnaires,’ ‘audio
and video recordings,’ ‘observation,’ ‘action research,’ ‘teaching portfolios,’ ‘group discussions,’ ‘analyzing critical
incidents’ (Farrell, 2004; Murphy, 2001; Richards and Farrell, 2005; Richards and Lockhart, 1994). In this study,
however, doing action research has been categorized under the ‘cognitive element’ of a reflective practice.

3.2. Cognitive element

This element is concerned with teachers’ attempts aimed at professional development. Conducting small-scale
classroom research projects (action research), attending conferences and workshops related to one’s field of study,
and reading the professional literature are among the behaviors included in this domain (Farrell, 2004; Richards and
Farrell, 2005).

3.3. Learner element (affective)

This component includes those items that deal with a teacher’s reflecting on his/her students, how they are learning
and how learners respond or behave emotionally in their classes. According to Zeichner and Liston (1996), this
tendency ‘‘emphasizes reflection about students, their cultural and linguistic backgrounds, thinking and under-
standings, their interests, and their developmental readiness for particular tasks’’ (p. 57). This element concentrates
also on teachers’ reflecting on their students’ emotional responses in their classes (Hillier, 2005; Pacheco, 2005;
Pollard et al., 2006; Richards and Farrell, 2005; Richards and Lockhart, 1994).

3.4. Meta-cognitive element

This component deals with teachers and their reflections on their own beliefs and personality, the way they define
their practice, their own emotional make up, etc. (Hillier, 2005; Pollard et al., 2006; Richards and Lockhart, 1994;
Stanley, 1998; Zeichner and Liston, 1996). As Akbari (2007) states, ‘‘Teachers’ personality, and more specifically
their affective make up, can influence their tendency to get involved in reflection and will affect their reaction to their
own image resulting from reflection’’ (p. 10).

3.5. Critical element

This component consists of items that refer to the socio-political aspects of pedagogy and reflections upon those.
Items falling in this category deal with teachers’ reflecting on the political significance of their practice and intro-
ducing topics related to race, gender and social class, exploring ways for student empowerment (Bartlett, 1997; Day,
1993; Jay and Johnson, 2002; Zeichner and Liston, 1996).

3.6. Moral element

Items included here check for teachers’ reflecting on moral issues. Hansen (1998) refers to Valli’s (1990) three
strands of reflection which take into account the notion of morality. The ‘deliberative approach’ ‘‘urges teachers to
think critically about their purposes and how to justify them from a moral point of view’’ (Hansen, 1998, p. 644). The
second approach called the ‘relational approach’ ‘‘draws upon moral philosophy and feminist theory which centers the
moral life around issues of personal character and how individuals regard and treat other individuals’’ (p. 645). The
third approach, called the ‘critical approach’, which is, according to Hansen (1998), very much similar to critical
R. Akbari et al. / System 38 (2010) 211e227 215

reflection, is also highlighted by Goodman (1986), Apple (1979), and Giroux and McLaren (1986) (Cited in Hansen,
1998); items dealing with this third approach were not included here since the 5th element (above) dealt specifically
with this aspect.
Table 1 includes a list of the proposed components, their definitions, sample item(s) for each component, and the
references that justify the components’ inclusion in the model.
To complement the conceptual relevance of our tentative model, we conducted interviews with 29 domain experts
(applied linguistics’ university professors and PhD students) familiar with reflective practice and its theoretical
underpinnings. The interviews, ranging from 13 to 45 min in length, were conducted based on a guide designed to
elicit responses e both general and specific e dealing with the nature of reflective teaching and the components that
can be subsumed as its constituent elements (see Appendix I for the interview questions). All the interviews were tape
recorded and later transcribed for final content analysis. The purpose of this qualitative content analysis was to find out
whether any alternative reflective model can be developed, and whether the potential categories experts referred to
matched the ones we had developed for the study. The interview results corroborated our model and did not include
any new theme or pattern to add.
In the next step of our instrument development effort, the tentative model, along with the developed items (i.e.,
reflective behaviors), underwent a second round of item assessment/reduction by ten of the participants in the
interview phase who agreed to have an analytic look at the instrument. In fact, our objective in this stage was two fold:
to get a second professional opinion on the component make-up of the model and to make use of ‘experts’ judgment’
for item redundancy, clarity and readability (Dornyei, 2003). This expert analysis of the instrument further polished
the questionnaire and led to a more truncated model since some remaining redundant items were discarded at this
phase. In addition, minor changes were also made in the wording of a few questions based on the experts’ opinion on
the items’ clarity and readability; hence preparing the instrument for the subsequent validation phase. The participants
also rank ordered the items based on their perceived degree of relevance to the element they belonged to.
Forty two items were selected for inclusion in the instrument, (7 behavioral items for each component) based on the
frequency with which each item was selected as relevant by the ten experts. Then, following the standard outlines for
questionnaire development (Brown, 2001; Dornyei, 2003), we chose a 5-point Likert scale ranging from ‘‘always’’ to
‘‘never’’ to assess English language teachers’ recourse to reflective practice. Once the draft version of the ques-
tionnaire was ready, it was given to two applied linguistics professors with language teacher education background for
proofreading and face validity assessment, resulting in some minor alterations in the wording of a few items. (See
Table 2 for the draft version). The instrument was then piloted on a group of 32 ELT teachers and the Cronbach alpha
reliability of the questionnaire was estimated to be .91

4. Instrument validation

The validation procedure proceeded by distributing a total of 650 instruments to practicing English teachers at
different institutes, schools, and centers of higher education in Tehran and six other provinces of Iran. Both face to face
Table 1
The tentative model, its components and sample items.
Component Definition Sample item
Practical Actual act of reflection by using different tools, such as After each lesson, I write about the
keeping journals, talking to colleagues accomplishments/failures of that lesson or I talk
about the lesson to a colleague.
Cognitive Conscious efforts for professional development by attending I look at journal articles or search the internet to
conferences and reading professional books and journals see what the recent developments in my profession are.
Affective Deals with knowledge of learners and their affective/cognitive states I think about my students’ emotional responses to my
instructions.
Meta-cognitive Deals with teachers’ knowledge of their personality, their I think about my strengths and weaknesses as a teacher.
definition of learning and teaching, their view of their profession
Critical Deals with the socio-political dimension of teaching I think of ways to enable my students to change their
social lives in fighting poverty, discrimination, and
gender bias.
Moral Deals with issues of justice, empathy, and values I believe in the concept of justice and try to show it
in my classroom practice.
216 R. Akbari et al. / System 38 (2010) 211e227

Table 2
The tentative structure and items of the instrument.
Component Items
1. Practical 1. I write about my teaching experiences in a diary or a notebook.
2. I have a file where I keep my accounts of my teaching for reviewing purposes.
3. I talk about my classroom experiences with my colleagues and seek their advice/feedback.
4. After each lesson, I write about the accomplishments/failures of that lesson or I talk about the lesson
to a colleague.
5. I discuss practical/theoretical issues with my colleagues.
6. I observe other teachers’ classrooms to learn about their efficient practices.
7. I ask my peers to observe my teaching and comment on my teaching performance.
2. Cognitive 8. I think of using/introducing new teaching techniques in my classes.
9. I read books/articles related to effective teaching to improve my classroom performance.
10. I participate in workshops/conferences related to teaching/learning issues.
11. I think of writing articles based on my classroom experiences.
12. I look at journal articles or search the internet to see what the recent developments in my profession are.
13. I carry out small scale research activities in my classes to become better informed of learning/teaching processes.
14. I think of classroom events as potential research topics and think of finding a method for investigating them.
3. Learner 15. I think about my students’ emotional responses to my instructions.
16. When a student is having an emotional problem or is neglected by his/her peers, I try to spend more time with
him/her.
17. Before and after teaching, I think about aspects of my lessons my students liked/disliked.
18. I ask my students to write/talk about their perceptions of my classes and the things they liked/disliked about it.
19. I talk to my students to learn about their learning styles and preferences.
20. I talk to my students to learn about their family backgrounds, hobbies, interests and abilities.
21. I ask my students whether they like a teaching task or not.
4. Meta-Cognitive 22. As a teacher, I think about my teaching philosophy and the way it is affecting my teaching.
23. I think of the ways my biography or my background affects the way I define myself as a teacher.
24. I think of the meaning or significance of my job as a teacher.
25. I try to find out which aspects of my teaching provide me with a sense of satisfaction.
26. I think about my strengths and weaknesses as a teacher.
27. I think of the positive/negative role models I have had as a student and the way they have affected me in
my practice.
28. I think of inconsistencies and contradictions that occur in my classroom practice.
5. Critical 29. I think about instances of social injustice in my own surroundings and try to discuss them in my classes.
30. I think of ways to enable my students to change their social lives in fighting poverty, discrimination, and
gender bias.
31. In my teaching, I include less-discussed topics, such as old age, AIDS, discrimination against women and
minorities, and poverty.
32. I think about the political aspects of my teaching and the way I may affect my students’ political views.
33. I think of ways through which I can promote tolerance and democracy in my classes and in the society
in general.
34. I think about the ways gender, social class, and race influence my students’ achievements.
35. I think of outside social events that can influence my teaching inside the class.
6. Moral 36. I think of my job as showing care and sympathy to others.
37. I regard myself as a role model for my students and as a result try to act as a moral example.
38. I believe in the concept of justice and try to show it in my classroom practice.
39. I talk about my moral standards and values to my students.
40. I establish a clear set of rules for my students to follow in terms of their classroom attendance and the way they
will be evaluated at the end of the course.
41. I provide equal opportunities for all my students in the class regardless of their capabilities.
42. I have a clear set of general class rules and what constitutes acceptable behavior for my students to follow.

methods and emails were used for instrument distribution; 425 instruments were completed by the respondents and
returned to the researchers (response rate of 65%). Upon initial inspection, 117 of the completed instruments were
discarded since they were either incomplete or carelessly completed (for example those questionnaires in which one
response was systematically selected). This left us with 308 instruments for model validation. 149 of the respondents
R. Akbari et al. / System 38 (2010) 211e227 217

were male (48%) and 157female (52%), while two respondents failed to specify their gender. The participants’
experience ranged from 1 to 40 years (Mean ¼ 6.3, SD ¼ 5.9) of teaching across different private and public contexts.

4.1. Data analysis framework

There is not any single agreed upon framework for model construction and evaluation. However, there are some
studies that have proposed a more consistent approach to such a practice. In the present study, the validation scheme
proposed by Mulaik and Millsap (2000), consisting of Exploratory Factor Analysis (EFA), Confirmatory Factor
Analysis (CFA) and model evaluation, was used. However, for ease of discussion, we have divided our validation
process into two Macro-Phases: Exploratory Data Analysis (EDA) and Confirmatory Data Analysis (CDA), each of
which comprised of a number of well grounded micro procedures. At the same time, experts’ opinions as well as
domain knowledge were interactively employed for verifying the rationality of the yielded results at each stage of data
analysis (Fig. 1). What follows is a brief descriptive account of our framework of data analysis of the study.

4.1.1. Phase 1 e exploratory data analysis


Exploratory Data Analysis (EDA) is an approach to analyzing data for the purpose of formulating hypotheses
worth testing and to this end it draws upon a variety of techniques to maximize insights into a dataset, uncover
underlying structure and extract important latent factors. As Fig. 1 shows, our analyses started with data cleaning by
manually discarding 117 questionnaires due to their sloppy completion, followed by calculation of descriptive
statistics. Along with this phase of descriptive statistics, cluster analysis, i.e. partitioning of a data set into subsets or
clusters sharing certain common traits, was conducted. Two objectives were pursued in this phase of the analysis:
first, to identify subgroups within the data, and second, to detect outlier cases which have survived the data cleaning
phase. These clusters of analysis, divided into Fitting Data and Validation Data, then set the ground for the last
phase of the EDA, i.e. Exploratory Factor Analysis. In fact, in the exploratory phase, efforts were made to uncover
the latent variables which could explain as much of the variance in the data as possible (Shultz and Whitney, 2005).
For this purpose, the Fitting Data was fed into STATISTICA to see whether it uncovered any meaningful factor
structure.

4.1.2. Phase 2 e confirmatory data analysis


Following the EDA phase, the developed model was further extended and verified in the subsequent Confirmatory
Data Analysis (CDA) stage. This was done first through a round of analyses in which the reliability of the sum scale
and each of the yielded factors from the exploratory phase was computed using Cronbach’s Alpha. Then, the Fitting
Data, from the previous stage of clustering and EFA, underwent Confirmatory Factor Analysis (CFA), a data reduction

Hypothetical Model

Re-specification Fitting and


Evaluation

Unsatisfactory
Model
Experts and
Domain Knowledge

Satisfactory
Model

Final Model

Fig. 1. Iterative process of model validation.


218 R. Akbari et al. / System 38 (2010) 211e227

procedure which sets a priori the number of expected factors for model validation (Shultz and Whitney, 2005). In fact,
what CFA does with the data is to verify ‘‘that the factor structure obtained in the exploratory factor analysis is robust
and not merely the consequence of the whims of random variability in one’s data’’ (Howitt and Cramer, 2000: 329).
Finally, in order to objectively evaluate the model’s overall fit for the data being examined, both Fitting and
Validation, heuristic measures called ‘Goodness of Fit Indices’ (GFI) (Ho, 2006; Hu and Bentler, 1999) were
employed as the last step in our model evaluation/validation scheme. This included the most commonly used indices
for empirical examination of model fit, i.e. tests of absolute fit and tests of incremental fit. In addition to normed Chi-
Squared statistic (chi-square divided by the degrees of freedom) which is not very sensitive to sample size, three other
commonly used absolute fit indices of Goodness of Fit Index (GFI), Adjusted Goodness of Fit Index (AGFI) and the
relatively new Root Mean Square Error of Approximation (RMSEA) were used in this study. In fact, the simultaneous
use of these three fit indices was due to the absence of a single universally accepted criterion for assessing model fit
(Heubeck and Neill, 2000).
The minimum cut-off value for model validation is <3 for Chi-Squared statistic while the corresponding values for
GFI, AGFI and RMSEA are .9, .85, and .08, respectively (Sharma, 1996). At the same time, Incremental Fit Indices,
including Incremental Fit Index (IFI), Tucker-Lewis Index (TLI), and Comparative Fit Index (CFI) all with cut-off
values >.9, were also calculated to estimate model fit. The computation of Incremental Fit Indices as an aid in
assessing model fit is recommended since they employ more complex computational matrices and therefore can
provide researchers with more accurate estimations (Smith and McMillan, 2001). In brief, a model which exceeds
these minimum acceptance cut-off values can be regarded as a valid tool. For such a model to be attained, an iterative
process had to be applied (Fig. 1). If the resulting model is satisfactory, the final model is obtained, otherwise model
re-specification should be done until the most satisfactory model emerges.

5. Results

5.1. Descriptive data analysis and clustering

After making sure that there was no kurtosis, highly skewed distribution, bi-modality, or strong deviations from
normal distribution, we used Ward’s Method to identify the appropriate number of clusters within the dataset. The
analyses using STATISTICA gave way to four distinct clusters with a strong degree of association among the
members within the same group and weak association among members of different groups. Afterwards, K-Means
Method was used to cluster the respondents into the four groups. Analysis of Variance (ANOVA) confirmed
significant mean difference among these four clusters of analysis (p < 0.01) which included 114, 93, 93 and 8
respondents respectively.
However, as Fig. 2 indicates, cluster 4 with 8 members stood far away from the other clusters in terms of its relative
mean over the indicators. This discrepancy is due to the existence of outlier cases within the cluster which have
outlasted the cleaning stage. Therefore, in a measure to further refine the data, this cluster was discarded, reducing the
dataset to 300 cases. In the next step and in order to create the validation and Fitting Datasets e from the remaining
three clusters e for the subsequent analyses, each cluster was randomly divided into two sections with the ration of
3:1. The first part of the 3 clusters were, then, added en masse to constitute the Fitting Dataset (containing 210 cases to
used in EFA, CFA and model-fit analysis), whereas the second subsets made up the Validation data (containing 90
cases reserved as an independent sample for the final model-fit analysis).

5.2. Exploratory factor analysis (EFA)

In the next stage, in order to determine whether there is any empirical support for the existence of separate factors
for reflective practice, the Fitting Data given from the earlier clustering stage underwent Principle Components
Factoring (PCF) with varimax rotation. However, prior to this analysis and as a measure against multicolinearity, the
determinant was calculated and turned out to be higher than .00001. KMO (Kaisor-Mayor-Olkin). In addition,
the measure of Sampling Adequacy (.865) and Bartlett’s Test of Sphericity (.0) were both significant indicating that
the data were factorable.
PCF with varimax rotation on the 42 items of the Fitting Dataset yielded 6 factors with eigenvalues greater than
one accounting for 51% of the total variance (See Table 2). Cattell’s (1966) scree test of eigenvalues was also used to
R. Akbari et al. / System 38 (2010) 211e227 219

Plot of Means for Each Cluster


7

Cluster 1
-1 Cluster 2
Q4 Q8 Q12 Q16 Q20 Q24 Q28 Q32 Q36 Q40 Cluster 3
Cluster 4
Variables

Fig. 2. Plot of means for each cluster over indicators.

plot the appropriate number of factors substantiated by the dataset and suggested that 6 factors could be extracted.
Having examined the very first factor e with an eigenvalue of 10.2 e and as a conservative heuristic measure, we set
.45 as the minimum item loading threshold (Raubenheimer, 2004). As a result, items 36 and 39 with loadings smaller
than .45 were discarded from the subsequent analyses since they had failed to reach the acceptable loading value on
their given factors. In fact, these two questions, ‘I think of my job as showing care and sympathy to others’ and ‘I talk
about my moral standards and values to my students’, which enquired about moral dimensions of teaching practices
showed no significant statistical relationship with any of the uncovered factors and were thus removed from the
instrument.
Next, the obtained factor structure was further scrutinized through systemic interaction with domain knowl-
edge and experts’ judgments. The analyses of this stage showed that all the emerged factors were clearly
identifiable with reference to our tentative model (Table 3); Meta-cognitive (Factor 1 with 8 items accounting for
24% of the variance), Cognitive (Factor 2 with 7 items accounting for 8% of the variance), Critical (Factor 3 with
7 items explaining 6% of the variance), Practical (Factor 4 with 7 items explaining 5% of the variance), Moral
(Factor 5 with 5 items accounting for 4% of the variance) and Affective (Factor 6 with 9 items explaining 3% of
the total variance). Yet, closer inspection showed that Question 28 with primary loading on Factor 6 did not
belong to the items in this category. That is, question 28 of the instrument, ‘I think of inconsistencies and
contradictions that occur in my classroom practice’, relates to teachers’ meta-cognitive reflection whereas the
factor on which it has loaded represents the affective make-up of teachers. Therefore, this item was also deleted
from the instrument in this phase.
As for cross-loadings of items on more than one factor, instead of eliminating them from the model, we consulted
domain experts and assigned such items to the factors that looked logically more relevant; one of our objectives in this
phase was to maintain as many items of the instrument that could survive the exploratory phase as possible for the
confirmatory stage of the study. Therefore, items 34 and 35, ‘I think about the ways gender, social class, and race
influence my students’ achievement’ and ‘I think of outside social events that can influence my teaching inside the
class’, which had simultaneously loaded on factors 1 and 3 were further content-examined for locating their more
relevant factor(s). As a result, these items, which underlined reflection on the socio-political aspects of pedagogy, were
included under Factor 3, due to their content proximity. In the same vein, Question 25, ‘I try to find out which aspects of
220 R. Akbari et al. / System 38 (2010) 211e227

Table 3
The results of exploratory factor analysis.
Item content Exploratory factor analysis
Meta-cognitive Cognitive Critical Practical Moral Affective
Q 1. Practical .53
Q 2. Practical .58
Q 3. Practical .53
Q 4. Practical .74
Q 5. Practical .48
Q 6. Practical .53
Q 7. Practical .64
Q 8. Cognitive .53
Q 9. Cognitive .72
Q 10. Cognitive .59
Q 11. Cognitive .77
Q 12. Cognitive .76
Q 13. Cognitive .70
Q 14. Cognitive .69
Q 15. Affective .65
Q 16. Affective .63
Q 17. Affective .56
Q 18. Affective .54
Q 19. Affective .58
Q 20. Affective .54
Q 21. Affective .67
Q 22. Meta-coa .48
Q 23. Meta-coa .67
Q 24. Meta-coa .57
Q 25. Meta-cog .45 .53
Q 26. Meta-cog .53
Q 27. Meta-cog .52
Q 28. Meta-cog .47
Q 29. Critical .78
Q 30. Critical .76
Q 31. Critical .63
Q 32. Critical .70
Q 33. Critical .65
Q 34. Critical .48 .53
Q 35. Critical .52 .50
Q 36. Moral
Q 37. Moral .56
Q 38. Moral .47
Q 39. Moral
Q 40. Moral .69
Q 41. Moral .55
Q 42. Moral .77
Factor loadings (Varimax Raw).
Extraction: Principal components.
(Marked loadings are >450,000).

teaching provide me with a sense of satisfaction’, loading simultaneously on Factors 1 and 6, was assigned to the Meta-
cognitive element.

5.3. Confirmatory factor analysis

Based on the EFA results, substantiated by the domain knowledge and experts’ opinion, a six factor model of
reflective practice was extracted from the Fitting Dataset. This hypothetical model, then, had to be validated so that it
could be used as a valid measurement instrument for reflective teaching. At this stage, since judgments were made
a priori as to the number and nature of the latent variables, Confirmatory Factor Analysis (CFA) was conducted on the
R. Akbari et al. / System 38 (2010) 211e227 221

Fitting Data using STATISTICA. However, prior to proceeding with this analysis, the Cronbach’s alpha estimates for
the indicators of Practical, Affective, Critical, Meta-Cognitive, Moral, and Cognitive were calculated and turned out to
be .73, .78, .84, .82, .67, and .83 respectively.
As Fig. 3 shows, CFA corroborated a five factor model, i.e. Practical, Affective, Cognitive, Critical and Meta-
Cognitive, in which all the loadings between the indicators and the latent factors as well as the covariance among
the factors were significant at a ¼ .001 (p-value  .001). Yet, none of the items dealing with the construct of morality
from the exploratory phase did show a significant relationship with their corresponding factor. Neither did Morality
show any significant covariance with the other latent factors; hence it was deleted from the model, reducing it to a five
factor instrument. In addition, the loadings of only three items, 19, 20 and 21, belonging to the Affective factor turned
out to be salient in the confirmatory stage. For this reason, this component is presented as three indicator factor in the
fitted CFA model. Finally, item 1 of the instrument did not load on its perceived factor either and was hence discarded
from the model. This left us with an instrument with five factors and 29 items for measuring teacher reflectivity (see
Appendix II for the final version of the instrument).
In order to determine whether this hypothetical model adequately fits the data, both the Fitting and Validation
Datasets underwent model-fit analysis. Crucial to such an analysis is the Validation data since it had been reserved
from the clustering stage and had not met the given model before. Nonetheless, in order to cross-validate the findings,

Q2 1.00

Q3 .75 .19 1.00


.43 .42 Q19

.87
Q4 .79
Practical Affective
Q20

1.04
Q5
1.15
Q21
Q6 .57
.23 1.00 Q29
.14
.68
Q7 .28 .16
1.06 Q30
1.00
Q9

.80 Q31
.94
Q10
.16
.75
1.03 Q32
Q11 Cognitive .15 Critical
.27
.45 .56 Q33
Q12 .75
1.09

Q13 .93 .24 .24 .76 Q34

Q14 .92
.77
Meta- .43 Q35
Cognitive

.66 .62 .64 1.01 1.00


.78 .83

Q28 Q27 Q26 Q25 Q24 Q23 Q22

Fig. 3. Fitted CFA model.


222 R. Akbari et al. / System 38 (2010) 211e227

Table 4
Absolute and incremental fit indices for CFA model.
Dataset Index
Absolute fit indices Incremental fit indices
Chi-Sq/DF GFI AGFI RMSEA IFI TLI CFI
Fitting dataset 1.155 .904 .884 .023 .943 .935 .942
Testing dataset 1.424 .899 .880 .038 .884 .854 .872

the Fitting Data, which has already been subject to EFA and CFA, was also incorporated into this phase of fitting
analysis. As Table 4 shows, the assessment indices for both Validation and Fitting Data outstripped the minimum cut-
off points, i.e. <3 for normed Chi-Squared; >.9, >.85, and >.08 for GFI, AGFI and RMSEA respectively; and >.9 for
(TLI) and (CFI). Despite a slight decrease in the assessment indices of the Validation dataset, these model-fit estimates
confirm the correspondence of the data to the CFA model. All this, in turn, verifies the construct validity of the final
version of the instrument for its intended purpose.

6. Discussion

The present study described the development and validation of an instrument for measuring teachers’
reflection in English language pedagogy. For this purpose, a model construction framework consisting of
exploratory and confirmatory analyses was used to examine the construct validity of a proposed six factor model,
i.e. meta-cognitive, cognitive, practical, critical, moral and affective. This hypothetical model which was derived
from the literature was then tested on a sample of 300 EFL teachers using EFA, CFA and Model Evaluation
estimates. While EFA corroborated all the initial components of our tentative model, CFA only gave statistical
support to five components reducing the instrument to a five-factor model (the moral component did not survive
the confirmatory stage). The calculated model-fit estimates also verified this CFA model as a valid measure of
reflective teaching.
Despite the overall data-to-model fit observed in this study, there are some aspects of our initially proposed
instrument which were not substantiated by the gathered data. The most striking of these incongruities is perhaps
the omission of the construct Morality in the CFA stage, in spite of the partial affirmation this factor received in the
exploratory analyses. In fact, none of the items tapping into moral issues had a significant statistical relationship
with their corresponding factor; neither did Morality show an acceptable covariance with the other constructs. Yet,
as related literature vividly attests, teachers’ reflection on their moral, in addition to pedagogical, responsibilities is
considered to be an integral component of any instance of reflective practice. The lack of empirical support for this
crucial construct here can be explained in light of the myriads of contextual factors shaping language teachers’
perception of their professional duties. This is to say that, because of differences in terms of the status of the
teaching profession in general e and English teaching in particular e as well as the conditions under which
instruction occurs in each context, a universal and homogeneous role definition is difficult to conceptualize for
language teachers; i.e. ‘‘a TESOL knowledge base would be shaped by national expectations as well as local and
regional demands’’ (Fradd and Lee, 1998; p. 763). In the same vein, one possible reason for the exclusion of
Morality can be teachers’ role perceptions, which bars them from getting overtly involved in issues of moral
significance (Hansen, 2001).
Of the other components outlasting CFA, Affective emerged as an independent factor with only three of its indi-
cators confirmed (See Fig. 3). The items which did not load on this given component are questions 15e18, all dealing
with teachers’ reflection on the emotional make-up of their students. The reason why these four indicators were
discarded in the confirmatory analyses e while the other three were confirmed e can not be explicated in the light of
the current findings and further research is needed in this regard. In the meantime, item 1 belonging to the Practical
component of reflection, ‘I write about my teaching experiences in a diary or a notebook’, also did not survive the CFA
phase of the study. This can be partly justified with reference to the context of present study, and, more specifically, to
teachers’ lack of sufficient time and motivation to keep a diary. Besides, diary writing is not a very common practice
among many ESL teachers and the majority lack familiarity with the basic premises of this technique. Together, these
can explain why item 1 was not supported by our dataset in the CFA.
R. Akbari et al. / System 38 (2010) 211e227 223

7. Conclusion

In the absence of any other instrument measuring teacher reflection, we had to rely on our own data and it was not
possible to check the concurrent validity of the instrument with any similar measure. The true test of ELTRI’s
relevance and validity would be the results it will produce in the empirical studies that try to further validate its
structure, a move we strongly recommend. Given the interplay of contextual variables with knowledge and behavior of
language teachers, it would not be surprising if the model developed here undergoes modifications once applied in
different pedagogical contexts. Thus replication studies are encouraged since they can help to better operationalize
teacher reflection and refine the model’s factor structure.

Appendix I: Interview questions

1. Are you a reflective teacher?


2. How do you define reflective teaching? (Or what is your understanding of reflective teaching?)
3. What are the characteristics of reflection or of a reflective teacher? How should a reflective teacher reflect on his/
her practice?
4. What should a reflective teacher reflect on while developing a reflective teaching practice? (Or what constitutes
the content of reflective teaching?)
5. What components do you think are missing in the literature on reflective teaching, and should be included in this
practice?
6. What procedures should a reflective practitioner employ in his/her practice?
7. Some educators say that teachers should reflect only on their classroom teaching, whereas other educators say
that teachers should also take societal, political, and cultural influences into consideration. Where do you stand
on this issue? How do define critical reflection?
8. What about moral/ethical aspects?
9. Do you think affective factors play a role in reflective practice?

Appendix II: The reflective teaching instrument

Dear respondent
This questionnaire is devised with the aim of looking into your actual teaching practices as a professional teacher.
To that end, your careful completion of the questionnaire will definitely contribute to obtaining real data which is
crucial for more accurate findings. Therefore, please check the box which best describes your actual teaching
practices. The information will be kept confidential and will be used just for research purposes. Thank you very much
in advance for your time and cooperation.
224 R. Akbari et al. / System 38 (2010) 211e227
R. Akbari et al. / System 38 (2010) 211e227 225

References

Akbari, R., 2007. Reflections on reflection: a critical appraisal of reflective practice in L2 teacher education. System 35, 192e207.
Apple, M., 1979. Ideology and the Curriculum. Routledge & Kegan Paul, London.
226 R. Akbari et al. / System 38 (2010) 211e227

Bartlett, L., 1997. Teacher development through reflective teaching. In: Richards, J.C., Nunan, D. (Eds.), Second Language Teacher Education.
Cambridge University Press, New York, pp. 202e214.
Braun, J.A., Crumpler, T.P., 2004. The social memoir: an analysis of developing reflective ability in a pre-service methods course. Teaching and
Teacher Education 20, 59e75.
Brown, J.D., 2001. Using Surveys in Language Programs. Cambridge University Press, Cambridge, UK.
Carter, K., 1993. The place of story in study of teaching and teacher education. Educational Researcher 22, 5e12.
Cattell, R.B., 1966. The scree test for the number of factors. Multivariate Behavioral Research 1, 245e276.
Connelly, M., Clandinin, J., 1990. Stories of experience and narrative inquiry. Educational Researcher 19, 2e14.
Crandall, J.A., 2000. Language teacher education. Annual Review of Applied Linguistics 20, 34e55.
Davis, J., Adams, N., 2000. Exploring early adolescent identity through teacher autobiography. Middle School Journal 42, 18e25.
Day, C., 1993. Reflection: a necessary but not sufficient condition for teacher development. British Educational Research Journal 19 (1), 83e93.
Dewey, J., 1933/1993. How We Think: A Re-Statement of the Relation of Reflective Thinking to the Education Process. DC. Heath, & Co.,
Boston.
Dornyei, Z., 2003. Questionnaires in Second Language Research: Construction, Administration, and Processing. Lawrence Erlbaum Associates,
Inc., Publishers, New Jersey.
Farrell, T., 2004. Reflective Practice in Action: 80 Reflection Breaks for Busy Teachers. Corwin Press, California.
Fendler, L., 2003. Teacher reflection in a hall of mirrors: historical influences and political reverberations. Educational Researcher 32 (3), 16e25.
Fradd, H.D., Lee, O., 1998. Development of a knowledge base for ESOL teacher education. Teaching and Teacher Education 14, 761e773.
Giroux, H., McLaren, P., 1986. Teacher education and the politics of engagement: The case for democratic schooling. Harvard Educational
Review 56, 213e238.
Griffiths, V., 2000. The reflective dimension in teacher education. International Journal of Educational Research 33, 539e555.
Goodman, J., 1986. Teaching preservice teachers a critical approach to curriculum design: A descriptive account. Curriculum Inquiry 16 (2),
179e201.
Halliday, J., 1998. Techncisim, reflective practice and authenticity in teacher education. Teaching and Teacher Education 14, 597e605.
Hansen, D.T., 1998. The moral is in the practice. Teaching and Teacher Education 14 (6), 653e655.
Hansen, D., 2001. Exploring the Moral Heart of Teaching. Teacher’s College Press, New York, New York.
Heubeck, B.G., Neill, J.T., 2000. Internal validity and reliability of the 30 item Mental Health Inventory for Australian Adolescents. Psychological
Reports 87, 431e440.
Hillier, Y., 2005. Reflective Teaching in Further and Adult Education. Continuum, London.
Ho, R., 2006. Handbook of Univariate and Multivariate Data Analysis and Interpretation with SPSS. Taylor & Francis Group, Boca Raton,
Florida.
Howitt, D., Cramer, D., 2000. An Introduction to Statistics in Psychology: a Complete Guide for Students, second ed. Prentice Hall, Hemel
Hempstead.
Hu, L., Bentler, P.M., 1999. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Structural
Equation Modeling 6 (1), 1e55.
Inchausti, R., 1991. Ignorant Perfection of Ordinary People. SUNY Press, Albany.
Jay, J.K., Johnson, K.L., 2002. Capturing complexity: a typology of reflective practice for teacher education. Teaching and Teacher Education 18,
73e85.
Korthagen, F.A.J., 2001. Linking Practice and Theory: The Pedagogy of Realistic Teacher Education. Lawrence Erlbaum, Mahwah, NJ.
Korthagen, F.A.J., Wubbles, T., 1995. Characteristics of reflective practitioners: towards an operationalization of the concept of reflection.
Teachers and Teaching: Theory and Practice 1, 51e72.
Kumaravadivelu, B., 1994. The postmethod condition: (e)merging strategies for second/foreign language teaching. TESOL Quarterly 28 (1),
27e48.
Kumaravadivelu, B., 2001. Toward a postmethod pedagogy. TESOL Quarterly 35 (4), 537e560.
Mulaik, S.A., Millsap, R.E., 2000. Structural Equation Modelling 7 (1), 36e73.
Munby, H., Russell, T., 1990. Metaphor in the study of teachers’ professional knowledge. Theory into Practice 29, 116e121.
Murphy, J.M., 2001. Reflective teaching in ELT. In: Celce-Murcia, M. (Ed.), Teaching English as a Second or Foreign Language. Heinle & Heinle,
Boston, pp. 499e514.
Pacheco, A.Q., 2005. Reflective teaching and its impact on foreign language teaching. Numero Extraordinario 5. http://revista.inie.ucr.ac.cr/
articulos/extra-2005/archivos/reflective.pdf Available at.
Pennycook, A., 1989. The concept of method, interested knowledge, and the politics of language teaching. TESOL Quarterly 23 (4), 589e612.
Pica, T., 2000. Tradition and transition in English language teaching methodology. System 28, 1e18.
Pollard, A., et al., 2006. Reflective Teaching. Continuum, London.
Raubenheimer, J.E., 2004. An item selection procedure to maximize scale reliability and validity. South African Journal of Industrial Psychology
30 (4), 59e64.
Richards, J.C., Farrell, T., 2005. Professional Development for Language Teachers. Cambridge University Press, New York.
Richards, J.C., Lockhart, C., 1994. Reflective Teaching in Second Language Classrooms. Cambridge University Press, New York.
Richards, J.C., Rodgers, T.S., 2002. Approaches and Methods in Language Teaching. Cambridge University Press, Cambridge.
Schon, D.A., 1983. The Reflective Practitioner: How Professionals Think in Action. Basic Books Inc., New York.
Schon, D.A., 1987. Educating the Reflective Practitioner. Jossey-Bass, San Fransico.
Schon, D.A. (Ed.), 1991. The Reflective Turn: Case Studies in and on Educational Practice. Teachers College Press, New York.
Sharma, S., 1996. Applied Multivariate Techniques. John Wiley & Sons, Inc., New York.
R. Akbari et al. / System 38 (2010) 211e227 227

Shultz, S.K., Whitney, J.D., 2005. Measurement Theory in Action: Case Studies and Practices. Sage Publication Inc..
Smith, T.D., McMillan, B.F., 2001. A primer of model fit indices in structural equation modeling. Paper presented at the Annual Meeting of the
Southwest Educational Research Association, New Orleans, LA.
Stanley, C., 1998. A framework for teacher reflectivity. TESOL Quarterly 32, 584e591.
Thiessen, D., 2000. A skillful start to a teaching career: a matter of developing impactful behaviors, reflective practices, or professional
knowledge? International Journal of Educational Research 33, 515e537.
Valli, L., 1990. Moral imperatives in reflective teacher education programs. In: Clift, R.T., Houston, W.R., Pugach, M. (Eds.), Encouraging
Reflective Practice: An Examination of Issues and Exemplars. Teachers College Press, New York, pp. 39e56.
Van Manen, M., 1977. Linking ways of knowing with ways of being practical. Curriculum Inquiry 6, 205e228.
Ward, J.R., McCotter, S.S., 2004. Reflection as a visible outcome for preservice teachers. Teaching and Teacher Education 20, 243e257.
Zeichner, K.M., Liston, D.P., 1987. Teaching student teachers to reflect. Harvard Educational Review 57 (1), 23e48.
Zeichner, K.M., Liston, D.P., 1996. Reflective Teaching: An Introduction. Lawrence Erlbaum Associates, Mahwah, New Jersey.

You might also like