You are on page 1of 9

Studies in Educational Evaluation 64 (2020) 100831

Contents lists available at ScienceDirect

Studies in Educational Evaluation


journal homepage: www.elsevier.com/locate/stueduc

Using data on school strengths and weaknesses for school improvement T


a, ,1 b
Tali Aderet-German * , Miriam Ben-Peretz
a
Department of Education, Ben-Gurion University of the Negev, P.O.B. 653, Beer-Sheva 84105, Israel
b
Faculty of Education, University of Haifa, Haifa, Israel

A R T I C LE I N FO A B S T R A C T

Keywords: This paper examines evaluation data use practices of a network of schools implementing an internal, in-
School self-evaluation dependent, school self-evaluation process for more than a decade. This network currently uses data on its
School improvement strengths alongside data indicative of its weaknesses, collecting and utilizing both positive and negative data for
Data use improvement and accountability purposes. We conducted multiple formal and informal interviews with 24
School change
school management members and teachers, and gathered and analyzed 50 school documents in order to un-
School evaluation
derstand how the network used evaluation findings for school improvement. We identified types of data use
described in the literature (instrumental, conceptual, and symbolic), and propose a complementary type of use
we termed “reinforcement data use”. Our findings suggest that identifying strengths is one of the valid goals and
outcomes of evaluation, shedding light on its potential to enhance school ethos, and to promote a positive
attitude toward evaluation processes and their subsequent effects.

1. Introduction more than a decade. The school network conducted evaluations on a


wide variety of topics, and unlike many school evaluations (Marsh,
Over the past decade, the use of evaluation data in schools has been 2012) was not limited to student performance data, providing a po-
one of the more increasingly researched topics in education (see for tentially lucrative platform to explore different types of use of evalua-
example: Coburn & Turner, 2012b; Schildkamp et al., 2012b, 2013; tion findings. In light of the common barriers to data use (fear of eva-
Datnow & Hubbard, 2015a). The use of data refers to decision making luations, limited capacity, misuse and nonuse) Rishonim’s ongoing
in schools based on various types of data, such as assessment data, efforts to use data they collect stand out, leading us to the research
student demographics, and parent and student surveys. Data can be question at the center of this study: How does an autonomous school
used for both accountability and improvement purposes (Schildkamp & network use data collected through the school’s self-evaluation pro-
Kuiper, 2010; Schildkamp, Earl et al., 2013). School improvement is cess?.
commonly associated with educational change (Hopkins, Stringfield, The first part of this paper provides theoretical background for the
Harris, Stoll, & Mackay, 2014), thus many of the ways in which schools study, detailing the terminology used throughout the paper, and ela-
use data entail change. borating on existing types of data use. This part examines the purpose
School evaluation data conveys strong possibilities for school im- each type of data use has to enable a more complete understanding of
provement, yet schools face barriers to using evaluation findings due to how schools use data in practice. The second part describes the context
teachers fearing evaluation processes (Conley & Glasman, 2008; of the study, the SSE enacted at the school network at the center of this
Terhart, 2013), their limited capacity regrading data use (Datnow & study. Following this background, we detail the methods we employed
Hubbard, 2015a), and the common misuse, and even nonuse of data and present our analysis of the network’s data use practices which
(Alkin and King, 2016). In this article we discuss the different ways a identified types of use described in the literature. In this section we
network of schools uses data indicative of its strengths, as well as its suggest an additional, complementary type of data use that we termed
weaknesses. “reinforcement data use”. The fifth and final part of the paper discusses
We present a case study of “Rishonim’’ (all names in this paper are these findings, as well as the possible implications reinforcement data
pseudonyms), an autonomous network of schools which has been use has for school improvement.
conducting an internal, independent school self-evaluation (SSE) for


Corresponding author.
E-mail addresses: tali100@gmail.com (T. Aderet-German), mperetz@edu.haifa.ac.il (M. Ben-Peretz).
1
The study was conducted at University of Haifa, Haifa, Israel.

https://doi.org/10.1016/j.stueduc.2019.100831
Received 26 February 2019; Received in revised form 3 November 2019; Accepted 5 December 2019
0191-491X/ © 2019 Elsevier Ltd. All rights reserved.
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

2. Evaluation data use 2.2. Using data in practice: purposes of evaluation data

One of the questions that must be addressed by school leadership at A comprehensive review by Cousins and Leithwood (1986) of 65
the end of an evaluation process is “How do we use the data?”. This empirical studies on evaluation data use found that users of evaluations
question is widely dealt with in the literature, referring mainly to the should be involved in their design and implementation to ensure the
changes in schools and other organizations lead by evaluation use reliability and relevance of the findings, creating a greater commitment
(Alkin & Taut, 2002; Cousins & Leithwood, 1986; Johnson et al., 2009; to the use of the findings in practice. Johnson et al. (2009) corroborated
Contandriopoulos & Brousselle, 2012). While many studies in this area these findings, adding the importance of stakeholder involvement in
deal with how teachers use data to improve their classroom instruction facilitating evaluation use.
(for instance Assessment Reform Group, 2002; Bertrand & Marsh, 2015; Schildkamp and Kuiper (2010) reviewed studies regarding the
Birenbaum, Kimron, & Shilton, 2011), this study follows Schildkamp, purposes of data use by schools. Some of these purposes were intended
Earl et al. (2013) data use framework which expands this focus on to lead to improvement and change, for instance: instructional pur-
classroom use, to explore the use of data by the school for informing poses; policy development and planning; shaping professional devel-
policy decisions in addition to improving instructional practices. opment; meeting accountability demands; legitimizing existing pro-
Accordingly, in this paper we define data as “systematically col- grams; motivating students and staff; supporting personnel decisions.
lected information on students, schools, school leaders, and teachers, The researchers also identified other types of data use with different
which can be obtained from qualitative (e.g., classroom observations) purposes, such as accountability demands and legitimizing actions.
and quantitative (e.g., assessment results) methods” (Schildkamp, Lai Combining the two purposes of improvement and accountability is
et al., 2013, p. 137). In the school at the center of this study, the data one of Nevo’s (1995, 2006) arguments for a dialog between external
was obtained through SSE, therefore we use the terms ‘data’ and ‘eva- and internal evaluation in schools. Nevo (2006) claims this dialog can
luation data’ intermittently. Furthermore, we define data use (some- enable schools to use various kinds of evaluation data from a wide
times termed ‘data-based decision-making’, e.g. Schildkamp, Earl et al., range of variables that are relevant and meaningful to the specific
2013; Staman, Timmermans, & Visscher, 2017, or ‘data-driven decision school’s context for both purposes.
making’, e.g. Mandinach, 2012; Reeves & Chiang, 2018) as the ways Much of the recent work done on data use in schools refers to the
school leaders and teachers shape local policy and practice toward way teachers use data to improve their own instruction and their stu-
school improvement based on these data. dents’ learning, focusing on data generated through tests and other
evidence of students learning (see for example: Coburn & Turner,
2012b; Schildkamp et al., 2012b; Datnow & Hubbard, 2015b; Van Geel,
2.1. Main types of evaluation data use Keuning, Visscher, & Fox, 2016). Similarly, assessment for learning is a
type of data use which refers to a form of formative assessment which
There are several differentiations when examining the use of eva- incorporates teachers’ continuous feedback to students as part of on-
luation findings. The traditional three types are instrumental, con- going classroom practices and that focuses on the quality of the learning
ceptual, and symbolic uses (Alkin & King, 2016; King & Pechman, 1984; process (Assessment Reform Group (ARG), 2002; Birenbaum et al.,
Leviton & Hughes, 1981). 2011; Wiliam, 2011; Heitink, Van der Kleij, Veldkamp, Schildkamp, &
Instrumental use refers to cases when the findings are used to di- Kippers, 2016).
rectly influence action (King & Pechman, 1984; Leviton & Hughes, These types of data use are more local, and usually do not involve
1981). For example, when a teacher decides to emphasize a certain managerial decision-making or suggest policy change. Using data for
topic which seems to have been misunderstood by the students, or whole-school development is a complex task that has many enablers
when a principal decides to host a lecture for the teachers on a peda- and barriers. Schildkamp, Karbautzki, and Vanhoof (2014) found that
gogical subject that is lacking according to a recent evaluation. Con- the main barriers to data use were limited accessibility to high quality
ceptual use refers to the influence of evaluation processes or their re- data, lack of professionalism in using data, and staff collaboration issues
sults over time on changing the thinking of users (King & Pechman, around the use of data. In their examination of schools’ data use in five
1984; Leviton & Hughes, 1981). For instance, when teachers are in- European countries they noted that a positive attitude toward data use
volved in evaluating a program they are exposed to the details of the was mentioned by respondents in all countries as an enabling influence
program, its advantages and limits, and consequently take a position as on data use” (p. 21). These findings concur with previous findings on
to its ability to enrich students. Symbolic use refers to the instances in the importance of principals’ and teachers’ positive attitude towards
which users apply the evaluation process for their own agenda, such as self-evaluation as a pre-condition for its successful enactment. They
attaining political support or confirming a policy (King & Pechman, emphasize the wariness of school staff from the process, explaining that
1984; Leviton & Hughes, 1981). An example is a principal using student “Self-evaluation is not a popular activity among many teachers because
achievements on standardized tests to persuade stakeholders to support it is seen as a form of social control” (p. 27) and as a threat to their
a newly implemented program which the principal has been promoting. autonomy.
King and Pechman (1984) introduced two types of uses in their
analysis of a case study of a research and evaluation department of a 2.3. Evaluation data use and school improvement
large city: signaling use and charged use. Their discussion of the two
types emphasized the interrelations and overlaps among the types they As presented above, a considerable amount of literature has been
identified and the previous typology. Signaling use is viewed as a published on evaluation data use. Fig. 1 presents a classification of the
routine accountability signal to outside agencies, regarding the fulfill- types of data use we reviewed, according to their purposes (i.e., ac-
ment of expectations and requirements of the system (King & Pechman, countability and improvement purposes), and the level of data sources
1984). Charged use is a term from the field of chemistry, where eva- used in the school (i.e., school-level and teacher-level).
luation results are viewed as “positive” or “negative” and are used ac- These studies have typically examined the way in which schools use
cordingly to initiate change. Negative data point to school weaknesses data on areas in which they are lacking in order to initiate change,
and areas in need of change. The way in which schools use positive data improve practices, and support decision-making (Coburn & Turner,
that highlight strengths is less developed in the theoretical literature on 2012a; Hargreaves, Lieberman, Fullan, & Hopkins, 2010; Hopkins &
data use. West, 2002; Schildkamp, Ehren, & Lai, 2012; Schildkamp & Kuiper,
2010; Chapman and Sammons, 2013). Scholars inherently link school
improvement with educational change (de Clercq, 2007; Fullan, 2006;

2
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

Fig. 1. Classification and examples of types of data use, according to their purposes and the level of generation of data used.

Hopkins & West, 2002; Hopkins, Stringfield, Harris, Stoll, & Mackay, always clear as to how the schools themselves should use this data. The
2014). Does school improvement implicitly entail change? When the orientation of these studies are in contrast with the claim of Bubb,
data gathered implies that the issue requires the school’s attention and Earley, Ahtaridou, Jones, and Taylor (2007) that one of the aims of SSE
action, the answer to this question is yes. However, this raises a follow- is “shining a light into dark corners” (p.35), requiring schools to pay
up question: how do schools use data that identifies and confirms school attention to neglected areas in order to inform the school improvement
strengths? plan, together with data gathered on school strengths. Similarly,
Reynolds (2010) views identifying “the excellent departments” (p. 605)
2.4. Identifying school strengths in school as an important part of school improvement, but neither study
details how information on school strengths could be later used.
MacBeath, Schratz, Meuret, and Jakobsen (2000) presented an in- In a project involving 101 schools from eighteen European coun-
strument consisting of a set of twelve areas of school life as the starting tries, one of the SSE benefits noted by MacBeath and colleagues
point of a school evaluation process. One of the aims of this instrument (MacBeath et al., 2000) was the affirmation of “the good thing the
was to help the schools “identify and prioritize areas for deeper in- schools were doing, recognizing and celebrating aspects of school life
quiries” (p. 97). This instrument served as the starting point for sys- which had previously gone unnoticed” (p. 184). Yet, while acknowl-
tematic investigation and evidence collection. MacBeath et al. (2000) edging the importance and value of identifying school strengths, Mac-
found that the areas chosen for further inquiry included issues in which Beath and colleagues do not explain how to use the school strength
there was a wide divergence of opinion, areas with apparent absence of which has been identified.
evidence, areas with an obvious weakness, or a practice the school The conclusions from these studies (Bubb et al., 2007; MacBeath
wished to enhance. The Scottish HM Inspectorate of Education (HMIE) et al., 2000; Reynolds, 2010) emphasize the significance of under-
called this approach a “proportionate approach… [since it] enables you standing the ways in which schools can benefit from positive SSE
to focus on areas of priority rather than routinely covering all aspects of findings, calling for further analysis of positive data use in schools. The
the school’s work in turn” (HMIE, 2007, p. 15). School evaluation fra- current study follows this call, by attending to the ways schools can use
meworks by both the HMIE (2007) as well as MacBeath et al. (2000) data that confirm school strengths.
included indicators concerning the stronger school practices, however
the focus in both frameworks was on uncovering problematic areas. 3. Study context: “Rishonim” school network
An example of a study which concentrated mainly on school
strengths is Lawrence-Lightfoot’s (1983) study, in which she con- This paper presents data from a case-study of a network of schools
structed portraits of high schools, selecting “good high schools” as called “Rishonim”, a semi-private Israeli school. Semi-private schools in
identified by faculty, students, parents, and communities. These por- Israel are partially funded by the Ministry of Education, and are also
traits explored the “goodness” of schools “…that refers to what social allowed to collect tuition from the parents of the students. Students
scientists describe as school’s ‘ethos’, not discrete additive elements. It enrolled in the school come from a wide geographical area, and it is a
refers to the mixture of parts that produce a whole.” (p. 23). Similarly, highly reputed and in-demand school, doing well consistently in ex-
research approaches such as the “success case method” (SCM; ternal and internal evaluations.
Brinkerhoff, 2003) and Marzano’s review of What works in schools The network employs approximately 340 teachers, who teach 4000
(2003), highlight the positive impact which research and evaluation students between the ages of five and eighteen, at six different branches
can have in practice. located in four campuses. The school network is headed by a Chief
Executive Officer (CEO) and has a pedagogical department headed by a
2.5. How could schools use evaluation data on school strengths? Chief Pedagogical Officer (CPO).
Rishonim has been engaged in independent SSE for the past decade,
While many researchers recognize the utility and advantages of and for the past six years has employed a full-time skilled evaluation
learning from school strengths (for example Brinkerhoff, 2003; coordinator who leads and implements the evaluation processes un-
MacBeath et al., 2000; Marzano, 2003; Reynolds, 2010), they are not dertaken by school management. The SSE at Rishonim addresses

3
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

various subjects such as effectiveness of homework, student learning oriented “research relationship” (Josselson, 2013, p. 24) between the
characteristics and preferences, school climate from the perspective of researcher and the participants.
both teachers and students, and many other aspects of school life. The An additional method used for understanding the way in which the
evaluation coordinator employs qualitative and quantitative methods, school utilizes data was through gathering and analyzing approxi-
such as interviews, focus groups, and questionnaires, as suitable to the mately 50 school documents. The documents included presentations,
objectives of the evaluations. Usually, the evaluation method is de- reports and other material on the SSE, reports to the school board, as
signed by the coordinator, adapting existing tested methods to the well as booklets produced by the school with regard to school programs
particular school context, or constructing new instruments to address and activities, and school documentation (e.g., such as the school
specific issues. Thus, the SSE enacted at Rishonim is not based on a newsletter) available online about school activities and school life in
predetermined evaluation model, and is aligned to the research ques- general. These documents shed light on the context, background, and
tions the school management decides upon. outcomes of the evaluation process enacted at Rishonim.

4. Methods 4.2. Data analysis

The findings presented in this paper are based on data from a case- We analyzed the data using thematic analysis, identifying recurrent
study that utilized several qualitative methods in order to investigate themes in the texts transcribed from the interviews with the partici-
the SSE process at the “Rishonim” network of schools. pants, as well as those in the first author’s field notes on the interviews
(Ryan & Bernard, 2003). The analysis was carried out manually, using
4.1. Participants and sources iterative readings to search for evidence of the three main types of data
use presented above: instrumental, conceptual, and symbolic, allowing
We analyzed school documents and conducted semi-structured in- for the addition of emerging new types. This integration between de-
terviews with school management and key figures in the school eva- ductive and inductive analysis (Patton, 1990) enabled us to expand the
luation process, as well as with teachers from two of the network’s existing typology of data use found in the literature.
branches: a junior high-school (i.e., grades seven to nine) and an ele- Ryan and Bernard (2003) suggest reading the text over and over in
mentary school (i.e., grades one to six). The interviews were conducted order to identify themes through topics that reoccur throughout the
with 24 participants: five teachers from each branch, and 14 manage- text; “the more the same concept occurs in a text the more likely it is a
ment-level administrators (i.e., principals in different levels of school theme” (Ryan & Bernard, 2003, p. 89). On the first reading, major
hierarchy and other non-teaching staff members who were key figures themes (i.e. data use types) can be quickly marked; in the next stage less
in the school network). obvious themes can be uncovered. In qualitative research the process of
Participants were purposively sampled (Patton, 1990) for variation analyzing data is an ongoing endeavour, commencing at the data col-
in role in school, and involvement in the SSE. As Rubin and Rubin lection phase and proceeding throughout the remainder of the study.
(2012) recommended, the participants in this case study were not se- During the course of the data collection and the analysis process the
lected to be representative of the whole school staff population, but first and second author discussed the ideas and themes that were con-
rather on the basis of their ability to illuminate issues. Participants were structed, facilitating both better understanding of the data, and trian-
suggested by the school management and by other participants, as well gulation of our interpretations.
as chosen to participate in the study because of their role in the school. In addition, we implemented another triangulation strategy by
The semi-structured interviews’ length varied, from 45 min (the collecting data from different sources in the school. We analyzed the
shortest interview) to two hours (the longest interview), the average school documents we collected as to the context they gave to under-
interview was approximately an hour long. These formal interviews standing the SSE enacted at Rishonim, and specifically the network’s
consisted of questions focused on understanding the participants’ per- data use practices. These documents provided information on school
spectives on the SSE processes in all their phases, for instance, what policy and infrastructure, allowing us to triangulate our thematic in-
they thought of the topics which were evaluated, how the process was terview analysis and form a comprehensive picture of how the network
actually implemented, how much were they a part of the design of the used data. We matched each data use example our participants de-
process, and how the findings were communicated. The participants scribed to the documents that could complement it. This data aided us
were asked to elaborate on SSEs in which they were involved, how they in developing questions for the follow up interview with the partici-
viewed the process, and how the findings were useful in the classroom pant. Thus, our findings are presented in this paper through excerpts
or school context, if at all. Interview transcripts were based on notes from interview transcripts which represented the types of data use we
taken during the interview, completed in detail by the researcher on the identified, while document data was used at the analysis stage.
same day of each interview. Special attention has been given in this study to the researchers’
Many of the participants were interviewed, formally and informally, position in the school. The first author had no formal status in the
several times, in order to follow-up and elaborate on underdeveloped school, developed relationships with the school staff through regular
issues (see Table 1 for precise details regarding the number and type of interviews and visits throughout the two-year data collection, spending
interviews). These interviews and informal conversations were oppor- many hours in the school getting to know the staff and the school
tunities for the participants to expand ideas from their previous inter- context. The school management and staff cooperated fully and will-
view, and for the researcher to ascertain the interpretations made from ingly with the research and data collection. This allowed for the first
the previous interview. These follow-up interviews had another major author to develop a unique “insider-outsider” point of view.
benefit, as they facilitated the formation of a trusting and listening- The second author has a long-time close acquaintance with the

Table 1
Number and type of interviews with teachers and management members.
One semi-structured Two semi -structured Three semi -structured interviews and Documented informal conversations
interview interviews more

Management members (n = 14) 2 8 4 10


Teachers (n = 10) 2 8 0 8

4
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

school, in various roles, thus adding her own “insider-outsider” point of adding computers if the answers pointed to such a need.
view from a different, longitudinal, perspective. This involvement in Another strategy for generating findings usable by the teachers in
the school’s life provided the study with an insider’s point of view on the short term is to provide them with specific data on their practices.
the schools’ vision and history. The combination of this insider’s point For example, one of the school’s recent pedagogical changes involved
of view and the first author’s insider-outsider knowledge of the school teachers implementing a mentorship program for every student. This
aided in the analysis process and in ensuring the validity of the findings program is accompanied by continuous periodical evaluations, in which
(Surra & Ridley, 1991). the students are asked about their relationship with their mentor-tea-
cher. Each mentor receives their own mentees’ confidential evaluation,
5. Findings: types of data use at Rishonim to which only they have access, so that they can determine the best way
to proceed with their role as mentors.
In order to answer the question “‘How does Rishonim use self-eva-
luation data?”, data from this case study was categorized according to 5.2. Conceptual data use: using accumulative knowledge to change
the three traditional types of data use: instrumental, conceptual, and conceptions
symbolic use. These terms have been repeatedly examined in the lit-
erature since their initial introduction (Alkin & King, 2016; Alkin & Conceptual data use addresses instances when the process of the
Taut, 2002; Cousins & Leithwood, 1986), and expanded and evolved by evaluation, as well as its results, are used to change the users’ thinking
leading scholars in the field (Johnson et al., 2009; Shulha & Cousins, about the issues examined, although evaluation data is not used directly
1997). Accordingly, in addition to identifying these three traditional in order to change specific practices. Teachers referred to the way in
data-use types, the analysis process described above identified an ad- which the school management used SSE data to initiate discussion: “We
ditional type of data use, namely, “reinforcement data use”. In this always learn about the findings in our staff meetings, so it feels we
section we present each of the four types of data-use, and demonstrate know what is done with the questionnaires we and the students an-
the various ways in which they were enacted following the SSE process. swer.” (L). The school staff has routine weekly meetings in each of the
branches in order to provide a consistent venue for these discussions. In
addition, once every few months the pedagogical department organizes
5.1. Instrumental data use: action-oriented SSE
whole-school staff meetings in which various issues are presented and
debated.
Instrumental data use occurs when the evaluation findings lead di-
In recent years Rishonim has been gradually implementing a school-
rectly to specific changes in practice. In this case study we found in-
wide pedagogical and organizational reform. When asked about the
strumental data use both by the teachers as well as by the management
origins of the reform, the management interviewed could not point to
of Rishonim.
any one evaluation process which had pointed to an immediate action
One of the recurring issues described by teachers in their interviews
the school needs to take toward change, or which had provided evi-
was their use of SSE data to actively change specific classroom and
dence of its direct necessity. It may be that the accumulated knowledge
school conduct. For instance, one of the teachers said, “I use evaluation
gained over the years of SSE, as well as the revelations from some of the
findings [on school violence] to set specific goals for my class, in order
findings, raised the need for profound change which was answered by
to change recess activities” (F2). Another example for teachers using
the school-wide reform.
data to change and improve the school in a concrete way was given by a
One of the processes which could exemplify the way SSE has in-
teacher who described a problem the school had with the cafeteria
cremental power on the school’s decision-making was described by
lines, where they were constantly overcrowded and caused many stu-
several management-level study participants. At the center of a specific
dents to be late to classes. The teacher explained how following an
SSE process was an evaluation of students’ ways of learning, which
evaluation process, the problem and possible solutions were deliberated
showed the diversity of the students’ preferences, and the limited
with the students and a practical course of action was found: the ca-
variety of opportunities of expression these preferences were given. The
feteria added another cashier, and the student council organized
participants claimed this process led to the understanding that the
timetables for priority in the lines in order to control the flow of stu-
school needs to find ways to enable teachers to allow for diverse
dents.
learning tendencies. This conclusion may have been one of the SSE
The management emphasized the importance of using the knowl-
findings which initiated the discussion with regard to profound changes
edge created from SSEs: the CPO stated, “when you ask questions, you
in pedagogic and organizational aspects of the school. The finding from
must use the answers; asking in order to know is not enough”. In line
these, and other, SSE processes, were viewed as part of an accumulative
with this, the CEO said, “The challenge is not only to conduct a survey,
sequence which led to the school-wide reform, as one of the manage-
but to do something with the findings”. This perception of the man-
ment members said: “The reform is based on the continuum of eva-
agement is mirrored in one of the teachers’ interviews:
luations we executed” (D).
The surveys aren’t done in vain, they are used for improvement, and An evaluation of homework practices as perceived by teachers,
that is very important to us – there is serious consideration [of the parents, and students, serves as an example of an evaluation with
findings]. That is how it should be done, and if you give feedback findings that caught the management by surprise, and led to a process
then there must be a reference to this feedback – it is supposed to of discussing the school’s educational objectives. This evaluation re-
help growth (S). vealed that most of the high school students were struggling with
homework overload and were aided by private tutors. One of the school
The school administration aimed to generate evaluation findings
leaders stated: “I was rattled by these findings [of the evaluation on
that could be used in the short term by the teachers, for instance, by
homework practices] it required a deep thinking, I had no idea how
considering the options available for acting upon evaluation findings
many students were assisted by private tutoring lessons” (N). Another
when selecting questionnaires items. The CPO emphasized, “I can’t ask
school leader explained that “these disturbing findings …led to a pro-
about something that can’t be dealt with this year”, referring to not
fessional symposium on this subject, involving the school community:
asking teachers if they were happy with the availability of computers
teachers, parents, and alumni, using the presentation of the findings as
for their use in the teacher’s lounge, since there were no resources for
a motor to start a discussion on how to change the situation, and to
build the awareness of teachers.” (E). A central school leader concluded
2
These unique codes are used to allow the reader to differentiate between the that “the product [of this evaluation process] is the discussion and the
participants (teachers and management members). awareness of staff and parents [with regard to this issue]” (N).

5
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

5.3. Symbolic data use: validating prior decisions


I answered the survey, and I saw the results at the teachers meeting
– the results were projected on the big screen, and the management
Symbolic data use refers to data used to retroactively support de-
didn’t only present them – ‘come see the results’, but also – ‘let’s see,
cisions already made, as well as political and personal use of findings.
where were we wrong, where were we right, what should we sustain
Referring to the school-wide reform initiated by the management, a
and what improve.’ (L)
school leader explained how the school management used surveys on
the new programs as a legitimizing tool in order to “ignite” (in his An experienced teacher described her approach to the SSEs. And
words) the innovative reform, and to support the new initiative by what is learned from them, using the same expression:
providing information on the process to stakeholders such as parents
I am all for the surveys etc., it is very convenient that we have a way
and the school board.
of knowing what to improve and what to sustain….I know that lessons
A member of the school’s management described one of the ways
are learned, and I benefit from it. There is feedback to the teachers,
the findings could be used politically and in order to support existing
for instance there was a whole staff meeting [on one of the SSE
processes:
findings]… we can see that someone has thought this through –
The school has implemented significant procedures…and this is the anyway, it is a way to ventilate your feelings, it doesn’t have to
role of the surveys, to support the direction the school wants to go, imply significant change all the time (R)
or suggest new paths. If you want to start a communication major in
Teachers and management members used this expression to explain
school…a survey can support or reject this direction because you
how the SSE was used not only for changing and improving bad pro-
check with the teachers, the parents and the students if there is
cesses and practices, but also to intentionally recognize, understand,
interest and value in this initiative, and according to the outcome
and celebrate what is working and what the school should keep doing.
decide if you implement. It gives corroboration… the survey is not
the only source leading to new directions, but it has significance, it
5.4.2. Three ways to use data on school strengths
can support or confirm… [the school-wide reform] was not initiated
Our findings suggest three ways in which Rishonim school staff
because of surveys, but rather they started [the reform] and learned
members used positive data: 1. recognizing and sustaining good prac-
from the evaluation how it is done, and then fine-tuned it. (K).
tices; 2. enhancing a sense of pride; and 3. promoting a positive attitude
This school leader also emphasized the importance of the back- toward evaluation. As presented in the previous section, Rishonim staff
ground for the evaluation process, and how it was initiated after an viewed the evaluation process as a process aimed at improvement, but
unstable era in the school’s history, both economically and organiza- not necessarily entailing change. Many participants explained that
tionally. The school needed a tool to verify that it was on the right path, evaluation findings included both good and bad aspects, and that both
and the school’s stakeholders wanted an accountability tool. positive and negative findings should be taken into consideration when
Another example for symbolic data use refers to the use of an eva- setting the school agenda. When discussing the use of evaluations in-
luation process on teachers’ perceptions of the school as a way to un- cluding positive data, one of the principals described his goals for
derscore the management’s commitment to the teachers in the school. It evaluation, stressing the need for reinforcing existing practices by sys-
was important for a new CTO of the school to highlight how important tematic evidence: “Evaluation is multidimensional and should answer
and central teachers are to him specifically, and to the school in gen- the questions: Am I happy and satisfied with the situation? What am I
eral, and conducting a large comprehensive SSE on their perceptions happy with? Is it what we aimed for? [The last survey] gave me cor-
was a means for not only understanding their needs and views, but also roboration to the climate I am leading.” (A). This principal obtained
a way to underscore their significance to him. positive and negative evaluation findings and used both to decide how
to continue leading the school while aiming for continual school im-
provement. This example demonstrates how positive evaluation data
5.4. Reinforcement data use: the value of identifying strengths
could be used to better the school: By understanding which practices
worked, the principal was able to reinforce them.
Reinforcement data use is the fourth type of data use that we are
One of the management members explained the importance of re-
introducing, having emerged from inductive thematic analysis. This
ceiving data on both the good and bad aspects of the school as being
type is similar to conceptual data use as it refers to a change in people’s
accountable to stakeholders in terms of areas in need of improvement,
thoughts and beliefs and is not directly linked to an explicit action.
while also publicizing and celebrating its strengths:
Whereas conceptual data use involves using the process, or its results,
for gradual conceptual change, which is often inadvertent, reinforce- We want to really understand, not from informal conversations – to
ment data use involves a conscious, intentional act of utilizing eva- understand where we need to improve and what we are good at.
luation findings on school strengths, rather than a gradual change in What we do well, what less so and needs improving. It is essential
perceptions (as in conceptual data use). for us, also because of the school’s special situation in the education
system [as a semi-private school]. (O)
5.4.1. Unofficial shared language recognizing strengths alongside Several other administration members expressed the value of SSE
weaknesses outcomes that were not straightforward change: “The outcome is dis-
During data collection in this case study, it became clear that cussion and awareness…a sense of pride in what we are” (D). This view
Rishonim had an unofficial shared language. Many of the participants, was also evident in the process of choosing the questions for an eva-
independently, and spontaneously, used the same terms to refer to ac- luation of teachers’ perceptions of the school. The administration added
tivities related to the SSE process. One of these terms (in Hebrew - the survey items that concerned strengths: school ethos, goals, and vision as
language in which the interviews were conducted) was “Leshamer ve well as perceptions on collegial work at the school. These areas were
Leshaper”, an expression that means sustaining and improving. This known to be school strengths, as opposed to areas the management felt
expression is commonly used in schools when assessing strengths and were in need of change such as the issue of trust and communication
weaknesses of student achievements, but usually not to describe school with the management, as well as professional development options for
processes. Both teachers and management members used this expres- teachers.
sion to describe SSE as a process of sustaining and improving school One of the principals, who was previously a teacher in the school
practices. One of the new teachers used it when describing the weekly network, explained the importance of emphasizing strengths when
staff meeting in which the teachers learned about SSE findings: presenting SSE findings:

6
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

2009). It seems that reinforcement data use has the potential to dissolve
We showed the staff what our strengths were, what the good things
some of this resistance and enable ongoing non-intimidating evalua-
were. If you want the staff to cooperate, and to see the benefits of
tions that are not perceived as inspections.
cooperation, you need to tell them when they are good at co-
It is important to note that we do not attest that reinforcement data
operating. I think it strengthens the solidarity, ‘esprit de corps’(team
use is opposed to current typologies. Rather, our findings suggest that
spirit), gives a sense of fulfillment and worth when you see the good
this type of use could be viewed as complementary to existing types,
things that come up. It also puts the bad in perspective – this is a
facilitating a more comprehensive understanding of data use both
good school and good things happen here, it is important to have
theoretically and in practice. In a similar way that Zwart et al.’s (2015)
this feedback. (Y)
conceptual framework on professional learning explicitly addresses the
In addition to explaining how SSE can strengthen team spirit and value of confirming existing conceptions, reinforcement data use
school staff solidarity, this principal adds another benefit to recognizing highlights this aspect in data use frameworks, as well as the affordances
school strengths through SSE: Giving the staff the opportunity to see a of intentionally using data on school strengths.
more comprehensive picture of school activity. In his opinion, a more
positive attitude towards evaluations is facilitated by the acknowl- 6.2. Using data on school strengths to enhance school ethos
edgement through evaluations’ findings that “good things happen”
alongside that which needs fixing. When discussing the goals of evaluation, the literature notes that
It is important to note that the school network management re- evaluation can reveal strengths as well as weaknesses (Bubb et al.,
ceived periodic updates of the evaluation processes implemented at the 2007; HMIE, 2007; MacBeath et al., 2000; Reynolds, 2010). One of
school, and that the reports explicitly stated both negative and positive Leithwood et al.s’ (1998) findings when examining conditions that
findings. The management was proud of the school’s progress and the foster organizational learning in schools refers to using positive data:
satisfactory findings (see Study Context), but at the same time they “Teachers’ commitments to their own learning appeared to be re-
were careful to question the methodology employed in order to ascer- inforced by shared celebrations of successes by staff…” (p. 263).
tain the reliability of these positive findings. Nonetheless, most of the research in the field argues for the need to
change weaknesses and strive for improvement, without explaining
6. Discussion what could be done when strengths are identified. This study’s findings
show that the management at Rishonim deliberately collected data that
This case study examined how SSE data on school-wide issues be- was meant to uncover positive aspects about the school, in order to
yond individual instructional practices and student achievements are confirm them and celebrate them. Thus, along with conducting eva-
used for different purposes. The findings we presented in the previous luations that were meant to improve the school and change areas that
section answer the question “How does Rishonim use self-evaluation identified as weaknesses through the collection of negative data, the
data?” by elaborating on the various ways in which the school enacted school network systematically and intentionally collected positive data
the three traditional types of data use – instrumental, conceptual and and communicated it to the school community.
symbolic – and by proposing a fourth type, reinforcement data use. This Teachers, administration, and other key figures in the school’s self-
section discusses the implications for research and practice of the fourth evaluation process emphasized the importance of knowing what they
type of data use we identified. should sustain and what they should change, providing multiple ex-
amples for both. Some of the examples for change were of actions
6.1. Four types of data use based on data on school strengths and limited to specific contexts, such as adding teachers to yard duty during
weaknesses recess to prevent violence, while some examples were of broad edu-
cational policy decisions, such as the network-wide reform. The ex-
Our findings regarding Rishonim’s data use suggest that data col- amples of how data was used to sustain practices focused on the col-
lected by and for schools could be differentiated between two cate- lection and use of positive data which expressed the school leadership’s
gories (see Fig. 2): “positive data” which point to school strengths and agenda in enhancing school ethos and team spirit. The school ethos at
“negative data” which point to school weaknesses. This is similar to Rishonim commemorated their high academic achievements, but also
King and Pechman’s (1984) reference to the term “charged data use”, the emphasis the school gave to character education and values,
which views evaluation data as positive or negative, using both to in- striving for educational excellence and innovation in all aspects of
itiate change. Fig. 2 shows an overview of how these two categories of school-life. The school network emphasized the importance of cher-
data could be used internally and externally, and how this relates to the ishing strengths, and the school evaluation process identifying positive
four types of data use we presented in the findings. data enabled this ethos to be uncovered and celebrated. This approach
Instrumental data use is usually based on negative data (data in- to using positive data coincides with and expands upon arguments
dicative of school weaknesses) in order to change existing practices at made by previous researchers with regard to the importance of iden-
either the teacher/classroom level or at the school level. Positive data tifying the good aspects of schools in order to build on these aspects for
(data indicative of school strengths) does not have a part in instru- improvement (Brinkerhoff, 2003; Lawrence-Lightfoot, 1983; Marzano,
mental data use, as it does not point to areas in need of improvement 2003).
and does not relate directly to an action. Conceptual data use also fo-
cuses on negative data in order to change beliefs held on specific issues, 6.3. Limitations and implications for further research
but could use positive data for this goal. Symbolic data use occurs when
the school or teacher uses evidence retrospectively in order to justify This study followed the need to expand the research on data use
actions. practices (Coburn & Turner, 2012a; Johnson et al., 2009) by gaining
Reinforcement data use, a fourth type of data use that was identified insights into the various ways data was used by a network of schools
in this case study, is the use of positive data for reinforcing existing conducting independent evaluation processes. However, the methodo-
school strengths. This kind of use and its meaning for schools was in- logical choices that were made entail limitations that should be ad-
frequently discussed in the literature, yet it could be used as an im- dressed by future research. This case study relies on qualitative data
portant tool with which schools can enhance their ethos and team collected in a unique network of schools. The specific characteristics of
spirit. One of the barriers to school evaluations is the staff’s resistance this network, as well as the methods of the study, limit the causal
to them, associating evaluations with ways to control and inspect them generalizability of the findings. However, Maxwell (2012) emphasized
rather than as tools for improvement (Bubb et al., 2007; Vanhoof et al., that “causal generalizations are necessarily secondary to particular

7
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

Fig. 2. Classification of types of data use, according to the kind of data (“positive data” and “negative data”), and the kind of data use (internal and external).

causal understanding” (p. 657). Thus, this case study adopts Maxwell’s West, 2002; Schildkamp, Ehren & Lai, 2012; Schildkamp & Kuiper,
approach in exploring the specific conditions and context in which a 2010). This type of data is usually used conceptually and instrumentally
network of schools successfully implemented an ongoing systematic in order to change areas in which the school is lacking. External data
self-evaluation process, to understand how the network used data for use of positive data, aimed at accountability, is typically used for po-
school improvement. Future research could explore the concept of re- litical symbolic purposes, either promoting the school to various sta-
inforcement data use in other organizations and contexts. keholders, or using the data to legitimize actions after they are carried
In addition to the study’s setting, another limitation of the study out. As discussed previously, internal data use of positive data, aimed at
concerns the interview transcription process. Experts in the field ex- improvement, could enhance school ethos, support educational in-
plicitly recommend not taking notes during an interview in order to novations, and has meaningful practical implications for organizations.
promote concentration and rapport, viewing audio recording as an An important practical implication of reinforcement data use is the
obvious and necessary method (Josselson, 2013). However, following potential positive data has in enabling ongoing evaluation processes.
several participants’ refusals to be recorded, the current study was The findings of this study suggest that by using positive data in con-
compelled to carry out most of the interviews without audio-recording junction with negative data, school evaluations were perceived as a
them. Interview transcriptions were based on handwritten notes taken process oriented to learning and improvement rather than an ac-
during the interview and completed in detail later that day by the re- countability process with an inspection orientation. Acknowledging and
searcher. Despite this drawback, and partly due to the multiple, lengthy sustaining strengths encouraged teachers to accept the evaluation
meetings conducted with many of the participants, the majority of the process, and enabled educational innovations building on the strong
interviews were aligned with Josselson’s (2013) main characteristic of aspects of the school, reducing the threating facet of evaluations pre-
“good” interviews – creating an empathic and dynamic “research re- vious research noted (for example Conley & Glasman, 2008; Terhart,
lationship” (p. 24). 2013; Vanhoof et al., 2009).
In this paper we examined data use mainly from the perspectives of The concept of reinforcement data use has implications for practice
the leaders of Rishonim, as the data we collected emphasized this in schools. It raises questions regarding the role of evaluators in pre-
perspective. Further research could examine the impact of reinforce- senting and communicating positive data together with negative data
ment data use on teachers’ ownership of evaluation processes, espe- rather than focusing exclusively on school weaknesses. This study sheds
cially in light of the current study’s findings on the affordance of this light on the importance of the school management’s awareness of the
type of use on the acceptance of evaluation processes, and in light of potential in utilizing positive data as a constructive force in school.
emerging participatory models of evaluation. These findings may help us understand better ways to use data in
schools and other organizations for accountability and improvement
purposes, recognizing that identifying strengths is one of the valid goals
6.4. Implications for Policy and Practice and outcomes of evaluation aimed at improvement. Yet, it is important
to keep in mind that in order to fully benefit from the use of data on
Fig. 2 demonstrates possible ways schools can use different types of school strengths, schools need to have access to high quality data on
data for different types of purposes: Accountability and improvement. school weaknesses that point to areas in need of improvement (as
Both purposes can be based on data that highlights school strengths emphasized by Schildkamp & Kuiper, 2010). Using data only for re-
(i.e., positive data) alongside the traditional use of data pointing to inforcement limits the purposes of data use. Both practice and further
school weaknesses (i.e., negative data). Using internal and external data research may benefit by examining how data indicative of school
on school weaknesses is widely discussed in the literature on school strengths could be used for school improvement and for facilitating
evaluation and school improvement (for instance Coburn & Turner, continuous non-threatening evaluations.
2012a; Hargreaves, Lieberman, Fullan, & Hopkins, 2010; Hopkins &

8
T. Aderet-German and M. Ben-Peretz Studies in Educational Evaluation 64 (2020) 100831

References review and synthesis. Evaluation Review, 5, 525–549. https://doi.org/10.1177/


0193841X8100500405.
Lawrence-Lightfoot, S. (1983). The good high school: Portraits of character and culture.
Alkin, M. C., & King, J. A. (2016). The historical development of evaluation use. The Jossey-Bass Incorporated Pub.
American Journal of Evaluation, 37(4), 568–579. https://doi.org/10.1177/ MacBeath, J., Schratz, M., Meuret, D., & Jakobsen, L. (2000). Self-evaluation in European
1098214016665164. schools: A story of change. London: RoutledgeFalmer.
Alkin, M. C., & Taut, S. M. (2002). Unbundling evaluation use. Studies in Educational Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making
Evaluation, 29(1), 1–12. https://doi.org/10.1016/S0191-491X(03)90001-0. to inform practice. Educational Psychologist, 47(2), 71–85. https://doi.org/10.1080/
Assessment Reform Group (ARG) (2002). Assessment for learning: 10 principles. Cambridge, 00461520.2012.667064.
UK: University of Cambridge School of Education. Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and
Bertrand, M., & Marsh, J. A. (2015). Teachers’ sensemaking of data and implications for gaps. Teachers College Record, 14(11).
equity. American Educational Research Journal, 52(5), 861–893. https://doi.org/10. Marzano, R. J. (2003). What works in schools: Translating research into action. Alexandria,
3102/0002831215599251. VA: ASCD.
Birenbaum, M., Kimron, H., & Shilton, H. (2011). Nested contexts that shape Assessment Maxwell, J. A. (2012). The importance of qualitative research for causal explanation in
for Learning: School-based professional learning community and classroom culture. education. Qualitative Inquiry, 18(8), 655–661.
Studies in Educational Evaluation, 37, 35–48. https://doi.org/10.1016/j.stueduc.2011. Nevo, D. (2006). Evaluation in education. In I. F. Shaw, J. C. Greene, & M. M. Mark (Eds.).
04.001. The sage handbook of evaluation. Sage.
Brinkerhoff, R. (2003). The success case method: Find out quickly what’s working and what’s Nevo, D. (1995). School-based evaluation: A dialogue for school improvement. Oxford:
not. Berrett-Koehler Publishers. Pergamon.
Bubb, S., Earley, P., Ahtaridou, E., Jones, J., & Taylor, C. (2007). The self-evaluation Patton, M. Q. (1990). Qualitative evaluation and research methods. SAGE Publications.
form: Is the SEF aiding school improvement? Management in Education, 21(3), 32–37. Reeves, T. D., & Chiang, J. L. (2018). Online interventions to promote teacher data-driven
https://doi.org/10.1177/0892020607079991. decision making: Optimizing design to maximize impact. Studies in Educational
Chapman, C., & Sammons, P. (2013). School self-evaluation for school improvement: What Evaluation, 59, 256–269.
works and why? England: CfBT Education Trust. Reynolds, D. (2010). Smart school improvement: Towards schools learning from their
de Clercq, F. (2007). School monitoring and change: A critical examination of whole best. In A. Hargreaves, A. Lieberman, M. Fullan, & D. Hopkins (Eds.). Second
school-evaluation. Education As Change, 11(2), 97–113. International handbook of educational change (pp. 595–610). Netherlands: Springer.
Coburn, C. E., & Turner, E. O. (2012a). The practice of data use: An introduction. Rubin, H. J., & Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data. Sage
American Journal of Education, 118(2), 99–111. https://doi.org/10.1086/663272. Publications.
Coburn, C. E., & Turner, E. O. (2012b). The practice of data use [Special issue]. American Ryan, G. W., & Bernard, H. R. (2003). Techniques to identify themes. Field Methods, 15(1),
Journal of Education, 118(2). 85–109. https://doi.org/10.1177/1525822X02239569.
Conley, S., & Glasman, N. S. (2008). Fear, the school organization, and teacher evalua- Schildkamp, K., Ehren, M., & Lai, M. K. (2012a). Editorial article for the special issue on
tion. Educational Policy, 22(1), 63–85. https://doi.org/10.1177/0895904807311297. data-based decision making around the world: From policy to practice to results.
Contandriopoulos, D., & Brousselle, A. (2012). Evaluation models and evaluation use. School Effectiveness and School Improvement, 23(2), 123–131. https://doi.org/10.
Evaluation, 18(1), 61–77. 1080/09243453.2011.652122.
Cousins, J. B., & Leithwood, K. A. (1986). Current empirical research on evaluation uti- Schildkamp, K., Ehren, M., & Lai, M. K. (2012b). Data-based decision making around the
lization. Review of Educational Research, 56(3), 331–364. https://doi.org/10.3102/ world: From policy to practice to results [Special issue]. School Effectiveness and
00346543056003331. School Improvement, 23(2).
Datnow, A., & Hubbard, L. (2015a). Teacher capacity for and beliefs about data-driven Schildkamp, K., Karbautzki, L., & Vanhoof, J. (2014). Exploring data use practices around
decision making: A literature review of international research. Journal of Educational Europe: Identifying enablers and barriers. Studies in Educational Evaluation, 42, 15–24.
Change, 17(1), 7–28. https://doi.org/10.1016/j.stueduc.2013.10.007.
Datnow, A., & Hubbard, L. (2015b). Teachers’ use of assessment data to inform instruc- Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data,
tion: Lessons from the past and prospects for the future. Teachers College Record, what purposes, and promoting and hindering factors. Teaching and Teacher Education,
117(4), n4. 26(3), 482–496. https://doi.org/10.1016/j.tate.2009.06.007.
Fullan, M. (2006). Change theory: A force for school improvement. Seminar series no. 157. Data-based decision making in education. In K. Schildkamp, M. Lai, & L. Earl (Vol. Eds.),
Centre for strategic education. Retrieved from http://www.michaelfullan.ca/Articles_ Studies in educational leadership: vol 17Dordrecht: Springer. https://doi.org/10.1007/
06/06_change_theory.pdf. 978-94-007-4816-3_10.
Hargreaves, A., Lieberman, A., Fullan, M., & Hopkins, D. (Eds.). (2010). (Vol. 23). Schildkamp, K., & Lai, M. (2013). Conclusions and data-use framework. In K. Schildkamp,
Springer Science & Business Media. M. Lai, & L. Earl (Vol. Eds.), Data-based decision making in education. Studies in edu-
Heitink, M. C., Van der Kleij, F. M., Veldkamp, B. P., Schildkamp, K., & Kippers, W. B. cational leadership: vol 17, (pp. 137–191). Dordrecht: Springer. https://doi.org/10.
(2016). A systematic review of prerequisites for implementing assessment for 1007/978-94-007-4816-3_10.
learning in classroom practice. Educational Research Review, 17, 50–62. Shulha, L. M., & Cousins, J. B. (1997). Evaluation use: Theory, research, and practice
Hopkins, D., Stringfield, S., Harris, A., Stoll, L., & Mackay, T. (2014). School and system since 1986. Evaluation Practice, 18, 195–208.
improvement: a narrative state-of-the-art review. School Effectiveness and School Staman, L. L., Timmermans, A. A., & Visscher, A. A. (2017). Effects of a data-based de-
Improvement. An International Journal of Research, Policy and Practice, 25(2), cision making intervention on student achievement. Studies in Educational Evaluation,
257–281. https://doi.org/10.1080/09243453.2014.885452. 55, 58–67.
Hopkins, D., & West, M. (2002). Evaluation as school improvement: A developmental Surra, C. A., & Ridley, C. A. (1991). Multiple perspectives on interaction: Participants,
perspective from England. In D. Nevo (Ed.). School-Based Evaluation: An International peers, and observers. In B. M. Montgomery, & S. Duck (Eds.). Studying interpersonal
Perspective (pp. 89–112). Oxford, UK: Elsevier Science. interaction (pp. 25–55). New York: Guilford Press.
HM Inspectorate of Education (HMIE) (2007). How Good is our School ? – The Journey to Terhart, E. (2013). Teacher resistance against school reform: Reflecting an inconvenient
Excellence, Part 3. https://www.educationscotland.gov.uk/Images/ truth. School Leadership and Management, 33(5), 486–500. https://doi.org/10.1080/
HowgoodisourschoolJtEpart3_tcm4-684258.pdf (accessed 30 September 2014). 13632434.2013.793494.
Johnson, K., Greenseid, L. O., Toal, S. A., King, J. A., Lawrenz, F., & Volkov, B. (2009). Van Geel, M., Keuning, T., Visscher, A. J., & Fox, J. P. (2016). Assessing the effects of a
Research on evaluation use: A review of the empirical literature from 1986 to 2005. school-wide data-based decision-making intervention on student achievement growth
The American Journal of Evaluation, 30(3), 377–410. https://doi.org/10.1177/ in primary schools. American Educational Research Journal, 53(2), 360–394.
1098214009341660. Vanhoof, J., Van Petegem, P., & De Maeyer, S. (2009). Attitudes towards school self-
Josselson, R. (2013). Interviewing for qualitative inquiry: A relational approach. Guilford evaluation. Studies in Educational Evaluation, 35(1), 21–28.
Press. Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation,
King, J. A., & Pechman, E. M. (1984). Pinning a wave to the shore: Conceptualizing 37(1), 3–14.
evaluation use in school systems. Educational Evaluation and Policy Analysis, 6(3), Zwart, R. C., Korthagen, F. A., & Attema-Noordewier, S. (2015). A strength-based ap-
241–251. proach to teacher professional development. Professional development in education,
Leithwood, K., Leonard, L., & Sharratt, L. (1998). Conditions fostering organizational 41(3), 579–596.
learning in schools. Educational Administration Quarterly, 34(2), 243–276.
Leviton, L. C., & Hughes, E. F. X. (1981). Research on the utilization of evaluations: A

You might also like