You are on page 1of 55

Educational Administration

Quarterly
http://eaq.sagepub.com/

How Graduate-Level Preparation Influences the Effectiveness of


School Leaders: A Comparison of the Outcomes of Exemplary and
Conventional Leadership Preparation Programs for Principals
Margaret Terry Orr and Stelios Orphanos
Educational Administration Quarterly 2011 47: 18 originally published online 2
November 2010
DOI: 10.1177/0011000010378610

The online version of this article can be found at:


http://eaq.sagepub.com/content/47/1/18

Published by:

http://www.sagepublications.com

On behalf of:

University Council for Educational Administration

Additional services and information for Educational Administration Quarterly can be found
at:

Email Alerts: http://eaq.sagepub.com/cgi/alerts

Subscriptions: http://eaq.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

Citations: http://eaq.sagepub.com/content/47/1/18.refs.html

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


>> Version of Record - Jan 10, 2011

OnlineFirst Version of Record - Nov 2, 2010

What is This?

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


378610
OrphanosEducational Administration Quarterly
© The University Council for
Reprints and permission: http://www.
sagepub.com/journalsPermissions.nav
EAQ47110.1177/0011000010378610Orr and

Article
Educational Administration Quarterly

How Graduate-Level 47(1) 18­–70


© The University Council for
Educational Administration 2011
Preparation Influences Reprints and permission: http://www.
sagepub.com/journalsPermissions.nav
the Effectiveness of DOI: 10.1177/0011000010378610
http://eaq.sagepub.com
School Leaders:  A
Comparison of the
Outcomes of Exemplary
and Conventional
Leadership Preparation
Programs for Principals

Margaret Terry Orr1 and Stelios Orphanos2

Abstract
Purpose: This study attempted to determine the influence of exemplary
leadership preparation on what principals learn about leadership, their use
of effective leadership practices, and how their practices influence school
improvement and the school’s learning climate.The authors also investigated
how the frequency of effective leadership practices related to the strength
of district support and the extent of school problems and student pover-
ty. Finally, the authors examined the contribution of exemplary leadership
preparation to variations in school improvement progress and school ef-
fectiveness climate. Research Design: The study, using survey research con-
ducted in 2005, compared 65 principals who had graduated from one of four
selected exemplary leadership preparation programs to a national sample of

1
Bank Street College, New York, NY, USA
2
Frederick University, Limassol, Cyprus

Corresponding Author:
Stelios Orphanos, Frederick University, Mariou Agathangelou 18, Ag. Georgios Havouzas, 3080
Limassol, Cyprus
Email: s.orphanos@frederick.ac.cy

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 19

111 principals.The authors used structural equation modeling to find the best
fit. Findings: Participation in an exemplary leadership preparation program
was significantly associated with learning about effective leadership and en-
gaging in these practices, particularly where stronger preparation program
and internship quality existed. Frequent use of effective leadership practices
was positively associated with school improvement progress and school ef-
fectiveness climate. Taken together, exemplary leadership preparation had a
positive but mediated influence on variations in school improvement prog-
ress and school effectiveness climate; the relationship was even stronger
when focusing on preparation program and internship quality measures.
Conclusions: Faculty investments in preparation program and internship
quality will positively contribute to the leadership knowledge of graduates
and their leadership practices and school improvement progress. These re-
sults yield significant implications for policy makers, universities, and other
providers of leadership preparation.

Keywords
leadership preparation, internship, principals, school improvement

Public demands for more effective schools have placed growing attention on
the influence of school leaders, primarily principals and assistant principals.
Numerous studies have consistently found positive relationships between
principals’ practices and various school outcomes, including student achieve-
ment (Hallinger & Heck, 1996; Leithwood & Jantzi, 2008; Robinson, Lloyd,
& Rowe, 2008). This growing body of research has tried to account for the
ways that school leaders influence the academic achievement of students in
their schools. The research has yielded strong evidence that leaders’ influence
is felt primarily through their direct effects on staff and organizational condi-
tions. Given these findings, policy makers and educational experts are increas-
ingly turning to educational leadership preparation and development as a
strategy for improving schools and student achievement (Educational Research
Service, 2000; Farkas, Johnson, Duffett, & Foleno, 2001; Hale & Moorman,
2003). The collective aim of these efforts is to develop leaders who can pro-
mote powerful teaching and effective learning for all students in their schools
(Bottoms & O’Neill, 2001).
Existing research has contributed to important changes in the nature and
quality of leadership preparation. In recent years, many university-based edu-
cational leadership preparation programs have redesigned their content and

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


20 Educational Administration Quarterly 47(1)

delivery to meet national standards that are based on effective leadership


research (National Policy Board for Educational Administration [NPBEA],
2005). By setting standards, NPBEA (2002) has attempted to develop excel-
lent schools through the preparation of high-quality leaders trained in pro-
grams that reflect the new conditions faced by school principals.
As well, many states, districts, and funders (both public and private) are
developing new policies and investing resources to improve strategic school
leadership preparation and development (McCarthy, 1999; Sanders & Simpson,
2005; Southern Regional Education Board, 2006). Several school districts
(particularly in urban areas, such as Hartford, Connecticut; Jefferson County,
Kentucky; San Diego, California; and New York, New York) are collaborat-
ing with universities to produce aspiring leaders who are well qualified to
implement district reform strategies and improve schools for higher student
achievement (Darling-Hammond, LaPointe, Meyerson, Orr, & Cohen, 2007;
Orr, King, & LaPointe, 2009). These efforts place greater emphasis than ever
before on university-based preparation that develops leaders’ capacity for
creating and implementing a school’s vision and facilitating instructional qual-
ity, teacher development, and parent and community engagement (McCarthy,
1999; NPBEA, 2002).
The expectation, then, is that the quality of a preparation program can
positively influence leaders’ work and their school’s efforts to improve stu-
dent achievement. Yet only limited research exists on the relationship between
preparation program approaches and features and effective leadership prac-
tices (Leithwood, Jantzi, Coffin, & Wilson, 1996; Orr, 2009; Orr & Barber,
2007). This article addresses the research gap by investigating the relation-
ship between leadership preparation and the success of graduates as princi-
pals in improving their school’s educational climate. We build on prior
conceptual work on the effects of leadership preparation on leadership prac-
tices and its indirect effects on school improvement work (Orr, 2009). We
also draw on the leadership preparation research of a larger Stanford University
study conducted in collaboration with the Finance Project and WestEd
(Darling-Hammond et al., 2007). That study was funded by the Wallace
Foundation as part of its mission to support and share effective ideas and
practices about leadership and education. The findings presented here reflect
the authors’ views.

Study Framework
Leadership preparation program models and features, which are critical to the
development of effective leadership practices that generate school improvement

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 21

and effectiveness, provide the framework for this study. The presumed path-
way from leadership preparation to principals’ school improvement work is
based on heretofore only partially tested assumptions about mediated relation-
ships in a causal chain of influences (i.e., the relationship of leadership prac-
tices to teacher practices and school climate and, in turn, their relationship to
student outcomes, including engagement and achievement). In studies of both
leadership and effective school improvement, several researchers have
mapped the relationship among leadership practices, changes in teaching and
school organization practices, and student outcomes. Both types of research
identify student, teacher, and organizational conditions that are most predictive
of student achievement gains and examine which leadership practices and
school improvement changes are most commonly at play.
Based on a review of available research, Orr (2003, 2006b) established a
conceptual model that stepped back further to suggest how leadership prepa-
ration influences the practices of principals and school improvement and
effectiveness. She delineated a series of staged outcomes that took into
account participants’ program experiences, what they learned, and how they
applied that knowledge in their schools, which is similar to the four-stage
outcome model for evaluating adult development programs (Kirkpatrick,
1998). An overriding assumption of Orr’s conceptual model is that better
quality leadership preparation has a positive influence on graduates’ leader-
ship practices, which in turn influences the quality of their school improve-
ment practices and the educational climate of their schools. The primary
independent variable of this model is quality leadership preparation: The ini-
tial and direct outcome of leadership preparation is graduates’ knowledge
about school leadership and the extent to which they exercise effective lead-
ership practices. Thus, there may be an important, albeit distant, relationship
between leadership preparation and student achievement.
The study reported here uses this hypothesized model as its framework
and tests whether there is a series of mediated relationships among variables
in a causal model that links leadership preparation to school improvement. In
this section we review prior evidence about leadership preparation and these
relationships.

The Nature of Exemplary Leadership


Preparation
Much of the research on leadership preparation has consisted of case
studies of innovative program models and survey-based investigations of the
efficacy of specific program features (Orr, 2009). There have been extensive

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


22 Educational Administration Quarterly 47(1)

reviews of research on exemplary leadership preparation programs and qual-


ity program features (Davis, Darling-Hammond, Meyerson, & LaPointe, 2005;
Jackson & Kelley, 2002; McCarthy, 1999; Orr, 2006a; Young, Crow, Ogawa,
& Murphy, 2009), and they draw similar conclusions about the features of
exemplary programs. Such programs have the following elements:

• A well-defined theory of leadership for school improvement that


frames and integrates the program features around a set of shared
values, beliefs, and knowledge
• A coherent curriculum that addresses effective instructional leader-
ship, organizational development, and change management and that
aligns with state and professional standards
• Active learning strategies that integrate theory and practice and stim-
ulate reflection
• Quality internships that provide intensive developmental opportuni-
ties to apply leadership knowledge and skills under the guidance of
an expert practitioner–mentor
• Knowledgeable (about their subject matter) faculty
• Social and professional support, including organizing students into
cohorts that take common courses together in a prescribed sequence,
formalized mentoring, and advising from expert principals
• The use of standards-based assessments for candidate and program
feedback and continuous improvement that are tied to the program
vision and objectives

Definitions of these quality features have now become foundational to devel-


oping constructs for research on leadership preparation (Young et al., 2009).
Other research reviews have characterized conventional preparation as a con-
struct for the absence of effective preparation (McCarthy, 1999; U.S. Department
of Education, 2004). The U.S. Department of Education (2004), for example, in
an analysis of federally funded programs, characterized conventional programs
as lacking vision, purpose, and coherence. Furthermore, according to the analy-
sis, in this program type students self-enroll without admissions consideration of
their prior leadership experience, and they progress through discrete, often unre-
lated, courses, without connection to actual practice or local schools.

The Influence of Leadership Preparation


A small body of research has tried to identify how the nature of leadership
preparation influences the leadership practices and school outcomes. This is

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 23

a very new area of research, limited by scholarly skepticism over the perceived
legitimacy of such research and difficulties of launching large-scale com-
parative research studies (Kottkamp & Rusch, 2009; McCarthy & Forsyth,
2009). Available research results, however, are positive, highlighting exem-
plary preparation approaches that are most influential to leadership develop-
ment and have focused on specific program features (e.g., program content
and organization) or the combination of features in innovative programs.
Three studies have investigated the relationship between individual pro-
gram features and graduate outcomes for innovative or conventional pro-
grams. Leithwood et al. (1996) documented 11 innovative graduate-level
leadership preparation programs that were redesigned through a Danforth
Foundation grant initiative and surveyed teachers who worked in schools led
by program graduates. The authors found that programs’ innovative use of
several features—instructional strategies, cohort membership, and program
content—was most predictive of teachers’ positive perceptions of principals’
leadership effectiveness (e.g., in setting direction, developing staff, fostering
a positive school culture, and focusing on curriculum and instruction).
Two more recent studies investigated the relationship between quality pro-
gram features and initial graduate outcomes: what graduates learned about
leadership, their beliefs about the principalship as a career, and their actual
career advancement. The studies were modeled on the theory of planned
behavior, which asserts that career intentions are strongly predictive of sub-
sequent career advancement and are influenced by individuals’ perceived
efficacy and beliefs about the position (Ajzen, 1991), and on other related
research on beliefs about the principalship and career aspirations (Pounder &
Merrill, 2001). Orr and Barber (2007) compared the outcomes for graduates
of two university–district partnership programs (both designed to include
many of the innovative features identified above) with outcomes for graduates
of a conventional program (with few such features). They found that three
program features—supportive program structures (e.g., accessibility and
scheduling convenience), a comprehensive and standards-based curriculum,
and broader, more intensive internships—were significantly but differentially
related to three types of outcomes: self-assessed leadership knowledge and
skills, leadership career intentions, and actual career advancement.
Similarly, Orr, Silverberg, and LeTendre (2006) examined how differ-
ences in five programs’ incorporation of these innovative features and overall
program redesign to meet national and state standards were associated with
graduate learning and career outcomes. They found that the five programs
varied most on measures of three types of program features: program chal-
lenge and coherence, use of active student-centered instructional practices,

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


24 Educational Administration Quarterly 47(1)

and internship length and quality. How graduates rated the extent to which
these features were attributes of their preparation was significantly related to
how much they learned about instructional and organizational leadership. The
length and quality of internships, however, were uniquely associated with
graduates’ career intentions and subsequent advancement. Thus, the authors
concluded, programs with well-implemented, innovative features yield posi-
tive and significantly better outcomes than more typical programs.
These three studies used different outcome measures for evaluating the
influence of leadership preparation. Leithwood and others (1996) focused on
the association of preparation to teachers’ perceptions of leadership practices.
In comparison, Orr and Barber (2007) and Orr et al. (2006) evaluated more
near-term outcomes, including graduates’ self-assessed leadership knowledge
and skills, extent of learning about instructional and organizational leadership,
positive and negative beliefs about the principalship, career intentions, and
actual career advancement—all of which were associated with one or more
preparation program features.
Finally, a few studies are now including measures of leadership prepara-
tion quality as influential to leadership outcomes, such as leadership self-
efficacy and leadership practices (Tschannen-Moran & Gareis, 2005). For
example, Tschannen-Moran and Gareis (2005), in their study of 558 princi-
pals, found that perceived quality and utility of leadership preparation signifi-
cantly contributed to principals’ sense of leadership self-efficacy.

Antecedents of Leadership Preparation


Critics of the leadership preparation field identify a potential influence on the
preparation program–leadership practice relationship: differences in the lead-
ership potential and the prior skills of students admitted to programs. Some
critics argue that programs’ admissions decisions are more determinant of
graduate outcomes than are their program features (e.g., Levine, 2005).
Moreover, many programs have been shown to use weak selection criteria (e.g.,
low or nonexistent admissions expectations) based on prior academic perfor-
mance or leadership experiences (Lad, Browne-Ferrigno, & Shoho, 2005).
Two of the studies summarized above (Orr & Barber, 2007; Orr et al.,
2006) included consideration of students’ personal characteristics, and they
found that gender and race/ethnicity were not significantly related to gradu-
ates’ ratings of the extent to which program features existed (e.g., instruc-
tional leadership focus and challenging content) or their learning and career
outcomes. Both found that candidates’ leadership aspirations prior to enroll-
ing and, for Orr et al. (2006), the extent of their prior leadership experience

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 25

(e.g., a position as department chair or instructional coach) were positively


associated with program feature ratings and graduate outcomes.

Influence of Leadership on Schools


The basic conceptual model (presented as Model 1 in Figure 1) that we used
for the study reported here hinges on leadership quality, both as the outcome
of leadership preparation and what graduates learn and as an influence on
school improvement and school educational climate. Thus, the conceptualiza-
tion of leadership quality and its hypothesized relationships with both prepa-
ration and subsequent practice are critical. Several reviews of research on
school leader practices provide converging findings on both issues and serve
as a foundation for our work (Hallinger & Heck, 1996; Leithwood, Louis,
Anderson, & Wahlstrom, 2004; Robinson et al., 2008; Waters, Marzano, &
McNulty, 2003). The authors of these reviews confirm two hypotheses:
(a) that leadership practices influence school performance and (b) that what
we know about the effects of leadership on the school climate and students’
achievement is dependent on how the research is conceptualized.
Hallinger and Heck (1998) identified three types of models for the causal
relationship between leadership and school effectiveness where leadership
beliefs and behaviors are treated as independent variables and measures of
school performance are dependent variables. Most of the 40 studies that they
reviewed were nonexperimental and used cross-sectional, correlational
design with surveys or interviews as their source of data. The three research
design models were direct effects, mediated effects (both with and without
consideration of antecedent effects), and reciprocal effects of the relationship
between leadership and student achievement.1 Out the studies that used either
direct effects only or direct effects plus antecedent effects designs, all showed
no or weak leadership effects on school outcomes. The studies that investi-
gated mediated effects (with and without consideration of antecedent effects)
all yielded a range of effects, primarily indirect. The authors concluded that
leadership effects on schools are indirect and that improvement in student
achievement is through the results of others, primarily teachers and other
professional staff. They noted that research on leadership–school effects has
evolved over time in two ways: more sophisticated conceptualization and
practice of leadership (including instructional and transformational leader-
ship) and their linkages to school outcomes and more advanced statistical
analysis techniques to test theoretical models. Accounting for these effects is

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


26 Educational Administration Quarterly 47(1)

SIP

IQUAL PELI

PREP OILL ILP EXP SPL ISP DS

PBP
PQUAL
ESC

Figure 1. Conceptual Model 1


DS = district support; ESC = effective school climate; EXPP = principalship
experience; ILP = instructional leadership practices; IQUAL = internship quality;
ISP = index of school problems; OILL = organizational and instructional leadership
learning; PBP = positive belief about principalship; PELI = prior experience in
leading instruction; PQUAL = program quality; PREP = participation in an exemplary
preparation program; SIP = school improvement progress; SPL = school poverty level.

in part dependent on the research’s conceptual richness and methodological


quality.
In their review of research, Leithwood and others (2004) concluded that
the leadership practices that most strongly influence teachers and organiza-
tional conditions are those related to setting direction (through vision, goals,
and expectations), helping individual teachers (through support and model-
ing), redesigning the organization (to foster collaboration and engage fami-
lies and community), and managing the organization (providing organizational
resources and support). Waters and others (2003) drew similar conclusions in
their research review, defining these elements as balanced leadership
practices.
Finally, Robinson and others (2008) evaluated the findings of 27 published
research studies to determine which leadership practices most influenced stu-
dent learning. They distilled their findings into five leadership dimensions,
compared and contrasted instructional and transformational leadership, and
calculated the average effect size that each dimension contributed to show
differences among leadership practices:

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 27

Dimension 1: Establishing goals and expectations (goal settings) had


average effect sizes of 0.42 standard deviations
Dimension 2: Strategically deploying resources (securing, aligning,
and allocating resources, including staffing and teaching resources)
had an average effect size of 0.31 standard deviations
Dimension 3: Planning, coordinating, and evaluating teaching and
the curriculum was shown to have a moderate impact on student
achievement (effect size = 0.42), particularly strategies such as
being actively involved in collegial discussion of instructional mat-
ters, actively overseeing and coordinating the instructional program,
coordinating the curriculum across levels, conducting classroom
observations and providing feedback, and helping staff systemati-
cally monitor student progress
Dimension 4: Promoting and participating in teacher learning and
development (formally and informally) had a large average effect
size of 0.84
Dimension 5: Ensuring an orderly and supportive environment had a
modest effect size (effect size = 0.27 standard deviation)

From their meta-analysis, the authors concluded that instructional leadership


has 3 to 4 times more impact than transformational leadership on student
learning.2 But they acknowledged some of the measurement problems in
assessing transformational leadership, noting that some researchers are modi-
fying transformational leadership to be more applicable to educational settings,
thus merging instructional and transformational leadership. They agree that
more in-depth research on the relationship among leadership practices, tasks,
and the work of schools to improve teaching and learning will yield larger
impacts on student outcomes.
Most of the above studies assess leadership practices using teacher sur-
veys. Few studies use principals’ self-reports on their own behaviors. One
exception is Goldring and Pasternack (1994), who combined survey data
from 36 Israeli principals with their schools’ performance data. Dichotomous
measures of whether principals emphasized six leadership tasks and their
perceived influence only marginally discriminated between the more and less
effective schools. Principals’ goal emphasis, particularly having staff consen-
sus about educational goals, did discriminate between more and less effective
schools, however, after controlling for parents’ socioeconomic status (as
reported by the principal).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


28 Educational Administration Quarterly 47(1)

Effects of Leadership on Schools


Leithwood and others (2004) identified critical teacher and organizational
conditions that are directly influenced by leadership practices and in turn
influence student engagement, effort, and achievement. These conditions,
therefore, are conceptualized in various studies they reviewed as both depen-
dent variables in the leadership–school outcome relationship and as anteced-
ents to student performance outcomes. According to Leithwood et al., the
modifiable organizational conditions include having shared purposes and goals,
enabling school structures (e.g., using instructional and staff time effectively
and limiting disruptive external influences), an organizational commitment
to purpose and change, a school learning culture, quality content and instruc-
tion, and organizational learning practices and other school environment
factors conducive to study achievement (parental involvement, an orderly
environment, and collegiality). The teacher effects include job satisfaction,
improved teaching practices, participation in distributed leadership, and col-
lective teacher development and engagement.
A small body of school improvement research has similarly examined the
leader, teacher, and organizational factors that influence school change and
improvement and contribute most to gains in student outcomes (Muijs, Harris,
Chapman, Stoll, & Russ, 2004; Sebring, Allensworth, Bryk, Easton, &
Luppescu, 2006; Sweetland & Hoy, 2000). Taken together, these investiga-
tions suggest that the effectiveness of leadership practices would first be evi-
dent in progress in creating a school improvement culture (in which teachers
and other school staff engage in practices that are most commonly associated
with effective school improvement) and an academically positive school cli-
mate. Although much of the research in this area is fairly new, the findings
suggest ways to conceptually model and test the relationships.
School improvement progress. In their review of school improvement and
reform research (both peer-reviewed and evaluation reports), Muijs et al.
(2004) identified organizational conditions that characterize improving schools:
focusing on teaching and learning, having an information-rich environment,
creating a positive school culture, building a learning community, fostering
continuous professional development, involving parents, and providing external
support and resources. Such findings were confirmed by the qualitative research
on high- and low-performing schools by Brown, Anfara, and Roney (2004).
They found that the two sets of schools differed on measures of the technical,
managerial, and institutional levels of the schools’ organizational health (e.g.,

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 29

teacher efficacy, curriculum articulation, student expectations, collegiality,


and instructional and institutional integrity; Brown et al., 2004). It would
appear, then, that a school’s progress on adopting strategies shown to improve
schools—and the leadership practices to affect such adoption—would be
positively associated with improved student outcomes.
School climate. As noted by Muijs et al. (2004), the quality of the academic
environment of the school has been found to be positively associated with
student achievement gains. Sebring and others (2006), through their research
on Chicago schools, further found that a student-centered learning climate and
ambitious instruction were critical to school success. Sweetland and Hoy
(2000) similarly constructed a measure of an academically positive school
climate, which they termed “academic press” and defined as a stress on aca-
demics by students and teachers, resource support, and principal influence. In
their study of a diverse sample of New Jersey middle schools, they found that
the extent of academic press in a school is highly correlated with student math
and reading achievement scores. Both studies found teachers’ capacity—
defined by Sebring and others (2006) as professional capacity and by Sweet-
land and Hoy (2000) as teacher empowerment to make decisions on curriculum
and instruction and a view of the school as a professional community—to be
important as well. Similarly, Brand, Felner, Minsuk, Seitsinger, and Dumas
(2003) assessed students’ and teachers’ perspectives of whole-school climate
for middle schools (which includes student commitment, teacher support,
safety, peer interactions, and discipline approach among others) and student
commitment and achievement orientation, drawn from samples of 188 schools
in 25 states. They found their school climate measures to be highly predictive
of students’ reading and math scores and GPA as well as the prevalence of
student and school problems (Brand et al., 2003).
Taken together, these studies show that school climate is an important
associated outcome of the school leadership–school performance relation-
ship and may be an important mediating factor to school performance given
how much it predicts student outcomes.

Moderating Effects of Districts


and School Communities
The above leadership and school-related outcomes focus primarily on leader-
ship influences on within-school processes. Two external factors have been
identified in prior research as critical and are included here as possible
moderating influences on the leadership–school improvement relationship.3
They are district support and school community context.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


30 Educational Administration Quarterly 47(1)

There is limited research on how district support influences the leader-


ship–school relationship. Very few large-scale studies have tried to model the
influence of district support on the leadership–school relationship. However,
some prior research, primarily qualitative, has shown that the scope and
nature of district support (in its coherence and degree and nature of support)
positively influence effective leader actions (McLaughlin & Talbert, 2002;
Osterman & Sullivan, 1996). More recently, Leithwood and Jantzi (2008),
using path analysis techniques in a study of 79 schools, found that both dis-
trict leadership and district conditions (that are most focused on producing
student learning) are strong influences on principals’ leadership self-efficacy
and that the latter is a strong influence on the principals’ sense of collective
leadership efficacy. They define these two types of efficacy as a leader’s
“belief about one’s own ability (self-efficacy), or the ability of one’s col-
leagues collectively (collective efficacy), to perform a task or achieve a goal”
(Leithwood & Jantzi, 2008, p. 497).
Several studies have yielded mixed results when incorporating one or
more measures of the school and community context as moderating influ-
ences in the leadership–school performance relationship. Sebring and others
(2006) found that the extent of poverty, school size, and community resources
can stress or enhance school improvement efforts. Sweetland and Hoy (2000),
in contrast, found that students’ socioeconomic status had no independent
effect on measures of teacher empowerment and their schools’ student achieve-
ment scores. In their path analysis study, Leithwood and Jantzi (2008) found
that district size (not school size) and school level were powerful moderators
in the principal–school performance relationship. Finally, Tschannen-Moran
and Gareis (2005) found that the extent of school poverty was unrelated to
principals’ sense of self-efficacy.
None of these studies considered how challenging the schools were to lead
based on measures of student and teacher problems (e.g., absenteeism and
verbal and physical behavior problems). Yet these factors are often cited as
reasons why low-performing schools are difficult to lead and why leadership
preparation improvements are needed (Bottoms & O’Neill, 2001; Institute for
Educational Leadership, 2000; Leithwood & Riehl, 2005; Shen, Rodriguez-
Campo, & Rincones-Gomez, 2000).

Method
Drawing from the above research, we concluded that a positive relationship
might exist between completion of a quality leadership preparation program
and the following: leadership knowledge, effective leadership practices, and

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 31

school improvement and effective school climate. Each set of influences,


however, must be strong enough to transform its target outcome. Moreover,
given the existing leadership preparation research, exemplary preparation
programs should have a more positive influence on schools and students than
conventional programs because they are likely to incorporate more quality
features. Consequently, we drew on available survey data on exemplary (i.e.,
encompassing the above quality features) and conventional preparation to
examine the influence of principals’ leadership preparation on the leadership
knowledge, leadership practices, school improvement, and school climate
outcome relationships.
We analyzed the survey data using structural equation modeling (SEM)
techniques. This line of analysis is appropriate for a number of reasons. First,
we made use of multiple indicators representing theoretical constructs; con-
ventional methods, such as multiple regression analysis, cannot be used to
estimate such models. Second, the observed variables were not perfect reflec-
tions of their underlying constructs; SEM analysis takes measurement error
explicitly into consideration. Finally, SEM can test the relationships among
multiple variables simultaneously, enabling us to estimate direct, indirect,
and total effects for various variables of interest.
The following research questions guided our investigation of the relation-
ships among these measures to be tested through SEM:

1. How does principals’ completion of an exemplary leadership prepa-


ration program, consisting of both high-quality features and intern-
ships, relate to the acquisition of knowledge about instruction and
organization leadership and use of effective leadership practices?
2. What is the relationship for principals among effective leadership
practices, school improvement progress, and school effectiveness
climate? How are these relationships moderated by the degree of
district support and the extent of challenging problems and students
in poverty?
3. What is the contribution of types and quality of leadership prepara-
tion to variations in school improvement progress and school effec-
tiveness climate?

Sample
The study compares a sample drawn from all principals who completed one
of four exemplary leadership preparation programs between 1999 and 2005
to a national sample of comparison principals. The initial source of these two

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


32 Educational Administration Quarterly 47(1)

samples was the Stanford University study described above. The origins of
this source and how the two samples were narrowed for the purposes of this
study are described below.
The first sample was constructed through a two-step process. It began with
selecting the four exemplary leadership programs that coherently organized
preparation around core assumptions of effective leadership and preparation,
had stringent student selection criteria, used program designs that empha-
sized instructional leadership and intensive internships, engaged in univer-
sity–district collaborations and types of institutions and state policy contexts
in support of leadership preparation, and represented different geographic
regions. The four programs were identified through interviews with experts,
a review of the published professional research on leadership preparation,
and initial consideration of a much larger sample of programs. Appendix A
contains a brief description of the four programs.
Next, in 2005 all of the graduates (who had graduated between 1999 and
2005) of these programs were identified and surveyed (as explained in the
study’s report; Darling-Hammond et al., 2007). Response rates ranged between
50% and 71% and yielded 198 exemplary program graduates.
The comparison sample was drawn from a survey of two national lists of
principals.4 Of 1,229 principals surveyed, 661 responded (54%).
To make the two sample groups more comparable and relevant for our
study, we restricted each to only those principals who had completed a lead-
ership preparation program between 1999 and 2005, had an internship expe-
rience while in their program, and were currently principals. These limitations
reduced the exemplary prepared sample to 65 respondents. Of them, 37%
were from the Educational Leadership Development Academy (ELDA) pro-
gram at the University of San Diego in California, 35% had graduated from
Delta State University in Mississippi, 23% were from the University of
Connecticut, and 5% were from Bank Street College’s Principals Institute in
New York City. We applied the same restrictions to the comparison sample.
We eliminated from the sample those comparison principals who were not
current principals in 2005 and those who lacked an internship. These steps
reduced the comparison sample to 111 principals. One consequence is that we
may have reduced the differences between the two groups by overstating
the quality of conventional preparation on leadership outcomes because we
excluded much more from the comparison group for not having had an intern-
ship experience (one critical requisite of quality preparation). Specifically,
we eliminated 23% of the comparison principal sample but only 7% of the
exemplary prepared principals for not having had an internship as part of
their preparation experience.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 33

Assumption of Program Differences


We assumed the leadership preparation program experiences of the compari-
son principals were representative of what existed nationwide among pro-
grams with internship components at the time they completed their preparation.
Although some comparison principals may have experienced exemplary
preparation, we assumed that most had had the primarily conventional prepa-
ration that was typically available. Our SEM model evaluated this assumption
by including quality ratings of program features as separate measures.

Consideration of Potential Sample Bias


There were several potential sources of sample bias that could confound the
investigation of the study measures: the regional distribution differences
between the samples, nested groupings of the exemplary prepared principals,
demographic and professional characteristic differences in the two samples,
and program selection and district support differences.
Regional bias. The exemplary program sample represented an uneven geo-
graphic distribution of principal respondents. To limit potential bias because
of regional differences, we constructed sample weightings for the two sam-
ples. A weighting scheme was designed and used when comparing the pro-
gram to its state or national comparison group. The data were first weighted,
using the weight WET, to the number of possible respondents by program
and by state (for the national comparison sample) to its corresponding popu-
lation. A second weight, WT2PRINC, was calculated for the national com-
parison principals, which took into account the total number of principals in
each state (i.e., the total number of schools in each state) and adjusted the
proportion of respondents to the total population of principals within the state
and similarly for the program sample. Appendix B presents the formulas for
these weightings.
Nested grouping of the exemplary graduates in four programs. The exemplary
prepared principals are drawn in unequal numbers from four programs. The
sample weighting process corrected for this problem in part by weighting
these two groups of principals to their state but did not correct for the unequal
distribution among the four programs. Thus, the relationships here might be
somewhat influenced by the two larger programs (ELDA, San Diego, and the
University of Connecticut) that account for two thirds of the respondents.
Demographic and professional differences. Table 1 compares the two princi-
pal samples, exploring possible demographic and professional differences
that may independently contribute to their leadership effectiveness. As
shown, the exemplary prepared principals had a higher percentage of female

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


34 Educational Administration Quarterly 47(1)

Table 1. Sample Characteristics of Exemplary and Conventionally Prepared


Principals
Exemplary
Prepared Conventionally
Sample Principals Prepared Difference
Characteristic Size M (EPP) Principals (CPP) (EPP – CPP)
Age 167 44.36 42.85 45.16 −2.31
% female 176 57.05 71.17 49.29 21.88*
% minority 176 25.83 45.64 14.95 30.69**
Teaching 176 13.86 11.11 15.37 −4.26*
experience
(years)
Principalship 176   3.02   2.30   3.42 −1.12**
experience
(years)
*p < .05. **p < .01.

principals, were more demographically diverse (specifically, there was a


higher percentage of minority principals), and had fewer years of teaching
and principal experience. The two groups were of similar age. Analysis of the
correlations among demographic and professional variables and the outcome
measures of interest showed mostly weak associations, suggesting that these
sample differences are not meaningful in our SEM model. They are shown in
Appendix C. Nonetheless, we included in our conceptual model one charac-
teristic that may be related to leadership learning and practices: prior leader-
ship experiences.
Program selection and district support differences. The four exemplary programs
all used different criteria to select candidates and to support candidate participa-
tion (which could have influenced candidates’ enrollment decisions): for two
programs, district officials nominated potential candidates and were active in
final selection decisions; two programs required candidates to have substantial
teaching experience; one program provided a state-funded full-time paid intern-
ship; and two programs offered reduced tuition and districts paid some or all of
the tuition balance. These selection and support differences may contribute to
the outcomes in ways that are unaccounted for, such as motivation.
Other considerations. Our dichotomous program measure (PREP) con-
tained three core differences in the two samples: The first—having or not
having completed an exemplary leadership preparation program—is the mea-
sure of interest. The second difference is district influence and support in
program selection and participation, which is part of the exemplary program
design that we wanted to account for separately. The third is the difference in

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 35

the number of years since program completion. To control for the influence
of years since program completion, we limited the sample to only recently
prepared principals and included principal experience in the SEM models.
Measures of district support in program participation and in school leader-
ship were included in initial analyses to investigate these influences but
yielded weak associations, so they were discarded for the purposes of this
analysis. Still, the sample of exemplary prepared principals was much more
likely to have had district support when selected to participate.

Data Source
Both samples of principals completed a standardized 48-question survey proto-
col, made available to them online and by mail. The survey instrument was based
on a graduate survey developed and piloted by the University Council for
Educational Administration/Teaching Educational Administration Special
Interest Group of the American Educational Research Association (UCEA/TEA-
SIG) Taskforce on Evaluating Leadership Preparation Programs (for details on
the initial survey construction, see Orr et al., 2006). That survey had been aligned
in part with national leadership preparation standards (NPBEA, 2002) and with
measures of leadership drawn from Leithwood and Janti’s (2000) leadership
effectiveness research (for further details on the survey construction, see
Darling-Hammond et al., 2007). The survey was fielded by WestEd Associates
and other research staff between February and May 2005 using mail and online
survey strategies and phone follow-up with nonresponders.

Study Limitations
The research was limited by its cross-sectional nature and its reliance on
principals’ self-reports. Any bias that self-reporting creates was assumed to
be similar across the two samples.

Data Analysis
We performed a three-step analysis, beginning with a preliminary analysis of
our data set to test specific assumptions regarding the distribution of the data
to be analyzed. Then we conducted confirmatory factor analysis (CFA) to
verify the structure of latent variables used in the analysis and computed
validity and reliability measures to assess the adequacy of their measurement
models.5 The last step was to examine the relationships presented in Figure 1
using SEM techniques.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


36 Educational Administration Quarterly 47(1)

A preliminary descriptive analysis provided us with information regard-


ing the location and the variability of the data set. A complete table with the
means, standard deviations, range, and measures of skewness and kurtosis
of all variables used in the final analyses (excluding dummy variables) is
provided in Appendix D. Although there are no definite cutoff points as to
which skewness and kurtosis values are indicative of non-normally distrib-
uted data, methodologists argue that data are considered to be moderately
non-normal if skewness values are between 2 and 3 and kurtosis values are
greater than 7.0 (Curran, Finch, & West, 1996). Using these guidelines, we
concluded that our data were approximately normally distributed because
the majority of the observed variables have absolute skewness values of less
than 1 and absolute skewness values less than 5. This conclusion is corrobo-
rated by the mean (absolute) skewness and kurtosis values of 0.79 and 3.72,
respectively.

Measurement and the Study’s Conceptual Model


For purposes of the analyses here, our basic conceptual model (Model 1; see
Figure 1) narrows Orr’s original model by selecting key factors for each set
of study measures (drawing on both latent variables and indicator variables),
as guided by the research literature. The study’s conceptual model includes
both endogenous (determined by the model through direct or mediated rela-
tionships) and exogenous (not determined by the model but included as
either antecedent or moderating) variables. The validity of the analysis to
investigate our hypotheses was based on the fact that no important exoge-
nous or endogenous variables had been omitted. Such omissions result in
specification errors, which could potentially bias parameter estimates derived
from the analysis. The latent variables and their indicator variables used in
this study are presented in Appendix E.
Our first measurement consideration was the selection of the primary
school outcomes most likely to be influenced by leadership practices and
leader preparation. In this study, our conceptual model was concerned with
the factors associated with two primary school outcomes: school improve-
ment progress (SIP; how much of an increase there had been over the past
12 months in areas such as consensus among staff about school goals, teacher
collaboration in making curriculum and instructional decisions and sharing
practices, and improvement in teachers’ instruction) and an effective school
climate (ESC; the extent to which principals agree that in their school teach-
ers currently feel responsible to help each other, are continually learning and
seeking new ideas, and use time together to discuss teaching and learning).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 37

Both outcomes are endogenous (determined by the model) variables and


were drawn from the above leadership effects research.
In our model, both endogenous variables were conceptualized as being
influenced by four latent variables: instructional leadership practices (ILP),
the extent of organizational and instructional leadership capacity that princi-
pals developed through the program (OILL), the preparation program’s qual-
ity as perceived by the principals (PQUAL), and internship quality as perceived
by the principals (IQUAL).
The indicator variables for ILP measured how often the principal engaged in
an instructional leadership activity (including foster teacher professional devel-
opment on instruction, use data to monitor and improve school progress, work
with teachers to improve instruction, and work with faculty to develop prac-
tice-related goals). These behaviors were drawn from the federal School and
Staffing Survey for principals, the NPBEA standards for leadership preparation,
and measures used in effective leadership research (Leithwood et al., 2004;
Leithwood & Jantzi, 1999). The indicator variables for the extent of OILL were
drawn in part from the Educational Leadership Constituent Council (ELCC)
standards for leadership preparation outcomes (NPBEA, 2002). Although the
standards and substandards from which our survey items were drawn included
a larger number of leadership knowledge and skills topics, our preliminary
analysis identified topics on instructional leadership and leadership for organi-
zational learning as being the strongest. They were also substantively consis-
tent with the exemplary leadership practices as measured above.
The indicator variables for PQUAL were drawn from the research on qual-
ity leadership preparation cited above and assessed the extent to which prin-
cipals agreed that their preparation program had four qualities (content that
emphasized instructional leadership, integration of theory and practice,
knowledgeable faculty, and strong orientation to the principalship). The indi-
cator variables IQUAL were based on the ELCC internship quality standards
(NPBEA, 2002). Principals rated the extent to which their internships reflected
three internship qualities: program members had responsibilities typical of an
educational leader, members developed a leader’s perspective for school
improvement, and the internship was an excellent learning experience for the
principalship. Although the survey contained other program and internship
quality measures, these three were found to be the strongest. These measures
are also endogenous because they are affected by other variables.
PQUAL and IQUAL were hypothesized to have a direct positive effect on
OILL and, through this measure, to have a mediated effect on the frequency
of effective ILP and on the two school outcomes of interest. OILL was
hypothesized to have a direct positive effect on frequency of ILP and, through

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


38 Educational Administration Quarterly 47(1)

SIP

IQUAL

PREP OILL ILP ISP


PRE

PQUAL
ESC

Figure 2. Nested Model 2


ESC = effective school climate; ILP = instructional leadership practices;
IQUAL = internship quality; ISP = index of school problems; OILL = organizational
and instructional leadership learning; PQUAL = program quality; PREP =
participation in an exemplary preparation program; SIP = school improvement
progress.

that, to have a mediated effect on the two outcome variables, whereas ILP
was hypothesized to positively affect the two outcomes. The only exogenous
variable in the model was PREP, which was hypothesized to have a direct
positive effect on program attributes (PQUAL and IQUAL) and ILP.
Other important variables were also considered to account for possible
specification errors. Figure 2 shows the hypothesized effects of six moderat-
ing factors (principals’ attributes and contextual factors). The three moderat-
ing attributes of principals are their prior experience in leading instruction
while in a nonsupervisory position (PELI), such as department chair, team
leader, instructional specialist, or coach; their number of years of experience
as a principal (EXPP); and their positive belief that they can influence school
change (PBP). All three are hypothesized to have direct positive effects on
leadership practices and, through that variable, a moderating effect on the
two outcome variables.
The three moderating variables on the principal practices–school out-
comes relationship are district support (DS), an index of challenging school
conditions (ISP), and a school poverty level measured as the percentage of
poor students in a school (SPL). The DS variable, drawn from McLaughlin
and Talbert’s (2002) district research on school improvement (Center for

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 39

Research on the Context of Teaching, 2002), refers to whether the district


supports a principal’s efforts to improve the school’s performance, encour-
ages and facilitates a principal’s professional development, and helps a prin-
cipal promote and nurture a focus on teaching and learning and on school
improvement. ISP includes perceived lack of parental involvement, student
absenteeism, and students coming to school unprepared to learn and is drawn
from the federal School and Staffing Survey. DS was hypothesized to posi-
tively affect the two school outcomes, whereas ISP was hypothesized to neg-
atively affect the outcome variables. The effect of SPL on the outcome variables
was also assumed to be negative.

Results
Any structural equation model is composed of two components: the mea-
surement model and the structural model. In the section that follows we
present the results of fitting the measurement models for the latent variables
using CFA.

Measurement Models
First, we evaluated the measurement models underlying the latent variables
used in the analysis. We examined the relationships between our proposed
latent variables and their indicators using CFA to determine the validity and
reliability of the measures. If the latent variables were not satisfactorily mea-
sured with appropriate measurement models, there would be little point in
thinking of estimating statistical relationships between them (Bollen, 1989;
Diamantopoulos & Siguaw, 2000).
The validity of the indicators can be assessed by examining the magnitude
and significance of the paths between each latent variable and its indicators.
If any given indicator is a valid measure of a specific latent variable, then
the relationship between them must be substantial. The standardized validity
coefficient, which gives the expected number of standard deviation units that
the indicator changes for a one standard deviation change in the latent vari-
able, was used to assess these relationships. The standardized validity coeffi-
cients also allow the possibility of comparing the validity of different indicators
measuring a particular construct, which is impossible with unstandardized
loadings (Diamantopoulos & Siguaw, 2000).
Appendix E presents the complete measurement models for all latent vari-
ables. In total, 30 indicator variables loaded on eight different latent variables,
whereas five other latent variables were single-indicator variables that we
assumed were measured with no measurement error. Inspection of the range of

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


40 Educational Administration Quarterly 47(1)

Table 2. Validity and Reliability of Latent Variables’ Measurement Models


Range of
Standardized Composite Average Variance
Latent Variable Loadings Reliability Extracted
Program quality .63–.86 .84 .56
Internship quality .80–.89 .89 .74
Instructional .64–.69 .75 .44
leadership
practices
Organizational and .76–.92 .92 .69
instructional
leadership learning
Index of school .69–.91 .84 .63
problems
District support .86–.89 .93 .76
School improvement .76–.82 .86 .62
progress
Effective school .68–.89 .80 .58
climate

standardized loadings for each latent variable revealed that overall the observed
variables were valid measures for their respective latent variables. The majority
of the loadings were well above .70, and the lowest loading for any observed
variable was .63. The construct of ILP was the only latent variable with notably
lower loadings for its four observed variables, but they are considered adequate
although not as strong as the loadings for other latent variables.
Table 2 presents a summary of the measurement model for each latent
variable with information regarding the validity and reliability for each latent
construct. The reliability for most indicators was satisfactory. The composite
reliability value is an overall measure of each latent variable’s reliability. All
latent variables had values well above .60, which is considered the lowest
desired value (Bagozzi & Yi, 1988). The average variance extracted (AVE)
values are complementary measures of reliability and offer information not
provided by the composite reliability value. They show the amount of vari-
ance that is captured by the construct in relation to the amount of variance
because of measurement error (Fornell & Larcker, 1981) and should exceed
.50, which means that the underlying latent variable accounts for a greater
amount of variance in the indicators than does the measurement error. Five
latent variables have AVE values greater than .50, but two (PQUAL, ESC)
have values between .50 and .60 and another variable (ILP) has an AVE
of .44. The reason for the low AVE values for ILP is the small amount of

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 41

variance explained by almost all indicators (close to 50%). This was expected
given the complicated nature of the construct we are trying to measure. As a
result, we considered the AVE values to be satisfactory and believed that
overall we had good evidence of validity and reliability for all latent variables.

Estimation of Structural Models


The purpose of the estimation is to minimize the differences between ele-
ments of the sample covariance matrix and the model-implied covariance
matrix. Appendix F provides intercorrelations among the variables used in
the SEM analysis. The method of estimation was maximum likelihood because
it provides efficient estimation under the assumption of multivariate normal-
ity and is relatively robust against moderate departures from the latter. We
used the model comparison strategy where a number of alternative (and
nested) models were a priori specified and fitted to the data. This approach
is regarded as superior for model specification when compared with other
approaches, such as single-model estimation and the model modification strat-
egy (Bollen, 1989; MacCallum, 1995). The evaluation of each model is based
on certain fit criteria, which are summarized below.

Estimation of Hypothesized Models


We estimated three models. Model 1 is the basic conceptual model depicted
in Figure 1 that includes all hypothesized relationships among the variables
of the study. Model 2 (shown in Figure 2) is a nested version of this general
model but differs from the general model in that we kept only the paths with
significant coefficients from Model 1. In this way we can test whether the
deleted paths have any collective significance or not. Some of the significant
coefficients in Model 1 were not significant in Model 2 after removing the
insignificant paths. Consequently, we tested a second nested model, Model 3
(shown in Figure 3), in which we kept only the paths with significant coef-
ficients from Model 2. Appendix G provides the complete structural equation
estimates.

Comparison of Models
Given the nature of the three models, we can explicitly compare them by
estimating the difference in the χ2 statistic between any two models and then
assessing its significance with the critical values of the χ2 distribution. When
two models were found to be equivalent, our decision rule was to select the

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


42 Educational Administration Quarterly 47(1)

IQUAL SIP

OILL ILP ISP


PREP

PQUAL ESC

Figure 3. Nested Model 3


ESC = effective school climate; ILP = instructional leadership practices;
IQUAL = internship quality; ISP = index of school problems; OILL = organizational
and instructional leadership learning; PQUAL = program quality; PREP = participation
in an exemplary preparation program; SIP = school improvement progress.
*p < .05.

most parsimonious of them. Table 3 reports fit indexes for the three estimated
models and differences in the χ2 statistic between Models 1 and 3.
The three models have similar satisfactory fit as indicated by the fit
indexes. Based on the calculated χ2 difference between Model 1 and Model 3,
we concluded that Model 3 was statistically equivalent to the general concep-
tual Model 1 because this difference was not statistically significant. As a
result, our model of choice was Model 3.

Assessment of Model Fit for Selected Model


A model’s fit is the extent to which the hypothesized model is consistent with
the sample data. The model’s absolute fit is primarily measured with the χ2
statistic, which is the only inferential statistic available (Hoyle & Panter,
1995). The χ2 statistic provides a test of perfect fit in which the null hypoth-
esis states that the model fits the population data perfectly. For our selected
model the χ2 statistic is 617.30, with 536 degrees of freedom (p = .008). A
statistically significant statistic indicates significant discrepancies between
the population and the model implied covariance matrices. However, the

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 43

Table 3. Fit Indexes for the Basic Model 1 and Nested Models 2 and 3
Model χ2 χ2/df RMSEA CFI GFI χ2 diff
Model 1 (basic 601.35* 1.14 .020 .95 .85  
conceptual
model with
all proposed
paths)
Model 2 (basic 610.87* 1.14 .021 .95 .85  
conceptual
with only paths
with significant
coefficients
from Model 1)
Model 3 (basic 617.30* 1.15 .023 .95 .85  
conceptual
with only paths
with significant
coefficients
from Model 2)
Difference 15.95
between Model
1 and Model 3
CFI = comparative fit index; GFI = goodness of fit index; RMSEA = root mean square error of
approximation.
*p < .05.

assumption that the model is a true representation of the reality is an unreal-


istic assumption to hold for most covariance structure models. Therefore,
researchers have promoted the use of functions of the statistic. One such mea-
sure is the χ2 statistic divided by its degrees of freedom, which estimates how
many times larger the χ2 value is compared to its expected value (Bollen, 1989).
Ratios of 3 or less have been suggested as representing a good fit (Carmines
& McIver, 1981). For our third model, the measure had a value well below
3.0 (1.15). This result provided evidence that our third model was a plausible
approximation of reality and could not be rejected.
We also assessed each model’s overall fit with the root mean square error
of approximation (RMSEA), which is a measure of the discrepancy between
the sample and implied covariance matrix per degree of freedom (Jöreskog
& Sörbom, 2001). RMSEA values of .05 indicate a close fit of the model,
whereas index values less than .08 indicate reasonable errors of approxima-
tion (Browne & Cudeck, 1993). In our case, the RMSEA was below the
suggested cutoff point (.023).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


44 Educational Administration Quarterly 47(1)

A model’s incremental fit concerns the degree to which the model in ques-
tion is superior to an alternative model (usually the null model where no
covariances among variables are specified). We used two incremental fit
indexes: the comparative fit index and the goodness of fit index. The values
of these indexes are either above or close to the commonly recommended
cutoff point of .90 (.95 and .85, respectively), indicating a good fitting model.
Based on all measures, we concluded that our third model had a reason-
ably good fit. The fact that we obtained good fit does not mean that our sug-
gested model is the correct one. Rather, it implies that it is a plausible model
and consistent with the sample data at hand (Bollen, 1989). Given the accept-
able model fit, parameter estimates are presented in the following section.

Parameter Estimates
The standardized coefficients from the completely standardized estimation
of Model 3 are presented in Figure 4, indicating that the sign of all reported
coefficients was in the expected direction. Participation in an exemplary
preparation program positively affects PQUAL and IQUAL, which in turn
positively affect principals’ OILL. OILL positively affects principals’ ILP
and finally ILP has a positive effect on both SIP and ESC. Appendix G pro-
vides all the standardized coefficients in the model’s structural equations.

Evidence for Research Question 1


Research Question 1 investigates whether participation in an exemplary
leadership preparation program is positively related to OILL and to the fre-
quent use of ILP. According to Figure 4, program and internship quality as
reported by the principals was higher for the exemplary leadership preparation
programs compared to the conventional preparation programs. The program
quality of exemplary programs was .24 standard deviations greater than the
program quality of conventional programs. Exemplary programs also seem
to provide principals with better internship experiences, as evidenced by the
.13 standard deviation difference between the two types of programs.
PQUAL and IQUAL had a significant impact on what principals learn
about organizational and instructional leadership. A 1 standard deviation
increase in program and internship quality of the principals’ preparation pro-
gram was associated with increases of .70 and .50 standard deviations in prin-
cipals’ learning organization and instructional leadership. The two program
features also accounted for 77% of the variance observed in OILL
(see Appendix G), further strengthening the links among these variables.
Therefore, exemplary leadership preparation was positively associated with

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 45

SIP
IQUAL

.50* .50 * –.28*


.13*

OILL ILP ISP


.40*
PREP

.24* .70* .49*

PQUAL ESC

Figure 4. Standardized coefficients from Model 3


ESC = effective school climate; ILP = instructional leadership practices;
IQUAL = internship quality; ISP = index of school problems; OILL = organizational
and instructional leadership learning; PQUAL = program quality; PREP = participation
in an exemplary preparation program; SIP = school improvement progress.
*p < .05.

what principals learned in their programs about organizational and instruc-


tional leadership through the extent to which program quality and internship
quality were present.
The amount of learning through program content and field experience
about organizational and instructional leadership also has a positive effect on
ILP. The more that principals learn during their preparation programs, the
more often they practice specific leadership actions, as indicated by the .40
coefficient linking leadership learning and practices. However, it should be
noted that such learning accounts for only 26% of the observed variance of
effective leadership practices. Obviously, there are other factors that have an
impact that cannot be identified in this study. Combined, these results show
that exemplary leadership preparation positively affects leadership practices
(ILP) through both the extent of exemplary preparation (PQUAL and IQUAL)
and how much is learned through the program (OILL).

Evidence for Research Question 2


Research Question 2 investigates whether the frequent use of specific ILP is
positively related to SIP and school effectiveness climate while controlling for
various principals’ attributes (principalship experience, beliefs about the prin-
cipalship, and prior experience in leading instruction) and contextual factors

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


46 Educational Administration Quarterly 47(1)

(DS, SPL, ISP). The results presented in Figure 4 show positive associations
between leadership practices and the two outcome variables in Model 3. A 1
standard deviation increase in the frequency of leadership practices was asso-
ciated with a .50 standard deviation increase in school improvement. The
effect of leadership practices on promoting ESC was similar (.49).
The only contextual factor with a significant effect on either SIP or ESC is
the extent of various school problems. Even when taking ILP into account, a
1 standard deviation increase in school problems (ISP, as defined for the pur-
poses of this study) was associated with a .28 standard deviation decrease in
school improvement practices. DS and SPL had no significant effects on SIP.
The three principals’ attributes (positive beliefs, principalship experience,
and prior experience in leading instruction) also had insignificant effects on
both SIP and ESC and were dropped from the final model. ILP and ISP
explained 37% of the variance observed in SIP.

Evidence for Research Question 3


By combining all of these measures, SEM analysis enabled us to identify the total
effects for model variables, as addressed by Research Question 3. Table 4 presents
the total effects of PREP, IQUAL, and PQUAL on OILL, leadership practices,
and the two school outcomes. Table 4 also reports total effects for OILL and
leadership practices. All total effects are significant at the .05 level of significance.
The results show that participation in an exemplary leadership preparation
program has a measurable effect on the other core measures, supporting our
assumption that preparation effects can be ascertained in leadership practices
and school improvement work. First, exemplary program participation was
associated with a .23 of a standard deviation increase in OILL, answering our
first research question. This is a significant effect, much of which is mediated
through the extent to which program quality and internship quality was expe-
rienced rather than program affiliation alone. OILL had a total effect of .40 on
ILP and sizeable positive total effects of approximately .20 on both outcome
measures. However, ILP had a total positive effect of a .50 standard deviation
increase on SIP (which equals its direct effect) and a .49 standard deviation
increase on ESC (also equal to its direct effect), answering our second research
question.
Most important, in answering our third research question, participation in
exemplary programs had an estimated total effect of .09 of a standard devia-
tion on the frequency of effective leadership practices; the effect was indirect,
however, mediated through program and internship quality and the extent of
learning about leadership. Exemplary leadership preparation programs also
seem to have a small positive effect on school improvement and promoting

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 47

Table 4. Total Effects of Latent Variables


Total Effects on Variable

Latent Variable OILL ILP ESC SIP


PREP .23* .09* .05* .05*
PQUAL .70* .28* .14* .14*
IQUAL .50* .20* .10* .10*
OILL .40* .19* .20*
ILP .49* .50*
ESC = effective school climate; ILP = instructional leadership practices; IQUAL = internship
quality; OILL = organizational and instructional leadership learning; PQUAL = program qual-
ity; PREP = participation in an exemplary preparation program; SIP = school improvement
progress.
*p < .05.

an ESC. Both total effects are indirect effects that are mediated through other
variables (OILL and ILP).
The results reveal, however, that the quality of preparation—through pro-
gram quality and internship quality—is more strongly associated with the
outcome measures than is program affiliation (including the association between
program type and the two preparation quality measures). Combined, program
quality and internship quality had an estimated total effect of .48 of a stan-
dard deviation on the frequency of effective leadership practices and .24 on
each of the school outcomes: SIP and ESC.
In summary, our analysis uncovered positive, although mediated, effects
for participating in an exemplary leadership preparation program on SIP and
promoting an ESC. The results also underscore the importance of program and
internship quality as influencing leadership learning and effective leadership
practices. Their effects on SIP, however, are moderated by challenging school
conditions, as measured by the extent to which school problems exist (ISP).

Discussion and Conclusion


Our study addressed three questions. First, we investigated how the type of
preparation program (exemplary or conventional) relates to a principal’s
school leadership practices. Our results indicate a modest relationship between
program type and leadership practices but a stronger relationship between
program and internship quality and leadership practices. The results also
confirm that this influence is indirect, mediated through the extent to which
graduates learn about organizational and instructional leadership.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


48 Educational Administration Quarterly 47(1)

These results are consistent with prior findings about the influence of pro-
grams that embody quality principles (e.g., coherently organized, leadership-
relevant content, and quality internship) on the outcomes of graduates, in both
the extent of their learning about leadership and their attention to effective
instructional and organizational leadership practices as school principals
(Leithwood et al., 1996; Orr & Barber, 2007). Our two dimensions of program
features—program quality and internship quality—were moderately strongly
related to the extent of organizational and instructional leadership capacity that
principals developed through the program, suggesting that the quality of candi-
dates’ programs and their field experiences contribute significantly to what and
how much they learn about effective leadership and, through what they learn,
how they subsequently function as school leaders. This finding underscores the
importance of programs’ attention to both dimensions in the learning experi-
ences they provide for their candidates and the synergy they attain from both.
The weaker, positive association between program type and these two quality
dimensions suggests variability in candidates’ experiences within programs,
particularly around internship quality. Importantly, the higher the quality of
programs and internship experiences, the more positive the effects on candidate
learning and subsequent use of effective leadership practices.
A surprising result was the lack of influence on the program quality–lead-
ership practices relationship that personal characteristics had, although often
argued to be independently critical in principal readiness and leadership effi-
cacy: principals’ prior leadership experience, their number of years as a principal,
and their beliefs about the principalship. This finding may be in part because
of, as shown in Appendix D, the skewness of positive belief about the princi-
palship (M = 4.78 on a 5-point scale, SD = 0.51) and the low number of prior
leadership experiences (M = 1.1). The lack of influence of the number of
years of principal experience may be because of the fact that both samples
had modest experience. The results also suggest that quality preparation and
learning override the influence of years of experience, particularly for those
in their early years as principal.
The implications, therefore, are that leadership candidates who complete
an exemplary leadership preparation program increase the likelihood that
they will have superior preparation, thereby increasing the scope and quality
of what they learn about leadership. Being affiliated with a program recog-
nized as strong is not sufficient, however. Candidates must have both high-
quality preparation and high-quality internships to experience learning
benefits that positively influence their subsequent leadership practices.
Consequently, programs that are designed to incorporate research-recom-
mended quality features are recognizably different to their graduates (based

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 49

on graduates’ program feature ratings) and yield discernibly better learning


outcomes, as evidenced by comparisons of principals who completed one of
four exemplary programs to conventionally prepared principals.
Our second research question investigated whether a positive relationship
exists among effective leadership practices, school improvement progress,
and school effectiveness climate and whether these relationships are influ-
enced positively by district support and negatively by school conditions (the
extent of challenging problems and percentage of students in poverty). In
terms of the first part of our question, we found that a moderately strong effect
exists for both relationships, and this effect is consistent with prior research on
the influence of effective leadership practices (Leithwood & Jantzi, 2008;
Muijs et al., 2004; Robinson et al., 2008). Of particular interest is which lead-
ership practices were found to be most influential, given the debates around
the efficacy of instructional and transformational leadership practices for
school progress and student performance (Hallinger, 2005; Leithwood &
Jantzi, 2005; Robinson et al., 2008). Our measure of instructional leadership
practices was based on four indicators of frequent principal practices. As
shown in Appendix E, all four represent a focus on instructional effectiveness
for student learning, development of teachers’ capacity both individually and
collectively, and use of data to guide their school improvement work. To us,
these practices represent the integration of instructional and transformational
leadership, with its purposeful focus on developing teachers and other staff
and the organization as a whole for the improvement of teaching and learning.
These findings are consistent with Leithwood and Jantzi’s (2008) most recent
analysis of leadership efficacy and leadership behavior, which found strong
effects associated with organizational redesign and supporting the work of
others but less from direction setting. They are also consistent with Robinson
et al.’s (2008) findings of which leadership practices are most influential for
improved student learning, which stress promoting teacher learning and devel-
opment, establishing goals and expectations, and coordinating and evaluating
teaching and curriculum over other more managerial practices.
Similarly, we were surprised to find that fostering SIP and having an ESC
were separate outcomes that were independently influenced by principals’ ILP.
Both sets of outcomes included indicators of teachers’ investment in improving
teaching and learning. Among the school improvement indicators of progress
made over the prior year that were most significant were development of a
consensus among staff about school goals, teachers’ collaboration in instruc-
tional decisions, and teachers’ efforts to improve instructional strategies and
share practices. Part of the ESC measure were three indicators of how teachers
in the school work collaboratively: by feeling responsible to help each other,
continually learning, and using time together to discuss teaching and learning.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


50 Educational Administration Quarterly 47(1)

These results are consistent with prior research on school improvement and
climate (Muijs et al., 2004; Sebring et al., 2006) and underscore the leadership
influence on these two critical and complementary outcomes.
In terms of the second part of this research question, however, we found
limited moderating effects. Only the severity of school problems influenced
any of the dependent measures—only SIP—reducing the positive influence of
school leadership practices. This relationship is consistent with prior research
(Leithwood & Jantzi, 2008; Muijs et al., 2004; Robinson et al., 2008). DS and
student poverty levels, which had been found to be significant in other studies
of the leadership practice–school performance relationship, added little here.
In addressing the third research question, we found that, taken together,
the results are very encouraging, showing that quality preparation matters
and contributes significantly to what graduates learn and, ultimately, to how
they practice leadership and work to improve their schools. The results also
show that the quality of the program features—focus, content, faculty, and
internships—is more important for a candidate’s success than simply enroll-
ing in an exemplary program.
The results yield important implications for universities and other sponsors
of leadership preparation programs, districts, state policy makers, and educa-
tional researchers. First, the results confirm the importance of program and
internship quality for how much candidates learn about leadership and how, as
school leaders, they frequently use effective leadership practices. Moreover,
how preparation programs are designed and organized to operationalize key
quality features influences these outcomes. Of the seven features stressed most
by the research literature above, four are shown here to be most influential as a
combined influence: instructional leadership-focused program content, integra-
tion of theory and practice, knowledgeable faculty, and a strong orientation to
the principalship as a career. Their relationship suggests that there is a syner-
gistic effect when these elements are combined coherently. A fifth quality fea-
ture has an equally important but independent influence when embodying
several elements that are similar to program quality features—an orientation to
the principalship and a focus on leadership for school improvement, as well as
offering opportunities to have responsibilities for leading and facilitating and
making decisions typical of an educational leader. Such features, therefore, app­
ear to be critical for universities and other providers to redesign their programs.
For districts, the implications are twofold: their role in leadership prepara-
tion and their selection decisions in hiring school leader candidates. Although
not shown here, the case study research on the four leadership preparation pro-
grams (Darling-Hammond et al., 2007) used as the exemplary programs for
this research shows the strong role of districts in program design and delivery
in two of them (ELDA in San Diego and the Bank Street College Principals

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 51

Institute in New York City) and their role (combined with essential external
funding) in providing paid full-time internships for these two programs at
Delta University’s program. The importance of the district’s role can be inferred
further from the strong emphasis on the principal career and instructional lead-
ership for school improvement, both priorities for the participating districts
whose programs were designed to address leadership shortages while contrib-
uting to school improvement efforts. The preparation–school outcomes rela-
tionship confirms the value of investing in leadership preparation as part of
district school improvement efforts. Finally, given these positive findings, dis-
tricts should give greater attention to the nature and quality of leadership prepa-
ration in screening candidates for school leadership positions.
The results yield similar implications for policy makers who look for strat-
egies to strengthen leadership preparation quality as part of a more compre-
hensive effort to improve school performance and student achievement. The
results stress the importance of the coherence and focus of program content
around instructional leadership and school improvement and the enabling of
high-quality internships. State policy makers can use program guidelines to
reinforce these program features and direct funding for paid internships, par-
ticularly to serve the districts most in need.
Finally, the results have implications for researchers. This research had its
roots within two initiatives—a national taskforce on evaluating leadership
preparation and a national study of exemplary leadership preparation and
development programs (Darling-Hammond et al., 2009; Orr & Pounder,
2006). Through the efforts of these initiatives, significant conceptual and
methodological developments were accomplished, leading to the positive
investigation of our research questions here. But this work is just the begin-
ning and needs to be extended in three ways. First is to obtain concurrent
validity for the findings. One step would be to connect the survey results to the
schools’ achievement outcomes to validate whether principals’ perceptions of
SIP and ESC are yielding intended student learning gains. A second step
would be to solicit data on teachers’ experiences to compare to and validate
principals’ perceptions of their leadership work, school effectiveness climate,
and SIP. Second, the findings needs to be verified by undertaking similar
research with other program samples to replicate the findings and validate the
relationships found here. Third, further conceptual and measurement work
is needed on the central measure—ILP—on which the findings hinge, which
had the weakest measurement characteristics of all the study variables.
Further work is needed to capture effectively the nature and intensity of what

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


52 Educational Administration Quarterly 47(1)

principals do to improve schools in order to differentiate better the influence


of preparation and how their practices contribute to school improvement.

Appendix A
Exemplary Preparation Programs
Program Description
Delta State University, Cleveland, MS Delta State University, a regional public
institution, redesigned its leadership
preparation into a 14-month
master’s degree program to focus on
instructional leadership and provided
a full-time internship (with multiple
placements) and full-year financial
support for teachers to prepare to
become principals. The program’s
aim is to prepare candidates who
can transform schools in the poor,
mostly rural region. Applicants are
recommended by their districts, based
on teaching and informal instructional
leadership, and about 15 candidates
are selected for each cohort. Courses
are organized as intensive seminars
alternating with various internship
placements where they are supervised
by experienced administrators. Course
work and internships are closely
linked through field-based projects and
problem-based learning. The program
is supported by local districts and
the state of Mississippi (through its
sabbatical leave program).
University of Connecticut’s The UCAPP program is a 2-year,
Administrator Preparation 32-credit post–master’s degree
Program (UCAPP), Storrs, CT program that combines leadership-
related course work and a 2-year
internship for working professionals.
Through superintendent referral and
rigorous selection, the program admits
15 candidates per cohort. The program
(continued)

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 53

Appendix A (continued)
Program Description
combines conventional course work
in administration and supervision with
course work on school improvement
(including a curriculum laboratory and
teacher evaluation and development).
Candidates complete 80 days of
internship (including summer school)
in a different district, under the
supervision of a mentor principal. Each
candidate develops his or her own
plan of study and produces a portfolio
of documented work, including a
school–community analysis. Program
faculty are committed to continuous
improvement to blend course work
and field work, using an analytic,
reflective approach.
Principal’s Institute at Bank Street Working with Bank Street College,
College, New York, NY a private school of education in
New York City, Region 1 (one of 10
divisions of NYC public schools)
developed a continuum of leadership
preparation, including principal
preparation, induction, and in-service
support, using public and private
funding. This continuum aims to create
leadership for improved teaching
and learning closely linked to the
district’s instructional reforms. For
its preparation program, the region
nominates applicants based on their
teaching quality and instructional
leadership; applicants complete a
multistage interview process, including
a panel interview. Candidates enroll in
an 18-month program at Bank Street
College, leading to building and district
leader certification, and earn a master’s
degree. The program combines course
work in instructional leadership and
(continued)

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


54 Educational Administration Quarterly 47(1)

Appendix A (continued)
Program Description
organizational change, offered both at
the regional offices and at the college,
a full-time summer internship and two
other robust field-based experiences,
and an advisory and conference
group structure that fosters reflective
learning under the supervision of
an experienced educator. Through
advisement and conference group
meetings, candidates reflect on
practice and develop new skills and
strategies. The program’s candidate
goals are lifelong learning, reflective
practice, inquiry, and advocacy and are
aligned with the region’s goals for its
principals.
Educational Leadership San Diego’s continuum of leadership
Development Academy at the preparation and development reflects
University of San Diego, San a closely aligned partnership between
Diego, CA the school district and the University
of San Diego, made possible through
foundation support. The continuum’s
preservice and in-service programs
support the development of
instructional leaders within a context
of district instructional reform. The
district nominates applicants based on
excellent teaching and instructional
leadership. Candidates complete a
yearlong program of study, which
includes a full-year paid internship
mentored by an expert principal
and opportunities to network as
part of a cohort. Program content
combines instructional leadership,
organizational development, and
change management, with an emphasis
on school planning and teacher
professional development.

Source: Darling-Hammond, LaPointe, Meyerson, Orr, and Cohen (2007).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 55

Appendix B
Weights Used in Data Analysis
Program sample weight:

WT 2 PRINC sp = (Nsp / nsp)∗ (∑WY


s=AL
∑8p=1 nsp / ∑WY
s=AL
∑8p=1 Nsp)

Where:
N = number of schools represented by respondents
n = number in the sample
s = state (STATE)
p = program (PROGID = 1–8)

National sample weight:

WT 2 PRINC sp = (Nsp / nsp)∗ (∑WY


s=AL
∑99p=98 nsp / ∑WY
s=AL
∑99p=98 Nsp)

Where:
N = number of schools in state
n = number in the sample
s = state (STATE)
p = program (PROGID = 98, 99), reflecting the elementary and secondary
principal samples

Appendix C
Correlations (Absolute Values) Between Sample
Characteristics and Variables Measuring School Improvement
Progress and Effective School Climate
Variables Measuring Variables Measuring School
Effective School Climate Improvement Progress

Characteristic q40a q40b q40c q41a q41b q41c q41n

Age .25 .24 .26 .04 .09 .10 .13


% female .14 .13 .32 .14 .21 .23 .22
% minority .07 .17 .13 .08 .15 .05 .23
Teaching experience .19 .18 .22 .04 .07 .04 .11
Principalship experience .25 .24 .26 .04 .03 .00 .07

Principals N = 176. q40a = teachers in this school feel responsible to help each other do their best; q40b =
teachers in this school are continually learning and seeking new ideas; q40c = teachers use time together
to discuss teaching and learning; q41a = consensus among staff about school’s goals; q41b = collaboration
among teachers in making curriculum and instructional decisions; q41c = focus by teachers on improving and
expanding their instructional strategies; q41n = efforts among teachers to share practices with each other.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


56 Educational Administration Quarterly 47(1)

Appendix D
Univariate Summary Statistics
Variable Latent Variable M SD Skewness Kurtosis
q6a PQUAL 4.52 0.69 −1.23 3.56
q6k PQUAL 4.39 0.76 −0.95 2.88
q61 PQUAL 4.51 0.65 −1.23 4.39
q6m PQUAL 4.28 0.88 −0.98 2.98
q13c IQUAL 4.24 0.99 −1.30 4.20
q13e IQUAL 4.34 0.87 −1.33 4.55
q13f IQUAL 4.44 0.92 −1.74 5.43
q14b OILL 3.84 0.97 −0.59 2.76
q14h OILL 4.13 0.84 −0.82 3.49
q140 OILL 3.96 0.99 −0.76 3.41
q14s OILL 3.69 0.92 −0.33 2.69
q14r OILL 3.93 0.93 −0.76 3.41
q39g ILP 2.84 0.70 0.15 2.26
q39i ILP 2.79 0.81 0.20 1.94
q39n ILP 2.84 0.72 0.16 2.13
q39p ILP 2.61 0.72 0.39 2.53
q40a ESC 4.09 0.82 −0.74 3.18
q40b ESC 4.01 0.78 −0.46 2.81
q40c ESC 3.97 0.81 −0.55 3.27
q41a SIP 3.93 0.76 −0.35 2.84
q41b SIP 4.17 0.66 −0.04 3.24
q41c SIP 4.20 0.66 −0.48 3.32
q41n SIP 4.06 0.67 −0.42 3.40
q42a ISP 2.83 1.25 0.12 2.12
q42c ISP 2.41 1.07 0.39 2.61
q42d ISP 2.80 0.99 0.37 2.87
q43c DS 3.38 0.73 −1.18 4.39
q43d DS 3.36 0.73 −0.94 3.44
q43e DS 3.05 0.73 −0.34 2.68
q43f DS 3.29 0.70 −0.68 3.02
instldex PELI 1.11 1.10 0.73 2.69
frl SPL 49.88 29.99 1.95 1.91
pbelief PBP 4.78 0.51 −3.28 19.29
pexp EXPP 2.80 1.93 1.47 6.55

DS = district support; ESC = effective school climate; EXPP = principalship experience; ILP = instructional
leadership practices; IQUAL = internship quality; ISP = index of school problems; OILL = organizational and
instructional leadership learning; PBP = positive belief about principalship; PELI = prior experience in leading
instruction; PQUAL = program quality; SIP = school improvement progress; SPL = school poverty level.
Variables are on a 1–5 scale. Exceptions are instldex (1–4), which was drawn from a federal survey, frl, which
is a continuous variable measuring the percentage of students on free and reduced-price lunch (0%–100%),
and pexp, which is a continuous variable measuring principalship experience in years. For variable definitions,
see Appendix E.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 57

Appendix E
Loadings of Survey Items (Indicator Variables) on Latent
Variables
Latent Variable Survey Items Loading R2
Participation Dichotomous variable indicating participation 1.00 1.00
in exemplary in an exemplary or conventional
preparation preparation program
program (PREP)
Internship quality The extent to which their educational  
(IQUAL) leadership internship experience(s)
reflected the following attributes (based
on a 5-point agreement scale)
  q13c-principal had responsibilities for leading, 0.80 0.64
facilitating and making decisions typical of an
educational leader.
  q13e-principal was able to develop an 0.88 0.78
educational leader’s perspective on school
improvement.
  q13f-internship experience was an excellent 0.89 0.79
learning experience for becoming a principal
Program quality The extent to which the following qualities were  
(PQUAL) true of their educational leadership program
(based on a five-point agreement scale)
  q6a-program content emphasized 0.63 0.39
instructional leadership.
  q6k-program integrated theory and practice. 0.69 0.47
  q6l-faculty members were very 0.80 0.64
knowledgeable about their subject matter.
  q6m-program gave a strong orientation to 0.86 0.75
the principalship as a career.
Organizational How effectively their formal leadership program  
and prepared them to do the following: (based on
Instructional a five-point effectiveness scale)
Leadership
Learning (OILL)
  q14b-create a coherent educational program 0.76 0.57
across the school.
  q14h-create a collaborative learning 0.83 0.69
organization.
  q14o-use data to monitor school progress. 0.73 0.54
  q14r-engages in comprehensive planning for 0.92 0.84
school improvement.
(continued)

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


58 Educational Administration Quarterly 47(1)

Appendix E (continued)
Latent Variable Survey Items Loading R2
  q14s-redesigns school organization to enhance 0.90 0.81
productive teaching and learning.
Instructional In the past month, approximately how often the  
Leadership principal engaged in the following activities in
Practices (ILP) their role as principal of this school (based on
a five-point frequency scale)
  q39g-fosters teachers’ professional 0.61 0.38
development for instructional knowledge
and skills.
  q39i-use data to monitor school progress, 0.63 0.40
identify problems and propose solutions.
  q39n-work with teachers to change 0.74 0.55
teaching methods where students are not
succeeding.
  q39p-works with faculty to develop goals for 0.63 0.39
their practice and professional learning.
Prior Experience Did the principal had prior experience as 1.00 1.00
in Leading department chair, team or grade-level leader,
Instruction instructional specialist, or coach (excluding
(PELI) assistant principal) (dichotomy)
Index of School Extent to which principal perceives each of the  
Problems (ISP) following to be a problem in his or her school:
  q42a-lack of parental involvement. 0.78 0.61
  q42c-student absenteeism. 0.69 0.47
  q42d-students coming to school unprepared 0.91 0.82
to learn.
Experience as a Number of years as a principal of any school. 1.00 1.00
Principal (EXPP)
Positive Extent to which the principal agrees that 1.00 1.00
Belief about the “principalship enables me to influence
Principalship school change” (based on a five-point
(PBP) agreement scale)
School Poverty Percentage of students in school who are 1.00 1.00
Level (SPL) eligible for free- or reduced-price lunch
(federal poverty standard)
District Support How strongly the principal agrees or disagrees  
(DS) with the following statements regarding
his or her district (based on a five-point
agreement scale)
  q43c-district supports school’s efforts to 0.86 0.75
improve.
(continued)

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 59

Appendix E (continued)
Latent Variable Survey Items Loading R2
  q43d-district promotes the principal’s 0.87 0.76
professional development.
  q43e-district encourages principals to take 0.87 0.76
risks in order to make change.
  q43f-district helps the principal to promote 0.89 0.79
and nurture a focus on teaching and
learning.
School Extent to which the principal agrees that  
Improvement there has been an increase or decrease in
Progress (SIP) the following in his or her school since last
year (based on a five-point agreement
scale)
  q41a-consensus among staff about school’s 0.80 0.64
goals.
  q41b-collaboration among teachers in making 0.82 0.67
curriculum and instructional decisions.
  q41c-focus by teachers on improving and 0.76 0.58
expanding their instructional strategies.
  q41n-efforts among teachers to share 0.76 0.57
practices with each other.
Effective School The extent to which the principals felt each  
Climate (ESC) statement describes their school (currently)
(based on a five-point extensiveness scale)
  q40a-teachers in this school feel responsible 0.69 0.47
to help each other do their best.
  q40b-teachers in this school are continually 0.89 0.80
learning and seeking new ideas.
  q40c-teachers use time together to discuss 0.68 0.46
teaching and learning.

Note: Loadings represent standardized loadings.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


60
Appendix F
Intercorrelation Matrices
Variable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Principals (N = 176)  
  1.q6a —   .50   .41   .50 .26 .35 .35 .38 .38 .38 .34 .33 .05 .13 .25 .07 .05 .13
  2.q6k —   .58   .62 .11 .34 .29 .40 .43 .38 .39 .30 .09 .13 .18 .11 .15 .14
  3.q61 —   .66 .21 .39 .29 .39 .50 .43 .42 .34 .13 .09 .25 .22 .15 .06
  4.q6m — .25 .42 .46 .42 .40 .49 .51 .45 .14 .21 .28 .20 .14 .12
  5.q13c — .68 .68 .45 .17 .33 .27 .33 .19 .04 .13 .03 −.06 .02
  6.q13e — .82 .60 .36 .53 .51 .49 .31 .17 .28 .14 .06 .09
  7.q13f — .54 .29 .38 .37 .41 .24 .11 .15 .10 .03 .11
  8.q14b — .52 .52 .54 .56 .22 .16 .311 .14 .12 .16
  9.q14h — .50 .60 .55 .13 .18 .32 .21 .32 .29
10.q140 — .62 .56 .24 .24 .28 .15 .09 .13
11.q14r — .71 .21 .18 .34 .15 .19 .18
12.q14s — .12 .26 .31 .19 .21 .15
13.q39g — .29 .45 .43 .09 .19
14.q39i — .43 .44 .12 .28
15.q39n — .49 .12 .19
16.q39p — .14 .16

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


17.q40a — .57
Variable 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35  
  1.q6a .20   .09   .12   .17 .15 .0 .00 −.05 .08 .24 .15 .13 −.03 .16 .14 .11 .42  
  2.q6k .18   .22   .20   .19 .26 .06 −.13 −.09 .07 .07 .04 .12 .01 .28 .20 −.02 .37  
(continued)
Appendix F (continued)
Variable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
  3.q61 .19   .08   .18   .15 .17 −.09 −.07 −.16 .06 .09 .12 .16 .05 .22 .12 .06 .22  
  4.q6m .21   .11   .18   .15 .198 .00 .02 −.05 .02 .06 .05 .06 .04 .24 .14 .03 .34  
  5.q13c .00 −.07 −.07 −.08 −.05 .10 .17 .07 .16 .15 .19 .14 .01 .30 .22 .24 .12  
  6.q13e .05   .09   .07   .11 .05 .04 .11 −.09 .20 .16 .21 .27 −.07 .19 .25 .20 .30  
  7.q13f .09   .04   .07   .05 .10 .02 .07 −.05 .19 .18 .18 .21 −.08 .21 .21 .11 .28  
  8.q14b .19   .14   .17   .15 .15 .00 −.01 .00 .24 .22 .22 .29 .05 .23 .24 .08 .27  
  9.q14h .30   .23   .21   .22 .26 −.13 −.25 −.17 .15 .13 .10 .27 .16 .20 .09 .05 .23  
10.q140 .16   .10   .13   .07 .16 −.08 −.14 −.14 .17 .12 .109 .19 .04 .20 .20 .18 .35  
11.q14r .21   .21   .18   .29 .25 .00 −.12 −.18 .10 .13 .09 .22 .06 .22 .14 .12 .24  
12.q14s .22   .20   .09   .18 .24 −.04 −.09 −.15 .16 .19 .09 .28 .06 .24 .16 .05 .25  
13.q39g .06   .15   .22    .14 .15 .00 −.06 −.07 .10 .06 .11 .18 −.13 .13 .09 .25 .20  
14.q39i .16   .16   .06   .15 .10 .09 −.17 −.07 .11 .13 .04 .11 .08 .12 .08 .06 .17  
15.q39n .22   .18   .26   .31 .29 −.04 −.16 −.10 .10 .17 .18 .19 −.10 .07 .19 .14 .27  
16.q39p .16   .12   .20   .25 .22 −.03 −.16 −.06 .05 .04 .02 .14 −.07 .06 .03 .15 .12  
17.q40a .47   .16   .23   .20 .26 −.14 −.22 −.23 .13 .10 .00 .25 .26 .16 .00 .00 .00  
18.q40b .57   .22   .27   .27 .30 .00 −.11 −.10 .07 .12 .03 .15 .17 .20 .13 −.04 −.01  
19.q40c   .16   .43   .40 .46 −.03 −.15 −.16 −.02 .05 .01 .12 .22 .15 .14 .10 .12  
20.q41a   .51   .41 .37 −.15 −.10 −.12 −.05 .06 .02 .12 −.07 .13 .18 .05 .13  

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


21.q41b   .57 .52 −.10 −.09 −.12 .00 .16 .07 .17 −.03 .00 .11 .08 .13  
22.q41c .54 −.10 −.16 −.17 −.09 .07 .02 .13 .00 .07 .07 .04 .10  
23.q41n −.03 −.21 −.19 −.10 .07 .02 .16 .07 .08 .23 .08 .18  
24.q42a .34 .50 .02 −.04 .05 .00 −.09 −.04 .38 .11 .03  

61
(continued)
62
Appendix F (continued)
Variable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
25.q42c .51 −.08 −.04 −.07 −.08 −.22 −.06 .16 .06 −.06  
26.q42d −.06 −.10 .00 −.14 −.10 −.03 .25 .02 .01  
27.q43c .62 .59 .62 −.02 .06 −.03 −.08 −.08  
28.q43d .67 .65 −.02 .15 .04 .04 .00  
29.q43e .64 .01 .09 .10 .00 .03  
30.q43f .02 .05 .04 .05 −.05  
31.pexp .02 −.07 .01 −.20  
32.pbel .07 .01 .16  
33.frl .19 .30  
34.ili .09  
pexp = principalship experience; pbel = positive belief about principalship; frl = % of free and reduced-price lunch; ili = index of leading instruction;
pr = type of preparation (1 = exemplary preparation program).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 63

Appendix G
Parameter Estimates (Standardized Coefficients) for
Structural Equations in Three Models
Model 1
Structural Equations Estimates
Parameters IQUAL PQUAL OILL ILP SIP ESC
PREP .13* .24*  
IQUAL .49*  
PQUAL .70*  
OILL .39*  
ILP .47* .49*
ISP −.41* −.29*
PELI .05 .05  
PBP .00 .00  
EXPP −.06 .19
SPL .00 .00
DS −.10 −.07
R2 .03 .12 .77 .16 .40 .36

Model 2.
Structural Equation Estimates

Parameters IQUAL PQUAL OILL ILP SIP ESC


PREP   .13*   .24*  
IQUAL .50*  
PQUAL .69*  
OILL .39*  
ILP .47* .47*
ISP −.39* −.24
R2 .03 .12 .77 .15 .37 .28

Model 3.
Structural Equation Estimates

Parameters IQUAL PQUAL OILL ILP SIP ESC


PREP   .13*   .24*  
IQUAL .50*  
(continued)

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


64 Educational Administration Quarterly 47(1)

Appendix G (continued)
Structural Equation Estimates

Parameters IQUAL PQUAL OILL ILP SIP ESC


PQUAL .70*  
OILL .40*  
ILP .50* .49*
ISP −.28*  
R2 .03 .12 .77 .16 .33 .24

DS = district support; ESC = effective school climate; EXPP = principalship experience; ILP =
instructional leadership practices; IQUAL = internship quality; ISP = index of school prob-
lems; OILL = organizational and instructional leadership learning; PBP = positive belief about
principalship; PELI = prior experience in leading instruction; PQUAL = program quality; PREP
= participation in an exemplary preparation program; SIP = school improvement progress; SPL
= school poverty level.
*p < .05.

Declaration of Conflicting Interests

The authors declared no potential conflicts of interests with respect to the authorship
and/or publication of this article.

Funding
The authors received no financial support for the research and/or authorship of this
article.

Notes
1. According to Hallinger and Heck (1998), a mediated effects framework hypoth-
esizes that leaders affect school outcomes through indirect paths, which include
“other people, events, and organizational factors” (p. 167). Antecedent factors
are other variables, such as a school’s socioeconomic status, that may have an
effect on school outcomes but not the leadership–school outcome relationship. A
reciprocal effects model proposes that “the relationships between the administra-
tor and features of the school and its environment are interactive” (p. 167).
2. Robinson, Lloyd, and Rowe (2008) apply James McGregor Burn’s (1978) defi-
nition of transformational leadership: the ability to “engage with staff in ways
that inspire them to new levels of energy, commitment and moral purpose” and,
through such leadership and use of a common vision, to transform “the organiza-
tion by developing its capacity to work collaboratively to overcome challenges
and research ambitious goals” (p. 639).

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 65

3. A moderator variable influences the strength or direction of the relationship


between independent and dependent variables, whereas a mediator variable
accounts for the relationship between the independent and dependent variables
(Baron & Kenny, 1986).
4. See the technical report for a description of this sample as drawn from lists
provided by the National Association of Elementary School Principals and
the National Association of Secondary School Principals (Darling-Hammond,
LaPointe, Meyerson, Orr, & Cohen, 2007).
5. A latent variable is a measure that is not directly observed but is inferred from
other (indicator) variables that are observed and measured. Sets of indicator vari-
ables can be used to define a latent variable (Schumaker & Lomax, 2004).

References
Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and
Human Decision Processes, 50, 179-211.
Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models Jour-
nal of the Academy of Marketing Science, 16, 74-94.
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in
social psychological research: Conceptual, strategic, and statistical considerations.
Journal of Personality and Social Psychology, 51, 1173-1182.
Bollen, K. (1989). Structural equations with latent variables. New York, NY: John
Wiley.
Bottoms, G., & O’Neill, K. (2001). Preparing a new breed of school principals: It’s
time for action. Atlanta, GA: Southern Regional Education Board.
Brand, S., Felner, R., Minsuk, S., Seitsinger, A., & Dumas, T. (2003). Middle school
improvement and reform: Development and validation of a school-level assess-
ment of climate, cultural pluralism, and school safety. Journal of Educational
Psychology, 95, 570-588.
Brown, K. M., Anfara, V. A., Jr., & Roney, K. (2004). Student achievement in high
performing, suburban middle schools and low performing, urban middle schools:
Plausible explanations for the differences. Education and Urban Society, 36,
428-456.
Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit In
K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 136-162).
Newbury Park, CA: Sage.
Burns, J. M. (1978). Leadership. New York: Harper & Row.
Carmines, E. G., & McIver, J. P. (1981). Analyzing models with unobserved
variables: Analysis of covariance structures. In G. W. Bohrnstedt & E. F. Borga-
tia (Eds.), Social measurement: Current issues (pp. 65-115). Beverly Hills, CA:
Sage.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


66 Educational Administration Quarterly 47(1)

Center for Research on the Context of Teaching. (2002). Bay Area School Reform Col-
laborative, Phase 1 (pp. 65-115). San Francisco, CA: Bay Area School Reform
Collaborative.
Curran, P. J., Finch, J. F., & West, S. G. (1996). The robustness of test statistics to non-
normality and specification error in confirmatory factor analysis. Psychological
Methods, 1, 16-29.
Darling-Hammond, L., LaPointe, M., Meyerson, D., Orr, M. T., & Cohen, C. (2007).
Preparing leaders for a changing world. Palo Alto, CA: Stanford University, Edu-
cational Leadership Institute.
Darling-Hammond, L., Meyerson, D., La Pointe, M. M., & Orr, M. T. (2009). Prepar-
ing principals for a changing world. San Francisco, CA: Jossey-Bass.
Davis, S., Darling-Hammond, L., Meyerson, D., & LaPointe, M. (2005). Review of
research. School leadership study. Developing successful principals. Palo Alto,
CA: Stanford University, Educational Leadership Institute.
Diamantopoulos, A., & Siguaw, J. (2000). Introducing LISREL. Thousand Oaks, CA:
Sage.
Educational Research Service. (2000). The principal, keystone of a high-achieving
school: Attracting and keeping the leaders we need. Arlington, VA: Author.
Farkas, S., Johnson, J., Duffett, A., & Foleno, T. (2001). Trying to stay ahead of the
game: Superintendents and principals talk about school leadership. New York,
NY: Public Agenda.
Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unob-
servable variables and measurement error. Journal of Marketing Research, 18, 39-50.
Goldring, E. B., & Pasternack, R. (1994). Principals’ coordinating strategies
and school effectiveness. School Effectiveness and School Improvement, 5,
239-253.
Hale, E. L., & Moorman, H. N. (2003). Preparing school principals: A national per-
spective on policy and program innovations. Washington, DC: Institute for Edu-
cational Leadership, Illinois Education Research Council.
Hallinger, P. (2005, April). Instructional leadership: How has the model evolved and
what have we learned? Paper presented at the American Educational Research
Association, Montreal, Canada.
Hallinger, P., & Heck, R. (1996). Reassessing the principal’s role in school effec-
tiveness: A review of empirical research, 1980–1995. Educational Administration
Quarterly, 32, 5-44.
Hallinger, P., & Heck, R. (1998). Exploring the principal’s contributions to school effec-
tiveness. School Effectiveness and School Improvement, 9(2), 157-191.
Hoyle, R., & Panter, A. (1995). Writing about structural equation models. In R. Hoyle
(Ed.), Structural equation modeling: Concepts, issues and applications (pp. 158-176).
Thousand Oaks, CA: Sage.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 67

Institute for Educational Leadership. (2000). Leadership for student learning: Reinvent-
ing the principalship. Report of the Taskforce on the Principalship. Washington,
DC: Author.
Jackson, B. L., & Kelley, C. (2002). Exceptional and innovative programs in educa-
tional leadership. Educational Administration Quarterly, 38, 192-212.
Jöreskog, K., & Sörbom, D. (2001). LISREL 8: User’s reference guide. Lincolnwood,
IL: Scientific Software International.
Kirkpatrick, D. L. (1998). Evaluating training programs: The four levels (2nd ed.).
San Francisco, CA: Berrett-Koehler.
Kottkamp, R. B., & Rusch, E. A. (2009). The landscape of scholarship on the educa-
tion of school leaders, 1985–2006. In M. D. Young, G. M. Crow, J. Murphy, &
R. T. Ogawa (Eds.), Handbook of research on the education of school leaders
(pp. 23-85). New York, NY: Routledge.
Lad, K., Browne-Ferrigno, T., & Shoho, A. (2005, November). Leadership prepara-
tion admission criteria: Examining the spectrum from open enrollment to elite
selection. Paper presented at the annual convention of the University Council of
Educational Administration, Nashville, TN.
Leithwood, K., & Jantzi, D. (1999). The relative effects of principal and teacher
sources of leadership on student engagement with school. Educational Adminis-
tration Quarterly, 35(Suppl.), 679-706.
Leithwood, K., & Jantzi, D. (2000). Principal and teacher leadership effects: A repli-
cation. School Leadership & Management, 20(4), 415-434.
Leithwood, K., & Jantzi, D. (2005, April). A review of transformational school lead-
ership research. Paper presented at the meeting of the American Educational
Research Association, Montreal, Canada.
Leithwood, K., & Jantzi, D. (2008). Linking leadership to student learning: The con-
tributions of leader efficacy. Educational Administration Quarterly, 44, 496-528.
Leithwood, K., Jantzi, D., Coffin, G., & Wilson, P. (1996). Preparing school leaders:
What works? Journal of School Leadership, 6, 316-342.
Leithwood, K., Louis, K. S., Anderson, S., & Wahlstrom, K. (2004). How leadership
influences student learning. Toronto, Canada: Center for Applied Research and
Educational Improvement and Ontario Institute for Studies in Education.
Leithwood, K., & Riehl, C. (2005). What we know about successful school leader-
ship. In W. Firestone & C. Riehl (Eds.), A new agenda: Directions for research
on educational leadership (pp. 22-47). New York, NY: Teachers College Press.
Levine, A. (2005). Educating school leaders. Washington, DC: The Education Schools
Project.
MacCallum, R. (1995). Model specification: Procedures, strategies, and related issues.
In R. Hoyle (Ed.), Structural equation modeling: Concepts, issues and applica-
tions (pp. 16-36). Thousand Oaks, CA: Sage.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


68 Educational Administration Quarterly 47(1)

McCarthy, M. M. (1999). The evolution of educational leadership preparation pro-


grams. In J. Murphy & K. S. Louis (Eds.), Handbook of research on educational
administration: A project of the American Educational Research Association
(pp. 119-139). San Francisco, CA: Jossey-Bass.
McCarthy, M. M., & Forsyth, P. B. (2009). An historical review of research and devel-
opment activities pertaining to the preparation of school leaders. In M. D. Young,
G. M. Crow, J. Murphy, & R. T. Ogawa (Eds.), Handbook of research on the edu-
cation of school leaders (pp. 86-128). New York, NY: Routledge.
McLaughlin, M., & Talbert, J. (2002). Reforming districts. In A. M. Hightower, M.
Knapp, J. A. Marsh, & M. McLaughlin (Eds.), School districts and instructional
renewal (pp. 173-192). New York, NY: Teachers College Press.
Muijs, D., Harris, A., Chapman, C., Stoll, L., & Russ, J. (2004). Improving schools in
socioeconomically disadvantaged areas—A review of research evidence. School
Effectiveness and School Improvement, 15, 149-175.
National Policy Board for Educational Administration. (2002). Instructions to imple-
ment standards for advanced programs in educational leadership for principals,
superintendents, curriculum directors and supervisors. Arlington, VA: Author.
National Policy Board for Educational Administration. (2005). A listing of nationally
recognized educational leadership preparation programs at NCATE accredited
colleges and universities. Retrieved from http://www.npbea.org/ELCC/ELCC-
approved_and_denied_list_0805.pdf
Orr, M. T. (2003, April). Evaluating educational leadership development: Measuring
leadership, its development and its impact. Paper presented at the meeting of the
American Educational Research Association, Chicago, IL.
Orr, M. T. (2006a). Mapping innovation in leadership preparation in our nation’s
schools of education. Phi Delta Kappan, 87, 492-499.
Orr, M. T. (2006b). Research on leadership education as a reform strategy. Journal of
Research on Leadership Education, 1, 1-5.
Orr, M. T. (2009). Program evaluation in leadership preparation and related fields. In
M. D. Young & G. Crow (Eds.), Handbook of research on the education of school
leaders (pp. 457-498). New York, NY: Routledge.
Orr, M. T., & Barber, M. E. (2007). Collaborative leadership preparation: A compara-
tive study of innovative programs and practices. Journal of School Leadership,
16, 709-739.
Orr, M. T., King, C., & LaPointe, M. M. (2009). Districts developing leaders: Eight
districts’ lessons on strategy, program approach and organization to improve the
quality of leaders for local schools (Report prepared for the Wallace Foundation).
Newton, MA: Education Development Center.
Orr, M. T., & Pounder, D. G. (2006). UCEA/TEA-SIG Taskforce on Evaluating
Leadership Preparation Programs. Taskforce report: Six years later and future

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


Orr and Orphanos 69

directions. Austin, TX: University Council of Educational Administration.


Retrieved from http://www.ucea.org/evaluation/about.html
Orr, M. T., Silverberg, R., & LeTendre, B. (2006, April). Comparing leadership devel-
opment From pipeline to preparation to advancement: A study of multiple institu-
tions’ leadership preparation programs. Paper presented at the meeting of the
American Educational Research Association, San Francisco, CA.
Osterman, K. F., & Sullivan, S. (1996). New principals in an urban bureaucracy: A
sense of efficacy. Journal of School Leadership, 6, 661-690.
Pounder, D. G., & Merrill, R. J. (2001). Job desirability of the high school principal-
ship: A job choice theory perspective. Educational Administration Quarterly, 37,
22-57.
Robinson, V. M. J., Lloyd, C. A., & Rowe, K. J. (2008). The impact of leadership on
student outcomes: An analysis of the differential effects of leadership types. Edu-
cational Administration Quarterly, 44, 635-674.
Sanders, N. M., & Simpson, J. (2005). State policy framework to develop highly qual-
ified administrators. Washington, DC: Council of Chief State School Officers.
Schumaker, R. E., & Lomax, R. G. (2004). A beginner’s guide to structural equation
modeling (2nd ed.). Mahwah, NJ: Lawrence Erlbaum.
Sebring, P. B., Allensworth, E., Bryk, A. S., Easton, J. Q., & Luppescu, S. (2006). The
essential supports for school improvement. Chicago, IL: University of Chicago,
Consortium on Chicago School Research.
Shen, J., Rodriguez-Campo, L., & Rincones-Gomez, R. (2000). Characteristics of
urban school principals: A national trend study. Education and Urban Society,
32, 481-491.
Southern Regional Education Board. (2006). Schools can’t wait: Accelerating the
redesign of university principal preparation programs. Atlanta, GA: Author.
Sweetland, S., & Hoy, W. K. (2000). School characteristics and educational outcomes:
Toward an organizational model of student achievement in middle schools. Edu-
cational Administration Quarterly, 36, 703-729.
Tschannen-Moran, M., & Gareis, C. R. (2005, November). Cultivating principals’
sense of efficacy: Supports that matter. Paper presented at the annual convention
of the University Council of Educational Administration, Nashville, TN.
U.S. Department of Education. (2004). Innovations in education: Innovative path-
ways to school leadership. Washington, DC: U.S. Department of Education,
Office of Innovation and Improvement.
Waters, J. T., Marzano, R. J., & McNulty, R. A. (2003). Balanced leadership: What
30 years of research tells us about the effect of leadership on student achievement.
Aurora, CO: McREL.
Young, M. D., Crow, G., Ogawa, R., & Murphy, J. (2009). The handbook of research
on leadership preparation. New York, NY: Routledge.

Downloaded from eaq.sagepub.com at Cyprus Univ of Technology on November 10, 2013


70 Educational Administration Quarterly 47(1)

Bios

Margaret Terry Orr (PhD, Columbia) is on the faculty of Bank Street College of
Education where she directs its multi-district partnership program, the Future School
Leaders Academy. She co-chairs a national taskforce on evaluating leadership prepa-
ration programs and has published widely, including co-authoring Preparing Leaders
for a Changing World (Jossey-Bass).

Stelios Orphanos (PhD, Stanford) is a lecturer of educational administration at


Frederick University Cyprus. His research interests include the preparation and effec-
tiveness of educational leaders, ethical leadership, school climate and organizational
behavior.

You might also like