Professional Documents
Culture Documents
Quarterly
http://eaq.sagepub.com/
Published by:
http://www.sagepublications.com
On behalf of:
Additional services and information for Educational Administration Quarterly can be found
at:
Subscriptions: http://eaq.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Citations: http://eaq.sagepub.com/content/47/1/18.refs.html
What is This?
Article
Educational Administration Quarterly
Abstract
Purpose: This study attempted to determine the influence of exemplary
leadership preparation on what principals learn about leadership, their use
of effective leadership practices, and how their practices influence school
improvement and the school’s learning climate.The authors also investigated
how the frequency of effective leadership practices related to the strength
of district support and the extent of school problems and student pover-
ty. Finally, the authors examined the contribution of exemplary leadership
preparation to variations in school improvement progress and school ef-
fectiveness climate. Research Design: The study, using survey research con-
ducted in 2005, compared 65 principals who had graduated from one of four
selected exemplary leadership preparation programs to a national sample of
1
Bank Street College, New York, NY, USA
2
Frederick University, Limassol, Cyprus
Corresponding Author:
Stelios Orphanos, Frederick University, Mariou Agathangelou 18, Ag. Georgios Havouzas, 3080
Limassol, Cyprus
Email: s.orphanos@frederick.ac.cy
111 principals.The authors used structural equation modeling to find the best
fit. Findings: Participation in an exemplary leadership preparation program
was significantly associated with learning about effective leadership and en-
gaging in these practices, particularly where stronger preparation program
and internship quality existed. Frequent use of effective leadership practices
was positively associated with school improvement progress and school ef-
fectiveness climate. Taken together, exemplary leadership preparation had a
positive but mediated influence on variations in school improvement prog-
ress and school effectiveness climate; the relationship was even stronger
when focusing on preparation program and internship quality measures.
Conclusions: Faculty investments in preparation program and internship
quality will positively contribute to the leadership knowledge of graduates
and their leadership practices and school improvement progress. These re-
sults yield significant implications for policy makers, universities, and other
providers of leadership preparation.
Keywords
leadership preparation, internship, principals, school improvement
Public demands for more effective schools have placed growing attention on
the influence of school leaders, primarily principals and assistant principals.
Numerous studies have consistently found positive relationships between
principals’ practices and various school outcomes, including student achieve-
ment (Hallinger & Heck, 1996; Leithwood & Jantzi, 2008; Robinson, Lloyd,
& Rowe, 2008). This growing body of research has tried to account for the
ways that school leaders influence the academic achievement of students in
their schools. The research has yielded strong evidence that leaders’ influence
is felt primarily through their direct effects on staff and organizational condi-
tions. Given these findings, policy makers and educational experts are increas-
ingly turning to educational leadership preparation and development as a
strategy for improving schools and student achievement (Educational Research
Service, 2000; Farkas, Johnson, Duffett, & Foleno, 2001; Hale & Moorman,
2003). The collective aim of these efforts is to develop leaders who can pro-
mote powerful teaching and effective learning for all students in their schools
(Bottoms & O’Neill, 2001).
Existing research has contributed to important changes in the nature and
quality of leadership preparation. In recent years, many university-based edu-
cational leadership preparation programs have redesigned their content and
Study Framework
Leadership preparation program models and features, which are critical to the
development of effective leadership practices that generate school improvement
and effectiveness, provide the framework for this study. The presumed path-
way from leadership preparation to principals’ school improvement work is
based on heretofore only partially tested assumptions about mediated relation-
ships in a causal chain of influences (i.e., the relationship of leadership prac-
tices to teacher practices and school climate and, in turn, their relationship to
student outcomes, including engagement and achievement). In studies of both
leadership and effective school improvement, several researchers have
mapped the relationship among leadership practices, changes in teaching and
school organization practices, and student outcomes. Both types of research
identify student, teacher, and organizational conditions that are most predictive
of student achievement gains and examine which leadership practices and
school improvement changes are most commonly at play.
Based on a review of available research, Orr (2003, 2006b) established a
conceptual model that stepped back further to suggest how leadership prepa-
ration influences the practices of principals and school improvement and
effectiveness. She delineated a series of staged outcomes that took into
account participants’ program experiences, what they learned, and how they
applied that knowledge in their schools, which is similar to the four-stage
outcome model for evaluating adult development programs (Kirkpatrick,
1998). An overriding assumption of Orr’s conceptual model is that better
quality leadership preparation has a positive influence on graduates’ leader-
ship practices, which in turn influences the quality of their school improve-
ment practices and the educational climate of their schools. The primary
independent variable of this model is quality leadership preparation: The ini-
tial and direct outcome of leadership preparation is graduates’ knowledge
about school leadership and the extent to which they exercise effective lead-
ership practices. Thus, there may be an important, albeit distant, relationship
between leadership preparation and student achievement.
The study reported here uses this hypothesized model as its framework
and tests whether there is a series of mediated relationships among variables
in a causal model that links leadership preparation to school improvement. In
this section we review prior evidence about leadership preparation and these
relationships.
a very new area of research, limited by scholarly skepticism over the perceived
legitimacy of such research and difficulties of launching large-scale com-
parative research studies (Kottkamp & Rusch, 2009; McCarthy & Forsyth,
2009). Available research results, however, are positive, highlighting exem-
plary preparation approaches that are most influential to leadership develop-
ment and have focused on specific program features (e.g., program content
and organization) or the combination of features in innovative programs.
Three studies have investigated the relationship between individual pro-
gram features and graduate outcomes for innovative or conventional pro-
grams. Leithwood et al. (1996) documented 11 innovative graduate-level
leadership preparation programs that were redesigned through a Danforth
Foundation grant initiative and surveyed teachers who worked in schools led
by program graduates. The authors found that programs’ innovative use of
several features—instructional strategies, cohort membership, and program
content—was most predictive of teachers’ positive perceptions of principals’
leadership effectiveness (e.g., in setting direction, developing staff, fostering
a positive school culture, and focusing on curriculum and instruction).
Two more recent studies investigated the relationship between quality pro-
gram features and initial graduate outcomes: what graduates learned about
leadership, their beliefs about the principalship as a career, and their actual
career advancement. The studies were modeled on the theory of planned
behavior, which asserts that career intentions are strongly predictive of sub-
sequent career advancement and are influenced by individuals’ perceived
efficacy and beliefs about the position (Ajzen, 1991), and on other related
research on beliefs about the principalship and career aspirations (Pounder &
Merrill, 2001). Orr and Barber (2007) compared the outcomes for graduates
of two university–district partnership programs (both designed to include
many of the innovative features identified above) with outcomes for graduates
of a conventional program (with few such features). They found that three
program features—supportive program structures (e.g., accessibility and
scheduling convenience), a comprehensive and standards-based curriculum,
and broader, more intensive internships—were significantly but differentially
related to three types of outcomes: self-assessed leadership knowledge and
skills, leadership career intentions, and actual career advancement.
Similarly, Orr, Silverberg, and LeTendre (2006) examined how differ-
ences in five programs’ incorporation of these innovative features and overall
program redesign to meet national and state standards were associated with
graduate learning and career outcomes. They found that the five programs
varied most on measures of three types of program features: program chal-
lenge and coherence, use of active student-centered instructional practices,
and internship length and quality. How graduates rated the extent to which
these features were attributes of their preparation was significantly related to
how much they learned about instructional and organizational leadership. The
length and quality of internships, however, were uniquely associated with
graduates’ career intentions and subsequent advancement. Thus, the authors
concluded, programs with well-implemented, innovative features yield posi-
tive and significantly better outcomes than more typical programs.
These three studies used different outcome measures for evaluating the
influence of leadership preparation. Leithwood and others (1996) focused on
the association of preparation to teachers’ perceptions of leadership practices.
In comparison, Orr and Barber (2007) and Orr et al. (2006) evaluated more
near-term outcomes, including graduates’ self-assessed leadership knowledge
and skills, extent of learning about instructional and organizational leadership,
positive and negative beliefs about the principalship, career intentions, and
actual career advancement—all of which were associated with one or more
preparation program features.
Finally, a few studies are now including measures of leadership prepara-
tion quality as influential to leadership outcomes, such as leadership self-
efficacy and leadership practices (Tschannen-Moran & Gareis, 2005). For
example, Tschannen-Moran and Gareis (2005), in their study of 558 princi-
pals, found that perceived quality and utility of leadership preparation signifi-
cantly contributed to principals’ sense of leadership self-efficacy.
SIP
IQUAL PELI
PBP
PQUAL
ESC
Method
Drawing from the above research, we concluded that a positive relationship
might exist between completion of a quality leadership preparation program
and the following: leadership knowledge, effective leadership practices, and
Sample
The study compares a sample drawn from all principals who completed one
of four exemplary leadership preparation programs between 1999 and 2005
to a national sample of comparison principals. The initial source of these two
samples was the Stanford University study described above. The origins of
this source and how the two samples were narrowed for the purposes of this
study are described below.
The first sample was constructed through a two-step process. It began with
selecting the four exemplary leadership programs that coherently organized
preparation around core assumptions of effective leadership and preparation,
had stringent student selection criteria, used program designs that empha-
sized instructional leadership and intensive internships, engaged in univer-
sity–district collaborations and types of institutions and state policy contexts
in support of leadership preparation, and represented different geographic
regions. The four programs were identified through interviews with experts,
a review of the published professional research on leadership preparation,
and initial consideration of a much larger sample of programs. Appendix A
contains a brief description of the four programs.
Next, in 2005 all of the graduates (who had graduated between 1999 and
2005) of these programs were identified and surveyed (as explained in the
study’s report; Darling-Hammond et al., 2007). Response rates ranged between
50% and 71% and yielded 198 exemplary program graduates.
The comparison sample was drawn from a survey of two national lists of
principals.4 Of 1,229 principals surveyed, 661 responded (54%).
To make the two sample groups more comparable and relevant for our
study, we restricted each to only those principals who had completed a lead-
ership preparation program between 1999 and 2005, had an internship expe-
rience while in their program, and were currently principals. These limitations
reduced the exemplary prepared sample to 65 respondents. Of them, 37%
were from the Educational Leadership Development Academy (ELDA) pro-
gram at the University of San Diego in California, 35% had graduated from
Delta State University in Mississippi, 23% were from the University of
Connecticut, and 5% were from Bank Street College’s Principals Institute in
New York City. We applied the same restrictions to the comparison sample.
We eliminated from the sample those comparison principals who were not
current principals in 2005 and those who lacked an internship. These steps
reduced the comparison sample to 111 principals. One consequence is that we
may have reduced the differences between the two groups by overstating
the quality of conventional preparation on leadership outcomes because we
excluded much more from the comparison group for not having had an intern-
ship experience (one critical requisite of quality preparation). Specifically,
we eliminated 23% of the comparison principal sample but only 7% of the
exemplary prepared principals for not having had an internship as part of
their preparation experience.
the number of years since program completion. To control for the influence
of years since program completion, we limited the sample to only recently
prepared principals and included principal experience in the SEM models.
Measures of district support in program participation and in school leader-
ship were included in initial analyses to investigate these influences but
yielded weak associations, so they were discarded for the purposes of this
analysis. Still, the sample of exemplary prepared principals was much more
likely to have had district support when selected to participate.
Data Source
Both samples of principals completed a standardized 48-question survey proto-
col, made available to them online and by mail. The survey instrument was based
on a graduate survey developed and piloted by the University Council for
Educational Administration/Teaching Educational Administration Special
Interest Group of the American Educational Research Association (UCEA/TEA-
SIG) Taskforce on Evaluating Leadership Preparation Programs (for details on
the initial survey construction, see Orr et al., 2006). That survey had been aligned
in part with national leadership preparation standards (NPBEA, 2002) and with
measures of leadership drawn from Leithwood and Janti’s (2000) leadership
effectiveness research (for further details on the survey construction, see
Darling-Hammond et al., 2007). The survey was fielded by WestEd Associates
and other research staff between February and May 2005 using mail and online
survey strategies and phone follow-up with nonresponders.
Study Limitations
The research was limited by its cross-sectional nature and its reliance on
principals’ self-reports. Any bias that self-reporting creates was assumed to
be similar across the two samples.
Data Analysis
We performed a three-step analysis, beginning with a preliminary analysis of
our data set to test specific assumptions regarding the distribution of the data
to be analyzed. Then we conducted confirmatory factor analysis (CFA) to
verify the structure of latent variables used in the analysis and computed
validity and reliability measures to assess the adequacy of their measurement
models.5 The last step was to examine the relationships presented in Figure 1
using SEM techniques.
SIP
IQUAL
PQUAL
ESC
that, to have a mediated effect on the two outcome variables, whereas ILP
was hypothesized to positively affect the two outcomes. The only exogenous
variable in the model was PREP, which was hypothesized to have a direct
positive effect on program attributes (PQUAL and IQUAL) and ILP.
Other important variables were also considered to account for possible
specification errors. Figure 2 shows the hypothesized effects of six moderat-
ing factors (principals’ attributes and contextual factors). The three moderat-
ing attributes of principals are their prior experience in leading instruction
while in a nonsupervisory position (PELI), such as department chair, team
leader, instructional specialist, or coach; their number of years of experience
as a principal (EXPP); and their positive belief that they can influence school
change (PBP). All three are hypothesized to have direct positive effects on
leadership practices and, through that variable, a moderating effect on the
two outcome variables.
The three moderating variables on the principal practices–school out-
comes relationship are district support (DS), an index of challenging school
conditions (ISP), and a school poverty level measured as the percentage of
poor students in a school (SPL). The DS variable, drawn from McLaughlin
and Talbert’s (2002) district research on school improvement (Center for
Results
Any structural equation model is composed of two components: the mea-
surement model and the structural model. In the section that follows we
present the results of fitting the measurement models for the latent variables
using CFA.
Measurement Models
First, we evaluated the measurement models underlying the latent variables
used in the analysis. We examined the relationships between our proposed
latent variables and their indicators using CFA to determine the validity and
reliability of the measures. If the latent variables were not satisfactorily mea-
sured with appropriate measurement models, there would be little point in
thinking of estimating statistical relationships between them (Bollen, 1989;
Diamantopoulos & Siguaw, 2000).
The validity of the indicators can be assessed by examining the magnitude
and significance of the paths between each latent variable and its indicators.
If any given indicator is a valid measure of a specific latent variable, then
the relationship between them must be substantial. The standardized validity
coefficient, which gives the expected number of standard deviation units that
the indicator changes for a one standard deviation change in the latent vari-
able, was used to assess these relationships. The standardized validity coeffi-
cients also allow the possibility of comparing the validity of different indicators
measuring a particular construct, which is impossible with unstandardized
loadings (Diamantopoulos & Siguaw, 2000).
Appendix E presents the complete measurement models for all latent vari-
ables. In total, 30 indicator variables loaded on eight different latent variables,
whereas five other latent variables were single-indicator variables that we
assumed were measured with no measurement error. Inspection of the range of
standardized loadings for each latent variable revealed that overall the observed
variables were valid measures for their respective latent variables. The majority
of the loadings were well above .70, and the lowest loading for any observed
variable was .63. The construct of ILP was the only latent variable with notably
lower loadings for its four observed variables, but they are considered adequate
although not as strong as the loadings for other latent variables.
Table 2 presents a summary of the measurement model for each latent
variable with information regarding the validity and reliability for each latent
construct. The reliability for most indicators was satisfactory. The composite
reliability value is an overall measure of each latent variable’s reliability. All
latent variables had values well above .60, which is considered the lowest
desired value (Bagozzi & Yi, 1988). The average variance extracted (AVE)
values are complementary measures of reliability and offer information not
provided by the composite reliability value. They show the amount of vari-
ance that is captured by the construct in relation to the amount of variance
because of measurement error (Fornell & Larcker, 1981) and should exceed
.50, which means that the underlying latent variable accounts for a greater
amount of variance in the indicators than does the measurement error. Five
latent variables have AVE values greater than .50, but two (PQUAL, ESC)
have values between .50 and .60 and another variable (ILP) has an AVE
of .44. The reason for the low AVE values for ILP is the small amount of
variance explained by almost all indicators (close to 50%). This was expected
given the complicated nature of the construct we are trying to measure. As a
result, we considered the AVE values to be satisfactory and believed that
overall we had good evidence of validity and reliability for all latent variables.
Comparison of Models
Given the nature of the three models, we can explicitly compare them by
estimating the difference in the χ2 statistic between any two models and then
assessing its significance with the critical values of the χ2 distribution. When
two models were found to be equivalent, our decision rule was to select the
IQUAL SIP
PQUAL ESC
most parsimonious of them. Table 3 reports fit indexes for the three estimated
models and differences in the χ2 statistic between Models 1 and 3.
The three models have similar satisfactory fit as indicated by the fit
indexes. Based on the calculated χ2 difference between Model 1 and Model 3,
we concluded that Model 3 was statistically equivalent to the general concep-
tual Model 1 because this difference was not statistically significant. As a
result, our model of choice was Model 3.
Table 3. Fit Indexes for the Basic Model 1 and Nested Models 2 and 3
Model χ2 χ2/df RMSEA CFI GFI χ2 diff
Model 1 (basic 601.35* 1.14 .020 .95 .85
conceptual
model with
all proposed
paths)
Model 2 (basic 610.87* 1.14 .021 .95 .85
conceptual
with only paths
with significant
coefficients
from Model 1)
Model 3 (basic 617.30* 1.15 .023 .95 .85
conceptual
with only paths
with significant
coefficients
from Model 2)
Difference 15.95
between Model
1 and Model 3
CFI = comparative fit index; GFI = goodness of fit index; RMSEA = root mean square error of
approximation.
*p < .05.
A model’s incremental fit concerns the degree to which the model in ques-
tion is superior to an alternative model (usually the null model where no
covariances among variables are specified). We used two incremental fit
indexes: the comparative fit index and the goodness of fit index. The values
of these indexes are either above or close to the commonly recommended
cutoff point of .90 (.95 and .85, respectively), indicating a good fitting model.
Based on all measures, we concluded that our third model had a reason-
ably good fit. The fact that we obtained good fit does not mean that our sug-
gested model is the correct one. Rather, it implies that it is a plausible model
and consistent with the sample data at hand (Bollen, 1989). Given the accept-
able model fit, parameter estimates are presented in the following section.
Parameter Estimates
The standardized coefficients from the completely standardized estimation
of Model 3 are presented in Figure 4, indicating that the sign of all reported
coefficients was in the expected direction. Participation in an exemplary
preparation program positively affects PQUAL and IQUAL, which in turn
positively affect principals’ OILL. OILL positively affects principals’ ILP
and finally ILP has a positive effect on both SIP and ESC. Appendix G pro-
vides all the standardized coefficients in the model’s structural equations.
SIP
IQUAL
PQUAL ESC
(DS, SPL, ISP). The results presented in Figure 4 show positive associations
between leadership practices and the two outcome variables in Model 3. A 1
standard deviation increase in the frequency of leadership practices was asso-
ciated with a .50 standard deviation increase in school improvement. The
effect of leadership practices on promoting ESC was similar (.49).
The only contextual factor with a significant effect on either SIP or ESC is
the extent of various school problems. Even when taking ILP into account, a
1 standard deviation increase in school problems (ISP, as defined for the pur-
poses of this study) was associated with a .28 standard deviation decrease in
school improvement practices. DS and SPL had no significant effects on SIP.
The three principals’ attributes (positive beliefs, principalship experience,
and prior experience in leading instruction) also had insignificant effects on
both SIP and ESC and were dropped from the final model. ILP and ISP
explained 37% of the variance observed in SIP.
an ESC. Both total effects are indirect effects that are mediated through other
variables (OILL and ILP).
The results reveal, however, that the quality of preparation—through pro-
gram quality and internship quality—is more strongly associated with the
outcome measures than is program affiliation (including the association between
program type and the two preparation quality measures). Combined, program
quality and internship quality had an estimated total effect of .48 of a stan-
dard deviation on the frequency of effective leadership practices and .24 on
each of the school outcomes: SIP and ESC.
In summary, our analysis uncovered positive, although mediated, effects
for participating in an exemplary leadership preparation program on SIP and
promoting an ESC. The results also underscore the importance of program and
internship quality as influencing leadership learning and effective leadership
practices. Their effects on SIP, however, are moderated by challenging school
conditions, as measured by the extent to which school problems exist (ISP).
These results are consistent with prior findings about the influence of pro-
grams that embody quality principles (e.g., coherently organized, leadership-
relevant content, and quality internship) on the outcomes of graduates, in both
the extent of their learning about leadership and their attention to effective
instructional and organizational leadership practices as school principals
(Leithwood et al., 1996; Orr & Barber, 2007). Our two dimensions of program
features—program quality and internship quality—were moderately strongly
related to the extent of organizational and instructional leadership capacity that
principals developed through the program, suggesting that the quality of candi-
dates’ programs and their field experiences contribute significantly to what and
how much they learn about effective leadership and, through what they learn,
how they subsequently function as school leaders. This finding underscores the
importance of programs’ attention to both dimensions in the learning experi-
ences they provide for their candidates and the synergy they attain from both.
The weaker, positive association between program type and these two quality
dimensions suggests variability in candidates’ experiences within programs,
particularly around internship quality. Importantly, the higher the quality of
programs and internship experiences, the more positive the effects on candidate
learning and subsequent use of effective leadership practices.
A surprising result was the lack of influence on the program quality–lead-
ership practices relationship that personal characteristics had, although often
argued to be independently critical in principal readiness and leadership effi-
cacy: principals’ prior leadership experience, their number of years as a principal,
and their beliefs about the principalship. This finding may be in part because
of, as shown in Appendix D, the skewness of positive belief about the princi-
palship (M = 4.78 on a 5-point scale, SD = 0.51) and the low number of prior
leadership experiences (M = 1.1). The lack of influence of the number of
years of principal experience may be because of the fact that both samples
had modest experience. The results also suggest that quality preparation and
learning override the influence of years of experience, particularly for those
in their early years as principal.
The implications, therefore, are that leadership candidates who complete
an exemplary leadership preparation program increase the likelihood that
they will have superior preparation, thereby increasing the scope and quality
of what they learn about leadership. Being affiliated with a program recog-
nized as strong is not sufficient, however. Candidates must have both high-
quality preparation and high-quality internships to experience learning
benefits that positively influence their subsequent leadership practices.
Consequently, programs that are designed to incorporate research-recom-
mended quality features are recognizably different to their graduates (based
These results are consistent with prior research on school improvement and
climate (Muijs et al., 2004; Sebring et al., 2006) and underscore the leadership
influence on these two critical and complementary outcomes.
In terms of the second part of this research question, however, we found
limited moderating effects. Only the severity of school problems influenced
any of the dependent measures—only SIP—reducing the positive influence of
school leadership practices. This relationship is consistent with prior research
(Leithwood & Jantzi, 2008; Muijs et al., 2004; Robinson et al., 2008). DS and
student poverty levels, which had been found to be significant in other studies
of the leadership practice–school performance relationship, added little here.
In addressing the third research question, we found that, taken together,
the results are very encouraging, showing that quality preparation matters
and contributes significantly to what graduates learn and, ultimately, to how
they practice leadership and work to improve their schools. The results also
show that the quality of the program features—focus, content, faculty, and
internships—is more important for a candidate’s success than simply enroll-
ing in an exemplary program.
The results yield important implications for universities and other sponsors
of leadership preparation programs, districts, state policy makers, and educa-
tional researchers. First, the results confirm the importance of program and
internship quality for how much candidates learn about leadership and how, as
school leaders, they frequently use effective leadership practices. Moreover,
how preparation programs are designed and organized to operationalize key
quality features influences these outcomes. Of the seven features stressed most
by the research literature above, four are shown here to be most influential as a
combined influence: instructional leadership-focused program content, integra-
tion of theory and practice, knowledgeable faculty, and a strong orientation to
the principalship as a career. Their relationship suggests that there is a syner-
gistic effect when these elements are combined coherently. A fifth quality fea-
ture has an equally important but independent influence when embodying
several elements that are similar to program quality features—an orientation to
the principalship and a focus on leadership for school improvement, as well as
offering opportunities to have responsibilities for leading and facilitating and
making decisions typical of an educational leader. Such features, therefore, app
ear to be critical for universities and other providers to redesign their programs.
For districts, the implications are twofold: their role in leadership prepara-
tion and their selection decisions in hiring school leader candidates. Although
not shown here, the case study research on the four leadership preparation pro-
grams (Darling-Hammond et al., 2007) used as the exemplary programs for
this research shows the strong role of districts in program design and delivery
in two of them (ELDA in San Diego and the Bank Street College Principals
Institute in New York City) and their role (combined with essential external
funding) in providing paid full-time internships for these two programs at
Delta University’s program. The importance of the district’s role can be inferred
further from the strong emphasis on the principal career and instructional lead-
ership for school improvement, both priorities for the participating districts
whose programs were designed to address leadership shortages while contrib-
uting to school improvement efforts. The preparation–school outcomes rela-
tionship confirms the value of investing in leadership preparation as part of
district school improvement efforts. Finally, given these positive findings, dis-
tricts should give greater attention to the nature and quality of leadership prepa-
ration in screening candidates for school leadership positions.
The results yield similar implications for policy makers who look for strat-
egies to strengthen leadership preparation quality as part of a more compre-
hensive effort to improve school performance and student achievement. The
results stress the importance of the coherence and focus of program content
around instructional leadership and school improvement and the enabling of
high-quality internships. State policy makers can use program guidelines to
reinforce these program features and direct funding for paid internships, par-
ticularly to serve the districts most in need.
Finally, the results have implications for researchers. This research had its
roots within two initiatives—a national taskforce on evaluating leadership
preparation and a national study of exemplary leadership preparation and
development programs (Darling-Hammond et al., 2009; Orr & Pounder,
2006). Through the efforts of these initiatives, significant conceptual and
methodological developments were accomplished, leading to the positive
investigation of our research questions here. But this work is just the begin-
ning and needs to be extended in three ways. First is to obtain concurrent
validity for the findings. One step would be to connect the survey results to the
schools’ achievement outcomes to validate whether principals’ perceptions of
SIP and ESC are yielding intended student learning gains. A second step
would be to solicit data on teachers’ experiences to compare to and validate
principals’ perceptions of their leadership work, school effectiveness climate,
and SIP. Second, the findings needs to be verified by undertaking similar
research with other program samples to replicate the findings and validate the
relationships found here. Third, further conceptual and measurement work
is needed on the central measure—ILP—on which the findings hinge, which
had the weakest measurement characteristics of all the study variables.
Further work is needed to capture effectively the nature and intensity of what
Appendix A
Exemplary Preparation Programs
Program Description
Delta State University, Cleveland, MS Delta State University, a regional public
institution, redesigned its leadership
preparation into a 14-month
master’s degree program to focus on
instructional leadership and provided
a full-time internship (with multiple
placements) and full-year financial
support for teachers to prepare to
become principals. The program’s
aim is to prepare candidates who
can transform schools in the poor,
mostly rural region. Applicants are
recommended by their districts, based
on teaching and informal instructional
leadership, and about 15 candidates
are selected for each cohort. Courses
are organized as intensive seminars
alternating with various internship
placements where they are supervised
by experienced administrators. Course
work and internships are closely
linked through field-based projects and
problem-based learning. The program
is supported by local districts and
the state of Mississippi (through its
sabbatical leave program).
University of Connecticut’s The UCAPP program is a 2-year,
Administrator Preparation 32-credit post–master’s degree
Program (UCAPP), Storrs, CT program that combines leadership-
related course work and a 2-year
internship for working professionals.
Through superintendent referral and
rigorous selection, the program admits
15 candidates per cohort. The program
(continued)
Appendix A (continued)
Program Description
combines conventional course work
in administration and supervision with
course work on school improvement
(including a curriculum laboratory and
teacher evaluation and development).
Candidates complete 80 days of
internship (including summer school)
in a different district, under the
supervision of a mentor principal. Each
candidate develops his or her own
plan of study and produces a portfolio
of documented work, including a
school–community analysis. Program
faculty are committed to continuous
improvement to blend course work
and field work, using an analytic,
reflective approach.
Principal’s Institute at Bank Street Working with Bank Street College,
College, New York, NY a private school of education in
New York City, Region 1 (one of 10
divisions of NYC public schools)
developed a continuum of leadership
preparation, including principal
preparation, induction, and in-service
support, using public and private
funding. This continuum aims to create
leadership for improved teaching
and learning closely linked to the
district’s instructional reforms. For
its preparation program, the region
nominates applicants based on their
teaching quality and instructional
leadership; applicants complete a
multistage interview process, including
a panel interview. Candidates enroll in
an 18-month program at Bank Street
College, leading to building and district
leader certification, and earn a master’s
degree. The program combines course
work in instructional leadership and
(continued)
Appendix A (continued)
Program Description
organizational change, offered both at
the regional offices and at the college,
a full-time summer internship and two
other robust field-based experiences,
and an advisory and conference
group structure that fosters reflective
learning under the supervision of
an experienced educator. Through
advisement and conference group
meetings, candidates reflect on
practice and develop new skills and
strategies. The program’s candidate
goals are lifelong learning, reflective
practice, inquiry, and advocacy and are
aligned with the region’s goals for its
principals.
Educational Leadership San Diego’s continuum of leadership
Development Academy at the preparation and development reflects
University of San Diego, San a closely aligned partnership between
Diego, CA the school district and the University
of San Diego, made possible through
foundation support. The continuum’s
preservice and in-service programs
support the development of
instructional leaders within a context
of district instructional reform. The
district nominates applicants based on
excellent teaching and instructional
leadership. Candidates complete a
yearlong program of study, which
includes a full-year paid internship
mentored by an expert principal
and opportunities to network as
part of a cohort. Program content
combines instructional leadership,
organizational development, and
change management, with an emphasis
on school planning and teacher
professional development.
Appendix B
Weights Used in Data Analysis
Program sample weight:
Where:
N = number of schools represented by respondents
n = number in the sample
s = state (STATE)
p = program (PROGID = 1–8)
Where:
N = number of schools in state
n = number in the sample
s = state (STATE)
p = program (PROGID = 98, 99), reflecting the elementary and secondary
principal samples
Appendix C
Correlations (Absolute Values) Between Sample
Characteristics and Variables Measuring School Improvement
Progress and Effective School Climate
Variables Measuring Variables Measuring School
Effective School Climate Improvement Progress
Principals N = 176. q40a = teachers in this school feel responsible to help each other do their best; q40b =
teachers in this school are continually learning and seeking new ideas; q40c = teachers use time together
to discuss teaching and learning; q41a = consensus among staff about school’s goals; q41b = collaboration
among teachers in making curriculum and instructional decisions; q41c = focus by teachers on improving and
expanding their instructional strategies; q41n = efforts among teachers to share practices with each other.
Appendix D
Univariate Summary Statistics
Variable Latent Variable M SD Skewness Kurtosis
q6a PQUAL 4.52 0.69 −1.23 3.56
q6k PQUAL 4.39 0.76 −0.95 2.88
q61 PQUAL 4.51 0.65 −1.23 4.39
q6m PQUAL 4.28 0.88 −0.98 2.98
q13c IQUAL 4.24 0.99 −1.30 4.20
q13e IQUAL 4.34 0.87 −1.33 4.55
q13f IQUAL 4.44 0.92 −1.74 5.43
q14b OILL 3.84 0.97 −0.59 2.76
q14h OILL 4.13 0.84 −0.82 3.49
q140 OILL 3.96 0.99 −0.76 3.41
q14s OILL 3.69 0.92 −0.33 2.69
q14r OILL 3.93 0.93 −0.76 3.41
q39g ILP 2.84 0.70 0.15 2.26
q39i ILP 2.79 0.81 0.20 1.94
q39n ILP 2.84 0.72 0.16 2.13
q39p ILP 2.61 0.72 0.39 2.53
q40a ESC 4.09 0.82 −0.74 3.18
q40b ESC 4.01 0.78 −0.46 2.81
q40c ESC 3.97 0.81 −0.55 3.27
q41a SIP 3.93 0.76 −0.35 2.84
q41b SIP 4.17 0.66 −0.04 3.24
q41c SIP 4.20 0.66 −0.48 3.32
q41n SIP 4.06 0.67 −0.42 3.40
q42a ISP 2.83 1.25 0.12 2.12
q42c ISP 2.41 1.07 0.39 2.61
q42d ISP 2.80 0.99 0.37 2.87
q43c DS 3.38 0.73 −1.18 4.39
q43d DS 3.36 0.73 −0.94 3.44
q43e DS 3.05 0.73 −0.34 2.68
q43f DS 3.29 0.70 −0.68 3.02
instldex PELI 1.11 1.10 0.73 2.69
frl SPL 49.88 29.99 1.95 1.91
pbelief PBP 4.78 0.51 −3.28 19.29
pexp EXPP 2.80 1.93 1.47 6.55
DS = district support; ESC = effective school climate; EXPP = principalship experience; ILP = instructional
leadership practices; IQUAL = internship quality; ISP = index of school problems; OILL = organizational and
instructional leadership learning; PBP = positive belief about principalship; PELI = prior experience in leading
instruction; PQUAL = program quality; SIP = school improvement progress; SPL = school poverty level.
Variables are on a 1–5 scale. Exceptions are instldex (1–4), which was drawn from a federal survey, frl, which
is a continuous variable measuring the percentage of students on free and reduced-price lunch (0%–100%),
and pexp, which is a continuous variable measuring principalship experience in years. For variable definitions,
see Appendix E.
Appendix E
Loadings of Survey Items (Indicator Variables) on Latent
Variables
Latent Variable Survey Items Loading R2
Participation Dichotomous variable indicating participation 1.00 1.00
in exemplary in an exemplary or conventional
preparation preparation program
program (PREP)
Internship quality The extent to which their educational
(IQUAL) leadership internship experience(s)
reflected the following attributes (based
on a 5-point agreement scale)
q13c-principal had responsibilities for leading, 0.80 0.64
facilitating and making decisions typical of an
educational leader.
q13e-principal was able to develop an 0.88 0.78
educational leader’s perspective on school
improvement.
q13f-internship experience was an excellent 0.89 0.79
learning experience for becoming a principal
Program quality The extent to which the following qualities were
(PQUAL) true of their educational leadership program
(based on a five-point agreement scale)
q6a-program content emphasized 0.63 0.39
instructional leadership.
q6k-program integrated theory and practice. 0.69 0.47
q6l-faculty members were very 0.80 0.64
knowledgeable about their subject matter.
q6m-program gave a strong orientation to 0.86 0.75
the principalship as a career.
Organizational How effectively their formal leadership program
and prepared them to do the following: (based on
Instructional a five-point effectiveness scale)
Leadership
Learning (OILL)
q14b-create a coherent educational program 0.76 0.57
across the school.
q14h-create a collaborative learning 0.83 0.69
organization.
q14o-use data to monitor school progress. 0.73 0.54
q14r-engages in comprehensive planning for 0.92 0.84
school improvement.
(continued)
Appendix E (continued)
Latent Variable Survey Items Loading R2
q14s-redesigns school organization to enhance 0.90 0.81
productive teaching and learning.
Instructional In the past month, approximately how often the
Leadership principal engaged in the following activities in
Practices (ILP) their role as principal of this school (based on
a five-point frequency scale)
q39g-fosters teachers’ professional 0.61 0.38
development for instructional knowledge
and skills.
q39i-use data to monitor school progress, 0.63 0.40
identify problems and propose solutions.
q39n-work with teachers to change 0.74 0.55
teaching methods where students are not
succeeding.
q39p-works with faculty to develop goals for 0.63 0.39
their practice and professional learning.
Prior Experience Did the principal had prior experience as 1.00 1.00
in Leading department chair, team or grade-level leader,
Instruction instructional specialist, or coach (excluding
(PELI) assistant principal) (dichotomy)
Index of School Extent to which principal perceives each of the
Problems (ISP) following to be a problem in his or her school:
q42a-lack of parental involvement. 0.78 0.61
q42c-student absenteeism. 0.69 0.47
q42d-students coming to school unprepared 0.91 0.82
to learn.
Experience as a Number of years as a principal of any school. 1.00 1.00
Principal (EXPP)
Positive Extent to which the principal agrees that 1.00 1.00
Belief about the “principalship enables me to influence
Principalship school change” (based on a five-point
(PBP) agreement scale)
School Poverty Percentage of students in school who are 1.00 1.00
Level (SPL) eligible for free- or reduced-price lunch
(federal poverty standard)
District Support How strongly the principal agrees or disagrees
(DS) with the following statements regarding
his or her district (based on a five-point
agreement scale)
q43c-district supports school’s efforts to 0.86 0.75
improve.
(continued)
Appendix E (continued)
Latent Variable Survey Items Loading R2
q43d-district promotes the principal’s 0.87 0.76
professional development.
q43e-district encourages principals to take 0.87 0.76
risks in order to make change.
q43f-district helps the principal to promote 0.89 0.79
and nurture a focus on teaching and
learning.
School Extent to which the principal agrees that
Improvement there has been an increase or decrease in
Progress (SIP) the following in his or her school since last
year (based on a five-point agreement
scale)
q41a-consensus among staff about school’s 0.80 0.64
goals.
q41b-collaboration among teachers in making 0.82 0.67
curriculum and instructional decisions.
q41c-focus by teachers on improving and 0.76 0.58
expanding their instructional strategies.
q41n-efforts among teachers to share 0.76 0.57
practices with each other.
Effective School The extent to which the principals felt each
Climate (ESC) statement describes their school (currently)
(based on a five-point extensiveness scale)
q40a-teachers in this school feel responsible 0.69 0.47
to help each other do their best.
q40b-teachers in this school are continually 0.89 0.80
learning and seeking new ideas.
q40c-teachers use time together to discuss 0.68 0.46
teaching and learning.
61
(continued)
62
Appendix F (continued)
Variable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
25.q42c .51 −.08 −.04 −.07 −.08 −.22 −.06 .16 .06 −.06
26.q42d −.06 −.10 .00 −.14 −.10 −.03 .25 .02 .01
27.q43c .62 .59 .62 −.02 .06 −.03 −.08 −.08
28.q43d .67 .65 −.02 .15 .04 .04 .00
29.q43e .64 .01 .09 .10 .00 .03
30.q43f .02 .05 .04 .05 −.05
31.pexp .02 −.07 .01 −.20
32.pbel .07 .01 .16
33.frl .19 .30
34.ili .09
pexp = principalship experience; pbel = positive belief about principalship; frl = % of free and reduced-price lunch; ili = index of leading instruction;
pr = type of preparation (1 = exemplary preparation program).
Appendix G
Parameter Estimates (Standardized Coefficients) for
Structural Equations in Three Models
Model 1
Structural Equations Estimates
Parameters IQUAL PQUAL OILL ILP SIP ESC
PREP .13* .24*
IQUAL .49*
PQUAL .70*
OILL .39*
ILP .47* .49*
ISP −.41* −.29*
PELI .05 .05
PBP .00 .00
EXPP −.06 .19
SPL .00 .00
DS −.10 −.07
R2 .03 .12 .77 .16 .40 .36
Model 2.
Structural Equation Estimates
Model 3.
Structural Equation Estimates
Appendix G (continued)
Structural Equation Estimates
DS = district support; ESC = effective school climate; EXPP = principalship experience; ILP =
instructional leadership practices; IQUAL = internship quality; ISP = index of school prob-
lems; OILL = organizational and instructional leadership learning; PBP = positive belief about
principalship; PELI = prior experience in leading instruction; PQUAL = program quality; PREP
= participation in an exemplary preparation program; SIP = school improvement progress; SPL
= school poverty level.
*p < .05.
The authors declared no potential conflicts of interests with respect to the authorship
and/or publication of this article.
Funding
The authors received no financial support for the research and/or authorship of this
article.
Notes
1. According to Hallinger and Heck (1998), a mediated effects framework hypoth-
esizes that leaders affect school outcomes through indirect paths, which include
“other people, events, and organizational factors” (p. 167). Antecedent factors
are other variables, such as a school’s socioeconomic status, that may have an
effect on school outcomes but not the leadership–school outcome relationship. A
reciprocal effects model proposes that “the relationships between the administra-
tor and features of the school and its environment are interactive” (p. 167).
2. Robinson, Lloyd, and Rowe (2008) apply James McGregor Burn’s (1978) defi-
nition of transformational leadership: the ability to “engage with staff in ways
that inspire them to new levels of energy, commitment and moral purpose” and,
through such leadership and use of a common vision, to transform “the organiza-
tion by developing its capacity to work collaboratively to overcome challenges
and research ambitious goals” (p. 639).
References
Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and
Human Decision Processes, 50, 179-211.
Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models Jour-
nal of the Academy of Marketing Science, 16, 74-94.
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in
social psychological research: Conceptual, strategic, and statistical considerations.
Journal of Personality and Social Psychology, 51, 1173-1182.
Bollen, K. (1989). Structural equations with latent variables. New York, NY: John
Wiley.
Bottoms, G., & O’Neill, K. (2001). Preparing a new breed of school principals: It’s
time for action. Atlanta, GA: Southern Regional Education Board.
Brand, S., Felner, R., Minsuk, S., Seitsinger, A., & Dumas, T. (2003). Middle school
improvement and reform: Development and validation of a school-level assess-
ment of climate, cultural pluralism, and school safety. Journal of Educational
Psychology, 95, 570-588.
Brown, K. M., Anfara, V. A., Jr., & Roney, K. (2004). Student achievement in high
performing, suburban middle schools and low performing, urban middle schools:
Plausible explanations for the differences. Education and Urban Society, 36,
428-456.
Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit In
K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 136-162).
Newbury Park, CA: Sage.
Burns, J. M. (1978). Leadership. New York: Harper & Row.
Carmines, E. G., & McIver, J. P. (1981). Analyzing models with unobserved
variables: Analysis of covariance structures. In G. W. Bohrnstedt & E. F. Borga-
tia (Eds.), Social measurement: Current issues (pp. 65-115). Beverly Hills, CA:
Sage.
Center for Research on the Context of Teaching. (2002). Bay Area School Reform Col-
laborative, Phase 1 (pp. 65-115). San Francisco, CA: Bay Area School Reform
Collaborative.
Curran, P. J., Finch, J. F., & West, S. G. (1996). The robustness of test statistics to non-
normality and specification error in confirmatory factor analysis. Psychological
Methods, 1, 16-29.
Darling-Hammond, L., LaPointe, M., Meyerson, D., Orr, M. T., & Cohen, C. (2007).
Preparing leaders for a changing world. Palo Alto, CA: Stanford University, Edu-
cational Leadership Institute.
Darling-Hammond, L., Meyerson, D., La Pointe, M. M., & Orr, M. T. (2009). Prepar-
ing principals for a changing world. San Francisco, CA: Jossey-Bass.
Davis, S., Darling-Hammond, L., Meyerson, D., & LaPointe, M. (2005). Review of
research. School leadership study. Developing successful principals. Palo Alto,
CA: Stanford University, Educational Leadership Institute.
Diamantopoulos, A., & Siguaw, J. (2000). Introducing LISREL. Thousand Oaks, CA:
Sage.
Educational Research Service. (2000). The principal, keystone of a high-achieving
school: Attracting and keeping the leaders we need. Arlington, VA: Author.
Farkas, S., Johnson, J., Duffett, A., & Foleno, T. (2001). Trying to stay ahead of the
game: Superintendents and principals talk about school leadership. New York,
NY: Public Agenda.
Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unob-
servable variables and measurement error. Journal of Marketing Research, 18, 39-50.
Goldring, E. B., & Pasternack, R. (1994). Principals’ coordinating strategies
and school effectiveness. School Effectiveness and School Improvement, 5,
239-253.
Hale, E. L., & Moorman, H. N. (2003). Preparing school principals: A national per-
spective on policy and program innovations. Washington, DC: Institute for Edu-
cational Leadership, Illinois Education Research Council.
Hallinger, P. (2005, April). Instructional leadership: How has the model evolved and
what have we learned? Paper presented at the American Educational Research
Association, Montreal, Canada.
Hallinger, P., & Heck, R. (1996). Reassessing the principal’s role in school effec-
tiveness: A review of empirical research, 1980–1995. Educational Administration
Quarterly, 32, 5-44.
Hallinger, P., & Heck, R. (1998). Exploring the principal’s contributions to school effec-
tiveness. School Effectiveness and School Improvement, 9(2), 157-191.
Hoyle, R., & Panter, A. (1995). Writing about structural equation models. In R. Hoyle
(Ed.), Structural equation modeling: Concepts, issues and applications (pp. 158-176).
Thousand Oaks, CA: Sage.
Institute for Educational Leadership. (2000). Leadership for student learning: Reinvent-
ing the principalship. Report of the Taskforce on the Principalship. Washington,
DC: Author.
Jackson, B. L., & Kelley, C. (2002). Exceptional and innovative programs in educa-
tional leadership. Educational Administration Quarterly, 38, 192-212.
Jöreskog, K., & Sörbom, D. (2001). LISREL 8: User’s reference guide. Lincolnwood,
IL: Scientific Software International.
Kirkpatrick, D. L. (1998). Evaluating training programs: The four levels (2nd ed.).
San Francisco, CA: Berrett-Koehler.
Kottkamp, R. B., & Rusch, E. A. (2009). The landscape of scholarship on the educa-
tion of school leaders, 1985–2006. In M. D. Young, G. M. Crow, J. Murphy, &
R. T. Ogawa (Eds.), Handbook of research on the education of school leaders
(pp. 23-85). New York, NY: Routledge.
Lad, K., Browne-Ferrigno, T., & Shoho, A. (2005, November). Leadership prepara-
tion admission criteria: Examining the spectrum from open enrollment to elite
selection. Paper presented at the annual convention of the University Council of
Educational Administration, Nashville, TN.
Leithwood, K., & Jantzi, D. (1999). The relative effects of principal and teacher
sources of leadership on student engagement with school. Educational Adminis-
tration Quarterly, 35(Suppl.), 679-706.
Leithwood, K., & Jantzi, D. (2000). Principal and teacher leadership effects: A repli-
cation. School Leadership & Management, 20(4), 415-434.
Leithwood, K., & Jantzi, D. (2005, April). A review of transformational school lead-
ership research. Paper presented at the meeting of the American Educational
Research Association, Montreal, Canada.
Leithwood, K., & Jantzi, D. (2008). Linking leadership to student learning: The con-
tributions of leader efficacy. Educational Administration Quarterly, 44, 496-528.
Leithwood, K., Jantzi, D., Coffin, G., & Wilson, P. (1996). Preparing school leaders:
What works? Journal of School Leadership, 6, 316-342.
Leithwood, K., Louis, K. S., Anderson, S., & Wahlstrom, K. (2004). How leadership
influences student learning. Toronto, Canada: Center for Applied Research and
Educational Improvement and Ontario Institute for Studies in Education.
Leithwood, K., & Riehl, C. (2005). What we know about successful school leader-
ship. In W. Firestone & C. Riehl (Eds.), A new agenda: Directions for research
on educational leadership (pp. 22-47). New York, NY: Teachers College Press.
Levine, A. (2005). Educating school leaders. Washington, DC: The Education Schools
Project.
MacCallum, R. (1995). Model specification: Procedures, strategies, and related issues.
In R. Hoyle (Ed.), Structural equation modeling: Concepts, issues and applica-
tions (pp. 16-36). Thousand Oaks, CA: Sage.
Bios
Margaret Terry Orr (PhD, Columbia) is on the faculty of Bank Street College of
Education where she directs its multi-district partnership program, the Future School
Leaders Academy. She co-chairs a national taskforce on evaluating leadership prepa-
ration programs and has published widely, including co-authoring Preparing Leaders
for a Changing World (Jossey-Bass).