In the fall of 2007, the New York City Departmentof Education (DOE) launched a quantitative schoolaccountability system known as the School Progress
Reports. These reports are released annually and aremeant to be a tool that “enables students, parents, and
the public to hold the DOE and its schools accountable forstudent outcomes and improvement.”
They grade eachNew York City public school along three dimensions: schoolenvironment, student performance, and student progress,and then combine them into a single School ProgressReport grade. These grades carry practical implicationsfor schools in terms of rewards and consequences.For example, schools that in any given year receive anA or B are eligible for increased funding in the formof bonuses for principals, while schools that receive aD, an F, or a third consecutive C face the possibility of adverse consequences including the dismissal of theprincipal or even closure of the school. In addition to directadministrative consequences, schools are affected by theirprogress report scores indirectly, as many parents andguardians use them to inform their own decisions whenchoosing schools.Given the important implications of progress report grades,it is essential that the DOE’s methodology for computing them be as successful as possible in fullling its goal,which is to “reect each school’s contribution to studentachievement, no matter where each child begins his orher journey to career and college readiness.”
to shed some light on the DOE’s success in identifying the contribution schools make to student learning, theIndependent Budget Ofce has analyzed the data andmethods used by the DOE to produce the reports for theschool years 2006-2007, 2007-2008, 2008-2009, and2009-2010. As this report was underway, the DOE releasedthe 2010-2011 reports; we have incorporated those resultsinto two of our three research questions below.This report considers three key research questions onthe reliability of these reports for measuring a school’seffectiveness in improving student outcomes:
Have the progress reports from 2006-2007 through2010-2011 for all levels of schools completelycontrolled for variables that may systematicallyaffect student outcomes but cannot be controlled byteachers or school administrators?
Have the progress reports for all levels of schooling captured differences between schools that persistin the long run, rather than differences that candisappear over the course of just a year or two?
Have the progress reports produced estimatesthat are reasonably robust to modest changes in
methodology to how measurements of the same data
Summary of School Progress Reports Methodology
Goals of the DOE School Progress Reports.
SchoolProgress Reports are meant both as a descriptive toolas well as a guide for decisionmakers. In the publicly
Introduction to the Progress Report
Educator Guides to the School Progress Report
, the DOE speciesthe descriptive goal as to provide “an overall assessmentof the school’s contribution to student learning,”
“produce outcomes that are minimally correlated withsocioeconomic status, Special Education populations, orother demographic characteristics.”
The practical goals of the project are stated as follows: “The report is designedto help teachers and principals accelerate academicachievement for all city students. It enables students,parents, and the public to hold the DOE and its schoolsaccountable for student outcomes and improvement.”
This dual purpose implies a degree of trade-off between
descriptive accuracy and practical applicability. On the onehand, progress reports must “Measure student outcomesas accurately as possible given the different challengesthat schools face;”
on the other, their goal is to “Ensurethat schools can verify and re-create metrics so schoolsunderstand how they are measured and how they canimprove their performance.”
Accurately measuring the contribution of schools tostudent learning is a task of enormous complexity. Studentachievement and progress are affected by a large set
of variables; in addition, those variables are nested in
a hierarchy of interacting levels (individual, class, andschoolwide). In the presence of such a complicatedenvironment, estimating true school effects based onobservational data alone requires very sophisticated (andcomplex) statistical models. At the same time, the SchoolProgress Report is not an academic exercise; it is meant asa way to give teachers and administrators tools to monitorand improve the performance of their schools. Because of this, the methodology must have a degree of transparencythat makes it possible for school managers to anticipatewhat type of policies could have a positive inuence ontheir students’ education.