This action might not be possible to undo. Are you sure you want to continue?
INTRODUCTION AND BACKGROUND
2 SDPDIAGNOSTIC . We focus on these two areas precisely because of the leverage they offer. WHAT IS “THE DIAGNOSTIC”? INTRODUCTION AND BACKGROUND THE DIAGNOSTICS WHAT THEY ARE 1 2 3 4 1 2 3 4 Standardized analyses designed to help agencies better understand their current performance. a student’s successful graduation from high school (and beyond) can dramatically shape the opportunities for his or her future. For instance. A primary goal is to reveal opportunities for management or policy changes that may improve outcomes. an education leader might ask: How successfully are we recruiting effective teachers? How long do these teachers stay and where do they teach in our system? Similarly. carefully understanding the results of various diagnostic tests that may in turn help each to understand any underlying health problems and how they might be alleviated by changing certain choices in the patient’s life (more exercise. within schools. but we do not try to provide an exact reasoning for this finding. Similarly. fewer french fries. Research consistently shows that among all factors controlled by school systems.SDP DIAGNOSTIC A core strategy of the Strategic Data Project is to execute “diagnostic analyses” using our partner agencies’ data. Rather. We conduct these diagnostic analyses in two core areas: human capital (primarily focusing on teacher effectiveness) and postsecondary success and attainment for students. etc.). our aim is to broach conversation about our findings in a manner similar to how a patient and a doctor might converse. our analyses are meant to be “diagnostic” in nature. As their names suggest. uncover issues. and strategically plan responses Illustrations of how existing data can be used to improve decision making Starting point for district-led “deep-dive” explorations on specific strategic issues A way to explore how agencies perform relative to others in the covered areas The diagnostic is a set of analyses that frames actionable questions for education leaders. our diagnostics do not try to explain the root cause of our findings. a teacher’s effectiveness matters the most for student learning. and across time). how many of our students stay on-track for high school graduation? Of those who stay on-track and those who do not. Just as a blood test does not reveal the underlying cause of high cholesterol. For example. how many transition to postsecondary education? Our analyses were developed after a close examination of the data that is available in many districts and states while working closely with our partners to understand their core questions. To this end. we may find that high-poverty schools have WHAT THEY ARE NOT Root-cause analyses for specific issues uncovered A set of specific recommendations of actions agencies should take to improve performance Comprehensive collection of all that can be done with existing data Ranking of agencies or departmental performance the greatest proportion of novice teachers. we designed both the human capital and collegegoing diagnostics to offer insight into an agency’s current performance in these areas and to demonstrate how that performance varies (for different schools.
geography)? DEVELOP • Are teachers with advanced degrees more effective? • Do teachers become more effective as they gain experience? EVALUATE • How predictive is novice teacher effectiveness of future teacher effectiveness? RETAIN/TURNOVER • Are more effective teachers retained at higher rates than less effective teachers? • Are there patterns in the movement to different jobs (into/ out of teaching jobs.harvard.HUMAN CAPITAL DIAGNOSTIC PATHWAY: FIVE STAGES IN A TEACHER’S CAREER RECRUIT • Does teacher effectiveness vary by certification pathway (traditional vs. in no case do we report teacher effectiveness measures for individual teachers.1 We measure a teacher’s effect on student achievement by using a statistical model that is often referred to as “value-added. measure by controlling for other factors in the student’s background (including prior performance.2 For all of the diagnostic analyses.edu/~pfpie/pdf/CMS_Technical_Appendix_ Teacher_Effects. demographics. We obtain this isolated “value-added” 1 The analyses we are able to complete for each agency are dependent on data availability and quality. we measure a teacher’s effect on student achievement growth by isolating the part of that growth attributable to a particular teacher. and peers in the classroom) to account for other differences in performance unrelated to the teacher. alternative)? • Where do agencies recruit their most effective teachers? PLACE • Where are highly effective teachers placed (which schools and with which students)? • Does teacher effectiveness vary across type of school (grades. etc. The chart above identifies each stage in the human capital pathway and lists an example or two of the kind of questions our analyses are designed to answer.” The human capital diagnostic includes 20–30 analyses. we examine average teacher effectiveness rates for groups of teachers.)? Human Capital We organize each diagnostic around a framework or “pathway. to/from higher needs schools. please see http://www.” When we employ the value-added model. poverty level. 2 For a technical explanation of the model we use. income status.pdf.gse. each falling under one of five stages in a teacher’s career. SDPDIAGNOSTIC 3 .
“off track”? COLLEGE ENROLLMENT • What are two. COLLEGE-GOING DIAGNOSTIC PATHWAY 9TH TO 10TH HIGH SCHOOL GRADUATION • What are high school graduation rates? By high school? Controlling for prior student achievement? • For how much of high school are students “on track” vs. • by student socioeconomic status. we examine high school completion and postsecondary attainment rates: • by high school. for most of the stages below.and fouryear college enrollment rates? • How do enrollment rates vary by seamless enrollers (students who go to college the fall after graduating from high school) vs. • by student ethnic group. four-year school? • How do transition rates vary for different schools and different groups of students? • When do students fall “off track” in terms of credit accumulation and when do they get back on track? 4 SDPDIAGNOSTIC . delayed enrollers? COLLEGE PERSISTENCE • How do persistence rates vary for specific groups of students and types of colleges? • By seamless enrollers vs. In this diagnostic we repeat several analyses using various student samples to track postsecondary outcomes for each stage in the pathway. and • by prior student achievement. along with several factors and questions our analyses seek to answer for each stage. one might ask: What are the college enrollment rates for students who score in the lowest quartile on their eighth-grade math and English tests? What if we explore these rates by high school? (To date.College-Going For the college-going diagnostic. we see a wide range here. For example. For example. we focus on student trajectories from high school into college.) Below we list the four “stages” in the college-going pathway. delayed enrollers? • By two-year vs. indicating that some high schools significantly outperform other schools in ensuring lower performing students successfully matriculate to college.
but also controlling for prior student achievement or family income. With each agency. we often integrate these Fellows onto the diagnostic research team in order to equip them to conduct deeper and agencyspecific analyses stemming from the diagnostic work. Develop an industry-standard set of metrics that become “need-to-know” information about any education agency. In partner agencies where we have also placed Data Fellows (another core element of the Strategic Data Project). for example. our CEPR-based research team worked closely with Charlotte-Mecklenburg Schools Superintendent Peter C. we must strike a balance in the level of customization we can provide each agency. We also believe that the process of executing and delivering the diagnostics builds the analytic capacity (and appetite) of an organization. Because we strive to develop a body of work with a standard set of analyses that can successfully be used across an array of agencies. We also attempted to identify analyses that would be replicable given data quality and availability in other education agencies. WHY DO WE EXECUTE THESE ANALYSES? HOW DID WE SELECT THE ANALYSES THAT COMPRISE THE DIAGNOSTIC? In our pilot study. but. Demonstrate to agencies what can be done with the growing trove of data they are collecting (primarily for compliance purposes. Build a body of work that relays the findings of similar analyses across a growing number of agencies. the analyses often can help confirm what leaders may know intuitively. Additionally. Convey actionable and salient information about teacher effectiveness and postsecondary transitions to our partners. which. nonetheless can be used to deeply understand performance). we report interim findings to internal audiences in our partner agency to ensure we are accurately interpreting and using data. we work with leadership to understand the most pressing issues and to ensure the diagnostic analyses address these whenever possible. When a team sees that teachers with advanced degrees are not. on average. 4. 2. One agency recently learned that its high school that uses block scheduling is remarkably successful at getting students who fall behind on course credits back on track for on-time high school graduation. examining. We often find. as well as helping agency leaders understand what the analyses are conveying and how they may be used for meaningful decision making. Our goal is to present new analyses.HOW ARE THE DIAGNOSTIC ANALYSES COMPLETED? CEPR commits a three-person research team for four to five months to complete both diagnostics (human capital and college-going). politics. not unlike widely recognized measures such as the priceto-earnings ratio for a publicly traded company. however. The diagnostics are a core element of the Strategic Data Project for four reasons. more effective than those without such degrees. such as allocating staff research time to become part of the diagnostic work team. not just student graduation rates by high school. and priorities of the agency. 3. though certainly some of our work (particularly in the college-going diagnostic) is fairly well understood given No Child Left Behind high school graduation accountability requirements and the excellent work of the National Student Clearinghouse. We attempt to take this information and dive deeper. allowing for a means of comparing and identifying best practices. or by race and ethnicity. SDPDIAGNOSTIC 5 . We also work with agency leaders to plan an appropriate public release of the findings and then publish them on our web page. Along the way. both in grounding our team in the history. We find that these conversations are invaluable. that customization can be enhanced with a greater commitment from the agency. they want to know more: Does it matter if a high school math teacher has a master’s degree in math? Does it matter from which institution the teacher earns the advanced degree? These questions require deeper digging and may often lead to districts realizing that such essential data elements are not even collected. We complete them in order to: 1. Gorman and members of his leadership team to: 1) sift through the data that was available to create the analyses that were possible and 2) sort through the myriad of analyses to hone in on those that are most salient and most actionable.
Thus. Other parts of our analysis indicate that novice teachers are.040 Prior Year Math Score -.002 -. and three years of experience compared to the prior average achievement of students with teachers who have four or more years of experience. What are current methods for placing teachers? 2. Average Prior Elementary Student Performance for Teachers with 3 or Fewer Years of Experience Relative to Teachers with 4 or More Years of Experience Districtwide .2 -.2 -.2 . Many district leaders may know this to be true–at least anecdotally–across the district. teachers with one year of experience. one each from the human capital diagnostic and the college-going diagnostic along with key questions district leaders may want to explore based on the analytic results.3 -. What data is collected (and how is it analyzed) about teacher placement? In this case.1 0. Are principals held accountable for teacher placement in any way? 6. One could imagine several key questions agency leaders (and principals) would want to understand after learning such findings: 1.1 -0.1 0 . What strategies are used to place teachers in ways that are intended to increase student achievement? How could this be tracked? 5.1 Prior Year Math Score .4 Novice 1 Year 2 Years 3 Years 6 SDPDIAGNOSTIC .3 -. Both across the district and within schools.169* -0. Significant changes would clearly require groups within the agency to discuss next steps and options to consider. less effective than teachers with several years of experience. But we find they are quite surprised to learn how systemic this can be within schools. How are student rosters created? 3. This pattern is consistent across several districts for which we have conducted this analysis. as in many.076* -0. this analysis shows that the agency is systematically placing lower performing teachers with the students who need to learn the most. two years of experience. it is probably true that any decision would require more analysis and evidence-gathering. This finding is widely found in other research as well. The goal of the analysis is to help these decision makers approach their decisions with more knowledge.4 -.2 Within Specific Schools 0 -.SAMPLE ANALYSES AND THE KINDS OF MANAGEMENT AND POLICY QUESTIONS THEY PROMPT Our stated goal is that the diagnostic analysis findings help form the basis of management and policy changes intended to improve student results. Human Capital: Teacher Placement The graphs below show how teachers are placed with students across the district (left graph) and within schools (right graph).072 -0.274* Novice 1 Year 2 Years 3 Years -0.192* -0. but within schools.046 -0. This result occurs not just across schools. we see that novice teachers are disproportionately placed with lower performing students. The bars indicate the prior year average student achievement of students taught by novice teachers. What student and teacher data is used to determine teacher placement? 4. on average. But what does this really look like? Please refer to the bottom of this page and page 43 for two sample analyses.
harvard. They create methods for analyzing data. What are the top producing high schools doing to prepare students for college? What are they doing to encourage students to enroll in college? 2. For this analysis. Then. not many know this rate by high school or with any degree of granularity about different groups of students.College-Going: College Enrollment Rates by Prior Student Achievement Many education leaders know roughly what percent of their high school graduates enroll in college. This introduces a number of questions: 1. Distribution of College Enrollment Rates by Prior Student Achievement Percent of High School Graduates 100 91 92 85 90 92 80 70 73 64 64 60 47 51 40 0 20 32 Bottom Quartile 2nd Quartile 3rd Quartile Top Quartile 8th Grade Composite Score Quartile Average for Quartile Individual High School SDPDIAGNOSTIC 7 . Notice that there is nearly a 30-point gap between the top high school and the bottom high school in the college enrollment rates for high school graduates who perform in the top two quartiles based on eighth-grade test scores. We created the analysis below to reveal variability that may yet be undetected. SDP will also release a compilation of diagnostic findings across SDP partners in summer 2011 (please refer to our website at www. The Charlotte-Mecklenburg Schools’ human capital findings and Fulton County Schools’ findings for both diagnostics are currently available on the Center for Education Policy Research at Harvard University’s website. We hope this also allows leaders to strategically plan responses. By matching student records with National Student Clearinghouse data. Do high schools have the same success in sending students from all quartiles to college. or are some high schools more successful with certain groups of students? 3. CONCLUSION The diagnostic analyses are a starting point. Are there high school course enrollment patterns that are evident for students who enroll in college at high rates? 4. Are principals or guidance counselors aware of this variation? This particular analysis might also be highly meaningful for parents and students. more bottom quartile students from one high school enroll in college (70 percent) than do top quartile students from another high school (64 percent). however. working carefully with each partner on the public release of the findings. Each blue bar of the graph represents one high school. Amazingly. The Strategic Data Project will publish summary findings for each partner’s diagnostic results. we are able to present K-12 leaders with fine-grained analysis of college enrollment and college persistence of their high school graduates. They are intended to help agencies better understand current performance and uncover issues. we first divide all students in the sample into quartiles according to their eighth-grade composite math and English test scores. we report the college enrollment rates for each high school for the students in each quartile. It at least appears that particular high schools have significant influence on this success. The gaps are even larger for the bottom two quartiles. It shows rather clearly that prior student achievement does not always predict whether a student goes to college.edu/cepr).gse. not an ending point.
All rights reserved. .©2011 by the Center for Education Policy Research at Harvard University.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.