Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
IBO Report on Grade-to-Grade Achievement

IBO Report on Grade-to-Grade Achievement

Ratings: (0)|Views: 2,895|Likes:
Published by GothamSchools.org

More info:

Published by: GothamSchools.org on Jul 12, 2012
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Fiscal Brief 
New York City Independent Budget Office 
 July 2012
Meeting State Education Standards?
 Tracking Grade-to-Grade AchievementFor a Group of 46,400 Students
New York City
Independent Budget OfceRonnie Lowenstein, Director110 William St., 14th oor
New York, NY 10038Tel. (212) 442-0632Fax (212) 442-0350
Schools Brief 
Parents, educators, and taxpayers all want to know if the city’s schools are increasing the shareof students meeting educational standards. The city’s Department of Education typically assessesprogress in raising prociency rates by comparing scores on the state English Language Arts and mathtests of this year’s third through eighth graders with last year’s.While that approach will tell you how this year’s third graders are doing compared with last year’s
third graders, it will not tell you how individual students are performing from one year to the next: if 
they are gaining, losing, or maintaining grade-level prociency in English and math. In this report, IBOuses data on English Language Arts results provided by the education department to track the annualperformance of more than 46,400 students from school year 2005-2006 through 2009-2010 as theymoved from third grade through seventh grade. (Because of changes in the state scoring in 2009-2010,some of our comparisons were limited to sixth grade rather than seventh grade.) Among our ndings:
Nearly 62 percent of the students in the cohort studied ended up at the same prociency level insixth grade as where they started in third grade; just over 30 percent ended up at a higher leveland about 8 percent at a lower level.
Of all the students whose prociency levels improved between third grade and sixth grade,80 percent began at Level 1 (not meeting standards) or Level 2 (partially meeting standards),compared with 20 percent who began at Level 3 (meeting standards).
The largest group in the cohort is the nearly 27,700 students who started out meeting standards(Level 3) in third grade. These students tended to maintain their procient ranking, with over 82percent ending sixth grade at Level 3.
Of the small number of students who started out at Level 4 (exceeding standards) in grade 3, just36 percent retained their Level 4 standing at the end of sixth grade, with nearly all of the others
slipping to Level 3.
IBO also sought to identify any shifts in the achievement gap between students of different racial andethnic backgrounds in the cohort we examined. We found little evidence of a narrowing of the achievementgap between 2005-2006 and 2009-2010, except for a decline of 3 percent between Hispanic and whitestudents. Gaps between Asian students and each of the other ethnic groups widened.
Background: The Value of Longitudinal Analysis
Parents, educators, and taxpayers alike want to know if public school students are making progress in developing the underlying skills they will need as they advance inschool and enter the workplace. The Department of Education (DOE) traditionally attempts to answer this
question by presenting one-year snapshots of student
achievement, meant to answer the question: How are thisyear’s third graders through eighth graders doing comparedwith last year’s? In this report, we take a different approachby looking at the same children over time and ask thequestion: How did individual students perform this yearcompared with last year and the year before? Increasingly,schools and educators are being held responsible for theyear-to-year change in the achievement scores of individualstudents, yet there has been no public report of the schoolsystem’s overall performance on this account.For this report, IBO examined the test records of 46,419New York City students, tracking their annual performanceover ve years (school year 2005-2006 through 2009-
2010) as they progressed from third grade to seventh
grade. The data follow them from third grade, the rst yearthey are required to take statewide English Language Arts(ELA) and math tests, until seventh grade. IBO derived
information for this longitudinal study from student level
les maintained by the Department of Education and
provided to IBO.
The number of elementary and middle school students
tested normally varies from year to year and grade to grade.
Each year students enter or leave the test population
with the arrival of new test-eligible youngsters and thedeparture of others. Note that in the grades studied inthis paper—third through seventh—students who leave aregenerally not “drop-outs” but instead are transferring to
a private or parochial school or moving to another schooldistrict. Absence from school is another factor that removes
students from the testing pool. There are also shifts in
the numbers due to exclusions based on language orspecial education status. For example, in 2007 the federalgovernment required that more English-language learnersbe tested. As a result, there were many fewer exemptionsor waivers granted to English-language learners and thisboosted the size of the group subject to testing citywide.In some cases, the inuence of changes in studentpopulation on overall performance measures is hard topredict. As the composition of the group varies over timethere will be a certain amount of variation in outcomes thathas little to do with instruction—but instead reects the netimpact of the upward or downward tendency each change
in the make-up of the population being tested has on theexam results.
Studying the same students longitudinally has the benetof removing a large source of variation in outcomes—namely, changes in the make-up of the group whoseperformance is being examined. But eliminating suchunknown and unwanted “noise” from the analysis comesat a cost. The factors that enable a stable cohort of students to be identied and evaluated over ve years will
also render it less than fully representative of the entirepopulation of students in those grades. Nonetheless,
longitudinal analysis can enhance our understanding of theperformance of a substantial portion of the students whotook these tests in the ve years.
The longitudinal design allowed results to be seen from
perspectives not generally presented by the DOE inannouncements about student outcomes and made it
possible to investigate:
Changes in grade-to-grade achievement levels by the
same students
Performance level patterns that appear over two and
three years
Assumptions that students will make continuous
Data that shed light on the achievement gap among ethnic and racial groups.This work used the reported performance levels andscale scores each student received on the annualstatewide exams. Performance level thresholds andthe transformation of raw scores on the tests to scalescores are determined by the New York State EducationDepartment (SED). It should be noted that our approachcould not be used by the DOE or SED in their annual reporton student outcomes because of the need to excludestudents who have not been tested in consecutive years.Thus, this work is intended as a complement to, rather thana substitute for, the ofcial reports of test scores.
Our Analytic Cohort
Department of Education les were searched to identifystudents who had complete ELA test data from schoolyear 2005-2006 through 2009-2010. Any student who
was not tested in any individual year was dropped from our
analysis. This yielded 49,333 students from an originalgroup of 76,437 third graders who had been tested atleast once in grade 3 in 2005-2006 or grade 4 in 2006-2007 and so on, up to grade 7 in 2009-2010. Upon furtherinspection, an additional 2,914 of these students wereexcluded because they had been tested out of their gradelevel at least once during the ve-year period, confounding 
interpretation of the results. Thus, this report is based upon
the performance patterns of a cohort of 46,419 students,60.7 percent of the students tested at least once in thesegrades in these years. Each of the students in the cohorttook the state tests for ve consecutive years and advancedto the next grade level each year, beginning in grade 3 andending in grade 7. The benets inherent in this analysiscome with a trade-off: The survivors, having gotten pastthese restrictions, will differ in character from those whodid not and may yield ndings that do not generalize to thecitywide test population.Given how the cohort was assembled, it is importantto understand how students in the cohort differ from
the population of students being tested and what these
differences imply for IBO’s ndings. Table 1 comparescharacteristics of the cohort to all citywide test takers.The make-up of the cohort in 2005-2006 is described bygender and ethnicity as well as by special education statusand English prociency when students were third graders.
Corresponding breakdowns are given for all students who
took the ELA test in 2005-2006.While the analytic cohort is similar to the citywidepopulation in terms of gender and ethnicity, it differs interms of the percentage of students with special needs andEnglish-language learners. The 50:50 female to male ratioof the cohort is close to the 49:51 citywide ratio of studentsin grade 3. The ethnic mix of the cohort appears to matchthe percentages observed in the overall school year 2005-2006 test population, with each ethnic group in the cohortwithin 1 percentage point of its representation among alltest takers. Black and Hispanic students make up 70.6percent of the cohort; white and Asian students are theother 29.4 percent. We observe larger differences betweenour analytic cohort and the overall population in terms of the percent of students with special education needs andstudents classied as English-language learners. We losesome students with special needs from our cohort because
they are tested off grade level if their Individualized
Education Plan calls for that. Some English-language
learners are exempt from testing upon their entry to the
school system and therefore are not included in our cohort.Overall, the analytic cohort performed somewhat betterthan the entire group of students taking the ELA exams.
Table 1: Composition of the Cohort Compared with All Students Tested Citywide,2005-2006 Through 2008-2009
CohortAll Students2005-2006Grade 32005-2006Grade 32008-2009Grade 6
General Ed42,579 92.2%
84.9%54,97680.8%Special Ed3,6097.8%9,25915.1%
Total46,188 100.0%61,504100.00%68,002100.0%
ELL990 2.1%
3.6%7,63111.2%EP45,198 97.9%59,27196.4%60,37188.8%
Total 46,188 100.0%61,504100.0%68,002100.0%
SOURCES: IBO analysis of Department of Education data, http://schools.nyc.gov/Accountability/data/TestResults/ELAandMathTestResultsNOTE: Totals differ because some student records were missing information on gender, ethnicity, special educationstatus and language prociency.

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->