This action might not be possible to undo. Are you sure you want to continue?
MICHAEL MULGREW, as President of the UNITED FEDERATION OF TEACHERS, Local 2, American Federation of Teachers, AFL-CIO, on behalf of all represented employees in the City School District of the City of New York,
Index No. 113813/2010
AFFIDAVIT OF JACKIE BENNETT
BOARD OF EDUCA nON OF THE CITY SCHOOL DISTRICT OF THE CITY OF NEW YORK, and JOEL 1. KLEIN, as Chancellor of the City School District of the City of New York,
For a Judgment Pursuant to Article 78 of the CPLR and for Declaratory Relief Pursuant to CPLR 3001
STATE OF NEW YORK )
COUNTY OF NEW YORK )
JACKIE BENNETT, being duly sworn, deposes and says:
1. I am a Special Representative for the United Federation of Teachers, Local 2,
American Federation of Teachers, AFL-CIO (the "UFT"), which is the recognized bargaining
agent for all nonsupervisory pedagogical personnel and classroom paraprofessionals employed
by the Board of Education of the City School District of the City of New York. My job
responsibilities include conducting research to assist in the development ofUFT educational
policy, with a special focus on school and teacher accountability, including teacher evaluation.
In this role, I have been the point person for the UFT on the Teacher Data Initiative and have
advised UFT officers, staff and members regarding the information produced by the DOE on the
Teacher Data Reports ("TDRs").
2. I respectfully make this affidavit in further support of the UFT's Verified Petition
seeking an Order that Respondents not release publicly TDRs that are unredacted as to teachers'
names and in response to the Affidavit of Joanna Cannon, sworn to November 19, 2010
("Cannon Aff."), with respect to the issue of the data verification process utilized by the New
York City Department of Education ("DOE") as part of the Teacher Data Initiative. The
information set forth below was assembled under my direction and control.
Quality of Inputs
3. As explained in the Verified Petition, some of the information upon which the
TDRs are supposedly based is inaccurate. (See Verified Petition, at ~ 45). At the time the
Verified Petition was filed, the UFT had the opportunity to review only a small sample of TDRs.
Since that time, the UFT has sought a sampling of teachers who believed that their TDRs were
based on inaccurate information. The UFT's review of these TDRs has revealed that a large
portion of the reports received are materially flawed as they have been calculated based on errors
in student lists. Significantly, as discussed infra, at ~~ 10-11, teachers did not have the
opportunity to review the student lists upon which their TDRs were based prior to the reports
being prepared or after the reports were issued. Indeed, teachers still have not had the
opportunity to review the student lists. Despite this failure in the verification process, categories
of errors can be identified based upon the subject listed, the numbers of students and the
identification of the comparison group. These errors include teachers receiving TDRs for
subjects they didn't teach; student results being attributed to the wrong teacher; the
misidentification of class type; and that many teachers have such large numbers of students 2
missing from their reports that serious errors must have occurred. Further undermining any confidence in the integrity of the TDRs is that some teachers with results for prescribed periods were not even teaching at the time.
4. As indicated above, teachers received value-added results for subjects they never
taught (e.g., a teacher who did not teach math received a math TDR). This error seems to have occurred most frequently in elementary schools where teachers typically teach the same class of students all day and, therefore, would usually teach both English language arts ("ELA") and math. However, some elementary schools "departmentalize," meaning that, for example, one 5th grade teacher is responsible for teaching all the 5th grade math and the another 5th grade teacher is responsible for teaching all the 5th grade ELA. In some instances, a 5th grade teacher might not teach math or ELA but, rather, social studies or science (in which case, they should not have received any TDR). Because the DOE failed to provide the proper guidance regarding data verification (see, infra, at ~ 12), and teachers were not involved in the process, principals apparently submitted the homeroom list for each teacher without indicating whether the classes were departmentalized. Accordingly, teachers who taught only math or only ELA received TDRs for both subjects, even though they did not teach both subjects. See Bronx Wrong Subject Example, a copy of which is annexed hereto as Ex. 1 (demonstrating that Teacher A and Teacher B each received a TDR for ELA and math, despite the fact that Teacher A taught only ELA and Teacher B taught only math); see also Ex. N to the Verified Petition, Departmentalization Examples 1 and 2. Likewise, some teachers who taught only social studies, at both the elementary and middle school levels, received both a math and an ELA TDR. See Staten Island Wrong Subject Example, a copy of which is annexed hereto as Ex. 2 (demonstrating that a teacher who taught only social studies was listed as co-teacher on ELA and math TDRs, and
therefore held accountable for student achievement in subjects she never taught and sharing the credit or blame with a teacher with whom she did not teach); Tribeca Wrong Subject Examples, copies of which are annexed hereto as Ex. 3 (demonstrating that a 6th grade social studies teacher received TDR reports for ELA and math as did a 6th grade science teacher).
5. The "wrong subject error" discussed above raised another concern with regard to
the validity of the value-added methodology utilized by the DOE. While a teacher who teaches only ELA received both ELA and math reports, the companion teacher who taught math also received two reports. Thus, the results for each teacher were split between two reports rather than being averaged together into a single report (as they should have been). This error demonstrated how unstable the value-added results may be because the two separate reports sometimes gave the teacher wildly divergent ratings. For example, the ELA TDRs demonstrated that Teacher A (who should have been accountable for all ELA achievement) received a percentile ranking of 67% for her purported effectiveness with one group of students and a percentile ranking of 4% for her purported effectiveness with the other group. See Ex. 1. Likewise, the math TDRs demonstrate that Teacher B (who should have been accountable for all math achievement) received a percentile ranking of 69% for her purported effectiveness with one group of students and a percentile ranking of 8% for her purported effectiveness with the other group. Id.; see also Ex. N to the Verified Petition, Departmentalization Example 1 (reflecting a 97% percentile ranking in one ELA class and a 14% percentile ranking in another ELA class for the same teacher).
6. Another common error found by the UFT was that student scores had been
attributed to the wrong teacher. This error manifested itself in several different ways:
• Grouping/Regrouping: In some schools, teams of teachers group students according to need for certain subjects and may change these groupings based on student need during the course of the school year (meaning the students' homeroom teacher may not necessarily be their math and/or ELA teacher). The nuances of this effective practice, however, are not reflected in the TDRs for such schools. When it comes time to link students to teachers there is little option but for principals to connect each teacher with his or her homeroom class, regardless of the students actually taught, given the structure of the DOE's information management system. See Brooklyn Regrouping Example, a copy of which is annexed hereto as Ex. 4 (reflecting the teacher's homeroom students on the math TDR, despite the fact that students were re-grouped for math, so that the submitted list would have reflected some students she never taught and would exclude some students she did teach); Queens Regrouping Example, a copy of which is annexed hereto as Ex. 5 (reflecting teacher's homeroom students on the ELA TDR, despite the fact that every nine weeks the students were regrouped with a different ELA teacher).
• Small Groups: The DOE provided little guidance as to how to account for small groups of students who were removed from the class on a daily basis to be taught by a teacher other than the teacher of record. As a result, such situations were treated differently from school to school, creating erroneous reports across the City and meaningless comparisons
among teachers. For example, in some schools these students were
attributed to the teacher of record, even though that teacher was not
responsible for teaching the students at issue. 1 Other schools took the
opposite approach, attributing the students to the small group teacher. See
Ex. Q to the Verified Petition, AIS Example (discussing the varying
treatment from school to school of students receiving small group
instruction outside of the regular classroom). In yet another approach a
school listed the small group teacher as a co-teacher on the TDRs of the
teachers of record, as if the two teachers taught all of the students together
at the same time, which was not the case. See Manhattan Small Group
Example, a copy of which is annexed hereto as Ex. 6 (evidencing that
small group teacher was listed as a co-teacher on the ELA and math TDR
for each classroom teacher, despite the fact that the small group teacher
only taught a small percentage of the students in each class and never co-
taught with the classroom teachers); Brooklyn Small Group Example, a
copy of which is annexed hereto as Ex. 7 (listing a 4th grade small group
teacher as a co-teacher for a 5th grade class with a teacher who was
actually on leave during the year in question).
• Wrong Students: In some instances erroneous students (in some
instances entire classes) never taught by the teacher were included on the
1 This input error was discovered as a result of the teacher of record receiving a TDR that presumably included all of the students who were taught in the small group and the teacher of the small group received no TDR Accordingly, there exists no "sample TDR" to demonstrate this error.
2 This error may be further exacerbated by the fact that the comparison group for this teacher would presumably 6
teachers' list. See Ex. M to the Verified Petition, Extra Students Example (reflecting 57 students in one class, 27 of whom the DOE has since verified were never taught by the teacher for whom the TDR was prepared). In other instances, teachers received TDRs for grade levels they never taught. See Brooklyn Wrong Students Example, a copy of which is annexed hereto as Ex. 8 (evidencing a 4th grade teacher, who never taught 5th grade, receiving both an ELA and a math TDR for 5th grade). Notably, this type of error is very difficult to detect because the TDRs rarely reflect the total number of students in the class given that the model utilized by the DOE discards the scores of certain students for various reasons (e.g., it is the student's first year in the district). Accordingly, it is impossible to determine by the number of students indicated on the TDR whether the students' on the teachers' list are the correct students. I have, however, been able to verify this error with the teachers themselves on several TDRs that came under specific scrutiny.
• Teachers on Leave: Several teachers on leave for a large portion (or even all) of the school year received TDRs for the students assigned to their classes, despite the fact that they had little or no responsibility for actually teaching these students. See Brooklyn Leave Example, a copy of which is annexed hereto as Ex. 9 (reflecting an ELA percentile ranking for a teacher who was on maternity leave from the last week in September until just one week before the ELA assessment was administered in January; the teacher taught the students for only a few weeks yet was held
accountable for their achievement); Queens Leave Example, a copy of
which is annexed hereto as Ex. 10 (reflecting two ELA percentile rankings
for a middle school teacher who was on medical leave for all of October,
part of December and most of January; the teacher taught the students for
approximately two months yet was held accountable for their
achievement); Brooklyn Leave Example II, a copy of which is annexed
hereto as Ex. 11 (reflecting a "last four years" percentile ranking on a
2008-09 TDR for a teacher who was on medical leave from 2000-2008
and, therefore, should have a blank in the "last four years" field).
7. The most common error found by the UFT related to the reporting of value-added
results for Collaborative Team Teaching ("CTT") classes. TDRs either falsely identify teachers
as working in a CTT class or, alternatively, failed to identify teachers as working in such a class.
CTTs are inclusive settings for both students with disabilities and general education students.
Because these classes have two teachers, there is no way to separate the impact of one teacher
from the other and, therefore, it is DOE policy to list both teachers on the same TDR. In some
instances, however, only one teacher was listed on the TDR and, therefore, held wholly
accountable for student achievement. See Staten Island CTT Example, a copy of which is
annexed hereto as Ex. 12 (reflecting only the general education teacher in reports that incorrectly
mixed CTT and general education classes into a single report and did not include the special
education teacher, thereby holding the general education teacher solely accountable for student
achievementjr' Queens CTT Example, a copy of which is annexed hereto as Ex. 13 (reflecting
2 This error may be further exacerbated by the fact that the comparison group for this teacher would presumably have been other teachers with a similar level of experience (as opposed to the citywide comparison group to which 8
only one teacher on the ELA TDR for a CTT class); Brooklyn CTT Example, a copy of which is
annexed hereto as Ex. 14 (reflecting a single teacher on a TDR that apparently combined the
special education students from three different CTT classes at three different grade levelsjr' see
also Ex. 4, Single Teacher (reflecting only a single teacher who taught in a CTT class); Ex. 0 to
the Verified Petition, Missing Years and CTT Class Example (failing to identify the class for
which the value-added result was based as a CTT class); Ex. P to the Verified Petition, CTT
Example (combining student results from 6th grade CTT class with student results from non-
CTT classes). In other instances, teachers were identified as a co-teacher in a CTT when they
were not in fact co-teachers, thereby being held accountable for students they did not teach. See
Brooklyn CTT Example II, a copy of which is annexed hereto as Ex. 15 (reflecting CTT classes
on several TDRs where the teachers in question never co-taught); Queens CTT Example II. a
copy of which is annexed hereto as Ex. 16 (reflecting CTT classes on several TDRs where the
teachers in question never co-taught); see also Ex. 7 (reflecting a CTT class where the teachers
did not co-teach and one of the two teachers was on leave during the year in question).
8. While the UPT understands that a certain number of students cannot be included
in the value-added analysis for any given teacher due to missing information on the students
(e.g., student had not prior year test score), it is suspicious when the number of students reflected
on the TDR is significantly less than the number of students taught, leading the UPT to believe
that there was an error in the DOE's linking of students to teachers. See Staten Island Missing
teachers of CTT classes are compared) given that the TDR did not reflect that it was a CTT class. Such a comparison could invalidate the percentile ranking given to this teacher.
3 The teacher in question co-taught at the 6th, 7th and 8th grade levels with three different teachers. Further complicating this TDR is the fact that it reflects a class of22 students with disabilities (apparently a combination of the students from all three classes) and purports to be for the 8th grade. It is unclear what comparison group was used in compiling this TDR.
Students Example, a copy of which is annexed hereto as Ex. 17 (reflecting only 309 students for
a teacher who reports teaching between 375 and 400 students); Brooklyn Missing Students
Example, a copy of which is annexed hereto as Ex. 18 (reflecting only 131 students for a teacher
who reports teaching 240 students); Queens Missing Students Example, a copy of which is
annexed hereto as Ex. 19 (reflecting only 87 students for a teacher who reports teaching 112
students). Unfortunately, the UPT cannot determine whether the removal of students from
teachers' lists is a function of missing information or DOE error since teachers are not privy to
the student lists upon which their TDRs are based.
9. The errors illustrated above constituted a significant cross section of the
unredacted TDRs the UPT had the opportunity to examine. Importantly, since the inaccurate
information thus generated became the basis for comparisons against other teachers, the flaws
are geometrically increased as a percentage of the entire pool. In addition, most of the flaws
identified came from the most recent year's TDRs, for which information was slightly less
opaque and memories were fresher. Yet, the TDRs contain three more years of historical student
lists and information, lumped in aggregated numbers. The UPT found it very difficult, if not
impossible, to penetrate that information, even in a superficial manner.
Data Verification Process
10. The input quality issues discussed above could be mitigated through a
comprehensive data verification process involving the teachers themselves. The DOEs data
verification process, however (as described in the Cannon Aff. at ~ 25), does not require teachers
to be involved in the process. Rather, the DOE merely suggests that the principal or data
specialist involve teachers in the process and, in the vast majority of schools, the TDR inputs are
verified by only the principal or a data specialist. What is more, principals and data specialists 10
were told that the TDRs were to be used only for low-stakes purposes and that, whatever the results, they would be kept confidential. As a result, there was little incentive to ensure accuracy in the data verification process. Simply stated, teachers, who not only have the greatest interest in ensuring accurate inputs but also the best information to verify accuracy, generally do not have the opportunity to review the class lists upon which their TDRs are based.
11. Notably, those reviewing the information receive no recommendations or
guidance as to how to include the actual teachers in the verification process. Moreover, when teachers receive their individual TDRs, there are no class lists attached so there is no way, on the face of the report itself, for any individual teacher to verify that the TDR accurately represents the students he or she taught.
12. Further, those individuals responsible for verifying the information receive no
formal training in what information should be reviewed or how. Instead, these individuals were provided with a document explaining how to access the system and how to fill in the various screens. Those conducting the verification process were asked to link classes of students to the correct ELA teacher and then to the correct math teacher based on the information available in the central computer system and other resources at the school, based on the best judgment (or guess) of the person conducting the verification process.
13. To ensure accuracy of the inputs to the value-added model, the person conducting
the verification process would be required to have a detailed understanding of the many nuanced ways that students and teachers are actually programmed in the school and then figure out how to record this information in the computer system in such a way to ensure proper matches between students, teachers and subjects taught. This currently is not done consistently across the City.
14. The data verification process is extremely cumbersome and is often completed by
only one person in the school. In some schools, this means one person is responsible for
ensuring over 1,000 students were assigned to the correct ELA teacher and then again to the
correct math teacher. This process involves the review of thousands of screens of information,
making the process ripe for error.
15. Finally, not only were teachers left out of the verification process, the UFT has
recently discovered that many members could not even access their TDRs and, therefore, had no
way of knowing whether their reports even existed, let alone whether they contained accurate
data. When some teachers attempted to access the TDRs online, the DOE website stated the
teacher had no report. This, however, was not the case. In approximately 40 separate instances,
I personally contacted the DOE via email seeking confirmation that no TDR was prepared for the
teacher in question. In about half of these instances, the DOE reported that the teacher did, in
fact, have a TDR, which the DOE then sent directly to the teacher.
Sworn to before me this _k__th day of December, 2010
Notary Publi CHERYL FIDDLER
Notary Public, State of New York
No. 01 FI6099004
Qualified in Kings County
Commission Expires September 22, 20,4.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.