You are on page 1of 11

More Than a Score:

Overview of Standardized Testing and Student Data Privacy

Where did all this testing come from?


Standardized testing developed as a way to test large numbers of people quickly to be able to rank and sort them by score (college admission, military) Elementary and Secondary Education Act passed in 1965 as part of the War on Poverty. In 1994 requirement added that states test children in Title I (>40% poverty) schools every year. No Child Left Behind Act (2001) added even more testing for all schools and requirement for test scores to improve every year (=AYP) Race To The Top competitions made funding contingent on continued use of high-stakes standardized testing

Problems with high-stakes standardized testing

Norm-referenced standardized testing is biased towards upper-, middle-class, white, male, native speakers of prestige dialect of English o Scores mostly measure these factors, not what child knows or has learned. o Family income explains vast majority of variance in test results. Curriculum narrows towards what is on the test. Only knowledge and skills that can be tested easily are assessed. Not: creativity, curiosity, empathy, leadership High-stakes testing for schools, districts, teachers, administrators means higher likelihood of cheating.

Standardized testing in CPS


Some required by state, some by district, some "optional"; ELL students take ACCESS for English proficiency. PreK: ESI-R, REACH x2, TSG K-1st: Literacy and Math x3 (NWEA MPG or mClass TRC or other), mClass DIBELS x3, Common Core benchmarks x3 (reading), REACH x2 (reading, math, possibly specials) 2nd: Literacy and Math x3 (NWEA MPG or mClass TRC or other), mClass DIBELS x3, Common Core benchmarks x3 (reading and math), REACH x2 (reading, math), NWEA MAP (reading and math) x2 3rd-8th: NWEA MAP (reading and math) x2, ISAT, Common Core benchmarks x3 (reading, math, science, social science), REACH x2 (reading, math, possibly specials) High school: Common Core benchmarks x3 (reading, math, science, social science), REACH x2 (reading, math, possibly specials), PSAE, EXPLORE/PLAN/ACT, AP and IB exams

High-stakes uses of test scores


Because ISAT phasing out, this year NWEA MAP test to be used for: Promotion in 3rd, 6th, 8th grades Performance policy for elementary schools Growth also used for part of teacher evaluation 70% of Elem quality rating based on tests 45% HS quality rating based on EPAS

No info yet on what will be used for SE admissions

Problems with CPS Promotion Policy

CPS now replacing ISAT with another norm-referenced test, NWEA MAP School year and summer school supports heavily based on software program, Compass Learning. Extensive research shows retention: harmful academically, leads to increased dropouts, and vastly disproportionately affects minority students MTAS proposed alternative promotion policy relies primarily on report card grades and only uses standardized test scores as diagnostic tools

Opting Out of Standardized Testing


1. Investigate: Which standardized tests are being administered in your school and to your child? What are results being used for?

1. Opt out: Write a letter (or use universal opt out letter) to your childs principal and share it with their teacher(s). Talk to your child, so they can know what to expect on whole-class testing days. If they are old enough to refusing testing themselves, provide them with info necessary to decide.

1. Connect with other families: Share your research with parents who havent opted out yet. Hold an event with a member of MTAS. Join the Opt Out Chicago Facebook group.

InBloom, ISLE, and Data Privacy


Illinois State Board of Education creating statewide longitudinal database, ISLE, with architecture created by InBloom. InBloom software built by Rupert Murdoch-owned Amplify, std testing software and hardware company Only Illinois and New York still working with InBloom CPS scheduled to start participating as early as spring 2014. Claims must participate in ISLE/InBloom to get $19M in RTTT funds. Existing IMPACT system cost $50M ISBE says districts have choice whether to participate in ISLE and whether to use InBloom component ISLE is longitudinal: eventually collect information from birth to age 20 for school districts, higher ed institutions, employers

Concerns about ISLE/InBloom




Intended to include much more data than IMPACT, e.g. output of learning software programs your child may use, IEP/504 info; possibly even health, immigration, DCFS, criminal, disciplinary Data intended to be shared or possibly rented to software companies so that they can train algorithms use in software to be sold back to schools. Eventually schools will pay for data storage and will also pay for software built from this data. InBloom denies liability for data breaches, does not guarantee of security. Currently no plan to grant parents right of disclosure and consent for uploading childrens data, sharing it with 3rd party, using it for commercial purposes FERPA, state, and CPS regulations were changed to make all of this legal. HIPPA, COPPA would protect data in other situations.

Relevant Legislation on Standardized Testing and Data Privacy


SB2156 Introduced by Sen Bill Cunningham last year. Would limit standardized academic achievement tests per school year to four (including the two administrations of REACH exams

New York State: currently attempting to pass bill(s) giving parents right to opt out of participating in InBloom. Could have legislative language to be used in IL

(105 ILCS 13/) P-20 Longitudinal Education Data System Act 2009. Established creation of ISLE so IL could compete for RTTT funds. No mention of InBloom. The data warehouse, as integrated with the longitudinal
data system, must include [...] unique statewide student identifier that connects student data across key databases across years. The unique statewide student identifier must not be derived from a student's social security number and must be provided to institutions of higher learning to assist with linkages between early learning through secondary and postsecondary data.

Problems with MAP and growth


Whats wrong with the NWEA MAP? Parents and teachers not allowed to see questions Not designed to be high-stakes for any purpose Research showed that use of info provided by MAP test is not beneficial (study on 4th and 5th grade in IL) Computer-adaptive, whole-class testing terrible for early grades Not norm-referenced for special needs or ELL Whats wrong with measuring growth? Educational research has shown attributing growth to cause (like particular teacher) usually called Value Added Measure unreliable Growth is minimal in upper grades---can be smaller than std error Campbells Law: "The more an indicator is used for social decisionmaking, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."

You might also like