You are on page 1of 7

Assessment Description Purpose Grade

Level
Date of
Admin.
Date of
Calculation
Teacher
Distribution
Parent
Distribution
Description of
Analysis
Next steps
KRA-L

State
assessment
Kindergarten
Readiness
literacy
K August Immediate Immediate Immediate Teacher &
Administration
Kindergarten
Placement
Get it/Got
it/Go
State
assessment
Kindergarten
Readiness-
literacy
K August Immediate Immediate Teacher &
Administration
Kindergarten
Placement
STAR Renaissance
Learning
Reading level 1 4 Aug, Dec,
May
Immediate Immediate Teacher Used for
reading
curriculum
Accelerated
Reader
Renaissance
Learning
Reading
comprehensio
n & progress
1 4 Throughou
t year
Immediate Immediate Teacher Used for
reading
curriculum
OAA State
assessment
Achievement
based on
state
standards
3 8 Oct 7 11
& April 21
May 9
RTI
team/administr
ation
3
rd
grade
reading
guarantee -
benchmark
Reading A-Z Learning A-Z Reading
literacy/level
K 4 Aug, Dec,
May
Teacher 3rd grade
guarantee
benchmark
Dibels Next Dynamic
Measureme
nt Group
Reading/litera
cy level
K 4 Aug, Dec,
May
Immediate Immediate Teacher & RTI
team
3
rd
grade
reading
guarantee
benchmark
RTI screener &
progress
monitoring
OLSAT Pearson
assessment
Gifted
Identification
2 April 7 & 8 Gifted
consultant
Benchmark
scores for
placement
reading, math,
superior
cognitive
OTELA State
assessment
English as a
second
language
achievement
K 12 February Special
Education
Department
Periodic testing
until
proficiency 1
student in 2014
SRI Scholastic Reading
comprehensio
n & progress
5 8 Aug, Dec,
May
Immediate Immediate Teacher & RTI
Team
RTI screener &
progress
monitoring
SMI Scholastic Math
achievement
5 8 Aug, Dec,
May
Immediate Immediate Teacher & RTI
Team
RTI screener &
progress
monitoring
Reading
Counts
Scholastic Reading
comprehensio
n & progress
5 8 Throughou
t the year
Teacher Used for
reading
curriculum
Explore Provided by
ACT
General
achievement
8 October Teacher &
Administration
Pre-AP
placement
going into 9
th

grade ACT
predictor
PLAN Provided by
ACT
General
achievement
9 October Given to
students
immediately
College
readiness
identifies
strengths &
weaknesses in
math, reading,
English, &
science
Compass Provided by
ACT
General
achievement
10 Spring Given to
students
immediately
Dual
enrollment
placement
OGT State
assessment
Achievement
based on
state
standards
10 Fall &
Spring
Immediate Immediate Teacher,
Administration,
RTI team

ACT College
readiness/pla
cement
1112 (students
choose)
College
readiness
SAT College
readiness/pla
cement
11-12 (students
choose)
College
readiness
OCTCA
(WebXam)
State
assessment
Career
readiness/CTE
11-12 Spring Teacher & CTE
supervisor
State uses for
completer
status part of
new CTE state
school report
card
PSAT College
readiness/pla
cement
11 Spring Guidance
counselors/Tea
cher
Predictor
assessment for
ACT and SAT




The Process

Over a 4 day time period
1) Email sent to literacy specialist
2) Literacy specialist directed me to the Director of Technology
3) Technology director emailed with assessment document for entire district
4) Technology director agreed to meet with me to discuss assessments in further detail

To begin the process, I emailed a member of the high school RTI team who is also a central office employee regarding the assessments
given for the entire district. She is considered the Literacy Specialist, but always seems knowledgeable regarding testing. She redirected
me to the Director of Technology. He emailed me back right away and shared a Google document with me that he was completing,
similar to the matrix provided above. He also agreed to meet with me that week to further discuss details regarding the assessment.
During OGT testing time, the technology director spoke with me about the purpose of the assessments and the analysis part of the
data collection process. He also discussed with me the strengths and weaknesses of our districts current data collection and analysis
process.

Analysis

In conjunction with the Director of Technology for my school district, we compiled a list of Whats Working and What Needs
Improvement in regards to assessment administration, data collection, data analysis, and outcomes for the building/district.






















Whats Working
! Multiple assessments good variety covering several
content areas
! Balance of testing avoiding overuse and an
emphasis on testing overall
! Constant analysis of assessments
i.e. Currently evaluating the efficacy of STAR, SRI, SMI
and looking into i-Ready to serve as new assessment system
for diagnostic and progress monitoring
! TestingWerks data warehouse, cost effective
! Increases student achievement marginally but lacks
long term impact
What Needs Improvement
! Consistency across district of analysis, building to
building and within each building
! Longitudinal data year to year silos, need to look at
more big picture data
! TestingWerks professional development needed for
teachers to understand how data should be analyzed
and what the numbers mean
! Communication of data significance and findings to
all stakeholders (parents)
! Validity of data monitoring if and how
systems/programs are being implemented with fidelity
! Transitional systems assessments being added and
removed on a regular basis
! Connecting all the data
Analysis continued
After speaking with my districts Director of Technology, it became clear to me that our greatest strength was in the type and
variety of assessments we are administering as well as how we constantly evaluate the efficacy of those assessments in order to ensure
the validity of the data rather than using tests just because we always have. The districts greatest weakness is in the consistency of
how the data is being analyzed and connected longitudinally. The CALDER Center (National Center for Analysis of Longitudinal Data in
Education Research) considers a dataset longitudinal if it tracks the same type of information on the same subjects at multiple points
in time. For example, part of a longitudinal dataset could contain specific students and their standardized test scores in six successive
years (CALDER, 2011). Currently, my district does a sufficient job in identifying student strengths and weaknesses for a years dataset,
but we are not looking close enough at the numbers across time in order to determine which programs specifically are working and at
what grade levels students seem to be finding success.

Currently, a great need for the district is a screener to use for the incoming 9
th
grade students to determine their need for
intervention services through the Response to Intervention program that has slowly been implemented. As a member of the RTI high
school team, I am aware of the difficulty the district has had in determining the appropriate assessment to use as an RTI screener that
may also serve as a diagnostic tool. The RTI program at the high school lacks a consistent data system. In October 2010, Education
Leadership published an article entitled Doing RTI Right where a Texas school discussed their first roadblocks with implementing RTI,
roadblocks I believe my district is currently facing: There was an entrenched belief in our district that a teacher's intuition was as
reliable as quantitative student data in defining student progress. Intuition, which is often based on qualitative data such as anecdotal
notes and classroom observations, should always be a part of the equation. However, qualitative data should not drive instructional
decisions (Bryson, 2010). At the high school, we often make decisions in committees, department meetings, grade level meetings, and
more based upon teachers recommendations, student behavior, and as stated above, intuition. We are not using data as often as
we should to make curricular and placement decisions. However, I believe the biggest reason we are not using data as much as we
should is because we do not have those screening and diagnostic tools needed to make early decisions. As I discussed with the
technology director, we are looking into i Ready to serve as a 9
th
grade screening tool to help place students in appropriate learning
environments and provide necessary interventions, including gifted education. i-Ready is considered a Common Core state standards
screener and serves as a math and reading screener for grades 3 8. The test was created under Curriculum Associates, LLC, through
a panel of expert advisors. It is intended to pinpoint the standards, which students are most likely to struggle with and help identify
gaps in current curriculum.

Lastly, another recommendation for the district as a whole is to do a better job of communicating the ins- and outs- of the data
to parents. Parental involvement can be one of the greatest indicators of student achievement. In a meta-analysis on how parental
involvement impacts student motivation, it was found that students perceptions of their parents values about learning and
achievement have the strongest relationships with both motivations and competence (Gonzalez-DeHass, p. 110). Parents simply
receiving data from the school does not necessarily guarantee parental involvement, but the more the school can educate parents
on what each assessment means and how it impacts their child, parents can help to become part of the progress monitoring process.
As you can see from the matrix above, the technology director was unable to provide for me exact dates of distribution to teachers
and/or parents and for many of the assessments for middle school and high school, he said he was not familiar with each buildings
dates of distribution for assessments that are actually distributed to parents. From a teachers perspective, I know I have never been
given data from the assessments above other than the OGT test data. With more time, I would like to meet with building principals to
get a more detailed view of how and what assessment data is distributed to parents and teachers.
District Overview

The high school in which I work is one of the largest districts in the county as far as the district lines are concerned. The district is
stretched across several townships and cities creating for a unique, demographic mix of rural, suburban, and urban environments.
According to the 2012-2013 building report card, the average daily enrollment for grades 9-12 is 2, 133. The student population is
comprised of 8.5% Black/Non-Hispanic, 0.6% Asian or Pacific Islander, 0.8% Hispanic, 4.7% Multi-racial, and 85% White. The economically
disadvantaged population makes up for 45.5% of the student body and 15.6% are students with disabilities. At the high school, 96.3% of
the faculty has at least a Bachelors degree and 67.4% with at least a Masters degree. The high schools current Superintendent has a
strong belief in Phillip Schlechtys designing engaging work philosophy of education. She has provided multiple professional
development opportunities regarding the process of designing engaging work and developing a school culture centered on design.
The following is an excerpt from the Schlechty Centers website:
Our work begins with the assumption that there is a direct link between the caliber of schoolwork students are provided and
the willingness of students to engage in schoolwork. When students engage in and persist with their work, they are much more
likely to learn that which schools, parents, and the community deem important. However, the production of engaging
experiences for students requires a commitment to continuous innovation and the constant creation of new ways of doing
thingsin the classroom and the principal's office, as well as in the central office and the boardroom. This means that the
schools and the districts in which schools are embedded must be organized as learning organizations rather than as
bureaucracies. The reason this is so is that bureaucracies require certainty and predictability, whereas learning organizations are
designed to give order, meaning, and discipline to innovation and creativity. Therefore, in a school district that functions as
a learning organization, the core business is to ensure that every student, every day, is provided challenging, interesting, and
satisfying work (Schlechty, 2011).
This philosophy has shifted the thoughts of teachers on being providers of information but rather designers of work. With this in mind, a
lot of negativity surrounds the multitude of testing in school and encourages the belief that a focus on testing does not improve the
function of schools. However, this does not mean there is no validity to testing data or no place for data analysis, but the culture of the
school favors a view of testing that takes a back seat to designing engaging work. Testing and data are not often discussed in staff
meetings, committee meetings, or even district-wide staff inservices.


Interviews

1) High school Math teacher
2) Literacy specialist
3) High school English teacher
4) Guidance Counselor
5) Intervention Specialist

































Insights and questions from five informal interviews:

! How long has this document existed? Why have I never seen this?
! Reading comprehension appears more frequently than any other purpose for assessmentwhy?
! Why are we only monitoring reading progress and not also math?
! I knew we gave these tests, but I dont always know why
! There is so much testing, what are we communicating to our students at such a young age about testing?
! Want to know more about each type of test and how to interpret the scores
! Recommendation from nearly everyone that we need to meet more regularly to analyze this data professional development
needed
! We need to do more progress monitoring at the middle and high school levels
! Would like to see a screener used at the high school level for RTI
! Teachers need to be more involved in the analysis process
! Need more professional development on TestingWerks
! How do we plan on using the screener at the high school if we decide on one?
! Most people had no idea we would now be receiving a CTE report card


References
Bryson, M., Maden, A., Mosty, L., & Schultz, S. (2010). Doing RTI right. Educational Leadership, 68(2).
Elliot, K. (2011). i-Ready Common Core State Standards Screener helps educators prepare students for testing on new standards.
eSchool News. Retrieved from http://www.eschoolnews.com
Gonzalez-DeHass, A., Willems, P., & Doan Holbein, M. (2005). Examining the relationship between parental involvement and student
motivation. Educational Psychology Review, 17(2), 99-123.
National center for analysis of longitudinal data in education research. (2011). CALDER. Retrieved from http://www.caldercenter.org
Schlechty, P. (2011). Schlechty center on engagement. Schlechty Center, 5-7. Retrieved from http://www.schlechtycenter.org







Screencast Links:

Part 1: http://screencast.com/t/2WfBlfymlDoz

Part 2: http://screencast.com/t/jXxGxfKj8Crv

Part 3: http://screencast.com/t/WCIRdxPU