You are on page 1of 6

Assessment and Evaluation

International Literacy International Dyslexia Course/Artifact


Association Association

ILA Standard 3: Assessment IDA Standard D: EDCS 647


and Evaluation Interpretation and Pre/Post Intervention
Administration of Assessment Comparison-
Assessments for Planning Literacy Learners Case Study
Instruction (artifact 1)

EDCS 647
Informal Reading Inventory-
Literacy Learners Case Study
(artifact 2)

SPED 639 Assessment


Summary Table- Student Case
Study Report (artifact 3)

Synthesis of Assessment Standards


I​ LA Standard 3 and IDA Standard D focus on assessing students and using the results to
evaluate their skills and plan appropriate instruction based on the assessment findings. Literacy
specialists must be able to choose and administer the appropriate assessments, analyze the data,
select appropriate instruction using evidence-based practices, and use progress monitoring to
guide decisions. It is essential for literacy specialists to use reliable and appropriate assessments
to identify students who are at risk for reading failure and why. Assessment is also used to
identify students who are achieving above age/grade expectation. Furthermore, the literacy
specialist must be able to analyze the data to determine if a student is not making progress, and
report the findings, pinpointing the specific area in which the student is experiencing difficulty.
Assessing all students with a variety of assessments also highlights individuals who are not
experiencing difficulty in reading and writing and the results of the assessments assist teachers in
creating effective instruction to differentiate to include all students.

Both the ILA 3 and IDA D standards recognize the importance of the proper
administration and analysis of assessments and the importance of using standard based tools.
Both standards place emphasis on collecting data, both initially and continuously through
progress monitoring. Results are then used to shape and guide instruction in order to optimize
student learning and maximize positive outcomes, which is the ultimate goal. Both standards
highlight the importance of creating a comprehensive report that is easily communicated.​ ​The
standards differ in that the IDA Standard D places emphasis on the importance of a thorough
understanding of the phonological, decoding, oral reading, spelling, and writing skills that are
typically assessed. This is important in order to understand when atypical results show up in
assessments.

The main reason for administering assessments is to discover what skills students have
and which skills they need. To use the appropriate assessment, literacy specialists must be very
familiar with different assessments types used in their school and also be familiar with other
research based assessments available. The ultimate goal of assessing and evaluating student
performance is to optimize student learning outcomes. Analysis of assessment information can
indicate whether the student is reading at grade level (or above or below) and areas of strengths
and need, such as oral reading fluency, spelling, comprehension etc. Students’ progress should
be continuously monitored and, through data collection and analysis, instruction should be
adjusted accordingly. In my current position as a trained literacy specialist, collaborating with
classroom teachers is essential for student success. Reports of my findings are generated and
shared with the RTI team and classroom teachers. Literacy Specialists must be able to create
comprehensive reports that are easily communicated to inform teachers, parents and
administration and recommend further instruction.

Summary of Artifacts

Artifact #1 Pre/Post Intervention Assessment Comparison​

The pre / post assessment information shown below in artifact 1 was extracted from a
semester long Literacy Learner’s Case Study (LLCS) from a ​Literacy Assessment​ class (EDCS
647). The subject of the case study was a first grade student who struggled with reading. The
assessment data shown was collected using the Observation Survey (Clay, 2013). The
intervention for the student in between the two assessments, was based on Clay’s (2005) ​Reading
Recovery Program ​and sought to develop oral reading fluency through building sight word
fluency. The assessment information on the left was collected after the first week of school. The
instruction was modified to reflect assessment results. On the right, the post text information was
collected in December and shows the student’s skill level after the intervention.

Artifact # 2 Informal Reading Inventories


The second artifact is an informal reading inventory (IRI) in the form of a running record
from the same student in the LLCS discussed in artifact 1. Running records were taken each day
as the student worked directly with me for this intervention. Clay (2005) recommends
implementing the ​Reading Recovery​ intervention program 5 days a week and this was the case
with the student in this student apart from 2 absences due to illness. The running record provided
daily opportunity for me to apply a miscue analysis after each oral reading. The books were
specifically chosen at the student’s instructional level and included at least 2 target words for the
day’s lesson as well as a number of practice words from the student’s sight word list. Analysis of
the daily running records indicated when it was time to move the student up a reading level
based on the accuracy rate obtained. The running records also indicated when the student was
making progress in specific decoding areas that were identified as a challenge (e.g., vowel
sounds, vowel combinations, consonant blends). Further, the running records indicated when she
had mastered particular sight words or where more practice was needed.

Artifact #3 Assessment for Case Study Student


Artifact 3 is a table that was extracted from a Case Study Project from SPED 639
Advanced Fundamentals of Language and Literacy​ showing the results from a student who I
assessed with the QRI-6, a Spelling Inventory, and a CBM writing assessment. I received full
training in how to effectively use these 3 tools and analyze the results. The student’s sight word
level, reading fluency and comprehension, spelling, and writing levels were assessed.

Justification of Artifacts

Assessment and Evaluation (ILA 3).


Interpretation and Administration of Assessments for Planning Instruction (IDA:
D)

I demonstrated my ability to meet ILA Standard 3 and IDA Standard D in an analysis of


the results of the ​Observation Survey​ (Clay, 2005) pre-assessment that indicated the student was
a struggling reader because her oral reading fluency was slow and choppy due to a limited sight
word inventory. More specifically, the assessment indicated that the student struggled with
foundational concepts regarding vowel sounds, vowel combinations and consonant blends and
that the student’s identification of high frequency words was limited, which further impacted
her oral reading fluency. As part of this assignment, I used assessment to guide instructional
decisions. I implemented an intervention that targeted building sight word fluency while clearing
up any confusion regarding irregular words and word parts through direct and explicit
instruction. Based upon these initial findings, a 16 week intervention was implemented.
Comparing the post assessment data to the pre assessment data (see artifact #1) we can see the
student increased in sight word fluency, and oral reading fluency after the 16 week intervention.
The results indicate that my implications based on analysis of the pre assessment data were
interpreted correctly. By monitoring student progress, through informal reading inventory in the
form of daily running records (see artifact # 2), I was able to continually adjust the instruction as
needed. The daily running records indicated the student was making gains. Through direct
instruction in building a stronger sight word bank while concurrently learning the ‘rules’ that
apply to irregular words through manipulation of words and word parts, the student demonstrates
favorable post test results. This indicates the intervention that I specifically designed for the
individual student based on analysis of the pre assessments, and monitoring progress through
daily running records, was successful. As can be seen in artifact #1, the student progressed to a
Reading Recovery​ (Clay, 2005) level 16, (the exit level for the program), and placed the student
at a first grade reading level. I demonstrated my professional ability to create and implement an
effective reading program based on an individual case that resulted in significant progress by that
student. As the ILA 3 and IDA D standards state, the purpose of assessment is to gather
information to guide instruction and ultimately result in positive learner outcomes. Artifacts 1
and 2 from the literacy learner’s case study demonstrate my ability to understand and apply these
two standards.

A third artifact (artifact #3) is added to further demonstrate my ability to show that I am
able to use a variety of assessment tools so that I may choose the best assessment tool for the
student and situation. IDA Standard 3 states that literacy professionals are able to choose and
administer an appropriate assessment and interpret the results to evaluate and plan instruction.
The ​Qualitative Reading Inventory ​- (QRI-6), the ​Spelling Inventory​ and the ​CBM Writing
assessment​ are 3 tools I received training in through Sped 639 ​Advanced Fundamentals of
Language and Literacy​. Competence in administration of and analysis of these assessment tools
is demonstrated through the summary of implications / analysis portion of the table shown in
artifact 3 below. The student scored at an instructional level pre-primer for the word list, oral
reading fluency and comprehension. Word automaticity is very low, affecting oral reading
fluency which is slow and halting. Reading words in context was slower and choppier than when
he read the sight word list in isolation. This suggests a need for an increased sight word bank to
in turn assist fluency. Analysis of the expository text passage at the primer level on which the
student scored at a frustration level revealed that when the student comes to a word he doesn’t
know, he will fill in, omit, or miscue subsequent words to create a meaningful sentence. It seems
as though his strategy is to invent his own story following the unknown word. This greatly
disrupts the text’s authentic meaning and throws the student off for subsequent sentences.
Analysis of the narrative text passage at the pre-primer level on which the student scored at an
instructional level revealed that he uses visual strategies to read words. That is, his miscues have
similar letter and / or sound patterns. He read ​there​ for ​where​, ​here​ for ​he​ and ​stable​ for ​table
(which affected his retell in which he talked about horse stables which was not in the story at
all). Additionally it should be noted that the student self-corrected one time during passage
reading saying ​here​ for ​he​ and then correcting. Analysis of the concept questions, retell and
comprehension questions revealed that the student tends to go off topic. Repeating the questions
seemed to ‘bring him back’. He was unable to answer in a concise way and gave additional
non-relevant information with the concept questions and the retelling of the story. The
comprehension questions revealed that he was better able to answer explicit questions than
implicit questions and this suggests that concrete information versus inference may be easier for
this student to process. The ​Elementary Spelling Inventory​ revealed the student’s overall spelling
stage is at a ​Middle​ in ​Within Word Pattern​. He correctly spelled 6 out of 25 words and
accumulated 29/62 feature points. The areas of strength for the student were initial and final
consonants and digraphs. He did not know the ​ch​ digraph when it occurred in the middle of the
word (marched) and he did not know ​wh.​ The student did well on blends, attaining 6/7. He did
not know ​pl​. The student needs work with vowel blends (oi, er, ew, ar, ow). He experiences
confusion with syllable junctures and made errors with double r’s, t’s, l’s and p’s. This student
seems to enjoy writing and this is a strength. He is also very imaginative and this fuels his
writing. He writes significantly slower than same aged peers. He produced 19 words in 3
minutes. Of the words he wrote, 18 / 19 were spelled correctly. The story starter from the CBM
written expression probe peaked and maintained his interest. He used periods but not capitals.
The above analysis demonstrates my ability to effectively apply ILA standard 3 and IDA
Standard D. I have demonstrated through detailed explanation of the artifacts that I am able to
choose the appropriate assessment and interpret and evaluate the results to guide individual
student instruction.