You are on page 1of 17

Focus:

I have often noticed that many of my students are unwilling to attempt to answer

questions that require more than simple solutions. I have also observed that my students tend to

provide short and simplistic responses to open-ended questions in what may be an attempt to

avoid higher orders of thinking. Answering open-ended questions is an important way to

challenge students to think critically and execute higher orders of thinking, which is a critical

component of effective pedagogy (Danielson 3C). Other teachers in the building have advised

me that practicing higher order thinking skills with my student population is something that I

should avoid because they have limited experience with this skill. I reject that advice because it

is important for teachers to set high expectations for learning and hard work (Danielson 2B). For

these reasons, student performance on critical response questions (CRQs) is the concept that I

plan to tackle for my study of student achievement. CRQs are open-ended questions that are

asked on the Biology Keystone. My students will take the Biology Keystone during their

sophomore year, but the goal of this assessment is to encourage increased effort and higher order

thinking. Standardized test preparation is not a primary goal of this study. On top of

inexperience, effort on open-ended questions is an area in which I know that my students can

improve. Therefore that is another goal for this study. I plan on using classwork assessments

(released CRQs) and one summative assessments to measure student progress.

I specifically focused on one class for this study to avoid getting overwhelmed with data.

The class that I studied consisted of ninth graders ranging from 14 to 16 years of age. The class

originally had nineteen students, but one student was added to the class during the middle of the

study. At the end of the study, the class consisted of one caucasion student (the student who
joined during the middle of the study), two Hispanic students, and seventeen African American

students. Both of the Hispanic students are English Learners, but only one of them seems to have

a communication barrier based on her level of English proficiency. Throughout my placement as

a student teacher, I have frequently observed that students have struggled to answer open ended

questions during class. I recorded these observations in a journal (Danielson, 4B). My students

have also performed less favorably on open-ended questions on summative assessments than on

other types of questions (multiple choice, true/false, short answer, etc) on summative

assessments. My students score a 47% on average for the open-ended questions on summative

assessments throughout the year, which is much lower than their 65% on other types of

questions. It is also important to note that summative assessments refer to tests and quizzes for

this study, because the averages would arguably be inflated if performance on projects was

included.

Pre-Assessment Lessons:

Wednesday, January 30

Teacher: Mr. Berenson

Grade: 9th

Content Area: Environmental Science

1. Content and Standards:​ 4.1.10.A: Examine the effects of limiting factors on population

dynamics: Analyze possible causes of population fluctuations.

2. Prerequisites: ​Students should have background knowledge on populations


3. Essential Questions: ​How do interactions among organisms affect their ability to survive?

Why is it important to keep track of population growth? How do communities respond to

disturbance? How are population sizes in nature regulated?

4. Materials and Equipment: ​Writing utensil, Do, study guide, SMART BOARD, notebook,

CRQ

5. Instructional Objective*:

- SWBAT complete a CRQ related to predation with at least 33% accuracy (1 out of 3)

IOT understand the impact that predator and prey populations have on each other within

the same ecosystem.

- SWBAT answer analysis questions based on a graph modeling lynx and hare population

size with at least 75% accuracy IOT identify the impact that predation has on population

size.

6. Instructional Procedures:

I. Before​:

A. Students answer the following question in their “Do Now” tracker: What is a

population? The answer to the question will be reviewed as a class (7 Min).

II. During​:

A. Direct Instruction: The students will take notes on population dynamics (size,

distribution, density) as well as predation. Sample discussion questions: Mr. G.

and George Washington both live in the United States, are they part of the same

population?, Since there are 20 people in the classroom right now, would we have

a dense population? What about after dismissal when Mr. B. and Mr. G. are the
only ones in the room? How would a population of gazelles be impacted if the

population of cheetahs increased? (20 min)

B. Guided Practice: ​The class will complete a CRQ. The teacher will model the

beginning of the CRQ by tagging the first part of the question with the students

(25 min).

C. Independent Practice: The students will complete the rest of the graph on their

own. They will also answer the analysis questions with a goal of having at least

six questions being correct (15 min).

III. After:

A. Exit Ticket: Students will answer the following question before exiting the class:

How would the population of lynxes be impacted if the population of hares

decreased? (6 Min)

7. Assessment:

A. Do Now: Do now is a pre-assessment to gauge students prior knowledge. Students will

receive full credit as long as they attempt to answer the question.

B. Formative: I will pay attention to student participation and engagement during direct

instruction. I will ask students for feedback about their understanding of the assignment

for the day. I will also monitor progress as they work on their graphs and analysis

questions.

C. CRQ: Students will be expected to explain their answers using evidence from the article

about infectious disease and the coronavirus.


D. Predation Worksheet: Students will be expected to answer the analysis questions with

75% accuracy (6 out of 8).

E. Exit Ticket: I will be interested in the percentage of students that identified the key

relationship between predator and prey dynamics.

8. Differentiated Instruction: ​Students in need of time accommodations will only need to

complete the four analysis questions. ​(Name redacted) can have a scribe for the CRQ.

Tuesday, February 18

Teacher: Mr. Berenson

Grade: 9th

Content Area: Environmental Science

1. Content and Standards:​ 4.1.10.A: Examine the effects of limiting factors on population

dynamics: Analyze possible causes of population fluctuations.

2. Prerequisites: ​Students need to have completed the majority of their invasive species

research worksheet because they will be using it to construct their google slide presentation.

3. Essential Questions: ​How do interactions among organisms affect their ability to survive?

Why is it important to keep track of population growth? How do communities respond to

disturbance? How are population sizes in nature regulated?

4. Materials and Equipment: ​Chromebooks, writing utensil, Do Now Sheet, Invasive species

research worksheet sheet, CRQ.

5. Instructional Objective*:
- Students will be able to answer a CRQ about invasive species with at least 33% accuracy

IOT assess their knowledge of material and to practice critical thinking skills by

explaining their answers using evidence

- SWBAT use the internet to complete the biome research worksheet in order to gather

information about their biomes with at least 95% accuracy.

- After a day of research on their biome, SWBAT create Google slides with a cover page,

four facts of background information, and four facts about the native location and habitat

for their invasive species IOT identify key characteristics for their invasive species.

6. Instructional Procedures:

I. Before​:

A. Students will complete a CRQ about Invasive species (25 Min).

II. During​:

A. Direct Instruction: The rubric will be distributed and explained to the students

with a focus on the first three slides. The students will be told to focus on the first

three slides for today’s lesson. They will be told to use the information on their

research worksheets in order to create the slides. Students will be told that the

expectation is to finish at least the first three slides (cover page, background info,

and native location/habitat info) and their progress will be graded later today. The

expectation is that they must finish at least the first three slides of their

presentation (10 min).


B. Independent Practice: Students will complete the invasive species research

worksheet. Students will create the first three slides of their presentation using the

rubric and their invasive species research worksheets. (52 Min).

III. After:

A. Exit Ticket: Quick check-in: What progress did everyone make today? How much

time do they need to finish the entire presentation? (3 Min)

7. Assessment:

A. CRQ: I will be looking for students to know that the organism is non-native because it

was not originally from Australia. They must also be able to predict one consequence that

the non-native species will have on native populations (predation, competition, etc). They

need to explain their answers using evidence (from their brain, or from the excerpt before

the question).

B. Formative: I will pay attention to student participation and engagement during direct

instruction. I will ask students for feedback about their understanding of the project task

for the day. I will also monitor progress as they work on their google slides.

C. Invasive species research worksheet:This should be accurately completed by the end of

the days class,

D. Exit Ticket: I will be monitoring student progress and determining how much time is

needed to complete the final presentation

8. Differentiated Instruction: ​Students in need of time accommodations will only need to

complete the invasive research worksheet and CRQ. ​(Name redacted) can have a scribe for

the CRQ.
Analysis​:

A total of 16 students were present for the first pre-assessment, which was much higher

than the second pre-assessment. Of the 16 students, 3 did not attempt to answer the question, six

students answered the question but scored a 0 out of 3, five students scored a 1 out of 3, and 2

students scored a 2 out of 3. No students scored a 3 out of 3. Unfortunately, only 10 students

were present for the second pre-assessment. Of the 10 students, 3 students did not attempt to

answer the question, 4 students attempted the question but scored a 0 out of 3, 2 students scored

a 1 out of 3, and 1 student scored a 2 out of 3, and no students scored a 3 out of 3. It appears that

the students performed better on the first pre-assessment at a glance, but the students actually

performed similarly on both assessments. One student increased his score by 1 point, and two

students decreased their score by one point. All of the other students had the same score on both

pre-assessments. The numbers are most likely lower on the second pre-assessment because many

students who scored points on the first pre-assessment were no present for the second

pre-assessment. The mean score for the pre-assessments was 0.5, and the median score was 0.

My strategies to help students achieve were limited on the pre-assessments. My goal for

these assessments was to establish a baseline for students so that I could develop interventions

that would lead to improvement with the pos-assessments. I broke down the CRQ scoring rubric

with the students before administering the pre-assessments, so they understood that they needed

to answer the questions correctly (there are limited objectively correct answers) and explain their

answers using evidence from the graph, data, exert, or their own background knowledge. As far

as students grades were concerned, credit was received for the assignment if the students

answered the questions and explained their answers regardless of their score on the assessment.
The results for this pre-assessment were not unsurprising to me as I anticipated that the students

would struggle with open-ended questions, so I decided to focus on them for this study. It was

clear that a lot of work needed to be done to help students succeed on the post-assessments.

Post-Assessment Lessons:

Tuesday, March 3

Teacher: Mr. Berenson

Grade: 9th

Content Area: Environmental Science

1. Content and Standards: ​4.1.10.A: Examine the effects of limiting factors on population

dynamics: Analyze possible causes of population fluctuations.

2. Prerequisites: ​Students need to understand how populations change based on predation,

limiting factors, the presence of invasive species, and competition.

3. Essential Questions: ​How do interactions among organisms affect their ability to survive?

Why is it important to keep track of population growth? How do communities respond to

disturbance? How are population sizes in nature regulated?

4. Materials and Equipment: ​Writing utensil, Population dynamics exam,

5. Instructional Objective*:

- SWBAT complete the first three parts of Ecosystem Dynamics Exam with at least 70%

IOT demonstrate content mastery of populations and how they change


- SWBAT complete the CRQ at the end of Ecosystem Dynamics Exam with at least 60%

accuracy IOT demonstrate improvement with higher order thinking and offering

explanations using evidence

6. Instructional Procedures:

I. Before​:

A. The students will answer the following questions in their “Do Now” tracker: What

are your thoughts about the test? (7 min)

II. During​:

A. Direct Instruction: Teacher goes over today’s objectives. Teacher introduces some

test taking strategies (5 min)

B. Guided Practice: N/A

C. Independent Practice: Students will work independently to complete the

Populations Test (78 min).

III. After:

A. N/A

7. Assessment:

A. Population Dynamics Exam: Students will be expected to demonstrate their knowledge of

populations, predation limiting factors, invasive species, population curves, survival

strategies, survivorship, and competition. ​There will be a CRQ on limiting factors that

will be worth 20% of the test grade. Students will be asked to identify a limiting factor

for a moose population, explain how wolf and moose populations are related, and predict

the future population trends of the moose population based on a graph. Students must
think critically and explain their answers using evidence in order to successfully answer

the questions. Students will have access to a checklist of steps for completing a CRQ.

8. Differentiated Instruction: ​A modified version of the test is available. The modified

version bolds key words. There are three answers choices instead of 4 on the multiple choice

of the modified test. ​(Name redacted) can use a scribe on the CRQ.

Monday, March 9

Teacher: Mr. Berenson

Grade: 9th

Content Area: Environmental Science

1. Content and Standards: ​4.5.12.A Research how technology influences the sustainable use

of natural resources. Analyze how consumer demands drive the development of technology

enabling sustainable use of natural resources.

2. Prerequisites: ​Students need to understand population density. Students need to have the

skills necessary to have a productive discussion with their peers. Students need to be able to

determine the meaning of a scientific article.

3. Essential Questions: ​How do environmental policies protect the environment? What

impacts do human populations and resource use have on the environment? How can we

balance our needs for housing and jobs with that of the environment? How does the

environment’s health affect our own?

4. Materials and Equipment: ​Writing utensil, Do Now, current event, SMART BOARD,

notebook
5. Instructional Objective*: ​Following a discussion about infectious disease and coronavirus,

SWBAT​ complete a CRQ on the coronavirus IOT examining the impact that pathogens and

infectious disease can have on ecosystems.

6. Instructional Procedures:

I. Before​:

A. Do Now: Students will answer the following question in their Do Now sheets:

What are some thoughts, feelings, or questions that you have about the

coronavirus?. (7 min).

II. During​:

A. Direct Instruction: Teacher goes over today’s objective and introduces the idea of

infectious disease. Based on this, the class will have a conversation about the

coronavirus. Technology will be emphasized in the discussion. The following

probing questions will be asked: In which areas is disease more likely to spread?

What are some of the factors that influence the spread of disease? What have you

heard about the coronavirus? What can you do to prevent the spread of disease?

(30 min).

B. Guided Practice: Think pair share about disease in urban areas (10 Min).

C. Independent Practice: ​The students will read a current event article about the

virus. They will then answer a CRQ about infectious disease and the coronavirus.

The article and CRQ checklist can be used as evidence (40 min).

III. After:
A. Closing Discussion: The teacher will talk with the students about their opinions

on the articles. Did anyone change their minds after reading the articles? (3 Min).

7. Assessment:

A. Formative: I will use a Kahoot as a formative assessment to ensure that students are able

to answer level one questions about infectious disease.

B. CRQ: Students will be expected to explain their answers using evidence from the article

about infectious disease and the coronavirus.

C. Think Pair Share: Students will relate the previously discussed topic of population

density to the new topic of infectious disease.

8. Differentiated Instruction: ​I will add visuals to the Kahoot. Students with reading

difficulty will be allowed to to work with a partner. ​(Name redacted) can have a scribe for

the CRQ.

Analysis:

Before getting to the results of the post-assessments, it would be beneficial to discuss the

changes that I made to my teaching that would support students as they worked to improve on

CRQs. In between the pre-assessments and the post-assessments, I made three major changes to

my teaching strategies with the goal of increasing student success. The first change was to have

the students peer-review each other's answers on the first pre-assessment since more students

were present for that assessment. This was an idea that was suggested in our student teaching

seminar, and I decided to utilize it because it aligns well with Danielson by giving student’s

agency over their learning (Danielson, 2B). Before the peer review, I went over sample answers

with the students and then allowed them to help each other improve their answers. There were
some students reluctant to participate, but most students took this opportunity to work towards

improvement. The next change that I made was to provide students with a checklist that they

could fill out as they answered the CRQ. This was another idea discussed in our seminar, and it

was a simple tool that supported students as they answered CRQs. The checklist was very

simple, as it only served to ensure that students restated the question in their answer, answered

all parts of the question, and explained their answers using evidence. The final change that I

made was a mini-competition that would incentivise engagement (Danielson, 3C). It is important

to engage students in learning because of the following reason: “When teachers carefully

structure the delivery of their content so as to ensure...cognitive engagement by every learner,

they help ensure that the learning will be lasting and meaningful,” (Witkowski & Cornell, 2017).

I am also a proponent of behaviorism, so I provided incentives for students who improved their

scores on the pre-assessment following the peer review. The incentives I selected were a choice

of a donut or extra credit to any student who edited their CRQ and improved their score during

the CRQ. This helped to motivate and engage the students in higher orders of thinking, and

signaled to me that the students were ready for the two post-assessments.

There were a total of 19 students present for the first post-assessment. It is necessary to

note that a new student joined the class for the post-assessment who was not present for the

pre-assessments. Of the 19 students who were present, 1 student did not attempt to answer the

CRQ (the new student), 4 students attempted the CRQ but scored 0 out of 3, 7 students scored 1

out of 3, 5 students scored 2 out of 3, and 2 students scored 3 out of 3. The second

post-assessment was a little bit different because I had to create the CRQ because the topic of the

lesson was the coronavirus. This likely meant that the assessment was less reliable and less valid
than the other 3 CRQs which were released items from Biology Keystone exams. A total of 15

students were present for the second post-assessment. Of the 15 students, 1 did not attempt to

answer CRQ, 5 attempted the CRQ but scored 0 out of 3, 7 scored 1 out of 3, and 2 students

scored 2 out of 3. The mean score for the post-assessments was exactly 1.0 out of 3, and the

median score was also 1 out of 3. I also recorded in my journal that the students seemed to exert

more effort into the post-assessments as the length of their responses increased.

Conclusions:

The results of the post-assessment were incredibly encouraging because the mean score

on the post-assessment was twice as high as the mean score on the pre-assessment. While this

study was not sufficient to conclusively state that my changes in teaching strategy were the cause

of this increase, the evidence seems to suggest that my teaching strategies were likely a limiting

factor. The mean score may not be the best indicator of student performance because absences

likely caused the average score to decrease on the pre-assessment. That being said, there were 8

students who scored higher on the first post-assessment than the first pre-assessment. This is a

crystal clear example of student improvement, and my teaching strategies were likely a major

contributing factor that impacted that improvement.

While it was encouraging to see an increase in scores because that indicates that students

improved in their ability to answer questions requiring higher order thinking skills, it is also

important to note that the length of responses improved in the post-assessment according to my

classroom observations. I view length of answer is an indicator of student effort. It was also true

that less students refused to answer the CRQ on the post-assessment when compared to the

pre-assessment, so the evidence suggests that both goals of the study were met. The data
suggests that students performed better on tasks requiring critical thinking skills and improved

with their effort on open-ended questions.

The goals of the study are significantly important because engagement and a culture of

learning are keys to academic success (Danielson, 2B and 3A). A focus on these goals led to

positive results, so that means I should continue to focus on them throughout my entire career as

an educator. The improvement demonstrated in this study was significant, but many of my

students still struggle with higher order thinking skills and CRQs. A score of 1 out of 3 is still

below the state average for CRQs, so there is room for further improvement. That being said,

checklists, peer-review, and competitive incentives are teaching strategies that I will replicate in

the future because my students have demonstrated their effectiveness during my time as their

student-teacher.
References

Danielson, C. (2013) ​The 2013 Framework For Teaching Evaluation Instrument

Witkowski, P., & Cornell, T. (2017). A Model for Total Participation and Higher-Order

Thinking. In ​Total Participation Techniques : Making Every Student an Active Learner,

2nd Ed​ (pp. 14–31). Association for Supervision & Curriculum Development.

You might also like