You are on page 1of 11

Wang, Y., Chung, C., & Yang, L. (2014). Using Clickers to Enhance Student Learning in Mathematics.

International Education Studies, 7(10), 1-13.

Title: “Using Clickers to Enhance Student Learning in Mathematics”

Authors: Ye Wang, Chia-Jung Chung, & Lihua Yang

Summary of Literature Review

The literature analysis defined Clickers as a type of student response system that are small

remotes that students use to answer questions. The research also focused on the benefits of Clickers

and how to integrate Clickers correctly in the classroom. Some of the benefits noted include the ability

to assess student learning immediately and provide feedback instantly. Wang, Chung, & Yang (2014)

state, “This technology allows teachers to review individual scores and re-teach mathematics concepts

where students are struggling.” (p.3) The literature stated that the anonymity of student responses

made the students feel less vulnerable and more eager to respond to questions. “In math class, many

students are hesitant to respond to an answer until they know how others will respond.” (Wang, et al,

2014, p.3) The literature review does illuminate the fact that it is important to implement Clickers using

the well-developed questions and identifying misconceptions.

Analysis of Methodology:

The participants consisted of 47 Geometry Students. These students were in 2 different classes,

and ranged from grades 8-12 (but all taking the same course.) The use of pre and post tests were used

to compare a control group to a group that used student response systems. The student response

system was used in daily lessons to respond to questions from homework, quizzes, and warm ups. Data

was obtained from completion rate of class work, informal assessments, participation, summative
assessment data, and the teacher’s observation journal.

Summary of Results:

The study found that students in the trial group performed better on post-tests after using the

student response systems in the classroom. The study also made observations that the students were

more willing to participate with the student response systems than without the technology system. The

study showed data where all of the warm ups and homework quiz class averages were higher for the

students using the student response systems. The results on the summative tests and the benchmark

assessment also showed that the trial group performed better.

My Opinion:

Overall, the study had clearly defined trial and control groups with clear results. In my opinion,

this study would benefit from doing it on a larger scale with more students involved. When looking at

the breakdowns of the trial group and control group, there seemed to be some differences in the

student population between the two groups. Setting up this study to involve more classes could help

even the balance of students. The results were given as class averages, where we don’t know if one

particular student pulled the average up or down. With such a small sample population, factors such a

private tutors, after hour help sessions, or classroom discussion leaders could impact the results of the

Lynch, L. A. (2013, January 1). The Effects of Clickers on Math Achievement in 11th Grade Mathematics. ProQuest


Title: Effects of CLickers on Math Achievement in 11th Grade Mathematics

Author: Lycinda Lynch

Summary of Literature Review:

Lynch (2013) defined student response systems as “wireless devices that create interactivity

between teacher and student.” (p.3) Lynch stated that the student participation in a classroom that

does not use student response systems as “spotty,” meaning a small percentage of students are willing

to participate or ask questions. With a classroom that uses student response systems, or clickers, the

teacher is ability to check students’ understanding throughout the entire lesson. In the literature

review, Lynch focused on research that promoted the use of student and teacher interactions and active

engagement. The Clicker systems allows not only the teacher to gather data, but for the students to

self-reflect on their understanding and give immediate feedback to both the teacher and student.

According to Lynch’s Literature Review, several studies have been conducted on the use of Clickers, but

were limited to studying the effects Student Response Systems have on large college level courses.

Analysis of Methodology:

Lynch gathered data from 61 students taking an 11th grade mathematics class in rural

Northwest Georgia. All 61 students took the same pretest and posttest and had the same teacher. The

experiment had one class using the Student Response System every day for 4 weeks, while the other
class did not use this technology. Lynch used ANCOVA to assess the differences between the two

groups of students and statistically control these pre-existing differences.

Summary of Results:

Lynch found no difference in the post test scores between the class who used Student

Response Systems and the class without student response systems. According to his data, the control

group scored a mean of 24.82 on the pretest and a 46.55 on the Posttest while the treatment group

scored a 25.43 on the Pretest and a 43.29 on the Posttest.

My Opinion

The study was limited to only 61 students with 2 classes. I also found the study limited by only

using the technology for 4 weeks and analyzing the results from one test. In my opinion, I appreciated

that the same teacher taught both the control group and the treatment group, but in order to get more

established results a larger sample size is needed at a variety of schools.
Dunham, V. K. (2011, January 1). The Impact of a Student Response System on Academic Performance. ProQuest


Title: The Impact of a Student Response System on Academic Performance

Author: Victoria Dunham

Summary of Literature Review:

Dunham focused her first section of the literature review on researching No Child Left Behind in

order to establish the importance of the CRCT scores. No Child Left Behind emphasizes the stronger

accountability for results and encourages closing the achievement gap by using proven teaching


In the Literature Review, Dunham (2011) also described the research supported the use of

student response systems as technology that “enhanced critical thinking and the ability to utilize

knowledge.” (p. 2) Student response systems are used to engage students and provide immediate

feedback. Other benefits stated were increased participation, track student progress, reduce

paperwork, and to use students’ misconceptions to create class discussion. Dunham also stated that this

generation has grown up in a world of technology have a need for instant gratification.

Analysis of Methodology:

This study consisted of 97 7th grade students participated from four math classes were picked to

participate. The school is located in the 2nd fastest growing county in Georgia, and was currently the top

ranked county in the state. Two classes were assigned the student response systems, ACTIVote while 2

classes did not use the technology. Dunham (2011) compared the Criterion-Referenced Competency
Test (CRCT) results for the prior year and the year after using the technology. Dunham focused on 4

specific units: integers, rational numbers, linear equations, and constructions. She analyzed pre and

post test data from these four units as well as the CRCT scores.

Summary of Results:

This study found that the use of student response system had no significant difference in the

Criterion-Referenced Competency Test scores of 7th grade math students. There was also no significant

difference on the post test scores. Dunham(2011) stated in the summary that, “even if the difference

between the scores of users and non-users was statistically insignificant the mere fact that students

enjoyed using them made them a viable tool for the classroom teacher.” (p.60)

My Opinion:

It was valuable that this study included 4 classes, with 2 control groups and 2 trial groups. To

improve this research, a description of how the teacher used Student Response Systems and the

frequency of use would be needed.

Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience

response systems (clicker-based technologies) on cognition and affect.Computers &

Education,94, 102-119. doi:10.1016/j.compedu.2015.11.013
Title: A Meta-Analysis of the Effects of Audience Response Systems (Clicker-based

Technologies) on Cognition and Affect

Authors: Nathaniel Hunsu, Olusola Adesope, Dan James Bayly

Summary of Literature Review:

The purpose of this meta-analysis is to resolve the conflicting findings of previous

researchers on the effects of Audience Response Systems (ARS) in classrooms. The authors

noted the increasing popularity of ARS in classrooms as more technology has become available

that allows instructors to easily implement it with their classes. They noted that while teachers

have always engaged in the “practice of punctuating lecture periods with questions” (pg. 103),

ARS allows them to get to get better and more responses than by just having students raise their

hands. They also mentioned that teachers use ARS for other reasons such as taking attendance,

improving interaction among students, facilitating discussion, and encouraging students to be

more invested in their answers.

Hunsu, Adesope, and Bayly (2015) seek to answer several questions in their analysis.

First of all, they wanted to know how effective clicker-based technologies are. Second, they

wanted to know what they are most useful for, and who gets the most benefits from their use (pg.

105). They note that previous studies produced mixed results on the effectiveness of using ARS

in the classroom. They suggested that some of the studies did not control for factors outside of

the actual use of clicker-based technology, and that some of the studies involved vastly different

learning environments. The intended outcome of the meta-analysis is to review the published

experimental studies on the effects of clicker use and to estimate the effects of their use.
Analysis of Methodology:

Eligible studies had to meet a list of criteria in order to be included in the meta-analysis.

The initial search produced 2,269 articles, and in the end only fifty-three met all of the

requirements for eligibility. Hunsu et al. extracted thirty-one study variables from the articles

and organized them into seven sections which included reference information, study research

questions, sample information, conditions and treatments, research design, dependent variable,

and effect size computation (pg. 107). They then coded the studies as high or low for fidelity,

confidence, and methodological quality (pg. 107).

Summary of Results:

The analysis found small positive effects on cognitive outcomes in the measures of

knowledge transfer and final achievement for the clickers group versus the non-clickers group.

The analysis found the largest positive effects on non-cognitive outcomes such as learner self-

efficacy, engagement, and participation. No significant difference on retention and recall was

found between the two groups. Hunsu et al. noted that the effects of the clickers were reduced as

the class size increased, and that potential benefits were diminished as classes got larger (pg.

111). They also noted that clickers had a more positive outcome in studies in which pre-tests

were given before the clickers were used.

My Opinion:

The meta-analysis took great care to only include studies that met a set of specific

criteria, and therefore the results seem valid. I found it very interesting that the clickers did not

produce any statistically significant benefits for retention, as I have seen many classes that use
these types of response systems for questions involving rote memorization. The findings of this

analysis suggests that clickers may have a small positive impact on higher-order thinking such as

knowledge transfer.

Rigdon, J. (2010).The Effect of Student Response Systems in Middle School Math

Classrooms(Doctoral dissertation, Walden University, 2010) (pp. 4-77). Ann Arbor, MI: Pro Quest.

Author: Jason Rigdon

Summary of Literature Review:

Rigdon (2010) emphasized using technology as a means of engaging students. He

defined the problem of teachers having to compete with video games and the internet for

students’ attention (p. 4). Rigdon stated that his study was framed around John Dewey’s

constructivist theory that learners must be actively involved in their learning. He suggested that

the use of wireless electronic student response systems could help students to become actively

involved. Rigdon posed two questions that served as the basis for his study. The first question

was whether student response systems affected students’ attitudes toward learning. The second

question was about the possible effect of student response systems on learning in the middle

school math classroom.

Analysis of Methodology:

Rigdon conducted his study at a rural middle school in southeast Georgia. The study

included three 8th grade math classrooms, all taught by the same teacher. Two of the classes

would use Qwizdom, a wireless student response system. The other classroom would not. There
were 56 students in the treatment classrooms and 25 students in the control classroom. The

study included surveys, observations, interviews, and evaluations of pretests and posttests. All

56 students in the control group were surveyed about the student response systems. Rigdon did

observations in all three classrooms. Eight students were selected by Rigdon and the classroom

teacher for interviews. Pretests and posttests were evaluated from all students in all three


Summary of Results:

Rigdon’s surveys indicated that most students enjoyed using the student response system

and felt that it helped them learn. The interviews from the eight students who were chosen by

Rigdon and the classroom teacher were overwhelmingly positive in support of the use of student

response systems. Rigdon’s classroom observations also revealed that more students were

engaged in the lesson while using the student response system. Therefore, Rigdon concluded

that the use of student response systems did lead to more positive attitudes toward learning

among students, which in turn would help them learn more. “Student response systems provide

immediate feedback and reinforcement which leads to a deeper understanding of the topic” (p.


Rigdon also evaluated pretest and posttest data and concluded that the use of student

response systems had no significant effect on student learning outcomes.

My Opinion:

I think that Rigdon glossed over the second of his two research questions, and did not

provide much information about his evaluation of the classroom pretest and posttest scores. In
fact, the majority of his conclusion focused on his findings that student response systems do

affect students’ attitudes toward learning. I am not sure if he was disappointed that his study did

not show a major difference in learning outcomes, but he certainly did not discuss it much. I

appreciated the fact that he took onus of the fact that he brought some of his own personal bias to

the study. He admitted that as an instructional technology specialist he had been very excited

about purchasing the student response systems and seeing them utilized in the classroom. I also

would have liked for him to have interviewed students at random instead of hand selecting the

eight students. I think this may have given a bigger variety of answers about the student

response systems. The students that he did interview did not elaborate much more than to say

that they really liked the student response systems.