This action might not be possible to undo. Are you sure you want to continue?
Karen L. Mahon, Ed.D.
The class erupted into animated conversation. This worked well for multiple-choice questions: Students raised the piece of paper with the color corresponding to their answer choice. (p. and what the learning impacts of those various approaches are. 2001. English. 2004. 3-4) We agree. With the advent of easier systems. These uses are a waste of the system’s potential. 2005).147) In the instances where using an SRS in education has shown itself to be most effective. along with successful implementations of these devices that improve student learning outcomes. as Beatty and Gerace (2009) remind us. Cue. health professions. non-interactive lectures (Cutts. (p. These systems. 2 mimio. 1999. have always been part of education.” A student response system is technology that “allows an instructor to present a question or problem to the class. To truly realize the benefits of [an SRS]. The use of the tool does not guarantee positive student outcomes. In the early days. and endeavor to provide some guidance here about the pedagogical practices that have been identified as effective. communications. Knight & Wood. research suggests that daily use of an SRS contributes to increased attendance. instructors appear to be taking the advice of Beatty (2004): [An SRS] can be used to insert occasional audience questions into an otherwise traditional lecture. 2009). Results were tallied and displayed to the class. We will endeavor to review the history of these devices. engineering.. don’t ask what the learning gain from [SRS] use is. in one form or another. and more. 2004) Student response systems (SRSs) are described in current research as being instrumental to what is commonly known as “active learning” or “active student responding. and therefore technology is not the magic bullet it is often presumed to be” (Mazur. education. computer science. 1998. [an SRS] may be used in many possible ways for many possible ends” (p. 1998). In addition. In other words. One popular approach was using colored construction paper. including science. to quiz students for comprehension. and instantly aggregates and summarizes students’ answers for the instructor” (Beatty. the majority of uses of technology in education consist of nothing more than a new implementation of old approaches. 2004. 2003. I simply asked the students to debate briefly with their neighbors and see who could convince whom about which answer was correct.147). moving away from wired hardware to portable and wireless devices that work together with software.. 2003). but not by much.Using Student Response Systems to Improve Student Outcomes For me. as well (Caldwell. As Caldwell (2007) points out. 2002. but there is evidence that the use of SRSs is effective in improving student outcomes. Nicol & Boyle. et al. McDermott & Redish. English. . After a few minutes. Hake. law. allows students to enter their answers into some kind of device. One of the earliest applications of the technology in education took place at Rice University (Lane & Atlas. d’Inverno et al. Burnstein & Lederman. the use of these tools has spread to a range of topic areas in education. Educators recognized that methods that allowed all students to select an answer simultaneously would increase the opportunities for every student to participate. business. 2006. 2003).11). and the teacher could see the array of responses easily. “Unfortunately. that we should not confuse the technology with the pedagogy. these types of units were used to record audience responses to pilot television programs and movies. Not only do students and teachers who use SRSs think they are fun and prefer using them over traditional. instructive questioning to teach students since at least the time of Socrates” (p. as we consider the use of an SRS: “Like any tool. making integration of the process easier for those who use it. 2004. and now over 90% gave the correct answer… (Wood. Since that time the technology has evolved. 1996). particularly when participation in interactive instruction is linked to grade incentives (Burnstein & Lederman. Early methods were a bit more high-tech than raised hands. Technology devices that allow individual responses to questions have been used since the 1960s. math. I asked for a revote. Beatty and Gerace further point out: We argue that tools should be evaluated on their affordances. Draper et al.com Using Student Response Systems to Improve Student Outcomes 3 . It is important to remember. In fact. close to 30% of school districts in the United States had substantial implementations of SRSs. political science. an instructor must rethink her entire instructional model and the role class time plays within it and make [SRS] use an integral part of an organic whole. 2003. “The idea behind [SRS] is not new – teachers have used interactive. Draper & Brown. ask what pedagogical approaches an [SRS] can aid or enable or magnify. where students in a computerequipped classroom answered questions about how well they understood portions of the lecture. psychology. this was a moment of revelation…for the first time in over 20 years of lecturing I knew…that over half the class didn’t “get it”…Because I had already explained the phenomenon as clearly as I could. Current Status and Research As of mid-2011. Even the simple raising of hands is a student response system. or to keep them awake. whereas approaches and methodologies should be evaluated on their student impacts.
These results suggest that seeing the most common response to a question can bias a student’s second vote. SRS questions were used as summative evaluation.51). timed trials (Van Houten et al. Heward et al.. Increasing Opportunities to Respond Barnett (2006) points out that SRSs are “a tool that provides for interactivity” (p. called Assessing-to-Learn.862). 1982. in a wide range of academic settings and a wide range of disciplines. A word to the wise when using these methods.” The effect was more pronounced in true/false questions (38%) than in multiple-choice questions (28%). displaying the responses. “Data obtained in my class and in classes of colleagues worldwide. Fagen et al. however. Dufresne’s A2L has question cycles in which students read questions and then discuss those questions in small groups. is provided by Perez et al. the group is asked to stop the lesson. the control group continued to get the baseline treatment and the experimental group received the intervention. 1986). 1985. this assertion does not suggest that use of the SRS itself is what improves student outcomes. 1989. Importantly. Tudor & Bostow. 14).. self-directed learning (Kosiewicz et al. Deliberate Practice Perhaps the most interesting recent study examining the use of SRSs was conducted in 2011 at the University of British Columbia (Deslauriers et al. 1990).2). (2010): Be careful about showing graphs of voting results to the class. and guided lecture notes (Kline. but some feel that the availability of the SRS is what encourages the use of peer instruction (Burnstein & Lederman..5 standard deviations. However. Perez et al. Wood. Weeks 1-11 of the course were the baseline condition: Both the control and experimental groups received lectures from an experienced faculty member. These methods. moving the “simple transfer of factual knowledge” outside of class. Dufresne et al. The assertion that this increase in student interaction and feedback to the student increases student understanding (Crouch & Mazur. using a show of hands. (2000) argued that this practice “informs teachers about what students think. or an SRS. Yang. et al.Peer Instruction Eric Mazur’s method of peer instruction (Mazur. Munro & Stephenson. known as “deliberate practice. Delquadri et al. In this study. But the way in which those SRSs were implemented was very different. 2011). discuss the question and its topic among themselves. in A2L the question cycle is the core class activity. included formative assessment explicitly. show that learning gains nearly triple with an approach that focuses on the student and on interactive learning” (p. Although the currently available research on the use of SRSs is inconclusive. 1992. Students did pre-reading outside of class. flashcards. It is critical to note that both sections used SRSs. student engagement nearly doubled and attendance increased by 20%. or A2L. In all cases. Van Houten & Thompson. Burnstein & Lederman. 2003). “The proportion of students that choose the correct answer always increases after the discussion” (p. it informs individuals what they themselves think” (p. Results from the week of the experiment showed no change in engagement or attendance in the control section. enter responses. The students are required to choose an answer. 1997) involves regularly inserting multiplechoice questions into what he calls “strategic junctures” in Physics lessons. The shift toward the most common response occurred even when that most common response was incorrect. A number of strategies that increase the frequency of active student responding have demonstrated improvement in academic achievement (Narayan et al. Stallard. Sindelar et al. During the intervention. and spent class time working on activities and tasks with feedback from peers and the instructor. and participation credit was given for submitting answers. In the common scenario of asking the students a question. 1984. Assessing-to-Learn (A2L) In a separate effort.. 1982). Similar to the peer instruction of Mazur. two groups of students were compared. 2001.. 2009). which includes instructor-led presentation of information interleaved with questions.. with an effect size for the difference between the two sections of 2. there are two likely contributors that have long been recognized as critical in the learning literature: increasing the opportunities for students to respond and the role of feedback in instruction. in the experimental section. Smith. This method. If a number of students answer a question incorrectly. the University of Massachusetts Physics Education Research Group (UMPERG) developed a similar approach to using SRS in the classroom in 1993.com Using Student Response Systems to Improve Student Outcomes 5 . choral responding (Heward et al. and view the results. These include class-wide peer tutoring (Cooke et al. holding class discussion. 1986. 1976). 1983. and then reanswer the question.” required the students “to practice physicist-like reasoning and problem solving during class time while provided with frequent 4 mimio. the experimental group was taught by a postdoctoral fellow who used instructional methods based in learning research. instruction from the teacher is only added as needed. In week 12.. Hestenes et al. but why it does so. SRS questions were used as in-class formative evaluations. feedback” (p. It is only the tactic used to increase the responding that varies. 1986). Why Does the Use of SRSs Improve Student Outcomes? The compelling question about the use of SRSs is not whether it tends to improve student outcomes. 1988).. 2009).. Unlike Mazur’s peer instruction. they were 30% more likely to switch from a less common to the most common response.g. 1974. followed by discussion and problem solving. Lovitt et al. it informs students what their classmates think. the strategy is the same: increase active student responding. 1996. Mazur. Mazur (2009) further shares. 1997) is supported by quantitative evidence from use in undergraduate science courses across multiple topics (e. 1983... more than twice the learning occurred in the experimental group versus the control group. To be clear. 2002. found that “if students saw the bar graph [with results from the first vote].. and then re-asking the question. Mazur (1996) reports. computer-assisted instruction (Balajthy.11). 2001. 1991). use of response cards (Cooke et al.
2001). Reichow & Wolery. diagnosis of any areas of difficulty for individual learners.. with small groups working collaboratively.. 2006).” Unlike the lower-tech approaches. Remember that the primary use of SRS should be for formative assessment. but manually recording responses from a low-tech solution.. such as energy consumption (Bekker et al. 2008). and that then. 1975. But the student may respond without fear of embarrassment when an incorrect choice is selected. Knight & Wood. Barnett suggests that the privacy that a high-tech SRS allows a normally shy or reticent student increases the likelihood that such a student will participate. 2004b. such as saying an answer aloud or raising a hand. the appeal of these activities is likely to be low. 4.. a high-tech SRS allows for immediate feedback to every learner. 3. Judson and Sawada (2002) point out that modern devices have changed little from lower-tech approaches. We explore four benefits of using a high-tech SRS here. allowing evaluation of each student’s performance. Finally. In fact. 1978. Multiple-choice questions are not restricted to low-level skills. The second important advantage of using a high-tech SRS is that the ease of implementing the tool and the convenience of collecting the data make it more likely that teachers will design and use activities that offer high numbers of student response opportunities in the classroom. and is now part of common wisdom. When working on new skill acquisition. 5. sports skills (Boyer et al. with a high-tech SRS each and every student may receive immediate feedback directly from the device itself.g. 2. if written properly. Smith & Ward. Designing such activities can be time-consuming. Martin et al. and planning to address these difficulties. flight training (Rantz et al. is too labor-intensive and time-consuming to be amenable to the classroom. The fact that feedback changes behavior has been accepted in education for years.com Using Student Response Systems to Improve Student Outcomes 7 . SRS are flexible and can be used with large groups all at once. 2004. include enough questions with novel examples to ensure that students are getting sufficient practice and generalization opportunities. why spend the money on the new devices? The simple answer is that there are advantages to using a hightech SRS that cannot be accomplished with lower-tech approaches. just as it is clear that numerous methods allow for increasing active student responding and the associated feedback opportunities. These data can be examined later by the teacher. saving the student response data to an onboard gradebook in the software. When used with a carefully designed pedagogy. or with individual students. Lasry. The following are selected best practices for helping to ensure the successful use of SRS. such as flashcards. Benefits of High Tech Versus Low Tech Conclusion It is clear that there is nothing about using an automated SRS that is fundamentally better or more effective than using a lower-tech method (e. Third. 2009.Role of Feedback in Instruction The performance data that are generated by active student responding provide the opportunity for feedback to both the students and the instructors. Van Houten et al. Include only those questions that are pertinent to the targeted student learning outcomes. multiple-choice questions remain the most common format for the student interaction.. The teacher may still see the individual responses of each learner through the data recorded in the gradebook. 1974). Be careful not to give away an answer through irrelevant cues. 2002. and therefore more possible occasions to modify the instruction to address the learners’ needs. 2002. The quantity of data provided by increased active student responding gives the instructor more awareness of student problems ( Johnson & McLeod. Integrate questions throughout the lesson so that student understanding can be evaluated frequently and regularly. Barnett (2006) points out that “one of its major attractive qualities is the provision of swift feedback to students. So if high-tech solutions are not any more effective than low-tech solutions. 1973. 2011). Increasing opportunities to evaluate student performance allows real-time adjustment of instruction... 6. A methodology that allows feedback to be delivered automatically and on an individual basis makes its implementation much easier for classroom teachers. 2011. Leaving all questions until the end of the lesson does not allow for changing the instruction along the way. The learning research literature addressing the important role of feedback in behavior change goes back some 40 years and spans a large number of areas. The most important benefit is that using a high-tech SRS allows data to be collected automatically. Moreover. 2005). 2009.. and if using those activities leads only to difficult data collection and management.. Not only is automatic data collection impossible with low-tech solutions. such as a pattern of correct answers or the wording of the question. and student academic achievement (Fink & Carnine. Using a high-tech SRS allows each learner to respond to the device without being observed directly by his or her peers. questions that are arbitrary or irrelevant should not be used. These guidelines have proven useful when using SRS in the classroom: 1. Endeavor to write questions that target some of higher-level skills described by Bloom’s Taxonomy (Pear et al. SRS can provide immediate observable outcomes in student performance. A lower-tech approach requires that learners perform a publicly-observable action. as now. Roschelle et al. 2010. 2004a. 1976). Brobst & Ward.. Student Response Systems (SRS) can be instrumental in ensuring that students are engaged. except for the display of the students’ answers and the ease of the record-keeping. infection-control procedures (Babcock et al. and in providing continuing formative evaluation of how well students are learning the material. in which one or several students may receive feedback from the teacher. teacher behavior (Cossairt et al. 6 mimio. Seaver & Patterson. there is no easy method for evaluating the data if they are collected manually. Rantz & Van Houten. Trap et al. 1992).. 1975). Harris et al.
(1992). WVU Technology Symposium. feedback. 6. Dr. 7. 12.. Butler M.com Using Student Response Systems to Improve Student Outcomes 9 . (2007). Journal of Reading. N. Be willing to throw out or regrade questions that are unclear. M.D. 18. 3. I. The Physics Teacher. The effects of experimenter’s instructions. She advises education technology companies in instructional design and digitized content practices. Batsche. By examining the patterns of what worked and what did not. 855-860. I. R. J. Journal of Science Education Technology. & Hopkins. Increasing nurses’ use of feedback to promote infection-control practices in a head-injury treatment center.. 12.. Journal of Applied Behavior Analysis. 408–411. Miltenberger. Cumming. L. The Physics Teacher. and oral feedback on the skills of female soccer players. 8. 14. Research Bulletin. 146-162. Zelkowski J. 272-275. Transforming student learning with classroom communication systems.S. (2003). N.. 621-627. If you include items in which the student must identify the answer that does NOT belong.I. Barnett. Beatty. The point is not to trick the learners.M. Dr.7. T.P. Using Personal Response Systems in the Classroom. 474-494. 3. & Leland.. April 11. Balajthy. Technology-enhanced formative assessment: A researchbased pedagogy for teaching science with classroom response technology. write the word “NOT” in all capital letters and in bold. Journal of Applied Behavior Analysis. Encouraging electricity savings in a university residential hall through a combination of feedback.. Sanderson.dymo. E. Cooke. 4. Osborne. Morgantown. 27(5). 5.M. To learn about the MimioVote assessment system visit mimio. R.. & Ward.D.. C. Bruining. Video modeling by experts with video feedback to enhance gymnastics skills. use the SRS daily. R..L. Karon Mahon Information Dr. Beatty.A. do not show graphs of student response distribution following the first vote in order to avoid biased response shifting. & Heward. (2002). B..D. W. (1973). Clickers in the large classroom: Current research and best-practice tips. (2004). Caldwell J.com/SRS 9. (1984). The point is to use the questions to evaluate the instruction the learners have received. and praise on teacher praise and student attending behavior. Caldwell.E.. She is committed to instruction that produces meaningful and measureable student learning outcomes and has dedicated her career to helping kids and their teachers. Sulzer-Azaroff. 1-13. 13. 8 mimio.A. but do use plausible distracters. Journal of Applied Behavior Analysis.K. 10. Life Sciences Education. (2006). WV.. A. Using wireless keypads in lecture classes. Implementation of personal response units in very large lecture classes: Student perceptions. 327-331. Educause Center for Applied Research. Boyer. Burnstein.. 22(4). Cossairt. (1983). E. When using Vote-Discuss-ReVote methods in class. Using student-constructed questions to encourage active reading. & Gerace.. so that it is as clear as possible.G. (2010). 39.. B. 6(1). (2009). (2001). 41. References 1.. Mahon blogs at www. McClean. J. J. Hall. 25(3). Columbus. 8. A. (2006). L. 43(2). visual prompts. Comparison of different commercial wireless keypad systems. J. W. E. Journal of Applied Behavior Analysis. 35(3). 9-20.J.M. P. Peer tutoring: Implementing classwide programs in the primary grades. M. L. & Fogel. and incentives. R.com and can be found on twitter @KarenLMahon. 6(1). L. Babcock.A. Burnstein. L. 8-11. B. If you want to increase attendance in your class. & Scibak.V. 2006. & Lederman. (2009).J..KarenMahon. 9. 42(4). & Lederman. goal setting. Ensure that the correct answer is clearly the best one. 2. Journal of Applied Behavior Analysis. you can improve the instruction for next time! 11. Effects of public posting. Make sure you review and analyze the data after the class is over. Brobst. 10. 247-257. Bekker. 89-100. R. Karen Mahon is a Learning Scientist and Educational Psychologist. 11. Heron. OH: Special Press. T. Australasian Journal of Educational Technology. V.
. Kennedy. 4. 41. Are science lectures a relic of the past? Physics World. J. 31. 20. (1997). Martin. The networked classroom. 44. Exceptional Children. Mazur. J.. Davis. Unpublished master’s thesis. Greenwood. 27. & Wieman.. & Stephenson. Using choral responding to increase active student response during group instruction.W. Journal of Applied Behavior Analysis. 18(1). M. American Journal of Physics. American Journal of Physics. D. W. Jr. R. Schelew. Improved learning in a large-enrollment physics class. T. 427430. Presented at 7th IASTED International Conference on Computer and Advanced Technology in Education. E. 24. A. A Universal Learning Tool for Classrooms? Proceedings of the “First Quality in Teaching and Learning Conference. Mestre. 275-285. Heward. Using a personal response system for promoting student interaction. 2004. M. 21(3).. F. 242-244. Delquardi. E. W. F. Hestenes. ASK-IT/A2L: assessing student knowledge with instructional technology (Tech. Mitchell. 40.. (1992).. E. N. R.. Learning & Leading With Technology. C. Gerace. 323. 21(2). D. C. & McLeod. G. 17. D. 45. Harris. W. (2011). (1996). C.gla. D. & Sawada. Effects of self-instruction and self-correction procedures on handwriting performance. & White.. (2002).. 755-767. Two methods of adapting science material for learning disabled and regular seventh graders. S.. The effects of response cards on student and teacher behavior during vocabulary instruction. & Swackhamer. W. Force concept inventory. 206-207. 22-24. Australian Journal of Education Technology. Lovitt. 535-542.ac. (1998).W. H. & Redish.F. S. 21. W. M.. J. & Mazur. & Benedetti. (1999).. (1985). (1982). E. Pious. Crouch. J. Gardener. Journal of Applied Behavior Analysis. 23. 67(9). 32. Kline. 5. S. 72-75. 42. (2003). Q. Lasry. 332. (2002). Bushell. The Physics Teacher 40(4). 36. L. 19. 13-23. Classwide peer tutoring. 37. Johnson.. (2009). S. E. Peer instruction: results from a range of classrooms. bonus payments. G. Get answers: Using student response systems to see students’ thinking. R.J.dcs.. A.H. Cargill. A.. J. Grossi. (1996).. Control of arithmetic errors using informational feedback and graphing. 462.. C. 16. 29. Courson. S. L. 22(4). Cell Biology Education. Teaching Exceptional Children. Rudsit. 81-94. J. (2009). Lloyd. L. V. Science. University of Massachusetts Amherst Scientific Reasoning Research Institute. feedback. & Barbetta. V. 20(2). 35.P.. Q. (2001). 9. 5-10. Pear. (1986). & Hall. T. & Leonard.uk/~quintin/papers/cate2004. R. 30(3). (2000)..M. 66(1). 8. Winter) Everyone participates in this class. T. D..com Using Student Response Systems to Improve Student Outcomes 11 .J. (2004). J. Judson. D. D.. Teaching more by lecturing less.. J. Dufresne. Physics Education Research. 26. 22. 46. The Ohio State University.” December 10–12.I. L.. d’Inverno.. (2004). Lane..W. 1998.. Clickers or flashcards: Is there really a difference? The Physics Teacher. & Atlas.. F. & Wood. S.. C. M. (2004). (1986). 34. 13-14. 8.. & Brown. Wells. Whorton. 10 mimio.K. N.. Maximizing dialogue in lectures using group response systems. (1998). W. lecture? Science.C. (2003). C. 32(3). 42. E. AV Video Multimedia Producer. & Graves... Sherman. Courson. W. Farewell. Peer instruction: ten years of experience and results. R. Journal of Computer Assisted Learning.. Teaching Exceptional Children. J.. Mazur. J. & Kane. J. 167-182. 2-8. Paper presented at the 1996 meeting of Computers and Psychology. J. China. Draper. 2012]. & Martin. McDermott.P. H. (2002). Carta. 795-800. 30. 298-310. C. 50-51. Hallahan. Peer instruction: a user’s manual. Effects of guided notes on academic achievement of learning disabled high school students. 33. D. Resource letter PER-1. J. 25(12). R. (2002). Rep. Deslauriers. Learning Disabilities Quarterly. Hong Kong SAR. & Draper. J. Prentice Hall: Upper Saddle River. & Narayan. S.. D. D. praise. American Journal of Physics. 35.. (1975). E.. & Mazur. Columbus.. Knight. 862-864. & Cutts. August 16-18. 52. Fink. R. English. 39..P. A. Cue. E. (2008). Journal of Computers in Mathematics and Science Teaching. Cutts. (1975). Draper. 38. D. H. dufresne-2000ask). J. & Carnine. The Physics Teacher. 163-169. Feedback and its effectiveness in a computeraided personalized system of instruction course.15. Audiences talk back: Response systems fill your meeting media with instant data.B. 141-158. P.. Kosiewicz. 69(9). R. 8. Teaching Mathematics and Its Applications. 25. Learning from past and present: Electronic response systems in college lecture halls. www.. Crouch. UK. 72-75. Hawaii [accessed 30 January. 43. Journal of Applied Behavior Analysis. Heward. Electronically enhanced classroom Interaction. 64–74..H. and teacher behavior. Instructions. Cavanaugh. Journal of Applied Behavior Analysis.pdf 18.. Fagen. L.W. York. (1989). A. Learning Disability Quarterly. 28. G. T. Hake R. Mazur. Interactive-engagement versus traditional methods: a six-thousand student survey of mechanics test data for introductory physics courses. Increasing interactivity in lectures using an electronic voting system. 970. Jenkins. L. (1996. Munro. W.J. 461.
(2011). 50. (2009).J... 28 (4). Decreasing fuel-oil consumption through feedback and social commendation.S..com ©2012 DYMO. Wirth. W. (2009). 9(2). 63. Paper presented at the Annual Meeting of the American Educational Research Association. Reichow. (1978). 61. R. (1991).. 796-798. (1988). N.. Roschelle. 55. Does displaying the class results affect student discussion during peer instruction? CBE Life Sciences Education. 227-230. 94-98. (1990). CA. & Thompson. Why peer discussion improves student performance on in-class concept questions.. applications. 381-393. 48. The effects of feedback and consequences on transitional cursive letter formation. Seaver.G. 147-152. Smith. R. W. (1976).. (2010). 9... H. Journal of Applied Behavior Analysis... 49. Journal of Applied Behavior Analysis. M.K. 483-490. (2011). 47. Journal of Applied Behavior Analysis. Pear. P. O. L. J. S. Unpublished master’s thesis.. A. Yang. 58. Guild. & Bostow.. W. & McDonald. Joseph. 56-66. The effects of two variations of teacher questioning on student performance. W. Adams. 12 mimio. Sindelar.. Effects of guided lecture notes on sixth graders’ scores on daily science quizzes. J. R. 23(4). J. Academic Exchange Quarterly. Assessment of thinking levels in students’ answers. 133-140. R. & Penuel. S. Roschelle. Classroom response and communication systems: Research review and theory. 145-150..E. L.. R. F. Paper presented at the Annual Meeting of the American Educational Research Association. Computer-programmed instruction: The relation of required interaction to practical application. (2004a). C. Trap. 547-555. & Van Houten... W. 457-473. R. D. W. D.. 44(1). 102-104. 65. & Su.. J... Computers and education for exceptional children: Emerging 46. Rantz. Wood. (1986).L. (2004b). 327-340. & Courson. Columbus. Behavioral interventions to improve performance in collegiate football. T. Sinclair G. P. The effects of explicit timing and feedback on compositional response rate in elementary school children. Wieman. Jeanne.J.. Bursuck.. CA. (2003). Journal of Applied Behavior Analysis. Dickinson A. R. F. (1974). 7. Comparison of progressive prompt delay with and without instructive feedback. Gardner. Van Houten. P. 57.59... Developmental Cell. Wood.B. Galbraith. 52. Perez. M. 39. C. & Simister. C. J.T. K. & Wolery. (2001). R. 5 (4). Milner-Davis. 54. Nicol. Downey. & Abrahamson. Journal of Applied Behavior Analysis.H. A feedback intervention to increase digital and paper checklist performance in technically advanced aircraft simulation.H. & Boyle. 51. Morrison. Journal of Applied Behavior Analysis. M. Rantz W. M. 64. 53. 497–509. San Diego.. B.E. Studies in Higher Education. A. 7(6). 9. J. The effects of explicit timing on math performance. Penuel. & Ward. Journal of Applied Behavior Analysis. Abrahamson. K.. 60. (2006). A.T. Education and Treatment of Children. Jarvis.. W. & Halle.B. 49(2). Journal of Applied Behavior Analysis. R.G. Knight. E. K. J. The Ohio State University. 11.. W. W. Integrating classroom network technology and learning theory to improve classroom science learning: A literature synthesis.A. a Newell Rubbermaid company Using Student Response Systems to Improve Student Outcomes 13 .M. T. Strauss. J. Stallard. Using response cards to increase student participation in an elementary classroom. Van Houten R.M. San Diego. J. S. N. (1976). & Cooper. 56. 122-124. Clickers: a teaching gimmick that works.. (2004).E. Smith.A.B. 62. Science.K. 44. Journal of Applied Behavior Analysis. J. Heward. & Patterson. Crone-Todd. 42.. Exceptional Children. Van Houten. (1982). 361-368.. 24(2). A. E. Tudor. 385–391..K. 9(2). & Cooper. The effect of feedback on the accuracy of checklist completion during instrument flight training. 323. Narayan. W. Peer instruction versus class-wide discussion in large classes: a comparison of two interaction methods in the wired classroom. Journal of Applied Behavior Analysis. L. D. D.