You are on page 1of 9

ACTION RESEARCH REPORT 1

Action Research Report:

Using Google Slides as an Alternative Medium of Assessment

Joseph Scogin

Coastal Carolina University

EDIT 677: Assessment Technology and Learning Analytics

Dr. Joe Winslow & Dr. Corey Lee

May 1st, 2022


ACTION RESEARCH REPORT 2

Introduction

Being fresh out of college with a degree in elementary education, I am always looking for

ways and ideas for my future classroom to run efficiently. Efficient in not only instruction but

also assessing whether the instruction was valuable or lacking. As of writing this report, I do not

have a formal teaching job lined up for the future but aspire to one day run an elementary

classroom that fosters my student’s passion for learning. Not having a classroom of my own as

of writing this did initially provide a challenge when the time came to start collecting data and

completing my research. Fortunately, one of my close friends and former peer in the Spadoni

College of Education at Coastal Carolina was open to working with me to complete this exciting

endeavor. Mrs. Grace Batten is a 2nd Grade teacher at Pee Dee Elementary School in Horry

County. Due to my lack of formal education setting with learners, she was willing to facilitate

my research and assist me in collecting and administering the assessments and activities

necessary to complete this action research report.

As unfortunate and devastating the Coronavirus Pandemic has been on the world, it has

inadvertently caused every branch of education to take a closer look at how they do things.

College professors posted lectures online, high schools gave students their own Chromebook to

attend classes online, and elementary students stayed sat in one spot in their house attending

video chats with their class to complete their school instruction and work. Now, the days of

distance learning may be behind us, but the impact of it is still being felt and a paradigm is

shifting in the world of education. During the times of distance learning, teachers were not able

to collect paper assignments from students and this posed a problem for many educators when

the time came for them to assess student progress and understanding. These issues made me

wonder if the teachers that were facing these dilemmas changed or evolved their methods of
ACTION RESEARCH REPORT 3

assessing their students. My thinking shifted into the realm of which medium worked best for

student assessment, paper and pencil quizzes/tests or digital assessments taken on some form of

multimedia device. My curiosity was an excellent place to start with this project and I began

working with Mrs. Batten. Mrs. Batten allowed me to work alongside her in a unit of Math

instruction. This unit was about fractions and parts of a whole. In order to assist her in the best

way, I asked if she already used a form of digital assessment; to which she told me that she does

weekly assignments that the students submit by filling out their own personal Google Slides

presentation and if I could continue this way of assessment then there would be less

complications on her on of administering any tests/ assessments that we gave them. So, my

question shifted into the final research topic that we worked on this semester: Does using Google

Slides as a medium for assessing students yield better results than paper and pencil assessments.

Review of Literature

Since my original line of questions pertained to the general differences between concrete

and digital assessments, the beginning of my delve into the realm of academic articles goes along

with that.

Paleczek, Seifert, and Schöfl (2021) designed a study for a group of 85 Austrian

kindergarteners. They digitalized an assessment on receptive vocabulary knowledge. This study

was designed to look at whether digital assessments meet the same level of quality as the paper

one, as well as the preference in the students’ and instructors’ opinion on digital assessment

versus paper assessments. The study found that the digital assessment held up to the quality of

the paper assessment as well as being preferred by both the students and the instructors. For the

students, they claimed that the digital test was easier, and the instructors preferred it for its

convenience of administering.
ACTION RESEARCH REPORT 4

Klein (2019), along with assistance from master’s-level and doctoral-level graduate

students in the School Psychology program at St. John’s University, administered the Wechsler

Intelligence Scales for Children (WISC-V) in order to examine possible scoring errors made by

administration method. They created a digital version of the WISC-V to go alongside the paper

and pencil version so that they could draw the comparison. Since this study was about the

scoring of the assessments, they gave both of the WISC-V test to a research assistant to fill out

form a script. The conclusion of this study found that the digital scoring system had a lower

mean error rate than its counterpart.

Garofalo and Farenga (2021) compared digital and physical models of DNA molecules in

order to gauge which one enhances spatial ability. In order to gauge this, they used a sample of

132 students in a tenth-grade science class at an all-male school. Using both models to design a

course of study reliant on the different models of DNA, they gave two identical sets of

instruction to two groups. The results from the assessments from both of the groups indicate that

the greater 3-D representational and conceptual understanding when the student’s construction

the paper models.

Dieck-Assad (2018) set out to test a if a specific technological resource, Microsoft

OneNote Class Notebook (MONCN), could be used to promote sustainable development and

make students into leaders. They used this resource and conducted this experiment in

undergraduate courses at Technologico de Monterrey in Mexico. After using MONCN

throughout the courses and running classes incorporating it, they concluded that their hypothesis

of it being a positive influence was correct and it promoted efficient learning in said courses.
ACTION RESEARCH REPORT 5

Based on the research done and the articles that were consulted, it is my hypothesis that

the students will score better on the online variants of the assessment. Meaning that the online

manipulative assessment will be the more efficient test.

Methodology

In order to complete this Action Research project, as I said, I had to rely on my friend

Mrs. Batten at Pee Dee Elementary in Horry County where Mrs. Batten is a first-year teacher of

a 2nd grade. After discussing with her, we decided that the best course of action was to find a

suitable academic subject to form this project around. At Mrs. Battens request, we chose to use a

unit in mathematics; this unit that we chose was on fractions and parts of a whole.

When designing the layout for how we would complete this report, we chose to complete

2 separate pre-tests, have the students complete the unit, and then have them complete two

separate post-tests. Mrs. Batten had they pre-tests already made, so she sent them to me to

transfer over to a Google Slide. Once the Google Slides were created, Mrs. Batten administered

both the versions of the pre-test, taught the unit over the course of two and a half weeks, and then

gave both versions of the post test.

The digital version of the pre-test and post-test were derived from the same document as

its paper copy. This was by design so that we could negate any type of disparities and keep the

content that we are assessing as similar as possible. We wanted to create a valuable digital

version of each test that assessed the student of the same information while giving the student the

different medium.

Qualitative data was collected by the teacher in the form of gauging questions while the

students were taking each version of the assessment. Mrs. Batten asked questions along the line

of, “Which test did you like better and why?”


ACTION RESEARCH REPORT 6

Analysis

After collecting all of the data from the test scores, the information was analyzed by

looking at the average score of each specific test. Figure 1.1 shows the comparison of the

averages across the board for each assessment from the sample of eighteen 2nd grade students.

Figure 1.1 is created in the order the tests were given by Mrs. Batten. It can be seen that initially,

the paper test scored better, marginally. After going through the unit of instruction with Mrs.

Batten, the students performed better on the paper post-test as well.

On top of the quantitative data collected, Mrs. Batten also kept tabs on which form of

assessment the student preferred. Below, Figure 1.2 shows the class’s preference on which

assessment medium they preferred to take. Again, this was taken from Mrs. Batten asking the

students questions after they completed their last assessment.


ACTION RESEARCH REPORT 7

Findings

The results of the research that was conducted beforehand somewhat conflicts to what

was seen from the data collected from completing the assessments. It can be seen that the

students performed better on both the paper pre-test and post-test. However, from the qualitative

data collected, it can be observed that the students would rather take the digital variation of the

fractions test. Also, Mrs. Batten also brought up that it was more convenient for her to administer

the Google Slide; she could assign each student their own copy and she did not have to keep

track of them after completion.

Looking back at the results and wondering where the discrepancy could come through

raises numerous questions. Where the students merely clicking through the questions online so

they could complete it faster? Is math a subject better suited for paper and pencil forms of

assessment? These questions sit alongside many more that could possibly answer for the
ACTION RESEARCH REPORT 8

different in opinion. My hypothesis of the Google Slides outperforming the paper variant was not

supported. The class preferred the Google Slides, but when they had to actually take the

assessment, they scored better on the paper and pencil tests.

This study allowed me to test out many different aspects of digital assessments that I was

curious about. It also showed me that it could possible be a melding of both mediums that would

be best for students. Allowing them the option of choice is a solution that could yield even

greater results in the future.


ACTION RESEARCH REPORT 9

References

Digital Teaching: In Search of an Effective Paperless Platform for Classroom Activities.

(2018). Journal of International Education Research, 14(2), 1–8.

Garofalo, S. G., & Farenga, S. J. (2021). Cognition and Spatial Concept Formation: Comparing

Non-Digital and Digital Instruction Using Three-Dimensional Models in

Science. Technology, Knowledge and Learning, 26(1), 231–241.

Klein, M. (2019). Comparing Digital and Paper-and-Pencil Methods of Administering and

Scoring the WISC-V [ProQuest LLC]. In ProQuest LLC.

Paleczek, L., Seifert, S., & Schöfl, M. (2021). Comparing Digital to Print Assessment of

Receptive Vocabulary with GraWo-KiGa in Austrian Kindergarten. British Journal of

Educational Technology, 52(6), 2145–2161.

You might also like