Cycle 7 – Using Video feedback for Written Work: 25 respondents ACTION TAKEN: This cycle focused on students’ written

work, and used previous screencasting methods to provide feedback. The methodology remained similar by beginning with a rubric and then showing the students’ written work compared against the directions for the assignment. There was a larger sample size of students this cycle, with 25 respondents. Rather than testing the process with only one class of six students, I have provided all of the middle school photography students who completed this assignment, 25 students, with video feedback. RESEARCH QUESTION: How will video feedback apply to written work and affect student-learning outcomes, as evaluated by the students? PREDICTED OUTCOME: The promise of screencasting for feedback applied to both visual and written work, and it was my prediction that video feedback would continue to be an effective tool when used with written work. Therefore this method should have provided a more effective means of feedback than the traditional written notes. EVIDENCE USED TO EVALUATE THE ACTION: The evidence used to evaluate the actions in this cycle was a survey with the following questions: 1. How useful do you feel video feedback was to you? 2. What was the worst part of video feedback? 3. Did you find video feedback helpful in any way? If so, what were those ways? (Please describe in detail.) 4. What do you get that is different from video feedback than you would from written feedback? 5. Does receiving verbal feedback change your impression on how I assess your work? For example, do you think I do something differently providing written feedback than I would otherwise? 6. Ultimately these changes are an attempt to help your learning. What do you think about being asked to provide input into my system of assessment? Do you think having a chance to provide your opinion will make a difference in your learning? EVALUATION: The baseline question for this cycle asked how useful video feedback was to the individual students (Figure 7.1.) The validity of this method was proven in this larger sample size of students, with 76% stating that they found it more helpful than traditional written feedback. The remaining 24% found the feedback was at least as helpful as traditional methods. This suggests that even when it is not seen as beneficial, screencasting does not become an obstacle to their learning.

How useful is video feedback?
0% 0%

24% 40%

Much more useful than written feedback

more useful than written feedback
The same as written feedback Less useful than written feedback

36%

Much less useful than written feedback

Figure 7.1. How useful do you feel video feedback was to you? When asked to suggest the worst part of the video feedback method (Figure 7.2) the majority of the students (32%) either indicated that there was no worst part, or presented a positive quality of video feedback. 24% of the students stated that since the video was placed on the school’s server, they did not have access at home. While this is an accurate, two things must be noted: first, in the primary cycle of research, the video files were emailed to the students and challenges arose due to their size. After the first cycle the students suggested using the school’s server because it has unlimited storage. Secondly, the students are able to download the videos from the server and take them home on their computers. Therefore with more experience the students would be able to download the files as a part of their daily routine. 8% of the students suggested that not being able to see the written suggestions on their paper was a drawback. This issue will be addressed in cycle eight of the research, by marking the paper using Adobe Acrobat before the video is recorded. It was suggested from 4% of the respondents that the video moved too quickly. Attempting to make the video concise, I spoke quickly. In response to this suggestion I will slow down and refer to more details in the upcoming cycle. Another 8% of the students suggested that reading both their work and the directions simultaneously was a challenge. This was a good point and one of the reasons Cognitive Load Theory was used in the refining of the rubric, however, the students have the ability to pause their video. This would allow them time to look at the corrections and then make the modifications to their writing.

What was the worst part of video feedback
Positive responses Not having access at home
24% 32%

Difficult to read both the directions and their work at the same time Cannot see the correction on the paper It moved to quickly

4% 4% 4% 8% 8% 16%

Shows what I already know No colorful enough other

Figure 7.2. What was the worst part of video feedback? Question three searched for a deeper level of insight from the students by asking them in what ways they found video feedback helpful (Figure 7.3.) The two most frequent responses were the two reasons I am continuing with this study, that visual feedback is more easily understood (39%) and that the audio recording makes criticism less confrontational and more personal (35%.) In the students’ view these are the clear benefits to video feedback over written feedback. Other responses to the question were: it is more specific or clear (9%), the rubric was helpful in clarifying the feedback (5%), visual feedback is more easily understood (4%), and that the video can be paused and replayed (4%.) It should be noted that the answer of “the rubric helped clarify the feedback” was recorded as a separate category from the general response of “visual feedback is more easily understood.” This was because the general response measured the quality of video feedback as a whole, while the other response suggested that the rubric made the feedback more specific.

In what ways did you find video feedback helpful?
4%

4% 4% 5%

Visual feedback is more easily understood

it is more specific or clear
39%

The voice makes the feedback more personal The rubric clarified feedback It is better than written feedback The video can be paused or replayed It is the same as written

35% 9%

Figure 7.3. Did you find video feedback helpful in any way? If so, what were those ways? (Please describe in detail.) The fourth survey question, how is video feedback different from written feedback, continued to look for more thoughtful responses from the students, and it produced a range in the depth of insight (Figure 7.4.) The two most common answers were that “it is more visual” (33%) and “it is more verbal” (30%.) Neither of these responses indicated reflection upon the question, and they are only cursory responses. 20% of the respondents stated that video feedback felt more personal. This response demonstrated a deeper level of thought in the question. It is important to note that only 2 of the 25 respondents mentioned that it was more personal, in both questions three and four. These questions were open-response questions, so there was room for multiple-concept answers, however the lack of repetition showed that almost half of the respondents felt that video feedback is more personal. The formula used to determine this was (6+(8-2))/25 = 48%. Six respondents in the third question and eight in the fourth were counted. To remove the similar responses from the two students, two answers were removed from question four. Results from both questions were then added together to get twelve individual responses. These twelve responses were then divided by the total number of respondents, which was twenty-five, to get forty-eight percent.

How is video feedback different from written feedback
3% 20%

It feels more personal It is easier to understand It is more Visual

7%

30%

7%

It is more verbal It is pretty much the same other

33%

Figure 7.4. What do you get that is different from video feedback than you would from written feedback? The fifth question asked the students if they thought the video method revealed or changed their impressions of my level of effort in assessing them. The majority of respondents (65%) said that this did not change their perception of my effort level. Only 5% suggested that they believed I worked more in creating screencasts. The other responses did not directly relate, and they suggested that this method explains the information better (26%) and that it makes criticisms easier to absorb (4%.) These results may have suggested that while video feedback is a different form of assessment, it does not suggest a change in effort by the teacher. It is possible that the students not fully understanding the meaning of this question. That would have explained the answers that did not full make sense in relation to the question. The final question regarded action research rather than screencasting. It asked if they believed offering input into my style of assessment would benefit their learning. The overwhelming majority (74%) replied that it would make a difference. 9% of the students responded that it would not, and the remaining 17% replied with other statements that did not answer the question. This emphatic reply showed that students believed that they could make a difference in their own education when given the opportunity. REFLECTION: This cycle of research has shown that video feedback can be applied to written work as well visual artwork. The screencast was created by utilizing Microsoft Word to display the written work and using Adobe Illustrator include a rubric and it was recorded in Snaps Pro X. The process only takes slightly longer

than editing written works by hand but it is capable of providing a deeper level of explanation and a more personal approach. By its very nature, speaking as opposed to writing, allows for more information transfer in a shorter period of time. A conclusion can be drawn in the fact that we are able to present more information about students’ work through speaking about that work, rather than writing comments in the margins. Personally, this allowed for a freer flow of ideas. Writing restricted my abilities to deliver my thoughts, while my speech may not have been grammatically correct, it was easier to share my thoughts and thereby give more information. I was also able to think ahead while speaking slightly, which created more opportunities to make mental connections, and thereby give a greater depth in my message. Student responses in this cycle have shown that they find video feedback more easily understood. These responses could be attributed to the ability to speak my thoughts rather than write them because I could think more freely and I was not restricted by the rules of writing. Almost half of the respondents (48%) stated that hearing my voice creates a more personal approach to assessing their work, which has made my criticism easier to take. These responses have aligned with the research in my literature review, containing the concepts of Social Presence Theory (SPT.) Indicated in SPT is western societies’ preference for face-to-face methods of communication, and when that is not an option, methods that more closely resemble face-to-face communication are preferred. This suggested that we are more comfortable when hearing a person’s voice than reading the same words, because we can hear the nuances in the voice and thereby gain a deeper understanding of its meaning. This applied to feedback because criticism is easier to accept and praise more personal. It is closer to having a conversation with your teacher and thereby more socially acceptable. This cycle reached more middle school students, and created a respondent pool of 25 people, which is approximately half of the students who completed the assignment. This larger group corroborated the results from previous cycles indicating video feedback was more useful than traditional written feedback (Figure 7.1.) 76% of the respondents found video feedback to be more beneficial than traditional written feedback, and the remaining 24% found it to be no better or worse. This increased sample size further reinforced that video feedback is a beneficial tool from the students’ perspective. Two problems with feedback were pointed out this cycle. The first issue related to the file size challenges I had in the first cycle. I have been placing the feedback videos on the school’s server because the large files take up too much space in the students’ email. Some of the students mentioned that being unable to access the server from home, which if handed written feedback they could do, was a challenge. The solution to this challenge is a simple one, the students could make downloading the feedback video to their computers before leaving for home, and make it a part of their daily practice. This way they could easily watch the video whenever they liked and there are no storage size issues with their email. The second problem in this

cycle was that I spoke too quickly during the videos (Figure 7.2.) Balancing the appropriate duration with a useful amount of depth was challenging and it is a process I am still refining. I will consider this in the next cycle of research and add more depth in my feedback. The students did have the ability to hit pause on the videos, which some of them indicated was an asset. It is difficult to tell by the responses if pausing the video would have benefited the students who believed I moved too quickly. The ability to control the sequencing of my feedback has recently been pointed out as a benefit of this practice. This ability could prevent the students from looking only at the final grade of the assignment by showing them the quality of their work and then leading the explanation to a place where the final grade has a depth of meaning. The students could have easily fast-forwarded the video to a place where the final grade is displayed just as if they looked at the bottom of their paper for a score. However, the advantage is that rather than having the student looking at written feedback in no particular order, I could explain the issues in a logical way. This concept further helps to clarify the feedback presented. A potential challenge of this method exists when placed in the context of non-native English speakers. My comfortable cadence of speaking may be too fast for nonnative English speakers. In thinking about this issue I have begun to write more at the bottom of rubrics and in the next cycle I will be marking the students papers with corrections. This challenge can be fixed as long as the teacher is aware of their audience and take the necessary steps to provide information clearly. However, by verbally providing the feedback, non-native speakers may have to focus more on the words being spoken and they might not follow the message in its entirety. However there may be a way to use this beneficially during English as a Second Language classes or English speakers learning another language. Both sides of this concept would require more testing to reach conclusions. It was reassuring to see that 74% of the students believed that by providing their viewpoints they could improve their education. This shows that even though students are required to attend school, they are invested in their education and may benefit from the opportunity to provide feedback in their classes. Since this survey was taken I have had informal conversations with students about this topic. They are interested in evaluating their classes and would welcome the opportunity to provide more input into what they are being taught. In the next cycle of research I will be using video to assess more student work. The work will be a rough draft of writing, where I believe video feedback will be most beneficial. By providing detailed feedback before the project is due and incorporating written suggestions on the paper I hope to improve the quality of the final writing. I will also attempt to slow down and provide more thorough feedback to the students. Average length of recording 1:26

Sign up to vote on this title
UsefulNot useful

Master Your Semester with Scribd & The New York Times

Special offer: Get 4 months of Scribd and The New York Times for just $1.87 per week!

Master Your Semester with a Special Offer from Scribd & The New York Times