Professional Documents
Culture Documents
Helen S. Lambert
Introduction
“Creating educationally effective multimedia programs means taking seriously the
idea of learning by doing. Good educational software is active, not passive, and
ensures that users are doing, not simply watching.” (Schank 1994, p .69)
The quotation above is sound advice that epitomises the essence of the processes of
creating and evaluating multimedia projects. These projects incorporate audio, video and
other aspects of production such as graphics, under the umbrella of a multimedia authoring
Bovell, Rachel Edwards, Ophilia Boyd and Helen Lambert, using Articulate Rise 360.
Each team member produced a storyboard of a programme they were interested in, to
be considered for further development as the team project; Ophilia Boyd’s project – ‘The
Solar System’ was selected. Development means the inclusion of user interactivity, ease of
navigation, content sequencing, pacing and access to learning support. Overall, the project
Three peers of the UWIOC’s Instructional Design and Technology programme for
Semester 1, 2021/2022 were invited by the group to try out the programme and provide
feedback based on the pre-determined criteria of the Project Peer Evaluation Rubric rated
between 1 and 4 – each score had a corresponding description; two peers – Andreen Green-
Walters and Nicholas Dillon, responded. The criteria and the evaluation results comprising
Evaluation Results
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 3
ANDREEN NICHOLAS
Green- Dillon
ITEM CRITERIA
Walters FEEDBACK
Score Score
1-4 1-4
1 Storyboard or 3.0 3.0 Each slide was numbered; however,
Planning Sheet some instructions seem to be missing.
You were able to capture Gagne’s Nine
Events of Instruction. The introduction
was also appropriate.
2 Organisation of 3.0 3.5 Logical sequence of information.
Content Menus and paths to more information
were clear. There could have been a
home button to lead participants back
home if they wished.
3 Originality 2.7 2.7 While the topic is not original, the
presentation format is original. The
introduction offered a little spin to the
presentation, which was good. You
could have incorporated more original
work created by the group.
4 Copyright and 1.0 0.0 Sources have not been properly cited
Documentation and permissions have not been received.
5 Format and 3.5 3.0 The stack, presentation, or webpage play
Platform easily on both Mac and PC. Care has
Transferability been taken in naming files, selecting
technologies, or creating enhancements
to produce a final product that is cross-
platform.
6 Subject 2.8 3.0 There is need for more original content.
Knowledge This would have shown subject-
knowledge. The use of videos that were
already created did not allow you to
show much of this.
7 Graphical 3.0 3.4 The design elements and content were
Design combined effectively. It would have
been good to use added reinforcement.
8 Mechanics 4.0 4.0 Presentation has no mis-spellings or
grammatical errors.
9 Screen Design 4.0 4.0 There were adequate navigation tools.
The screen design was well laid out,
there was no confusion in deciding
where to go next.
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 4
Evaluation Model
that all the things that need to be done, are included. However, process evaluation -
gathering data and feedback on the programme’s processes, as was done in this exercise is
critical, to alert the designers to potential areas of deficiency. In this instance, the team was
instructional materials and multimedia; Gagne’s Nine Instructional Events, and the tabled
criteria. Tyler (1942), cited in Owston, indicated that the goal-based evaluation model is best
suited for the attainment of the programme’s goals and objectives, and programme
improvement. The evaluator’s task in the goal-oriented evaluation model is to “measure the
Review of Feedback
I was pleased with the feedback received – it adequately reflected our output. The
comments for item 3 – originality, are worthy of reflection, since the storyboard was selected
because we all felt that it lent itself to original development. Time constraints caused us to
cut corners but the unmistakably literal West Indian voice was represented in our work. We
made a conscious attempt to create a theme in our graphical design, using the starred
background for the slides. It was satisfying to see that our fastidiousness regarding
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 5
mechanics and screen design were acknowledged. One unfortunate oversight was copyright
and documentation; we discussed it, well aware of the implications, but inadvertently omitted
it.
Revision Plan
According to the findings, there are four areas of concern based on the ratings
allocated: Storyboard – 3.0; Originality – 2.7; Copyright and Documentation – 0/1; Subject
knowledge – 2.8. Schwier (1992) identifies three levels of interaction, the highest of which is
mutual interaction – consistent in today’s environment with artificial intelligence (AI). The
two other levels are reactive and proactive. Our evaluators have told us by their feedback
that we are between levels one and two. The table below contains suggestions for project-
improvement.
Conclusion
The quality of the interaction between the learner and the multimedia tutorial, is a measure of
its success. This becomes evident when an evaluation is done to determine its effectiveness
and as a gauge to programme improvement. The key is to consider design from the
would be ideal.
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 7
References
MERLOT Peer Review Report Form - V 17.6. (n.d.). Retrieved from http://www.merlot.org/
Owston, R. (n.d.). Models and Methods for Evaluation. York University, Toronto, Canada.
University of Saskatchewan.
https://www.youtube.com/watch?v=xKKhiaUMHw8