You are on page 1of 7

Running head: EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT) 1

Evaluation of Group Multimedia Project (Individual)

Helen S. Lambert

EDID6508, Developing Instructional Materials

University of the West Indies Open Campus (UWIOC)

STUDENT NAME and I.D.: Helen S. Lambert - 320028011

COURSE: EDID6508 – Assignment 3, Section 3

INSTRUCTOR: Dr LeRoy Hill

DUE DATE: December 2021


EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 2

Introduction
“Creating educationally effective multimedia programs means taking seriously the
idea of learning by doing. Good educational software is active, not passive, and
ensures that users are doing, not simply watching.” (Schank 1994, p .69)
The quotation above is sound advice that epitomises the essence of the processes of

creating and evaluating multimedia projects. These projects incorporate audio, video and

other aspects of production such as graphics, under the umbrella of a multimedia authoring

tool. This paper is an evaluation report on the development of an instructional programme

created by team Systemic Performance Solutions, comprising Keebah Brown, Malissa

Bovell, Rachel Edwards, Ophilia Boyd and Helen Lambert, using Articulate Rise 360.

Each team member produced a storyboard of a programme they were interested in, to

be considered for further development as the team project; Ophilia Boyd’s project – ‘The

Solar System’ was selected. Development means the inclusion of user interactivity, ease of

navigation, content sequencing, pacing and access to learning support. Overall, the project

would be evaluated on its relevance, appropriateness, and sufficiency, its adherence to

Gagne’s Nine Instructional Events, and functionality.

Three peers of the UWIOC’s Instructional Design and Technology programme for

Semester 1, 2021/2022 were invited by the group to try out the programme and provide

feedback based on the pre-determined criteria of the Project Peer Evaluation Rubric rated

between 1 and 4 – each score had a corresponding description; two peers – Andreen Green-

Walters and Nicholas Dillon, responded. The criteria and the evaluation results comprising

their ratings, are tabled below.

Evaluation Results
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 3

Table 1. Abridged version of Peer Evaluation Project Rubric, and Findings

ANDREEN NICHOLAS
Green- Dillon
ITEM CRITERIA
Walters FEEDBACK
Score Score
1-4 1-4
1 Storyboard or 3.0 3.0 Each slide was numbered; however,
Planning Sheet some instructions seem to be missing.
You were able to capture Gagne’s Nine
Events of Instruction. The introduction
was also appropriate.
2 Organisation of 3.0 3.5 Logical sequence of information.
Content Menus and paths to more information
were clear. There could have been a
home button to lead participants back
home if they wished.
3 Originality 2.7 2.7 While the topic is not original, the
presentation format is original. The
introduction offered a little spin to the
presentation, which was good. You
could have incorporated more original
work created by the group.
4 Copyright and 1.0 0.0 Sources have not been properly cited
Documentation and permissions have not been received.
5 Format and 3.5 3.0 The stack, presentation, or webpage play
Platform easily on both Mac and PC. Care has
Transferability been taken in naming files, selecting
technologies, or creating enhancements
to produce a final product that is cross-
platform.
6 Subject 2.8 3.0 There is need for more original content.
Knowledge This would have shown subject-
knowledge. The use of videos that were
already created did not allow you to
show much of this.
7 Graphical 3.0 3.4 The design elements and content were
Design combined effectively. It would have
been good to use added reinforcement.
8 Mechanics 4.0 4.0 Presentation has no mis-spellings or
grammatical errors.
9 Screen Design 4.0 4.0 There were adequate navigation tools.
The screen design was well laid out,
there was no confusion in deciding
where to go next.
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 4

10 Use of 3.5 3.5 Appropriate amounts of video and audio


Enhancements enhancements were used effectively to
entice users to learn and to enrich the
experience. Clips are long enough to
convey meaning without being too
lengthy. The audio created by the group
added value to your presentation.
TOTAL 30.5 30.1

Evaluation Model

Programme Evaluation is particularly important when resources are limited, to ensure

that all the things that need to be done, are included. However, process evaluation -

gathering data and feedback on the programme’s processes, as was done in this exercise is

critical, to alert the designers to potential areas of deficiency. In this instance, the team was

guided by the tenets of mutual relevance, appropriateness and sufficiency between

instructional materials and multimedia; Gagne’s Nine Instructional Events, and the tabled

criteria. Tyler (1942), cited in Owston, indicated that the goal-based evaluation model is best

suited for the attainment of the programme’s goals and objectives, and programme

improvement. The evaluator’s task in the goal-oriented evaluation model is to “measure the

extent to which the goals were achieved” expressed as measurable objectives.

Review of Feedback

I was pleased with the feedback received – it adequately reflected our output. The

comments for item 3 – originality, are worthy of reflection, since the storyboard was selected

because we all felt that it lent itself to original development. Time constraints caused us to

cut corners but the unmistakably literal West Indian voice was represented in our work. We

made a conscious attempt to create a theme in our graphical design, using the starred

background for the slides. It was satisfying to see that our fastidiousness regarding
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 5

mechanics and screen design were acknowledged. One unfortunate oversight was copyright

and documentation; we discussed it, well aware of the implications, but inadvertently omitted

it.

Revision Plan

According to the findings, there are four areas of concern based on the ratings

allocated: Storyboard – 3.0; Originality – 2.7; Copyright and Documentation – 0/1; Subject

knowledge – 2.8. Schwier (1992) identifies three levels of interaction, the highest of which is

mutual interaction – consistent in today’s environment with artificial intelligence (AI). The

two other levels are reactive and proactive. Our evaluators have told us by their feedback

that we are between levels one and two. The table below contains suggestions for project-

improvement.

ITEM CRITERIA SOLUTION

• Review the order and relevance of all the slides and


1 Storyboard remove those that are superfluous.
• Create an overview that describes the subject-matter
and the features of the project.
• Create a 360o virtual video tour of the solar system
with manipulatives, such as the papier-maché orbs
2 Originality Ophilia created in her instructional video.
• Introduce sounds and animation to represent the
distinctive individual “voices” of each planet.
Present in the active rather than passive voice.
• https://www.youtube.com/watch?v=xKKhiaUMHw8
What is Creative Commons? by Esther Wojcicki is
useful in this area.
3 Copyright and
Correct attributions for open source and obtaining
Documentation
permissions for private resources created by others
relieves the instructional designer from having to create
and use only their work.
• Add a search bar to enable learner-control to
broaden their knowledge on the subject.
4 Subject Knowledge
• Create a repository of learning material in different
formats accessible by links from the primary
authoring tool.
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 6

Conclusion

The quality of the interaction between the learner and the multimedia tutorial, is a measure of

its success. This becomes evident when an evaluation is done to determine its effectiveness

and as a gauge to programme improvement. The key is to consider design from the

learner/user’s point of view. A combination of reactive, proactive and reciprocal interactions

would be ideal.
EVALUATION OF GROUP MULTIMEDIA PROJECT (INDIVIDUAL REPORT 320028011) 7

References

MERLOT Peer Review Report Form - V 17.6. (n.d.). Retrieved from http://www.merlot.org/

Owston, R. (n.d.). Models and Methods for Evaluation. York University, Toronto, Canada.

Schwier, R. A. (1992). A Taxonomy of Interaction for Instructional Media. Saskatchewan: The

University of Saskatchewan.

Wojcicki, E. (n.d.). What is Creative Commons? Retrieved from

https://www.youtube.com/watch?v=xKKhiaUMHw8

You might also like