You are on page 1of 31

Courseware for

Participants

Certified
Instructional Designer
This document is confidential, it is provided to the recipient for the limited purpose of
supporting the participant for developing better understanding of the skill set delivered in the
program. Any commercial use of this document is prohibited. Copyrights are reserved with
Middle Earth HR and Carlton Advanced Management Institute.

Visit www.middleearthhr.com

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
1
Courseware
for
Participants

Session 5
Implementation and Evaluation

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
2
MEHR- ID Model

I. Basics of
Instructional
Design.
II. Requirement
Overview Analysis
I. Implementat
ion of I. What Existing
Training – Knowledge do
integrating Implement
Analyse they have
with LMS & Evaluate (bars)
II. Evaluation &Bloom’s
of Training Taxonomy
II. Converting
the bar into
TLO
I. Understandin
g Story board
II. Creating E – Storyboard Design I. Creating Top
content using Level Design
video editing using
Reigeluth
Develop II. Creating
Motivational
I. Identify Different design using
Exercises ARCS
II. Making Gagne’s
Lesson Plan
III. Selection of Right
Exercises

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
3
SESSION OVERVIEW
By the end of this session the participants would be able to:

Make the check list for program implementation.


Learn about the legal & ethical standards.
Learn & Understand the difference between
Formative & summative evaluation, Criterion &
Normative evaluation.
Learn to evaluate using Kirkpatrick’s 4 levels.
Learn about LMS and LCMS tools.
Learn about the standard for e-
learning implementation – SCORM.
Learn how to evaluate a training.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
4
Have you ever faced any issue in implementing any training program?

What went wrong?

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
5
CASE
• Amazon pays for 95 % of tuition, fees and textbooks — up to $12,000 over four years.
• Provides 1-year tenure to earn certificates.
• Participants can earn certifications / associate degrees in
high-demand occupations such as Computer Aided Designing, machine - tool
technologies, medical lab technologies and nursing.

Implementation:

• A “Career Choice Program” was successfully implemented by Amazon.


• It is ideal for high-demand occupations
• According to Bezos, over 16,000 Amazon associates have taken advantage of
the Career Choice Program in over 10 countries.

How many programs are you conducting in your company?

How do you ensure the consistency and quality of all the programs?

Answer
The secret of quality and consistency is effective implementation.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
6
IMPLEMENTATION OF TRAINING PROGRAM:

The implementation part can be done in two parts.

1. To make sure the preliminary implementation requirements & pre-requisites


for implementation is fulfilled.
2. The implementation standards are checked, and the laws are followed during
implementation.

Check List:

Implementation of the Training program is done by ensuring the entire task/ activities
in the checklist are completed.
The main task implementation starts up is to roll out the program, book venue and
manage facilities according to the requirements and schedule the date and time slots
for the program and book the faculty and resources required for the program in
advance. Care should also be taken to make sure that the technical glitches that may
arise are not overlooked.
Check Handout – 4 for sample checklists for a training program.

After the preliminary check list is verified, the implementation should also ensure that
the training standards if any is strictly adhered to and the legal & ethical laws and norms
are followed.

Sample checklist

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
7
Legal & Ethical Considerations:

Reiser & Dempsey (2002) enumerate the following issues as possible legal areas of
concern if the training program is hosted online or is a web based training program:

Copyright:
Taking material from the internet without getting consent, thereby
violating copyright laws or intellectual property rights.
Discrimination:
The training should abide to various antidiscrimination laws that apply include
Age Discrimination, Civil Rights Act, Gender discrimination.
Lawsuits resulting from comments made or material handed out during a
training that is discriminatory (based on age, sex, race, ethnicity, etc.).

Injuries:
The injury of a trainee or employee who received the injury during a training if
the training personnel were negligent in providing a safe training environment.
Injury due to missing or incorrect information received by training materials.
Other Issues:
Failure to develop and offer training which is mandated by the government can
result in litigation initiated by the government.
Possible lawsuits due to employees feeling that they were kept from
promotions because they were denied access to the training which is necessary
for the promotion.
An organization must monitor and audit for criminal activity, periodically
evaluate the effectiveness of the program, and create and communicate
procedures for employees and agents to report criminal activity without fear
of retaliation.

508 Compliance Standards:

Section 508 was enacted to eliminate barriers in information technology, to make


available new opportunities for people with disabilities, and to encourage development of
technologies that will help achieve these goals.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
8
Under Section 508, agencies must give disabled employees and members of the public
access to information that is comparable to the access available to others.

Summary of Section 508 technical standards

Software Applications and Operating Systems: includes accessibility to software, e.g.


keyboard navigation & focus is supplied by a web browser.

Web-based Intranet and Internet Information and Applications: assures accessibility


to web content, e.g., text description for any visuals such that users of with a disability or
users that need assistive technology such as screen readers and refreshable Braille
displays, can access the content.

Telecommunications Products: addresses accessibility for telecommunications


products such as cell phones or voice mail systems. It includes addressing technology
compatibility with hearing aids, assistive listening devices, and telecommunications
devices for the deaf (TTYs).

Videos or Multimedia Products: includes requirements for captioning and audio


description of multimedia products such as training or informational multimedia
productions.

Self-Contained, Closed Products: products where end users cannot typically add or
connect their own assistive technologies, such as information kiosks, copiers, and fax
machines. This standard links to the other standards and generally requires that access
features be built into these systems.

Desktop and Portable Computers: discusses accessibility related to standardized


ports, keyboards and touch screens.

Implementation of ELT

Selection of LMS

A learning management system (LMS) is a software application for the administration,


documentation, tracking, reporting and delivery of education courses or training
programs.

An LMS not only delivers content but also handles registering for courses,
course administration, skills gap analysis, tracking, and reporting.

Student self-service (e.g., self-registration on instructor-led training), training workflow


(e.g., user notification, manager approval, wait-list management), the provision of on-line
learning (e.g., computer-based training, read & understand), on-line assessment,
Certified Instructional Designer
All rights reserved by MEHR© version 2.2, February 2017
9
management of continuous professional education (CPE),collaborative learning (e.g.,
application sharing, discussion threads), and training resource management (e.g.,
instructors, facilities, equipment), are all important dimensions of Learning Management
Systems.

Modern techniques now employ competency-based learning to discover learning gaps


and guide training material selection.

Selection of LCMS:

Learning content management system (LCMS) is a related software technology that


provides a multi-user environment where developers, authors, instructional designers, and
subject matter experts may create, store, reuse, manage, and deliver digital e-
learning content from a central object repository. LCMS focuses on the development,
management and publishing of the content that will typically be delivered via an LMS.

Rather than developing entire courses and adapting them to multiple audiences, an
LCMS provides the ability for single course instances to be modified and republished for
various audiences maintaining versions and history. Some systems have tools to deliver
and manage instructor-led synchronous and asynchronous online training based on
learning object methodology.

LCMSs provide tools for authoring and reusing or re-purposing content (mutated learning
objects, or MLOs) as well as virtual spaces for student interaction (such as discussion
forums, live chat rooms and live web-conferences).

Usage of LCMS Tools:

Authoring systems are authoring environments that have on-screen tools like menus,
prompts, icons etc. that let users enter text, graphics, branching logic etc. and that
generate underlying code

Functions of an authoring system:

A computer based system that allows a group of people (including non-programmers)


to create content for intelligent tutoring systems.
Provide graphics, interaction, and other tools, educational software needs.
Has pre-programmed elements for the development of interactive multimedia
software titles.
Authoring tool is a word template that helps guide the authors in writing to the pre-
established pattern.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
10
With multimedia authoring software, you can make video productions including CDs
and DVDs, user interface, animations, interactive training and simulations.

Editing:

Editing is a process which involves preparing a written, visual, audio and film media
to convey information through the processes of correction or modification to result in a
correct, accurate or consistent work to make it complete.

Interactivity

Interactivity refers to the artifact’s interactive behavior as experienced by the human


user. It is not the visual appearance, its internal working or the meaning of the signs it
might mediate but, its behavior of its user interface as experienced by its user.

Ex: the interactivity of an iPod includes the way you move your finger on its input
wheel, the way this allows you to select a tune in the playlist etc.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
11
Make a sample checklist for an induction program which is going to be held in your
organisation for new sales employees.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
12
Self-Study
Ensuring SCORM

SCORM is a collection of specifications adapted from multiple sources to provide a


comprehensive suite of e-learning capabilities that enable interoperability,
accessibility and reuse of Web-based learning content.

Sharable Content Object Resource Model -- The SCORM system is


produced and maintained by the Advanced Distributed Learning
initiative. An important goal of SCORM is to separate the "content"
from the "system" that delivers the content to increase compatibility.
The challenge is to make it so that neither the content
nor the delivery system relies on the other's special properties to create learning
experiences that are adaptive and that follow accepted instructional design
principles. S-C-O-R-M stands for:

Sharable. The goal is to make learning content readily available to virtually all
members of the learning community. That means the content should run on multiple
platforms and be launch able from any number of SCORM-conformant learning
management systems. It also means the content should carry information that enables
identification and search of the content (meta-data).
Content. The choice of the word "content" rather than course is important. A piece
of content can be as small as a single page, a single image, a single audio file, or a
word or character. This granularity provides great flexibility for learning developers.
Object. This term, from the world of information technology, implies that the
existence of learning chunks or objects containing data and behaviors will make it
easier to develop reusable content.
Reference Model. This term refers to SCORM's role as a roadmap to standards
work, like a bookshelf of reference materials. SCORM-based standards model the
learning content so that everyone needing to combine that content into larger
composites can understand it thanks to the SCORM framework.

Sharable Content Object Reference Model (SCORM)


It is a collection of standards and specifications for web-based e-learning.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
13
It defines communications between client-side content and a host system
called the run-time environment, which is commonly supported by a learning
management system.
SCORM also defines how content may be packaged into a transferable ZIP file
called "Package Interchange Format"
SCORM 2004 introduced a complex idea called sequencing, which is a set of
rules that specifies the order in which a learner may experience content objects.
In simple terms, they constrain a learner to a fixed set of paths through the
training material, permit the learner to "bookmark" their progress when taking
breaks, and assure the acceptability of test scores achieved by the learner. The
standard uses XML, and it is based on the results of work done by AICC, IMS
Global, IEEE, and Ariadne.

Making e-learning content SCORM compatible

Content is generally compatible with SCORM if:


It can be delivered via a web browser
It can be self-contained (i.e. packaged with all dependencies wholly in a ZIP file)
It does NOT depend on server-side scripting languages (such as JSP, ASP, and PHP)
It does NOT depend on external files or external URLs
It does NOT depend on downloadable components that must be installed by an
administrator.

General steps for making e-learning content SCORM conformant:

Ensure content meets SCORM compatibility requirements (above)


Organize all content files (including dependencies) into a single directory structure
Define and describe the content using an XML manifest file as described by
SCORM Package all the content and necessary files into a ZIP file

SCORM conformant e-learning content can be packaged, deployed to, and delivered
via any SCORM conformant learning management system (LMS). E.g.: SumTotal's
authoring products, LCMS and Tool Book, are SCORM compliant.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
14
Have you ever seen a soccer match?

Have you tried guessing on who would be the winner or loser of the game?

How do you do that?

Solutions:

• You will closely watch the game and see how the players are performing through the game.

• You will monitor their reactions.

• You will see how well they are scoring.

• You see the final scores.

So, how is training evaluated ?

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
15
EVALUATION OF TRAINING PROGRAM:

Types of evaluation

Training evaluation refers to the process of collecting the outcomes needed to determine
if training is effective. The logic of evaluation, as developed by Scriven, states there are
four steps to evaluation. These steps include selecting criteria of merit, setting standards
of performance, gathering data, and integrating the results to pass final judgment of
value. Based on this logic, then evaluation types are:

Formative vs. Summative Evaluation:

Formative Evaluation: Formative evaluation is an on-going evaluation. Formative


evaluation is separated into two sub-categories (implementation & progress).

Implementation evaluation may answer questions such as: participant selection,


participant involvement, activities matching grant plan, strategies matching grant plan,
changes to protocol, staff members hiring and training, materials and equipment
possession, timeline, appropriateness of personnel, and the development and
fulfillment of the management plan.

Progress evaluation may answer questions such as: participant progress toward certain
outlined goals, what activities and strategies aided the participants to reach
predetermined goals.

Summative evaluation is meant to evaluate the program at its conclusion. This type of
evaluation will attempt to determine: the success of the project, goals being met,
participant satisfaction and benefit, effectiveness, end results versus cost, and whether
the program should be repeated or replicated.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
16
Criterion Reference Assessment Vs. Norm – Referred Assessments:

Norm-referenced tests (or NRTs) compare an examinee’s performance to that of other


examinees. Standardized examinations such as the SAT are norm-referenced tests. The
goal is to rank the set of examinees so that decisions about their opportunity for success
(e.g. college entrance) can be made.

Criterion-referenced tests (or CRTs) differ in that each examinee’s performance is


compared to a pre-defined set of criteria or a standard. The goal with these tests is to
determine whether or not the candidate has the demonstrated mastery of a certain skill
or set of skills. These results are usually “pass” or “fail” and are used in making
decisions about job entry, certification, or licensure. A national board medical exam is
an example of a CRT. Either the examinee has the skills to practice the profession, in
which case he or she is licensed, or does not.

5.2.2Evaluation strategies

Methods and procedures for formative and summative course evaluation should
be carefully planned in the course design process.
Methods and procedures for evaluating student learning must be well articulated
and directly linked to the stated learner objective.
The content of course evaluation should closely link to the course objectives for
the purpose of course improvement.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
17
Kirkpatrick’s Model for Evaluation:

In his 1975 book Evaluating Training Programs, Donald Kirkpatrick outlined a four-level
model of evaluation that here is addressed in the context of e-learning, e-training, or
blended instruction. Kirkpatrick's four levels are:

Reaction of student - what they thought and felt about the training
Learning - the resulting increase in knowledge or capability
Behavior - extent of behavior and capability improvement
and implementation/application
Results - the effects on the business or environment resulting from the
trainee's performance

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
18
Level One: Learner Reaction

This level measures how your trainees (the people being trained), reacted to the training.
Obviously, you want them to feel that the training was a valuable experience, and you
want them to feel good about the instructor, the topic, the material, its presentation, and
the venue.

Start by identifying how you'll measure reaction. Consider addressing these questions:

Did the trainees feel that the training was worth their
time? Did they think that it was successful?
What were the biggest strengths of the training, and the biggest weaknesses?
Did they like the venue and presentation style?
Did the training session accommodate their personal learning styles?

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
19
1. Reaction Reaction evaluation is how Typically, 'happy sheets'. Can be done immediately
the delegates felt, and the training ends.
their personal reactions to Feedback forms based on
the training or learning subjective personal reaction Very easy to obtain
experience, for example: to the training experience. reaction feedback

Did the trainees like and Verbal reaction which can be Feedback is not expensive
enjoy the training? noted and analyzed. to gather or to analyze for
groups.
Did they consider the Post-training surveys or
training relevant? questionnaires. Important to know that
people were not upset or
Was it a good use of their Online evaluation or grading disappointed.
time? by delegates.
Important that people give
Did they like the venue, Subsequent verbal or written a positive impression when
the style, timing, reports given by delegates to relating their experience to
domestics, etc.? managers back at their jobs. others who might be
deciding whether to
Level of participation. experience same.

Ease and comfort of


experience.

Level of effort required to


make the most of the
learning.

Perceived practicability
and potential for applying
the learning.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
20
Level Two: Learning Results

Kirkpatrick defines learning as the extent to which participants change attitudes, increase
knowledge, and/or increase skill as a result of attending a program. So, to measure
learning we need to determine the following:

What knowledge was learned


What skills were developed or improved
What attitudes were changed

2. Learning Learning evaluation is the Typically, assessments or tests Relatively simple to set up,
measurement of the before and after the training. but more investment and
increase in knowledge or thought required than
intellectual capability from Interview or observation can be reaction evaluation.
before to after the learning used before and after although
experience: this is time-consuming and can Highly relevant and clear-
be inconsistent. cut for certain training such
Did the trainees learn what as quantifiable or technical
intended to be taught? Methods of assessment need to skills.
be closely related to the aims of
Did the trainee experience the learning. Less easy for more complex
what was intended for learning such as attitudinal
them to experience? Measurement and analysis development, this is
is possible and easy on a famously difficult to assess.
What is the extent of group scale.
advancement or change in Cost escalates if systems are
the trainees after the Reliable, clear scoring and poorly designed, which
training, in the direction or measurements need to be increases work required to
area that was intended? established, so as to limit the risk measure and analyze.
of inconsistent assessment.

Hard-copy, electronic, online or


interview style assessments are
all possible.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
21
Different techniques used to measure the learning
Item Type Pros Cons
Multiple More answer options (4-5) reduce the chance of Reading time increased with more
Choice guessing that an item is correct answers
Many items can aid in student comparison and Reduces the number of questions that
reduce ambiguity. can be presented
Difficult to write four or five
reasonable choices
Takes more time to write questions
True/False Can present many items at once Most difficult question to write
Easy to score objectively
Used to assess popular misconceptions, cause- Ambiguous terms can confuse many
effect reactions Few answer options (2) increase the
chance of guessing that an item is
correct; need many items to overcome
this effect
Matching Efficient Difficult to assess higher-order
Used to assess student understanding of outcomes (i.e., analysis, synthesis,
associations, relationships, definitions evaluation goals)
Interpretive Variation on multiple choice, true/false, or Hard to design, must locate
Exercise matching, the interpretive exercise presents a appropriate introductory material
new map, short reading, or other introductory Students with good reading skills are
material that the student must analyze often at an advantage
Tests student ability to apply and transfer prior
knowledge to new material
Useful for assessing higher-order skills such as
applications, analysis, synthesis, and evaluation
Supplied Chances of guessing reduced Scoring is not objective
Response Measures knowledge and fact outcomes well, Can cause difficulty for computer
terminology, formulas scoring
Essay Less construction time, easier to write More grading time, hard to score
Encourages more appropriate study habits Can yield great variety of responses
Measures higher-order outcomes (i.e., analysis, Not efficient to test large bodies of
synthesis, or evaluation goals), creative content
thinking, writing ability If the choice of three or four essay
options, you can find what they know,
not what they don't.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
22
Level Three: Behavior in the Workplace

Level three can be defined as the extent to which a change in behavior has occurred
because someone attended a training program. It can be challenging to measure behavior
effectively. This is a longer-term activity that should take place weeks or months after the
initial training. In order for change in behavior to occur, four conditions are necessary:

The person must have a desire to change


The person must know what to do and how to do
it The person must work in the right climate
The person must be rewarded for changing

Item Type Pros Cons


Performance Measures higher-order outcomes Labor and time-intensive
Assessments (i.e., analysis, synthesis, or
evaluation goals) Need to obtain different ratings
reliability when using more than
one rater

Example of Behavior improvement based on Assessment Center:

1st Assessment Mid Assessment


score after training Score % improvement RANGE OF PERFORMANCE
65 80 23 21 - 30% POSITIVE
65 73 12 11 - 20% POSITIVE
64 70 10 0 - 10% POSITIVE
60 37 -38 30 - 40% NEGATIVE
61 70 14 11 - 20% POSITIVE
64 43 -32 30 - 40% NEGATIVE
68 76 11 11 - 20% POSITIVE
66 74 12 11 - 20% POSITIVE
58 72 25 21 - 30% POSITIVE
73 77 6 0 - 10% POSITIVE
81 70 -14 10 - 20% NEGATIVE

62 77 25 21 - 30% POSITIVE

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
23
Example of Behavior improvement based on BARS:

John Julia
Skills as per BARS Level 1 Level 2 Level 3 Level 4
Emotion Control

Presentation Skills
Before Training

Voice and Accent


Knowledge

Sensitivity &
Response

John Julia
Skills as per BARS Level 1 Level 2 Level 3 Level 4
Emotion Control

Presentation Skills
After Training

Voice and Accent


Knowledge

Sensitivity &
Response

3. Behavior evaluation Observation and interview over time are Measurement of behavior
Behavior is the extent to which required to assess change, relevance of change is less easy to
the trainees applied change, and sustainability of change. quantify and interpret than
the learning and reaction and learning
changed their Arbitrary snapshot assessments are not evaluation.
behavior, and this reliable because people change in
can be immediately different ways at different times. Simple quick response
and several months systems unlikely to be
after the training, Assessments need to be subtle and adequate.
depending on the ongoing, and then transferred to a

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
24
situation: suitable analysis tool. Cooperation and skill of
observers, typically line-
Did the trainees put Assessments need to be designed to managers, are important
their learning into reduce subjective judgment of the factors, and difficult to
effect when back observer or interviewer, which is a control.
on the job? variable factor that can affect reliability
and consistency of measurements. Management and analysis of
Were the relevant ongoing subtle assessments
skills and The opinion of the trainee, which is a are difficult, and virtually
knowledge used relevant indicator, is also subjective impossible without a well-
and unreliable, and so needs to be designed system from the
Was there noticeable measured in a consistent defined way. beginning.
and measurable
change in the activity 360-degree feedback is useful method Evaluation of
and performance of and need not be used before training, implementation and
the trainees when because respondents can make a application is an extremely
back in their roles? judgment as to change after training, important assessment - there
and this can be analyzed for groups of is little point in a good
Was the change in respondents and trainees. reaction and good increase in
behavior and new capability if nothing changes
level of knowledge Assessments can be designed around back in the job, therefore
sustained? relevant performance scenarios, and evaluation in this area is
specific key performance indicators or vital, albeit challenging.
Would the trainee criteria.
be able to transfer Behavior change evaluation
their learning to Online and electronic assessments are is possible given good
another person? more difficult to incorporate - support and involvement
assessments tend to be more successful from line managers or
Is the trainee aware when integrated within existing trainees, so it is helpful to
of their change in management and coaching protocols. involve them from the start,
behaviour, and to identify benefits for
knowledge, skill Self-assessment can be useful, using them, which links to the level
level? carefully designed criteria and 4 evaluations below.
measurements.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
25
Level Four: Business Results

Of all the levels, measuring the results of the training is likely to be the costliest and time
consuming. The biggest challenges are identifying which outcomes, benefits, or results
are most closely linked to the training and coming up with an effective way to measure
these outcomes over the long term. This can include increased production, improved
work quality, reduced turnover, etc.

Measuring outcomes:
o Sales volumes
o Customer retention
o Customer support
o Time for task completion
o Defect reduction
Increased employee retention.
Increased production.
Higher morale.
Reduced waste.
Increased sales.
Higher quality ratings.
Increased customer satisfaction.
Fewer staff complaints.

The above will vary widely depending on the business and product or service provided.

CASE:

• You want to increase customer retention by X % and you need to


improve listening skills of front - end employees.
• Step 1: Set target for customer retention.
• Step 2: Give training on listening skills
• Step 3: Evaluate your training program in steps

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
26
4. ResultsResults evaluation is the effect It is possible that many of these Individually, results
on the business or measures are already in place via evaluation is not particularly
environment resulting normal management systems and difficult; across an entire
from the improved reporting. organization it becomes very
performance of the trainee much more challenging, not
- it is the acid test. The challenge is to identify least because of the reliance
which and how relate to the on line-management, and
Measures would typically trainee's input and influence. the frequency and scale of
be business or changing structures,
organizational key Therefore, it is important to responsibilities and roles,
performance indicators, identify and agree accountability which complicates the
such as: and relevance with the trainee at process of attributing clear
the start of the training, so they accountability.
Volumes, values, understand what is to be
percentages, timescales, measured. Also, external factors
return on investment, and greatly affect organizational
other quantifiable aspects This process overlays normal and business performance,
of organizational good management practice - it which cloud the true cause
performance, for instance; simply needs linking to the of good or poor results.
numbers of complaints, training input.
staff turnover, attrition,
failures, wastage, non- Failure to link to training input
compliance, quality type and timing will greatly
ratings, achievement of reduce the ease by which results
standards and can be attributed to the training.
accreditations, growth,
retention, etc. For senior people particularly,
annual appraisals and ongoing
agreement of key business
objectives are integral to
measuring business results
derived from training.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
27
Exercise

Develop a four level Kirkpatrick Model evaluation for a training program.

Methods of evaluation used in each level and are they formative/ summative and criterion/
Normative evaluation types.

Level 1: Reaction

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
28
REVIEW of LEARNING

Check the legal& ethical compliances if the training program is to be implemented as


an online/ web-based training.

Explain the LCMS purpose, difference between LCMS and LMS.

Discuss the tools & technology used for LCMS.

Use the 4 – levels of Kirkpatrick’s evaluation model to analyze, improve and evaluate a
training program.

Certified Instructional Designer


All rights reserved by MEHR© version 2.2, February 2017
29
Certified Instructional Designer
All rights reserved by MEHR© version 2.2, February 2017
30

You might also like