You are on page 1of 6

Annex B

JCLT(04)18

Evaluation Strategy for E-Learning Programme


16 June 2004

1. Purpose of this Paper


The purpose of this paper is to describe the evaluation strategy for the JISC e-Learning Programme,
to request the JISC Committee on Learning and Teaching (JCLT) to approve this strategy, and to
request approval to proceed with developing an ITT to select an evaluator.

2. Evaluation Strategy
The e-Learning programme is developing an integrated set of strategies covering evaluation,
communications and embedding, sustainability, and technology and standards. The evaluation
strategy is the most important of the four strategies and the foundation upon which the others will
build.

Objectives
The overall aim of the evaluation is to ensure that the e-Learning Programme and its strands
result in outcomes with clear benefits for the FE/HE communities and is perceived as value for
money. The objectives of the evaluation are to ensure that:

• The e-Learning Programme achieves its aims and objectives


• Programme outputs are wanted by the community and meet the needs of stakeholders
• Programme outcomes have an impact on the community and facilitate/enable positive change
• The programme can respond flexibly to changes in the technical and political environment
and isn’t overtaken by events.

As with any programme, in principle many aspects could be evaluated. For the e-Learning
programmes, the focus is on achievements and outcomes and ensuring they are useful to the
community. The programme will put in place a framework to ensure effective project and programme
management and quality of outputs. The evaluation will review evidence that this framework is
successful, but evaluating the quality of outputs or the specifics of programme management are not
key objectives of the evaluation.

Overall Approach
In the past, JISC has undertaken traditional formative and summative evaluations for its programmes.
The e-Learning programme will develop a new approach to evaluation; if this is successful, it can be
used as a model for future programmes. There are several reasons for this new approach:

• The typical three-year programme lifecycle does not take account of the rapidly changing
environment and factors like new technologies developed, evolving technical standards, new
or more evident user needs, or changing priorities of institutions. A development programme
can be overtaken by events if it lacks the flexibility to respond to changing circumstances.
• The overall e-Learning Programme is a group of programmes that need to operate as a
coherent whole. Their objectives and plans need to be aligned and will need to be re-aligned
as the programme progresses and responds to changes in the environment.
• Traditional formative and summative evaluations tend to review a programme in depth at one
point in time, resulting in a single report. Continuous evaluation with regular and timely
outputs will be more suitable for the dynamic nature of the e-Learning programme.

The new approach is based on the Kolb cycle of ‘plan, do, evaluate, review’ and shown in the diagram
below. The overall aims and objectives of the e-Learning Programme will be agreed at the start. The
individual programme strands will develop their plans and implement their activities, resulting in
various outputs and outcomes. Progress and achievements will be evaluated annually. The results of
this evaluation will inform the development of future plans and enable the programmes to fine-tune

B1
Annex B
JCLT(04)18

their objectives on an annual basis or add new objectives where these are needed, for example to
respond to a new area where activity is needed.

JISC will commission an evaluator to undertake the programme evaluation to achieve the objectives
outlined above. The evaluation will run along side the programme for its duration, providing both
formative and summative information for making strategic programme decisions. The evaluator will
review the programme and its outcomes annually (in May) to assess progress against overall
programme aims. These results will feed iteratively into the programme to improve it, informing the
plans and priorities set for the next annual cycle.

This approach to evaluation should deliver important benefits. As noted above, it keeps the
programme tuned to drivers in the environment and the changing requirements of stakeholders. By
being tuned in, the programme can be flexible and respond in appropriate ways. For example, it can
close down strands, merge them or develop new ones. Where gaps are spotted, they can be filled
with new short-term initiatives. There will always be a high level of activity within the programme, but
it will change in response to priorities set by the evaluation and review process.

Scope
The evaluation will cover all four strands of the e-Learning programme, i.e. e-Learning and Pedagogy,
Technical Frameworks and Tools, Innovation, and Distributed e-Learning.

Reporting
The evaluation will be commissioned by and report to JCLT, the JISC subcommittee responsible for
JISC’s strategy in the learning and teaching area, ensuring that this is reflected in the overall JISC
strategy, administering funds in that area, and ensuring value for money from the programmes
funded. The evaluator will report to JCLT through a nominated member of the JISC Executive.

Audience
The main audience for outputs of the evaluation will be:

• JCLT
• The programme steering committees
• The programme teams

These are the key groups responsible for the e-Learning programmes, and evaluation reports and
other outputs are mainly intended for their use. The evaluation will provide them with useful and

B2
Annex B
JCLT(04)18

timely information to inform their decisions about the programme and its strands, particularly the
results of the annual review to develop plans for the following year.

Other stakeholders will also have a vested interest in the evaluation, and might be considered to be
secondary audiences. These would include:

• Funding bodies that have provided the funds JISC is administering, e.g. DfES, or HEFCE in
the case of the Distributed e-Learning programme
• Other members of the JISC Executive, e.g. the Communications team
• The JISC community, particularly those involved in community consultation associated with
the programmes.

These groups are likely to want short, targeted reports related to their stake in the programmes. The
evaluation will provide reports to fulfil this need when directed to do so by the programme team on
behalf of JCLT.

Key Principles
• The evaluation will operate as a project along side the e-Learning programme and its strands
for their duration
• The evaluation must be external and objective, but still work closely with the programme
teams (but not a member of the team)
• There will still be need for a summative evaluation at the end, as a check on what’s been
done, not a full-scale summative evaluation starting from scratch (e.g. £10k)
• Effective QA and programme management should be built into the fabric of the programme
strands (e.g. via programme support). The evaluator reviews and validates their evidence
rather than evaluating these aspect directly
• The evaluator works with the programme and support teams, to minimise the involvement of
projects.

3. Implementation Framework
The evaluation strategy outlined above indicates the purpose, overall approach, and key principles of
evaluation for the e-Learning programmes. The strategy will be supported by an implementation
framework that indicates how it will be achieved in an operational context at programme, strand, and
project level. This section briefly summarises the framework.

Levels of Evaluation
Evaluation will be conducted at project, strand, and programme level to ensure that outputs and
outcomes are of high quality, relevant to stakeholder needs, and that the programme and its strands
achieve their aims. The table below summarises evaluation at each level.

Level Evaluation responsibilities Focus


Programme • Evaluation at programme level and planning to achieve • Impacts
programme outcomes (above strand level) • Outcomes
• Coordinate evaluation at strand level, e.g. minimise gaps
and overlaps
• Coordinate community consultation across strands
• Annual review of evaluation evidence and plans at strand
and programme level
Strand • Develop an evaluation framework at strand level • Impacts
appropriate to the outputs and outcomes envisaged • Outcomes
• Provide evaluation and technical support to projects • Outputs
• Develop community consultation processes
• Engage with stakeholders
Project • Adhere to JISC standards guidelines and QA processes • Outputs
• Evaluate project outputs providing evidence of quality, e.g.
fitness for purpose, adherence to standards

B3
Annex B
JCLT(04)18

Though each strand is different, there will be common approaches to aspects of the evaluation. Each
strand will provide for:

• Community consultation – Engage with key agencies in learning and teaching to influence
their policies and agendas, and feed into priority setting within the strand, e.g. agencies in HE,
FE, ACL, inspectorates, funding bodies, standards bodies
• Engagement with stakeholders – Ensure that project and strand outputs and outcomes are
wanted and meet stakeholder needs
• Project support – Provide advice and guidance to projects on evaluation, JISC technical
standards, QA processes and testing for conformance.

Evaluation at Strand Level


The detailed planning for evaluation will be done at strand level. Each strand will need to decide on
the questions the evaluation will answer, success indicators, evaluation criteria, appropriate
methodologies, data collection and analysis, and the role of projects in the evaluation. The evaluation
framework set up and methods used will reflect the nature of its outputs and envisaged outcomes.
e-Learning and Pedagogy
This strand aims to ensure that e-learning as practiced in UK FE/HE should be pedagogically sound,
learner-focused, and accessible. The outputs are largely non-technical, comprising a wide range of
studies, models, and guidelines to be released over the life of the strand. Evaluation of outputs will
involve peer review by experts, and community consultation will be used to assess priorities for future
work. The strand has therefore created a panel of 70 experts to review outputs and raise awareness,
and developed a community consultation process with key stakeholder agencies. Success will be
judged by factors like whether the models and guidelines developed are being used and valued by
practitioners.
Technical Frameworks and Tools
This strand aims to develop a technical framework to facilitate interoperability across learning,
teaching, research, and their supporting systems. Here the outputs are largely technical, so the
approach for evaluation is different. The technical outputs will be created by funded projects, and the
strand will have a programme support role to guide them in technical development and evaluation.
This will include providing projects with advice and guidance on how to implement JISC standards
and developing processes and procedures for conformance testing. Consultation with the community
will be undertaken at different levels and will include the technical development community, standards
bodies, relevant vendors, and international initiatives. Success will be judged by factors like
interoperability, support for institutional and pedagogical diversity, and adoption of the framework by
institutions and vendors.
Distributed e-Learning
The DEL strand will build on the others to support e-learning across institutions in a regional and
subject community context, widening participation and encouraging progression into HE. The strand
is organised by sub-strands like architectures, repositories, tools, culture, and regional pilots. The
strand will build a wide range of technical and non-technical outputs including architectures, tools, e-
learning content, question banks, a model lifelong learner record, and guidelines on topics like
building repositories and sharing resources. DEL will adopt evaluation methods similar to Technical
Frameworks for the technical outputs and methods similar to Pedagogy for the non-technical outputs.
However, DEL will also appoint an evaluator at strand level to implement the evaluation over the
various sub-strands and range of outputs. Like the other strands, DEL will have a community
consultation process, in this case based in the HE Academy, a strategic partner. Success will be
judged by factors like adoption of the various architectures, tools, and models, and the extent to which
links can be facilitated across institutions and communities.

Evaluation at Project Level


For strands with funded projects, the strand will provide detailed information about the evaluation
framework and procedures to be followed for evaluation at project level. Technical projects must
follow JISC standards guidelines and any procedures for testing QA and conformance to standards.
All projects must follow guidelines given on methodologies, data collection, and evidence of quality.
Each project will be expected to develop and implement its own evaluation plan as part of and

B4
Annex B
JCLT(04)18

contributing to the overall plan at strand level. Evaluation results and evidence will be collected and
assessed at strand level.

Evaluation at Programme Level


An evaluator will be appointed at programme level to implement the evaluation strategy and achieve
the evaluation aims outlined above. This will involve evaluation programme level and coordinating the
evaluations at strand level.
Coordinating Strand Evaluations
The ‘nuts and bolts’ of evaluation will be conducted at strand and project level, as outlined above.
The evaluator will need to coordinate their work, ensuring coherence across the strands and
minimising gaps and overlaps. The role might be similar to that of an orchestra conductor who
‘orchestrates’ the evaluation, acts as a mentor, and gets the best performance from the teams. This
will involve various activities:

• Liaison with JCLT – Work closely with JCLT to understand their priorities, provide them with
timely and objective information about the programme and its outcomes, and to ensure they
have the necessary information for making decisions
• Evaluation planning – Ensure that the evaluation plans and implementation frameworks at
strand level are consistent, coherent, methodologically sound, and will successfully answer
the evaluation questions
• Review – Objectively review evaluation evidence provided by the strands during the annual
evaluation, and review their plans for the following year, ensuring they are aligned and
respond to new developments in teaching and learning.
• Community consultation – Coordinate the plans of each strand for community consultation,
and where appropriate participating in consultation events that feed into the evaluation
process but also are informed by it.
Programme Evaluation
However, a programme is more than the sum of its parts, and evaluation at programme level is more
than planning and coordination. The evaluator must ensure that the overall e-Learning Programme
achieves its outcomes and has impact. The terms of reference for the evaluation will provide specific
questions that the evaluation at programme should answer, e.g.

• What are the key outcomes of the Programme and the benefits for stakeholders in the
community?
• What is the impact of the Programme in the community, and what cultural change is it
fostering?
• What are the key developments and changes in the e-learning environment, and what
problems or opportunities do they pose?

The evaluator will develop a plan at programme level to answer these questions, coordinated with the
plans of each strand and where appropriate drawing on their evidence. The evaluator will work with
programme directors, JCLT, and other members of the JISC Executive to develop strategies for
achieving outcomes and impacts common to all strands or above strand level, e.g. influencing the
policies and agendas of key agencies and international initiatives. For example, influencing DfES
policy on e-learning standards may need to be initiated above strand level, and the evaluator should
plan with JISC how to achieve such outcomes.

4. Selecting an Evaluator
Process
The usual JISC tendering and contracting procedures will be followed. An open tender processes
announced by ITT is suggested. The Programme Director for E-Learning will draft the ITT for
approval by JCLT. It may be useful to invite the top candidates to give presentations as part of the
decision process.

Terms of Reference

B5
Annex B
JCLT(04)18

The terms of reference in the ITT will cover the following topics:

• Aims and objectives of the evaluation


• Description of the programmes to be evaluated
• Scope of the evaluation
• Audience and users of the evaluation
• Key evaluation questions
• Roles and responsibilities
• Evaluation outputs
• Appointment and reporting
• Time commitment and budget

Timing
• 11 May 2004 – Evaluation strategy is circulated to JCLT
• 25 May 2004 – JCLT approves/amends the evaluation strategy
• xxxxxx 2004 – ITT is circulated
• xxxxxx 2004 – Deadline for proposals (allows 4 weeks)
• xxxxxx 2004 – Decision to select evaluator (allows 3 weeks)
• 1 Sept 2004 – Appointment for the evaluator starts

Appointment
The appointment will commence on 1 August 2004, will be renewed annually, and end on 31 July
2007 (3 years), subject to a satisfactory annual performance review.

Time Commitment and Budget


The terms of reference will give a guideline regarding the time commitment (e.g. 60-80 days per year)
and the budget (£30,000) per year.

B6

You might also like