Professional Documents
Culture Documents
the difference
you make
An evaluation toolkit for adult
learning and career change
programmes
BACKGROUND AND ACKNOWLEDGEMENTS ABOUT LEARNING AND WORK INSTITUTE
The future of work is changing, providing opportunities for new careers and novel Learning and Work Institute (L&W) is an independent policy, research and
ways of working. Recent work by Nesta, Pearson and The Oxford Martin School development organisation dedicated to lifelong learning, full employment and
predicts that approximately a fifth of workers are in occupations that are likely inclusion. We research what works, develop new ways of thinking and implement
to shrink over the coming decade. Changes to the world of work will be driven new approaches. We want everyone to have an opportunity to realise their
by long-term trends such as technological and demographic change as well ambitions and potential in learning, work and throughout life. We believe a better
as the ongoing impact of the COVID-19 pandemic. Future skills requirements skilled workforce, in better paid jobs, is good for business, good for the economy,
will increasingly emphasise interpersonal, higher-order cognitive skills (such as and good for society. We want learning and work to count. To find out more visit
fluency of ideas and active learning) and systems skills (such as judgement and www.learningandwork.org.uk
decision making). The future workforce will need a combination of broad-based @LearnWorkUK @LearnWorkCymru (Wales)
and specialised knowledge. There is currently limited evidence of ‘what works’ to
support people to build these skills and support them into new careers.
ABOUT NESTA
This work has been developed by Learning and Work Institute (L&W) based
We are Nesta. The UK’s innovation agency for social good. We confront challenges
on our experience as the evaluation partner on the Nesta and Department
that affect millions of people, from inequality and ill-health to the climate crisis.
for Education CareerTech Challenge in 2020/21, a £5.75 million programme
to encourage bold solutions to improve people’s working lives and unlock We believe that innovation offers more potential now than ever before. We see
employment opportunities for the future. L&W worked closely with 11 innovators opportunities to mobilise citizens and influence behaviour. Private and public
to evaluate the effectiveness of the digital and online learning solutions in capital that can be used more creatively. A wealth of data to mine.
building career adaptability skills and learner motivation amongst people who And so we draw on these rich resources by bringing together diverse teams.
are working in shrinking sectors and occupations. The evaluations developed Data scientists, designers and behavioural scientists. Practitioners, academics,
will help to identify ‘what works’ to upskill and retrain those most at risk of losing entrepreneurs and people with lived experience.
employment. Working in collaboration with Nesta, L&W wants to build capacity
Together, we design, test and scale new solutions to society’s biggest problems.
to conduct rigorous, innovative and high-quality evaluations in adult learning,
We partner with frontline organisations, build new businesses and work to change
skills and employment.
whole systems. Harnessing the rigour of science and the creativity of design, we
work relentlessly to put new ideas to the test. We’ll keep going until we change
AUTHORS millions of lives, for the better.
Find out more at nesta.org.uk
Fay Sadro, Lucy Marsh, Hazel Klenk, Corin Egglestone and Seana Friel.
DESIGN
Storythings | hello@storythings.com
02
Introduction Guidance on how to design and deliver an impact evaluation
Guidance on how to design and deliver a process evaluation Top tips on developing key findings and lessons learned
03
Introduction
INTRODUCTION
05
What is
evaluation
and why is it
important?
WHAT IS EVALUATION AND WHY IS IT IMPORTANT?
07
WHAT IS EVALUATION AND WHY IS IT IMPORTANT?
Further reading
The power of evaluation – how it supports us to make big
decisions about how we design and deliver policies and • Better Evaluation: global collaboration to improve
programmes evaluation use and theory
Back to Work Finland: Evaluation that identifies effective practice • HM Treasury’s Magenta Book: Central Government
guidance on evaluation
An evaluation of a range of active labour market policies found that job
rotation schemes in Finland get roughly 6 out of 10 displaced workers
into jobs within three months – far outperforming other approaches used • The Innovation Growth Lab Toolkit: A guide to build
in the country. This is now one of Finland’s policies in place to address an understanding of how adopting an experimental
economic restructuring and to assist people losing their job for economic approach can be used to make policies more effective
reasons.
Behavioural nudges to support attendance in adult learning: Evaluation • For general guidance on using research evidence see
that identifies when an intervention does not work this guide from The Alliance for Useful Evidence.
After an initial promising pilot evaluation, an independent evaluation
found that students who received motivational text messages or had
study supporters did not, on average, have higher pass rates in GCSE • For more examples of how evaluation can help non
re-sits. profit organisations, take a look at Evidence for Good
from the Alliance for Useful Evidence, the Nesta
Per Scholas: Evaluation that supports economic growth Standards of Evidence and the Nesta Guidance
Three years on from delivery, the US Per Scholas IT programme has been for Evaluating Social Innovation to Create Lasting
found to have ‘large and growing’ impacts on employment and earnings, Change
with participant earnings $4,800 or 27 per cent higher on average than
the control group. It has now been adopted across several other states in • Nesta also has an Evidence & Experimentation
the US. practice team who are developing the future focus for
Nesta’s work in this area. You can contact them here
08
Guidance on
how to develop
your Theory of
Change
GUIDANCE ON HOW TO DEVELOP YOUR OWN THEORY OF CHANGE
The theory of change can be summarised What does a good Theory of Change
in a diagrammatic form which can be look like?
useful in several ways:
• Clear links between activities/outputs/
• Communicates rationale for your
outcomes (causal pathways)
project
• Manages risks and identifies • The Theory of Change aligns with the
assumptions rationale and expected outcomes
Change? • Guides your evaluation activity • Consider the assumptions about the
links you make between different
Developing a project Theory of Change factors (causal pathways)
A Theory of Change sets out the supports organisations to set out how the
relationship between the inputs activities they are delivering are directly • Consider the key risks for the project
and activities associated with connected to the change they expect to intervention
see. Developing a theory of change is often
your project and the outputs and an iterative process and it may evolve over There are a number of ways to present a
outcomes which you hope to see time as your understanding develops. It Theory of Change, the examples provided
as a result. should be seen as a live document that are just a starting point. Do consider how to
you return to and adapt as you learn more visualise your Theory of Change and be as
about your programme. creative as you like!
10
GUIDANCE ON HOW TO DEVELOP YOUR OWN THEORY OF CHANGE
Grant funding Development of innovative learning Learning platform/product Attitudinal Outcomes: Increased use of online learning
products and platforms fully developed and in use • Increased motivation for learning to provide retraining for people
• Increased awareness of value of who are most vulnerable to
learning workplace disruption
Number of course starts
Online delivery mechanisms e.g.
Capacity building • Machine learning/AI Increased demand for online
Behavioural Outcomes:
support and training • Gamifications learning amongst the targeted
Number of learners • Increased participation in
• Chatbot learners
completing the provision learning, particularly online
Engagement and recruitment e.g. Learner retention rates Improved progression of the
Skills Outcomes:
Stakeholder and • Partnership working (employers, unions) (proportion of learners targeted learners into new
• Increased skills development
media engagement • Outreach and engagement who continue to engage in occupations and roles
especially English/Maths/Digital skills
provision after the first course • Industry-specific skills
is completed) • Employability skills development
The Career Adaptability
Evaluation: Framework first proposed by
• Development of project evaluation Robust data capture tools Mark Savickas (and added to
Programme plan and MI systems in place Career Adaptability Outcomes by Jenny Bimrose, Warwick
evaluation and • Programme shared outcomes to monitor progress and • Reduced Concern University), has been adopted
evaluation support framework capture outcomes • Increased Control by a number of researchers
• Increased Curiosity to describe an individual’s
• Increased Confidence capability to respond to career
• Increased Commitment challenges. Career Adaptability
is important because these
Rationale - Nesta committed to this programme of work following a period of capabilities will enable a person
research into the future of skills and employment, particularly the Precarious to adapt and learn throughout
Evaluation Outcomes: their lives.
to Prepared manifesto. The research identified a need for more opportunities • Findings on effect of different
for workers to retrain in order to be resilient to labour market shocks (such as approaches on learner
engagement and retention
technological and environmental change). A particular need was identified • Findings on effect of different Increased knowledge of ‘what
for the development of training that increases learner motivation and career approaches on learner outcomes works’ in the use of online
adaptability. • Increase in innovator awareness learning for the targeted
of and commitment to evaluation learners
11
Guidance on
how to design
and deliver
a process
evaluation
What is Process
Evaluation?
Why is process evaluation valuable
A process evaluation is focused to my organisation and project?
on understanding key successes
• It can help you to identify how and
or challenges encountered in why new ways of working have
the delivery of the project and been effective – and which of these
whether the project was delivered practices you might want to roll out
as intended. across your organisation
process evaluation, think about You should design and deliver a process
creative ways you could conduct evaluation to help you understand why you
the evaluation that builds on may be seeing certain outcomes improving,
existing mechanisms you have in staying the same or getting worse.
place. We have included some
example Process Evaluation
questions later on in this chapter.
13
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
1. Decide which elements of your project to 2. Develop a set (3-4) of 3. Identify and prepare the
examine as part of your process evaluation. overarching questions that research tools you will use to collect
you want to answer through information
This decision will be guided by the activities and outputs
your process evaluation.
you have put in your Theory of Change. These include: There are different research tools you can
Typical generic questions include: use in your evaluation. Through these you
• Your marketing or recruitment strategy
can collect information such as factual
• Your approach to recruiting and engaging learners • What worked well and less well,
details as well as perceptions of what
• Demographic characteristics of learners and why?
worked well and less well in the project
• Partnerships with other organisations • What could be improved?
delivery.
• Context in which the project is delivered • How has the context influenced
• Project management arrangements delivery? See ‘Evidence collection for your process
• Quality and intensity of the learning provision
evaluation’.
• Staffing and training
14
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
How do I identify learners to take part in the evaluation? • Using a gatekeeper as a ‘warm’ contact can be an effective approach
to recruitment.
• You can recruit learners via an ‘opt in’ method whereby you contact
them about the evaluation and ask them to opt in if they are interested. • Advertise the opportunity via a range of mediums – for example on
Alternatively, you can ask a gatekeeper (e.g. tutors, mentors, employers your website, social media or via text or email.
or other delivery staff) to identify learners who would be willing to take
part. • Use clear and accessible language to encourage learners to take part.
Reassure them that it is not a test and that there are no right or wrong
• Use your monitoring information to select a range of learners based on answers.
their characteristics (e.g. age, gender, ethnicity, employment type/sector
and qualifications). How do I gain informed consent?
• Keep a record of who you have contacted and who has opted in. • Provide learners with information about the evaluation, its purpose and
why you would like to talk to them before they take part in any formal
activity with the evaluation.
How do I incentivise learners to take part in an interview or focus group?
• If you are planning to collect monitoring information, you must inform
• Highlight the value of taking part. Emphasise that their involvement will learners of this upon engagement so that they can opt in (or out) to you
contribute to wider learning that will feed into a better, evidence-based collecting information about them.
understanding of how best to design online learning for adults like them
so that future programmes of work can support people more effectively. For more information about informed consent and other ethical
considerations, please see Chapter 5: Ethical considerations for your
evaluation.
15
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Evidence collection for your 1. One-to-one interviews or focus groups with those delivering the project and those who
take part (learners) to understand what worked well and less well about the project. For
process evaluation more information about interviews and focus groups, see ‘Conducting interviews and
discussion groups’.
You will need to identify and develop research tools
2. Output information to understand whether processes such as recruitment, retention
to use to collect evidence for your process evaluation.
and completion have worked as expected. This will be collected using your monitoring
Process evaluation commonly uses two types of research
table.
tools to gather information:
Through these you can collect evidence such as factual details as well as perceptions of
what worked well and less well in the project delivery. It may also highlight any key aspects
about the context of the intervention that affected project delivery.
In the table below, we have suggested the range of questions you may consider exploring
with key participant groups across your process evaluation. This is a starting point and you
should consider further topic areas you wish to explore with participants relevant to your
evaluation aims.
16
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Your How has the process for marketing and recruiting to the project been working? Project staff At the start,
marketing and Was the marketing and recruitment strategy delivered exactly as it was planned? What during and
recruitment changes were required and why? post-project
strategy How has the external advice and guidance on the marketing and recruitment strategy been put into delivery
practice? What worked well or less well?
What aspects of the marketing and recruitment strategy worked well / less well to recruit learners into Project staff At post-project
the project? E.g. targeting learners through existing partnerships with employers, targeting learners in and learners delivery
existing course, using online communities, social media, etc.
Outreach and How would the respondent describe the learners who have engaged with the project? E.g. Project staff During and
engagement of demographics, skills level, employment status, sector, background, learning needs. post-project
learners Was this group as it was expected to be when the project was planned? If it differed, how and why? delivery
How has the mechanism of learning delivery influenced learners’ initial experience of the project? E.g. Project staff During and
chatbot function, learning app, online collaboration, machine learning. and learners post-project
What worked well in engaging learners into the project? What worked less well? delivery
Were there any challenges keeping learners engaged with the project once they enrol? How does this
differ by types of learners?
What lessons have been learned about engaging learners into the project?
17
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Project Were the inputs and activities specified in the Theory of Change the same as what was delivered? Project staff During and
management Did the project meet budgetary expectations, or were there unforeseen issues and hidden costs? post-project
arrangements, Was the project managed and staffed as it was planned? What changes were required and why? delivery
staffing and Was any training provided for staff? By whom? How useful was it? Any additional training needed?
training
Model of Was the project delivered exactly as it was planned? What changes were required and why? Project staff During and
learning Were there different variations to project delivery? What sort of variations were made? How did this post-project
delivery affect delivery? delivery
Was the project delivered in the same way consistently (e.g. was each module delivered the same way)?
Were the inputs and activities specified in the Theory of Change the same as what was delivered?
How has the design/mechanism of learning delivery influenced learners’ experience of the project? E.g. Project staff At the start,
chatbot function, learning app, online collaboration, machine learning. and learners during, and
Which aspects of delivery were most relevant and most valued? Was this different for different groups of post-project
learners? delivery
Which aspects caused difficulties among learners? Was this different for different groups of learners?
What did learners and staff feel worked in delivering the project? What facilitated this?
What did learners and staff feel worked less well in delivering the project? Why were these aspects less
successful?
Are there any changes that could be made to the delivery model?
18
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Partnership What partners have been involved in delivering the project? What was their role? E.g. designing the Project staff At the start,
working technology, communication, facilitating referrals to the project, co-creators. during, and
What’s worked well? What were the key challenges? How could these be addressed? post-project
Could anything be improved about partnership working? delivery
Pathways to What outcomes have been achieved through the project? Project staff During and
outcomes What types of learners are achieving these outcomes (e.g. certain age group, sector, and learners post-project
employment history)? delivery
Have there been any unintended outcomes amongst learners?
What might act as facilitators and barriers to desired outcomes? How can barriers be
overcome, and facilitators harnessed?
Context in Are there any organisational factors influencing the project’s delivery? To what extent do these act as Project staff At the start,
which the facilitators or barriers to effective project delivery? and learners during, and
project is Are there any wider contextual factors influencing the delivery of the intervention (e.g. national or local post-project
delivered policy priorities, Coronavirus, labour market context)? delivery
In what ways have these factors influenced the project’s delivery?
Are there any external factors which have influenced learners’ experience of the project? Project staff During and
and learners post-project
delivery
19
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Summary Overall, how well has the project been going so far? Project staff During and
questions: To what extent has the project met its aims? How could it better meet these aims in the future? and learners post-project
strengths, What are the key strengths of the project? delivery
weaknesses and What have been the main challenges? How have these been overcome?
lessons
Do you think the project was delivered successfully? Why or why not?
What would they change if they were to do the project again?
What design and delivery factors would they consider if they were to scale up the intervention in the
future?
20
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Interviews and Interviews enable in-depth exploration of Can be used to elicit views of Can be resource intensive; requires
focus groups the intervention with participants. individuals involved in an intervention. time to conduct and analyse; does
not provide numerical estimates;
Focus groups are useful to elicit views Can be used to collect in-depth there may be risk of bias in the views
from a group of people rather than an insight about an intervention and shed collected.
individual. light on patterns emerging in other
pieces of evidence collected (such as
quantitative monitoring data).
Case studies In-depth investigation of a person, group Can capture real-life situations It is difficult to generalise findings
or event within its real-world context. in depth and detail and help to different contexts, situations or
understand complex phenomena. phenomena.
Subjects are often purposely selected
because they are unusual and reveal Works well in combination with or
information. supplementing other methods, such
as surveys.
Often uses multiple sources of evidence
and data collection methods. Can be Can be helpful for communicating
descriptive, exploratory or explanatory. to stakeholders what interventions
have worked for organisations in
certain contexts.
1
HM Treasury (2020) Magenta Book: Central government guidance on evaluation
21
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Surveys or Commonly used to collect information An effective method of obtaining Less useful for providing in-depth
questionnaires from a number of individuals, such as information from a large number of insight into an intervention.
beneficiaries, or a large organisation with learners.
numerous members of staff. There can be response-rate issues that
Provides data suitable for statistical decrease the quality of its findings.
They can be administered face-to-face, analysis that, if properly designed and
by post, online, by telephone or as a conducted, can be generalised to the
handout. whole population of interest.
Output and Continuous measurement and Can provide a relatively low cost Can feel onerous for both learners
outcomes performance review of an intervention. and rapid method to identify if an and the staff (usually delivering an
monitoring Monitoring plans are developed based intervention is being delivered and intervention) who collect it.
on the Theory of Change to allow the creating outputs as intended.
tracking of the inputs, outputs and
outcomes of an intervention.
22
Collecting learner information
You will need to collect information about your learners
(you may hear it referred to as monitoring or management
information). This allows you to explore who is taking part in
your project. Collecting this information can also support you
– alongside other evaluation evidence – to understand what
works for different groups of people. For example, if you work
with adults of all ages it may be helpful to know whether your
project outcomes work better for some age groups more than
others. Characteristics you should consider collecting include:
• Age
• Gender
• Ethnicity
• Employment sector
23
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
• Try to make the interview environment • You should use a discussion or ‘topic’ • With participant consent, you can record
safe and private. Interviewees may want guide to help aid the discussion. The the interview or focus group as this will
to share information that they would not discussion guide is a document that give you an accurate record of what is
want others to hear. outlines the key issues and subtopics to be said. If this is not possible, you should
explored with participants. The Process write notes during the interview or
• Respect for interviewees’ confidentiality Evaluation Questions section includes discussion group so that you can capture
and anonymity is paramount. This is some example questions you may want to the key points and use these notes to
especially important in the reporting consider using in the discussion guide. support your analysis.
phase, where interviewee names (and
organisation names, where relevant) • As the interviewer or group facilitator, • If you are conducting small group
should not be included in the final report. you should be as neutral as possible and discussions, do consider carefully who
seek out the interviewees’ views about you are inviting to the groups and ensure
• Before starting the interview or focus the project in a balanced way. Try to that the group mix is appropriate for
group, you should double check that avoid asking ‘leading’ questions and the discussion. For example, having
participants are still happy to take part. It start questions in an open way i.e. “What employers in the same discussion group
should be made clear to participants that difference, if any, did the project make to X?” as learners (i.e. employees) may influence
their involvement is voluntary and can be and “Could you talk me through what you certain individuals’ contributions.
discontinued at any time. You should also considered worked well about the project, if
explain that they don’t have to answer anything?”
any questions if they don’t want to, and
that they can end the interview/leave the
group at any time without explanation.
24
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
If you are planning to conduct online or • It is recommended groups use a typed Avoiding bias in process evaluation
virtual interviews or focus groups, consider interface (as not everyone likes/has
the following points: webcams and this can create ‘tech
barriers’ on timings). Project beneficiaries, colleagues, and wider
• These methods assume that learners stakeholders may be reluctant to criticise
have access to hardware, good quality • Same (and additional) stimulus the project, especially when the evaluation
wifi and a private space, don’t forget techniques can be used as in face-to- is conducted by someone who is closely
to check these things in advance and face groups: whiteboard functions, word involved in the project. All respondents
establish a plan to engage with those clouds, quick polls, short films, pictures, should be asked to express honest views
who may be digitally excluded. screen shots, screen sharing, website and be assured that their responses will
checking. This level of interactivity helps not have negative repercussions. It is
maintain engagement but may require useful to emphasise that their involvement
• Groups should take place in real-time, no
more prep time to ensure participants can will contribute to wider learning in the
longer than 90 minutes.
use them comfortably. programme evaluation that will feed into
a better, evidence-based understanding
• Direct messaging can be used for one-to-
• Give time for responses: typing an answer of ‘what works’ to engage and motivate
one follow-ups.
can take longer than saying one and individuals.
not being able to see people’s facial • It is recommended to have two people
expressions may delay responses. leading the focus group.
25
GUIDANCE ON HOW TO DESIGN AND DELIVER A PROCESS EVALUATION
Example: Process evaluation of DigiSkill – an online basic skills course for retail
workers, with one-to-one mentoring
DigiSkill delivered an online basic skills course with sector-specific content to 200 adults working in the retail sector. The provision included support from
a one-to-one mentor who worked with employers as gatekeepers to incentivise take up. The process evaluation aimed to understand the extent to which
the project was delivered according to the plan (as set out in the Theory of Change) and the key successes and challenges. The team developed three key
research questions:
1. To what extent is online provision of basic skills effective? 3. Conducted 15 phone interviews with learners to get an in-depth
understanding of how and why they chose to engage with the
2. What role do mentors play in adults’ experience of online learning? provision, their views and experiences of the course, the content and
the mentor support, their perspectives of the benefits of the course
3. What role does employer engagement play in incentivising adults to and suggestions for improvements. The mentors and employers
take up learning? identified 15 learners who were willing to take part in the interviews.
To answer their evaluation questions, the DigiSkill project team conducted 4. Conducted a focus group with 6-8 mentors to explore their views
the following activities: and experiences of the online provision, their views of working with
employers, what they felt worked well (e.g. regular sessions with the
1. Developed an evaluation plan including completing a Theory of learner) and what the challenges were (e.g. lack of training), and
Change and a monitoring table. suggestions for improvement.
26
Guidance on
how to design
and deliver
an impact
evaluation
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
28
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
1. Develop a set of key questions you 2. Identify and prepare the tools you will use to collect information
want answered
You can use:
Typical impact evaluation questions include:
• Questionnaires or output data to collect quantitative information i.e. numbers that can
• Did the intervention work? be measured. For example, you can ask learners to rate on a scale of 1-5 how confident
they feel about applying for new jobs. See ‘When to collect information if you are using
• What changes occurred as a result of the questionnaires’ for more information. The Toolkit Question Sets section gives some example
intervention? questions that you could use to help you identify which questions may be helpful. You can
also use or develop your own questions.
• Have different groups been impacted in • Interviews and focus groups to collect information that cannot readily be measured using
different ways, how and why? numbers. For example, if you wanted to better understand any changes to learners’ attitude to
learning you may want to explore this through an interview or focus group.
• How has the context influenced
outcomes? Use the research tool table to help you decide.
29
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
Example: Before & After Delivery Example: After Delivery Example: Several Weeks After Delivery
30
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
Explore a randomised
Explore comparison groups
control trial
2
Adapted from HM Treasury Magenta Book (2020)
31
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
Will your project have a Consider using people Learners enrolled on one of two interactive online learning courses – one in
waiting list i.e. engage on the waiting list for the July and one in August. All learners completed a short questionnaire before
some people first and comparison group. This is the July course started, asking about their attitudes to and experiences of
others later? called phased intervention. learning.
At the end of the July course, both groups were asked to repeat the
questionnaire. By doing this, the August cohort could act as a comparison
group for those who did the course in July because they had not yet done
Will your project allow the course.
for different levels
of engagement or Consider using people who Whilst the July course was being delivered, a national educational
differentiation i.e. some only access one module as
campaign about learning online and its benefits took place, including
participants completing the a comparison group.
adverts on social media and TV. This meant that both groups felt more
whole course, and others
motivated to engage in online learning. Without the comparison group, it
doing one module?
would have been difficult to know how much of the increased motivation
of the July learners was due to TechLearn, and how much would have
happened anyway.
32
GUIDANCE ON HOW TO DESIGN AND DELIVER AN IMPACT EVALUATION
Pre-test Pre-test
(before) (before)
33
Ethical
considerations
for your
evaluation
ETHICAL CONSIDERATIONS FOR YOUR EVALUATION
Informed consent: You should explain clearly the purpose of collecting the information
you need for the evaluation, so participants can make an informed choice about whether
they are happy to take part.
Ethical Considerations Safeguarding vulnerable adults: Anyone undertaking research with vulnerable adults
must have an up-to-date DBS check.
As an evaluator you have a
responsibility to work ethically.
Confidentiality: You should reassure participants that all information they share will be
This is important so that the reported anonymously to protect their privacy and ensure there are processes in place to
participants in your research protect personal information. For example, anonymising quotes used in reporting.
are protected from undue harm.
Consider the following points Preventing harm to participants: It is important to avoid causing any undue harm to
when designing and undertaking participants taking part in your evaluation. For example, questionnaires should try not
your evaluation: include any questions that may upset participants. In situations where participants do
become upset, the project must have safeguarding provisions and processes in place to
signpost the participant to someone who can help.
35
ETHICAL CONSIDERATIONS FOR YOUR EVALUATION
36
Top tips on
developing key
findings and
lessons learned
TOP TIPS ON DEVELOPING KEY FINDINGS AND LESSONS LEARNED
Quantitative analysis
38
ANALYSIS AND REPORTING
Qualitative analysis
39
TOP TIPS ON DEVELOPING KEY FINDINGS AND LESSONS LEARNED
How to report findings • Using a clear style and writing in plain English – you can find advice in ‘How to write
reports in Plain English’, which you can download from the Plain English Campaign
Reporting enables us to draw out lessons learned about website.
‘what works’ about the design and delivery of your • Writing a clear and concise executive summary that presents the key findings from the
intervention (i.e. mechanisms for learning), and ‘what evaluation
works’ to generate positive outcomes for learners. We • Describing what is being evaluated and why
can then share these lessons with external audiences to
• Setting out the research questions
generate knowledge and learning.
• Explaining the steps and activities used to answer the questions
There is no one-size-fits-all approach to reporting your • Presenting findings supported by credible evidence in response to the questions
findings. The following are key elements to a good
• Presenting data in an unbiased, clear way
quality report:
• Acknowledging any limitations of the evaluation
• Drawing conclusions and lessons learned about findings
• Proposing concrete and usable recommendations derived from conclusions and lessons
learned
• Tailoring report findings to the needs of different audiences - demonstrating how the
findings relate to specific audiences’ areas of interest can improve the usability of
findings. If you have comms or marketing professionals working in your organisation,
get them involved early in developing approaches to disseminating findings and
translating these for different audiences.
3
Or equivalents, such as having confidence or not having confidence in a statement
4
For example, the number completing the course or participating in additional courses
40
TOP TIPS ON DEVELOPING KEY FINDINGS AND LESSONS LEARNED
When reporting findings, it will normally be useful to report both numbers and percentages;
although you should only use percentages if you have at least 100 results. For example, you
could calculate:
• The percentage of participants who agree or strongly agree with a statement, the
percentage who are neutral and the percentage who disagree or strongly disagree. For
an example of what this could look like, see Table 1 (page 42).
• The percentage of people who complete an outcome (the completion rate). For an
example of what this could look like, see Table 2 (page 42).
• The percentage of people with each overall score (or the percentage in different score
ranges). For an example of what this could look like, see Table 3 (page 42).
• Where you have overall scores, it may also be useful to calculate average scores across
your participants.
You should consider breaking down findings by participant characteristics. This will
enable you to analyse whether your project has a different impact for different groups.
For example, you could calculate the completion rate per age group, or the percentage of
each gender who agree or disagree with a statement (for an example of how this could
be presented, see Table 2 - page 42). However, before doing so you should ensure that no
participants are identifiable through such breakdowns – for example, if you only have a
small number of people in a particular category. In addition, you should be aware that
breakdowns may not be reliable if you only have a small sample size. You should consider
presenting your findings with a mixture of tables and charts. Tables can be a useful way
to summarise detailed information, whereas charts can help readers to quickly engage
with key findings. If using pre- and post-intervention questionnaires, consider using charts
to display before and after findings side-by-side. For examples of the types of charts you
could use, see Figures 1 and 2 (page 43).
41
TOP TIPS ON DEVELOPING KEY FINDINGS AND LESSONS LEARNED
Agree/strongly Disagree/strongly
Statement Neutral Age group Completion rate
agree (%) disagree (%)
I do not consider myself ready to learn 30% 20% 50% 25-34 57%
something new
The thought of learning something new 20% 20% 60% 55-65 52%
makes me anxious
I am often anxious when learning 25% 50% 25% Base: all participants, n = 300
9-17 20%
18-26 45%
27-35 35%
42
TOP TIPS ON DEVELOPING KEY FINDINGS AND LESSONS LEARNED
Figure 1: Continuing my education would make me feel Figure 2: Self-control and belief - average score by
better about myself age group
70% 35
60% 30
50% 25
40% 20
30% 15
20% 10
43
Question Bank
Tool
Question bank tool
45
QUESTION BANK TOOL
The outcomes we have identified as part You should consider which of the shared
of the question bank are not an exhaustive outcomes align with what your project may
How to use this list of all changes you will expect to see achieve, as identified in your Theory of
46
QUESTION BANK TOOL
Increase in confidence. This includes confidence in everyday life and High course completion rates Digital skills
confidence in gaining employment
Questions: Questions:
Questions: • Course completion measure • Digital skills questions
• Confidence in gaining employment questions
• Self-control and belief questions
Improvements to attitudes and motivation towards learning. This includes Improvements to career adaptability behaviours. This includes both Employability skills. For
reduced anxiety around learning and an increased readiness to learn. resilience in the labour market (e.g. when applying for jobs, also known as example, leadership, personal
career commitment) and being open to exploring career options (career development and team
Questions:
curiosity). working.
• Attitudes towards learning questions
• Readiness to learn questions Questions: Questions:
• Learning anxiety questions • Career adaptability questions • Employability skills
• Career commitment questions questions
Improvements to career adaptability attitudes. This includes motivation High rate of engagement with the course.
for adapting to changes (career control) and perception of transferable
Questions:
skills (career confidence)
• User engagement questions
Questions:
• Career adaptability questions
• Self-control and belief questions
Increased confidence to engage with digital learning. Increased participation in learning. This includes, for example,
progression onto a further learning course.
Questions:
• Confidence in digital learning measure Questions:
• Participation in learning questions
Improved emotional wellbeing. This includes both mental and physical High rate of career progression. This includes, for example, progressing
wellbeing. into another job, increased salary, becoming more secure at work or
achieving specific career goals.
Questions:
• Mental wellbeing questions Questions:
• Wellbeing and life satisfaction questions • Progression in employment questions
47
QUESTION BANK TOOL
Use of partial question sets outcomes you measure via the questionnaire the information you are collecting and why
and explore alternative methods (such you are doing so. You should also explain
You may wish to use only part of a set as qualitative research) to measure other any specific instructions for completing the
of questions in your questionnaire, for outcomes. questionnaire and reassure participants
example in order to cut down its length about confidentiality. Some example
or if you consider certain questions to be introductory text is given below.
inappropriate for your project.
Pre- and post-intervention
We are asking everyone who is taking part in our
In this situation, we would still recommend questionnaires
programme to complete a short survey.
that you use the full set of questions per We recommend that you ask We would like to ask you some questions to
outcome as outlined in the question bank. If participants to complete both pre- help us to understand how taking part in the
questions are cut, there is a risk that certain and post-intervention questionnaires. programme has made a difference to you. The
aspects of an outcome will not be measured. Including the same questions before survey contains questions about different aspects
and after your intervention allows you of your work, life and skills to see where you may
Length of questionnaire to identify a baseline for participants, have experienced benefits or changes from taking
in order to assess changes that have part.
We recommend a maximum length of occurred after completing your
15 minutes (dependent on the length of project. It is important for pre- and Your involvement in this survey is entirely
participant involvement in your project). post-intervention questionnaires to be voluntary/optional, but we would very much
Before using your questionnaire with completed by the same participants. appreciate your input.
participants, it is important to test it for both All questions in this toolkit are suitable
accuracy and length. You can do this by for use in pre- and post-intervention The survey will take about 15 minutes to complete,
asking members of your team and wider questionnaires unless otherwise stated. and should be completed in one sitting. We will
organisation to complete it, record the length use your response to explore and demonstrate
of time taken to do so and to feed back any the impact of our work. Your response will
issues. You should also check their responses remain confidential, with any findings shared
Introductory text
to make sure it is being filled out correctly. or published from this survey remaining strictly
To keep the questionnaire to a maximum of Your questionnaire should contain some anonymous.
15 minutes, you may need to prioritise which introductory text to explain to participants
48
QUESTION BANK TOOL
Participant
Characteristics
If you don’t already collect participant
demographic information, we strongly
suggest including a set of questions in your
questionnaire to do so. This will enable you to
analyse whether your project has a different
impact for different groups.
49
QUESTION BANK TOOL
Characteristic Question
Age What age were you on your last birthday?
25-34 35-44 45-54 55-65
Employment sector What sector would you consider your employer to be in?
Agriculture, forestry & fishing Accommodation & food services Public admin & defence; social security
Mining, energy & water supply Information & communication Education
Manufacturing Financial & insurance activities Human health & social work activities
Construction Real estate activities Other services
Wholesale, retail & repair of motor vehicles Professional, scientific & technical activities
Transportation & storage Administrative & support service activities
50
Toolkit Question
Sets
TOOLKIT QUESTION SETS
Confidence in gaining Please select how much you agree, or how confident you are, with
each statement.
employment questions
Strongly No Strongly
Agree Disagree
When should you use these questions? Agree preference Disagree
These questions may be useful if your project aims to Having almost any type
increase participants’ confidence, particularly their of paid work is better
confidence in gaining employment. The questions have than not working
been previously developed by L&W5, and have been used
in several national government surveys.
The thought of being in
paid work is better than
not working
How to use these questions
These questions can be copied and pasted directly into a I am confident that I
questionnaire. It is important to include all six questions, can find a job that suits
in order to cover all aspects of confidence in gaining me
employment.
I would be a happier,
For the first four questions, choosing ‘agree’ or ‘strongly more fulfilled person if I
agree’ shows a higher level of confidence. This is also was in paid work
shown by choosing ‘confident’ or ‘completely confident’
for the last two questions. Not at all Not Completely Don’t
Confident
confident confident confident Know
To analyse the results of the first four questions, you can
combine separately for each question the number of
You can do well in job
participants who agreed/strongly agreed, interviews
disagreed/strongly disagreed or had no preference.
To analyse the results of the last two questions, you can You can cope with
combine separately for each question the number of rejections and knock
participants who were confident/completely confident, backs
not confident/not at all confident or didn’t know.
5
Social Metrics Report
52
TOOLKIT QUESTION SETS
Self-control and belief Please select how true you consider each statement to be for you.
6
Questions are from the General Self-Efficacy Scale, available here
53
TOOLKIT QUESTION SETS
Attitudes towards learning Please select how much you agree or disagree with each statement.
Readiness to learn questions Please select how much you agree or disagree with each statement.
If I don’t understand
something, I look for
additional information
to make it clearer.
8
Questions are from the Readiness to Learn Scale, available here
55
TOOLKIT QUESTION SETS
Learning anxiety Please select how much you agree or disagree with each statement.
questions Strongly
Agree
No
Disagree
Strongly
Agree preference Disagree
I have no desire to
How to use these questions participate in learning
To analyse the results, you can combine separately for The thought of learning
each question the number of participants who something new makes
me anxious
agreed/strongly agreed, disagreed/strongly disagreed or
had no preference.
I am often anxious
when learning
56
TOOLKIT QUESTION SETS
Career adaptability
questions
When should you use these questions?
These questions are likely to be useful to most projects, as they
measure career adaptability. The questions were developed by
Savickas & Porfeli9, and include four sets of questions (sub-scales)
which measure career concern, career control, career curiosity and
career confidence.
9
Questions are from the Career Adaptability Scale, available here
57
TOOLKIT QUESTION SETS
Different people use different strengths to build their careers. No one is good at everything, each of us emphasises some
strengths more than others. Please rate how strongly you have developed each of the following abilities using the scale below.
Career concern
Career control
Keeping upbeat
Counting on myself
58
TOOLKIT QUESTION SETS
Career curiosity
Exploring my surroundings
Career confidence
Working up to my ability
Overcoming obstacles
Solving problems
59
TOOLKIT QUESTION SETS
If you need to reduce the length of your questionnaire, you may wish to use the short-form of the career
adaptability questions. These have not been as extensively tested as the long version, but have been found to give
similar results. Questions can be included in your questionnaire and analysed in the same way as the long version.
Your evaluation manager will be able to advise on which set of questions is best for your questionnaire.
Career concern
Career control
Counting on myself
Career curiosity
Career confidence
Working up to my ability
60
TOOLKIT QUESTION SETS
Confidence in digital learning Please select how much you agree or disagree with each
statement.
measure
Strongly No Strongly
Agree Disagree
Agree preference Disagree
When should you use this measure?
This measure may be useful if your project aims to improve
participants’ confidence in engaging with digital learning. This I know how to find
measure contains two parts. Firstly, a bespoke set of participant appropriate digital
learning courses
questions developed by L&W. Secondly, some suggestions on
proxy outcomes that can be directly measured.
61
TOOLKIT QUESTION SETS
Mental wellbeing questions Please select how much you agree or disagree with each statement.
10
Questions are from the Short Warwick Edinburgh Mental Wellbeing Scale, and are available here
62
TOOLKIT QUESTION SETS
Wellbeing and life Please select how much you agree or disagree with each statement.
1 2 3 4 5 6 7 8 9 10
When should you use these questions?
These questions may be useful if your project aims to
improve participants’ emotional wellbeing. The questions Overall, how satisfied
have been developed by the Office for National are you with your life
Statistics11, and are used in national surveys to measure nowadays?
personal wellbeing and life satisfaction.
For each question, the higher the score chosen the higher
the level of personal wellbeing.
To analyse the results, simply record each participant’s Overall, how happy did
score for each question. you feel yesterday?
On a scale where 0
is “not at all anxious”
and 10 is “completely
anxious”, overall, how
anxious did you feel
yesterday?
11
Questions are from ONS Wellbeing questions, and are available here
63
Course completion measure
When should you use this measure?
This measure may be useful if your project aims include
course completion or drop-out rates.
64
TOOLKIT QUESTION SETS
questions
When should you use these questions? Are you currently considering changing jobs Yes No
65
TOOLKIT QUESTION SETS
User engagement questions Please select how much you agree or disagree with each statement.
For each of the initial five questions, a higher level of I would like to learn
engagement is shown by choosing ‘agree’ or ‘strongly about similar project
content in the future
agree’.
The last two questions are open-ended. To analyse, Please describe anything that made it difficult to participate in the project
responses should be read and grouped into different
themes.
66
TOOLKIT QUESTION SETS
questions
When should you use these questions? Considered participating in any additional learning activities Yes No
67
TOOLKIT QUESTION SETS
questions
Don’t
When should you use these questions? Felt more secure in your current job Yes No
Know
Moved into a new job at the same level in your current Don’t
How to use these questions sector
Yes No
Know
68
TOOLKIT QUESTION SETS
12
Questions are from the Digital Skills to Tangible Outcomes Measure, and are available here
69
TOOLKIT QUESTION SETS
Please select how true each of the following statements are for you.
Neither true nor
Not at all true of me Not very true of me Mostly true of me Very true of me
untrue of me
Operational skills
I know how to use shortcut keys (e.g. CTRL-C for copy, CTRL-S for save)
I know how to change who I share content with (e.g. friends, friends of
friends or public)
70
TOOLKIT QUESTION SETS
I know how to create something new from existing online images, music
or video
I know how to make basic changes to the content that others have
produced
Mobile skills
71
TOOLKIT QUESTION SETS
Employability skills
questions
When should you use this measure? ‘disagree’ or ‘strongly disagree’ shows a higher
level of employability skills.
These questions may be useful if your project
aims to improve employability skills. The It is not necessary to include all three areas
questions were developed by Llinares-Insa. in your questionnaire, only those which are
Gonzalez-Navarro et al.13, and measure relevant to your project. If you are including
employability skills across three different an area in your questionnaire, you must
areas: behaviours to protect employment, include all questions which make it up.
risks to employment and self-learning in
employment. Each area should be analysed separately. For
the first and third set of questions (behaviours
How to use this measure to protect employment and self-learning in
employment) assign a 5 to ‘strongly agree’, a
These questions can be copied and pasted 4 to ‘agree’, a 3 to ‘neither agree nor disagree’,
directly into a questionnaire. The questions a 2 to ‘disagree’ and a 1 to ‘strongly disagree’.
cover three areas of employability skills. For the second set of questions (risks to
The first set of questions cover behaviours employment), reverse this and assign a 5 to
to protect employment, the second risks to ‘strongly disagree’, a 4 to ‘disagree’, a 3 to
employment and the third self-learning in ‘neither agree nor disagree’, a 2 to ‘agree’ and
employment. a 1 to ‘strongly agree’.
For the first and third set of questions Calculate the average score for each area for
(behaviours to protect employment and each participant. The higher the score, the
self-learning in employment), choosing ‘agree’ greater the level of employability skills.
or ‘strongly agree’ shows a higher level of
employability skills. For the second set of
questions (risks in employment), choosing
13
Questions are from the Employability Appraisal Scale, and are available here
72
TOOLKIT QUESTION SETS
Please select how much you agree or disagree with the following statements.
For me, it is more important to feel good about myself than to receive
the approval of others
73
TOOLKIT QUESTION SETS
Risks to employment
Risks to employment
I like to learn new things about my work even if it’s about small details
74
CREATED BY
Learning and Work Institute for the
CareerTech Challenge
DESIGNED BY
Storythings