Professional Documents
Culture Documents
net/publication/242347923
CITATIONS READS
33 8,611
1 author:
Ian Miles
The University of Manchester - & - Higher School of Economics, Moscow
331 PUBLICATIONS 6,731 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Basic Research Program at the National Research University Higher School of Economics View project
All content following this page was uploaded by Ian Miles on 23 July 2014.
Article information:
To cite this document: Ian Miles, (2012),"Dynamic foresight evaluation", foresight, Vol. 14 Iss: 1 pp. 69 - 81
Permanent link to this document:
http://dx.doi.org/10.1108/14636681211210378
Downloaded on: 26-07-2012
References: This document contains references to 22 other documents
To copy this document: permissions@emeraldinsight.com
This document has been downloaded 95 times since 2012. *
Jonathan Calof, Jack E. Smith, (2012),"Foresight impacts from around the world: a special issue", foresight, Vol. 14 Iss: 1 pp. 5
- 14
http://dx.doi.org/10.1108/14636681211214879
Jonathan Calof, Riel Miller, Michael Jackson, (2012),"Towards impactful foresight: viewpoints from foresight consultants and
academics", foresight, Vol. 14 Iss: 1 pp. 82 - 97
http://dx.doi.org/10.1108/14636681211210387
Access to this document was granted through an Emerald subscription provided by UNIVERSITY OF MANCHESTER
For Authors:
If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service.
Information about how to choose which publication to write for and submission guidelines are available for all. Please visit
www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
With over forty years' experience, Emerald Group Publishing is a leading independent publisher of global research with impact in
business, society, public policy and education. In total, Emerald publishes over 275 journals and more than 130 book series, as
well as an extensive range of online products and services. Emerald is both COUNTER 3 and TRANSFER compliant. The organization is
a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.
*Related content and download information correct at time of download.
Dynamic foresight evaluation
Ian Miles
lbert Einstein remarked that ‘‘Everything that can be counted does not necessarily
DOI 10.1108/14636681211210378 VOL. 14 NO. 1 2012, pp. 69-81, Q Emerald Group Publishing Limited, ISSN 1463-6689 j foresight j PAGE 69
process of the exercise. For example, it is rather plausibly claimed that the impact of taking
part in a scenario workshop is liable to be far more than that achieved as a result of being
handed the report of that workshop.
Sometimes evaluation is taken to mean simply auditing the activity in question – producing
an assessment of whether funds have been spent efficiently (or even effectively). But this is a
very limited understanding of the task, with similarly limited scope for learning from the
evaluation. Miles and Cunningham (2006) discussed the evaluation of innovation
programmes, in this light, arguing that understanding how, how far, and why programmes
have succeeded or failed to realise the desired effects. Given the complexity of innovation
systems, the evaluation of such programmes is needed to help us understand better the
interaction between policies and innovation processes, to establish, to identify how
programmes and broader policies might be designed so as to have more of the desired
effects. In other words, evaluation can help us better understand the system we are trying to
influence, by telling us what can be learned from our efforts to exert influence. This means
that we are interested in the impact, the outcome of the work – and in what processes have
led to this, and why. This point applies equally to foresight practice – evaluation should
enable us to understand the object of foresight, and the ways in which foresight can be used
more generally in our social and political contexts. We term this ‘‘dynamic foresight
evaluation’’.
j j
PAGE 70 foresight VOL. 14 NO. 1 2012
It can be helpful to see foresight as a service that a supplier is providing to (or facilitating
within) a user, and as such that it can be intensive in terms of user-supplier interaction, and
can be extensive over time. This is very different from, for example, providing a one-off
forecast to an interested party.
j j
VOL. 14 NO. 1 2012 foresight PAGE 71
EFMN database feature many cases with relatively few participants – more than half of them
feature less than 50 participants, and are quite possibly simply desk exercises run by
relatively few people. This is not to discount the value of such activities – but they are quite
different in nature from the large-scale TFPs. (Though the way the EFMN database is
constructed, it is possible that some of the activities described are individual cases are
actually part of a single large exercise with several specific projects within it.) Most of the
exercises captured had quite long time horizons, but around a quarter were focused on the
relatively short-term future (less than ten years). Longer-term time horizons mean that the
definitive evidence about impact of a TFP (and about how well it went about its task of
appraising future prospects)[3] may not be available for decades. It will be necessary to
assess the exercise in terms of more immediate outcomes, which will need to be
persuasively linked to the longer-term objectives.
Not all were linked to policy or decision-making processes in an obvious way, and a range of
objectives were encountered. Most of the exercises had three to four specific objectives,
spread across two or three of nine families of objectives, ranked here in order of prevalence
across some 200 cases:
1. Orienting policy formulation and decisions [33 cases].
2. Supporting STI strategy- and priority-setting [30 cases].
3. Fostering STI cooperation and networking [29 cases].
4. Generating visions and images of the future [24 cases].
5. Triggering actions and promoting public debate [21].
6. Recognising key barriers and drivers of STI [20 cases].
7. Identifying research/investment opportunities [16 cases].
8. Encouraging strategic and futures thinking [15 cases].
9. Helping to cope with Grand Challenges [12 cases].
Though the families are here defined in rather broad and potentially overlapping ways, the
diversity of objectives is apparent. This also poses a challenge for evaluation – or at least, for
efforts to benchmark exercises – they are not all trying to achieve the same things.
This is even the case if we consider the national TFPs outlined in Georghiou et al. (2008).
Here we find cases whose main objectives are informing particular decisions about critical
technologies, priority setting across fields of research, enhancing awareness of future issues
and the opportunities for applying technology to major problems, helping to build consensus
on desirable targets and aligning various stakeholders behind these, promoting networking
in the STI system (the famous ‘‘wiring up’’ of the national innovation system, see Martin and
Johnston, 1999).
Strikingly, we can even point to a substantial reorientation of objectives in the UK foresight
programme over its 16 year history. Originally its aims were to aid research priority setting
and help ‘‘wire up’’ the innovation system; it attempted to map out opportunities across a
very wide range of areas of technology development and application. Its current mission is
seen much more as that of promoting better interaction and networking across different
paths of the policy system bringing together different parts of government that need to have
an understanding of each other’s actions if policies to deal with long-term challenges (that
have a strong technology dimension) are to be coordinated. The criteria for evaluation will be
required to change to take into account this evolution of objectives. The current objectives
also highlight the point that policy impact may involve the interaction across different fields of
policy, not just the adoption or shaping of specific policy measures.
More generally, while a foresight exercise may often be linked to specific decisions and
planning processes, there will probably be other objectives that are intended, often explicitly
(and we would suspect that when these other objectives are not spelled out explicitly, they
will nonetheless be implicit in the design of the exercise and narratives of the programme
j j
PAGE 72 foresight VOL. 14 NO. 1 2012
managers). Evaluation will thus need to be done in the context of or against multiple criteria,
and understanding how far different sorts of objectives have been realised may call for
rather different research methods - surveys of stakeholders and participants, analysis of
official documents, interviews with the primary ‘‘users’’ of the work, and so on.
If we focus on the large-scale TFPs, then, we still face a wide range of activities, even if we
have excluded the small-scale, short time horizon activities. The TFPs typically take from one
to three years, though sometimes they last longer. But these exercises are often not just
one-offs; it is common to find a national programme becoming a more or less continuous
activity, a series of interlinked activities, often divided into distinct cycles, waves, or stages.
Sometimes the connection between stages is rather loose – a new exercise is launched that
draws to a greater or lesser extent on lessons learned and personnel active in the earlier
programme. Sometimes there is tight coupling. The current UK Foresight, for instance, is a
rolling programme, with a stream of projects that typically last for about two years – at any
one time there will be some projects that are fairly new ones and others nearing completion
or into follow-up stages[4].
j j
VOL. 14 NO. 1 2012 foresight PAGE 73
the part of programme managers, but could not have been achieved without demonstrable
achievements – most likely the approval of key stakeholders, as well as any more formal
evaluations.
An interesting sidelight on the social and political embedding of foresight, and of TFPs in
particular, can be gleaned from analysis of the use of terminology in publications. An earlier
application of Harzing’s Publish or Perish data analysis system (Harzing, 2007) was to
demonstrate the surge of publications using ‘‘Technology Foresight’’ in their titles, dating
from the mid-1990s, so that it eclipsed the similar use of ‘‘Technology Forecasting’’ and
‘‘Technological Forecasting’’. (The data are drawn from Google Scholar, and thus emphasise
journal articles and academic books, though some policy and business texts and some
more popular literature also slip in; the data are very ‘‘noisy’’, but are generally valuable for
looking at broad trends.) What happens when we consider use of the term ‘‘Foresight
Programme’’? Figure 1 displays the count of publications using the term in their title, Figure 2
those using it in their text. The results are intriguing.
In the cases of the use of the term in titles, we can expect much of this to refer to discussion
of TFP as a policy innovation, about what this actually was, why it was happening, how to
create and manage the programme, and the like. Uses rapidly peaked around the time of the
j j
PAGE 74 foresight VOL. 14 NO. 1 2012
take off of foresight programs in the 1990s and has declined subsequently. The TFP as an
object of intellectual curiosity seems to have declined in salience. This could be because
TFPs have become less important in the policy mix – which does not accord with our
discussion above. Perhaps, then, they have become less interesting for this sort of
discourse because the novelty has worn off, and only become foregrounded when there is
something really new happening in TFPs.
In contrast, the number of publications that refer to, but are not centrally about, the TFPs
broadly increases over time. There are many times more publications of this sort, and the
trend is clearly upward, though uneven. This level and trend could well be an indicator of the
impact of TFPs on intellectual discussion, that this has been growing alongside the policy
influence of the programmes. Thus TFPs are of intellectual interest, but because of their
results (and perhaps the way they are used).
j j
VOL. 14 NO. 1 2012 foresight PAGE 75
Figure 3 The foresight cycle
j j
PAGE 76 foresight VOL. 14 NO. 1 2012
Figure 4 Knill and Tolsun: five stages of the policy cycle
This can be important in that policy implementation may be dependent on, or much affected
by, the engagement of stakeholders in foresight. The ‘‘Advising’’ and ‘‘Renewing’’ stages of
the Foresight cycle are liable to be contributing to policy implementation in other ways, too,
as the policy plans are translated into new routines and concrete actions for civil servants
and other officials and stakeholders; and ‘‘Renewing’’ and evaluation can coincide implicitly
or explicitly – or in some cases, we might say that they collide. For example, we know of one
case where Foresight recommendations were rapidly translated into a strategy statement. In
a striking example of the uptake of a product into policy, the report of a roadmapping
exercise and scenario workshop looked to be the statement of official policy for the UK
nanotechnology sector. But then implementation was captured by a different agenda; a full
assessment of the impacts of the foresight and the action would be bound to foreground the
tensions and interests that led to this[6].
In reality, of course, a policy is not formulated in a vacuum. At any one time, there may be
several policy cycles underway, and several more in the offing. Even in the STI field, there
may be initiatives underway on IPR, training, tax relief, small firm support, regional
strategies, cluster strategies, research funding, and so on. Policies in some other fields may
also be related to the topic – for example might be migration policy, health policy, trade
policy . . .. Different policies could well be at different stages of development, and thus a
technology foresight exercise is liable to feed into (and draw from) them in very different
ways. This will, of course, include links achieved through engagement with some of the
stakeholders involved with the other policies. It is not uncommon for stakeholders to have
interests in multiple policy areas – indeed, one persistent problem with policy networks is
that often the ‘‘usual suspects’’ are hauled in as relevant stakeholders and experts across a
wide range of topics. Foresight programmes should be using systematic stakeholder
analysis to ensure that they are not simply drawing on this small, but loquacious, pool, when
they have been drawn into the exercise as being involved in some way with the main policy
process to which Foresight has been designed to contribute.
j j
VOL. 14 NO. 1 2012 foresight PAGE 77
With these caveats in mind, we can briefly refer to a helpful account of the role of foresight in
innovation policymaking (Havas et al., 2010), which deploys the idea of the policy cycle and
suggests three major forms of influence. Rephrasing their account, the three roles are:
1. Policy-informing – where formal products (codified information, results and reports) are
supplied to policymakers as inputs into policy design. Havas et al. note that the inclusion
of a wide range of stakeholders into the process is important here.
2. Policy advising / strategic policy counselling – which relates the information generated in
the foresight process to the strategies, instruments and policy options of particular
policymaking bodies.
3. Policy-facilitating – using foresight for collective learning, cognitive alignment of
appraisals of risks and opportunities, and building networks and infrastructures to
support this.
For each of these roles Havas et al. (2010) list a number of concrete outcomes that might be
expected, and that can in principle be the basis of an evaluation effort. Some are much
easier to assess than others, of course. One general message is that that evaluation will be
much more thorough if it can be built into the process from the outset, so that relevant
information is not lost in the pressure of implementing the exercise.
The policy cycle of course influences the foresight process – how it is set up, how
stakeholders (especially civil servants) participate, how results are used, and so on. The
options and choices considered, the stakeholders considered legitimate, the scope for
challenging as opposed to just working within an established framework are all important
conditions where the foresight practitioners may be able to influence outcomes, but may find
themselves constrained by the policy context. (An interesting case of a foresight program
that was mandated to challenge the institutional framework of an innovation system was the
French FUTURIS exercise. See Barré, 2008.)
j j
PAGE 78 foresight VOL. 14 NO. 1 2012
responses from other parts of the system – a Ministry of Research and a Ministry of Industry
may have their own agendas to pursue, a research funding agency with a particular sectoral
of S&T focus may be concerned that a national exercise will not place sufficient emphasis on
its concerns – which may lead to competing exercises (which may labelled as ‘‘key
technologies’’, ‘‘roadmapping’’, ‘‘horizon scanning’’, while having very much a foresight
agenda)[7]. This may create confusion or even some delegitimation of one or both activities
– but even if this does not happen, then disentangling the consequences of the distinct
activities can be problematic. Expert informants may be able to clarify affairs, though they
may also be aligned with one or another side! In addition to various parts of government,
there may be other stakeholder organizations that are also active on the foresight front –
industry and employee bodies, third sector organizations, independent foundations funding
S&T or investigation into the implications of various technology developments.
It could be hailed as a success for foresight in that, in such instances, it has helped establish
a broader foresight culture – by triggering activities in other institutions – even if the way in
which this has happened is unintended, and in the worst cases it may undermine some of the
other goals of the programme.
Beyond impacts
This highlights the inadequacy of the term ‘‘impact’’ for thinking about foresight
interventions. The term is best used for describing physical collisions between inanimate
objects (a ball of putty is impacted by hitting the floor, but it has not learned anything).
People may be impacted by cars, in terms of physical trauma. But people can learn, and
experience and learning are crucially involved in using foresight results (which is one reason
why the results are generally more meaningful to those who have been involved in the
process), and of course in active engagement in foresight activities. The consequences of
foresight activities will very much depend on the orientations of the ‘‘users’’ of the activity –
their existing appraisal of the topic, the effort they are prepared to put into understanding
alternative perspectives, the extent to which they can think beyond existing policy
perspectives. We began this essay by discussing the role of users in coproducing services,
and if we should want to continue to talk of impacts, we should note that the impactee is a
coproducer (to a great or lesser extent) of the impact (which may also be a greater or lesser
impact). Given that the impactee may also have played a role in setting up the foresight
activity in the first place, any unidirectional language will be too one-dimensional to capture
the effectivity of foresight.
Foresight practitioners may seek to engage users in the most productive way, but –
especially when dealing with large bureaucracies (possibly especially international
organizations) – they are institutionally limited in what they can accomplish[8]. Creating
communities of practice in Foresight, where professionals can be mutually supportive of
efforts to maintain the quality of activities in the face of restrictions that users and sponsors
may want to put into place, and be constructively critical of actual practice, is liable to help
matters. Evaluation can also play a role – if users and sponsors are prepared to learn the
lessons that are telling them about the ways in which framing foresight can limit its ultimate
value. Thus, evaluation needs to focus less on simple ‘‘impacts’’, and much more on how the
outcomes of foresight have been coproduced by the various actors engaged (or that should
have been engaged) in the process. This will be what makes it dynamic foresight evaluation.
Notes
1. A wide-ranging guide to current service research is Maglio et al. (2010).
2. EFMN web site still operational at www.efmn.info/. EFP available at www.foresight-platform.eu/
(accessed 22/11/2011)
3. Which is definitely not the same thing as forecasting accuracy, though the accuracy of specific
forecasts may be part of this appraisal.
j j
VOL. 14 NO. 1 2012 foresight PAGE 79
4. It should be noted that the UK Programme is often officially described as having involved three
‘‘cycles’’; the terminology is not precisely the same as that used here. See Miles (2005).
5. An interesting account of the use of expertise at various stages of the policy cycle is provided by
Barkenbus (1998).
6. The ‘‘Taylor Report’’ (Advisory Group On Nanotechnology, 2002) – heavily based on the report of a
Foresight process (documented as Miles and Jarvis, 2001) – was a strategy for nanotechnology in
the UK. However, the strategy was watered down in a policy entitled the Micro and Nano Technology
Manufacturing Initiative. (The ‘‘Micro’’ element of this title is, perhaps, suggestive of the industrial
interests involved. This was almost immediately criticised by a Parliamentary Committee as ‘‘an
inadequate response to the Taylor Report. The £90 million over six years on offer from DTI would
have been better spent on the establishment of one or two nanofabrication facilities, along the lines
of those recommended by the Taylor Report. . . . at the cost of a clearly focussed strategy, based on
areas of existing UK strength. . . . the pressures for short term financial returns and for a wide
regional distribution of funding will result in a disparate range of microtechnology facilities being
supported instead of the few world class nanotechnology centres necessary to raise the UK’s
nanotechnology profile. . . . The education and training strategy called for by the Taylor Report has
still to be developed.’’ The Science and Technology Committee (2004, p. 3).
7. This element of the politics of foresight exercises is not well-documented, to my knowledge, though
it is often discussed among practitioners. The phrase ‘‘politics of foresight’’ returns startlingly few
google hits, most to the same two essays (search performed 22/11/2011).In contrast, the ‘‘politics of
expertise’’ is the focus of too vast a body of literature to attempt to summarise here.
8. See Loveridge (2009) for further reflections on what he calls institutional foresight.
References
Advisory Group on Nanotechnology (2002), New Dimensions for Manufacturing: A UK Strategy for
Nanotechnology, Department of Trade and Industry, London, DTI Pub 6182 2k/06/02/NP, URN 02/1034,
originally published online at www.dti.gov.uk/innovation/nanotechnologyreport.pdf (but removed);
now available at: www.innovateuk.org/_assets/pdf/taylor%20report.pdf
Barkenbus, J. (1998), Expertise and the Policy Cycle, Energy, Environment, and Resources Center
(EERC), The University of Tennessee, Knoxville, TN, available at: www.oocities.org/hgheleji/en/
publicpolicy/policy_cycle.pdf (accessed 22 November 2011).
Barré, R. (2008), ‘‘Foresight in France’’, in Georghiou, L., Cassingena, J., Keenan, M., Miles, I. and
Popper, R. (Eds), The Handbook of Technology Foresight, Edward Elgar, Cheltenham and
Northampton, MA.
Coates, J.F. (1985), ‘‘Foresight in federal government policymaking’’, Futures Research Quarterly,
Vol. 2, pp. 29-53.
EFMN (2009), Mapping Foresight: Revealing how Europe and Other World Regions Navigate
into the Future, Directorate-General for Research, European Commission, Brussels, EUR 24041 EN,
available at: ftp://ftp.cordis.europa.eu/pub/fp7/ssh/docs/efmn-mapping-foresight.pdf (accessed
22 November 2011).
Georghiou, L., Cassingena, J., Keenan, M., Miles, I. and Popper, R. (Eds) (2008), The Handbook of
Technology Foresight, Edward Elgar, Cheltenham and Northampton, MA.
Harzing, A.W. (2007), Publish or Perish, available at: www.harzing.com/pop.htm (accessed
22 November 2011).
Havas, A., Schartinger, D. and Weber, M. (2010), ‘‘The impact of foresight on innovation policy-making:
recent experiences and future perspectives’’, Research Evaluation, Vol. 19 No. 2, pp. 91-104.
Irvine, J. and Martin, B. (1984), Foresight in Science, Edward Elgar, Aldershot.
Knill, C. and Tosun, J. (2008), ‘‘Policy making’’, in Caramani, D. (Ed.), Comparative Politics,
Oxford University Press, Oxford.
j j
PAGE 80 foresight VOL. 14 NO. 1 2012
Martin, B. and Irvine, J. (1989), Research Foresight, Edward Elgar, Aldershot.
Martin, B. and Johnston, R. (1999), ‘‘Technology foresight for wiring up the national innovation system.
Experiences in Britain, Australia, and New Zealand’’, Technological Forecasting and Social Change,
Vol. 60 No. 1, pp. 37-54.
May, J.V. and Wildavsky, A. (Eds) (1978), The Policy Cycle, Sage, London.
Miles, I. (2002), Appraisal of Alternative Methods and Procedures for Producing Regional Foresight,
report for the European Commission’s DG Research funded STRATA – ETAN Expert Group Action, CRIC,
Manchester, available at: http://ec.europa.eu/research/social-sciences/pdf/appraisalof-alternative-
methods_en.pdf
Miles, I. (2005), ‘‘UK foresight: three cycles on a highway’’, International Journal of Foresight and
Innovation Policy, Vol. 2 No. 1, pp. 1-34.
Miles, I. and Cunningham, P. (2006), Smart Innovation: A Practical Guide to Evaluating Innovation
Programmes, European Commission, Luxembourg, ISBN 92079-01697-0, available at: ftp://ftp.cordis.
lu/pub/innovation/docs/sar1_smartinnovation_master2.pdf
Miles, I. and Jarvis, D. (2001), Nanotechnology – A Scenario for Success in 2006. HMSO, Teddington,
National Physical Laboratory Report Number: CBTLM 16, available at: http://libsvr.npl.co.uk/npl_web/
pdf/cbtlm16.pdf
Science and Technology Committee (House of Commons) (2004), Too Little Too Late? Government
Investment in Nanotechnology – Fifth Report of Session 2003-04, Report Volume I, The Stationery Office
Limited, London.
Wells, H.G. (1932), ‘‘Wanted professors of foresight!’’, BBC radio programme broadcast
19 November 1932, published in Futures Research Quarterly, Vol. 3 No. 1 (1987), pp. 89-91.
Further reading
Miles, I. (2010), ‘‘The development of technology foresight: a review’’, Technological Forecasting and
Social Change, Vol. 77 No. 9, pp. 1448-56.
j j
VOL. 14 NO. 1 2012 foresight PAGE 81