Professional Documents
Culture Documents
Unit Description
This unit provides an opportunity for students to select a topic of interest in an area and / or application of
Science, Technology, Engineering or Mathematics (STEM). Students will then research and develop a STEM
communication that informs and educates a generalist audience (i.e. an audience without detailed knowledge
of science) on this area and / or application of STEM. Evaluation of the effectiveness of the STEM
communication will be an integral part of the planning, delivery and reflective components of the unit.
For 2022, the STEM communication task will be development of audio-video resources (e.g. narrated
Powerpoint) on a STEM topic of your choice. Students will work individually to research and develop a STEM
communication that provides background understanding of the topic and explains its relevance / benefit to
society. Throughout the unit students will reflect on the development and delivery of their STEM
communication, and partake in group discussions to provide feedback to other students on their STEM
communication development and delivery.
To pass this unit you need to achieve at least 50% overall for the unit.
Details of each assessment item and their alignment with the ILOs is provided in the Assessment Schedule in
the Unit Outline.
Examples of STEM Communication Topics (but you can choose the area / topic)
Assessment Task 1 – Literature Review and STEM Communication Planning (Weighting 30%)
Task description
Select an emerging field or application of STEM and undertake a literature review of this field or application for
the purposes of developing knowledge and resources for preparation of a presentation to a generalist audience.
(feedback will be provided prior to submission for assessment by peers and teaching staff through group
discussion).
Due date – 11:59 pm Sunday August 21
Assessment criteria
• Demonstrate cognitive-analytical, critical, creative skills by gathering, synthesising and critically evaluating
information from a range of sources.
• Demonstrate knowledge in at least one STEM disciplinary area through development of a STEM
Communication plan.
• Demonstrate an understanding of the role and relevance of STEM in society.
• Demonstrate the capacity for independent, autonomous, self-directed learning.
• Demonstrate the ability to work in a professional manner following relevant WHS, ethical and inclusive
practices.
Task length - Indicative length, 1000 – 1500 words
Task description
Prepare and deliver a communication on your selected emerging field or application of STEM to a generalist audience that
informs and educates the audience of the emerging field or application of STEM.
For 2022, the STEM communication task will be development of audio-video resources (e.g. narrated Powerpoint) on a
STEM topic of your choice. Students will work individually to research and develop a STEM communication that provides
background understanding of the topic and explains its relevance / benefit to society. Throughout the unit students will
reflect on the development and delivery of their STEM communication, and partake in group discussions to provide
feedback to other students on their STEM communication development and delivery.
(feedback will be provided prior to assessment by peers and teaching staff through review/practise delivery of the
communication)
Due date – 11:59 pm Sunday October 2
Assessment criteria
• Be an effective communicator by communicating STEM information to a generalist audience using appropriate
language.
• Demonstrate knowledge in at least one STEM disciplinary area
• Demonstrate the ability to work effectively, responsibly and safely in an individual or team context as required.
• Demonstrate the ability to work in a professional manner following relevant WHS, ethical and inclusive practices.
Task length - An audio-video resource of 10 to 15 minutes duration
Task description
Complete a reflective journal throughout the preparation and delivery of your STEM communication, reflecting,
reviewing and evaluating your work in developing the communication and the effectiveness of the
communication in communicating an emerging field or application of STEM to your selected generalist
audience. (feedback will be provided prior to submission for assessment by peers and teaching staff through
group discussion).
Due date – 11:59 pm Sunday October 23
Assessment criteria
• Demonstrate the ability to evaluate effective programs for learning through the review, reflection and
evaluation of STEM communication(s).
• Demonstrate the capacity for independent, autonomous, self-directed learning. Critically evaluate your own
contribution to the development and delivery of your STEM communication, including:
• Application of initiative and innovative approaches
• Ability to work independently
• Ability to lead
• Demonstrate the ability to work in a professional manner following relevant WHS, ethical and inclusive
practices.
Task length - Indicative length, 1000 words
References
• Communication and engagement with science and technology Issues and dilemmas, Editors J.K. Gilbert & S.
Stocklmayer, 2013, Routledge, New York (https://www-taylorfrancis-
com.ezproxy.utas.edu.au/books/e/9780203807521)
• L. A. Orthia, Chapter 5 Negotiating public resistance to engagement in science and technology, in Gilbert &
Stocklmayer, 2013
• S. Stocklmayer, Chapter 2 Engagement with science Models of science communication, in Gilbert &
Stocklmayer, 2013
• S. Perea & S. Stocklmayer, Chapter 11 Science communication and science education, in Gilbert & Stocklmayer,
2013
• L. J. Rennie, Chapter 12 The practice of science and technology communication in science museums, in Gilbert
& Stocklmayer, 2013
• Suzanne Spicer, The nuts and bolts of evaluating science communication activities, Seminars in Cell & Developmental
Biology, 2017, 70, 17-25 (https://www-sciencedirect-
com.ezproxy.utas.edu.au/science/article/pii/S1084952117304640)
• Office of Science and Technology (OST) & The Wellcome Trust, Science and the public: A review of science
communication and public attitudes to science in Britain, 2000, The Wellcome Trust, London
(https://wellcome.ac.uk/sites/default/files/wtd003419_0.pdf )
• P. Weingart, L. Guenther & M. Joubert, Science communication is on the rise – and that’s good for democracy, The
Conversation, August 4, 2016 (http://theconversation.com/science-communication-is-on-the-rise-and-thats-good-for-
democracy-62842 )
References
Lecture Slides
utas.edu.au
11
Part 1 – Public
perception of science
and scientists
utas.edu.au
12
Scientists and science communicators often wish to communicate science with the public because they believe
that science is exciting, important, and useful. Not everyone, however, feels wholly positive about
science. Some members of the public may be offended by certain scientific theories, or feel that
particular technologies are frightening and dangerous.
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
Since 2008 there has been a focus on fostering a culture of public engagement in higher education
plus an impact agenda that demands scientists provide evidence of how their work, including their
science communication, is making a difference.
Spicer, 2017, p. 17
A study by the Office of Science and Technology (OST) and the Wellcome Trust explored public attitudes to
science, engineering and technology in the United Kingdom (OST, 2000). A survey of 1839 people of their
attitudes towards science identified six attitudinal clusters:
• Confident Believers
• Technophiles
• Supporters
• Concerned
• Not Sure
• Not for Me
From the qualitative research, respondents had four main requirements regarding their relationship with
scientific debates. First, they wished to be put in a position in which they were able to have a reasonable
opinion. Second, they wanted a framework in which to place both breakthroughs and disasters, and
everything in between. They felt the need for information that was genuinely objective and distanced from the
very many, often very powerful interests participating in the debate. Third, they wanted to feel not only
generally informed but also educated. Finally, they wished to be consulted, although respondents had
no idea how that consultation might work.
OST, 2000, p26
Science is driven by business – at the end of the day it’s all about money 22 39 17 15 2
Scientists seem to be trying new things without stopping to think about the risks 11 45 18 17 2
Rules will not stop researchers doing what they want behind closed doors 20 50 13 10 2
Emotional controversies
A source of medical, environmental, agricultural or industrial technologies, gives rise to many emotional
controversies, with three examples given in Chapter 5.
• The Deaf community and the cochlear implant
• Nuclear power, nuclear disasters and climate change
• Animal homosexual behaviour and anti-gay science
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
• and vaccinations
Part 2 – STEM
communication
utas.edu.au
19
One-way information
• To inform the reader, listener or viewer (no other effect)
• To inform research in science communication
• To inform policy
• To affect attitudes (and possible behaviour)
• To inform as an expert witness
• To facilitate creation of theoretical models
• To ‘educate’ (as understood by the ‘deficit model’)
Knowledge sharing
• To assist in formulation of policy
• To mediate diverse perspectives by exchange of knowledge
• To facilitate and integrate interdisciplinary approaches
Knowledge building
• To create new meaning or understanding from different knowledge systems
• To enable action in complex environments through integration of knowledge in order to construct new
meaning
Stocklmayer, in Gilbert and Stocklmayer, 2013, Chapter 2, Table 2.2
Weingart et al., 2016, Science communication is on the rise – and that’s good for democracy
• “The mid-1950s saw the birth of a science literacy movement in the US called “Public Understanding of
Science”. Americans were reeling after the USSR sent a satellite, Sputnik 1, into space. The “Public
Understanding of Science” initiative was designed to mobilise public support for the costly project to put a
man on the moon. It was also hoped that more young high school graduates would choose to study maths,
physics and engineering rather than creative writing and philosophy.”
• “The 1970s brought controversies about the risks of nuclear power; in the 1980s people started to fear
recombinant DNA and genetically modified crops. These concerns brought about a new paradigm
known as “Public Engagement with Science and Technology”, or PEST.”
• ““Public Understanding of Science” had been based on the assumption that knowing more about science
implied trust in and acceptance of science. PEST was based on the belief that there needed to be more of
a dialogue between scientists and the public.”
• “Since then there’s been an increasing “democratisation” of science. Governments actively promote
public accounting by science in a bid to secure legitimacy for the considerable expenditures of an enterprise
that receives public resources but is largely opaque to the outside observer.”
• “This puts the scientific community in a position where it has to convince the public of two important
things. First, that it’s delivering “value for money” by doing a good job and, second, that it is responsive to
the general public’s needs and interests. To achieve both of these aims, scientists must communicate.”
Recommended reading
P. Weingart, L. Guenther & M. Joubert, Science communication is on the rise – and that’s good for democracy,
The Conversation, August 4, 2016
If scientists or science-defenders make a decision that directly puts people in danger, our trust in science
can plummet dramatically. Regardless of the benefits of science and technology, it only takes one
frightening risk to make people wary.
In the United Kingdom, this occurred in the late 1980s and early 1990s during the controversy over Bovine
Spongiform Encephalopathy (BSE), a lethal syndrome found in cattle, commonly known as “mad cow disease”.
British scientists and politicians foolishly reassured the public that eating British beef was not
dangerous, despite the prevalence of BSE amongst British cattle. This turned out to be incorrect advice,
with over 150 people dying in the UK of a disease caused by eating BSE-contaminated meat. Millions of
cattle have since been slaughtered. This incident has been linked to record low levels of trust in science
among the British public (House of Lords Select Committee on Science and Technology, 2000; Jasanoff, 1997).
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
Recommended listening
Danielle Torresan, John Hunt, Kate Hughes, Scott Warner and Cathy van Extel (moderator), Cleaning up chemical
contaminants, ABC Big Ideas, September 23 2019 (54:05 minutes)
It is common for scientific controversies to be framed in narrow terms that artificially separate the
scientific aspects from other aspects such as economics and ethics. The result is that people in the
community often feel frustrated and powerless, and see scientists as out of touch with the rest of society.
Three basic principles can help science communication efforts succeed in this way (adapted from Chilvers,
2008):
1. Widespread involvement of affected people from the beginning of the process;
2. Transparency in explaining the relevant science;
3. Discussions over a long time period that include diverse alternative perspectives.
The first step is to stop thinking of members of the public as a “target audience”, and to start thinking of them
as communication partners (Kirk, 2009).
The second step is to challenge our own assumptions about science and open our minds to the possibility
that people hostile to science have many good reasons for their views. … We all need to know that we have
been heard, that our communication efforts have been taken seriously.
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
Forcing partners to learn scientific details will only lead to frustration. … we as science communicators must
be prepared to provide the scientific details in everyday language but without oversimplifying.
… people communicating science so often claim that science is simply the truth, … communicating in a
technical language that most people cannot understand. … While this may win arguments by forcing others to
back down and bow to an “expert”, it is not effective or helpful.
… scientists may leave out the uncertainties inherent in science, worrying that expressing uncertainty will
weaken their argument. Research has shown, however, that scientists who … admit the uncertainties of
science … are seen to be more trustworthy and believable by members of the public. (Jensen, 2008)
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
… communication must continue to take place over a long period of time. Time gives people an opportunity
to become fully informed and to develop complex understandings of the issues, because they have time to
think about them, talk about them, and digest the information.
Orthia, in Gilbert & Stocklmayer, 2013, Chapter 5
Constructivist learning
The important role of inquiry and hands-on experience in learning about science has been implicit for
decades, even centuries. …
Inquiry-based teaching and learning are characterised by the responsibility placed on the learner, which
promotes learning experiences that have personal significance (Simon & Johnson, 2008). To achieve this
outcome, teachers should focus less on transmitting information and more on facilitating active
learning experiences for their students.
This learner-centred approach is closely allied with constructivist learning, which stipulates that new knowledge
should be built on existing understandings, so that actual learning translates into a more active,
personally meaningful exercise. Learning science through inquiry requires students to scaffold (build a
connected network of) prior knowledge, construct understandings, evaluate alternative conceptions, apply
ideas in a socio-cultural context, and engage in open-ended questions, co-operative learning, and reflection
(Shymansky et al., 1997), all of which are elements of a model constructivist classroom.
Perea & Stocklmayer, in Gilbert and Stocklmayer, 2013, Chapter 11
In 2000, an important report was formulated by the UK Government which influenced thinking about science
and the public in most countries practising and teaching Western science (House of Lords, 2000). It effectively
accused the PUS movement of being top-down and arrogant, and advocated a different approach to the
relationship between science and society:
Although scientists are a minority of the population, democratic citizenship in a modern society depends,
among other things, on the ability of citizens to comprehend, criticise and use scientific ideas and claims …
the application of science raise, or feed into, complex ethical and social questions.
(Section 1.11)
The emphasis had thus shifted from factual knowledge to an understanding of the impact and
implications of science in an increasingly technological world.
As long as students see science as irrelevant and divorced from everyday experiences, they are unlikely to
become scientifically engaged citizens. In the classroom, this implies that students’ beliefs and opinions are
afforded respect and consideration, which is an important factor in inquiry learning but one that is
overlooked.
Perea & Stocklmayer, in Gilbert and Stocklmayer, 2013, Chapter 11
The communication of science and technology in everyday, informal environments is characterised by choice.
People may choose to notice and engage in various opportunities to learn about science, or they may
not. If they do choose to engage, then they are generally in control of how they interpret the science
information offered. However, the science that is on offer is not usually science as scientists see it. Instead,
these informal sources of learning present their information in a narrative or story form. Developing that
science story requires selecting, packaging, and presenting science information in such a way that the
intended audience is motivated to engage with it, can understand and make use of it, according to their
own needs, interests, and experience.
Because choice is so central to the process of science communication in informal environments, successful
communication requires that an informal organisation, such as a science museum, understands its audience
well enough to present science stories that are interesting and relevant, and with which the audience is willing
to engage.
… given the constraints of institutional agendas, visitors are most likely to have opportunities to learn science
via engagement with exhibits and exhibitions that are designed mainly to communicate knowledge. … most of
this will be “knowledge that”, rather than “knowledge how” or “knowledge why”, but importantly,
“knowing that” is a precursor to learning how and why.
Rennie, in Gilbert and Stocklmayer, 2013, Chapter 12
Part 3 – Evaluation of
STEM communication
utas.edu.au
35
Since 2008 there has been a focus on fostering a culture of public engagement in higher education plus an
impact agenda that demands scientists provide evidence of how their work, including their science
communication, is making a difference. ... Effective evaluation needs to be planned
(Spicer, 2017, p. 17)
In the Research Councils UK’s guide to evaluating public engagement (see Table 2), evaluation is described as
a process that takes place before, during and after an activity or programme of activities. It not only allows
you to determine if the content and delivery of your science communication has worked, but more importantly,
to identify what has not worked and why. Evaluation can provide evidence to demonstrate the value,
benefits and impact of your engagement in relation to your aspired objectives, for example an increase in
awareness or understanding, or a change in behaviour. It helps you learn from your actions so you can be
more successful in achieving your engagement objectives in the future.
Spicer, 2017, p. 20
The nuts and bolts of evaluating science communication activities (Spicer, 2017)
Three examples based on actual experiences of developing and conducting evaluation have been selected
to illustrate the steps in the process (see Table 1).
The first example is ‘Wriggling Rangoli’ from Manchester implemented by Pennock, Cruickshank and Else [12]
who aimed to raise awareness of the dangers of parasitic infection and to share knowledge and
experiences with those who had lived in affected areas before moving to the UK. Working in partnership with
community organisations, they ran a workshop for local Asian women and their children, at which they
shared their knowledge of the science of parasitic infections, inviting the women to share their own experiences
and knowledge.
The second example is called ‘From Supermarket to Sewers’ and is based on a science show originally
developed by the Museum Science and Industry for school children aged 8–14 years to deepen
understanding of how the human digestive system works and encourage healthy eating. Offered as
part of the Museum’s school and family programme, the 25-min interactive show used theatrical
demonstrations and on-stage experiments to engage young people with the science of the human
digestive system and address a key health issue.
The final example is ‘Science Spectacular’ [14], a one-day interactive science event for families run as the
University of Manchester’s main contribution to the Manchester Science Festival (see Fig. 2). Based on campus
in two buildings, the event involves 200 researchers offering 40 different table-top interactive activities
covering all areas of science such as discovering how a plane flies, investigating the science behind music and
how to help prevent antibiotic resistance. … There are about 2000 visitors and the majority are families from
the Greater Manchester region.
Spicer, 2017, p. 20
How to start
Ideally you should start planning your evaluation at the same time as you are planning your science
communication activity …
It is important to be clear of your purpose right from the start − is it to learn from and improve your
practice, to demonstrate impact and success in achieving what you set out to do, or a mixture of both?
There are two main modalities of evaluation, formative and summative.
• Formative evaluation is about process and is when you assess if your activity is working − think of it as
testing your ideas. Does the activity work? Can it be improved? Is it suitable for the audience you are
engaging with? This allows you to, later on, modify what you are doing.
• Summative evaluation usually happens at the end and it assesses and evidences the impact of your
science communication activity, project or programme. It addresses the question as to whether your activity
is making a difference. To successfully demonstrate this you need to plan how you will establish a
baseline from which you can evidence any change.
Table 2 - useful guides and resources from other organisations and higher education institutions.
Spicer, 2017, pp. 20-21
To keep focused, it is a good idea to have a simple evaluation plan, a step-by-step guide which summarises
the whole process from the aims and objectives to how the results will be reported.
Start by considering your aims and objectives.
• aims are what you want to achieve overall from your science communication activity
• objectives are how you will implement your science communication activity to achieve your aims.
Keep them SMART [15],
• Specific: do your objectives state what you will do and with whom?
• Measurable: can you measure their success?
• Achievable: do you have enough time and resources to achieve your objectives?
• Relevant: do they meet your aims?
• Time-bound: do they include timescales?
It is important to identify who will be involved in your evaluation and what challenges may arise during
the evaluation process. Remember to not only include your public audiences but to also include your-self and
your team, and any partners you are working with.
Spicer, 2017, p. 21
Careful thought has to be invested in … identifying appropriate and effective key evaluation questions. …
should reflect your aims and objectives, and relate to what you consider success will look like. Ideally have a
minimum of two questions and a maximum of six, and ensure they relate to the evidence you can collect.
They should not only measure outputs (the results of your activity, event or programme) but also outcomes
(a change or benefit such as an increase in awareness, knowledge and understanding; the
development of skills or confidence; a change in behaviour or attitude; or a change in practice or
procedure).
Outputs are often easier to identify and measure, whereas outcomes can be more challenging but can be more
insightful and reveal the impact of your activity.
The AHRC provides a definition of outputs, outcomes and impact in their guide to self-evaluation (see Table 2).
Spicer, 2017, p. 22
Data collection
To answer your evaluation questions you need to collect data which is likely to comprise of a mixture of
quantitative data that is numerical factual answers that can be counted such as visitor numbers, the selection
of predetermined answers to questions or web page downloads, and qualitative data which is made up of open
responses such as answers to open-ended questions, drawings, videos or observations.
You can find more information on sampling on the Better Evaluation website (see Tables 1 and 2) and
resources with evaluation techniques are provided in Table 2.
Data collection can vary from the traditional questionnaire and interview to more creative methods such as
drawing or graffiti walls. Better Evaluation (see Table 2) gives examples and categorises them into:
A - Information from individual; for example, in-depth interviews or questionnaires, comment cards, online
surveys, drawings, emails, placing stickers on a scale/image, recording experiences on scales radiating
outwards like a dart board, using an event passport [16], self-selecting voting by placing a ball in a tube or a
coloured token in a jar, using an electronic voting pad or voting app, project diaries or logs, social media
Facebook wall.
B - Information from groups; for example, focus groups, mapping knowledge and understanding to show
change after the outreach, World Café [17], graffiti walls, moving physically into spaces to rate experiences
such as good, OK or not good.
C - Observation; for example, observation of an outreach activity by a neutral observer, use of video or
photography to record the activity and/or interviews with participants.
D - Physical measurements; for example, collecting postcode data, web site analytics, head count.
E - Reviewing existing records and data; for example, planning notes, meeting minutes, personal logs and
diaries.
Spicer, 2017, p. 22
Data collection
When collecting your evaluation data remember to adopt the same ethical approach that you would
when conducting your research. Treat all participants with respect and inform them that evaluation is
taking place. Always ask permission to record or observe them and, particularly when working with young
people, follow your institution’s consent procedures or use the guidelines provided by the UK’s NSPCC [18]. If
you ask people for sensitive information such as their ethnic background, religious beliefs or political opinions,
you need to be aware of the legal implications of the UK’s Data Protection Act [19] or equivalent in other
countries. Be clear why you need personal information and, if it is essential to your evaluation, how you will
anonymise the data, who will have access to it, where you will safely store it and how long you will keep it. If
you are planning to publish your evaluation results then you need to go through your institution’s
ethics board.
When writing questions, be clear and avoid using language and jargon that could be misunderstood. Keep your
language simple and easy to understand. Check that none of your questions are leading or biased in the
way that could possibly skew your results. … Be careful you are not ambiguous or too vague, … And do not ask
multiple questions in one, … Aim to have a balanced number of questions that are positive and negative, as
this will help to ensure that participants answer truthfully, without feeling that they have to give the answers
that they think you want. Finally, you might also like to consider having a question that looks for
unanticipated positive (and negative) outcomes so you do not miss the results you had not planned for.
For example was there anything you were not expecting or is there anything else you would like to add?
Spicer, 2017, p. 23
When planning your evaluation, decide how much data you want to collect and how much time and
capacity you will have available to analyse it. There is no point in spending energy on collecting data unless
you plan to analyse it and learn from the results. Now consider how you will analyse the information you
have collected to assess if you have answered your key evaluation questions. How you do this depends on
whether the data is quantitative (numbers)or qualitative (words and images). Useful information is given in
various resources such as the RCUK’s guide to evaluating public engagement (see Table 2).
The BMJ has an excellent collection of resources on conducting and analysing qualitative research (see Table
2).
Spicer, 2017, p. 23
Once you have analysed your data then you should interpret what you have found. List your key findings
both positive and negative and link them to your evaluation questions and critically reflect on what you
have learned. If you are conducting formative evaluation, ask yourself − What has worked well? What has
not worked so well? What could be improved? What should we do differently next time? Identify
recommendations that can be taken forward. If you are evaluating the impact of your science communication
activity, assess if you have been successful in achieving your intended outcomes. What evidence has
supported your conclusions? What stories are emerging? Are there any unanticipated outcomes that you had
not intended?
You should be aiming to integrate evaluation into your science communication work so it becomes an
everyday part of developing activities, events or programmes.
Spicer, 2017, p. 24