You are on page 1of 6

The impact of mobile technology on students’ experience of

learning and assessment in practice settings: a focus group


Jackie Haigh

University of Bradford

Christine Dearnley

University of Bradford

As part of the ALPS CETL (HEFCE 2007) 29 student midwives and their link lecturers
were given an electronic version of a clinical portfolio on Personal Digital Assistants
(PDAs). These devices were used during a seven week clinical practice placement to
record tripartite assessment interviews and clinical experiences relevant to the
performance indicators of the placement.

Focus groups explored the impact of the electronic portfolio on the students’ experience
of clinical practice and its assessment. Data was analysed from an activity theory
perspective in that the electronic portfolio was viewed as an artefact mediating situated
knowing about student assessment in a particular socio-historical context. Findings
suggest that the changes made to the electronic version of the marking criteria
mediated a shared understanding of the assessment process which was pragmatic and
less contested by students; however this still required skilful facilitation by the link
lecturer and needed to be sensitively introduced to clinicians.

Changing the assessment tool has the potential to change the shared understanding of
the assessment process. Before seeking to radically change the tool for all professional
groups it is essential to understand current processes and how the introduction of
mobile technology and new tools might impact on them.

This paper analyses student experiences of using an electronic assessment tool on a

PDA to facilitate self-assessment, feedback and grading in clinical practice. It explores
the potential of the new tool to mediate a shared understanding of the assessment
process and to enhance student learning from experience in practice. The significance

SOLSTICE 2007 Conference, Edge Hill University 1

of the project lies in its potential to illuminate the potential of new technologies and tools
to change the shared meaning of professional practice assessment.

Clinical assessment: activity theory, mobile technology

This paper reports on one aspect of the findings of a case study conducted as part of an
Assessment & Learning in Practice Settings (ALPS) IT pilot project. ALPS is one of the
74 Centres for Excellence in Teaching and Learning (CETLs) funded by the Higher
Education Funding Council for England. It is a collaborative programme between five
Higher Education Institutions with proven reputations for excellence in learning and
teaching in medicine, dentistry, health and social care. Practice settings in health and
social care education refer to hospital and community based work placements for
students. The findings of this study may also be relevant in other assessed work-based
learning situations.

ALPS aims to develop and improve assessment, and thereby learning, in practice
settings for all Health & Social Care students and to develop the competence of people
who support and assess Health and Social Care students in practice settings. Crucially,
it is intended that assessment and learning in practice settings will be enhanced by
electronic mobile delivery of learning objects and assessment documents. Exploring the
feasibility and potential of mobile learning and assessment has therefore become a
central tenet in the work of ALPS. This paper focuses specifically on the impact of
assessment tools on the student experience, arguing that use of a mobile format of itself
necessitates change. Such change if informed by clear pedagogical principles can
enhance the student experience of being assessed.

A cohort of thirty first year student midwives was selected as the case study group to
trial using a PDA (personal digital assistant) in a clinical assessment context. The
midwifery programme already assessed practice through a paper portfolio document. At
the beginning of each placement students have a preliminary interview with their clinical
assessor and link lecture during which an action plan is agreed. The learning
opportunities in the placement are identified and performance indicators to be achieved
are discussed. An intermediate interview midway through the placement is a chance to
review progress towards the performance indicators and amend the action plan if
necessary. At the end of the placement the student discusses her achievement in all
performance indicators and these are signed off by the assessor if an adequate
standard of performance has been achieved. The assessor then grades the overall
performance using a marking criteria grid to assess the level of student attainment of
the module learning outcomes for the particular stage of the programme. Changing this

SOLSTICE 2007 Conference, Edge Hill University 2

well established process to include a PDA format was thought to be relatively
unproblematic – it was already available electronically as a word document.

The pilot group had completed one placement using the paper portfolio and would be in
placement again within the time span of the project therefore they were ideally placed to
compare the two tools. The assessment process included three clinical interviews
involving student, link lecturer and clinician. This allowed the PDA to be tested over time
as a tool to facilitate an assessment interaction between three people in a clinical
setting. Our initial project plan was to create electronic performance indicator and
assessment interview forms using Microsoft Word© which students could work on using
their personal computers or the PDA. These forms would replicate, to the extent that it
was possible, the paper based format with which students and staff were familiar. In
addition, a new form would be created which amalgamated the learning outcomes and
assessment criteria into a simple dropdown assessment form, since it was recognised
that huge chunks of text and the necessity to scroll should be avoided on the PDA if at
all possible.

We anticipated that forms could be prepared in advance on the students own computer,
could be added to by link lecturers and mentors during the interviews, signed
electronically and beamed by blue tooth technology from the students PDA to the link
lecturers PDA on completion of the interview. The link lecturer could then feedback the
electronic form to a central data base held by the divisional administrator. Both the
student and the Midwifery Division would have a complete and verified record of the
students’ placement experience stored electronically and available for print.

As we tried to put this plan into action we came up against a few technical hitches.
Pocket Word© did not support tables, drop down boxes or many of the features we had
taken for granted in our own use of the professional word software. We soon realised
that it was not sophisticated enough for our purpose. We spent a frantic few days
searching for alternatives and chose
Pocket Forms© as a programme more suited to our needs, however it was far from
ideal. Our IT expert was able to create the required forms on a PC but the cost and
complexity of providing the full Pocket Forms© programme to students was prohibitive
and would not have adequately solved the problem of allowing students easy access to
their records whether on PC or PDA. We were forced to compromise and offer a system
which required all data to be added via the PDA.

A second problem was bringing all these forms together to create a database of the
placement assessments for the whole group. In the paper version the students had
handed in their portfolio at the end of the year so that marks could be recorded.
Creating a contemporary record of student achievement in placement was felt to be
necessary for the security of the electronic version. This was not easily achieved
through Pocket Forms© and necessitated the recruitment of a freelance computer

SOLSTICE 2007 Conference, Edge Hill University 3

software consultant to design a bespoke central database which would accept the
Pocket Forms©.

Several forms were needed to suit the different purposes of the portfolio. We had a form
for students to record their experience in each performance indicator and to note any
relevant references. Another form recorded the details of each placement interview.
Finally an assessment form integrating learning outcomes with assessment criteria was

At this point we found another limitation of pocket forms in that it was unable to
calculate an average mark from a series. We overcame this by creating an excel form
for the link lecturer to use to record the marks. This left our designed assessment form
somewhat redundant so we decided to use it as a student self-assessment form
instead. This idea was not purely opportunistic but rather grew out of existing practice.
Students had been asked to self-assess in the paper version but only as a comment on
the interview sheet, they had not been asked to grade their own performance using the
assessment criteria as a guide. However in academic assignments these students have
been grading themselves using assessment criteria for some time.

The new forms were used during the students’ second placement at the end of which
three focus groups were conducted to explore the student experience of using mobile
devices to support assessment in clinical practice. My interest in this project was in
discovering how if at all the new PDA forms impacted on student understanding of
clinical assessment. I therefore used an activity theory framework to analysis the focus
group data.

Activity Theory
Mediating artefacts: paper/electronic portfolio,
assessment criteria, learning outcomes

Subject: student Object: valid and

midwife reliable assessment
of clinical competence

Rules: Community: Division of Labour

QAA benchmarks, Clinicians Tripartite system:
NMC standards, Academics student, mentor, link
competencies etc Students lecturer

Activity theory (Engeström 1994) rejects the idea that thinking and learning are best
studied as individual cognitive processes; rather it argues that learning is a social
process of reaching a shared understanding or meaning and should be studied as

SOLSTICE 2007 Conference, Edge Hill University 4

embedded collective activities. Thus we can see from the diagram that what we mean
by assessment of clinical practice is embedded in a framework of rules, community and
division of labour and mediated by the tools we use to articulate that meaning. In our
case study the main tool or artefact used to mediate understanding of the assessment
process was the clinical portfolio. By changing this in the process of converting it for
PDA use there was the potential to change the shared meaning of assessment for the
actors involved.

Findings suggest that the new tools i.e. the electronic portfolio and particularly the new
assessment forms, created a new understanding of assessment for students and that
they felt it helped develop a shared understanding with their assessor. The new tool
seemed to encourage a more detailed appraisal of student performance. In particular
discussion based on the student’s self assessment of grade in each criterion grounded
the discussion in how the student felt she had performed.
It’s just a much better way of getting a grade
The way I was graded from my first placement to my second placement was a lot
better…it was ‘what do you think to this? where would you put her within this
This process of discussing each criterion in turn seemed fairer to the students and also
appeared to ensure that full consideration was given to each aspect by the assessor.
Students had complained about the previous process using the paper forms where it
was easier for the mentor to give an intuitive assessment of student overall performance
without necessarily pinpointing strengths and areas for development.
I think the new grading of assessment is better it’s fairer, because you can be
very good in some things for example communication skills and not as good in
But that makes them have to do it, because I know the grades are in the book but
it doesn’t say you have to go through each one and give a percentage for them, it
just says look at them – you know what I mean?

Interviews with link lecturers confirmed the favourable student response and made the
lecturers rethink how tools can influence a particular interaction. They were so
impressed with the new assessment form that even though the PDAs were not available
for subsequent placements they translated the form back to paper copy trying to
preserve as many advantages as they could i.e. full student self assessment and the
marking of each criterion. Thus it could be argued that the design of the forms rather
than the electronic format was the important factor; however without the PDA acting as
a catalyst these changes would not have been made. Tools are not neutral. They can
change the way we think and act and have the potential to improve the practice of
learning and assessment.

We have reported elsewhere (Dearnley and Haigh 2006) that this pilot of mobile
technology was not an easy process for staff or students to undertake, mainly because

SOLSTICE 2007 Conference, Edge Hill University 5

of the changed practice involved in using a new device – some of these students still
shudder at the very mention of a PDA! However the pilot was feasible and we did
succeed in developing our practice in ways unanticipated at the beginning. I feel that
this success was only possible because we were not imposing a completely new
assessment process but building on current best practice thus minimising associated
stress to students and staff. Technology moves on and since last year there have been
developments in making web-based personal learning systems accessible from PDAs
(Pebble Learning Ltd 2006 personal communication). This has the potential to make
future work in this area much less logistically challenging but will still necessitate some
rethinking of paper based formats.

We live in exciting times. New technologies allow for the development of learning and
assessment tools that change the way we think and act. Tools are not neutral; they are
by definition interactive and therefore stimulate interaction, which is multi-dimensional in
nature and has the potential to impact on relationships between the student, mentor and
lecturer and the assessment process. New technologies demand new approaches to
learning and assessment. It is not the case that we simply replicate old and trusted
paper friends in digital format, the rewards of such an exercise would barely prove
worthwhile in terms of value added.

To obtain such rewards we must allow new technologies to work for us. This case study
clearly demonstrated that to do this successfully, the new approaches need to be
acceptable to the actors involved by building on current best practice and thus
minimising associated stress to students and staff. Crucially, pedagogy must drive
technological advancement. We must build on what we know about how people learn
and how people respond to assessment and be creative in our technological responses.


Dearnley, C. and Haigh, J. (2006) Using mobile technology for assessment and learning in
practice settings: the Bradford Pilot. In Assessment for Excellence, Northumbria EARLI SIG
Assessment Conference Northumbria University.

Engeström, Y. (1994) Teachers as collaborative thinkers: Activity-theoretical study of an

Innovative Teacher Team. In Teachers' minds and actions: research on teachers' thinking and
practice: 6th International conference: Selected papers(Eds, Carlgren, I., Handal, G. and Vaage,
S.) Falmer Press, London, pp. 43-61.

HEFCE (2007) Complete list of funded CETLs updated 1st March 2007 accessed 9th
April 2007

Pebble Learning Ltd (2006) PebblePAD.

SOLSTICE 2007 Conference, Edge Hill University 6