You are on page 1of 10

Case Study Methodology and E-Learning: Reflections on

Evaluation Activities for Blended Modules

Richard Walker & Wendy Fountain

University of York, rw23@york.ac.uk & wf501@york.ac.uk

ABSTRACT
This paper offers a description of the case study research methodology at the University
of York and reflects on the way that it has been applied to capture student learning
experiences for a series of blended learning pilot projects. Discussion focuses on the
interpretive research approach, which has been adopted to provide a rich picture of
student working patterns across a range of pilot projects. The York approach aims to
establish a rolling evaluation programme, rather than a snapshot of current practice
through e-benchmarking. This work in progress highlights the challenges to the
successful implementation of this programme, and to the long-term sustainability of
case-study research for e-learning.

KEYWORDS
Case study, interpretive research, blended learning

INTRODUCTION
The University of York may be categorised as a ‘greenfield’ site for e-learning, without a
legacy of institutional usage, although there have been isolated pockets of e-learning
activity at a departmental level. This has enabled the University to introduce a centrally
supported e-learning service in a holistic way, through a managed rollout strategy
(Beastall & Walker, 2007). The University procured Blackboard Academic Suite as its
institutional platform in December 2004, and has embarked on a four year
implementation cycle (2005-09), during which time the University’s VLE Implementation
Group aims to establish quality assurance and enhancement processes to guide e-
learning activities across the university.

The rollout of the platform has been phased through the delivery of a series of pilot
projects, which have been selected by the VLE Implementation Group to explore

SOLSTICE 2007 Conference, Edge Hill University 1


student-focused course design approaches, which place the emphasis on active
learning through collaboration and the performance of assessed activities online. To
date the University has launched two rounds of pilot projects with 21 projects delivered
between January and July 2006, and a further 42 projects being delivered during the
current academic year (2006-07). It is anticipated that the pilot projects will generate
models of good practice for blended learning, and the lessons learned from course
design and delivery will inform staff training activities in a virtuous cycle of course
development. Consequently, the Implementation Group has placed a strong emphasis
on evaluation practices in the rollout strategy, to record the outcomes from these pilot
projects.

RATIONALE FOR CASE STUDY EVALUATION


The approach aims to evaluate e-learning practices through the adoption of an
interpretive research agenda, drawing on multiple data collection methods to provide a
rich picture of student learning within these pilot projects.

“Interpretive studies assume that people create and associate their own
subjective and intersubjective meanings as they interact with the world around
them. Interpretive researchers thus attempt to understand phenomena through
accessing the meanings participants assign to them”. (Orlikowski & Baroudi
1991)

Our research approach has therefore focused on student perceptions of their


experience, establishing ‘meaning’ from the standpoint of the actors, rather than through
objective measurements of student learning. As Cohen & Mannion observe:

“The central endeavour in the context of the interpretive paradigm is to


understand the subjective world of human experience. To retain the integrity of
the phenomena being investigated, efforts are made to get inside the person and
to understand from within”. (Cohen & Mannion 1994: 36)

Case study research is entirely suited to this purpose, in helping us to grasp meaning
from a real life situation, where the experiences of actors are important and the context
of action is critical. As Yin (1994) notes, a case study “investigates a contemporary
phenomenon within its real life context”, providing a holistic picture of the phenomenon
under observation. In the context of the University’s e-learning research plan, the case
study approach has been selected to help us to construct a multi-dimensional picture of
student learning across each pilot project. Through the use of multiple data collection
methods, drawing on both qualitative and quantitative techniques, we aim to record
individual perceptions of the learning experience for these experimental course designs.
We are therefore interested in exploring the range of design approaches across each
pilot course, identifying issues which may influence student acceptance of the new
learning methods.

SOLSTICE 2007 Conference, Edge Hill University 2


EVALUATION APPROACH
The evaluation plan (see Appendix) for each pilot project seeks to investigate six key
themes in relation to a range of stakeholders: student e-learning profile; induction to the
VLE; student work patterns; student learning experience; the lecturer/tutor experience;
and receptiveness to the VLE. The overarching research questions that we have
selected for the case study research in particular are:

• How are the VLE tools used by students to support their learning in formal /
informal study activities?
• What are the students’ affective and attitudinal responses to the blended course
experience?
• How did the lecturer/tutor perceive students’ learning relative to previous
performance and what action would be taken for future course development?

DATA COLLECTION METHODS

Entry and exit surveys


For each pilot project, we aim to establish at the beginning of the course the ‘e-learning
profile’ of the cohort under investigation. This involves students completing an entry
survey instrument for the module they are following during the induction session, which
focuses on four main areas, namely:
(i) computer access
(ii) IT literacy levels
(iii) familiarity with e-learning for educational purposes; and
(iv) expectations towards the use of the VLE to support learning activities.

The instrument also invites students to comment on the level of VLE training they have
received at this point and any access problems they have encountered in logging on to
the VLE. The survey design employs a nominal scale to reflect issues such as IT
literacy and familiarity with e-learning, whilst students are asked to comment on VLE
expectations against a range of items using a five point Likert scale, with items worded
positively.

Through the use of an exit survey instrument and focus group interviews at the end of
the course but before the issuing of grades, attitudes for each pilot module are
recorded, with students invited to comment on the strengths and weaknesses of their
blended learning experience. The timing of these events is intentional in attempting to
minimise the ‘halo and horns’ effect of student perceptions being influenced by course
grades. As Hiltz observes:

“Student evaluations are strongly related to grades received in the course. There
is argument about which is the cause and which is the effect. If grades are
‘objective’ measurements of amount of learning, then we would expect that

SOLSTICE 2007 Conference, Edge Hill University 3


students with higher grades in a course would also subjectively report more
positive outcomes. However, it may be that a student who has a good grade in a
course rates that course and instructor positively as a kind of ‘halo effect’ of
being pleased with the course because of receiving a good grade”. (Hiltz 1994:
154)

We therefore distinguish focus group and survey methods from the standard end-of-
module feedback instrument, and they are presented as serving a different purpose, in
an attempt to decouple the student evaluation of the learning experience from an
evaluation of the course instructor.

The exit survey instrument invites students to reflect on their learning experience using
the VLE and to review the same set of attitudinal statements that they considered in the
entry survey instrument. They are asked to submit a further set of responses to the
same list of items, and are also presented with an open set of questions at the end of
survey, which invite comments on the contribution of the VLE to their learning and the
link between the online and class-based components of the course. This enables us to
track trends in satisfaction ratings against expectations - assuming that respondents
completed both surveys.

Focus group interviews


The focus group interviews are loosely structured around the overarching research
questions, and are designed to elicit more detailed answers from students on their
usage of the VLE in the pilot module (interaction and work patterns). They are also used
to obtain further data on attitudes to the blend of online and class-based learning.
Students are invited to comment on the value of the online learning and the ways in
which the VLE could be used to support their learning in future modules. Practicality
demands that focus groups are comprised of volunteers, rather than randomly selected
representatives from the student population for each pilot course, which may skew the
sample in terms of the range of views which are expressed. This influences the way that
we interpret the output from these sessions, which are treated as illustrative
perspectives on student learning, rather than reflections of the whole class experience.

VLE activity logs


In addition to these methods, VLE activity logs are viewed for each module, with student
interaction patterns recorded for collaborative activities hosted within the site. These
statistics are gathered as contextual information for the module in question, and are not
used to measure and evaluate student behaviour. What concerns us is the process by
which students learn, rather than in establishing objective outcomes for the learning
under observation. Our data gathering may be restricted to noting for example, the
range of discussion forum posts against number of views, or classifying the nature of
responses received (informational, analytical, confirmatory etc.) within a forum or blog.

We acknowledge however that students do not restrict themselves to the use of


centrally-supported tools in collaborative activities, and are equally proficient in using
commercial software, mobile technology or web services such as Facebook to support

SOLSTICE 2007 Conference, Edge Hill University 4


informal learning activities. The focus group sessions are designed to probe the level of
online activity that takes place both within and outside the course environment, to
present us with a full assessment of student learning for the module under investigation.

Staff interviews
To provide a rounded view of the course experience and additional context to help
interpret the student learning experience, we also record the perspectives of the module
leader and tutors in the delivery of the module. Staff are encouraged to deliver snapshot
progress reports on their module and are invited for a debrief meeting at the end of the
module to reflect on the course design and lessons learned from the delivery of the
module.

OUTPUT AND WORK IN PROGRESS


For each pilot module, we collate the data and feed this back to module leaders, with
the aim of engaging staff in further discussion on the strengths and weaknesses of the
blended course design. This may lead to a follow-up meeting with the central e-learning
team to review the module and discuss future delivery plans. For selected pilot projects,
we also publish a case study report, which is developed for staff training purposes, as
well as for wider dissemination across the community of practice at York. The criteria for
selecting projects for formal case writing are:
(i) the innovative nature of the course design approach
(ii) a clear focus on student-centred learning
(iii) completeness of the data set (survey responses, focus group feedback) for
the course; and
(iv) the richness of the lessons learned and their transferability to other teaching
contexts.

Case study template and reports


For the case study compilation, a common template has been introduced. Each case
study report consists of:
• An overview of the module, including the pedagogic theme which has been
addressed in the design of the course and a list of conclusions /
recommendations emerging from the pilot.
• The rationale for the course and description of the blended approach, including
comments on the learning activities and tools for the online component of the
pilot.
• The e-learning profile of the cohort, touching on their IT skills, experience with
computers for learning and expectations towards the VLE.
• Description of student perceptions on the outcomes from the pilot, drawing on
focus group and survey feedback. Activity logs and feedback from the module
leader and tutorial team are also used to provide context to their observations.
• Actions for further development, highlighting the lessons learned for future
modules.

SOLSTICE 2007 Conference, Edge Hill University 5


The module leader is asked to sign off the report, which is then associated with a
blended learning model, as part of the University’s framework of blended course design
models which reflect a graduated approach to design from supplemental to fully
integrated course designs. This is intended to help staff interpret the course design
approach and relate it to their own context. The case study reports are presented on a
showcase website targeted at staff, highlighting the outcomes from the pilot projects
(see http://vlesupport.york.ac.uk/webapps/portal/frameset.jsp?tab_id=_192_1 ).

In order to provide further context to the case study report, we have presented each
report in conjunction with an archived (read-only) copy of the module site, with sensitive
student data removed. Staff are encouraged to explore these course sites and draw
inspiration from the course design approach which may, for example, feature good
practice in the structuring of content and activities or the observance of accessibility
standards in the site design etc.

Looking to the future, we also aim to introduce a further enhancement to the showcase
through linking original courses to revised designs which incorporate the lessons
learned from the student learning experience, demonstrating the evolution of course
design approaches to new staff. The combined case study report and module site
access represents a valuable learning resource and footprint for each module leader, to
review and build on in future iterations of the course.

LESSONS LEARNED AND SUSTAINABILITY OF APPROACH


Rather than capture a snapshot of student e-learning across departments through
benchmarking practices, we have described an attempt to embed e-learning evaluation
across modules as a standard practice using the case study method. Clearly though this
has been established with a controlled level of system usage, and challenges to this
evaluation approach will emerge in terms of the sustainability of data collection and
case writing as the volume of blended modules increases and as we approach full
availability of service. Nevertheless, there have been some useful lessons learned from
the implementation of the evaluation process.

The coupling of the induction process with the release of the entry survey has ensured a
high response rate from students (> 70%), and useful feedback has been gathered on
the varied e-learning profiles and expectations of students toward VLE usage. In
contrast, the exit survey instrument has generally recorded lower response rates, which
may reflect a level of survey fatigue, with students asked to complete two separate
evaluation forms as part of the general course evaluation and reflection on the blended
learning experience. The future points to a joined-up evaluation process where
questions on the online learning experience are embedded within one evaluation
instrument, as the VLE becomes integral to teaching and learning.

A further challenge relates to the visibility of student learning for out-of-class activity,
with students opting to use multiple communication tools to facilitate group-based tasks.
This complicates the tracking process for student learning, with VLE activity logs
recording online activity only in part. We have therefore placed a strong emphasis on

SOLSTICE 2007 Conference, Edge Hill University 6


focus group reviews to provide an insight into the range of tools that students are using,
and the informal learning methods which they employ to support their learning. Moving
forward, the challenge will arise as to how to support these sessions across a wider
range of modules. Indeed there are associated issues of access to data and the
willingness of module leaders to support this level of scrutiny on the student learning
experience. Whilst collaboration between staff and the central e-learning team in the
evaluation of projects has been a feature of the pilot phase, there may be less
enthusiasm by staff for this activity as the VLE is rolled out across the university.

Aside from the practicality of conducting case study research, we should also reflect on
the wider value of lessons learned from this approach. A weakness often associated
with case study research is the degree to which we can draw general conclusions from
individual case studies and apply results to different contexts. The application of results
is restricted to one event or situation, with generalisation proving quite problematic. For
the purposes of staff training and the dissemination of good practices to new course
developers, this represents a potential obstacle. However, Lawler et al., (1985) suggest
that single case studies can be helpful in developing and refining generalisable frames
of reference. Further, when multiple case studies are used, it is possible to relate
variability in context to constants in processes and outcomes. Yin (1994) concurs with
this view, arguing that case study research findings can be generalised at a theoretical
level:

“…case studies, like experiments, are generalizable to theoretical propositions


and not to populations or universes. In this sense, the case study, like the
experiment, does not represent a “sample”, and the investigator’s goal is to
expand and generalize theories (analytic generalization) and not to enumerate
frequencies (statistical generalization)”. (Yin 1994:10)

It is at this level of “analytic generalization” that we seek to draw lessons learned from
the pilot projects. The perceptions of participants on the blended course experience will
be used to revise the framework of instructional responsibilities on course design and
delivery which we present to staff in our training workshops. Consequently, we aim to
revise our guidance to staff on a cyclical basis, taking account of generalisable features
of the learning experience and characteristics of the student population in terms of IT
skills and e-learning profile.

CONCLUSION
This paper reflects a significant work in progress in terms of the embedding of case
study evaluation of blended learning modules. The selection of interpretive research
methods has been intentional in helping us to focus on the student learning experience,
seeking understanding of student interaction with VLE tools through the capture of
student perceptions of the new learning methods. The case study methodology
complements this approach, enabling us to build up a rich picture of the student learning
experience for individual modules, whilst also providing scope for generalisation on
course design and delivery methods. We acknowledge that the establishment of
research activities is at an early stage, and the sustainability of data collection and data

SOLSTICE 2007 Conference, Edge Hill University 7


analysis processes will be tested as the volume of blended modules increases, with a
requirement to integrate methods with existing course evaluation processes.

REFERENCES

BEASTALL, L. and WALKER, R., (forthcoming, Spring 2007). Effecting institutional change
through e-learning: An implementation model for VLE deployment at the University of York,
Organisational Transformation and Social Change.

COHEN, L., and MANNION, L., 1994. Research Methods in Education. Fourth Edition. London:
Routledge.

HILTZ, S., 1994. The Virtual Classroom: Learning Without Limits Via Computer
Networks.Norwood NJ: Ablex.

LAWLER, E., MOHRMAN, S., LEDFORD, G. and CUMMINS, T., 1985. Doing Research that is
Useful for Theory and Practice. San Francisco: Jossey-Bass.

ORLIKOWSKI, W. and BAROUDI, J., 1991. Studying Information Technology in Organizations:


Research Approaches and Assumptions. Information Systems Research (2), pp. 1-28.

YIN, R., 1994. Case Study Research – Design and Methods. Second Edition. Newbury Park,
CA: Sage.

SOLSTICE 2007 Conference, Edge Hill University 8


APPENDIX: VLE PROJECT EVALUATION PLAN

Theme Sub-themes Over-arching questions Stakeholders Methods Data collection


1. Student e- IT literacy What skills do you bring to Students Survey Entry Survey data
learning profile the VLE? Module Focus groups Round 1 & 2 focus
leaders / tutors Case study group records
IT socialisation What is your experience of interviews Case study data
tools that support online
learning?

What is your motivation for


Affective / attitudinal variables and what are your
expectations of online
learning?

2. Induction Training Was your technical training Students Survey Entry Survey data
adequate? Instructors Focus groups Round 1 & 2 focus
Was the module-specific Info Desk ETS (Enquiry group records
training adequate? Tracking Exit Survey data
Did you encounter any
Service) ETS logs
technical or accessibility
IT access barriers to the VLE & Library
resources?

3. Student work Access & usage patterns of How extensively have you Students Survey Entry Survey data
patterns online resources used the VLE and for what Module Focus groups Round 1 & 2 focus
purposes? leaders / tutors VLE system group records
ETS Case study data
With whom have you
Interaction patterns (students- interacted and for what
ETS logs
staff, staff-students, students- purposes? VLE activity logs
students)
Which communication tools
Communication tools do you prefer to use and
preferences why?

Overall access & work Overall how have you


patterns worked with the VLE?

SOLSTICE 2007 Conference, Edge Hill University 9


4. Student learning Satisfaction & attitudes to How satisfied are you with Students Survey Entry Survey data
experience VLE the VLE (relative to entry Module Focus groups Exit Survey data
survey)? leaders / tutors Round 1 & 2 focus
group records
Is there evidence that the
Learning outcomes VLE supported changed
Case study data
learning outcomes?

What skills are required to


Skills & competencies work effectively online?
(acquired / required online)
What skills have you
acquired from working
online?
5. Instructor Course design What approach did you take Module Survey Case study data
experience to blended learning? leaders / tutors Focus groups Snapshot reports
Case study
How has your future course interviews
design been informed?
Snapshot
interviews
Training What training issues and
needs emerged for you?

Course delivery approach What is involved in effective


delivery?

Role of instructor online How would you describe


your role in the blended
module? How did it differ
from your role previously?

6. Receptiveness to Usability / accessibility Is Blackboard easy to use? Students Focus groups Entry Survey data
the VLE Have you had any difficulty Disability Case studies Exit Survey data
using the VLE and/or Services Round 1 & 2 focus
accessing learning materials Comp. group records
in the VLE?
Services Special focus groups
Look and feel How do you rate the Module with DS
Blackboard environment leaders / tutors Case study data
overall?

SOLSTICE 2007 Conference, Edge Hill University 10