You are on page 1of 14

Peeling The Onion Without Tears: A Layered Approach To

Researching E-Learning

Cathy Sherratt & Professor Andrew Sackville

Edge Hill University.

sherrattc@edgehill.ac.uk, sackvila@edgehill.ac.uk

ABSTRACT
Online technology is now widely accepted as a major teaching method, but do we fully
understand the processes involved in learning via this approach? And how can we
obtain meaningful information?

In this paper, we will draw on our experience of delivering and researching a supported
online learning course (Postgraduate Certificate in Teaching & Learning in Clinical
Practice) over an eight-year period, and we will discuss our approach to researching e-
learning, especially considering students’ engagement and interaction online.

Our research to date has identified a number of ‘layers’ of enquiry, which exploit a
variety of methods and types of data. As each individual ‘layer’ is revealed, we can gain
new perspectives and fresh insights, to bring our understanding of online learning
increasingly into sharper focus.

We will outline the main questions that have guided our enquiry to date, and we will
explore the techniques we have used in collecting and analysing data in each ‘layer’.
Areas will also be identified as foci for future research.

KEYWORDS
E-learning research; online discussion; online interactivity; research methods.

SOLSTICE 2007 Conference, Edge Hill University 1


INTRODUCTION TO THE E-LEARNING PROGRAMME
The programme which is the subject of this research is the Postgraduate Certificate in
Teaching and Learning in Clinical Practice, which was developed in 1999 by a
partnership of Edge Hill University, the Mersey Deanery for Postgraduate Medical and
Dental Education, and Chester University. This course was the first step into e-learning
for all partners.

The course is aimed at senior health professionals engaged in clinical practice, who
have responsibilities for teaching or training students and junior staff within their clinical
areas. The course is multi-professional, and has been undertaken by doctors, dentists,
nurses and allied health professionals.

The content of the programme focuses on broad aspects of teaching and learning in the
clinical context, as well as more specific aspects of understanding student learning;
approaches and methods of teaching and learning support; assessment and evaluation.
Within the programme there is an emphasis on learning from experience, on relating
experience to published theory and research, on reflective and collaborative learning;
and on peer review of each other’s teaching/ learning facilitation activities. The
programme is accredited by the Higher Education Academy.

Because of the nature and needs of the audience, the programme team devised, right
form the start, a ‘supported-online learning’ approach to programme delivery. This
involves five face-to-face meetings over the duration of a year-long programme; but the
main learning opportunities are structured using a Virtual Learning Environment (VLE) –
in this case WebCT. The VLE presents the course material and resources around a
range of topics, and includes original-written text; links to electronic journal articles and,
latterly, e-books; and links to electronically published papers available on other
websites. A series of online ‘activities’ guides participants through the material, where
they engage in one-to-one discussions with their tutor; in small group work; and in group
discussion using the Discussion Board facility in the VLE.

The Model of Pedagogy and the significance of “engagement and interaction”.


The model of pedagogy accepted by the programme team is based largely on ideas of
social constructivism. Whilst the programme team accept the critical importance of
individually-focused constructivism, where each person constructs their own knowledge

SOLSTICE 2007 Conference, Edge Hill University 2


based in individual experience, the team also strongly believe that we learn from each
other, and thus learning is enhanced through the processes of dialogue and discussion.
In this, we have been influenced by Brookfield and Preskill’s (1999) seminal work on
discussion as a way of teaching.

There is an emphasis in the learning outcomes and design of the programme on


engagement and interaction. In earlier work we have identified five dimensions of this
interaction (Sackville, 2002; Sackville et al, 2002):-
• Interaction with the technology
• Interaction with the programme material and resources
• Interaction with peers
• Interaction with tutors
• Interaction with their wider professional community.

The emphasis on interaction and engagement is often a challenge to many of our


mature, part-time participants who may only have experienced a didactic method of
being taught in their previous education. Time is therefore spent in the induction
sessions for the programme, emphasising this active approach to learning which
underlies the pedagogy.

Discussion, dialogue and debate.


Before outlining the research approach we have taken, it is important to recognise the
interplay between our research into online interaction and interactivity, and our
underlying social constructivist pedagogy.

Right from the start, the programme team realised that not all activities would
necessarily produce the same level and type of discussion. Some activities are primarily
focussed on participants sharing their past and current experiences. These activities are
expected to lead to more personal, reflective statements, rather than the cut-and-thrust
of disputation. Other activities focus on the discussion of specific topics, which the
programme team felt might lead to a form of dialogue where the sharing of ideas would
help participants modify and develop their own thinking. On the one hand, the
programme team wanted to stimulate intrapersonal dialogue and reflection; whilst on
the other hand they also wanted to encourage interpersonal dialogue. This interpersonal

SOLSTICE 2007 Conference, Edge Hill University 3


dialogue might be social orientated or subject-matter orientated – or a mixture of both.
(see Gorsky & Caspi 2005).

Since our underlying pedagogy has at its heart a recognition of the importance of social
and co-construction of knowledge, our research relating to e-learning has focused
significantly on interactivity and the achievement of dialogue in the online discussion
board.

RESEARCHING E-LEARNING

A cursory review of published research into aspects of online learning reveals a wide
variety of different topics which have been explored by different authors, and using
different research methods. The topics include:
• Threading of messages (Hewitt, 2003, 2005);
• Online group cohesiveness (Swan, 2002);
• Individual interactions with technology (Garland & Noyes, 2004);
• The influence of the tutor in online discussions (Mazzolini & Maddison, 2003).

Whilst these specific studies are of great interest, there is also a need for an
overarching conceptualization of e-learning research as a whole. We offer one early
attempt at such a conceptualization, based on our own research experience, in the
following analogy.

The onion as an analogy for the research experience


In this section of the paper, we will explore the different ‘layers’ of research evidence
that form part of our own research experience. Why an onion? The onion consists of
edible ‘flesh’ with an attractive flavour, formed in a series of layers, under an outer skin.
But the onion is a product of nature, so not all layers are the same thickness - which is
highly appropriate when describing the research context, since not all evidence has the
same impact, and we find greater richness of understanding arising from certain types
of evidence.

Furthermore, not all onions are perfectly symmetrical - often we find bulges at one point,
and again, this appeals as a metaphor in the context of researching e-learning, since
some aspects of certain ‘layers’ are also larger and more significant than others.

SOLSTICE 2007 Conference, Edge Hill University 4


So if we explore the metaphor in more detail, we find that a typical onion will have a
tough brown outer layer, naturally representing our most straightforward evidence. This
is usually followed by an initial inner layer, which we observe is paler, but still with
brown edges - representing slightly more sophisticated evidence, but still relatively
simple and not yielding huge theoretical insights. These two outer layers are usually
discarded by cooks, and likewise the value of this information is often overlooked in the
research context.

We then come to a series of fleshy white or pink layers, which are more succulent and
more valuable, and these we use to represent our experience of more detailed analysis.
Finally, we sometimes find a green core at the very centre of the onion, which can
represent as-yet untapped evidence.

THE LAYERS OF RESEARCH EVIDENCE

1. The Outer Skin: WebCT ‘Tracking’ Data


In this initial ‘layer’, we exploit one of the unique affordances of online learning, since
there is a wealth of numerical data recorded in WebCT, about the use of the web
materials, which can be readily accessed by the instructor. This data includes:

• the patterns of access to the website by programme participants - in terms of how


many times participants accessed the site; and the times and dates when they
accessed the course site;
• the use made of each page of the materials, the number of times a page was ‘hit’,
the average time spent in reading/ working on each page (including reading
Discussion Board postings).

This is indicative, macro-level data, which has to be treated with a certain caution.
Whilst the number of ‘hits’ may prove a useful marker to a pattern of interactivity, we
must recognise that some individuals clearly download large amounts of the online
material for printing as hard-copy, so they may only ‘use’ a certain page on one
occasion. However, the number of times they enter the site can be an indication of their
commitment to engaging with the online discussion board, even if they do not always
indulge in a posting themselves. This final point is especially important in identifying the

SOLSTICE 2007 Conference, Edge Hill University 5


engagement of the silent ‘lurkers’ or ‘witness learners’ (Beaudoin, 2002), who will read
but not make postings themselves.

2. Course Evaluation Data.


Just inside the ‘outer skin’, is a layer of course evaluation data. This in-built and on-
going evaluation takes several forms:

• Informal evaluation by tutors, who attend formal Team Meetings once a month and
share experiences, with this dialogue supplemented by regular e-mail
communications. This evaluative data is based primarily on impressions of the
interactions taking place on the programme. As such, it is a somewhat crude but still
useful measure.
• Evaluation incorporated into the face-to-face ‘contact’ days - for example, in the third
of the face-to-face days, half of the day is focused on evaluation, and the
programme team model the Nominal Group Technique, which produces a wealth of
formative evaluative data, mostly from the students’ perspective, although a
balancing view from tutors is also sought.
• Finally there are conventional end of module and end of programme questionnaires,
which ask students to rate various aspects of the programme on a five-point Likert
scale, as well as allowing them space for more unstructured (but still anonymous)
feedback.

We can use this evaluative data to give an indication of the effectiveness of our course
design; and we can also draw on this data to illustrate and illuminate findings from the
layers to either side - it can bring greater meaning to the WebCT data, and can also
start to bring the ‘participant voice’ into our analyses of the online discussion postings
(see below).

3. Analysis of the Archive of Discussion-Board, in terms of Groups.


This is the first of the rich inner layers of our research onion, and again relies on a
particular affordance of online learning, ie that the discussion board is archived and
therefore readily amenable to analysis.

We started with the purely quantitative, identifying the number of postings made by
each group, and for each activity. This gave an initial guide regarding general levels of
activity and enabled us to identify differences between groups, even within the same

SOLSTICE 2007 Conference, Edge Hill University 6


cohort. For example, four apparently ‘balanced’ learning sets within a single cohort
revealed the following varied patterns of activity:

Total Postings
Discussion Forum Module 1 Module 2 Module 3
Learning set 1 131 164 78

Learning set 2 194 227 98

Learning set 3 260 161 97

Learning set 4 306 349 202

(Sherratt & Sackville, 2006c)

In order to understand the interactions within groups, we also identified affective /


cohesive / interactive indicators, as devised by Karen Swan (2002). We have found this
scale particularly useful in capturing the social aspects of interaction which underlie the
actual words and academic focus of postings; and in this way, we started to identify
some differences in online behaviour that seemed to influence how active and how
interactive each group became, (Sherratt & Sackville, 2006b, 2006c).

Combining quantitative and more qualitative analysis allowed us to build up a picture of


the engagement, interaction and styles of discussion that took place within each group.
Of course, we recognise that groups will “form, norm, storm and perform” in different
ways, (Tiberius 1999), and for this reason, we have felt a reluctance to impose a regular
pattern of discussion on groups – spontaneity, innovation and creativity are valued. On
the other hand, social constructivism emphasises the additional value to be gained
through collaborative learning, and it is therefore important to the programme team that,
wherever possible, we achieve an element of dialogue to facilitate the co-construction of
knowledge that is an essential component of our social constructivist pedagogy.
However, it is apparent that some groups achieve this more readily than others, and this
is where this third ‘layer’ of analysis can become important - in helping us to understand
how and why this might be the case.

Having recognised that not all postings to an online discussion board will be the same, it
has therefore become important to categorise postings into different types of

SOLSTICE 2007 Conference, Edge Hill University 7


engagement; and so we devised our own scale for making a qualitative judgement
about postings, (Sackville & Sherratt, 2006). This Typology of Online Responses,
representing a continuum ranging from ‘statement’ through to ‘dialogue’, is currently
expressed as a four-point scale:

Typology of Online Responses


• Statement. A view expressed. A “closed” statement. Not inviting response or
comparison. A position statement.
• Limited response. Refers back to an earlier posting, but only in a limited way.
May be encouragement – eg – “yes- I agree”.
• Questioning response. Opens up the topic. Expands on ideas. Makes
comparisons.
• Dialogue. Building on ideas, taking them further, introducing new
interpretations, joint problem-solving, disagreements and disputes.
Sackville & Sherratt, 2006.

Application of this typology has allowed us to characterise the types of interaction which
occur in each group, and to create a ‘profile’ for each different group. For example:

(Sherratt & Sackville, 2006b).

In creating these group profiles, we can start to draw out differences between groups
where discussion has been more or less successful, considering the question: What
are the constituents of a good online group/ learning community?

SOLSTICE 2007 Conference, Edge Hill University 8


4. Analysis of the Archive of Discussion-Board, in terms of Individuals.
In this next layer, we carried out similar analyses to those conducted in Layer 3, but at
the level of individual participants rather than considering their performance as a group.
In this, we seek particularly to answer the question: What distinguishes a “good” online
learner/ discussant?

Analysis of individuals’ postings, in terms of number and frequency, allowed the


programme team to identify five different types of participants in the first cohort,
(Sackville, 2002). There were the “express trains” – often the first to respond to
activities, who steamed ahead and were almost hooked on the technology. Secondly
there were “reliable” and “consistent” participants, who maintained a steady rate of
participation throughout the programme, who often initiated new topics for debate, and
who genuinely engaged in discussion with each other. There was thirdly a group of
“slow starters” who picked up momentum as the programme progressed. They were
complemented by a group of “slowing-downers” who lost their initial momentum some
months into the programme. Finally there were a group of “witness learners” or non-
participants, who generally downloaded all the material and rarely appeared online.

These categories can be seen to describe how individual participants engage with the
course as a whole. Subsequent analysis has also identified specific roles played by
individuals within the group, such as the “peer facilitator” whose presence is crucial to
achieving dialogue and prolonged discussion threads, as compared to the “strategic
learner”, whose engagement is purely outcomes-driven (Sherratt & Sackville, 2006a).

Identification of these additional types of participants arose out of an analysis of profiles


of individual participants, using our Typology of Online Responses, (Sackville &
Sherratt, 2006). This improves our understanding of how online discussion might
flourish, and thus, at this point we can be seen as having entered a slightly deeper
‘layer’ of research.

Other profiling of individual participants is also possible here, such as a consideration of


individual Learning Styles (Honey & Mumford, 1992), which may offer additional insights
into how and why individual participants engage and interact in particular ways,
especially when considered alongside the individual profile of engagement resulting
from applying our Typology (see above).

SOLSTICE 2007 Conference, Edge Hill University 9


Thus, it can be seen that each of these ‘fleshy’ inner layers of research actually
comprises of a number of analyses, yielding different elements of evidence.

5. Participant Voice - including both students & tutors:


In this layer, we explore the actual reported experience of all participants (including
tutors as well as students). This can be achieved by multiple methods, including
questionnaires, individual interviews, or focus groups. The binding element here is the
notion of individual testimony, which makes this an incredibly rich layer of evidence.

As an example, in the autumn of 2002, a grant from Escalate allowed the programme
team to employ an independent researcher to conduct in-depth interviews with a sample
of 9 participants who were members of the first two cohorts (2000, 2001). The sample
was constructed to include representatives of both of the first two cohorts;
representatives of the three main groups of participants – doctors, dentists and other
health professionals; and representatives from each of the four tutor-groups. The
interviewer used a semi-structured face-to-face interview and asked the participants to
assess whether the programme had helped them with their tasks of teaching junior staff;
as well as asking them about the form of course delivery and the positive and negative
aspects of using a virtual learning environment. The interviewer also asked them to
scale the success of five types of interactivity (see above) that had guided the
programme team in the design of the programme, (Sackville, 2002).

Another example of the ‘participant voice’ layer of evidence would be our subsequent
research into ways in which participants choose to engage with online resources, and
the extent to which online material is used in printed form. This enquiry was carried out
using questionnaires, addressed to two cohorts of students, (Sherratt, 2005).

Other research exploring participants’ individual experiences (both as students and as


tutors) is currently ongoing.

Another distinct aspect of the ‘participant voice’ is that afforded by assessed work. The
nature of this programme requires participants to reflect on their own learning
experiences as part of the assessed work within the programme. In particular, there is a
requirement that for one of the assessment tasks, the participants must review the
discussion postings on one particular activity and must write a reflective statement on

SOLSTICE 2007 Conference, Edge Hill University 10


the activity and their own role in the discussion which ensued. This also leads to them
making comments about the online aspects of the programme, and on their perceptions
of the usefulness of online discussion as a tool for learning.

These assessed statements provide a wealth of insights into the participants’ approach
and motivation, and their attitudes to participating in online discussion. But although
they do lend themselves to analysis, we need to gain the agreement of the participants
before we can use their assessed work for our research. We must also recognise that,
as these are assessed artefacts, the participants may sometimes write what they expect
the tutors will want to hear rather than always expressing their own unfettered ideas.

Nevertheless, the possibility of analysing reflective statements based around online


discussion activities offers an additional element to this layer of enquiry, especially
useful when triangulated with other sources of data.

6. Participant Observation
This is the deepest layer we have identified to date and have started to use. We
recognise, however, the potential for greater exploitation in future research. In 2005, an
external researcher joined one of our MA in E-learning modules to carry out
ethnographic research into this method of programme delivery.

In this ethnographic analysis, two broad themes emerged, namely motivational issues
pertinent to e-learning in general, and issues of identity formation in asynchronous
discussion. The research focused on both aspects, attempting to illuminate the potential
of e-learning for the enhancement of learning among self-motivated learners and
building motivation amongst other participants, with the result of constructing social
identity. Initial findings were reported to an international conference (Kruger, 2006), and
this initial report provided useful insights into group interaction which have influenced
our teaching and course design, and also our research approach.

A similar study was therefore commenced with the 2006 cohort of the Postgraduate
Certificate, although results have yet to be analysed in detail. Insights developed by
means of participant observation offer an ideal method of triangulating other sources of
evidence, and this is therefore recommended for future work.

SOLSTICE 2007 Conference, Edge Hill University 11


FUTURE WORK

Future work will involve further analysis of each of the layers of the research onion that
we have identified above.

The data collected from the outer layers can be revisited and compared to the data
arising from the analysis in the inner layers. For example, we propose to explore the
engagement profiles of identified ‘peer facilitators’ and compare these with the more
strategic learners.

In Layer 4, we propose to add to individual profiles by including data on achievement in


assessment. We recognise, however, that both achievement and engagement profiles
may relate to broader personal attributes. For example, a high level of engagement
online may contribute to high achievement in assessment; or, conversely, their online
engagement may be a product of the confidence engendered by academic excellence.

Similarly, as we have identified above, there is great potential to extend our work
around the participant voice. This will include exploring the interaction between different
tutors and participants.

CONCLUSIONS

In this paper we have described a ‘layered’ approach to researching an e-learning


programme. By employing a variety of sources of data, we can build a more complete
picture, and at the same time, we can capitalise on natural sources of evidence that are
built into the VLE.

This model also allows us to incorporate evidence from other researchers, which both
enhances our understanding as designers and teachers using e-learning, and gives us
additional insights as researchers into other approaches, conceptualizations and
methodologies which we might apply in our own ongoing research.

It is clear that all the different pieces of evidence bring additional and unique insights.
This means that we need to continue to peel the onion, but in order to gain the best

SOLSTICE 2007 Conference, Edge Hill University 12


overview and deepen our understanding, perhaps we also need to slice the research
onion, so that we can see all six layers of evidence at the same time!

REFERENCES

BEAUDOIN, M.F, 2002. Learning or lurking? Tracking the “invisible” online student. The
Internet and Higher Education 5, 147-155.

BROOKFIELD, S.D. & PRESKILL, S., 1999. Discussion as a Way of Teaching.


Buckingham: Open University Press.

GARLAND,K.J. & NOYES,J.M., 2004. CRT monitors: Do they interfere with learning?
Behaviour & Information Technology, January-February 2004 23 : 1 43-52

GORSKY. P. & CASPI, A., 2005. Dialogue: a theoretical framework for distance
education instructional systems. British Journal of Educational Technology 36, 2. 137-
144.

HEWITT,J., 2003. How habitual online practices affect the development of


asynchronous discussion threads. Journal of Educational Computing Research 28:1 31-
45.

HEWITT,J., 2005. Toward an understanding of how threads die in asynchronous


computer conferences. Journal of the Learning Sciences, 14, 4. 567-589.

HONEY P. & MUMFORD A., 1992. The Manual of Learning Styles. 3rd Edition.
Maidenhead: Peter Honey.

KRUGER, S., 2006. Students’ experiences of e-learning: a virtual ethnography into


blended online learning. Paper presented to the 5th International Conference on
Networked Learning, April 2006, Lancaster University, Lancaster, UK.

MAZZOLINI, M. & MADDISON, S., 2003. Sage, guide or ghost? The effect of instructor
intervention on student participation in online discussion forums. Computers &
Education 40 (2003) 237-253.

SOLSTICE 2007 Conference, Edge Hill University 13


SACKVILLE A.D., 2002. Designing for interaction. Proceedings of the 3rd International
Conference on Networked Learning (Sheffield University) 534-541.

SACKVILLE, A., SCHOFIELD, M. & DAVEY, J., 2002. Why should we be concerned
about ‘patterns of interactivity’ in cyberspace education? Paper presented to the Ideas
in Cyberspace Education Symposium; Higham Hall, Cockermouth, UK.

SACKVILLE A.D. & SHERRATT C.A., 2006. Styles of Discussion: Online Facilitation
Factors. Paper presented to the 5th International Conference on Networked Learning,
April 2006, Lancaster, UK.

SHERRATT C.A., 2005. e-Learners' engagement with online learning resources.


Paper presented to the 4th Edge Hill CLTR conference, July 2005, Ormskirk, UK.

SHERRATT C.A. SACKVILLE A.D., 2006a. Styles of Discussion: Influences & Trends.
Paper presented to the 1st SOLSTICE Conference, May 2006, Edge Hill, Ormskirk, UK.

SHERRATT C.A. & SACKVILLE A.D., 2006b. Styles of Discussion: Influences and
Trends. Paper presented to the Symposium on Medical Interactive eLearning (SMILE),
September 2006, Sestri Levante, Italy.

SHERRATT C.A. & SACKVILLE A.D., 2006c. Online learning: developing styles of
discussion. Paper presented to the 30th Collaborative Action Research Network
(CARN) conference, November 2006, Nottingham, UK.

SWAN, K., 2002. Building Learning Communities in Online Courses: the importance of
interaction. Education, Communication & Information. 2, 1. 23-49.

TIBERIUS, R.G., 1999. Small Group Teaching. London: Kogan Page.

SOLSTICE 2007 Conference, Edge Hill University 14