Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
The Distance Learning Benchmarking Club: Final Summary Report

The Distance Learning Benchmarking Club: Final Summary Report

Ratings: (0)|Views: 10|Likes:
Published by Paul Bacsich
Spinning off from the DUCKLING project, a Distance Learning Benchmarking Club of four institutions (initially seven) was set up across the world. All four were successfully benchmarked using a slightly modified version of Pick&Mix. This short report describes the history and outcomes of the Club, within the wider context of developments in Pick&Mix and in particular the progress of benchmarking at two further institutions outside the original Club.
Spinning off from the DUCKLING project, a Distance Learning Benchmarking Club of four institutions (initially seven) was set up across the world. All four were successfully benchmarked using a slightly modified version of Pick&Mix. This short report describes the history and outcomes of the Club, within the wider context of developments in Pick&Mix and in particular the progress of benchmarking at two further institutions outside the original Club.

More info:

Categories:Types, Research
Published by: Paul Bacsich on Aug 10, 2013
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

06/20/2014

pdf

text

original

 
Bacsich 1 23 September 2011
The Distance Learning Benchmarking Club
 – 
Final Summary Report
 Paul Bacsich. September 2011
Summary
Spinning off from the DUCKLING project,
1
a Distance Learning Benchmarking Club of four institutions (initially seven) was set up across the world. All four were successfully benchmarked using a slightly modified version of Pick&Mix. This short report describes thehistory and outcomes of the Club, within the wider context of developments in Pick&Mixand in particular the progress of benchmarking at two further institutions outside the Club.
History
Arising out of the planning for the DUCKLING project bid I was asked by Professor GillySalmon, then head of BDRA at the University of Leicester, in early 2009 to organise a smallclub of higher education institutions similar to the University of Leicester in their approach todistance learning, who would benchmark their e-learning in a collaborative way. TheUniversity would also be a member of the club.A preliminary project plan was prepared in June 2009 and it was agreed that via theconsultancy company that I own (Matic Media Ltd) I would be paid a small amount (a fee of £2000) to organise the club, but that any methodology development and support of theindividual institutions would be supplied free of charge by myself on the basis that it wouldassist my further development of the Pick&Mix methodology. It was further agreed thatBDRA staff would contribute to the support and organisation of the Club on the basis of their expertise in Pick&Mix benchmarking, the University (via BDRA) being at the time one of the earliest and most knowledgeable users of Pick&Mix. No formal agreement was signed
 – 
in some ways this was a problem but in others it was vitalto the success, as it is likely that any detailed business analysis of the proposition would haveled my colleagues in Matic Media Ltd to recommend against the proposal especially since therate I charged the University of Leicester was about half of the commercial rate.As a matter of principle, I never accept any funding for the
development 
of Pick&Mix (for many reasons including to protect the IPR and to continue to make it available via CreativeCommons), but I am normally paid for its
 support 
. The rough tariff, dating from the HEAcademy era of the Pilot Phase of benchmarking in the UK, is about 10 days work per institution. Try as one will, the tariff rarely comes out to much less than this in the end
withinstitutions new to benchmarking 
.Since all the benchmarking I had done up till then was for institutions within the UK, neither the University nor myself had any experience of supportin
g benchmarking “at a distance”
andthe partners assumed that use of video conferencing tools would make sense (in fact the veryfirst idea was to use GRID conferencing not just Elluminate or similar 
 – 
but in the end thesewere not used to any large extent.The original plan was that:
1
The DUCKLING project (but not the Club) was funded by JISC. Seehttp://www.jisc.ac.uk/whatwedo/programmes/elearning/curriculumdelivery/duckling.aspx 
 
BENCHMARKING FOR DISTANCE E-LEARNINGBacsich 2 23 September 2011
 A Distance Learning Benchmarking Club of seven universities across theworld, all active in distance online learning in a dual-mode fashion, will benchmark their online distance learning activity twice in the next two years,once in Autumn 2009 and once in Autumn 2010.
However, in only one institution did this pure paradigm occur of benchmarking twice
 – 
mostactivity took place in the period September 2010 to February 2011.The benchmarking system was described as follows:
The benchmarking system to be used is a new version of the Pick&Mix systemused already by 24 institutions in the UK for benchmarking e-learning and currently in use by two more (another UK institution and an Australianinstitution). It has been slightly adapted to have more focus on distance e-learning and 
 serious implementation
” (step
-change) but without going beyond the guidelines on numbers of criteria used. The basis for the new set of core criteria is the set of Critical Success Factors defined by the Re.ViCa project using extensive international input from a wide-ranging International  Advisory Committee of e-learning experts
 – 
usefully, there is a substantial overlap with the current UK set of core criteria for Pick&Mix...
The Club and the Pick&Mix methodology draws on five phases of benchmarking with UK HEIs using the Pick&Mix system, and on wider experience of benchmarking in Europe andinternationally (in particular, but not only, the Re.ViCa project).Early in the history of the Club there was already interest from open universities andadditional dual-mode institutions in this club and requests to join. It was decided early on thatthe Club was a closed club
 – 
but that I would take steps and soundings to see how to set upanother club later. For that reason, the Club is more pedantically described as:
The first dual-mode distance learning benchmarking club
Plans are now being formed to set up a second benchmarking club involving some but not allof the institutions in the first Club, and some new members.
Project partners
Earlier scoping work on such a Club suggested that between 4 and 8 universities should take part
 – 
at least 4 (to ensure sufficient diversity and common working) and at most 8 (to ensureviable operation of virtual project meetings using synchronous video tools). In general terms,even numbers are better than odd
 – 
more suitable for subgroups
 – 
but this is not essential.It was further agreed that:
each partner should be a university with at least 10,000 students whosedistance online learning offering has in excess of 1000 students and also has awide range of programmes.
The final list is of 7 institutions was stable by September 2009:1.
 
(lead) University of Leicester, UK 2.
 
University of Liverpool, UK 3.
 
University of Southern Queensland, Australia4.
 
Massey University, New Zealand5.
 
Thompson Rivers University, Canada
 
BENCHMARKING FOR DISTANCE E-LEARNINGBacsich 3 23 September 2011
6.
 
Lund University, Sweden7.
 
KTH (Royal Institute of Technology), Sweden.This was the list of universities reported when I presented an invited paper on benchmarkingand quality to the ENQA invitational workshop on quality in e-learning in Sigtuna (Sweden)in October 2009.
2
All of those I had met personally at conferences in 2009 and all hadconfirmed to me face to face and in writing (email) that they would join the Club and do benchmarking.
The finalists
However, despite the best endeavours of myself and Professor Gilly Salmon, we lost threeinstitutions along the way. It is not the place of this paper to speculate on the motives and thechanges behind this
 – 
that is more the role of a competitor analyst
 – 
but my summaryobservations would be as follows:
 
The University of Liverpool, UK had seen some changes in personnel including a newVC and a growing and deepening relationship with Laureate
 – 
perhaps they felt that benchmarking had no value since their path was fixed by higher powers.
 
The University of Southern Queensland, Australia suffered the death of a keyindividual
 – 
they took some time to re-engage but by that time they had recruitedProfessor Gilly Salmon and it seems that their priorities went off in a differentdirection.
 
Massey University, New Zealand were initially enthusiastic, indeed from both their senior staff I met, but for whatever reason (perhaps changes of LMS
 – 
and recentlysome reorganisations) they never got started. But interestingly in the last year theyhave collaborated well on other projects.
This left the “final four”, one in UK, two in Sweden, and one in Canada:
 1.
 
University of Leicester, UK 
 – 
due to some staff and internal issues they did not finishthe benchmarking until late 20102.
 
Thompson Rivers University, Canada
 – 
they managed to do two phases and I managedto attend their final scoring meeting (and two of their team met me in London)3.
 
Lund University, Sweden
 – 
they did a thorough job as one of three methodologiesthey used
 – 
I attended the kick-off meeting4.
 
KTH (Royal Institute of Technology), Sweden
 – 
they took longer to get going but inthe end did a very thorough job
 – 
I met their team three times, twice at KTH and onceat LondonIn line with earlier work for the Higher Education Academy, four is a sufficient basis onwhich to make useful comparisons.
What is
benchmarking distance learning 
?
Across the world, by no means all distance learning is yet imbued with substantial amounts of e-learning. So in theory we had to decide between two options:1.
 
Benchmarking e-learning within the Distance Learning
slice
of the institution, and2.
 
Benchmarking distance learning, using a distance learning
“mood”
of Pick&Mix.
2
See the presentation linked to http://www.enqa.eu/eventitem.lasso?id=249 

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->