You are on page 1of 127

Library Performance Indicators

Does it really make sense to measure them?


Zagadnienia
▪ Introduction
▪ LPI-s for traditional libraries (ISO)
▪ LPI-s for digital/electronic libraries (ISO)
▪ Measuring service quality:
LibQUAL+/Rodski
▪ Performance Analysis of Polish Research
Libraries Project (Marzena)
▪ New context of the library/Googlization
▪ Place of the future library
▪ LPI-s for libraries of the 21st Century?
▪ Conclusions

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction

How good we (the libraries) are?

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction.
Quality – is it important?

High quality of library performance is crucial for each


research library to survive. Wide on-line access to
information makes researchers and students demand
the highest quality library services. It is the quality of
library services that decides on the perception of the
library within its parent institution and the society.

Derfert-Wolf, Górski, Marcinek @IFLA 2005

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. What is quality?

Glossary: quality = fitness for purpose, fitness for use,


conformity to requirements and absence of defects

ISO Standard 11620 (Performance Indicators for Libraries) defines


„quality” as:
Totality of features and characteristics of a product or service
that bear on the library's ability to satisfy stated or implied needs.

In the TQM context :


„The quality of service is defined by the customer's perception of
both the quality of the product and the service providing it”
(Barnard, 1994)

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. Quality.

Quality assessment depends not only


on the product or service as it is, but
also on a person or institution involved
in the assessment process.

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. Quality.
Who decides about the quality and who evaluates its level and
assesses quality („fitness to purpose”) of the library?

"Many librarians maintain that only they, the professionals, have the
expertise to assess the quality of library service. They assert that users
cannot judge quality, users do not know what they want or need, and
professional hegemony will be undermined if they kowtow to users. Such
opinions about services, in fact, are irrelevant. The only thing that matters
is the customer opinions, because without users there is no need for
libraries except to serve as warehouses… After all, customers (present,
potential, and former ones) believe that the library's reason for being open
is to meet their needs. Each customer evaluates the quality of service
received and decides when (or if) there will be further interaction with that
organization”

(Hernon, Altman 1998)

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. Quality.

▪ The quality of a library is defined and assessed from a perspective of


different groups of people
▪ Basic element of the quality is user satisfaction
▪ Users in different countries or even different user groups may have
different needs and expectations and therefore different level of
satisfaction from the same service
▪ User satisfaction is NOT an objective value (though it is measurable)
▪ User satisfaction “is the emotional reaction to a specific transaction or
service encounter” (Hernon, Altman)
▪ User satisfaction from a single transaction is determined by many
different factors including service quality, user’s past experience with
the service provider, actual emotional status of user, etc. (Hernon,
Altman)

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. Service quality.

The better service quality – the higher satisfaction of users,


but:

The “perceived quality” is different from “the objective quality”

The “service quality” is dependent on the customers perception of what


they can expect from a service and what they believe they have
received, rather than it is any “objective” standard as determined by a
professional group or in conventional performance measurement.
R.Cullen, Library Trends, vol. 49, no. 4, 2001

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. User satisfaction.

User is satisfied when provision of the


service meets his/her expectations.

If there is a gap between service


delivered and user expectations – the
user is not satisfied from the service.

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction. Gap analysis.

The gaps between users’ expectations


and perceptions (SERVQUAL model):
1. The discrepancy between users’ expectations and
management’s perception of these expectations
2. The discrepancy between management’s perception of users’
expectations and service quality expectations
3. The discrepancy between service quality specifications and
actual service delivery
4. The discrepancy between actual service delivery and what is
communicated to users about it
5. The discrepancy between users’ expected service and
perception of service delivered.

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction.

▪ Quality
• User satisfaction
• Other parameters
▪ measuring user satisfaction
▪ measuring other parameters
▪ quality surveys (LibQual+, Rodski)
▪ Library performance indicators
▪ ISO norms

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction

▪ Why to measure library performance?


• Library management and decision-making process
• Monitoring implementation of strategic plans
• Optimization of library activities; service enhancements
• Acquiring and rational allocation of financial resources
• Library marketing
• Accreditation
• Benchmarking, rankings, to compare…
• …

▪ academic vs. public libraries context

CASLIN 2006, Český Ráj, 11–15.6.2006


to compare…

How’s your wife?” Compared to what?

M.Lynch :Compared to What? Or, Where to Find the Stats.


AMERICAN LIBRARIES, September 1999

CASLIN 2006, Český Ráj, 11–15.6.2006


Introduction
How to measure library performance?
▪ Norm ISO 11620:1998, ISO 11620:1998/AD1:2003 Information and
Documentation. Library performance indicators
▪ Technical Report ISO/TR 20983:2003 Information and documentation.
Performance indicators for electronic library services
▪ Poll R., te Boekhorst P. “Measuring Quality: International Guidelines for
Performance Measurement in Academic Libraries”. IFLA 1996.
▪ ICOLC Project: Guidelines for Statistical Measures of Usage of Web-
Based Information Resources. ICOLC, 2001
(http://www.library.yale.edu/consortia/2001webstats.htm)
▪ Projekt COUNTER (www.projectcounter.org)
▪ Guidelines for Statistical Measures of Usage of Web-Based Information
Resources. ICOLC, 2001 (http://www.library.yale.edu/consortia/2001webstats.htm)

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO 11620 (1998)

Information and documentation –


– Library Performance Indicators

▪ This International standard is applicable to all types of libraries


in all countries.
▪ Indicators may be used for comparison over time within the
same library. Comparison between libraries may also be made,
but only with extreme caution.
▪ This International Standard does not include indicators for the
evaluation of the impact of libraries either on individuals or on
society.

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO 11620 (1998)
▪ User perception
• General
• USER SATISFACTION
▪ Public Services
• General
• Percent of Target Population Reached
• Cost Per User
• Library Visits per Capita
• Cost per Library Visit
• Providing Documents
• Titles availability
• Required Titles Availability
• Percentage of Required Titles in the Collection
• Required Titles Extended Availability
• In-library Use per Capita
• Document Use Rate
CASLIN 2006, Český Ráj, 11–15.6.2006
ISO 11620 (1998)
▪ Public Services (cont.)
• Retrieving Documents
• Median Time of Document Retrieval from Closed Stacks
• Median Time of Document Retrieval from Open Access Areas
• Lending Documents
• Collection Turnover
• Loans per Capita
• Documents on Loan per Capita
• Cost per Loan
• Loans per Employee
• Document delivery from external sources
• Speed of Interlibrary Lending
• Enquiry and reference services
• Correct Answer Fill Rate

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO 11620 (1998)
▪ Public Services (cont.)
• Information searching
• Title Catalogue Search Success Rate
• Subject Catalogue Search Success Rate
• User education
• NO INDICATOR
• Facilities
• Facilities Availability
• Facilities Use Rate
• Seat Occupancy Rate
• Automated Systems Availability
▪ Technical Services
• Acquiring documents
• Median Time of Document Acquisition
• Processing documents
• Median Time of Document Acquisition
CASLIN 2006, Český Ráj, 11–15.6.2006
ISO 11620 (1998)
▪ Technical Services (cont.)
• Cataloguing
• Cost per Title Catalogued
▪ Promotion of services
• NO INDICATOR
▪ Availability and use of human resources
• NO INDICATOR

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO 11620 Amendment 1 (2003)

Additional indicators:

▪ Public services
• Providing documents
• Proportion of Stock Not Used
• Shelving Accuracy
• Lending Documents
• Proportion of Stock on Loan
▪ User services
• User Services Staff per Capita
• User Services Staff as Percentage of Total Staff

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO/TR 20983 (2003)
Information and documentation –
– Performance indicators for electronic libraries
The performance indicators described in this Technical Report are used as tools to
compare the effectiveness, efficiency and quality of the library services and
products to the library’s mission and goals. They can be used for evaluation
purposes in the following areas:

▪ Comparing a single library performance over years


▪ Support for management decisions
▪ Demonstrating the library performance and its cost to the funders, the population
and public
▪ Comparing performance between libraries of similar structure
▪ Whether the library’s performance or the use of its services has changed over
years
▪ How far the performance or use in one library differs from that in other libraries

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO/TR 20983 (2003)
▪ Public Services
• General
• Percentage of Population Reached by Electronic Services
• Providing electronic library services
• Percentage of Expenditure on Information Provision Spent
on the Electronic Collection
• Retrieving documents
• Number of Documents Downloaded Per Session
• Cost Per Database Session
• Cost Per Document Downloaded
• Percentage of Rejected Sessions
• Percentage of Remote OPAC Sessions
• Virtual Visits as Percentage of Total Visits
• Enquiry and reference services
• Percentage of Information Requests Submitted
Electronically

CASLIN 2006, Český Ráj, 11–15.6.2006


ISO/TR 20983 (2003)
▪ Public Services (cont.)
• User education
• Number of User Attendances at Electronic Service Training
Lessons Per Capita
• Facilities
• Workstation Hours Available Per Capita
• Population Per Public Access Workstation
• Workstation Use rate
▪ Availability and use of human resources
• Staff training
• Number of Attendances at Formal IT and Related Training
Lessons Per Staff Member
• Deployment of Staff
• Percentage of Library Staff Providing and Developing
Electronic Services

CASLIN 2006, Český Ráj, 11–15.6.2006


Measuring library service quality

▪ LibQUAL+
▪ RODSKI
▪ Performance Analysis for Polish
Research Libraries

CASLIN 2006, Český Ráj, 11–15.6.2006


LibQUAL+ (ARL)

▪ LibQUAL+ has 22 standard statements and the


option to select five local service quality
assessment statements. For each of which, the
client is asked to rate three times – for the minimum,
desired and perceived levels of service quality.
These are all scaled 1-9, with 9 being the most
favourable. There is an open ended comments box
about library services in general.

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
RODSKI (Rodski) (rodski.com.au)

▪ Rodski is the Australian behavioural research company which


develops its own surveys.
▪ Rodski had 38 statements and the option to include up to 15
local service quality assessment statements which clients are
asked to rate twice – firstly to measure the importance of each
of the statements to them, and secondly to measure their
impression of the library’s performance on each statement.
These are scaled 1-7, with 7 being the most favourable. There
are two comments boxes at the end of the survey – one for
general comments and one for “the one area we could improve
on to assist you”?

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
Performance Analysis for Polish
Research Libraries

Marzena Marcinek

CASLIN 2006, Český Ráj, 11–15.6.2006


Research Libraries in Poland
◼ Total 1225
◼ National Library 1
◼ Academic Libraries 989
◼ Libraries of the Polish
Academy of Sciences (PAS) 94
◼ Libraries of branch R&D 99
units 11
◼ Public libraries 31
◼ Other
Collection of Polish research
libraries (excl. e-collection)
Total Books Serials Special
in thousand in thousand in thousand collections
vols. vols. vols. in thousand
physical units
Total 73.503 57.546 15.957 25.967
National Library 2.865 2.129 736 2.977

Academic Libraries 52.804 42.238 10.566 19.308

Libraries of the PAS 4.729 2.960 1.769 583

Libraries of branch 2.867 2.062 805 1.093


R&D units
Public libraries 6.386 5.391 995 1.463

Other 3.852 2.766 1.086 543


Readers, loans and staff
of research libraries
Loans
Readers for individual users
Staff
in thousand in thousand physical
units
Total 2.102 17.297 9.461
National Library 35 25 590

Academic Libraries 1.657 14.712 6.680

Libraries of the PAS 42 270 380

Libraries of branch R&D 37 138 234


units
Public libraries 250 1.904 975

Other 81 247 602


Academic libraries
Regulations
◼ the Library Act of 1997
◼ the Higher Education Act of 2005

Funds
◼ budgets of parent institutions from the
resources of the appropriate ministries,
e.g. the Ministry of Science and Higher
Education (usually cover only current
expenditure)
Official library statistics in Poland

◼ Central Statistical Office (CSO)


data collected every second year

◼ The Higher Education published by the


Ministry of Science and Higher Education
data collected every year
Assessment of higher education
institutions (and libraries)

◼ State Accreditation Commission


◼ Journals
◼ University / parent institution bodies
◼ Libraries
Characteristics of library statistics and
performance measure in Poland
▪ lack of national library statistics system
▪ data on libraries are gathered every second year by the
Central Statistical Office - insufficient for comparable
analyses and not consistent with ISO 2789
▪ lack of unified criteria to evaluate and compare library
performance
▪ lack of tools for systematic data gathering
▪ lack of body/institution responsible for developing methods
and tools for library evaluation
▪ the State Accreditation Commission - dealing with library
issues in a very general manner
Quality initiatives and user surveys
in Polish academic libraries
◼ Development of Library Management
as Part of the University Total Quality
Management (EU Tempus grant, 1998-2000)
• "Analysis of current state of libraries
with selected performance indicators"
• user survey (LIBRA package)
◼ Comparative studies of Polish
research libraries (national conference, Krakow
2001)

◼ a lot of separate researches and


surveys
The Group for Standardisation
for Polish Research Libraries

▪ formed in 2001, initially as an informal team

▪ activities incorporated into the overall plan of tasks of the


Standing Conference of the Directors of Higher Education
Libraries

▪ "Performance Analysis for Polish Research Libraries” – a


project based on the agreement on cooperation signed by
8 institutions employing members of the Group (2004)

▪ Project co-financed by the Ministry of National Education


and Sport, 2004
A Common Project of Polish
Research Libraries on
Comparable Measures
Objectives

▪ to define methods for the assessment of


Polish research libraries
▪ to select a set of performance indicators
and standards for library performance
(quantity, quality and effectiveness)
Goals

▪ to collect libraries' statistical data for a computer


database
▪ to conduct a comparative research
▪ to prepare and publish yearly reports
Tasks

▪ identification of publications on library performance


and national solutions in different countries
▪ preparation and further modification of a
questionnaire for the survey of library performance
▪ preparation and further modification of a dedicated
software for the acquisition and analysis of data
collected in the surveys
▪ data collection
▪ promotion
▪ detailed analysis of data
Questionnaire
• Staff
• Collection
• Budget
• Infrastructure
• Circulation
• Information services
• Didactics
• Publications and data bases created by the library
• Library cooperation, organisation of library events,
professional activity of library staff
Patterns for the Polish Questionnaire

• EU TEMPUS PHARE JEP 13242-98 “Development of


Library Management as part of the University TQM”
▪ ISO 11620:1998, AD1:2003 Information and Documentation.
Library performance indicators
▪ ISO 2789:2003 Information and Documentation. International
Library Statistics
▪ R. Poll, P. te Boekhorst “Measuring Quality : International
Guidelines for Performance Measurement in Academic
Libraries”. IFLA 1996
Changes and modifications
to the questionnaire (2004)

▪ more indicators and formulas based mainly on the ISO


11620 and ISO 2789 standards (information services,
electronic sources and usage)
▪ problems reported by librarians or observed by the
administrator of the database
▪ more notes and comments (financial and staff issues)
Questionnaire

▪ 48 questions of various types


▪ refer to easily accessible or computable data (e.g. size of
collection, number of users etc.)
▪ closed questions about the services offered (e.g. on-line
reservation: Yes/No)
▪ 88 performance indicators
▪ 19 calculated by librarians
▪ 69 calculated automatically
Why so many indicators ?

▪ the need for a comprehensive analysis of current state


of Polish research libraries
▪ the need to cover all aspects of library activities included
in questionnaires
▪ the need to develop standards for library evaluation in
the future, on the basis of current performance indicators
▪ usefulness for different purposes, both for libraries and
another institutions and authorities
▪ “three times-calculated” selected indicators (lack of FTE
student equivalent)
Examples of performance indicators
required to complete the questionnaire

▪ library expenditure per student/user,


▪ expenditures for library materials/books per student/user
▪ ratio of library budget to the budget of its parent university
▪ time required for the technical processing of a document
▪ collection on the computer system as a % of the whole
collection of the library
▪ percent of catalogue descriptions acquired from outside
resources
Examples of performance indicators
calculated automatically

▪ Registered users as % of potential users


▪ Total books per student/user
▪ Books added per student/user
▪ Number of students/users per one library staff member
▪ Total library area per student/user
▪ Number of students/users per one study place in
reading rooms
▪ Loans per registered user
▪ Loans per library staff member
▪ User services staff as % of total staff
▪ Staff with higher LIS education as % of total staff
▪ Open access printed books as % of total printed books
Software for the acquisition
and analysis of data – requirements
▪ on-line access to the questionnaire (submission, modification)
▪ selected performance indicators automatically calculated and presented
▪ automatic control and verification of the accuracy of data in the fields
▪ multi-aspect comparative analysis of selected data and performance
indicators
▪ access to analysing functions for individual libraries
▪ Internet website - information about the Project, a set of instructions,
questionnaires, useful links, results of research
▪ module for librarians - an on-line questionnaire, multi-aspect analysis of data
concerning one’s own library
▪ administrator’s module - registration of libraries and direct contacts
▪ database - incorporate and register data from the questionnaires (dynamic
form)
▪ module for the Group - statistical analyses on data and performance
indicators
Elements of Software Application

• an Internet web-site with direct links to information about


the Project, a set of instructions, questionnaires, results of
research, useful links to sites dealing with performance
indicators and library statistics;

• a module for librarians - an on-line questionnaire with tools


for automatic control and verification of the accuracy of
data entered in each field, also an adequate formulae to
calculate selected performance indicators. There are two
versions of the questionnaire: for academic libraries and
for public libraries. The module for librarians enables also
multi-aspect analysis of data concerning one’s own library
according to various criteria;
Software (2)

• an administrator’s module enables registration


of libraries and individual persons entitled to
transmit data and work out analyses. It is also
used for direct contacts with library staff
responsible for filling-in the questionnaires;

• the database designed to incorporate and


register data from the questionnaires has been
given a dynamic form i.e. the administrator can
change, add or delete any fields corresponding
to the questions from the questionnaire;

• a module for the Task Group for Standardisation


designed as a tool to carry out statistical
analyses on data and performance indicator.
Data collection
◼ Since autumn 2003 the programme for statistical data collection
is accessible for each library registered in the system
◼ By 15 May 2005 in the Project database there are registered 57
libraries
▪ 52 academic libraries (41 state-owned and 11
non state-owned)
▪ 3 public libraries
▪ 2 special libraries
◼ Questionnaires for 2003 completed 29 libraries:
▪ 23 state-owned academic libraries
▪ 2 non state-owned academic libraries
▪ 2 special libraries
▪ 2 public libraries
◼ Questionnaires for 2002 completed 17 libraries
▪ 16 academic libraries
▪ 1 public library
Problems with data collection
◼ lack of some statistical data required to
complete the questionnaire or difficulties in
obtaining them
◼ lack of comparable data on the use of electronic
resources (incl. differences in usage statistics
generated by various providers)
◼ differences in library structure and budgeting
within university
◼ difficulties with validation - mistakes (e.g.
wrong ratio) need correction, misunderstanding
of data requirements, wrong interpretation of
questions
◼ participation in the Project is not compulsory
The analysis of data for 2002-2003
- examples
▪ performance indicators calculated by librarians
▪ performance indicators calculated automatically
▪ ratio of expenditures in library budgets
▪ groups of the analysed libraries:
▪ state-owned academic libraries of different types (university
libraries, technical university libraries, agricultural university
libraries and other),
▪ non state-owned academic libraries,
▪ public libraries
▪ special libraries
▪ average values, medians, maximal and minimal values
University of technology
University libraries
Performance indicators libraries
2002 2003 2002 2003
Cost per user in PLN 191,22 182,09 157,79 171,35
Acquisition cost per user in PLN 45,65 53,81 56,59 56,78
Library budget as % of institutional
4,81 4,75 2,57 2,59
budget
Registered users as % of potential
73,81 70,56 72,22 65,88
users
Total books per user 21,84 20,03 11,44 12,45
Books added per user 0,20 0,26 0,20 0,18
Loans per registered user 7,7 8,9 7,9 7,9

Loans per library staff member 2037,6 2446,8 2904,9 2215,2


Users per library staff member 341 364 451 432
Total library space per user 0,34 0,26 0,18 0,18
Users per seat 116,0 109,2 86,0 95,04
Open access printed books as % of
8,7 10,7 11,13 9,59
total printed books
User services staff as % of total staff 48,9 54,0 57,9 58,6
Staff with higher LIS education as %
34,8 43,6 50,0 51,5
of total staff
Time of document acquisition and
16,17 37,57 14,38 16
processing in days
University Libraries - 2002 University Libraries - 2003
11,65 12,4
28,05 28,41
12,09 8,14
Collection Collection
2,49
3,49
Staf f expenditures Staf f expenditures
Automation Automation
Premises Premises
Others expenditure Others expenditure
44,72 48,56

Technical Univ. Libraries - 2002 Technical Univ. Libraries - 2003


12,02 11,09

5,38 6,35
31,63
34,45
2,73 2,37
Collection Collection

Staf f expenditures Staf f expenditures

Automation Automation
Premises Premises
Others expenditure Others expenditure
45,43 48,58

All examined academic libraries - All examined academic libraries -


2002 2003
11,73
13,54
6,11 29,15
30,42
7,72
2,33
2,85 Collection Collection
Staf f expenditures Staff expenditures

Automation Automation

Premises Premises

Others expenditure Others expenditure


45,47 50,7
User satisfaction

IFLA guidelines recommend two indicators to


examine user opinion:

◼ user satisfaction measured at two levels


• general user satisfaction
• user satisfaction with individual services
or
components of those services

◼ user satisfaction with services offered for remote


use
So far and in the future
◼ Now
• Libraries conduct their own user surveys
• Libraries involved in the Tempus Project
within a common user survey project
analysed user needs with professional
computer programme - the LIBRA
software package
◼ Future:
• A unified, nation-wide user survey
conducted with common tools; the
methodology based on ISO standard
11620 and IFLA guidelines. The results
Conclusions
• The Project “Performance Analysis for Polish Research Libraries” is
focused on the development of methods and standards for the
evaluation of quality of research libraries including the academic ones
• The Task Group for Standardisation is convinced that such a
development of methods and standards ought to be preceded by a
several-year examination of performance indicators based on library
statistics and user satisfaction research
• In the next stage the results of such a research will be used for the
assessment of the degree to which libraries comply with the
standards required
• The evaluation of current performance of research libraries is the first
stage of that task
• The methodology and tools used in the Project need to be improved,
completed and developed
Plans for the future (1)

▪ to continue the process of standardisation of


statistical data
▪ to prepare guidelines for the interpretation of
indicators used
▪ to improve the database - more possibilities for
comparative studies
▪ to develop more performance indicators based on
ISO 11620, possible to be calculated basing on the
data already collected
▪ to select more performance indicators concerning
electronic environment
Plans for the future (2)

▪ to prepare comment form for questions from the


respondents
▪ to calculate more performance indicators from ISO
11620 on the base of data already existing
▪ to develop standard user surveys and computer
software for data analysis for determining
quantified user satisfaction as qualified indicators;
▪ to develop nation-wide set of standards and clearly
determine set of performance indicators formulas
and interpretations for each standard
▪ to promote, to promote, to promote ...
Internet in library

CASLIN 2006, Český Ráj, 11–15.6.2006


Internet as a competitor to library

Here is how Americans line up when probed about specific topics and
whether they think the Internet will satisfy their information needs:

90
80
70
60
Internet users
50
40 remaining
30
20
10
0
Information Commerce News Health

Pew Internet Project 2002: www.pewinternet.org

CASLIN 2006, Český Ráj, 11–15.6.2006


Internet as a competitor to library

71% 85% 87% 76%

CASLIN 2006, Český Ráj, 11–15.6.2006


Internet as a competitor to library – why?

▪ Easy access to Internet and simple to use search


mechanisms
▪ Independence in conducting search
▪ Apparent proficiency in using internet
browsers/search
▪ Excess of information available
▪ Availability of the same or similar sources of
information
▪ Variety of information types available with the
same tool

CASLIN 2006, Český Ráj, 11–15.6.2006


Internet context = Library context

Entertainment
Workplace
Learning
Research
Neighbourhood

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
New context

▪ Changes in users’ needs and patterns of behaviour:


✓ Cellular phones
✓ Multimedia online (music, movies)
✓ News and other information online
• Podcasts (what’s this?)
• RSS (what’s this?)
✓ Trade online
✓ Personalization of services

▪ Societies online
✓ Blogs (what’s this?)
✓ Wikis (what’s this?)
✓ chat rooms
✓ Tagging, commenting, opinions
✓ Virtual realities, lifes…

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
150,000-250,000
visits A DAY!

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
▪ Technologies behind

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
PL = 73%
W. Europe → 100%

CASLIN 2006, Český Ráj, 11–15.6.2006


Where can I
find podcast
from the
latest lecture
by…?

CASLIN 2006, Český Ráj, 11–15.6.2006


Palmtops

CASLIN 2006, Český Ráj, 11–15.6.2006


Nano Phone, Cardphones, ...

CASLIN 2006, Český Ráj, 11–15.6.2006


E-books

CASLIN 2006, Český Ráj, 11–15.6.2006


E-books

CASLIN 2006, Český Ráj, 11–15.6.2006


E-books

CASLIN 2006, Český Ráj, 11–15.6.2006


E-paper

CASLIN 2006, Český Ráj, 11–15.6.2006


E-paper

CASLIN 2006, Český Ráj, 11–15.6.2006


E-paper

Do you know how to catalog THIS?


Can it be catalogued at all?
Do we NEED to catalogue this?

CASLIN 2006, Český Ráj, 11–15.6.2006


Google invests in wired …

A $189,000,000 pilot
Bidirectional wireless module

CASLIN 2006, Český Ráj, 11–15.6.2006


▪ Googlization

CASLIN 2006, Český Ráj, 11–15.6.2006


Google jako dominant informacyjny

• Google „Classic” • Google SMS


• Google News • Google Answers
• Google Print • Google Groups
• Google Earth • Google Labs
• Google Video • Google College
• Google Alerts • Google Scholar
• .....???

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
CASLIN 2006, Český Ráj, 11–15.6.2006
books
Books

CASLIN 2006, Český Ráj, 11–15.6.2006


CASLIN 2006, Český Ráj, 11–15.6.2006
Next Massive Wave of Broadband Expands

Transition to

3G
Real-Time Service-oriented
Infra- architecture
Low-Power- structure
Consumption
Mobile/Display
Secure Devices
Broadband
Wireless 2006/7
WEB 2.0
▪ RSS – really simple syndication ▪ Commentary and comments
▪ Wikis ▪ Personalization and My
Profiles
▪ New Programming Tools: AJAX, API
▪ Podcasting and MP3 files
▪ Blogs and blogging
▪ Streaming Media – audio and
▪ Recommender Functionality video
▪ Personalized Alerts ▪ User-driven Reviews
▪ Web Services ▪ Rankings & User-driven
▪ Folksonomies, Tagging and Tag Ratings
Clouds ▪ Instant Messaging and Virtual
Reference
▪ Social Networking
▪ Photos (e.g. Flickr, Picasa)
▪ Open access, Open Source, Open
▪ Socially Driven Content
Content
▪ Social Bookmarking
Use your imagination:

▪ Mobile devices
▪ Electronic paper
▪ Global digititalization of
resources
▪ Open access to knowledge
▪ Wireless networks

CASLIN 2006, Český Ráj, 11–15.6.2006


▪ Place of the library

CASLIN 2006, Český Ráj, 11–15.6.2006


Results of round II/III Feret, Marcinek 2005

% 100 100
99 (2) 100 (4) 100
99 (9)
In 10 years
98 98
96 95 95

from now:
90 90 (4) 90 (5) 90 (5) 90

80 80 80 (2) 80 (3) 80 (2) 80 (2)


75
70 70 70 70 70 70
68
65 (2)
60
55
50 50 (3) 50 (4) 50 (2)
45
40 40 (2) 40 (2) 40 40

30 30 (3) 30 30
25
20 20 20 What percentage of information
will be accommodated by people
15
via electronic, and not by printed
10 10 (2) 10 media?
5 (2) 5
2 3
0
Book Book Journal Journal El. info. El. info
reading distribution reading distribution reading distribution
Results of round II/III Feret, Marcinek 2005

% 100 100 100 % 100 100

90 90 90 90
85 85
80 80 (4) 80 80 (4)
75 (2) 75 (2)
70 70 (5) 70 70
65 65
60 60 (3) 60 60

50 50 (4) 50 50 (5)
45
43
40 40 40 (2) 40
35 (2)
30 30 30 (2)
25 25 (2)
20 20 20 (2)
What percentage of queries
asked by academic library users 15
will be in the year 2015 directed 10
10
to the Internet instead of their
university library?
0 0
reference research What percentage of library users will visit the library in
person at least once a year, in the university of 2015?
The Long Tail of QUESTIONS
Place of the library

• Library – a social place?


• Library – information sorter?
• Library – warehouse of obsolete resources?
• Library – consultation centre?

▪ …we need to move on from the mindset of the local 'library' as the core
supplemented by digital resources from external providers and the wider
internet – to a different mindset where the 'library' is a value-added
overlay on the wider canvas of readily available digital information
content, which provides value-added presentation and personalised
delivery of information resources to match the specific needs of
researchers, students and staff in the University, integrated with their
other working/study materials. (Di Martin)

CASLIN 2006, Český Ráj, 11–15.6.2006


Library performance indicators in 2020

What indicators… ?
▪ Some of the „classic” indicators will survive
▪ The basic indicators will be:
• Demand for library services (percentage of target
population, which uses the library) (what for?,
how often?)
• User satisfaction (definitely!)
• Impact of library on the quality of scientific
research (VERY difficult to measure)
• Ranking in user-driven ratings
• … any other suggestions?
CASLIN 2006, Český Ráj, 11–15.6.2006
Conclusions

▪ Library performance indicators will not die,


though formal measurement of different
aspects of library activities will be less
important than now
▪ Basic factor driving changes will be user
satisfaction; this will be also important for
university management as a proof of
importance of the library

CASLIN 2006, Český Ráj, 11–15.6.2006


•It’s an
“Exploration
Space” not a
collection
space
It’s an
Information
Ocean, not a
Highway
•The future is
already here, it’s
just not evenly
distributed yet
You don’t have to agree
with us, but be warned…

CASLIN 2006, Český Ráj, 11–15.6.2006


We tend to overestimate
changes to happen in the
coming year, but to
underestimate changes in the
coming decade…
Andrew Odlyzko

CASLIN 2006, Český Ráj, 11–15.6.2006


Many thanks to Stephen Abram (SirsiDynix) for sharing
slides, some of which were used in this presentation.

CASLIN 2006, Český Ráj, 11–15.6.2006

You might also like