Professional Documents
Culture Documents
net/publication/221657981
CITATIONS READS
8 113
2 authors, including:
Adam Pope
Arup
6 PUBLICATIONS 10 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
The impacts of the Internet on public library use: An analysis of the consumer market for library and Internet services in New Zealand View
project
All content following this page was uploaded by Adam Pope on 23 May 2014.
Telephone
Telephone survey research for survey research
library managers
Philip Calvert
Victoria University of Wellington, Wellington, New Zealand 139
Adam Pope
Government Actuary’s Department, London, UK Received June 2004
Revised August 2004
Accepted October 2004
Abstract
Purpose – To explore and evaluate the evidence about the value of using telephone surveys,
especially in market research for a library.
Design/methodology/approach – A critical summary and review of the literature in this field.
Findings – This paper demonstrates that there are five major reasons for using this method of
surveying customer preferences: response rates are higher; data can be analysed sooner; the cost of
surveys is lower than alternative methods; calls can be monitored for quality; and the telephone offers
the benefits of spontaneity.
Practical implications – The paper contains extensive information on “best practice” of telephone
surveying including designing the questionnaire and conducting the interview. There is information
on getting a representative sample, plus coping with “no answers”, unlisted numbers and answering
machines.
Originality/value – Library managers with a commitment to using innovative techniques for
market research will find telephone surveys offer a useful and cheap alternative to other survey
methods. No previous paper has examined the use of, and the value of, telephone surveys in libraries.
Keywords Libraries, Market research, Surveys
Paper type Research paper
The use of telephone surveys has not figured largely in LIS research in the past. For
many years they have been used to good effect in the private sector, especially in
market research and customer satisfaction surveys, and this makes it worth asking
why they have not been more widely used by LIS researchers and practitioners. As
library managers pay increasing attention to the library as a product, and its place in a
market, the relevance of telephone surveys should become more apparent. Using a
telephone comes more naturally to people than reading and writing, yet most library
surveys are conducted using printed survey forms that require a written response,
whereas telephone surveys need only the ability to hear and speak.
The survey is still the most common method used in LIS research (Powell, 1999).
Surveys are usually treated as a single class and few writers have separated telephone
surveys from other methods, but one point that becomes clear, however, is that
telephone surveys represent only a very small proportion of all LIS research.
The American Marketing Association regards telephone surveys as equivalent to
mail surveys, a quantitative method (Dutka, 1993), and quantitative research is the Library Management
Vol. 26 No. 3, 2005
simplest and most efficient for collecting data for manipulation by statistical methods. pp. 139-151
On the other hand, qualitative research is best when seeking free-form responses in q Emerald Group Publishing Limited
0143-5124
which the respondent uses his or her own words. The method allows for the collection DOI 10.1108/01435120510580889
LM of in-depth information such as personal opinions, the reasons why, where and how
26,3 people choose to do things.
When the survey is conducted by telephone, the respondent is immediately
available so the survey can be completed straight away, and clearly this is an
advantage of the method. The person being questioned can only respond with audible
signals, though, hence it is much harder to gauge the mood and demeanor of the
140 respondent. Nor can the respondent see visual signals such as rating scales, or the
color, size and shape of objects (Alreck and Settle, 1995). This lack of “observational
data” has been cited as a weakness in telephone surveys (Blankenship, 1977, p. 48). For
this reason alone, telephones lend themselves to structured surveys rather more than to
open-ended interviews, though they can be used for the latter by competent
interviewers.
Two techniques that assist telephone surveys should be mentioned at this point,
random digit dialing (RDD) and computer assisted telephone interviewing (CATI).
RDD in its purest form is simply the random selection and dialing of area code,
exchange and suffix numbers. It can be done manually, though there is also computer
software to make this process faster and more accurate. The advantage is that it
ensures all numbers have an even chance of being called, hence it is truly random. The
disadvantage is that many numbers could be out of service, or belong to people or
organizations that are not in the target population. This problem can be limited by the
use of systematic RDD, in which only numbers that meet specified criteria are dialed.
CATI systems integrate traditional telephone survey methods with computer
telephone technology. Typically, each interviewer is equipped with a hands-free
headset and is seated at a computer. The computer dials the number and provides the
interviewer with the appropriate introduction screen. Simple keystrokes bring up the
first screen of questions, and if relevant, possible answer choices, then on to further
screens for alternative paths if Yes or No answers lead to different questions. One
advantage of CATI systems is that several members of a household can be
interviewed, yet only one set of common data need be collected for the family and is
then “shared”, saving the need for multiple calls. A second advantage is that
respondents who must leave during a call can be automatically called back at a set time
and the interview resumed with no loss of data. The major disadvantage of CATI
systems for LIS research is the initial investment they require and their operating
costs. Staff using such systems must possess computer skills in addition to the more
traditional skills of the interviewer. There is a simple alternative for researchers
without access to CATI systems, for data can easily be entered into a spreadsheet such
as Microsoft Excel or SPSS. Using the “freeze panes” option in Microsoft Excel allows
the interviewer to progress from one respondent to the next without losing sight of the
questions.
The questionnaire
There are no aesthetics in the design and construction of telephone surveys, only utility
(Frey, 1983, p. 89). Ease of administration by the interviewer is a prime concern.
The survey instrument should be designed as a whole, so the introduction, and the
instructions to the interviewer and respondent are of great importance. For
the interviewer: the response categories must be clear; question wording must allow
for a conversational tone; and transitional statements, adequate question introductions,
and branching statements may all be necessary to aid the flow between questions. For
the respondent: the logic of question order makes a difference; long questions simply
become confusing; and the level of motivation must be maintained.
The design of the questionnaire can be divided into three groups of questions.
Initially the survey needs to introduce the surveyor and assess the ability of the
responder to answer the questionnaire, followed by the body of the questionnaire, and
finally the demographic and miscellaneous questions. Leaving the simple demographic
questions until the end helps avoid respondent fatigue (Frey, 1983, p. 110).
The initial sentence requires careful design as it will make or break the response
rate. It requires a friendly greeting, the interviewer’s name, a statement about the
purpose of the call, and how long it will take. At this point the intended recipient makes
the decision whether to participate or not, and “Obtaining a favourable decision on
LM participation is largely dependent on the nature of the introduction to the survey”
26,3 (Frey, 1983, p. 87). Here, a library survey has an advantage because the topic is
non-threatening and for a large part of the population, a service that they mostly like.
Mentioning the library several times during the introduction is likely to raise the
response rate. If the respondents’ interest is raised by the introduction, the first
questions must continue the theme – there is no point in saying the survey will help
146 improve library services only to then ask the respondent’s age, or how often they use
the internet.
The simplest questions are the best in telephone surveys. Closed question such as
Yes/No, How often/how many?, Tell me all that you recognise from this list, Please rate
the following on a scale of one to five, are all suitable questions for the telephone. Open
questions can be asked, of course, but they pose problems in coding, and they are
potentially ambiguous (Blankenship, 1977, p. 93).
Questions are best grouped by topic, and in a way that matches the respondent’s
perception of the relationship between items. Pre-testing may discover that
respondents would not group questions in the same way as the survey’s authors.
Turning to practical interviewing, it is necessary to consider how to get a
satisfactory sample. To incorporate an element of chance into the survey the
interviewer should first ask to speak to the member of the household whose birthday
comes next.
The issue of age needs to be given careful consideration. To interview people under
the age of 18 should require the consent of a parent or guardian, and the questions
must ensure that the expected literacy of youth is accommodated. If people under 18
years of age are nevertheless an important group to be surveyed there will be other
methods of interviewing them, such as through their school.
Those elements of the community with a personal stake in the outcome of the
survey also ought to be excluded to avoid any skewing of the data in their favor. In the
case of a library survey, ask a question early on in the survey, such as “Do you work in
any of the following occupations: library work, information management, or library
supply?”
Order response bias occurs when respondents become fatigued by the repetitive
nature of questions, with interviewees responding with awareness to the first questions
then more carelessly towards the end. To avoid this bias occurring, when half of the
required sample has been interviewed, simply swap portions of the questionnaire
around, bringing the second half of the body to the beginning, for instance.
Demographic questions are almost always a valuable tool – not only to assess how
well the sample fits the demographic of the population, but also to be able to tailor any
resulting solutions to specific age, gender, educational, income or ethnic groups. Some
questions, however, can be disturbing to the respondent, and it is important to phrase
them in such a way as not to offend. To find out the respondent’s age, it might be
asked, “in what year were you born?” – gender is sometimes obvious from the
respondent’s voice – and ethnicity can be determined by asking, “Would you consider
yourself to be a New Zealander?” (for example). If the reply is “yes”, then ask “of Maori
descent?. . .European descent?. . .Chinese descent? . . . .” If the respondent answers “no”
to all such questions, then “Which ethnic group do you most identify with?” is the
logical next question.
Income can be divided into tiers by asking, “Is your income above or below $35,000 Telephone
per year?” If below, ask “. . .above or below $20,000?” and similar if above $35,000. survey research
Finally education should easily be answered with a question such as “What was your
highest educational qualification?”
In order to validate that the research has been conducted, respondents can be asked
towards the end of the survey if they are happy to have their phone number released to
a trusted and independent body, such as a supervisor, Justice of the Peace, or member 147
of the Library Board, in order that a third party can verify that the interview took
place. If the respondent declines, their phone number is deleted and this can be
recorded with a code such as FA (full, anonymous), indicating an interview was
conducted but the respondent wished to remain anonymous.
Another question towards the end of the survey asking whether the respondent
would like a summary of the results is important for the profession’s credibility as well
as the ethical rigor of the survey. If the respondent asks for one, the interviewer should
explain that he/she needs to open a separate file independent of their answers and enter
the respondent’s address into it whilst assuring them of confidentiality. E-mail
addresses can be used when possible as this can save a great deal of money and effort
when delivering the final results. Depending on the interest value of the survey to
respondents, about one third will be keen to see a summary.
Finally, thank the respondents for their time and assure them of the usefulness of
their contribution.
Telephone numbers
There are a number of methods of sourcing telephone numbers to ring. Many
commercial market research companies are willing to sell lists of randomly chosen
numbers that have a good chance of being answered, though it is far cheaper to go
through the phone directory and randomly pick phone numbers. It is most advisable to
run a telephone survey shortly after a new directory is published to avoid disconnected
numbers and to catch transient members of the population. Using a ruler and marking
every fifth centimetre on each column, then transposing the telephone number beside
each mark into a spreadsheet is an efficient way to provide a random list. A
computer-based alternative is RDD software, described earlier.
Phone numbers should be stored in a separate, password-protected file to ensure
respondent’s identities are kept apart from their answers, and secure.
If the population needed for the research covers a larger area than a free-calling area
there are a number of demographic factors to consider. To check what the demographic
is, look up the latest census figures. If the demographic in the free-calling area does not
closely match that of the population to be surveyed it will be difficult to claim in the
results that the sample is in any way representative of the larger population.
There are options to solve this. Calling outside of the free-calling area is a possibility
if funds allow. Be sure to check all the options for calling – often pre-paid cards or
other telecommunications companies can offer better deals than the predominant
market player.
Another option is to attempt to replicate the demographic of the population in the
sample. This is simply solved when gender is involved by asking “. . . may I speak to
the lady/gentleman of the house whose birthday comes next please?” However, if
20 per cent of the people in the target population are of Chinese origin, and only
LM 10 per cent of your free calling area is of Chinese origin then, towards the end of the
26,3 survey, it will be necessary to exclude potential respondents of races other than
Chinese. Randomly finding a 60-70 year-old female of Indian descent with no formal
education who is willing to do the survey, for instance, can be particularly vexing. You
may have to settle for “close enough” and mention any demographic limitations in your
results.
148 The statistical solution is to bring the responses obtained from one segment of the
population in line with their proportion of the demographic. For example, if conducting
a survey on use of the internet within a library, and if all females used the library for
internet services but were only 40 per cent of the sample – and 50 per cent of the
population – the proportion of females answering “Yes” would increase by 25 per cent
(10 per cent of the total “Yes” responses). Where only half of the male population used
the library for internet services but were 60 per cent of the sample, the proportion of
male “Yes” answers would decrease by 16.6 per cent (5 per cent of the total “Yes”
responses) (Tables I and II).
Response rate
It is important that the response rate of the survey is as high as possible. There are a
number of ways to increase the willingness of respondents to answer the survey. The
depth of the interviewer’s desire to learn from any potential respondent is invariably
immediately obvious – and will strongly be reflected in the results.
Perhaps the second biggest negative for respondents is the time the interviewer
says the survey will take. Though honesty is the best policy, it is often hard to judge
exactly how long a survey will take from the outset. Furthermore, the shorter the time
quoted at the beginning the higher the chance of the responder accepting the invitation.
It is therefore advisable to quote the average time it takes to conduct an interview
whilst taking out the longest 20 per cent of interviews since the respondents in that
category invariably dithered and cared not how long the interview took. As a rough
guide, if the questioner quotes 5 minutes for the interview at the start, people who
answer the survey speedily will start to question the length of the interview once 12 to
Internet use
Yes No
Table I. 60 males 30 30
With 60 per cent of 40 females 40 0
sample male 100 people 70 30
Internet use
Yes No
50 males 25 25
Table II. 50 females 50 0
After analysis Total (per cent) 75 25
15 minutes is up. A 15 minutes quote, though getting a lower response rate, will start Telephone
similar questions arising after 30 minutes. Any longer than 15 minutes and it is better survey research
not to give a time. If queried, issue the responder with a challenge such as “I had one
respondent who managed to knock off a survey within 20 minutes. Care to better her?”
Another effective method of increasing the response rate of a survey is to mirror the
language of the respondents. Proficiently adopting the traditional reply to the initial
phone greeting, such as a “Ni hao ma?” if a respondent answers the phone with “Wei?”, 149
attempting to develop rapport with the respondents by speaking at a similar pace,
using comparable language and responding with similar intensity all aid in producing
empathy in the respondent and upping the response rate.
Once the interviews are complete, unprotect the phone number spreadsheet and
forward it to the independent body, asking that they phone a random set of the
completed interviews to verify that the interviews took place. Once this is complete the
spreadsheet must be destroyed.
Limitations
Phone surveys are by no means enumerative and this has to be considered as a major
limitation. There are furthermore a number of biases resulting from phone surveys that
need to be taken into consideration, and avoided where and as best as possible.
Telephone surveys can expect refusals to be in the range of 10-25 per cent of those
called. In comparison to a mail survey, where 90 to 95 per cent refusal rates are not
uncommon (Alreck and Settle, 1995, p. 35) the decision to conduct a telephone survey is
still efficient. Nevertheless, when confronted with the introductory patter about a short
survey on library services, some people will say they are not interested. It is a good
idea to formalise a response to draw them back into the survey, such as “You haven’t
used a library in the last year?” Invariably the answer will be “No”, and the interview
can be continued with other questions pertinent to their information needs and why
they do not use a library. This will draw a markedly better response rate.
To make the results clearer in the final analysis and show any population biases the
survey may have, in the figures and tables which have demographic data for the entire
sample, incorporate a column containing equivalent data from the latest census to
highlight any differences between the norm for the population and the sample.
Notes
150 1. New Zealand data from Darren Lim, a representative of Telecom Yellow Pages Data
Solutions Team. UK data from Noble et al. (1998).
2. In order to calculate the number required for smaller populations, or to find the confidence
ratings of smaller samples, there are look-up tables in many books on statistics (Hernon,
1994). Also try the calculator at www.surveysystem.com/sscalc.htm.
References
Alreck, P.L. and Settle, R.B. (1995), The Survey Research Handbook: Guidelines and Strategies for
Conducting a Survey, 2nd ed., McGraw-Hill, New York, NY.
Blankenship, A.B. (1977), Professional Telephone Surveys, McGraw-Hill, New York, NY.
Collins, M. (1999), “Editorial: sampling for UK telephone surveys”, Journal of the Royal Statistical
Society, Vol. 162 No. 1, pp. 1-4.
Colombotos, J. (1969), “Personal vs telephone interviews effect responses”, Public Health Reports,
pp. 773-820, September.
D’Elia, G., Jorgensen, C., Woelfel, J. and Rodger, E.J. (2002), “The impact of the internet on public
library use: an analysis of the current consumer market for library and internet services”,
Journal of the American Society for Information Science and Technology, Vol. 53 No. 10,
pp. 802-20.
Dutka, A. (1993), AMA Handbook for Customer Satisfaction, NTC Business Books, Lincolnwood,
Il.
Frey, J.H. (1983), Survey Research by Telephone, Sage, Beverly Hills, CA.
Groves, R.M. and Kahn, R.L. (1979), Surveys by Telephone: A National Comparison with Personal
Interviews, Academic Press, New York, NY.
Gunn, H. (2002), “Web-based surveys: changing the survey process”, First Monday, Vol. 7 No. 12,
available at: http://firstmonday.org/issues/issue7_12/gunn/index.html (accessed 12 April
2004).
Hernon, P. (1994), Statistics: A Component of the Research Process, revised ed., Ablex, Norwood,
NJ, p. 121.
Klobas, J.E. and Clyde, L.A. (2000), “Adults learning to use the internet: a longitudinal study of
attitudes and other factors associated with intended internet use”, Library & Information
Science Research, Vol. 22 No. 1, pp. 5-34.
McGuckin, N., Keyes, M.A. and Liss, S. (2001), Hang-Ups: Looking at Non-Response in Telephone
Surveys, available at: www.fhwa.dot.gov/ohim/hang_ups.htm (accessed 27 April 2004),
Federal Highway Administration, US Department of Transportation, Washington, DC.
Mossberger, K., Tolbert, C.J. and Stansbury, M. (2003), Virtual Inequality: Beyond the Digital
Divide, Georgetown University Press, Washington, DC.
Noble, I. et al. (1998), “Bringing it all back home. . .”, Proceedings of the Annual Conference of the
Market Research Society, p. 48.
Powell, R.R. (1999), “Recent trends in research: a methodological essay”, Library & Information
Science Research, Vol. 21 No. 1, pp. 91-119.
Roselle, A. and Neufeld, S. (1998), “The utility of electronic mail follow-ups for library research”, Telephone
Library & Information Science Research, Vol. 20 No. 2, pp. 153-62.
United States Department of Commerce (1998), Falling Through the Net: Defining the Digital
survey research
Divide, available at: www.ntia.doc.gov/ntiahome/fttn99/part1.html (accessed 1 April 2004),
National Telecommunications and Information Administration, Washington, DC.
Ward, S.M. (2000), “The client satisfaction survey as a tool for evaluating library fee-based
information services”, Journal of Interlibrary Loan, Document Delivery & Information 151
Supply, Vol. 10 No. 3, pp. 63-76.
Further reading
International Telecommunications Union (ITU) (2004), Main telephone lines, subscribers per 100
people, available at: www.itu.int/ITU-D/ict/statistics/at_glance/main01.pdf (accessed 2nd
June 2004).