Incentives. Relative to other strategies for increasing response rates in Web
surveys, many studies have examined the use of incentives, Much of this work is
summarized in a meta-analysis by Géritz (2006a; see also Géritz, 2010). Across 32
experimental studies, she found that incentives significantly increased the propor-
tion of invitees who started the survey (with an average odds ratio for the incentive
effect of 1.19). But what types of incentives are most effective? The general findings
in the survey literature are that prepaid incentives are more effective than promised
or conditional ones, and that cash incentives are more effective than altematives
such as in-kind incentives, prize draws, sweepstakes, loyalty points, and the like (see
Church, 1993; Singer, 2002). Despite this, sweepstakes or loyalty-point incentives,
conditional on completion, are popular in Web surveys, especially among volunteer
panels.
There are several reasons why Web researchers prefer conditional incentives to
prepaid ones and noncash incentives to cash. First, prepaid cash incentives cannot
be delivered electronically; they require mailing addresses and entail more expen-
sive processing and mailing of materials. Second, if the response rate is likely to be
in the single digits (as is often the case), the return on investment may be low (but
see Alexander, Divine, Couper, McClure, Stopponi, Fortman, Tolsma, Strecher, and
Johnson, 2008, discussed below). Third, as Géritz (2006b) notes, the costs of sweep-
stakes are usually capped, since the size of the prize stays the same regardless of
how many people participate, This makes it easier to manage survey costs. Although
sweepstakes and loyalty points are attractive to the researchers, are they effective in
encouraging response from sample persons?
Giritz (2006a) found that sweepstakes incentives produced higher response rates
than no incentives in her meta-analysis of 27 experimental studies involving sweep-
stakes, most of them based on commercial panels. However, in a meta-analysis of six
incentive experiments in a nonprofit (academic) panel, she found that offering a cash
sweepstakes provided no significant benefit over no incentive at all (Goritz, 2006b),
Thus, while sweepstakes may be better than nothing, at least for commercial panels,
it is not clear whether they are better than alternative incentive strategies.
In one of the few studies comparing different types of incentives, Bosnjak and
‘Tuten (2002) conducted an experiment in a survey among real estate agents and brokers
for whom they had email addresses. They tested four incentive types: 1) $2 prepaidvia PayPal with the lirst coniact produced a 14-3 response rate; 2) $2 promised via
PayPal upon completion produced a 15.9 percent response rate; 3) a prize draw for
two $50 and four $25 prizes upon completion produced a 23.4 percent response
rate; and 4) a control group with no incentive produced 12.9 response rate. One
reason why the prize draw may have outperformed the prepaid and promised incen-
tives is that cash was not used for the latter; for the PayPal incentive to be of value,
one had to have a PayPal account. Another study (Birnholtz, Horn, Finholt, and
Bae, 2004) compared 1) a mail invitation with $5 cash, 2) a mail invitation with a
$5 Amazon.com gift certificate, and 3) an email invitation with a $5 Amazon.com
e-certificate in a sample of engineering faculty and students at 20 universities. The
study found the highest response rate (56.9 percent) for the cash incentive group,
followed by the mail (40.0 percent) and email gift certificate groups (32.4 percent).
This study suggests that cash outperforms a gift certificate (consistent with the ear-
lier literature on survey incentives) and isalso consistent with the studies showing an
advantage of mail over email invitations,
Alexander and her colleagues (2008) conducted an incentive experiment as part
of the recruitment effort for an online health intervention. Invitations to enroll online
were sent by mail to members of a health maintenance organization, ‘The experiment
tested six different enrollment incentives: no incentive, prepaid incentives of $1, $2,
or $5, and promised incentives of $10 or $20. The highest enrollment rates were
for the three prepaid incentive groups, with 7.7 percent enrolling with the $5 incen-
tive, 6.9 percent with the $2 incentive, and 3.9 percent with the $1 incentive. The
promised incentives produced enrollment rates of 3.4 percent with the $10 promised
incentive and 3.3 percent with the $20 promised incentive. The no-incentive group
had a 2.7 percent enrollment rate. This provides further evidence in support of the
effectiveness of prepaid incentives in online surveys. In terms of cost. the $5 prepaid
group cost approximately $77.73 per enrolled case, the $2 prepaid group cost about
$43.37, the $1 prepaid group $51.25, the no-incentive group $36.70, the $10 prom-
ised group $41.09, and the $20 promised group $50.94. Despite the relatively low
enrollment rates, a small prepaid incentive (a $2 bill) proved cost-effective relative to
the promised incentives, though not compared to no incentiv
This brief review suggests that incentives seem to work for Web surveys in pretty
much the same way as in other modes of data collection and for pretty much the
same reasons. Although itis impractical for Web panels to send mail invitations with
prepaid incentives when they are sending tens of thousands of invitations a day, the
combination of an advance letter, a small prepaid cash incentive, and an emai
tation may be most effective for lis-based samples,
's at all