You are on page 1of 47

Internet Research

The value of online surveys: a look back and a look ahead


Joel R. Evans, Anil Mathur,
Article information:
To cite this document:
Joel R. Evans, Anil Mathur, "The value of online surveys: a look back and a look ahead", Internet Research, https://
doi.org/10.1108/IntR-03-2018-0089
Permanent link to this document:
https://doi.org/10.1108/IntR-03-2018-0089
Downloaded on: 23 June 2018, At: 10:38 (PT)
References: this document contains references to 0 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 2 times since 2018*
Access to this document was granted through an Emerald subscription provided by emerald-srm:277069 []
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service
information about how to choose which publication to write for and submission guidelines are available for all. Please
visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of
more than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of online
products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on Publication
Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation.

*Related content and download information correct at time of download.


The value of online surveys: a look back and a look ahead

Abstract

Purpose – To present a detailed and critical look at the evolution of online survey research since
Evans and Mathur’s 2005 article on the value of online surveys. At that time, online survey research
was in its early stages. Also covered are the present and future states of online research. Many
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

conclusions and recommendations are presented.


Design/methodology/approach – The look back focuses on online surveys, strengths and
weaknesses of online surveys, the literature on several aspects of online surveys, and online survey
best practices. The look ahead focuses on emerging survey technologies and methodologies, and
new non-survey technologies and methodologies. Conclusions and recommendations are provided.
Findings – Online survey research is used more frequently and better accepted by researchers than
in 2005. Yet, survey techniques are still regularly transformed by new technologies. Non-survey
digital research is also more prominent than in 2005 and can better track actual behavior than
surveys can. Hybrid surveys will be widespread in the future.
Practical implications – The paper aims to provide insights for researchers with different levels of
online survey experience. And both academics and practitioners should gain insights.
Social implications – Adhering to a strong ethics code is vital to gain respondents’ trust and to
produce valid results.
Originality/value – Conclusions and recommendations are offered in these specific areas: defining
concepts, understanding the future role of surveys, developing and implementing surveys, and a
survey code of ethics. The literature review cites more than 200 sources.
Keywords Online survey, Research trends, Theories of survey research, Survey methodology,
Content analysis, Future of the online survey, Survey ethics
Paper type General review

Note: We thank the editor for her support of this project, and all of the reviewers for their suggestions that
have improved this paper.

1
INTRODUCTION

Internet-based research has come of age. About thirteen years ago, when “The Value of Online

Surveys” was published in Internet Research (Evans and Mathur, 2005), online surveys were not

yet well respected. Although now popular, SurveyMonkey – founded in 1999 – and Qualtrics –

founded in 2002, had not attained any real sophistication or critical mass as of 2005. Thus, they and

other newer online survey methodologies were not even discussed in the earlier article. Things
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

have certainly changed since 2005 – with the rapid growth of online survey research and advances

in technology, including the Internet of Things.

The 2005 article focused on the VALUE of doing online surveys as compared with other

modalities. It included the strengths and potential weaknesses of online surveys, aspects of

respondent methodology for online surveys (as well as mail surveys), how to address the potential

weaknesses of online surveys, and a directory of online survey services.

In this article, the focus is on the evolution of online survey research from 2005 to the present,

as well as a look at what is ahead for survey research. Online surveys not only continue to grow in

popularity and to mature as a research technique, but new technologies have also changed survey

methodologies – and possibilities. While earlier research and interest as to online surveys were

driven by the potential benefits of doing research online, the growth of online surveys has further

brought to the forefront some limitations and pitfalls of doing surveys online.

In view of its continuing growth, potential for future growth, and issues that could negatively

affect online survey growth, it is important to examine online survey research’s current and future

usefulness to researchers and practitioners. The objectives of this paper are to look at what has

changed since the 2005 article, to examine evolving trends, and to see where they are heading.

Extensive conclusions and guidelines are then presented as to the future role of survey research,

developing and implementing survey methodologies, and a code of ethics for survey research.

2
A LOOK BACK

In this section, we cover these topics regarding the period from 2005 to 2018: the growth of online

surveys, the strengths and weaknesses of online surveys, research on line surveys, online survey

best practices, and online survey firms.

The Growth of Online Surveys

During 2005, online survey research amounted to just over $1 billion in spending (Aster 2004). At
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

that time, predictions were that online survey research would grow significantly. That has certainly

been the case. Since its infancy in the mid-1990s, online research has found ready acceptance

among academic researchers and practitioners.

In 2005, the revenues of the global marketing research industry were estimated at $23 billion,

rising to $32 billion in 2008. Then, after a decline amidst the Great Recession, the industry

rebounded. According to Terhanian et al. (2016), online survey research alone accounted for $6

billion in expenditures in 2015. For 2016, global marketing industry revenues were estimated at

nearly $45 billion (Statista 2017a). Thus, based on our analysis, it is conservatively estimated that

the share of all forms of marketing research devoted to online surveys is 20 percent. As a result, the

revenues accruing from online surveys could have been as high as $9 billion in 2016 – an eight-fold

increase in comparison to 2005. This does not include the value of academic research based on

online surveys. In looking ahead, the dramatic effects of new technologies on marketing research in

general and online surveys more specifically need to be considered.

After an extensive review of information from trade associations, industry experts, and

academic journals, this is what we believe about the status of online survey research today:

• Overall data as to the current use of online surveys are hard to break out because of the way

that survey data are often reported. For example, ESOMAR (2016) uses “online” as a

3
catchall term for any type of research done online, yet it considers “automatic

digital/electronic” separately. Statista (2017a), based on ESOMAR data, separates “online

quantitative/research”, “automated digital/electronic surveys”, “mobile/smartphone online

quantitative research”, and “online research communities” in noting their usage.

• The best source for determining the present status of research is the GreenBook’s GRIT

report (2017). More than 1,500 marketing research professionals (both suppliers and buyers

of data) from about 75 countries participated in the 2017 study and gave their input on
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

various research topics. According to that study, for those engaged in quantitative research,

56 percent most often utilize online surveys and 14 percent most often utilize mobile

surveys. For those engaged in qualitative research, a much small percentage engage in

online surveys. This report is published free on a regular basis.

• Pew Research Center (2018b) reports that the dramatic increase in survey research via the

Internet over the past 10 years is driven by three powerful industry trends: broadband

connectivity, smart mobile device ownership, and social media and social networking.

• Virtually all major players in the global marketing research industry are engaged in at least

some online survey research; and a number of them have their own online consumer panels.

For example, Kantar (2018) has 5.5 million online research respondents covering 45

countries. It also monitors social media in 67 languages among countries. At Ipsos (2017),

80 percent of its research involves surveys. And now, more than half of its surveys are

online/mobile – up from 19 percent in 2005.

• Over the same time period, considerably more research has been conducted regarding the

theoretical bases needed for good online surveys, as well as the issues related to the

implementation of online surveys. Much of this research addresses data quality and online

survey usefulness.

4
So, how well do the concepts from the original article hold up? Let us review how online survey

research has evolved from the vantage point of the 2005 article: strengths and weaknesses of online

surveys, online surveys and other research formats, best uses and best practices, and online survey

firms. Then, we will look at future possibilities.

The Strengths and Weaknesses of Online Surveys


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

In the 2005 paper, a number of major strengths and weaknesses associated with online surveys were

highlighted and discussed. Those strengths and weaknesses are shown in Figure 1. And they hold up

remarkably well. The strengths are still in play; and the potential weaknesses still exist despite more

than a decade to address them. They are often inherent with this format. Next, consider how many

of these factors have evolved over the past thirteen years.

Figure 1 goes about here.

Major Strengths

Global reach continues to be a big advantage of online surveys. It is estimated that 52 percent of

the world population has access to the Internet via traditional devices or via mobile devices. Yet,

there is a big difference among countries. The range includes countries with close to 100 percent

coverage – such as Iceland – as well as countries in Asia and Africa that have more limited

coverage – such as India at 34 percent and Central African Republic at 5 percent. (Miniwatts

Marketing Group, 2018). Institutions and governments are investing heavily in the Internet

infrastructure so as to connect as many people as possible. Thus, greater coverage for researchers

interested in reaching global populations for their online studies we can be expected. As of 2021,

5
eMarketer (2017b) projects that nearly 54 percent of the worldwide population will use the Internet

at least once per month, representing more than 4.1 billion people.

Flexibility has increased tremendously due to advances in technology. In the past, a survey

designer had to learn to program and had to write the code in html to create a survey. Today, with

the availability of various online survey programs, creating and launching an online survey has

become very easy. Those programs provide flexibility in terms of question type, question format,

response categories, layout, fonts, and visual aspects. They also offer flexibility in terms of
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

automatic logic control, branching, randomization of questions, and assignment of respondents to

experimental conditions. In addition, the programs provide flexibility in restricting/eliminating

invalid responses.

Speed and timeliness continue to be an advantage of online surveys. With the increase in those

who have access to high-speed Internet connections, survey speed and timeliness have reached

another level. Moreover, now that a significant number of people access the Internet via their

smartphones, researchers can reach potential participants virtually anywhere and at any time.

Technological innovations continue to influence and improve online research. It is estimated

that about three-quarters of U.S. households/individuals have access to high speed Internet

connections (Pew Research Center, 2018a). At the same time, innovations related to mobile

communications and multiple sensors/technologies available in mobile devices enable researchers

to collect complex data in more convenient ways. These technologies also enable better targeting of

potential participants (such as geotargeting of potential participants via GPS tracking information).

Convenience continues to be an advantage of online research. In fact, its importance and value

have risen significantly. Because study participants can connect to the Internet using various types

of mobile devices (tablets, smartphones, etc.), it is becoming more convenient for them to

participate in online research. It is estimated that 14 percent of survey respondents already use

mobile devices to participate in online surveys (GreenBook, 2017).

6
Ease of data entry and analysis continue to be key advantages of online surveys. Most online

survey platforms offer some aspects of built-in analytic tools. They can carry out analysis

immediately. Some offer graphic display capabilities in analyses. As such, researchers can do live

analysis of the available data as more participants complete the survey and their data are collected.

Question diversity remains a major advantage. Since the 2005 article, it has become possible to

add more question formats. In fact, some online survey platforms (such as Qualtrics) can

automatically reformat a survey to suit the device the participant uses to connect to the Internet.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Large samples easy to obtain. Although low response rates continue to be an important issue

for online researchers, it is possible to obtain large samples at a fraction of the cost of traditional

mail or telephone surveys.

Required or forced completion of survey questions is often an advantage. Although some

participants may be reluctant to answer certain questions, survey programming can be such that the

survey site would alert the participant to answer all questions. Thus, skipping or jumping to the next

question without answering the question on the screen can be prevented. However, special

consideration may be needed for sensitive topics.

Go to capabilities. Logical trees – whereby respondents are automatically directed to specific

sections of the surveys/questions based on their responses to previous questions – have become

more advanced and capable. Randomization of assignment of participants to specific experimental

conditions or in response to specific questions is also possible.

Major Weaknesses

Perception as junk mail. The perception of E-mail invitations to participate in online surveys

as junk mail remains a large problem. It is estimated that a majority of E-mail is junk mail.

According to Statista (2018a), 59 percent of all E-mails were spam in September 2017. As such,

specialized firms have developed tools that automatically filter junk mail/spam/unsolicited mail.

7
Unfortunately, online researchers may be victims of this problem. Also, many unscrupulous players

have disguised themselves as researchers and attempted to collect information about people that

they then misuse to sell to them or to cheat them. This has resulted in numerous people mistrusting

any invitations they receive to participate in online surveys. As such, a vast majority of invitations

to participate end up in the junk mail folder and/or are ignored by people.

Sampling. Concerns remain with regard to the skewed attributes of the Internet population,

sample selection and implementation, some respondents’ lack of online experience/expertise, and
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

technological variations.

Unclear answering instructions. While unclear answering instructions have been a weakness

in the past, and continue to be a disadvantage for poorly designed questionnaires, a lot of research

has been conducted on online questionnaire design. Thus, guidelines are available so that

researchers can learn from past mistakes and make sure their instructions clearer.

Impersonal. Although surveys often were designed to be impersonal in the past, technological

advances now enable researchers to send out personalized invitations to participate. Surveys can

also be designed to incorporate some personal attributes of each potential participant. Furthermore,

data can be linked to previously collected information for each participant. While these approaches

can minimize this weakness of impersonal surveys, they may cause a problem with privacy.

Privacy issues. Vulnerabilities associated with privacy issues have increased. Now that

researchers can collect enormous amounts of data and create detailed profiles of individuals, people

are more concerned about their privacy and the security of the information they share. Many people

refuse to participate in surveys because of privacy concerns. For online researchers, it is crucial to

have clear and well-articulated privacy policies. And researchers have to convince potential

participants that the researchers will adhere to those privacy policies. Given the number of mergers

and acquisitions, as well as bankruptcies, what happens to the data collected and owned by the firms

involved? Laws are unclear about the privacy of data once it passes hands via a merger/acquisition

8
and/or a bankruptcy/sale of a company.

Low response rates continue to be a concern. Based on a meta-analysis, Manfreda et al. (2008)

found that the average response rate was 11 percent for online surveys and that the 95 percent

confidence interval was 6 percent to 15 percent. That low rate has dropped even further since then.

Many researchers have investigated the reasons for the low response rates and suggested ways to

improve them. Some factors that affect response rates include survey length (Galesic and Bosnjak,

2009), content and wording of surveys ( Hansen and Smith, 2012; Fazekas et al., 2014),
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

personalization issues (Sauermann and Roach 2013), invitation wording and reminders (Petrovcic

et al., 2016; Sauermann and Roach, 2013), incentives (Pedersen and Nielsen, 2014; Sauermann and

Roach, 2013), researcher and sponsor identity (Pan et al., 2013), and the default settings in the

survey (Jin, 2011).

Based on another meta-analysis, Fan and Yan (2010) concluded that online survey response rates

are influenced by such factors as the sponsoring organization, survey topic, survey length, question

wording, question order, question display (such as screen-by-screen or scrolling, backgrounds, logo

display, graphics and progress display, navigational instructions, and question format), sampling

methods, contact delivery modes, pre-notification, design of invitation, and incentives.

Research on Online Surveys

In this section, a number of topics regarding how online surveys have evolved since the 2005 article

are addressed. Our greatest emphasis is on theory-based research and research on data quality.

Theory-Based. There have been a number of articles on the theoretical underpinnings of good

survey methods. In particular, what can be done to persuade respondents that the benefits of survey

participation far outweigh the costs? And how can survey data quality be optimized? These are

topics among others addressed by researchers [In addition, several of the articles cited in the

following sections are theoretical in nature.]:

9
• Response behavior models (Greszk et al., 2015; Neslin et al., 2009; Rogelberg et al., 2006)

• Total survey error paradigm (Biemer, 2010; Ha et al., 2015)

• Theoretical assumptions about real and falsified data (Landrock and Menold, 2016)

• Social psychology and online social networks/communities (Kugler, 2014; Li, 2011; Riedl et

al., 2013)

• Validity of different scales in online surveys (Revilla and Ochoa, 2015)


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

• Insights about online data collection from academics teaching research methods courses

(Kilinç and Firat 2017)

• Factors affecting self-disclosure on social media sites (Chen et al., 2015; Cheung et al.,

2015)

• Respondent conditioning with online panel surveys (Struminskaya, 2016)

Data Quality. Related to the expanding theory-based literature for online surveys methods is

the recent literature on data quality. These are some examples:

• Validating customer profile data (Park et al., 2012)

• Satisficing response behavior and data quality (Barge and Gehlbach, 2012)

• Online consumer review credibility (Cheung et al., 2012)

• Online conflict of interest disclosures (Jensen and Yetgin, 2017)

• Methodologies for open-ended question analysis (Fielding et al., 2013; Holland and

Christian, 2009; Scholz and Zuell, 2012)

• Data quality and sampling techniques (Dutwin and Buslirk, 2017)

• Carelessness in answering online surveys (Ward et al., 2017)

• Completion rates for online surveys (Antoun et al., 2017; Liu and Wronski, 2017)

• Effects of topic sensitivity on data quality (Roster et al., 2017)

10
• Data quality and the Internet of Things (Karkouch et al., 2016)

Survey formats. These are among the survey format topics that have been studied: respondent

engagement (Downes-Le Guin et al., 2012); visualization effects (McDevitt, 2005); survey fatigue

(Papachristos, 2014); trends in quantitative research methods (Henning, 2014; Pecáková, 2016); and

survey delivery mode – such as face-to-face versus online surveys (Heerwegh, 2009; Revilla and

Saris, 2013), telephone versus online surveys (Braunsberger et al., 2007; Herian and Tomkins,
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

2012; Xing and Handy, 2014; Yeager et al., 2011), mail versus online surveys (Deutskens et al.,

2006), and paper versus online surveys (Schneider, 2015).

Panels. To improve the quality and representativeness of online survey responses, a number of

researchers – both academic and non-academic – have turned more to consumer panels. One firm,

SSI (2018), has a global panel with 11.5 million members across more than 100 countries. As such,

the literature in this area has grown a lot since 2005 and covers such aspects of panels as these:

types of panels (Krajicek, 2015), recruiting online panels (Martinsson et al., 2017), why people join

panels (Brüggen and Dholakia, 2010), topic interest and salience (Keusch, 2011), opt-in panels

(Brown et al., 2012), early versus late panel respondents (Sharp et al., 2011), response styles

(Menictas et al., 2011), and response metrics (Callegaro and Disogra, 2008).

Mobile surveys. Because of the worldwide increase in mobile device usage (see Deloitte,

2017), more survey research is now conducted through these devices. For example, according to

Forrester Research (Statista, 2017b), well over 90 percent of U.S. smartphone users access the

Internet from their phones at least once per week. As a result, there has been substantial research on

mobile survey topics, such as: panel surveys via mobile devices (Revilla, 2016; Weber et al., 2008),

the effect of survey formats and design (Antoun et al., 2017; De Bruijne and Wijnant, 2014), the

choice of survey modes (Wells, 2015), assessing mobile surveys (Lynn and Kaminska, 2012;

Okazaki, 2007), and extracting social behavior (Palaghias et al., 2016).

11
Hybrid surveys. With the advances in technology and survey research tools, some researchers

have turned to hybrid/multimedia surveys to enhance the quality of survey responses. Consider this

observation from iModerate (2018): “The companies that win are the ones that have a 360-degree

view of their consumer and their competitors. There is no better way to get this view than to blend

methods for a complete picture of your customers”. Research on hybrid surveys has involved such

analyses as these: invitation modes and non-response rates (Maxi et al., 2010; Porter and

Whitcomb, 2007), response modes and non-response rates (Millar and Dillman, 2011), combining
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

quantity and quality measures (Mauceri, 2016), the use of open-ended questions (Altintzoglou et al.,

2017), and mode effects on answers (Vannieuwenhuyze et al., 2010).

Social media. As Cheung et al. (2014) note: “The rise of social media has facilitated consumer

social interactions. Consumers use online social platforms, including social networking sites, blogs,

social shopping communities, and consumer review sites, to communicate opinions about products

and exchange purchase experiences”. Thus, consumer attitudes and other valuable information can

be gleaned from social media. [And later in the paper, it is discussed how survey research can entail

utilizing social media.] These are among the aspects of social media research that have been

studied: undertaking surveys at social media sites (Gregori and Baltar, 2011), critical mass and

collective intention (Shen et al., 2013), the level of engagement (Abitbol and Lee, 2017; Wiese et

al., 2017), friend recommendations (Shen et al., 2013), and microblogging (Li and Liu, 2017).

Big data. Today, people are inundated with information about big data and its transformational

impact. For instance, Google can read, analyze, and sort every digital volume in the U.S. Library of

Congress in one-tenth of a second; and data analysis that cost $1 million in 2000, cost $10 in 2016

(Springer, 2017). But in the midst of the excitement about the possibilities of big data, realism is

also required. As Clarke (2016, p. 77) observes:

The “big data” literature, academic as well as professional, has a very strong focus on
opportunities. Far less attention has been paid to the threats that arise from repurposing data,

12
consolidating data from multiple sources, applying analytical tools to the resulting collections,
drawing inferences, and acting on them. [We] draw attention to the responsibility of researchers
and professionals to temper their excitement and apply reality checks.

These are among the relevant research projects on big data: the evolution of big data techniques

(Lee, 2017; Yaqoob et al., 2016), transformational issues (Baesens et al., 2016), social network

analysis (Dabas, 2017), and big data SWOT analysis (Ahmadi et al., 2016).

Data analytics. The availability of big data does not add much to the knowledge base unless
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

appropriate data analytics are performed well. This is not always the case. According to an

Accenture global survey (2017), weak personalization efforts cost companies $2.5 trillion

worldwide in lost revenues annually. And that may be tied to poor use of big data. Important

research in this area relates to: big data analytics challenges (Lee 2017; Pouyanfar. et al., 2018), big

data reduction methods (Rehman et al., (2016), customer experiences on social media (Ting et al.,

2017), and debunking myths (Gandomi and Haider, 2015; Kimble and Milolidakis, 2015).

Survey incentives. As survey response rates have fallen, the issue of incentives has become

more hotly debated. The discussion typically focuses on the benefits of higher response rates versus

the possible answer bias caused by incentives. SurveyMonkey (2018) says this about incentives:

Incentivizing surveys may seem like a no-brainer. But consider this: Your reward may be
attracting the wrong kind of respondent. By offering everybody a reward to take your survey, it
can encourage satisficers – people who misrepresent themselves or rush through surveys to the
detriment of your survey results – just to collect a reward. However, incentivizing isn’t all bad.
Offering survey rewards can help you encourage hard-to-reach audiences to take your survey.
You can even offer indirect rewards to your respondents to benefit a third party, like a charity.
Decide whether or not to incentivize your survey by carefully considering the circumstances.

The literature on incentives has covered such areas as these: types of incentives (Göritz and Luthe,

2013; LaRose and Tsai, 2014; Pedersen and Nielsen, 2016), incentive size (Hsu et al., 2017; Mercer et

al., 2015), survey appeal (Walker and Cook 2013), and panelist response motives (Smith et al., 2017).

Ethical issues. The use of online surveys poses special ethical challenges, even for well-

intended researchers. In 2009, Buchanan and Hvizdak did a study involving 750 U.S. human

13
research ethics boards: “Respondents indicated that the electronic and online nature of these survey

data challenges traditional research ethics principles such as consent, risk, privacy, anonymity,

confidentiality, and autonomy, and adds new methodological complexities surrounding data storage,

security, sampling, and survey design”. (p. 37) It is interesting that some respondents believe their

privacy is being breached even if it is not. According to Sitecore, companies are less likely to

collect customer data than their customers believe (eMarketer, 2017a). Other studies have dealt with

ethics in the context of such topics as acceptable standards (Williams, 2012), social media research
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

(Moreno et al., 2013), online communities (Pan et al., 2017), notice and consent (Liu, 2014), and

the effects of fake online information (Burkhardt, 2017; Sirajudee et. al., 2017).

False information from respondents. A mounting problem involves respondents who

intentionally provide false information about their attitudes, behavior, and/or demographics. There

has been a lot of research on this, such as: predicting and detecting falsifications (Akbulut, 2015;

Simmons et al., 2016), answering without really reading the survey (Anduiza and Galais, 2017;

Jones et al., 2015), Web surfing to find information about a study topic (Motta et al., 2017),

differences in falsification by sample motives (Clifford and Jerit, 2016), justifications for false

answers (Punj, 2017), and online reviews (Banerjee and Chua, 2017).

Computer literacy. This affects both the number of people with access to the Internet (which

influences study representativeness) and some people’s ability to properly complete an online

survey. According to Techopedia (2018a), a computer-literate person is someone who has the

“knowledge and skills to use a computer and other related technology”. OECD collected data during

2011 to 2015 from 33 countries, and published its results in 2016. It defined several levels of

computer proficiency: “below level 1”, 14 percent of the adult population; “level 1”, 29 percent;

“level 2”, 26 percent; “level 3”, only 5 percent; and no computer ability, 26 percent (Nielsen, 2016).

Unfortunately, this issue has not had much attention in the literature. In one relevant study,

Greenland and Kwansah-Aidoo (2012) examined the research challenges in Sub-Saharan Africa.

14
Business-to-business surveys. Companies do a lot of survey research with their supply chain

partners as well as with their own employees. In some cases, online surveys are done via secure

Internet connections; other times, they are conducted through firms’ internal Intranets. To date,

there has been insufficient research on B2B online surveys. Here are some of the research projects

that have been undertaken: B2B online communities and social media utilization (Gharib et al.,

2017; Murphy and Sashi, 2018; Salo, 2017), online job applications (Morelli et al., 2014), user-

generated cross-cultural media behavior (Thakur and AlSaleh, 2017), and B2B customer insights
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

(Durr and Smith, 2017).

Online Survey Best Practices

A wide variety of researchers have written about the best practices to use in building, implementing,

analyzing, and presenting the results of online surveys. Later in the paper, we offer our conclusions

and recommendations. Because of the importance of this topic, Table 1 is for those who want to

access online resources about best survey practices. These are resources from practitioner (non-

academic) sources.

Table 1 goes about here.

Online Survey Firms

Online survey firms come in many types and sizes. There are traditional firms that began before

online surveys were introduced. Many of them have added online surveys to their portfolio. An

analogy can be made with the bricks-and-mortar retailers that expanded into bricks-and-clicks

formats. Specialized online survey firms also now exist; they focus on digital research. These

include Qualtrics and SurveyMonkey. In addition, there are more non-research firms that conduct

15
surveys for their own use. Sometimes, they sell their findings to other companies. An example is

Experian, the credit-monitoring firm.

Large traditional research firms that have branched into the digital arena dominate the industry

worldwide. According to 2016 ESOMAR data (Statista 2018b), the largest six market research

firms – Nielsen, Kantar, Quintiles IMS, Gartner, Ipsos, and GfK – accounted for 42 percent of all

revenues attained by firms in the industry. Their dominance is due to the full-service approach they

take. However, for smaller clients, other research companies are more utilized,
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

A LOOK AHEAD

In looking to the future of research, especially online survey research, let us examine a number of

relevant topics. The focus is on emerging online survey technologies and methodologies, and

emerging non-survey techniques and methodologies. After that, we will present our own extensive

conclusions and recommendations.

Emerging Online Survey Technologies and Methodologies

Even with the rapid growth of newer online survey techniques and methodologies, there will be

further advances as tools evolve.

Qualitative studies. By using video chat features available on platforms such as Skype and

Zoom, it is now possible to conduct online personal interviews without having to meet in person.

These interviews can be recorded for later analysis in the traditional way as well as by using

enhanced analytical tools (such as facial analysis and voice analysis). This method will save time

and costs. Also, newer technology (such as Zoom) can enable participants to engage in online focus

groups. Thus, focus group discussions can be held live without people having to meet – saving time

and money. However, this approach may not allow the moderator to observe all nonverbal

16
communications within the group. Focus group sessions can be recorded to analyze later. There

have been studies related to online personal interviews using Skype (Deakin and Wakefield, 2014;

Seitz, 2015; Hanna, 2012) and the use of online focus groups (Stewart and Shamdasani, 2017).

Crowdsourcing as a source of participants. Since the publication of the 2005 paper, a new

method of data collection has gained acceptance – the use of crowdsourcing platforms to recruit

participants and collect data for academic and practitioner research; and its popularity will grow

further in the future as the tool is better understood. Investopedia (2018) states: “Crowdsourcing
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

involves obtaining information or opinions from a large group of people who submit their data via

the Internet, social media, and smartphone apps. People involved in crowdsourcing sometimes work

as paid freelancers, while others perform small tasks on a voluntary basis”.

In the past, most online surveys were done by seeking participants from traditional consumer

panels, personal contacts, or E-mail lists. Then, Amazon’s Mechanical Turk (MTurk –

www.mturk.com) was established in 2005, with a platform to crowdsource participants in a more

efficient and cost-effective way (Buhrmester et al,. 2011). It is by far the most popular

crowdsourcing platform used by academics and practitioners. A search in the Google Scholar

database with the terms “Mechanical Turk” or “MTurk” produced nearly 15,000 article hits (as of

mid-April 2018) – up about one-third from the total at the start of 2017. And a study (Goodman and

Paolacci, 2017) found that between June 2012 and April 2016 nearly 27 percent of papers published

in the Journal of Consumer Research relied on data collected online via MTurk.

Initially, Amazon set up MTurk marketplace so employers could post tasks/jobs; and workers

could opt to complete the tasks for fixed monetary compensation pre-determined by the employer,

with Amazon keeping a commission. Then, academic and practitioner researchers found this

marketplace to be attractive, easily accessible, and reasonably priced for them to recruit participants

for their online studies. Although MTurk is the most popular platform, other crowdsourcing sites

have emerged – even in such languages as Japanese (see https://crowdworks.jp).

17
The crowdsourcing of data collection using platforms like MTurk is driven by these advantages:

global reach, potential sample size, ease of use, speed, low costs, participant diversity, flexibility in

question formats, respondent interest, and survey length. Some disadvantages are: sample self-

selection, too much participant knowledge on certain topics, misrepresentations, and the effect of

platforms on study design. In addition, there are some unscrupulous users who only seek to be

compensated. Despite these limitations, experts have devised guidelines and processes to help

researchers minimize their effects. Goodman and Paolacci (2017) and Wessling et al. (2017) offer
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

specific guidelines to deal with these limitations.

Omnichannel. There will more coordination of research efforts across the distribution channel

(supply chain). According to BRP (2017): “Successfully delivering a seamless customer experience

requires an approach that is enabled by unified commerce. This puts the customer experience first

by leveraging real-time data, which is delivered by utilizing one common, centralized, real-time

platform for all customer engagement points”. Thus, coordinated efforts to develop and analyze big

data – whether obtained through surveys or tracking – will expand (see Bradlow et al., 2017;

Raphaeli et al., 2017) to include attitudes, intentions, the purchase process, purchase history, and

demographics. One stumbling block will be the reluctance of some firms in the distribution channel

to share information with one another; because of the Internet, manufacturers and other channel

members are now more apt to compete with one another.

Internet of Things. Through the Internet of Things, more and more people – and firms – will be

“always connected”. As a result, security/privacy will remain a concern (see Airehour et al., 2016;

Zarpelão et al., 2017). For survey researchers, the challenge will be how to utilize surveys to gather

opinions and intentions regarding the use of connected devices. Good insights into the role of the

Internet of Things in future research efforts are by Ng and Wakenshaw (2017), and Qin et al. (2016).

Social media content analysis. With the continuing boom in social media platforms, more

researchers will engage in content analysis of people’s attitudes, intentions, and feedback by visiting

18
those platforms. This will be a major source of information for big data and data analytics. There is

considerable research on this subject to help us better understand the possibilities of social media

content analysis. For example, look at these studies: electronic word of mouth (Liu et al., 2016;

Raassens and Haans, 2017), Facebook likes and fan pages (Chang and Fan, 2017; Hu et al., 2017),

social tags (Nam et al., 2017), information from tweets (Søilen et al., 2017); and sentiment analysis

(Bohlouli et al., 2015).

Customer reviews. With the eroding trust of some commercial sources, the number of people
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

turning to customer reviews for company and product information will increase, due to their greater

perceived honesty. Thus, researchers will increase their content analysis of reviews, just as with

social media. The literature on this topic is quite extensive, on such aspects as these: why people

post customer reviews (Cheung and Lee, 2012), the helpfulness and use of reviews (Singh et al.,

2017; Zhang et al., 2017), sentiment analysis (Salehan and Kim, 2016; Wang et al., 2017), and the

dark side of reviews (Liu and Karahanna; 2017; Zhang et al., 2014).

Netnography. According to a 2002 article by Kozinets (p. 61):

Netnography is ethnography adapted to the study of online communities. As a method,


netnography is faster, simpler, and less expensive than traditional ethnography and more
naturalistic and unobtrusive than focus groups or interviews. It provides information on the
symbolism, meanings, and consumption patterns of online consumer groups. Guidelines
[should] acknowledge the online environment, respect the inherent flexibility and openness of
ethnography, and provide rigor and ethics in the conduct of marketing research.

Only recently, due to the evolution of the Internet and computer software, has netnography become

a truly viable content analysis tool for the researcher. And it will be used more widely in the future.

To learn about the current status of netnography, see Costello et al. (2017).

Emerging Non-Survey Technologies and Methodologies

Although the main focus of this paper is online surveys, it must also be noted that new technologies

are being developed and deployed to carry out further types of data collection for research purposes.

19
The discussion here covers how some of these technologies could be deployed. Note: The Future

Today Institute (2017) has published a comprehensive overview of 158 different technology trends

that are in process and that will influence future research efforts. It is available as a free download.

Observational research. Although survey research depends on people to directly or indirectly

provide information through communications, observational research relies on noting behavior in

natural and/or artificial settings. Observational research generally generates richer data because it

does not rely on the subject to provide information. In many instances, surveys may be used as a
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

supplement to observation (through hybrid research). These are examples of how observational

research is being done today – with a strong possibility of more extensive use in the future:

• To track online customer search and shopping (see Park, 2017)

• To monitor in-store behavior (see Sorensen et al., 2017)

• To track behavior through wearable devices (see Webster et al., 2017)

• To gather data from the use of mobile apps (see Han et al., 2016; Kim et al., 2017)

• To study locational data – of both people and devices – with GPS software (see Korpilo et al.,

2017; Shoval and Ahas, 2017)

• To see how touchscreens influence behavior (see Zhu and Meyer, 2017)

• To collect data using Google Trends – which tracks Google Search terms (see Castelnuovo

and Tran, 2017; Siliverstovs and Wochner, 2018; Stocking and Matsa, 2017)

• To use advanced eye-tracking, facial recognition, neuroscience, and other high-tech tools to

gauge physiological reactions to stimuli (see Hsu, 2017; Losbichler and Michaels-Kim, 2017).

Other new technologies. There are other emerging technologies that will provide researchers

with still more tools. They will offer opportunities for hybrid research, real-time results, large

databases, and in-depth insights. Below are a few examples:

20
• The number of sophisticated research labs will grow and operate in both the academic and

non-academic arenas. These labs will combine state-of-the-art observational research with

survey components.

• Now that voice-activated electronic digital assistants (such as Alexa) and voice-activated

software (such as Siri) have been around for a few years, their research capabilities are

coming into wider acceptance. They can track all sorts of activities and do sentiment and
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

other types of content analysis. The databases are huge.

• Online conversation agents are computerized assistants that engage with customers through

real-time, popup query boxes. For the most part, these agents have been used for customer

service purposes to engage people in chats, answer questions, guide participants through

problems, and hold down costs. The consumer information stored with these agents has been

under-utilized from a research perspective. Through content analysis, researchers will be

able to determine the most prevalent consumer complaints, which products receive the most

comments, and so forth. Data analytics will benefit from such big-data analysis.

CONCLUSIONS AND RECOMMENDATIONS ABOUT RESEARCH IN THE FUTURE

After looking at the present and trending status of online surveys, we are now ready to discuss our

own conclusions and recommendations about research in the future – with an emphasis on

prescriptive guidelines for online surveys. We cover these aspects of research: better defining

concepts, better understanding the future role of surveys, better developing and implementing

surveys, and adhering to a strong code of ethics. These topics are highlighted in Figure 2.

Figure 2 goes about here.

21
Better Defining Concepts

In particular, two areas need to be better clarified. First, in moving forward, the term “online research”

should give way to “digital research”. Online research focuses on data collected via the Internet.

However, this concept is becoming too narrow. Digital research encompasses online research – as well

as any research conducted through connected devices, GPS software, company Intranets, text
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

messaging, Skype and other video services, bulletin boards, high-tech research labs, and more.

Second, the concept of a “survey” is evolving. In the past, this involved asking questions of

respondents through various formats. In a traditional survey, the researcher controls the questions

and all other aspects of the methodology in data collection – sampling, question wording, answer

typologies, etc. But now, researchers must determine whether a review of user-controlled comments

can be considered as surveys. For example, if we systematically examine social media comments on

a topic or firm, are we doing a survey? And if we systematically look at reviews at Web sites such

as Amazon, are we doing a survey? Our response is yes; these are types of surveys. Here’s why.

As per Business Dictionary (2018), a survey “gathers data on attitudes, impressions, opinions,

satisfaction level, etc., by polling a section of the population”. Cambridge Dictionary (2018) notes

that a survey is used “to ask people questions in order to find out about their opinions and behavior”.

Finally, as to online surveys, Techopedia (2018b) states that:

Companies often use online surveys to gain a deeper understanding of their customers’ tastes
and opinions. Online surveys can be used in two basic ways: To provide more data on
customers, including everything from basic demographic information (age, education level, and
so on) to social data (causes, clubs, or activities the customer supports). To create a survey about
a specific product, service, or brand in order to find out how consumers are reacting to it.

For us, the biggest difference between gathering data from traditional “surveys” versus newer

formats, such as content analysis of social media comments and customer reviews at Web sites, is

22
that the former involve direct communication based on researcher-generated surveys; and the latter

entail indirect researcher-respondent communication whereby the researcher studies and compiles

user-generated content. And then the researcher inputs, categorizes, and analyzes the user-generated

data. Attitudes, intent, behavior, etc. can all be examined via user-generated content. However, with

user-generated information, there are issues regarding self-selected samples, the tendency for bipolar

commentaries (yea sayer and nay sayer effects), the lack of demographic data, and other

methodological issues. On the other hand, information may be more “real” rather than hypothetical,
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

comments do highlight attitudes, and comments may better pinpoint opportunities and problems than

researcher-generated data.

In sum, aren’t both traditional formats and user-generated content both consistent with the

premise of surveys – to gather data on “attitudes, impressions, opinions, satisfaction level, etc., by

polling a section of the population?” If we answer yes, we must reconfigure the way we define

“survey” to include the analysis of user-generated content. The latter can also contribute to the

knowledge base ways that traditional surveys cannot.

Here are some recent “real-life” applications of social media content analysis, as described in the

literature. They show the vast future possibilities of content analysis: social media and smoking

prevention in China (Jiang and Beaudoin, 2016), body imagery in fashion blogs (Kraus and Martins,

2017), health-related content at Reddit (Derksen et al., 2017), complaints on Twitter (Gunarathne et

al., 2017), YouTube public service announcements for healthy eating (Zhang, 2017), content at the

PlayStation blog (Tichon and Mavin, 2017), and issues and challenges with online news portals

(Ahmad and Buyong, 2017).

Better Understanding the Future Role of Surveys

Despite the current popularity of surveys, there are questions about the importance of surveys in

looking to the future, including online surveys. How much formal survey-based research will we

23
undertake in the evolving high-tech environment in effect now, which is expected to be even more

dominant in the future? In the previous section the definition of “survey” was deconstructed to

include other methods of obtaining consumer/people’s input or feedback on given topics.

The biggest threat to the survey method is the growing access to real, inexpensive, ongoing data

through Web site analytics, tracking online behavior, monitoring the online stages in the purchase

process, studying people’s actions via the Internet of Things, and more. In looking ahead, what

should be the major role of surveys, no matter their mode of delivery?


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

To us, the best synopsis on the future for surveys is by Miller (2017). As he notes in his abstract:

For decades, sample surveys have provided information of value to sponsors in government,
academia, business, and the public. That value proposition is threatened now by declining
survey participation and the advent of competition from alternative data sources. In this
environment, some developments in survey practice include new thinking about how to recruit
respondents, new methods for applying communication technology, and new approaches to
blending survey and non-survey data. Going forward, survey data may increasingly be one
component of information products, formed from various sources, including administrative
records and unstructured (“big”) data. [p. 205]

And he addresses the premise of blending data sources:

The future of surveys likely will involve more commingling of the data they produce with
information from other sources. This trend will be propelled in part by the study of
administrative records for policy analysis. The U.S. decennial Census has relied for decades on
a combination of questionnaire self-reports and in-person non-response follow-up interviews to
enumerate the population. These methods have become increasingly costly, leading to the
examination of alternatives. Current plans for the 2020 Census involve using administrative data
(e.g., tax filing information, health records) to reduce the nonresponse follow-up workload and
enumerate some households without conducting interviews. [p. 209]

Next, let’s look at two additional valuable observations about the future of the survey method.

These views are illustrative of what is being discussed in the current literature:

Sometimes, we as survey methodologists fall into the trap of thinking that surveys are the only
possible tool. We also get caught up in building the perfect tool, and forget that the tools are not
a goal in themselves, but are used for a purpose. Our job is to make better tools, to give the users
a range of tools to use in their work, and to guide them in which tool is best for which job. The
ultimate goal is to use the tools to make sense of the world around us and, in doing so, help to
make a better world. [Couper, 2013, p. 154]

24
A rising chorus is asserting that polls are passé. They claim public opinion, consumer behaviors,
and other sociopolitical outcomes can be better measured (less expensively, more quickly, more
easily) by the analysis of Internet usage in general and of social media in particular, by the data
mining of administrative databases (including the merging of disparate information sources
through such techniques as data fusion), or by a combination of these alternatives to traditional
surveys. The low cost of mined data is often touted in contrast to the high and rising cost of
traditional survey research. Another attractive aspect of social media and other Internet analysis
is its speed. Others emphasize the ease of amassing Internet data: that it can be compiled
routinely, even automatically, and it can be more easily accessed. [Smith, 2013, p. 219]

Finally, the future of survey research also depends on the researcher’s ability to generate
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

informative, actionable big data. Many more firms will insist on this. Already, there is a developing

literature base on this topic. Illustrations include: big data, brand marketing, and social innovation

(Calder et al., 2016), the risks of repurposing big data (Clarke, 2016), uses of big data in survey

research (AAPOR Task Force, 2015), better customer data in real time (Macdonald et al., 2013), and

bringing big data to life (Etheridge, 2016).

Table 2 shows our conclusions and recommendations as to the future role of survey research,

based on a detailed review of the existing literature. Here are several of the sources that we consulted

that offer insights on this topic: Bansal et al. (2017), Cooke and Buckley (2008), Couper (2017),

Dillman (2015), GreenBook (2017), Kennedy et al. (2016), Link et al. (2014), National Science

Foundation (2012), Pew Research Center (2018b), Statista (2017), Stipp (2016), and Žák (2015).

Table 2 goes about here.

Better Developing and Implementing Surveys

An ongoing problem with survey research – especially online survey research – is the lack of

consistency in survey methodology. There are no uniformly accepted guidelines. And this greatly

contributes to the varying quality of surveys. “Quick and dirty” is not an acceptable way to conduct

survey research if the goal is meaningful results.

25
As the American Association for Public Opinion Research (AAPOR) (2012) states:

Today’s practitioners and consumers of survey data are exposed to a wide array of
methodologies, each with its own challenges. Many users of survey data are skeptical of
contemporary survey methods, whether new, innovative approaches or more traditional ones.
Innovators are taking advantage of the growth in new technologies, exploring new data sources,
and experimenting with new methods. And while a well-designed and carefully executed survey
still can deliver valid and reliable results, methods and details matter more than ever.

Furthermore, the full details of the methodology employed always need to be stated – so

researchers can judge the validity of surveys conducted by others. The well-respected Pew Research
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Center (2017) is a good example of a source that provides methodology details:

The American Trends Panel, created by Pew Research Center, is a nationally representative
panel of randomly selected U.S. adults who are recruited from landline and cellphone random-
digit dial surveys. The panelists participate via monthly self-administered Web surveys. Those
panelists who do not have Internet access are provided with a tablet and wireless Internet
connection. The panel is being managed by Abt Associates. Members of the panel were
recruited from several large, national landline and cellphone random-digit dial surveys in
English and Spanish. At the end of each survey, respondents were invited to join the panel.

Table 3 shows our conclusions and recommendations as to the methodology of future survey

research, based on a detailed review of the literature. Our intent is to present an overall set of survey

methodology principles – in sequential order – that apply to the majority of online surveys. Here are

several of the sources we consulted on this topic: American Association for Public Opinion Research

(2018), Bastard (2012), Cho et al. (2011), Constant Contact (2018), Courtright (2015), Fazekas et al.

(2014), Galesic and Bosniak (2009), Interactive Marketing Research Organization (2015), National

Science Foundation (2012, pp. 6-11), PeoplePulse (2011), Singh et al. (2009), Smyth et al. (2009),

Sue and Ritter (2012), Vannette (2015), Wong (2016), and Wouters et al. (2014).

Table 3 goes about here.

26
Abiding by a Strong Code of Ethics

Earlier, it was noted that many respondents and potential respondents do not trust survey

researchers. They think survey practices are unethical (and dishonest). To overcome this view, a

strong code of ethics must be adhered to by both academic and non-academic survey practitioners.

Why is a survey research ethics code so important? According to Resnik (2014):

Ethics are norms for conduct that distinguish between acceptable and unacceptable behavior.
They promote the aims of research, such as knowledge, truth, and error avoidance. They promote
values essential for collaborative work, such as trust, accountability, mutual respect, and fairness.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

They mean researchers can be accountable for conflicts of interest, misconduct, and research
with humans or animals. They ensure that the public can trust research quality and integrity.

To illustrate, the Pew Research Center (2018c) can again be cited for its ethical performance:

Independence, objectivity, accuracy, rigor, humility, transparency, and innovation are


indispensable to the mission and success of Pew Research Center. To promote and preserve
these values, the Center’s code of ethics includes the following topics: conflicts of interest,
prohibitions on electioneering, and integrity of research.

It is also imperative that researchers comply with any new regulations related to privacy on the

Internet. And that may not be easy to do in the always evolving and complex digital world. For

example, the European Union’s General Data Protection Regulation (GDPR) went into effect in

May 2018. As reported by eMarketer (2018):

All multinational firms will have to comply with the General Data Protection Regulation, which
governs consumer data collection, storage, and usage practices. But many companies remain
unsure about what they need to do. The legislation, which is designed to give consumers in the
EU more control over their personal data, lays out requirements and will impose potentially
devastating fines on companies with poor data-handling practices or that experience data
breaches in which they are found at fault. Regulations may be limited to the personal data of
consumers residing in the EU, but they apply to any company handling, transmitting, or storing
that data, whether it has a physical location in the EU or not.

Table 4 shows our conclusions and recommendations about having a strong code of survey

research ethics, based on a detailed literature review. Our intent is to present a comprehensive set of

27
ethical principles to adopt in conducting survey research and analyzing data. Here are several of the

sources that were consulted for information on this topic: AAPOR (2015), Barchard and Williams

(2008), Buchanan and Zimmer (2018), CASRO (2013), ESOMAR (2011), European

Pharmaceutical Market Research Association (EphMRA) (2018), Markham and Buchanan (2012),

MRIA (2014), and Resnik (2015).


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Table 4 goes about here.

28
Figure 1

The Strengths and Potential Weaknesses of Online Surveys


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Source: Evans, J.R. and Mathur, A. (2005), “The value of online surveys”, Internet Research, Vol. 15, No. 2,
p. 197.

29
Figure 2

Considerations in Planning and Executing Future Survey Research Projects

• “Online” versus “digital” research


Better defining concepts • Broadened notion of a “survey”
• Traditional versus new kinds of surveys
• Social media content analysis
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

• Best uses of surveys


Better understanding the • Best uses of other forms of data collection
future role of surveys • Blended (hybrid) research formats
• In-depth coverage of the future role of
surveys (see Table 2)

• Consistency in survey methodology


Better developing and
implementing surveys • Full methodology communicated
• In-depth methodology guidelines (see
Table 3)

• Importance of an ethics code


Abiding by a strong code of • Academics and practitioners
ethics • Numerous ethical issues
• In-depth ethical guidelines (see Table 4)

30
Table 1

Selected Resources as to Online Survey Best Practices

Clear Seas Research (2015), “Best practices for conducting surveys on mobile devices”, November 1,
available at: https://goo.gl/nvjXFY.
Courvoisie, K. (2016), “Online survey best practices for surveys that get results”, January 5, available at:
https://goo.gl/XTBhtG.
Formstack (2018), “How to conduct an online survey”, available at: https://goo.gl/Qh9wo2 (accessed on
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

February 20).
Raymond, S. (2017), “Survey building: online survey best practices for b2b businesses”, March 7, available
at: https://goo.gl/GYK4ko.
Rimsevica, S. (2016), “Online survey design best practices – Webinar recap”, September 28, available at:
https://goo.gl/4jCQrm. [with embedded slideshow].
Survata (2018), “Survey best practices”, available at: https://goo.gl/36ue4w (accessed February 20).
SurveyGizmo (2018), “Best practices: understanding and reducing bias in your surveys”, available at:
https://help.surveygizmo.com/help/survey-bias (accessed on February 20).
SurveyMonkey (2018), “How to analyze survey results”, available at: https://goo.gl/goXXwf (accessed
February 20).
SurveyMonkey (2018), “The difference between quantitative vs. qualitative research”, available at:
https://goo.gl/ScfCQu (accessed February 20).
Usability.gov (2018), “Online surveys”, available at: https://goo.gl/YkR1CK (accessed February 20).
Ward, C. (2017), “Best practices for designing engaging customer surveys”, July 31, available at:
https://goo.gl/EajKhx.
WBA Research (2013), “Best practices in mobile survey design: the impact of mobile phones”, November
13, available at: https://goo.gl/sm5KPy.
Zarc Interactive (2018), “Considerations for deploying online survey research”, available at:
https://goo.gl/LMjkXW (accessed February 20).

Note: Lengthy URLs have been shortened using Google URL Shortener. If clicked, those URLs will open
at the sites of the original URLs.

31
Table 2
Conclusions and Recommendations as to the Future Role of Survey Research

• Academics and business professionals will have different uses for surveys. For academics, surveys will
continue to be the leading research tool. They will utilize tools such as Qualtrics, SurveyMonkey, and
other researcher-controlled online surveys. On the other hand, because they have more access to high-
tech research tools, business professionals will rely less on surveys.
• In both academic and non-academic arenas, these will be the best uses for surveys:
o To systematically study such factors as people’s attitudes, intentions, and motivation. This is vital
when researchers study why people have certain views and why they behave in a particular way.
o To get in-depth responses for qualitative research.
o To better control respondent selection and balance.
o To do longitudinal opinion research with the same respondents through consumer panels.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

o To learn whether attitudes about the future prove to be correct.


o To study special topics, such as consumer sentiment, feelings about specific brands, intended brand
loyalty, feelings about the customer experience, job satisfaction, and more.
o To provide the inputs for informative, actionable big data,
• In both academic and non-academic arenas, there will be more use of tools such as online consumer
panels and high-tech eye-tracking combined with surveys. In the non-academic arena, behavioral surveys
will mostly be replaced by online tracking, Internet of Things tracking, simulations, etc.
• With regard to surveys:
o Hybrid research techniques will be more prevalent, such as tracking behavior through connected
devices combined with online surveys on attitudes and intentions. Each tool must have a clear
purpose. This will also allow more points of contact.
o Sampling issues will remain, with researchers more often deciding that random sampling is not
feasible. Data collection must be done systematically whether the researcher controls the sample and
methodology or the user controls the data and is part of a self-selected sample.
o Social media comments, reviewers, and related sources of information must be better incorporated into
survey research – and be considered a valued way of gaining insights not likely to be gathered in
traditional surveys. But, the researcher must be careful in assessing the data, since they may not be
representative and some comments may “planted” by unscrupulous parties. Also, the researcher needs
to view discussions at online communities differently than those at more general social media sites.
o A major advantage of traditional online surveys is that questions, answer categories, sequencing,
moving from one question to the next, etc. are structured. This will continue to make researcher-
controlled surveys desirable.
o Response rates for online surveys must be increased for more representative results. The lack of
Internet access by some people must be considered when sampling. Quota sampling will be a must.
o To get better quality answers and less respondent abandonment during a survey, the survey length
must be tightened up. It is unrealistic to assume that respondents will patiently answer surveys that
take more than a few minutes to complete.
o The use of photos, videos, etc. may stimulate survey participants to complete surveys and to be more
likely to participate again.
o As long as they are carefully compiled and analyzed, surveys can be an input for quantitative research.
• The growing popularity of mobile surveys should not be underestimated. These surveys must be formatted
to best fit on a smaller screen and to easily answer questions. Tools such as Qualtrics do this.
• Because many researchers want to study offline behavior, not just online behavior, surveys may be the
only format to elicit this type of data. In this instance, the best mode of survey delivery must be set by
the researcher. Thus, hybrid surveys with different possible points of contact may be desirable.
• Many researchers will use “do-it-yourself” survey tools such as SurveyMonkey, with the attendant
advantages and limitations.

32
Table 3
Conclusions and Recommendations as to the Methodology of Survey Research
1. Establish the purpose/goals of the survey before the methodology is formed. Purpose/goals give guidance
as to the best survey methodology to deploy. Where possible, include testable hypotheses and methods
based on sound theoretical foundations.
2. Set the type of survey (mode of delivery).
o Traditional online survey (researcher-generated questions)
o Non-traditional online survey (user-generated content)
o Hybrid survey
o Intranet for employee survey
3. Decide upon the sampling method, with appropriate guidelines.
o How respondents will be accessed/reached
o Random vs. non-random sampling
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

o Opting-in vs. opting-out


o Control of survey access so as to be available only to the chosen respondents
o Representativeness
o Utilization of panels
o Sample size and sampling error
o Non-response rates and attrition in panel membership
o Use of incentives to increase response rate
4. Determine who constructs the survey.
o Do-it-yourself
o Professional survey firm
5. Disclose the true purpose of the survey and how data will be used (transparency). Note the survey author.
6. Create questions and answer categories in a user-friendly and objective manner:
o Are all instructions clear?
o What information is requested?
 Only asking questions that are necessary to fulfill survey goals
 Being sure respondents are capable of answering the questions asked
 Considering an opt-out policy for personal questions
 Including filters to include only serious respondents
o What is the best question format?
 Multiple-choice, scales, ranking, open-ended, etc. (difficulty of inputting open-ended answers on
mobile devices)
 Mix of questions
 Question wording (avoid bias)
 Jargon-free wording
o Should multi-media question features be utilized?
 Photos
 Videos/animations
o How should questions be ordered?
 Sequencing (logical order)
 “Skip to” directions
o What are the proper answer categories?
 Whether single or multiple answers should be chosen (or a mix of the two)
 Mutually exclusive answer choices
 Answer ranges (such as “age 45-54” in a close-ended question) vs. specific answers (such as
“how old are you” in an open-ended question)
 Rotation of close-ended answers to minimize “yea sayer/nay sayer” effect
 Not forcing responses (through a don’t know or other choice)
 Careful coding and categorization of answers for open-ended questions

33
Table 3 (continued)

7. Devise surveys long enough to get the required information, but not so long as to affect response rates.
o Realistic survey length
o Upfront information to the respondent about the expected time to complete the survey
8. Use an appealing survey design that encourages survey completion. Assure respondents of anonymity.
o Ease of navigation
o Accessible layout and font size, with adaptation to screen size for mobile surveys (drop-down menus
harder for respondents with mobile devices)
o Standardization of design across multiple surveys with the same respondents/panelists
o Including a progress bar to indicate the percentage of survey completion
9. When possible, pre-test the survey to pinpoint any weaknesses that should be corrected.
10. Determine who administers the survey, collects the collect information, and analyzes the data.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

o Researcher-controlled
o Firm such as Qualtrics or SurveyMonkey
o Other
11. Establish a time frame for all aspects of a survey research project, including data collection.
12. After the survey, preserve the raw data in an electronic file to be able to verify results.
13. Use data analytics appropriate for the given survey. And provide analysis based on study goals (and
hypotheses, if utilized).
o Quantitative results
o Qualitative results
14. In any publication arising from the survey, acknowledge study limitations and offer recommendations for
further research. And maintain respondent confidentiality in reporting results.
15. Act upon the findings. Do not conduct a survey that does not increase the knowledge base or does not
lead to a change in an organization’s strategy or tactics.

Special Considerations for Non-Traditional Online Surveys (User-Generated Content)

• As with other types of research, establish the purpose of the study (including hypotheses, if appropriate).
• For each project, determine the specific content areas/topics to be examined.
• Carefully select the online source of information, such as reviews at Amazon or comments at a company
Facebook page.
• Read a number of reviews and/or comments on the topic(s). Then set up questions upon which to focus
(such as “On Twitter, what are tweeters saying about the prices of product x?” and “At Amazon, what
are the main reasons for negative customer reviews about product x?”)
• Set a time frame for content to be included in the study (such as the last three months).
• Collect enough comments to provide a good cross-section of responses.
• Results can be treated qualitatively and/or quantitatively.
• Engage in longitudinal analysis to learn about shifting attitudes and intentions.
• Use other methods to obtain information about demographics and other personal data.
• Recognize both the advantages and limitations of this type of content analysis.
• Act upon the findings. Do not conduct research that does not increase the knowledge base or lead to a
change in an organization’s strategy or tactics.

34
Table 4

Conclusions and Recommendations about Abiding by a Strong Code of Ethics

• Abiding by strong codes of ethics will help researchers gain greater trust among all constituencies.
• All laws regarding privacy, data security, vulnerable groups, personally-identifiable information, etc.
must be followed. These laws may differ by country or state. [Global and European firms must
understand and adapt to the EU’s stringent new GDPR – General Data Protection Regulation.]
• The concept of ethics differs by culture and geographic region. However, this does not mean that survey
practices widely known as unethical are ever acceptable.
• The researcher should always commit to a high level of integrity, independence, objectivity, and
transparency – and be qualified to carry out a research project.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

• When respondents are asked to participate in a survey, that survey should not involve “sugging”, which
is selling under the guise of research. Ads should not appear during the survey process.
• The researcher’s identity should be clear. If there is a sponsor, that should be noted.
• Special care should be taken with disguised surveys.
• Survey participation should be voluntary.
o Potential respondents should not be aggressively pursued to participate.
o Respondents should only asked about subjects for which they are competent to answer.
o Respondents should be able to withdraw from answering survey questions at any point.
• Respondents should be asked for their informed consent before completing a survey.
o If tracking is to be done, this must be known and approved by the respondent.
o If cookies are placed on respondent devices, they should be temporary in nature and used only for
the specific survey undertaken.
o Active agent technology and required software downloads must be used judiciously, and tied to
informed consent.
o Care should be taken in using vulnerable people as respondents.
• Personal data should be securely collected and stored, and cookies used with care.
o Individual responses should be treated confidentially. Personal identifiers should not be included in a
survey’s database. Personally-identifiable information should be kept separate from survey
responses; and such data should be maintained by code numbers rather than be stored by name.
o The https designation should be utilized with all online surveys.
o If panels are used, respondents must be informed that data are stored and reviewed longitudinally,
and given the opportunity to opt out of surveys and drop off the panel (for any reason).
o Access to survey materials must be restricted, and identifiable information never sold to third parties..
• Expected survey duration (completion time) should be fairly estimated and told to respondents upfront.
• Both questions and answer choices should be presented in an objective manner to reduce bias.
o The researcher must weigh the trade-off between response rate and potential respondent bias due to
incentives to participate.
o If incentives are given, they should be small in nature (and their use noted in reporting results).
• The validity and value of survey results should not be overstated, especially as to the representative
nature of the findings.
o Ethical standards must apply to both quantitative and qualitative surveys.
o Where possible, sampling should include those without access to the Internet.
o Computer literacy may be an issue in surveys of respondents in less-developed countries.
• The use of the proper statistical techniques must be tied to the quality and types of data collected.
• In presenting survey results, analysis should be objective and based on the whole data set – and not
distorted to misrepresent the findings.
• When issuing press releases or otherwise publicizing results, methodologies should be detailed.
• Clients should be treated in the same ethical manner as survey respondents – honestly and transparently.
• Backup records should be maintained so as to verify findings.

35
References

Note: Lengthy URLs have been shortened using Google URL Shortener. If clicked, those URLs will open
at the sites of the original URLs.

Abitol, A. and Lee, S.Y. (2017), “Messages on CSR-dedicated Facebook pages: What works and what doesn’t”, Public
Relations Review, Vol. 43, No. 4, pp. 796-808.
Accenture (2017), “Put your trust in hyper-relevance”, available at: https://goo.gl/zV6wHk.
Ahmad, U., Zahid, A., Shoaib, and AlAmri, A. (2017), “HarVis: An integrated social media content analysis framework
forYouTube platform”, Information Systems, Vol. 69, pp. 25-39.
Ahmad, Z.A. and Buyong, M. (2017), “Content analysis of online news portal: Issues and challenges,” Journal of Social
Sciences and Humanity, Vol. 12, Special Issue 2, pp. 164-174.
Ahmadi, A., Dileepan, P., and Wheatley, K.K. (2016), “A SWOT analysis of big data”, Journal of Education for Business,
Vol. 91, No. 5, pp. 1-6.
Airehrour, D., Gutierrez, J., and Kumar, S. (2016)”Secure routing for Internet of Things”, Journal of Network and
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Computer Applications, Vol. 66, No. C, pp. 198-213.


Akbulut, Y. (2015), “Predictors of inconsistent responding in Web surveys”, Internet Research, Vol. 25, No. 1, pp.
131-147.
Altintzoglou, T., Done, I., Voldnes, G, Nøstvold, B., and Sogn-Grundvåg, G. (2018), “Hybrid surveys: A method for the
effective use of open-ended questions in quantitative food choice surveys”, Journal of International Food &
Agribusiness Marketing, Vol. 30, No. 1, pp. 49-60.
American Association for Public Opinion Research (AAPOR) (2015), “The code of professional ethics and practices”,
November 30, available at: https://goo.gl/1t8hgv.
American Association for Public Opinion Research (AAPOR), (2018a), “Best practices for survey research”, available at:
www.aapor.org/Standards-Ethics/Best-Practices.aspx#best1 (accessed February 5).
American Association for Public Opinion Research (AAPOR), (2018b), “Evaluating survey quality in today’s complex
environment”, available at: https://goo.gl/MAAqpA (accessed February 5).
American Association for Public Opinion Research (AAPOR) Task Force (2015), “Big data in survey research,” Public
Opinion Quarterly, Vol. 79, No. 4, pp. 839-880.
Anduza, E. and Galais, C. (2017), “Answering without reading: IMCs and strong satisficing in online surveys”,
International Journal of Public Opinion Research, Vol. 29, No. 3, pp. 497–519.
Antoun, C., Couper, M.P., and Conrad, F.G. (2017), “Effects of mobile versus pc Web on survey response quality: A
crossover experiment in a probability Web panel”, Public Opinion Quarterly, Vol. 81, No. S1, pp. 280-306.
Baatard, G. (2012), “A technical guide to effective and accessible Web surveys”, Electronic Journal of Business Research
Methods, Vo. 10 , No. 2, pp. 101-109.
Baesens, B., Bapna, R., Marsden,J.R., Vanthienen, J., and Zhao, J.L. (2016), “Transformational issues of big data and
analytics in networked business”, MIS Quarterly, Vol. 40, No. 4, pp. 807-818.
Banerjee, S. and Chua, A.Y.K. (2016), “Authentic versus fictitious online reviews”, Journal of Information Science, Vol.
43, No. 1, pp. 122-134.
Bansal, H. S., Eldridge. J., Haider, J., Knowles, R., Murray, M., Sehmer, L., and Turner, D. (2017), “Shorter interviews,
longer surveys”, International Journal of Market Research, Vol. 59, No. 2, pp. 221-238.
Barchard, K.A. and Williams, J. (2008), “Practical advice for conducting ethical online experiments and questionnaires for
United States psychologists”, Behavior Research Methods, Vol. 40, No. 4, pp. 1111-1128.
Barge, S. and Gehlbach, H. (2012), “Using the theory of satisficing to evaluate the quality of survey data”, Research in
Higher Education, Vol. 53, pp. 182-200.
Biemer, P.P. (2010), “Total survey error: design, implementation, and evaluation”, Public Opinion Quarterly, Vol. 74, No.
5, pp. 817-848.
Bohlouli, M., Dalter, J., Dornhöfer, M.,, Zenkert. J., and Fathi, M. (2015), “Knowledge discovery from social media using
big data-provided sentiment analysis (SoMABiT)”, Journal of Information Science, Vol. 41, No. 6, pp. 779-798.
Bradlow, E. T., Gangwarb, M., Kopallec, P., and Voleti, S. (2017), “The role of big data and predictive analytics in
retailing rsearch”, Journal of Retailing, Vol. 93, No. 1, pp. 79-95.
Braunsberger, K., Wybenga, H., and Gates, R. (2007), “A comparison of reliability between telephone and Web-based
surveys, Journal of Business Research, Vol. 60, No. 7, pp. 758-764.
Brown, G., Weber, D, Zanon, D., and de Bie. K. (2012), “Evaluation of an online (opt-in) panel for public participation
geographic information systems surveys”, International Journal of Public Opinion Research , Vol. 24, No. 4, pp. 534-
545.
BRP Consulting (2017), “2017 Unified commerce survey”, available at: https://goo.gl/DVrTYh.

36
Brüggen, E. and Dholakia, U.M. (2010), “Determinants of participation and response effort in Web panel surveys”,
Journal of Interactive Marketing, Vol. 24, No. 3, pp. 239-250.
Buchanan, E.A. and Hvizdak, E.E. (2009), “Online survey tools: Ethical and methodological concerns of human research
ethics committees”, Journal of Empirical Research on Human Research Ethics, Vol. 4, No. 2, pp. 37–48.
Buchanan, E.A. and Zimmer, M. (2018), “Internet research ethics,”, Stanford Encyclopedia of Philosophy, Spring,
available at: https://plato.stanford.edu/entries/ethics-Internet-research/.
Buhrmester, M., Kwang, T., and Gosling, S. D. (2011), “Amazon’s Mechanical Turk: A new source of inexpensive, yet
high-quality, data?” Perspectives on Psychological Science, Vol. 6, No. 1, pp. 3-5
Burkhardt, J. (2017), “Combating fake news in the digital age”, Library Technology Reports, Vol. 53, No. 8, pp. 5-33.
Business Dictionary (2018), “Survey”, available at: www.businessdictionary.com/definition/survey.html (accessed January 29).
Calder, B.J., Malthuse, E.C., and Maslowska, E. (2016), Brand marketing, big data, and social innovation as future
research directions for engagement,” Journal of Marketing Management, Vol. 32, Nos. 5-6, pp. 579-585.
Callegaro; M. and Disogra, C. (2008), “Computing response metrics for online panels”, Public Opinion Quarterly, Vol. 72,
No. 5, pp. 1008-1032.
Cambridge Dictionary (2018), “Survey”, available at: https://dictionary.cambridge.org/us/dictionary/english/survey
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

(accessed January 29).


CASRO (2013), “Code of standards and ethics for market, opinion, and social research”, available at:
https://goo.gl/Ec4HYZ.
Castelnuovo, E. and Tran, T.D. (2017), “Google it up! A Google Trends-based uncertainty index for the United States and
Australia”, Economics Letters, Vol. 161, December, pp. 149-153.
Chang, S-W. and Fan, S-H. (2017), “Cultivating the brand-customer relationship in Facebook fan pages”, International
Journal of Retail & Distribution Management, Vol. 45, No. 3, pp. 253-270.
Chen, J.V., Widjaja, A.E., and Yen, D.C. (2015), “Need for affiliation, need for popularity, self-esteem, and the moderating
effect of big five personality traits affecting individuals’ self-disclosure on Facebook”, International Journal of Human-
Computer Interaction, Vol. 31, pp. 815-831.
Cheung, C., Lee, Z. W.Y., and Chan, T. K.H. (2015), “Self-disclosure in social networking sites”, Internet Research, Vol
25, No. 2, pp. 279-299.
Cheng, C.M.K. and Lee, M.O. (2012), “What drives consumers to spread electronic word of mouth in online consumer-
opinion platforms”, Decision Support Systems, Vol. 53, No. 6, pp. 218-225.
Cheung, C.M.K., Xio, B.S., and Liu, I.L.B. (2015), “Do actions speak louder than voices? The signaling role of social
information cues in influencing consumer purchase decisions”, Decision Support Systems, Vol. 65, No. C, pp. 50-58.
Cheung, C. M-Y., Sia, C-L., and Kuan, K. K.Y. (2012), “Is this review believable? A study of factors affecting the
credibility of online consumer reviews from an ELM perspective”, Journal of the Association for Informations Systems,
Vol. 13, No. 8, pp. 618-635.
Cho, Park, J., Han, S.H., and Kang, S. (2011), “Development of a Web-based survey system for evaluating affective
satisfaction”, International Journal of Industrial Ergonomics, Vol.41, pp. 247-254.
Clarke, R. (2016), “Big data, big risks”, Information Systems Journal, Vol. 26, No. 1, pp. 77–90.
Clifford, S. and Jerit, J. (2016), “Cheating on political knowledge questions in online surveys”, Public Opinion Quarterly,
Vol. 80, No. 4, pp. 858–887.
Constant Contact (2018), “Top 12 survey best practices”, available at: https://goo.gl/info/QseukS (accessed February 6).
Cooke, M. and Buckley, N. (2008), “Web 2.0, social networks, and the future”, International Journal of Market Research,
Vol. 50, No. 2, pp. 267-292.
Costello, C., McDermott, M-L., and Wallace, R. (2017), “Netnography: Range of practices, misperceptions, and missed
opportunities”, International Journal of Qualitative Methods, Vol. 16, No. 1, pp. 1-12:
Couper, M.P. (2013), “Is the sky falling? New technology, changing media, and the future of surveys”, Survey Research
Methods, Vol. 7, No. 3, pp. 145-156.
Couper, M.P. (2017), “New developments in survey data collection”, Annual Review of Sociology, Vol. 43, pp. 121-145.
Courtright, M. (2015), “A reality check for online data quality best practices”, GreenBook Blog, August 24, available at:
https://goo.gl/4AP3J3.
Dabas, C. (2017), “Big data analytics for exploratory social network analysis”, International Journal of Information
Technology and Management, Vol. 16, No. 4, pp. 348-359.
Deakin, H. and Wakefield, K. (2013), “Skype interviewing: Reflections of two Ph.D. researchers”, Qualitative
Research, Vol. 14, No. 5, pp. 603-616.
De Bruijne, M and Wijnant, A. (2014), “Improving response rates and questionnaire design for mobile Web surveys”,
Public Opinion Quarterly, Vol. 78, No. 4, pp. 951–962.
Deloitte (2017), “Global mobile consumer trends”, 2e, available at: https://goo.gl/xdeS8e.
Derksen, C., Serlachius, A., Petrie, K.J., and Dalbeth, N. (2017), “What say ye gout experts?” BMC Muscloskeletal
Disorders, Vol. 18, pp. 1-5.

37
Deutskens, E., de Ruyter, K. and Wetzels, M. (2006), “An assessment of equivalence between online and mail surveys in
service research”, Journal of Survey Research, Vol. 8, No. 4, pp. 346-355.
Dillman, D.A. (2015), “Future surveys”, Monthly Labor Review, November, available at https://goo.gl/hbjsrE.
Downes-Le Guin, T., Baker, R., Mechling, J., and Ruyle, E. (2012), “Myths and realities of respondent engagement in
online surveys”, International Journal of Market Research, Vol. 54, No. 5, pp. 613-633.
Durr, J. and Smith, J. (2017), “Best of 2017: Insights from thousands of b2b customer interviews”, Gallup Business
Journal, December 21, available at: https://goo.gl/CEErsJ.
Dutwin, D. and Buskirk, T.D. (2017), “Apples to apples or gala versus golden delicious?” Public Opinion Quarterly, Vol.
81, Special Issue, pp. 213-249.
eMarketer (2017a), “Your customers think you know what they’ve been buying”, October 18, available at:
https://goo.gl/nuRSb6.
eMarketer (2017b), “Internet users and penetration worldwide, 2016-2021”, October 20, available at: https://goo.gl/eKtc9a.
eMarketer (2018), “Many companies feel unprepared for GDPR and all that it requires”, available at https://goo.gl/v83nfj
(accessed February 23).
ESOMAR (2008), Global market research 2008, Amsterdam.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

ESOMAR (2011), “ESOMAR guideline for online research,”, available at: https://goo.gl/1vHfZx.
ESOMAR (2016),Global market research 2016, Amsterdam.
Etheridge, A. (2016), “Bringing data to life”, Computers in Libraries, Vol. 36, No. 3, pp. 8-11.
European Pharmaceutical Market Research Association (EphMRA) (2018), “Code of conduct: Researchers responsibilities
by research approach”, available at: https://goo.gl/SEFsLS (accessed February 12).
Evans, J.R. and Mathur, A. (2005), “The value of online surveys”, Internet Research, Vol. 15, No. 2, pp. 195-219.
Fan, W., and Yan, Z., (2010), “Factors affecting response rates of the Web survey: A systematic review”, Computers in
Human Behavior, Vol. 26, pp. 132–139.
Fazekas, Z., Wall, M.T., and Krouwel, A. (2014), “Is it what you say, or how you say it? An experimental analysis of the effects
of invitation wording for online panel surveys”, International Journal of Public Opinion Research, Vol. 26, No. 2, pp. 235-244.
Fielding, J., Fielding, N., and Hughes, G. (2013), “Opening up open-ended survey data using qualitative software”, Qual
Quant, Vol. 47, pp. 3261-3276.
Future Today Institute (2017), “2017 Tech Trend Report”, available at: https://futuretodayinstitute.com/2017-tech-trends/.
Galesic, M., and Bosnjak, M. (2009), “Effects of questionnaire length on participation and indicators of response quality in
a Web survey”, Public Opinion Quarterly, Vol. 73, No. 2, pp. 349–360.
Gandomi, A. and Haider, M. (2015), “Beyond the hype: Big data concepts, methods, and analytics”, International Journal
of Information Management, Vol. 35, No. 2, pp. 137-144.
Gharib, R., Philpott, E., and Duan, Y. (2017), “Factors affecting active participation in B2B online communities”,
Information & Management, Vol. 54, No. 4, pp. 516-530.
Goodman, J.K. and Paolacci, G. (2017), “Crowdsourcing consumer research”, Journal of Consumer Research, Vol. 44, No.
1, pp. 196-210.
Göritz, A.S. and Luthe, S.C. (2013), “Lotteries and study results in market research online panels”, International Journal
of Market Research, Vol. 55, No. 5, pp. 611-626.
GreenBook (2017), GreenBook research industry trends (GRIT) report, Q3-Q4, available at: www.greenbook.org/grit.
Greenland, S. and Kwansah-Aidoo, K. (2012), “The challenges of market research in emerging markets: A practitioner
perspective from Sub-Saharan Africa”, Australasian Journal of Market & Social Research, Vol. 20, No. 2, pp. 9-22.
Gregori, A and Baltar, F. (2011), “Ready to complete the survey on Facebook”, International Journal of Market Research,
Vol. 55, No. 1, pp. 131-148.
Greszki, R., Meyer, M., and Schoen, H. (2015), “Exploring the effects of removing ‘too fast’ responses and respondents
from Web surveys”, Public Opinion Quarterly, Vol. 79, No. 2, pp. 471-503.
Gunarathne, P., Rui, H., and Seidmann, A. (2017), “ Whose and what social media complaints have happier
resolutions?” Journal of Management Information Systems, Vol. 34, No. 2, pp. 314-340.
Ha, L., Hu, X., Fang, L., Henize, S., Park, S., Stana, A., and Zhang, X. (2015), “Use of survey research in top mass
communication journals 2001–2010 and the total survey error paradigm”, Review of Communication, Vol. 15, No. 1,
pp. 39-59.
Han, S.P., Park, S., and Oh, W. (2016), “Mobile app analytics: A multiple discrete-continuous choice framework”, MIS
Quarterly, Vol. 40, No. 4, pp. 983-1008.
Hanna, P. (2012), “Using Internet technologies (such as Skype) as a research medium”, Qualitative Research, Vol. 12, No.
2, pp. 239-242.
Hansen, J.M., and Smith, S.M. (2012), “The impact of two-stage highly interesting questions on completion rates and data
quality in online marketing research”, International Journal of Market Research, Vol. 54, No. 2, pp. 241-260.
Heerwegh, D. (2009), “mode differences between face-to-face and Web Surveys”, International Journal of Public Opinion
Research, Vol. 21, No. 1, pp. 111-121.

38
Herian, M.N. and Tomkins, A.J. (2012), “Citizen satisfaction survey data: A mode comparison of the derived importance–
performance approach”, American Review of Public Administration, Vol. 42, No. 1, pp. 66-86.
Holland, J.L. and Christian, L.M. (2009), “The influence of topic interest and interactive probing on responses to open-
ended questions in Web surveys”, Social Science Computer Review, Vol. 27, No. 2, pp. 196-212.
Hsu, J.W., Schmeiser, M., Haggerty, C., and Nelson, S. (2017), “The effect of large monetary incentives on survey
completion”, Public Opinion Quarterly, Vol. 81, No. 3, pp. 736–747.
Hsu, M. (2017), “Neuromarketing: inside the mind of the consumer”, California Management Review, Vol. 59, Nol. 4, pp.
5-22.
Hu, K-C., Lub, M., Huanga, F-Y., and Jen, W. (2017), “Click ‘like’ on Facebook: The effect of customer-to-customer
interaction”, International Journal of Human-Computer Interaction, Vol. 33, No. 2, pp. 135-142.
iModerate (2018), “Today’s hybrid research: It may not mean what you think it means”, available at:
https://goo.gl/oSvXQn (accessed on February 19).
Interactive Marketing Research Organization (IMRO), (2015), “IMRO guidelines for best practices in online sample and
panel management”, October 13, available at: https://goo.gl/91wsU8.
Investopedia (2018), “Crowdsourcing”, available at: www.investopedia.com/terms/c/crowdsourcing.asp (accessed
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

February 21).
Ipsos (2017), “2016 annual results”, February 27, available at: https://goo.gl/oY1Vo4.
Jensen, M.L. and Yetgin, E. (2017), “Prominence and interpretation of online conflict of interest disclosures”, MIS
Quarterly, Vol. 41, No. 2, pp. 629-643.
Jiang, S. and Beaudoin, C.E. (2016), “Smoking prevention in China: A content analysis of an anti-smoking social media
campaign”, Journal of Health Communication, Vol. 21, pp. 755-764.
Jin, L. (2011), “Improving response rates in Web surveys with default setting: The effects of default on Web survey
participation and permission”, International Journal of Market Research, Vol. 53, No. 1, pp. 75-94.
Jones, M.S. House, L.A. ,and Gao, Z. (2015), “Respondent screening and revealed preference axioms”, Public Opinion
Quarterly, Vol. 79, No. 3, pp. 687–709.
Kantar (2018), “Kantar: Inspiration for an extraordinary world”, available at: www.kantar.com/about (accessed February
15).
Karkouch, A., Mousannif, H., Al Moatassime, H., and Noel, T. (2016), “Data quality in Internet of Things : A state-of-the-
art survey”, Journal of Network and Computer Applications, Vol. 73, No. C, pp. 57-81.
Kennedy, C., Mercer, M., Keeter, S., Hatley, N., McGeeney, K., and Gimenez, A. (2016), “Evaluating online
nonprobability surveys”, May 2, available at: https://goo.gl/jYP6o9.
Keusch, F. (2011), “The role of topic interest and topic salience in online panel Web surveys”, International Journal of
Market Research, Vol. 55, No. 1, pp. 59-80.
Kilinç, H. and Firat, M. (2017), “Opinions of expert academicians on online data collection and voluntary participation in
social sciences research”, Educational Sciences: Theory & Practice, Vol. 17, No. 5, pp. 1461-1486.
Kim, M., Kim, J., Choi, J., and Trivedi, M. (2017), “Mobile shopping through applications: Understanding application
possession and mobile purchase”, Journal of Interactive Marketing, Vol. 39, August, pp. 55-68.
Kimble, C. and Milolidakis, G. (2015), “Big data and business intelligence: Debunking the myths”, Global Business and
Organizational Excellence, Vol. 35, No. 1, pp. 23–34.
Korpilo, S., Virtanen, T., and Lehvävirta, S. (2017), “Smartphone GPS tracking – Inexpensive and efficient data collection
on recreational movement”, Landscape and Urban Planning, Vol. 157, January, pp. 608-617.
Kozinets, R.V. (2002), “The field behind the screen: Using netnography for marketing research in online communities”,
Journal of Marketing Research, Vol. 39, No. 1, pp. 61-72.
Krajicek,D. (2015), “A panel for every purchase”, Marketing Insights, Vol. 27, pp. 8-10.
Kraus, A. and Martins, N. (2017), “On the street: A content analysis of body imagery in streetstyle fashion blogs, Journal
of Broadcasting & Electronic Media, Vol. 61, No. 2, pp. 351-367.
Kugler, K. (2014), “Keeping online reviews honest”, Communications of the ACM, Vol. 57, No. 11, pp. 20-23.
Landrock, U. and Menold, N. (2016), “Validation of theoretical assumptions with real and falsified survey data”, Statistical
Journal of the IAOS, Vol. 32, pp. 305-312.
LaRose, R. and Tsai, H.Y.S. (2014), “Completion rates and non-response error in online surveys”, Computers in Human
Behavior, Vol. 34, May, pp. 110-119.
Lee, I. (2017), “Big data: Dimensions, evolution, impacts, and challenges”, Business Horizons, Vol. 60, No. 3, pp. 293-303.
Li, D.C. (2011), “Online social network acceptance: a social perspective”, Internet Research, Vol. 21, No. 5, pp. 562-580.
Li, Q. and Liu, Y. (2017), “Exploring the diversity of retweeting behavior patterns in Chinese microblogging platform”,
Information Processing & Management, Vol. 53, No. 4, pp. 945-962.
Link, M.W., Murphy, J., Schober, M.F., Buskirk, T.D., Hunter Childs, J. and Langer Tesfaye, C. (2014), “Mobile
technologies for conducting, augmenting, and potentially replacing surveys”, Public Opinion Quarterly, Vol. 78, No. 4,
pp. 779-787.

39
Liu, L., Cheung, C.M.K., and Lee, M. (2016), “An empirical investigation of information sharing behavior on social
commerce sites”, International Journal of Information Management, Vol. 36, No. 5, pp. 686-699.
Liu, M. and Wronski, L. (2017), “Examining completion rates in Web surveys over 25,000 real-world surveys”, Social
Science Computer Review, Vol. 36, No. 1, pp. 116-124.
Liu, Q.B. and Karahanna, E. (2017), “The dark side of reviews : The swaying effects of online product reviews on attribute
preference construction”, MIS Quarterly, Vol. 41, No. 2, pp. 427-448.
Liu, Y. (2010, “User control of personal information concerning mobile-app: Notice and consent?” Computer Law &
Security Report, Vol. 30, No. 5, pp. 521–529.
Losbichler, H. and Michaels-Kim, N. (2017), “Eye-tracking for better reports”, Strategic Finance, October, pp. 37-43.
Lynn, P. and Kaminska, O. (2012), “The impact of mobile phones on survey measurement error”, Public Opinion
Quarterly, Vol. 77, No. 2, pp. 586–605.
Macdonald, E.K., Wilson, H.N., and Konuş, U. (2012), “Better customer insight – in real time”, Harvard Business Review,
Vol. 90, No. 9, pp. 102-108.
Manfreda, K.L., Bosnjak, M., Berzelak, J., Haas, I., and Vehovar, V. (2008), “Web surveys versus other survey modes: A
meta-analysis comparing response rates”, International Journal of Market Research, Vol. 50, No. 1, pp. 79-104.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Markham, A. and Buchanan, E. (2012), “Ethical decision-making and Internet research: Recommendations”, Version 2.0.
available at: http://aoir.org/reports/ethics2.pdf.
Martinsson, J. Dumitrescu, D., and Riedel, K. (2017), “Recruiting an online panel from another online survey“,
International Journal of Public Opinion Research, Vol. 29, No. 2, pp. 339-351.
Mauceri, S. (2016), “Integrating quality into quantity: Survey research in the era of mixed methods”, Quality & Quantity,
Vol. 50, No. 3, pp. 1213-1231.
Maxi, E., Haring, W, Tarkus, A., Altenstrasser, M., and Dolinar, M. (2010), “Effects of mobile Web survey invitation
modes on non-response”, International Journal of Mobile Marketing, Vol. 5, No. 1, pp. 5-14.
McDevitt, P. (2005), “Mode and visualization effects in online marketing research”, Marketing Management Journal, Vol.
15, No. 1, pp. 149-161.
Menictas, C.. Wang, P., and Fine, B. (2011), “Assessing flat-lining response style bias in online research”, Australasian
Journal of Market & Social Research, Vol. 19, No. 2, pp. 34-44.
Mercer, A.. Caporaso, A., Cantor, D., and Townsend, R. (2015), “How much gets you how much? Monetary incentives and
response rates in household surveys”, Public Opinion Quarterly, Vol. 79, No. 1, pp. 105–129.
Millar, M.M. and Dillman, D.A. (2011), “Improving response to Web and mixed-mode surveys”, Public Opinion
Quarterly, Vol. 75, No. 2, pp. 249–269.
Miller, P.V. (2017), “Is there a future for surveys?” Public Opinion Quarterly, Vol. 81, Special Issue, pp. 205–212.
Miniwatts Marketing Group (2018), “Internet world stats, February 16, accessed at: www.Internetworldstats.com.
Moreno, M.A., Goniu, N., Moreno, P.S., and Diekema, D. (2013), “Ethics of social media research: Common concerns and
practical considerations”, Cyberpsychology, Behavior, and Social Networking, vol. 16, No. 9, pp. 708-713.
Morelli, N.A., Mahan, R.P., and Illingworth, A.J. (2014), “Establishing the measurement equivalence of online selection
assessments delivered on mobile versus nonmobile devices”, International Journal of Selection and Assessment, Vol. 22,
No. 2, pp. 124-138.
Motta, M.P., Callaghan, Y.H., and Smith, B. (2017), “Looking for answers: Identifying search behavior and improving
knowledge-based data quality in online surveys”, International Journal of Public Opinion Research, Vol. 29, No. 4, pp.
575–603.
MRIA (2014), “MRIA code of conduct for market and social media research”, December, available at:
https://goo.gl/af4ZHU.
Murphy, M. and Sashi, C. (2018), “Communication, interactivity, and satisfaction in B2B relationships”, Industrial
Marketing Management, Vol. 68, No. 1, pp. 1-12.
Nam, H., Joshi, Y.V., and Kannan, P.K. (2017), “Harvesting brand information from social tags”, Journal of Marketing,
Vol. 81, No. 4, pp. 88-108.
National Science Foundation (2012), “The future of survey research: Challenges and opportunities”, May, available at:
https://goo.gl/b1wrLJ.
Neslin, S.A., Novak, T.P., Baker, K.B., and Hoffman, D.L. (2009), “An optimal contact model for maximizing online
panel response rates”, Management Science, Vol. 55, No. 5, pp. 727-737.
Ng, I.C.L. and Wakenshaw, S.Y.L. (2017) “The Internet of Things : Review and research directions”, International
Journal of Research in Marketing, Vol. 34, No. 1. pp. 3-21.
Nielsen, J. (2016), “The distribution of users’ computer skills: Worse than you think”, November 13, available at:
www.nngroup.com/articles/computer-skill-levels/.
Okazaki, S. (2007), “Assessing mobile-based online surveys, International Journal of Market Research, Vol. 49, No. 5, pp.
651-675.
Palaghias, N., Nati, M., Hoseinitabatabaei, S.A., Gluhak, A., Moessner, K. (2016), “A survey on mobile social signal

40
processing”, ACM Computing Surveys, Vol. 48, No. 4, pp. 1-52.
Pan, B., Woodside, A.W., and Meng, F. (2013), “How contextual cues impact response and conversion rates of online
surveys”, Journal of Travel Research, Vol. 53, No. 1, pp. 58–68
Pan, Y., Wan, Y., Fan, J., Liu , B., and Archer, N. (2017), “Raising the cohesion and vitality of online communities by
reducing privacy concerns”, International Journal of Electronic Commerce, Vol. 21, No. 2, pp, 151-183.
Papachristos, A. (2014), “Consumer Survey Fatigue’s Impact on Brand Perception”, 1to1 Media, May, available at:
https://goo.gl/DqjeZv.
Park, C.H. (2017), “Online purchase paths and conversion dynamics across multiple Web sites”, Journal of Retailing, Vol.
93, No. 3, pp. 253-265.
Park, S-H., Huh, S-Y., Oh, W., and Han, S.P. (2012), “A social network-based inference model for validating customer
profile data”, MIS Quarterly, Vol. 36, Nol. 4, pp. 1217-1237.
Pecáková, I. (2016), “Pitfalls of Quantitative Surveys Online”, Acta Oeconomica Pragensia, Vol. 24, No. 6, available at:
https://goo.gl/x1EvLX.
Pedersen, M.J., and Nielsen, C.V. (2016), “Improving survey response rates in online panels: Effects of low-cost incentives
and cost-free text appeal interventions”, Social Science Computer Review, Vol. 34, No. 2, pp. 229-243.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

PeoplePulse (2011), “10 best practices in opinion survey design”, January 24, available at: https://goo.gl/LMkuu2.
Petrovcic, A., Petric, G., and Manfreda, K.L. (2016), “The effect of email invitation elements on response rate in a Web
survey within an online community”, Computers in Human Behavior, Vol. 56, pp. 320 -329.
Pew Research Center (2017), “The American Trends Panel survey methodology”, August 31, available at:
https://goo.gl/ADTxJA.
Pew Research Center (2018a), “Internet/broadband fact sheet”, February 5, available at: https://goo.gl/7CMXRR.
Pew Research Center (2018b), “Collecting survey data”, available at: https://goo.gl/XTSRuf (accessed February 8).
Pew Research Center (2018c), “Our Mission”, available at: www.pewresearch.org/about/our-mission/ (accessed February 12).
Porter, S.R. and Whitcomb, M.E. (2007), “Mixed-mode contacts in Web surveys”, Public Opinion Quarterly, Vol. 71, No.
1, pp. 635–648.
Pouyanfar, S., Yang, Y., Chen, S.C., Shyu, M-L., and Iyengar, S. (2018), “Multimedia big data analytics: A survey”, ACM
Computing Surveys, Vol. 51, No. 1, pp. 10-34.
Punj, G. (2017), “Consumer intentions to falsify personal information online: Unethical or justifiable?” Journal of
Marketing Management, Vol. 33,No. 15-16, pp. 1402-1412.
Qin, Y., Sheng, Q.Z., Falkner, N.J.G., Dustdar, S., Wang, H., and Vasilakos, A.V. (2016), “When things matter: A survey
on data-centric Internet of Things”, Journal of Network and Computer Applications, Vol 64, No. C, pp. 137-153.
Raassens, N. and Haans, H. (2017). NPS and online WOM”, Journal of Service Research, Vol. 20, No. 3, pp. 322-334.
Raben, F. (2014), “Foreword”, in Global Market Research 2014: An ESOMAR Industry Report, ESOMAR, Amsterdam.
Raphaeli, O., Goldstein, A., and Fink, L. (2017), “Analyzing online consumer behavior in mobile and PC devices”,
Electronic Commerce Research and Applications, Vol. 26, November-December, pp. 1-12.
Rehman, M. H., Liew, C.S., Abbas, A., Jayaraman, P.R., Wah, T.Y., and Khan, S.U. (2016), “Big data reduction methods:
A survey”, Data Science and Engineering, Vol. 1, No. 4, pp. 265–284.
Resnik, D.B. (2015), “What is ethics in research & why is it important?” December 1, available at: https://goo.gl/91B6RV.
Revilla, M. and Ochoa, C. (2015), “Quality of different scales in an online survey in Mexico and Colombia”, Journal of
Politics in Latin America, Vol. 7, No. 3, pp. 157-177.
Revilla, M.A. and Saris, W.E. (2013), “A comparison of the quality of questions in a face-to-face and a Web survey”,
International Journal of Public Opinion Research, Vol. 25, No. 2, pp. 242-253.
Revilla, M.A., Toninelli, D., Ochoa, C., and Loewe, G. (2016), “Do online access panels need to adapt surveys for mobile
devices?” Internet Research, Vol. 26, No. 5 (2016), pp. 1209-1227.
Riedl, C., Köbler, Goswami, S., and Krcmar, H. (2013), “Tweeting to feel connected: A model for social connectedness in
online social networks”, International Journal of Human-Computer Interaction, Vol. 29, pp. 670-687.
Rogelberg, S.C., Spitzmüeller, C., Little, I., and Reeve, C.L. (2006), “Understanding response behavior to an online special
topics organizational satisfaction survey”, Personnel Psychology, Vol. 59, pp. 903-923.
Roster, C.A., Albaum, G., and Smith, S.M. (2017), “Effect of topic sensitivity on online survey panelists’ motivation and
data quality”, Journal of Marketing Theory & Practice, Vol. 25, No. 1, pp. 1-16.
Salehan, M. and Kim, D.J. (2016), “Predicting the performance of online consumer reviews: A sentiment mining approach
to big data analytics”, Decision Support Systems, Vol. 81, January, pp. 30-40.
Salo J. (2017), “Social media research in the industrial marketing field”, Industrial Marketing Management, Vol. 66, No. 1,
pp.115-129.
Sauermann, H., and Roach, M. (2013), “Increasing Web survey response rates in innovation research: An experimental
study of static and dynamic contact design features”, Research Policy, Vol. 42, pp. 273-286.
Schneider, E. (2015), “Rock, paper, survey”, Marketing Health Services, Vol. 35, No. 2, pp. 22-27.
Scholz, E. and Zuell, C. (2012), “Item non-response in open-ended questions”, Social Science Research, Vol. 41, pp. 1415-1428.

41
Seitz, S. (2015), “Pixilated partnerships, overcoming obstacles in qualitative interviews via Skype”, Qualitative Research,
Vol. 16, No. 2, pp. 229-235.
Sharp, A., Moore, P., and Anderson, K. (2011), “Are the prompt responders to an online panel survey different from those
who respond later?” Australasian Journal of Market & Social Research, Vol. 19, No. 1, pp. 25-33.
Shen, X-L, Cheung, C.M.K., and Lee, M.K.O. (2013), “Perceived critical mass and collective intention in social media-
supported small group communication”, International Journal of Information Management, Vol. 33, No. 5, pp. 707-715.
Shen, X-L., Sun, Y., and Wang, N. (2013), “Recommendations from friends anytime and anywhere, Cyberpsychology,
Behavior, and Social Networking, Vol. 16, No. 5, pp. 349-356.
Shoval, N. and Ahas, Rein (2017), “The use of tracking technologies in tourism research: The first decade”, Tourism
Geographies, Vol. 18, No. 5, pp. 587-606.
Siliverstovs, B. and Wochner, D.S. (2018), “Organization Google Trends and reality: Do the proportions match?” Journal
of Economic Behavior & Organization, Vol. 145, January, pp. 1-23.
Simmons, K., Mercer, A., Schwarzer, S., and Kennedy, C. (2016), “Evaluating a new proposal for detecting data
falsification in surveys”, Statistical Journal of the IAOS, Vol. 32, No. 3, pp. 327–338.
Singh, A., Taneja, A., and Mangalaraj, G. (2009), “Creating online surveys: Some wisdom from the trenches”, IEEE
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Transactions on Professional Communication, Vol. 52, No. 2, pp. 197-212.


Singh, J.P., Irani, S., Rana, N.P., Dwivedi, Y.K., Saumya, S., and Roy, P.K. (2017), “Predicting the ‘helpfulness’ of online
consumer reviews”, Journal of Business Research, Vol. 70, January, pp. 346-355.
Sirajudee, S.M., Fatihah, N., Adamu, A., and Abubakar, I. (2017), “Online fake news detection algorithm”, Journal of
Theoretical and Applied Information Technology, Vol. 95, No. 17, pp. 4114-4122.
Smith, T.W. (2013), “Survey-research paradigms old and new”, International Journal of Public Opinion Research, Vol. 25
No. 2, pp. 218-229.
Smith, V.K., Harlan, S.L., McLaen, M., Fishman, J., Valcarcel, C., and Nation, M. (2017), “Does reputation enhance
response rates?” Applied Economic Letters, Vol. 24, No. 17, pp. 1228-1231.
Smyth, J.D., Dillman, D.A., Christian, L.M., and McBride, M. (2009), “Open-ended questions in Web surveys”, Public
Opinion Quarterly, Vol. 73, No. 2, pp. 325-337.
Søilen, K.S., Tontini, T., and Aagerup, U. (2017) “The perception of useful information derived from Twitter: A survey of
professionals”, Journal of Intelligence Studies in Business, Vol. 7, No. 3, pp. 50-61.
Sorensen, H., Bogomolovaa, S., Anderson, K., Trinh, G., Sharp, A., Kennedy, R., Page, B., and Wright, M. (2017),
“Fundamental patterns of in-store shopper behavior”, Journal of Retailing and Consumer Services, Vol. 37, July, pp.
182-194.
Springer, J. (2017), “Google exec: Analysis goes deeper, cheaper”, Supermarket News, March 23, available at:
https://goo.gl/ncfsxy.
SSI (2018), “Online surveys”, available at: https://goo.gl/A1scYH (accessed February 15).
Statista (2017a), “Dossier: market research”, available at https://goo.gl/A2e6hb (accessed February 8, 2018).
Statista (2017b), “Dossier: mobile search”, available at https://goo.gl/e8prXU (accessed February 8, 2018).
Statista (2018a), “Global spam volume as percentage of total E-mail traffic from January 2014 to September 2017, by
month”, available at: https://goo.gl/iJbRCz (accessed February 16).
Statista (2018b), “Leading market research companies worldwide in 2016”, available at: https://goo.gl/YK2T6k (accessed
February 16).
Stewart, D.W. and Shamdasani, P. (2017), “Online focus groups”, Journal of Advertising, Vol. 46, No. 1, pp. 48-60.
Stipp (2016), “What 80 years of study means for the future of advertising research”, Journal of Advertising Research, Vol. 56, No. 3,
pp. 231-234.
Stocking, G. and Matsa, K. (2017), “Using Google Trends data for research? Here are 6 questions to ask”, April 27, available
at: https://goo.gl/5LznUc.
Struminskaya, B. (2016), “Respondent conditioning in online panel surveys,” Social Science Computer Review, Vol. 34, No.
1, pp. 95-115.
Sue, V. M. and Ritter, L. A. (2012), “Planning the online survey”, Conducting online surveys, 2e, Sage Publications, pp.14-32.
SurveyMonkey (2018), “The pros and cons of incentivizing”, available at: www.surveymonkey.com/mp/survey-rewards/
(accessed February 12).
Techopedia (2018a), “Computer Literate”, available at: www.techopedia.com/definition/23303/computer-literate (accessed
February 24).
Techopedia (2018b), “Online survey”, available at: www.techopedia.com/definition/27866/online-survey (accessed
January 29).
Terhanian, G., Bremer, J., Olmsted, J., and Guo, J. (2016), “A process for developing an optimal model for reducing bias in
nonprobability samples”, Journal of Advertising Research, Vol. 56, No. 1, pp. 14-24.
Thakur, R. and AlSaleh, D. (2018), “A comparative study of corporate user-generated media behavior: Cross-cultural B2B
context”, Industrial Marketing Management, full citation available at: https://doi.org/10.1016/j.indmarman.2018.02.004.

42
Tichon, J.G. and Mavin, T. (2017), “Experiencing resiliency via video games”, Social Science Computer Review, Vol. 35,
No. 5, pp. 666-675.
Ting, P-J.L., Chen, S-L., Chen, H., and Fang, W.C. (2017), “Using big data and text analytics to understand how customer
experiences posted on yelp.com impact the hospitality industry”, Contemporary Management Research, Vol. 13, No. 2,
pp. 107-130.
Vakharia, D. and Lease, M. (2015) .Beyond Mechanical Turk: An analysis of paid crowd work platforms. Proceedings of
iConference 2015, available at https://goo.gl/tvwy2f.
Vannette, D. (2015), “10 tips for building effective surveys”, August 10, available at www.qualtrics.com/blog/10-tips-for-
building-effective-surveys/.
Vannieuwenhuyze, J., Loosveldt, G., and Molenberghs, G. (2010), “A method for evaluating mode effects in mixed-mode
surveys”, Public Opinion Quarterly, Vol.74, No. 5, pp. 1027–1045.
Vehovar, V. and Manfreda, K.L. (2016), “Overview: Online surveys, in N. G. Fielding, R. M. Lee, änd G. Blank (eds.),
The Sage Handbook of Online Research Methods, 2e , Sage Publications, London.
Walker, R.W. and Cook, W.A. (2013), “You can’t put a price tag on a survey participant’s enjoyment”, Journal of
Advertising Research, Vol. 53, No. 3, pp. 254-257.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Ward, M.K., Meade, A.W., Allred, C.M., Pappalardo, G., and Stoughton, J.W. (2017), “Careless response and attrition as
sources of bias in online survey assessments of personality traits and performance”, Computers in Human Behavior, Vol.
76, pp. 417-430.
Weber, M., Denk, M., Oberecker, K., Strauss, C., and Stummer, C. (2008), “Panel surveys go mobile”, International
Journal of Mobile Communications, Vol. 6, No. 1, pp. 88-107.
Webster, E., Sukaviriya, N., Chang, H-Y., and Kozloski, J. (2017), “Predicting cognitive states from wearable recordings
of autonomic function”, IBM Journal of Research and Development, Vol. 61, No. 2/3, pp. 2-11.
Wells, T. (2015), “What market researchers should know about mobile surveys”, International Journal of Market
Research, Vol. 57, No. 4, pp. 521-532.
Wessling, K.S., Huber, J., and Netzer, O. (2017), “MTurk character misrepresentation: Assessment and solutions”, Journal
of Consumer Research, Vol. 44, No. 1, pp. 211-230.
Wiese, M., Van Heerden, G., and Maree, T. (2017) “Dynamite in small packages: The engaged elite as a Facebook
emerging niche market”, African Journal of Information Systems, Vol. 9. No. 1, pp. 36-61.
Williams, S.G. (2012), “The ethics of Internet research”, Online Journal of Nursing Informatics, Vol. 16, No. 2,
available at: http://ojni.org/issues/?p=1708.
Wouters, K., Maesschalck, J., Peeters, C.F.W., and Roosen, M. (2014), “Methodological issues in the design of online
surveys”, Journal of Business Ethics, Vol. 120, No. 2, pp. 275-289.
Xing, Y. and Handy, S. (2014), “Online versus phone surveys”, Transportation Planning and Technology, Vol. 37, No. 6,
pp. 554-567.
Yaqoob, I., Hashem, I.A.T., Gania, A., Mokhtara, S., Ahmed, E., Anuar, N.B., and Vasilakos, A.V.. (2016), “Big data:
From beginning to future”, International Journal of Information Management, Vol. 36, No. 6, pp. 1231-1247.
Yeager, D.S., Krosnick, J.A., Chang, L., Lavitz, H.S., Levendusky, M.S., Simpser, A., and Wang, R. (2011), “Comparing the
accuracy of RDD telephone surveys and Internet surveys conducted with probability and non-probability samples”,
Public Opinion Quarterly, Vol. 75, No. 4, pp. 709-747.
Žák, S. (2015), “The identification of innovative research methods and techniques utilized in marketing research in the
digital era”, Studia commercialia Bratislavensia, Vol. 8; No 29, pp. 139-152.
Zarpelão, B.B., Mianib, R.S., Kawakania, C.T., and de Alvarenga, S.C. (2017), “A survey of intrusion detection in Internet
of Things”, Journal of Network and Computer Applications, Vol. 84, No. C, pp. 25-37.
Zhang, K.Z.K., Cheung, C.M.K., and Lee, M.K.O. (2014), “Examining the moderating effect of inconsistent reviews and
its gender differences on consumers' online shopping decision”., International Journal of Information Management, Vol.
34, No. 2, pp. 89-98.
Zhang, M., Ding, S. and Bian, Y. (2017), “The online reviews’ effects on Internet consumer behavior”, Journal of
Electronic Commerce in Organizations, Vol. 15, No. 4, pp. 83-94.
Zhang, X., Baker, K., Pember, S., and Bissell, K. (2017), “Persuading me to eat healthy,” Southern Communication
Journal, Vol. 82, No. 1, pp. 38-51.
Zhu, Y. and Meyer, J. (2017), “Getting in touch with your thinking style: How touchscreens influence purchase”, Journal
of Retailing and Consumer Services, Vol. 38, September, pp. 51-58.

Biographies:
Joel R. Evans, Ph.D., is the RMI Distinguished Professor at the Zarb School of Business, Hofstra
University, New York. He is author or co-author of numerous scholarly journal articles, including
two published in Internet Research. He serves on the editorial board of Industrial Marketing

43
Management, a leading business journal. Dr. Evans is co-author of Marketing in the 21st Century,
12e and co-author of Retail Management: A Strategic Approach, 13e. He has a popular marketing
blog, “Evans on Marketing” (www.evansonmarketing.com), with about 1,800 posts and a global
audience.

Anil Mathur, Ph.D., is the Brodlieb Distinguished Professor of Business and Chairperson of the
Marketing and International Business Department at the Frank G. Zarb School of Business, Hofstra
University, New York. He is co-author of Baby Boomers and their Parents: Surprising Findings
about Their Lifestyles and Well-Being (Paramount Publishing) and The Maturing Marketplace:
Buying Habits of Baby Boomers and their Parents (Quorum Books). Dr. Mathur has also authored
or co-authored over 100 articles and papers that have been published in scholarly journals and
conference proceedings.
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

44
Figure 1

The Strengths and Potential Weaknesses of Online Surveys


Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

Source: Evans, J.R. and Mathur, A. (2005), “The value of online surveys,” Internet Research, Vol. 15, No. 2,
p. 197.

[NOTE: THERE SHOULD BE AN ELECTRONIC VERSION OF THIS FILE AVAILABLE FROM


THE PUBLISHER]
Figure 2

Considerations in Planning and Executing Future Survey Research Projects

• “Online” versus “digital” research


Better defining concepts • Broadened notion of a “survey”
• Traditional versus new kinds of surveys
• Social media content analysis
Downloaded by UNIVERSITY OF TOLEDO LIBRARIES At 10:38 23 June 2018 (PT)

• Best uses of surveys


Better understanding the • Best uses of other forms of data collection
future role of surveys • Blended (hybrid) research formats
• In-depth coverage of the future role of
surveys (see Table 2)

• Consistency in survey methodology


Better developing and
implementing surveys • Full methodology communicated
• In-depth methodology guidelines (see
Table 3)

• Importance of an ethics code


Abiding by a strong code of • Academics and practitioners
ethics • Numerous ethical issues
• In-depth ethical guidelines (see Table 4)

You might also like