You are on page 1of 12

The current issue and full text archive of this journal is available on Emerald Insight at:

www.emeraldinsight.com/2046-469X.htm

JIEB
11,2 Reducing statistics anxiety using
limited teaching resources
Jonas Nilsson and Jeanette Carlsson Hauff
Goteborgs Universitet, Goteborg, Sweden
312
Received 23 March 2018
Revised 2 May 2018
Abstract
Accepted 4 May 2018 Purpose – Students in the marketing discipline have been reported to struggle with quantitative methods.
This paper aims to focus on whether it is possible to increase student confidence and reduce anxiety with
quantitative data analysis even when limited teaching resources are available. It reports on two half-day
initiatives to teach quantitative methods that followed the principles of integration of method into a
substantive course (as opposed to stand-alone course) and hands-on approach (as opposed to using a
theoretical and hands-off approach).
Design/methodology/approach – Over the course of three semesters, 92 students that took part of the
sessions answered a survey where they reported their basic understanding, confidence, practical abilities and
anxiety with quantitative methods.
Findings – The results indicate significant improvements in self-reported basic understanding, confidence,
practical abilities and anxiety. Further analysis indicated that neither gender nor previous statistical
background had an impact on perceived benefit with the initiative.
Practical implications – In all, the study indicates that integration and hands-on approaches may be
beneficial to reduce anxiety and increase confidence with quantitative data analysis, even when this initiative
is limited in time and resources.
Originality/value – The study presents an approach to reducing anxiety and increasing confidence with
quantitative data analysis. Teaching initiatives like this may be beneficial in situations when students
experience high levels of statistics anxiety.
Keywords Student confidence, Student anxiety, Teaching quantitative data analysis
Paper type Research paper

Introduction
In many different academic disciplines, skills in, and an understanding of, quantitative
methods is important (Onwuegbuzie and Wilson, 2003; Bridges et al., 1998). Knowledge of
quantitative methods is not only needed for students to perform research that answer
certain “quantitative” research questions but also serves a societal function; to understand
and critically evaluate research and societal trends (Markham, 1991). For students, a good
foundation in quantitative methods is often also necessary for many tasks that academia
prepares them for. The marketing discipline is no exception here (Tarasi et al., 2013;
Dobni and Links, 2008), especially considering the recent focus on marketing analytics,
forecasting and data-driven decision making in firms (Wilson et al., 2018; Liu and Levin,
2018; Spralls and Wilson, 2016). Moreover, a good knowledge of quantitative methods can
aid with tasks such as estimating the size of a market or, through surveys, perform market
research that give insights on consumer purchasing patterns.
However, despite its practical and theoretical usefulness, students and teachers within
Journal of International Education
in Business
the social sciences often struggle with the subject. In student population, courses on
Vol. 11 No. 2, 2018
pp. 312-323
quantitative methods have been observed to cause negative feelings and anxiety (Chew and
© Emerald Publishing Limited Dillon, 2014; Einbinder, 2014; Bridges et al., 1998). Students have, for instance, been known
2046-469X
DOI 10.1108/JIEB-03-2018-0010 to adapt an already defeatist attitude based on poor self-confidence in mathematical skills
and may withdraw psychologically instead of developing learning strategies (Sundt, 2010; Statistics
Paxton, 2006). Teachers often confirm this anxiety and poor knowledge and have been anxiety
reported to experience troubles in adapting their teaching to accommodate these issues
(Sankowsky, 2006; McBride, 1994). It has even been suggested that teachers, due to the
negative view among students, actually avoid teaching quantitative method courses for fear
of poor student evaluations (Bridges et al., 1998).
Within the business discipline, marketing is often seen as one of the least quantitative
disciplines. Thus, compared to more quantitatively oriented subjects, it is not surprising that
313
studies have shown that marketing students hold low attitudes toward quantitative methods.
For example, in a study on students in Business Administration, Tarasi et al. (2013) found
that marketing majors had lower levels of “quantitative affinity” than non-marketing majors.
To deal with the specific difficulties encountered when teaching quantitative methods,
previous research within different social sciences has put forth a number of ideas and
strategies. Two of the “remedies” that have been suggested are to use integration (to deal
with methods within a substantive topic class) and a practical hands-on approach (to
emphasize “doing” as opposed to theory) (Atkinson et al., 2006; Bridges et al., 1998; Paxton,
2006). Here, previous research has highlighted that integration can increase perceived
relevance of quantitative methods by highlighting questions that are perceived as
interesting and important within the discipline, while a hands-on approach can build
confidence in “doing” and increase practical skills.
This paper reports on an initiative that applies these suggestions in the context of a
Swedish undergraduate marketing program, with the overall aim to investigate whether an
integrated hands-on approach to teaching quantitative data analysis can yield positive
learning, confidence and anxiety outcomes, even when limited time is paid to the topic. The
initiative that is reported on is two “quantitative method half-days” that took place within an
undergraduate course in Consumer Behavior. It was specifically designed along the lines of
integration and hands-on thinking: students were confronted with empirical questions from
consumer behavior and had to design their own questionnaire and later analyze the data
that the questionnaire generated.
Using this initiative as a case, the following research questions are addressed in this paper:

RQ1. Can a teaching initiative using limited time and resources, where quantitative
method is integrated into the course in a hands-on fashion, be way to generate basic
understanding of and build practical abilities for quantitative data analysis and
reduce student anxiety and build confidence regarding quantitative data analysis?
RQ2. What is the influence of previous statistical background and gender on the
perceived benefit of the initiative?

The paper proceeds as follows. First, previous literature regarding teaching quantitative
data analysis is reviewed. After that, the initiative is described in detail. This is followed by
a description of the method and the results. Finally, the results are discussed and
conclusions are drawn.

Previous literature
Previous research has highlighted several aspects of the pedagogy of quantitative methods
within the general social science domain. Below, we review two aspects of this literature: the
difficulties of quantitative methods from a student perspective and the difficulties from a
teacher perspective.
JIEB Student difficulties and reactions to quantitative methods: anxiety and low confidence
11,2 In many academic programs, courses on research methods are obligatory (Parker et al.,
1999). The reason for this is sound; making sense of data is an essential skill in both
academia and in many roles in society (Bridges et al.1998; Markham, 1991). However,
previous research highlights that this enthusiasm for skills in research methods is not
shared by all students. Instead, students in several disciplines have been reported to want to
314 delay or avoid research method courses in general, and particularly quantitative method
courses are often seen as negative by students in the social sciences (Paxton, 2006;
Sankowsky, 2006).
Among teachers of quantitative methods, who have to deal with students who have
negative attitudes toward the topic, both anxiety and a lack of motivation has been
observed in the student population (Wilder, 2010). Onwuegbuzie and Wilson (2003)
argue that “statistics anxiety” can seriously harm academic performance and found
evidence that certain groups of students, such as women and older students, perceive a
higher level of anxiety than younger male students. Anxiety often goes hand-in-hand
with a lack of motivation to learn (Onwuegbuzie and Wilson, 2003; Sundt, 2010). Here,
research methods in general, and quantitative methods in particular, are often
described as unpleasant, uninteresting or boring (Markham, 1991). As personal interest
is a strong motivational factor for learning, the lack of interest is a serious inhibitor of
learning (Sundt, 2010).
Several factors have been suggested to be the root causes of anxiety and lacking
motivation. One such, which has received considerable attention, is that of self-confidence or
self-efficacy (Paxton, 2006; Sundt, 2010). Self-confidence has in previous psychologically
oriented studies been related to various cognitive capabilities, such as problem solving
(Metcalfe, 1986) and recognition (Schachter, 1983). Low levels of perceived competence have
further been shown to have an impact on behavioral outcomes within the educational
system, such as choice of specialization of college majors (Hackett and Betz, 1989). The
connection to anxiety is highlighted by Bridges et al. (1998), who argue that “Those who feel
incapable of doing mathematical operations often experience extreme anxiety about the
simplest statistical operations” (p. 15). This observation is confirmed in a study on
determinants of anxiety where Onwuegbuzie (2000) highlight that the students with the
lowest level of academic confidence also experience the highest levels of anxiety. As many
students have low self-esteem when it comes to mathematics (Lalayants, 2012), it is not thus
surprising that anxiety is a widespread phenomenon among students.
On top of the issue of self-confidence, another reason that students may lack in
motivation is also a failure to see “the use” of quantitative methods (Sundt, 2010; Lalayants,
2012). Students may simply not see when they will be able to apply the knowledge,
especially if this knowledge is perceived as abstract (Einbinder, 2014). In explaining why
students with low statistics affinity choose to study marketing, Tarasi et al. (2013) notes that
“without knowing much about the complexity of marketing, students may anticipate that it
is less quantitatively oriented than other disciplines.” This failure to see the relevance of
quantitative methods to a future career is likely a cause of a lack of motivation to learn.
However, as noted above, the reality is that many tasks actually involve quantitative
literacy. Thus, students may be in for a tough awakening when applying for positions after
their graduation (Aggarwal et al., 2007).

Teaching difficulties and strategies to enhance understanding and practical abilities


Due to anxiety and lack of motivation, as mentioned above, teachers of quantitative methods
face unique challenges (Paxton, 2006). Against this background, teachers and researchers
have made suggestions on how to overcome the difficulties associated with teaching Statistics
quantitative data analysis. Two such strategies include integrating quantitative methods in anxiety
substantive courses and using a practical hands-on approach. These two are discussed
further below.
Teaching strategy 1: integrating quantitative methods in substantive courses. One of the
main strategies that have been suggested to increase students’ knowledge of quantitative
methods is to integrate methodology into substantive “non-method” courses. Within the
315
topic of sociology, Markham (1991) argues that statistics should be taught within
substantive courses as it is the most effective way to reach students. Since then, several
other researchers and teachers have highlighted the benefits of doing so. Here, Atkinson
et al. (2006) show that students display better grasp of course material when integrating
quantitative methods into the introductory course. Bridges et al. (1998) showed that
students’ abilities to interpret empirical data increased significantly after having included
quantitative reasoning into the course.
One of the major benefits of integration of quantitative methods into substantive courses
is likely that it highlights “the use” of quantitative methods. By connecting quantitative
methods to the actual discipline, understanding the methods becomes a way to understand
and grasp the core content of the discipline. In this way, integration may be a way address
the lack of motivation as reported on above. An increased relevance is also likely to result in
an increased effort (Sundt, 2010).
Teaching strategy 2: applying a “hands-on” practice approach. In many ways, the
“language” and theory of quantitative methods is technical, and understanding this requires
a lot from students in terms of meta- and higher order thinking (Sundt, 2010). For many
students, the use of formulas and Greek letters are likely to be seen as something that makes
the understanding of statistics extra difficult. For example, in a survey to graduate students
by Lalayants (2012), more than half admitted that they felt intimidated dealing with
formulas and statistical calculations.
One way to reduce the negative association of a technical language may be to emphasize
the “doing” of quantitative data analysis along with theory (Sundt, 2010). When teaching
quantitative methods, this can be done through using labs or cases that clearly teach
practical skills and connects theory to application. Several studies have highlighted such
initiatives. For example, Paxton (2006) devised an exercise to address student anxiety and
motivation by making statistics seem relevant and students that they could “do” statistics.
Liau et al. (2015) found positive effects of giving students hands-on exercises and data sets
in a course on statistics. Harder (2010) presents a model for teaching where students are
allowed to work with real data, and Stork (2003) highlights positive results using student
survey data. In a survey to alumni, Wilder (2010) finds that a large share of the written
comments on the survey asks specifically for active learning opportunities and computer
skills.
Much of the research reviewed above thus indicates that the transition from “reading
formulas” to “practicing” statistics is good for learning and confidence. A hands-on
approach, starting in actual issues, emphasizing practical skills, may thus be a way to
reduce the anxiety and reluctance with quantitative methods.

The case: two half-days of quantitative research


The initiative that is used as a case in this paper is “two quantitative method half-days,”
specifically designed to highlight the principles of integration and hands-on approach that
were reviewed above. The sessions, and the principles involved, are briefly described below.
JIEB The course and context: integration of research method into a substantive course
11,2 To increase the understanding of research methods to undergraduate students, a
“research-in-action” week was developed and given within an undergraduate course on
Consumer Behavior. The research-in-action week was designed so that research
methods would be integrated into the course; students had to choose a research
question that relates to the topic of consumer behavior, and then, through the use of
316 various methodological techniques, try to answer the question. This meant that
research methods were presented to the students as an integral as part of
understanding consumer behavior and not something that was external to the topic.
Several research methods, such as interviews, observation and survey research, were
highlighted during the week. Of four days allocated to the week, two half-day sessions are
specifically earmarked to quantitative survey research. These sessions were taught by one
of the authors of this paper.

Structure of the sessions: a hands-on approach


The basic structure of the two sessions followed the steps of an actual research process. The
basic idea was that students would go through the steps of doing quantitative survey
research, from constructing statements for a questionnaire to analyzing the output, all
within two sessions. The sessions thus followed a strict hands-on approach: the major focus
was on “doing” research, with little theory or lecturing.
Prior to the sessions, students were divided into groups and introduced to the objectives
of the research-in-action week. They were instructed that for the session, their group would
have to develop a research question relating to consumer behavior that they wanted to do
research on. This question would then be the base of their activities during the week.
The sessions on quantitative survey research was then structured into four distinct
phases. The first two dealt with constructing a questionnaire, while the following two dealt
with how to analyze data in SPSS.
First half-day session: constructing questionnaire. The first half-day session was
dedicated to constructing a questionnaire regarding the research question that the
students had previously developed. After a short recap of different types of questions and
answer scales (Likert, Semantic differential, etc.), and a quick walk through of a Web-
survey program, the students were given an assignment to develop a questionnaire. For
practical reasons, the questionnaire should only contain a limited number of questions
(approximately 5-6 metric and 2-3 categorical).
After approximately 1-1.5 h of working on their questionnaire, the groups were
allocated to a “feed-back group.” The purpose of the feed-back group session was to
improve the questionnaires by allowing other groups to take the survey and give
constructive feedback.
After the feed-back sessions, the survey was sent out to the other groups in the class
through the Web-based survey program. Depending on semester, there were 14-18 groups in
the course. Each participant in the class thus had to answer 13-17 short questionnaires in the
final minutes of the session. This procedure meant that each group had about 60-85 answers
to their survey. These data-bases would make up the data to be used in the subsequent
sessions on data analysis.
Second half-day session: becoming familiar with SPSS and analyzing own questionnaire.
The primary objective of the second session was to build confidence with quantitative data
analysis by familiarizing participants with SPSS and practice the final stages of a research
process. In the first stage, labs were performed with data provided by the instructor. For
this, the students were divided into separate lab-groups, with about 20 students in each
group. During the lab, tasks were solved collectively in the seminar-group, and aspects such Statistics
as statistical significance, hypotheses-testing, inferential statistics and design and structure anxiety
of SPSS were reviewed and touched upon. At the end of this lab session, the goal was that
each participant should have become familiar with how to perform basic descriptive and
inferential tasks in SPSS.
After this general review of SPSS, the final session commenced. In the final 2-h session,
the students had to work independently (in groups) with the data that they themselves had
317
generated during the morning session. Here, the groups could “try their wings,” performing
the analyses previously touched upon on their own data. This process also allowed the
students to discover strengths and weaknesses in the data material that they had generated
and learn from their mistakes. For instance, several groups noticed that it was difficult to
analyze certain questions due to the manner in which the question had been asked. The
instructor was at hand for support and continuously had a dialogue and monitored the
progress of the groups.
Examination and grading. The research-in-action week was part of the examination of
the overall course, and the students’ grade was thus in-part determined by their performance
during the sessions. However, the teaching team highlighted that the results of the research
would not be graded, only methodological awareness at the end of the sessions. Thus, if a
group discovered that the way they had asked questions was not good, it would still be ok if
they showed that they understood this and outlined how they should have done instead.
This was examined with a small reflection assignment handed in at the end of the week.

Method
To test whether the two sessions increased perceived confidence and reduced anxiety with
quantitative research methods, a questionnaire was distributed to the participants
approximately three weeks after the sessions took place. Data collection took place over
three semesters.
To test RQ1, on whether a short initiative using integration and hands-on strategies
could improve student confidence and reduce anxiety, one part of the questionnaire got the
respondents to rate their perceived own learning during the two sessions. To capture this,
four aspects were included in the survey:
 perceived understanding of quantitative data analysis (two items);
 perceived practical abilities in quantitative data analysis (two items);
 perceived confidence in quantitative data analysis (two items); and
 perceived anxiety for quantitative data analysis (one item).

The questions were structured so that respondents first stated what their level of
understanding/abilities was before they took part in the sessions and then estimated their
level after having taken part in the sessions. The statements were all measured on five step
scales.
On top of this, background data on previous statistics courses and gender were collected.
Previous research has highlighted that gender and statistical background can influence how
students relate to quantitative research methods. For example, Tarasi et al. (2013) found that
female students and students who have not taken a marketing research class had a lower
quantitative affinity than male students and students who had taken a course on marketing
research. Gender and statistical background data were thus collected to see if they
influenced the perceived benefit of the initiative, in accordance with RQ2.
JIEB Sample
11,2 During the three semesters, a total of 215 students were enrolled in the class. Of these, 123
answered the survey, which represents a response rate of approximately 57 per cent. After
having removed students who took part in less than 80 per cent of the activities during the
two half-days, 92 respondents remained for the analysis, representing 43 per cent of the
students enrolled in the course over the three semesters.
318 The respondents of the survey were predominately female (64.5 per cent) and were of an
average age of 25. Regarding previous courses in statistics (or other quantitative data
analysis), 24 per cent had one course or less, while 20 per cent had more than two courses.

Results of the survey


Two research questions were asked at the beginning of this paper. RQ1 focused on whether
a teaching initiative using limited time and resources, where quantitative method is
integrated into the course in a hands-on fashion, can be a good way to generate basic
understanding and practical abilities and reduce student anxiety and build confidence
regarding quantitative method. RQ2 focused on whether statistical background and gender
would have an impact on the perceived benefit of the initiative

Can a limited teaching initiative build confidence and reduce anxiety?


To test RQ1, students were asked to evaluate their basic understanding, abilities, confidence
and anxiety prior to and after taking part in the sessions. To evaluate whether the sessions
had an impact, the average scores of their self-evaluation prior to the day and after the day
were compared using paired samples t-tests. The results are presented in Table I.
As can be seen in the table, the sessions resulted in significant self-reported
improvements on all measured aspects. Students taking part of the day felt that they knew
more, had better practical abilities, had an increased level of confidence and experienced less
anxiety. Particularly large differences were observed for comfort with SPSS. Given the
hands-on nature of the initiative, it is not surprising that such practical abilities receive the
highest rating.

Did gender or statistical background influence the perceived benefit of the initiative?
To examine whether gender and previous background in statistics influenced what students
thought of the initiative, the survey contained two questions on satisfaction with the
sessions and whether the sessions made them feel better prepared for future research tasks.
Tables II and III show the results of a T-test (for gender) and ANOVA (for previous statistics
background). As can be seen, all groups rated the initiative fairly high, between 3.6 and 3.9
on a five-grade scale. As a consequence, none of the tests were significant.
In all, this indicates that perceived benefit and satisfaction with the initiative were not
related to previous statistical background or gender.

Analysis and discussion


This paper started out with the observation that students and teachers alike struggle with
quantitative data analysis. Students have been known to experience “statistics anxiety” and
a lack of motivation, often anchored in poor self-confidence, and a failure to see the use of
quantitative data analysis. Teachers have experienced troubles dealing with these issues
and have been known to shy away from teaching quantitative courses out of frustration and
fear of poor student reviews.
Prior to sessions After sessions
Statistics
(Mean) (Mean) t Sig* anxiety
Perceived understanding of quantitative data
analysis1
What is (was) your level of understanding of basic
statistical concepts (such as statistical significance,
independent and dependent variables etc.)? 3.1 3.7 7.29 <0.01 319
What is (was) your level of knowledge about the
consequences of different ways to ask questions to
subsequent statistical analysis, (for example, the
consequence of asking questions on a 1-5 scale or a
yes/no scale)? 2.9 3.7 9.47 <0.01
Perceived practical abilities in quantitative data
analysis2
What is (was) your level of comfort using SPSS (or
other statistics software) for your bachelor thesis or
other research task? 2.0 3.2 11.32 <0.01
What is (was) your level of comfort in performing
bivariate or multivariate analyses, such as t-test and
ANOVA, for your bachelor thesis or other research
task? 2.3 3.1 8.38 <0.01
3
Perceived confidence in quantitative data analysis
How confident are (were) you in knowing what
statistical test that is appropriate for different
situations and data? 2.5 3.3 9.31 <0.01
How confident are (were) you in your abilities to use
quantitative methodology (such as sending out a
questionnaire and analyzing statistically) for your
upcoming bachelor thesis or other similar research
task 2.8 3.6 7.91 <0.01
4
Perceived anxiety for quantitative data analysis
How intimidating or “scary” do (did) you find
quantitative data analysis if you would have to use it Table I.
as the main method for your bachelor thesis or other Perceived
research task? 2.8 2.5 3.04 <0.01 understanding,
Notes: *Pared samples t-test, 1Five-step scale, anchored with low understanding-high understanding; 2five- abilities, confidence
step scale, anchored with uncomfortable-comfortable; 3five-step scale, anchored with very unconfident-very and anxiety prior to
confident; 4five-step scale, anchored with not scary at all-very scary and after the sessions

Male Female Sig.

I feel better prepared for writing a bachelor thesis (or performing other research
task) after having taken part of the quantitative sessions of the research in Table II.
action week 3.9 3.7 0.374 Perceived benefit
Overall, I am satisfied with the sessions 3.9 3.8 0.499
of the initiative
Note: *n: 92 (32 men and 60 women) by gender
JIEB Against this background, this paper addressed the effectiveness of two measures that can be
11,2 taken with limited resources and time: that of a hands-on integration approach when
teaching quantitative methods. It reported on two “quantitative methods-sessions” that took
place within a substantive marketing course in Consumer Behavior within an
undergraduate program in Business Administration in Sweden.
The results of the survey highlight that using integration (as opposed to stand-alone
320 courses) and a hands-on approach (as opposed to a more theoretical, lecture setting), may be
a good way to teach quantitative research methods. While no objective measure of learning
was used, the self-reported items speak for themselves: students experienced an increase in
confidence, practical abilities and basic understanding, while anxiety was reduced. In a
situation where anxiety is high, and where this anxiety can lead to poor performance
(Onwuegbuzie, 2000), anything that can impact student perceptions of their own abilities,
understanding and their confidence and potential anxiety is good. This may encourage
students to develop good learning strategies, learn more and make them more receptive in
classes and other initiatives dealing with quantitative data analysis.
The results revealing an increase in confidence may also be viewed in the light of the vast
literature on self-assessed levels of statistical capabilities. We note that two of the
suggestions brought forward to increase confidence are similar to the teaching techniques
used in our study: obtaining feedback and completion of (supposedly) difficult statistical
tasks (Warwick, 2008). Beginning with feedback given, the groups of students were
particularly instructed to give constructive feedback to each other during the first half-day
session. The relatively small groups (20 students) also enabled the teacher to, in the
subsequent session, give individual feed-back regarding SPSS, in particular. This aligns
well with the suggestions of Warwick (2008) to give feedback on particular aspects of the
tasks given, not just a general “well done.” The second suggested teaching strategy to
increase confidence was enabling a completion of seemingly difficult statistical tasks. The
use of self-generated data sets, and the continuously increased level of difficulty (e.g.
introducing the explicit task to scan for weaknesses and deficiencies in the material), most
likely has encouraged the students to perform tasks that they a priori had not felt
comfortable with.
Prior to going into the sessions, it was thought that the benefit of such an initiative would
differ depending on gender and previous statistical background. As for previous statistical
background, the working hypothesis was that the benefit of the initiative would diminish for
students who had a strong background in statistics. After all, the initiative was only made
up of two sessions, highlighting the very basics of constructing questionnaires and using
SPSS and performing hypotheses testing. However, analysis of the surveys proved this
wrong; there were no differences in perceived usefulness depending on previous statistics

None or low Medium High (15.1 ECTS


(0-7.5 ECTS)* (7.6-15 ECTS)* or more)* Sig.

I feel better prepared for writing a bachelor


thesis (or performing other research task) after
Table III. having taken part of the quantitative sessions of
Perceived benefit of the research in action week 3.8 3.9 3.6 0.684
the initiative by Overall, I am satisfied with the sessions 3.9 3.7 3.8 0.844
previous statistical Notes: *Although statistics courses can be different in extent, one course usually encompasses 7.5 ECTS
background credits; n: 93 (none or low – 30; medium – 44; high – 19)
credits. One likely explanation for this is that statistics courses often are quite general, Statistics
whereas this initiative focused on practical skills within the marketing discipline. Getting an anxiety
insight into how quantitative data analysis is performed in the actual topic of interest can
possibly increase the value and make the theory more understandable.
As for gender, it was initially thought that female students would feel that the
initiative was less useful than male students. Previous research has highlighted a gender
divide in attitudes toward quantitative methods (Tarasi et al., 2013), whereas male
students expressed higher attitudes. Analysis of the surveys, however, showed no 321
differences in gender in perceived usefulness of the initiative. A likely explanation for
this could be the strict integration of research methods into the substantial topic of the
course. While the sessions focused on methods, they also never lost contact with the
consumer behavior framework and theory, which should be of interest to both male and
female participants in the course. Thus, by integrating methods, it may be possible to
reduce the negative feelings that come with research methods and thus limit the gender
gap in quantitative methods.

Conclusions, reflections, and future approaches


The topic of this paper, student anxiety and poor confidence regarding statistics and
quantitative data analysis, is likely to remain an important topic and question for the future.
As information is ever more widespread, the need to build critically thinking individuals
will be a critical task for academia. Statistical abilities is one critical component here.
However, with high levels of anxiety, poor confidence and defeatists attitudes among many
students, with a resulting lack of knowledge, there is a clear risk that these ambitions may
fall short. Addressing the anxiety and poor confidence among students, to open the door to
independent and critical analysis, is thus an important task for universities.
This paper highlights that it is possible to integrate statistical method into substantive
courses and that doing so may be one way to reduce anxiety and build practical abilities
with quantitative data analysis. For the authors of this paper, who teach and supervise
students in quantitative methods, this is certainly positive news. Hopefully, this will address
the notion among students that they are not good enough in statistics to actually use
quantitative method for papers and projects. Many times, the authors have been exposed to
some version of the attitude expressed in the statement “Well, I would like to do a
questionnaire study, but I am not going to since I don’t know enough to analyze the data.”
Addressing confidence here may result in fewer students avoiding quantitative methods, in
cases where it is the most suitable method to use to answer the research question.
However, it should be noted that the increased confidence and reduced anxiety are only
measures of subjective knowledge, which can be seen as what our participants think or feel
that they know (Brucks, 1985; Raju, Lonial, and Glynn Mangold, 1995). It is unclear from the
study whether objective knowledge, the actual knowledge of the participants, was affected.
Thus, there is also a risk with these types of initiatives if it builds confidence without
generating actual abilities. Future studies may address this aspect to see if, on top of
confidence and anxiety, objective measures of knowledge are affected.
This study also has a number of methodological limitation worth mentioning. First, like
many studies on statistics anxiety, the study did not use a control group (Chew and Dillon,
2014). Moreover, data were also collected at one point in time, as opposed to collecting data
prior to and after the initiative.
Finally, it should also be noted that the strategies included here in no way are aimed to
replace traditional courses but rather to complement them. A one-day (or one-week) initiative
can never replace introductory courses in statistics. Instead, what initiatives like this may do
JIEB is to build upon already acquired knowledge, giving confidence to use this knowledge and
11,2 highlight that quantitative data analysis is not a theoretical spectator sport, but rather a
craft that needs to be practiced. In this way, a holistic approach to quantitative data
analysis, where basic courses are complemented and coordinated with methodological parts
in substantive courses, is likely a good way forward.

322 References
Aggarwal, P., Vaidyanathan, R. and Rochford, L. (2007), “The wretched refuse of a teeming shore? A
critical examination of the quality of undergraduate marketing students”, Journal of Marketing
Education, Vol. 29 No. 3, pp. 223-233.
Atkinson, M., Czaja, R. and Brewster, Z. (2006), “Integrating sociological research into large introductory
courses: learning content and increasing quantitative literacy”, Teaching Sociology, Vol. 34 No. 1,
pp. 54-64.
Bridges, G., Gillmore, G., Pershing, J. and Bates, K. (1998), “Teaching quantitative research methods: a
Quasi-Experimental analysis”, Teaching Sociology, Vol. 26 No. 1, pp. 14-28.
Brucks, M. (1985), “The effects of product class knowledge on information search behavior”, Journal of
Consumer Research, Vol. 12 No. 1, pp. 1-16.
Chew, P.K.H. and Dillon, D.B. (2014), “Statistics anxiety update”, Perspectives on Psychological Science:
a Journal of the Association for Psychological Science, Vol. 9 No. 2, pp. 196-208.
Dobni, D. and Links, G. (2008), “Promoting statistical intelligence in marketing research students: best
practice ideas”, Marketing Education Review, Vol. 18 No. 1, pp. 61-64.
Einbinder, S. (2014), “Reducing research anxiety among MSW students”, Journal of Teaching in Social
Work, Vol. 34 No. 1, pp. 2-16.
Harder, J. (2010), “Overcoming MSW students’ reluctance to engage in research”, Journal of Teaching in
Social Work, Vol. 30 No. 2, pp. 195-209.
Hackett, G. and Betz, N.E. (1989), “An exploration of the mathematics selfefficacy/mathematics performance
correspondence”, Journal for Research in Mathematics Education, Vol. 20 No. 3, pp. 261-273.
Lalayants, M. (2012), “Overcoming graduate students’ negative perceptions of statistics”, Journal of
Teaching in Social Work, Vol. 32 No. 4, pp. 356-375.
Liau, A., Kiat, J. and Nie, Y. (2015), “Investigating the pedagogical approaches related to changes in
attitudes toward statistics in a quantitative methods course for psychology undergraduate
students”, The Asia-Pacific Education Researcher, Vol. 24 No. 2, pp. 319-327.
Liu, Y. and Levin, M.A. (2018), “A progressive approach to teaching analytics in the marketing
curriculum”, Marketing Education Review, Vol. 28 No. 1, pp. 14-27.
McBride, A. (1994), “Teaching research methods using appropriate technology”, PS: Political Science
and Politics, Vol. 27 No. 3, pp. 553-557.
Markham, W. (1991), “Research methods in the introductory course”, To Be or Not to Be? Teaching
Sociology, Vol. 19 No. 4, pp. 464-471.
Metcalfe, J. (1986), “Feeling of knowing in memory and problem solving”, Journal of Experimental
Psychology: Learning, Memory, and Cognition, Vol. 12 No. 2, pp. 288-294.
Onwuegbuzie, A. (2000), “Statistics anxiety and the role of Self-Perceptions”, The Journal of
Educational Research, Vol. 93 No. 5, pp. 323-330.
Onwuegbuzie, A. and Wilson, V. (2003), “Statistics anxiety: nature, etiology, antecedents, effects, and
treatments – a comprehensive review of the literature”, Teaching in Higher Education, Vol. 8
No. 2, pp. 195-209.
Parker, R., Pettijohn, C. and Keillor, B. (1999), “The nature and role of statistics in the business school
curriculum”, Journal of Education for Business, Vol. 75 No. 1, pp. 51-54.
Paxton, P. (2006), “Dollars and sense: convincing students that they can learn and want to learn Statistics
statistics”, Teaching Sociology, Vol. 34 No. 1, pp. 65-70.
anxiety
Raju, P.S., Lonial, S.C. and Glynn Mangold, W. (1995), “Differential effects of subjective knowledge,
objective knowledge, and usage experience on decision making: an exploratory investigation”,
Journal of Consumer Psychology, Vol. 4 No. 2, pp. 153-180.
Sankowsky, D. (2006), “Inhibitors to learning and teaching in required quantitative methods courses: a
multidisciplinary analysis”, Journal of Student Centered Learning, Vol. 3 No. 1, pp. 11-19.
Schachter, D.L. (1983), “Feeling of knowing in episodic memory”, Journal of Experimental Psychology:
323
Learning, Memory and Cognition, Vol. 9, pp. 39-54.
Spralls, S.A., III. and Wilson, J.H. (2016), “Forecasting in marketing education: empirical findings.
International journal of business, marketing”, And Decision Sciences, Vol. 9 No. 1, pp. 97.
Stork, D. (2003), “Teaching statistics with student survey data: a pedagogical innovation in support of
student learning”, Journal of Education for Business, Vol. 78 No. 6, pp. 335-339.
Sundt, J. (2010), “Overcoming student resistance to learning research methods: an approach based on
decoding disciplinary thinking”, Journal of Criminal Justice Education, Vol. 21 No. 3, pp. 266-284.
Tarasi, C.O., Wilson, J.H., Puri, C. and Divine, R.L. (2013), “Affinity for quantitative tools:
undergraduate marketing students moving beyond quantitative anxiety”, Journal of Marketing
Education, Vol. 35 No. 1, pp. 41-53.
Warwick, J. (2008), “Mathematical self-efficacy and student engagement in the mathematics
classroom”, MSOR Connect, Vol. 8 No. 3, pp. 31-37.
Wilder, E. (2010), “A qualitative assessment of efforts to integrate data analysis throughout the
sociology curriculum: feedback from students, faculty, and alumni”, Teaching Sociology, Vol. 38
No. 3, pp. 226-246.
Wilson, E.J., McCabe, C. and Smith, R.S. (2018), “Curriculum innovation for marketing analytics”,
Marketing Education Review, Vol. 28 No. 1, pp. 52-66.

Corresponding author
Jonas Nilsson can be contacted at: jonas.nilsson@handels.gu.se

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com

You might also like