You are on page 1of 63

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/359532784

To imagine or Not to Imagine? A Meta-Analysis Investigating the Effectiveness


of Mental Simulation on Behavior

Article in SSRN Electronic Journal · January 2022


DOI: 10.2139/ssrn.4041557

CITATION READS

1 41

3 authors:

Gizem Ceylan Kristin Diehl


Yale University University of Southern California
7 PUBLICATIONS 20 CITATIONS 40 PUBLICATIONS 1,939 CITATIONS

SEE PROFILE SEE PROFILE

Wendy Wood
University of Southern California
160 PUBLICATIONS 26,757 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

habit in social psychology View project

Habits and the electronic herd: The psychology behind social media’s successes and failures View project

All content following this page was uploaded by Gizem Ceylan on 12 July 2023.

The user has requested enhancement of the downloaded file.


DOI: 10.1177/00222429231181071

Author Accepted Manuscript

From Mentally Doing to Actually Doing: A Meta-Analysis of


Induced Positive Consumption Simulations

Journal: Journal of Marketing

Manuscript ID JM-21-0851.R3
Pe

Manuscript Type: Revised Submission

Research Topics: Advertising, Marketing Communications, Brand/Product Choice


e

Methods: Meta-Analysis
rR
ev
iew
Ve
rsi
on

Journal of Marketing
Page 1 of 61

1
2 Author Accepted Manuscript
3
4
5
From Mentally Doing to Actually Doing: A Meta-Analysis of Induced Positive
6 Consumption Simulations
7
8
9 Gizem Ceylan1,
10
11
Kristin Diehl2,
12
13
14 Wendy Wood2
15
16
17
Pe
18 1 Yale University
19
20
e
2 University of Southern California
21
rR
22
23
24
25
ev

26
27
28
iew

29
30 Author Note
31
32 Gizem Ceylan is a postdoctoral researcher in Marketing, Yale School of Management, Yale
33 University New Haven, CT 06511, U.S.A. gceylan@gmail.com.
Ve

34
Kristin Diehl is Professor of Marketing, Marshall School of Business, University of Southern
35
36
California, Los Angeles, CA 90089, U.S.A. kdiehl@marshall.usc.edu .
Wendy Wood is Provost Professor Emerita of Psychology and Business, Department of
rsi

37
38 Psychology and Marshall School of Business, University of Southern California, Los Angeles,
39 CA 90089, U.S.A. wendy.wood@usc.edu
on

40
41 We have no conflicts of interest to disclose.
42
43 The authors thank Sebastian Contreras for assistance with the coding of studies, Blake McShane
44
for his methodological advice, and Debbie McInnis and Evan Weingarten for their thoughtful
45
46 and extensive comments on earlier versions of this paper.
47
48 Correspondence concerning this article should be addressed to Gizem Ceylan, Marketing, 165
49 Evans Hall, Yale School of Management, Yale University New Haven, CT 06511, U.S.A.
50 gceylan@gmail.com.
51
52
53
54
55
56
57
58
59 1
60 Journal of Marketing
Page 2 of 61

1
2 Author Accepted Manuscript
3 Abstract
4
5
6 Mental simulation is an important tool for managers who want consumers to imagine what life
7
8 would be like if they engaged in positive consumption behaviors. However, research has found
9
10 mixed effects of mental simulation on behavior. To understand this inconsistency, we conduct a
11
12
13
meta-analysis to quantify the effect of different mental simulation prompts. Our multivariate
14
15 three-level meta-analysis of 237 effect sizes spanning four decades (1980-2020) and representing
16
17 40,705 respondents, yields a positive but small effect of mental simulation on behavioral
Pe
18
19
responses. Managers and researchers can amplify this effect by using dynamic visual inductions
20
e

21
(e.g., AR), inductions involving both visuals and verbal instructions, and repeated inductions
rR
22
23
24 spaced over time (e.g., weekly, akin to real-world marketing campaigns). Inducing simulations
25
ev

26 repeatedly but massed (e.g., using the same message at the same time across different platforms
27
28
iew

29 or retargeting ads) actually reduces subsequent behavioral performance. We explain the


30
31 implications of these findings for theory and practice and identify novel avenues for research.
32
33
Ve

34
35
36
Keywords: Mental simulation, future, behavior, purchase, advertisement, meta-analysis
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 2
60 Journal of Marketing
Page 3 of 61

1
2 Author Accepted Manuscript
3 Introduction
4
5
6 Mental simulation, imagining behavioral episodes that have not (yet) taken place, has
7
8 been widely used in marketing and communication. For example, a commercial for EasyJet, a
9
10 leading European airline, asks people to “Imagine Where We Can Take You,” along with visuals
11
12
13
of flying over clouds and of different holiday locations from beaches to cities. In a Facebook ad,
14
15 Pillsbury prompts consumers to “imagine the memories” they will make using their cookie
16
17 dough. Similarly, the Moraine Park Technical College radio ad asks listeners to “imagine what’s
Pe
18
19
next.” In addition to explicit calls to imagine, managers employ strong visuals to implicitly
20
e

21
prompt consumers to simulate a future scenario. For example, to enable consumers to simulate
rR
22
23
24 how new furniture would fit into their existing environment, furniture companies such as West
25
ev

26 Elm and Pottery Barn now offer AR-based room planners. Similarly, realtors have turned to 360-
27
28
iew

29 degree videos to showcase homes and apartments.


30
31 Mental simulation has been shown to improve action readiness (van Boven and Ashworth
32
33 2007) and hence is used in advertisements and other communication to facilitate and ultimately
Ve

34
35
36
elicit purchase and consumption. However, research has revealed divergent effects of simulation
rsi

37
38 on behavior. For example, although some studies noted positive influences on behavioral
39
on

40 intentions and behavior (e.g., Elder and Krishna 2012; Shiv and Huber 2000; Zhao, Hoeffler, and
41
42
Zauberman 2007), others found minimal or even negative effects (e.g., Pecher and van Dantzig
43
44
45 2016; Rajagopal and Montgomery 2011). It is difficult to interpret these findings given that the
46
47 modality of simulation techniques, frequency of induction, type of consumption experience, and
48
49 target populations vary widely in research as well as in practice. In addition, over 25 years have
50
51
52 passed since the highly influential review of simulation effects by MacInnis and Price (1987),
53
54
55
56
57
58
59 3
60 Journal of Marketing
Page 4 of 61

1
2 Author Accepted Manuscript
3 and the literature has grown considerably since that time. For these reasons, a quantitative
4
5
6 synthesis of mental simulation on behavioral outcomes is timely and important.
7
8 We systematically review the literature in a meta-analysis and integrate the extensive
9
10 empirical research in this multidisciplinary literature that spans marketing, advertising,
11
12
13
psychology, and health. We connect and compare these results to the ways simulation has been
14
15 used by managers and policymakers. Hence, our results can directly inform their decisions on
16
17 simulation-based communication. Through this meta-analysis, we integrate 237 effect sizes and
Pe
18
19
capture mental simulation's overall impact on behavior. We further contribute a comprehensive
20
e

21
and empirically grounded account of the conditions under which mental simulation is effective in
rR
22
23
24 eliciting consumption and purchase. We, thereby provide researchers and practitioners with a
25
ev

26 benchmark and strong foundation for understanding how to use this important technique.
27
28
iew

29 Our research makes several important contributions. First, we find that, across studies,
30
31 mental simulation increases behavioral responses. However, the average effect is small,
32
33 suggesting that, while mental simulation works on average, marketers and researchers must
Ve

34
35
36
identify ways to strengthen its impact. Second, we identify more powerful mental simulation
rsi

37
38 prompts, such as dynamic visuals (AR, 360-degree video) and the combination of verbal
39
on

40 instructions along with pictures, and we guide marketers to use such interactive, rich media.
41
42
Third, we demonstrate that the frequency and spacing of mental simulation determine its effect
43
44
45 on behavior, and we offer direct guidance to managers for effective ad planning and delivery.
46
47 Fourth, we demonstrate that simulation has limited impact on behavior in online samples in
48
49 which participants may not be sufficiently motivated to engage in mental simulation.
50
51
52
53
54 Theoretical Development
55
56
57
58
59 4
60 Journal of Marketing
Page 5 of 61

1
2 Author Accepted Manuscript
3 Mental Simulation Readies for Behavior
4
5
6 Mental simulation is the process of projecting oneself and one’s action into alternate
7
8 temporal, spatial, social, or hypothetical realities (Kosslyn 1980; Taylor and Schneider 1989).
9
10 When people simulate future events, they think about their potential behaviors, creating a mental
11
12
13
representation in which they are engaging in the target behavior and experiencing its
14
15 consequences. The mental events involved in simulation are closely aligned with those of
16
17 performing the behavior. In a classic analysis, James (1890) argued that “every mental
Pe
18
19
representation of a movement awakens to some degree the actual movement which is its object.”
20
e

21
Echoing this idea, mental representations of future events have been termed action plans (Gilbert
rR
22
23
24 and Wilson 2007) and organizers of activity (Steels 2003). Supporting a close connection
25
ev

26 between simulation and action, neuroimaging research reveals that simulating a behavior and
27
28
iew

29 engaging in the actual behavior take similar amounts of time (Parsons 1994) and activate similar
30
31 neural mechanisms (Decety 1996; Guillot et al. 2009). This link between thinking and doing
32
33 suggests a common neural coding of action or shared action schemas for simulated and actual
Ve

34
35
36
behavior (Chartrand and Lakin 2013).
rsi

37
38 Given that mental simulation is essentially mentally performing a behavior, simulating a
39
on

40 future behavior should increase the likelihood of action and improve actual performance. For
41
42
instance, mental simulations improve athletes’ performance by increasing their preparation for
43
44
45 and motivation to act (Driskell et al. 1994; Conroy and Hagger 2018; Pham and Taylor 1999).
46
47 Other skills, like playing the piano, can be learned effectively through mental practice in addition
48
49 to actual physical practice (Pascual-Leone et al. 1995). In addition, simulating future
50
51
52 consumption enables people to evaluate the emotional and cognitive consequences of that
53
54 consumption, and imagining positive consequences increases people’s willingness to engage in
55
56
57
58
59 5
60 Journal of Marketing
Page 6 of 61

1
2 Author Accepted Manuscript
3 the imagined experience (Frijda, Bower, and Hamilton 1988; Higgins 2006). Furthermore,
4
5
6 imagining events in concrete and specific form often makes those events seem true (Taylor et al.
7
8 1998). For example, imagining an idyllic vacation of lying on a beach, swimming, sailing, and
9
10 snorkeling may increase the likelihood consumers turn these episodes into reality and book a
11
12
13
vacation. Overall, mental simulation aims to facilitate and elicit the imagined behavior directly.
14
15
16
17 Mental Simulation and Its Mixed Effects on Behavior
Pe
18
19
While ample research supports the idea that mental simulation increases readiness for
20
e

21
behavior, research on the effectiveness of mental simulation on inducing actual behavior has
rR
22
23
24 yielded mixed outcomes. As noted, some studies found a positive effect of simulation prompts
25
ev

26 on behavioral intentions and behavior (e.g., Elder and Krishna 2012; Shiv and Huber 2000;
27
28
iew

29 Zhao, Hoeffler, and Zauberman 2007), while others failed to find such an effect (e.g., Pecher and
30
31 van Dantzig 2016; Rajagopal and Montgomery 2011). Note that this was the case even though
32
33 both research and practice predominantly focus on simulating pleasurable consumption
Ve

34
35
36
experiences such as driving a new car (Burns, Biswas, and Babin 1993) or staying at a nice hotel
rsi

37
38 (Petrova and Cialdini 2005). Similarly, we focus our meta-analytic investigation on the mental
39
on

40 simulation of positive consumption experiences.


41
42
To understand these inconsistent findings, recent researchers have begun to identify
43
44
45 factors that may inhibit the effect of mental simulation on eliciting the simulated behavior. For
46
47 instance, Cornil and Chandon (2016) found that mentally simulating the consumption of an
48
49 unhealthy food option (e.g., a slice of chocolate cake) using all five senses simultaneously
50
51
52 reduced the amount of food people consumed. Also, Powell and Barasch (2019) found that
53
54 exposure to photos limited the extent to which consumers mentally simulated an experience (vs.
55
56
57
58
59 6
60 Journal of Marketing
Page 7 of 61

1
2 Author Accepted Manuscript
3 no photos), and limited simulation reduced purchase intentions. Given positive, null, and
4
5
6 negative findings in the literature, a meta-analysis is particularly well suited to quantify mental
7
8 simulation's overall effect and to identify possible reasons for varying results.
9
10 In line with marketing practice that uses verbal and visual inductions in different media
11
12
13
(e.g., verbal on the radio or visual in print), researchers have used both explicit verbal and more
14
15 implicit visual methods to activate mental simulation. Yet, researchers have rarely compared
16
17 these different induction methods in a single study. Instead, they used different approaches but
Pe
18
19
treated them as equivalent in activating mental simulation. Furthermore, researchers used
20
e

21
different simulation targets (i.e., focal purchases) and respondent samples (e.g., in person,
rR
22
23
24 online) seemingly based on convenience without examining the potential effects of these
25
ev

26 decisions. We examine potential conditions that could alter the effectiveness of mental
27
28
iew

29 simulation, including how simulation was induced, its frequency, consumers’ familiarity with the
30
31 simulation target, and the nature of target (e.g., material products, experiential purchases, healthy
32
33 behavior).
Ve

34
35
36
rsi

37
38 Does Mental Simulation Also Affect Attitudes?
39
on

40 Mental simulation involves imagining behavioral episodes that have not (yet) taken place,
41
42
such as driving a new car, putting on sneakers, or eating an apple. Neuroimaging research
43
44
45 suggests a common neural coding of action or shared action schemas for simulated and actual
46
47 behavior (D’Argembeau and Van Der Linden 2004). Collectively, the literature suggests that
48
49 mental simulation should be able to affect behavior directly by increasing action readiness and
50
51
52 realism of the experience, as indicated, for example, by feelings of greater narrative
53
54 transportation (Escalas 2004).
55
56
57
58
59 7
60 Journal of Marketing
Page 8 of 61

1
2 Author Accepted Manuscript
3 Further, attitudes and behaviors may be related, as people often infer attitudes from their
4
5
6 behavior and accessible thoughts (e.g., Kiesler, Nisbett, and Zanna 1969). It is possible that
7
8 mental simulations of future behavior also influence attitudes. Mental simulation can bring to
9
10 mind past experiences with the simulation target and related thoughts. These thoughts and
11
12
13
feelings might then be updated based on the simulated behavior. For instance, a mental
14
15 simulation of going to the gym might increase the likelihood to perform this behavior, thus also
16
17 yielding more favorable attitudes toward going to the gym.
Pe
18
19
This effect of mental simulation on attitudes via behavior augments other well-
20
e

21
established relationships between attitudes and behavior. A large body of work in research and
rR
22
23
24 practice examines persuasive communication that attempts to change behavior by first changing
25
ev

26 attitudes and intentions (i.e., an indirect effect of mental simulations on behavior). For example,
27
28
iew

29 the Theory of Reasoned Action (Fishbein and Ajzen 2011) provides a comprehensive
30
31 framework, supported by copious empirical evidence, demonstrating that changing attitudes can
32
33 lead to changes in behavior. Further, some work implies that mental simulation would influence
Ve

34
35
36
attitudes first by creating feelings of fluency (e.g., Petrova and Cialdini 2005) and this change
rsi

37
38 could carry over to behavior.
39
on

40 Given the importance of understanding the antecedents of behavioral responses to


41
42
marketing practice and theory, in this meta-analysis, we purposefully focus on the causal effect
43
44
45 of mental simulation on behavioral outcomes. Still, researchers may wonder whether and when
46
47 mental simulations influence attitudes (e.g., Petrova and Cialdini 2005). Per our inclusion
48
49 criteria, studies included in the meta-analysis must have assessed a behavioral response.
50
51
52 Expanding the scope to, independently, also include attitudinal outcomes was not feasible. To
53
54 offer an answer to whether mental simulation influences attitudes, we use the studies included in
55
56
57
58
59 8
60 Journal of Marketing
Page 9 of 61

1
2 Author Accepted Manuscript
3 the meta that also measured attitudes. Within the limits of this meta-analysis, we treat attitudes
4
5
6 and behavior as separate outcomes and examine whether mental simulations would yield a
7
8 positive effect on attitudes without disentangling the relationship between behavior and attitudes.
9
10
11
12
13
When Does Mental Simulation Affect Behavior?
14
15 To understand when mental simulations affect behavior, we examine different ways in
16
17 which mental simulation can be induced, focusing specifically on induction modality and
Pe
18
19
simulation frequency. Empirically, we also explore other moderators as shown in Figure 1.
20
e

21
Induction modality. Researchers have activated simulations through verbal inductions,
rR
22
23
24 comparable to radio and podcast advertising, and visual instructions, comparable to print ads and
25
ev

26 360 video presentations. Verbal inductions explicitly encourage individuals to simulate a future
27
28
iew

29 event or action related to a target behavior (e.g., Burns, Biswas, and Babin 1993; Silvera et al.
30
31 2014; Zhao, Hoeffler, and Zauberman 2007). Visual inductions provide photos, logos, drawings,
32
33 and illustrations of the target to activate simulation (e.g., Krishna, Morrin, and Sayin 2014;
Ve

34
35
36
Petrova and Cialdini 2005). A common assumption in the literature is that verbal and visual
rsi

37
38 inductions similarly activate mental representations of the simulation target (Lutz and Lutz
39
on

40 1978). Hence, researchers have used them interchangeably. However, to the extent that pictures
41
42
and words are processed and encoded differently, these different modalities may have different
43
44
45 effects on behavior.
46
47 Pictures might provide a basis for more impactful simulations, given that they depict a
48
49 specific representation of the referent object (Amit, Algom, and Trope 2009) and are processed
50
51
52 faster than words (Nelson, Reed, and McEvoy 1977; Paivio 1969). Visuals that provide extensive
53
54
55
56
57
58
59 9
60 Journal of Marketing
Page 10 of 61

1
2 Author Accepted Manuscript
3 detail may also evoke more intense emotional reactions (Rossiter and Percy 1980). For these
4
5
6 reasons, visual simulation prompts could produce stronger effects on behavior than verbal ones.
7
8 Alternatively, verbal simulation prompts (i.e., instructions to imagine) may be more
9
10 effective in activating detailed (Johnson et al. 1988) and self-relevant mental simulations
11
12
13
(Anderson 1983; Escalas 2007). A verbal simulation prompt that denotes a broad set of referent
14
15 objects (e.g., imagine yourself driving a sedan) allows consumers to simulate the features (e.g.,
16
17 color) that are most relevant and appealing to them. Thus, the personal relevance of verbal
Pe
18
19
prompts might increase individuals’ likelihood of engaging in the target behavior. The present
20
e

21
meta-analysis compares the effectiveness of verbal versus visual inductions, even though prior
rR
22
23
24 research did not systematically study this question.
25
ev

26 We also compare the effect of either visual or verbal simulation prompts to cases when
27
28
iew

29 both modalities are used in combination. Activating a simulation in two different modalities may
30
31 enhance the likelihood and elaboration of the simulation compared with using a single modality.
32
33 Hence, combined inductions may have a stronger effect on behavior than single modality
Ve

34
35
36
inductions. However, it is also possible that simulations in two modalities have no additional
rsi

37
38 effect, or that the combination may even have a negative effect through habituation.
39
on

40 To further understand the effects of different simulation inductions, we also compare the
41
42
effect of standard static visual prompts with those involving dynamic visuals. Using emerging
43
44
45 dynamic technologies, marketers and researchers potentially create more interactive, engaging
46
47 consumer experiences. Studies with dynamic visuals used 3D images aided by interactive
48
49 features, such as zooming, moving, and rotating or through augmented reality (AR). Control
50
51
52 conditions in this work involve 2D images or static product pictures taken from the front, back,
53
54 and side angles. Given the ability of dynamic visual prompts to evoke realistic and elaborate
55
56
57
58
59 10
60 Journal of Marketing
Page 11 of 61

1
2 Author Accepted Manuscript
3 simulations of the target (Schlosser 2003), we expect dynamic simulation prompts to have a
4
5
6 stronger effect on behavior than verbal and static (i.e., photos) visual inductions.
7
8 H1: Dynamic visual simulation inductions have a stronger effect on behavioral
9 outcomes than verbal or static visual simulation inductions.
10
11
12
13
Simulation Frequency. Prior research in advertising has identified the benefits of frequent
14
15 exposure (also called repetition effects by Pechmann and Stewart 1988), largely due to more
16
17 frequently advertised brands being recalled better at the point of purchase (Tellis 2003). In
Pe
18
19
simulation-based ads instead, simulating behavior more (vs. less) frequently may increase action
20
e

21
readiness, suggesting that greater simulation frequency leads to greater likelihood of engaging in
rR
22
23
24 the simulated behavior. However, simulation frequency may also lead to adaptation such that
25
ev

26 greater simulation frequency reduces one’s likelihood of engaging in the simulated behavior and
27
28
iew

29 eventually diminishes performance. Given these opposing mechanisms, it may not just be the
30
31 frequency of mental simulation but also the spacing of mental simulation that influences
32
33 behavioral outcomes.
Ve

34
35
36
Our review compares three approaches that researchers and practitioners have used to
rsi

37
38 vary exposure frequency and spacing in mental simulation-based ads. The majority of studies in
39
on

40 marketing evoke mental simulation of the target consumption experience only once in a single
41
42
session, akin to a single exposure. Single exposures are likely in today’s fragmented, cluttered
43
44
45 media landscape, despite strivings of advertising practice for greater exposure (Moorman, Ryan,
46
47 Tavassoli 2022). Furthermore, proponents of the minimalist (vs. frequentist) perspective on
48
49 advertising have argued for and shown effects even after a single exposure (Jones 1995, Gibson
50
51
52 1996). Indeed, managers and researchers typically rely on a single exposure to assess ad
53
54 effectiveness in A/B tests, a common industry practice. In addition, marketing practice stresses
55
56
57
58
59 11
60 Journal of Marketing
Page 12 of 61

1
2 Author Accepted Manuscript
3 the importance of the first impression brands create on consumers (e.g., when visiting a brand’s
4
5
6 web page, Ghimire 2022). Finally, for consumable products, such as choosing what to eat for
7
8 lunch from menus or apps, a single induction mimics the decision context where marketers use
9
10 vivid pictures to simulate the consumption experience and induce ordering (Pierce 2022). As
11
12
13
such, single simulation studies speak to an important aspect of marketing reality. Since mentally
14
15 simulating a positive behavior (e.g., eating a cookie) is very close to actually performing the
16
17 behavior and involves the same cognitive and neural circuitry, we expect it to increase the
Pe
18
19
likelihood of engaging in the target behavior (e.g., purchasing or consuming the cookie).
20
e

21
Spaced simulations, prompt participants to simulate the target consumption experience
rR
22
23
24 multiple times across temporally spaced sessions. Marketers use this strategy when delivering
25
ev

26 the same message to consumers on a daily or weekly basis. Across repetitions, individuals can
27
28
iew

29 bring different aspects of the simulation target to mind, eventually creating a more detailed and
30
31 vivid mental model of the target. Given that consistency of mental simulation practice helps to
32
33 increase and maintain behavior performance (Janiszewski et al. 2003), we expect repeated
Ve

34
35
36
simulations in temporally spaced sessions over time to have a stronger positive effect on
rsi

37
38 behavior than a single simulation.
39
on

40 Massed simulations prompt participants to engage in simulation multiple times without


41
42
temporal spacing. Marketers may (inadvertently) expose consumers to the same advertisement
43
44
45 content in a massed way when reaching them across different platforms (e.g., Instagram and
46
47 Twitter), when the inventory of non-skippable ads on online platforms is limited (a frequent
48
49 complaint on YouTube), or with retargeting ads that follow consumers across different websites.
50
51
52 For example, Li et al. (2021) speculated that consumer annoyance might explain why exposure
53
54 to retargeting ads within a short period (within 30-60 minutes) negatively affects purchases.
55
56
57
58
59 12
60 Journal of Marketing
Page 13 of 61

1
2 Author Accepted Manuscript
3 Relatedly, in a sports setting, Cooley et al. (2013) found a directionally negative relationship
4
5
6 between the number of repetitions of a single image in one session and performance. In the
7
8 context of simulation-based ads, we expect that repeatedly simulating the same target behavior in
9
10 a short time frame leads to habituation and reduced behavioral responses. As a result, we expect
11
12
13
massed simulations to reduce the likelihood of engaging in the target behavior compared with
14
15 single simulation inductions.
16
17 H2a: Spaced, repeated simulation inductions increase behavioral responses compared
Pe
18 with single simulation inductions.
19
H2b: Massed, repeated simulation inductions reduce behavioral responses compared
20
e

21 with single simulation inductions.


rR
22
23
24 We tested these a priori, theoretical predictions in our meta-analysis. Further, in line with
25
ev

26 meta-analytic practice, we also tested a number of other potential factors that arose during the
27
28
iew

29 review and differed between studies (see Figure 1) as well as operational factors (e.g., journal
30
31 field). Given that we did not have a priori predictions about these factors, we introduce them in
32
33 the methods section and report the results only when they yield significant findings.
Ve

34
35
36
rsi

37
38 FIGURE 1. MODEL TESTED IN META-ANALYSIS
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 13
60 Journal of Marketing
Page 14 of 61

1
2 Author Accepted Manuscript
3
4 Simulation Characteristics
Features of Simulation
Target
5 Induction Modality
Audience’s Familiarity
6 Simulation Frequency
Type of Simulation Target
7
8
9
10
11
12
13 Mental Simulation
Behavioral Intentions
(Presence vs. Absence)
14 (High vs. Low Level)
Behavior
15
16
17
Pe
18
19 Contextual Factors
20 Simulation Context
e
Simulation Focus
21 Population
rR
22
23
24 Research Operational
25
ev

Factors
26 Journal Field
27 Year
Country
28
iew

Scale
29 Multi-item
30
31
32 Gray text indicates moderators we tested but did not have a specific prediction.
33
Ve

34
35
36
Method
rsi

37
38 Literature Search
39
We searched PsycINFO, PubMed, Web of Science, Google Scholar, Oxford Scholarship
on

40
41
42 Online, ProQuest – Multiple Databases, and ProQuest Digital Dissertations for papers published
43
44
45 or authored between 1980 and 2020. In line with the focus of this meta-analysis, the search used
46
47 mental simulation-related terms in conjunction with outcome-related terms. Specifically, we used
48
49 the following search string: the independent variable of interest (simulation OR mental
50
51
simulation OR imagery OR mental imagery OR mental practice OR visuali*) in conjunction with
52
53
54 another independent variable or a dependent variable (simulation OR mental simulation OR
55
56 mental simulation OR mental simulation OR mental practice OR visuali*) AND (purchase
57
58
59 14
60 Journal of Marketing
Page 15 of 61

1
2 Author Accepted Manuscript
3 intent(ion) OR behavioral intent(ion) OR persuasion OR behavior OR intervention OR
4
5
6 program). For searches conducted on PubMed, we added a third string (randomized trial) to
7
8 correctly identify studies with an experimental design. The term “persuasion” was included
9
10 because it was sometimes used as an umbrella term to refer to outcomes of interest such as
11
12
13
consumption behavior. The terms “intervention” and “program” were especially relevant to
14
15 studies found in PubMed. Descendancy searches were conducted with Google Scholar on papers
16
17 that cited foundational articles (Bone and Ellen 1992; Burns, Biswas, and Babin 1993; Rossiter
Pe
18
19
and Percy 1980). Ancestry searches were conducted on the reference sections of foundational
20
e

21
articles and review papers (Babin and Burns 1997; Blondé and Girandola 2016; Burns, Biswas,
rR
22
23
24 and Babin 1993; Conroy and Hagger 2018; MacInnis and Price 1987).
25
ev

26 We also conducted a search for unpublished data (Rothstein and Bushman 2012).
27
28
iew

29 Searching ProQuest Digital Dissertations returned 13 documents that were a potential fit. Among
30
31 them, four dissertations fit our criteria and were included in the analysis (Babin 1992; Ross-
32
33 Stewart 2009; Thompson 2006; Walters 2007). Only one dissertation was later published (Babin
Ve

34
35
36
and Burns 1997). In this case, studies from both the published article and the unpublished
rsi

37
38 dissertation were included. We further searched ProQuest-Multiple Databases for conference
39
on

40 proceedings. None of the reports we located fit our inclusion criteria (described below).
41
42
Additionally, we included two presentations at the Society for Consumer Psychology
43
44
45 Conference in 2019 in our analysis. Finally, we sent requests through the listservs of the Society
46
47 for Personality and Social Psychology, the Society for Judgment and Decision Making, and the
48
49 Association for Consumer Research. We received 16 papers (both published and unpublished
50
51
52 work). Among the 16, two unpublished manuscripts met our inclusion criteria, and we included
53
54 them in the meta-analysis. In addition, studies from an unpublished working paper by the authors
55
56
57
58
59 15
60 Journal of Marketing
Page 16 of 61

1
2 Author Accepted Manuscript
3 of this manuscript were included in the analysis (see Web Appendix A for the graphical display
4
5
6 of literature search).
7
8
9
10 Inclusion Criteria
11
12
13
In line with the focus of this meta-analysis on the causal effect of mental simulation on
14
15 behavioral outcomes, we included research reports that met the following criteria:
16
17 1. Studies must have had an experimental manipulation in which participants were randomly
Pe
18
19
assigned to mental simulation conditions.
20
e

21
2. Mental simulation must be induced using verbal or visual prompts. As such, we included
rR
22
23
24 papers that induced participants’ mental simulation using static pictures (e.g., Krishna,
25
ev

26 Morrin, and Sayin 2014), dynamic pictures (e.g., interactive pictures, Schlosser 2003;
27
28
iew

29 augmented reality, Fritz, Hadi, and Stephen 2022) or verbal instructions (e.g., Adaval and
30
31 Wyer 1998). Using this criterion, we excluded only three studies that activated mental
32
33 simulation as a function of the number of claims (e.g., many vs. few claims Spears, Ketron,
Ve

34
35
36
and Ngamsiriudom 2016) or text type (e.g., dialogue vs. narration Avramova, de Pelsmacker,
rsi

37
38 and Dens 2017).
39
on

40 3. Eligible experiments must have had a control condition. We identified two different types of
41
42
control conditions in the literature: (a) one that used no simulation prompts (e.g., Adaval and
43
44
45 Wyer 1998; Krishna, Morrin, and Sayin 2014) or (b) one that used prompts to induce a
46
47 relatively lower level of simulation (e.g., Elder and Krishna 2012; Kim and Lennon 2008).
48
49 Four studies that compared two different types of simulation conditions (e.g., process- vs.
50
51
52 outcome-focused mental simulation manipulations) without a specific control were excluded
53
54 from the analysis (Cian, Longoni, and Krishna 2020; Rennie et al. 2014).
55
56
57
58
59 16
60 Journal of Marketing
Page 17 of 61

1
2 Author Accepted Manuscript
3 4. Studies must have focused on simulating positive future consumption behaviors or positive
4
5
6 sensory experiences derived from these behaviors (eating fruits, Knäuper et al. 2011;
7
8 vacationing in Europe, Petrova and Cialdini 2005) with the intention to increase simulated
9
10 behavior. We did not include studies (a total of 17 studies) that instructed participants
11
12
13
specifically to simulate reducing their future consumption behavior (e.g., reducing alcohol
14
15 consumption Conroy, Sparks, and de Visser 2015) or that instructed participants to simulate
16
17 risks, problems, and negative sensory experiences (e.g., consequences of not screening for
Pe
18
19
skin cancer, Block and Keller 1997; potential problems buying really new products, Dahl and
20
e

21
Hoeffler 2004; disgust from neglecting gum health, Dillard and Shen 2018).
rR
22
23
24 5. Studies must have assessed a behavioral response in the form of (a) intentions; (b) choice
25
ev

26 between two alternatives; (c) actual consumption; or (d) amount of behavior (e.g., exercise).
27
28
iew

29 Given that mental simulation affects behavior directly, we did not include studies that solely
30
31 assessed attitudes towards the brand or the company (e.g., DeRosia and McQuarrie 2019),
32
33 attitudes towards the advertisement through which mental simulation was encouraged (e.g.,
Ve

34
35
36
Bolls and Muehling 2007), or memory recall (e.g., Klepacz et al. 2016). However, we
rsi

37
38 computed attitude effects in studies that assessed attitudes in addition to behavioral responses
39
on

40 in order to test for any evidence of an inference process.


41
42
6. Studies needed to have sufficient information (means, standard deviations, F-ratios, t-tests,
43
44
45 etc.) to calculate effect sizes. We contacted authors for additional information whenever the
46
47 necessary information was not available, and we used this information when provided (which
48
49 recovered a total of 12 effect sizes).
50
51
52
53 Meta-Analytic Strategy
54
55
56
57
58
59 17
60 Journal of Marketing
Page 18 of 61

1
2 Author Accepted Manuscript
3 We calculated the effect size, Hedges’s g (Rosenthal 1991), using means and standard
4
5
6 deviations, t-tests, F ratios, and log-odds ratios using standard formulas (Borenstein et al. 2009;
7
8 Johnson and Eagly 2014) and Comprehensive Meta-Analysis (CMA) software (Borenstein et al.
9
10 2009). We used the bias-corrected Hedges’s g (similar to Cohen’s d but with a small sample
11
12
13
correction) as our main measure of effect size (Rosenthal 1991). To estimate heterogeneity, we
14
15 computed I2 (Borenstein et al. 2009), which reflects the heterogeneity present in a sample as a
16
17 percentage of the total variation that is not random sampling variation. We also estimated τ2 as a
Pe
18
19
measure of absolute heterogeneity.
20
e

21
In all, our meta-analysis synthesized 237 published and unpublished effect sizes
rR
22
23
24 (including all induction methods and all dependent variables), obtained from research conducted
25
ev

26 in the United States and internationally. To account for the hierarchical structure of our data (i.e.,
27
28
iew

29 multiple effect sizes derived from one paper), we estimated the overall effect size using a three-
30
31 level model, reflecting the set of experiments nested in a given paper (level 3), individual effect
32
33 sizes nested within experiments (level 2), and subjects nested in each experiment (level 1;
Ve

34
35
36
Cheung 2014). The analysis used robust variance estimation (RVE; Hedges, Tipton, and Johnson
rsi

37
38 2010) with hierarchical weights and small sample corrections (Tipton, Pustejovsky, and Ahmadi
39
on

40 2019). When sufficient data were available (k > 4), we tested for the influence of moderating
41
42
variables. All analyses were conducted using the R packages metafor (Viechtbauer 2010) and
43
44
45 robumeta (Fisher et al. 2017).
46
47 Finally, in order to better communicate the substantive importance of a given finding, we
48
49 translated Hedge’s g into an improvement index, based on Cohen’s U3 index, which converts an
50
51
52 effect into a percentile gain shown by the treatment group compared with the control group
53
54 (Durlak 2009). For example, an effect size of g = 0.2 moves the treatment group from the median
55
56
57
58
59 18
60 Journal of Marketing
Page 19 of 61

1
2 Author Accepted Manuscript
3 of a normal distribution to its 60th percentile and represents a 10% change in the focal outcome
4
5
6 (in standardized terms) between the treatment group compared with the control group.
7
8
9
10 Publication Bias
11
12
13
Publication bias is a threat to the validity of any meta-analysis. The published literature
14
15 documents only a proportion of all research carried out, and the unpublished proportion may be
16
17 systematically different from the published because selectivity may exist in what gets published
Pe
18
19
(Sutton 2009). To address these concerns and investigate bias, we assessed potential bias using
20
e

21
three different methods: Egger’s tests, trim-and-fill analyses (Duval and Tweedie 2000b), and p-
rR
22
23
24 curve (Simonsohn, Nelson, and Simmons 2014).
25
ev

26 First, we used Egger’s test to examine publication bias. Egger’s test estimates the
27
28
iew

29 association between the observed treatment effects and their standard errors; a strong association
30
31 implies publication bias (Egger, Smith, and Phillips 1997). Second, we employed trim-and-fill on
32
33 funnel plots with contours to verify the extent to which the effect size would change once
Ve

34
35
36
accounting for unpublished results (Duval and Tweedie 2000a; Palmer et al. 2008; Peters et al.
rsi

37
38 2008). Third, we created a p-curve from all published studies in our final dataset to test whether
39
on

40 the underlying studies had evidential value, meaning effects were not due to selective reporting
41
42
(Simonsohn, Nelson, and Simmons 2014).
43
44
45 Jointly, these analyses suggest that the effect of mental simulation has a small but robust
46
47 true effect on behavioral responses despite some evidence for publication bias (detailed analyses
48
49 can be found in Web Appendix B). Separately, we conducted a publication bias analysis for
50
51
52 studies with attitudes towards the simulation target as the dependent variable (see Web Appendix
53
54 C). Similar to the average effect size from studies with behavioral outcomes, the average effect
55
56
57
58
59 19
60 Journal of Marketing
Page 20 of 61

1
2 Author Accepted Manuscript
3 size from studies with attitudes reduced slightly but remained significantly different from zero
4
5
6 after controlling for publication bias.
7
8
9
10 Dependent Variables
11
12
13
Behavioral responses. Behavioral responses included purchase intentions, choice, actual
14
15 (food) consumption, and actual behavior (e.g., exercise). Purchase intention measures included
16
17 single- or multi-item measures of intentions to buy (e.g., Jeong and Jang 2016), willingness to
Pe
18
19
obtain information (e.g., Gregory, Cialdini, and Carpenter 1982), and willingness to recommend
20
e

21
an item to others (Silvera et al. 2014). We also included studies that combined purchase
rR
22
23
24 intentions with brand attitudes (e.g., Hartmann and Apaolaza-Ibáñez 2009). Some studies
25
ev

26 measured consumers’ choice as a preference between two alternatives in hypothetical settings


27
28
iew

29 (e.g., Shiv and Huber 2000). In addition, several studies measured actual behavior, including
30
31 actual food or drink consumption (either in grams or in units of the food and drink item; e.g.,
32
33 Cornil and Chandon 2016) and physical activity in terms of exercise in minutes or frequency in a
Ve

34
35
36
week (Kim et al. 2011). We report all as behavioral responses.
rsi

37
38 Attitudes towards the simulation target. Some studies in our sample also reported
39
on

40 attitudes toward the focal target and thus, allow us to test whether mental simulation also affects
41
42
attitudes. We calculated a separate effect size reflecting attitudes towards the simulation target.
43
44
45 These studies typically measured attitudes using multiple items (good/bad,
46
47 favorable/unfavorable, pleasant/unpleasant, etc.) and reported these as a composite score. Note,
48
49 in the reporting, prior research treated attitudes and behavior as separate outcomes and did not
50
51
52 report the relationship between them. Hence, we cannot assess the indirect effect of mental
53
54 simulation on attitudes via behavior as implied by a self-perception mechanism. However, we
55
56
57
58
59 20
60 Journal of Marketing
Page 21 of 61

1
2 Author Accepted Manuscript
3 can assess the direct effect of mental simulation on attitudes. Such a finding may still be of
4
5
6 practical importance because attitudes can be measured more easily than behavior.
7
8
9
10 Effects and Moderator Coding
11
12
13
The first author and a research assistant, who held a master’s degree in psychology and
14
15 was trained by the first author, calculated all effect sizes and coded moderator variables as
16
17 discussed below (interclass correlation = .90). The two coders resolved any disagreement by
Pe
18
19
discussion. To capture prior experience with the target behavior, we used text analysis (see
20
e

21
below).
rR
22
23
24 Induction modality. In studies with verbal inductions, the treatment group was instructed
25
ev

26 to mentally simulate a future behavior (e.g., driving a sedan car, vacationing at a beach resort,
27
28
iew

29 etc.). These studies compared the effect of mental simulation prompts in the treatment group
30
31 with a control group that entailed: (a) a list of product attributes (e.g., Escalas 2004) or (b) a
32
33 different scenario without instructions to engage in mental simulation (e.g., Zhao, Hoeffler, and
Ve

34
35
36
Zauberman 2007).
rsi

37
38 Studies with visual inductions activated mental simulation in the treatment condition by
39
on

40 showing a high-quality picture of the intended future behavior (e.g., colorful, and clear, Petrova
41
42
and Cialdini 2005; larger visuals, Rossiter and Percy 1980). In the control condition, these
43
44
45 studies employed two different operationalizations: (a) No picture - participants in this control
46
47 did not receive a visual at all or (b) Less-vivid picture, participants in this control received either
48
49 a picture that was unrelated to the intended behavior or a lower-quality, less vivid picture of the
50
51
52 intended behavior (e.g., black and white and fuzzy, smaller visuals).
53
54
55
56
57
58
59 21
60 Journal of Marketing
Page 22 of 61

1
2 Author Accepted Manuscript
3 We also coded studies that combined visual and verbal inductions in the same
4
5
6 experimental condition. We compared the combined treatment condition to the available control
7
8 conditions (which constituted the single modality treatment conditions): (a) Verbal induction
9
10 alone - this control used instructions or (b) Visual induction alone - this control used static
11
12
13
visuals to mentally simulate a behavior. We conducted a publication bias analysis for this
14
15 separate set of studies that used combined inductions (see Web Appendix D) and did not find any
16
17 evidence of publication bias.
Pe
18
19
We further coded studies in which the treatment condition involved dynamic visual
20
e

21
simulation inductions or technologically enhanced visuals. Dynamic simulation inductions were
rR
22
23
24 either images rendered in augmented reality (e.g., an image of a food item on a table, Fritz, Hadi,
25
ev

26 and Stephen 2020), rotating images (e.g., an image of a jacket participants can zoom, move, or
27
28
iew

29 rotate, Choi and Taylor 2014), or a 360-degree video (e.g., of a digital camera, Schlosser 2003).
30
31 The control group involved static visuals (e.g., pictures of the product from front, back, and side
32
33 angles). Note that in this comparison the control condition constituted the treatment condition
Ve

34
35
36
discussed earlier (i.e., high-quality pictures). Hence, we analyzed studies with dynamic visuals
rsi

37
38 separately. We also conducted a publication bias analysis for this separate set of studies that used
39
on

40 dynamic visuals (see Web Appendix E) and did not find any evidence of publication bias. We
41
42
present the four simulation induction methods and respective control conditions in Table 1.
43
44
45 TABLE 1. INDUCTION MODALITY IN TREATMENT AND CONTROL CONDITIONS
46
47
48 Modality k Treatment Control
49
50
51
Verbal 60 Instructions No instructions
52
53
Visual 88 High-quality picture (static) No picture or Low-
54 quality picture
55
56
57
58
59 22
60 Journal of Marketing
Page 23 of 61

1
2 Author Accepted Manuscript
3
4 Combined 22 Instructions + High-quality picture (static) Instructions OR High-
5 quality picture (static)
6
7 Visual 20 Dynamic visuals (360-degree pictures or High-quality picture
8 augmented reality) (static)
9
10
11
12 Simulation frequency. Studies in our review that involved either verbal or visual
13
14 inductions used three distinct presentation frequencies and spacings to activate mental
15
16
17 simulation: single, spaced, and massed. In single simulations, participants simulated the target
Pe
18
19 behavior once (k = 116). For instance, participants mentally simulated consuming popcorn (e.g.,
20
e

21 Hildebrand, Harding, and Hadi 2019) or using a notebook computer (e.g., Shiv and Huber 2000).
rR
22
23
24 In spaced simulations (k = 10), participants repeatedly simulated the target behavior in multiple
25
ev

26 sessions distributed over time (e.g., weekly). For instance, participants mentally simulated
27
28 engaging in physical activity (Kim et al. 2011) or eating healthy snacks (Rennie et al. 2014) once
iew

29
30
31
a week for four weeks. These studies measured the outcome at the end of the final intervention
32
33 session. Finally, in massed presentations (k = 22), participants simulated the same target behavior
Ve

34
35 repeatedly within a short period (generally in the same session). In a single experimental session,
36
rsi

37 respondents mentally simulated performing the behavior multiple times (e.g., eating 30 M&Ms,
38
39
Morewedge, Huh, and Vosgerau 2010) or using all five sensory modalities (e.g., focusing on the
on

40
41
42 sound, the smell, taste, look, and sensation of eating cake, Cornil and Chandon 2016).
43
44 Familiarity with the simulated target. The reviewed studies used simulation targets that
45
46
47 may have differed in participants’ prior familiarity. To estimate the familiarity of each target at
48
49 the time of publication, we tracked mentions of the simulation targets in the news with Dow
50
51 Jones Interactive, an online news database that records the frequency of use of a topic, word, or
52
53
54
phrase (now known as Factiva.com [https://global.factiva.com/]). Following Fast, Heath and
55
56 Wu’s (2009) demonstration that the prevalence of words or names in news media is an indicator
57
58
59 23
60 Journal of Marketing
Page 24 of 61

1
2 Author Accepted Manuscript
3 of familiarity, we counted the number of times a simulation target appeared in publications in the
4
5
6 country where the research was conducted and in the year the paper was published. We created
7
8 an index of this word or phrase versus four commonly used words (news, land, buy, market) that
9
10 may have appeared in publications frequently to control for any increase in word appearances
11
12
13
over time. We then created a composite index by averaging these four indexes. We used the
14
15 composite index as a continuous moderator that reflects the relative prevalence of a stimulus
16
17 target at the time of publication (see Web Appendix G for further details).
Pe
18
19
Type of simulation target. Simulation targets were typically not manipulated within a
20
e

21
given study but rather varied across studies. We coded four distinct simulation targets that would
rR
22
23
24 be relevant to managers and researchers: (a) material included tangible products such as a car,
25
ev

26 laptop, jacket, etc.; (b) experiential included ephemeral experiences such as vacationing at a
27
28
iew

29 resort or visiting a park; (c) food included items to be consumed such as hamburgers, cookies,
30
31 M&M, orange juice, etc.; and (d) health included engaging in beneficial activities (e.g.,
32
33 exercising).
Ve

34
35
36
Simulation focus. Only two studies in our review explicitly manipulated whether
rsi

37
38 instructions focused on the outcome versus the process of consumption (vs. a separate control
39
on

40 condition, Marszał-Wis̈niewska and Jarczewska-Gerc 2016; Zhao, Hoeffler, and Zauberman


41
42
2007). For these papers, we compared process and outcome simulation conditions separately to
43
44
45 the control. For all other studies, we coded simulation prompts as focusing on the outcome or
46
47 process in line with the definition from Taylor et al. (1998).
48
49 Simulation context. We also coded whether manipulations were embedded in an explicit
50
51
52 persuasive context (i.e., advertising context) or not. In some studies (e.g., Elder and Krishna
53
54 2012; Escalas 2004), prompts were embedded within an advertisement, which we coded as an
55
56
57
58
59 24
60 Journal of Marketing
Page 25 of 61

1
2 Author Accepted Manuscript
3 advertising context. In other studies (Duncan et al. 2012; Meslot et al. 2016), prompts were
4
5
6 presented without a specific persuasive context.
7
8 Population. Some studies recruited participants online, through Amazon’s Mechanical
9
10 Turk. Other studies were conducted in person at a college/university with student participants,
11
12
13
and the remaining studies were conducted in person using general population samples.
14
15 Journal field. Some studies appeared in marketing journals (e.g., Journal of Marketing,
16
17 Journal of Consumer Research) and others in non-marketing ones (e.g., Appetite, Frontiers in
Pe
18
19
Psychology). In the case of working papers, dissertations, or conference presentations, we based
20
e

21
coding on the primary appointment of the authors or the domain of the conference. Of all effect
rR
22
23
24 sizes included in this paper, 79% were based on a paper in the field of marketing.
25
ev

26 A summary of all moderators tested can be found in Table 2.


27
28
iew

29
30 TABLE 2. MODERATORS
31
32
33 Moderator Coding Scheme
Ve

34
35 1 = visual, 0 = verbal
36 Induction modality  Additionally, we tested Combined (Visual + Verbal)
rsi

37
38
and Dynamic Inductions.
39
1 = spaced, 0 = otherwise
on

40
41 Simulation frequency 1 = massed, 0 = otherwise
42 Reference category if both variables are 0 is single simulation
43
44 Prior experience with
Familiarity index based on Factiva mentionings
45 simulated target
46
47 1 = material, 0 = otherwise
48 1 = food, 0 = otherwise
49 Type of simulation target
1 = health, 0 = No
50 Reference category if all variables are 0 is experiential
51
52
Simulation focus 1 = process-focused, 0 = outcome-focused
53
54
55 Simulation context 1 = in advertising context, 0 = no advertising-context
56
57
58
59 25
60 Journal of Marketing
Page 26 of 61

1
2 Author Accepted Manuscript
3
4 1 = General population, 0 = otherwise
5 Population 1 = Students, 0 = otherwise
6 Reference category if both variables are 0 is Mturk
7
8 Journal field 1 = marketing, 0 = outside of marketing
9
10 Year Year of publication (1980 to 2020)
11
12 1 = published, 0 = unpublished data (working paper,
13 Published
dissertation, or conference proceedings)
14
15
16 Country 1 = US, 0 = otherwise
17
Pe
18 Dependent variable is a
1 = continuous, 0 = otherwise (binary, count)
19 scale
20
e

21 Dependent variable is a
rR
22 scale composed of 1 = multiple-items, 0 = single-item
23 multiple items
24
25
ev

26
27 Results
28
iew

29
30
31
32 Does Mental Simulation Affect Behavior?
33
Ve

34 Across induction methods and dependent variables, the meta-analysis included 237 total
35
36
effect sizes nested in 126 studies from 55 different manuscripts. In aggregate, these reflected
rsi

37
38
39 responses of 40,705 participants. Of these, combined (i.e., visual + verbal) simulation inductions
on

40
41 (22 effect sizes), dynamic stimulation inductions (20 effect sizes), and studies with attitudes as
42
43 the dependent variable (47 effect sizes) were analyzed separately, leaving the largest number of
44
45
46 effect sizes (148) for analyses of static inductions on behavior.
47
48 Table 3 shows the results from the best fitting three-level hierarchical linear model
49
50 (HLM; see Web Appendix H for alternative models). Overall, mental simulation created a small
51
52
53 positive increase in consumers’ behavioral responses. On average, simulation inductions
54
55 increased behavior by 7.5% expressed in Cohen’s Improvement Index. When the effect size was
56
57
58
59 26
60 Journal of Marketing
Page 27 of 61

1
2 Author Accepted Manuscript
3 broken down to intentions and behavior, the effect size was significant for intentions but not for
4
5
6 behavior, possibly due to the smaller sample size assessing actual behavior. Figure 2 depicts the
7
8 effect sizes of all 148 studies, grouped by induction modality, as well as the overall effect size.
9
10 The nonrandom variability (I2), or dispersion attributable to true heterogeneity and not sampling
11
12
13
error (Higgins and Thompson 2002; Huedo-Medina et al. 2006), was sizable, suggesting that
14
15 outcomes were influenced by simulation frequency and other study features.
16
17
Pe
18
19
TABLE 3. THE INFLUENCE OF MENTAL SIMULATION INDUCTIONS (VS. NO
20
e

21
SIMULATION CONTROL)
rR
22
23
24
25
Number Total Sample %
ev

26 Dependent Variables of Effects Size Hedge’s g [95% CI] Change1 I2 (%)


27
28 Behavioral
iew

29 Responses 148 40,705 0.19***a [0.11, 0.27] 7.5 83.0


30
31 Behavioral
32 Intentions 116 36,868 0.18***a [0.11, 0.25] 7.1 76.2
33
Ve

34 Behavior 32 3,837 0.13a [-0.20, 0.45] 5.1 91.2


35
36 Attitudes Toward
rsi

37 Simulation Target 47 23,828 0.15***a [0.07, 0.24] 6.1 63.0


38
39 ***p < .001.
1 Standardized percentage change in the outcome as indicated by Cohen’s Improvement Index.
on

40
41 a Identical letters indicate no significant difference between effect sizes.
42
43
44
45
46
47 FIGURE 2. FOREST PLOT OF ALL EFFECT SIZES WITH BEHAVIORAL
48
49 OUTCOMES GROUPED BY INDUCTION MODALITY
50
51
52
53
54
55
56
57
58
59 27
60 Journal of Marketing
Page 28 of 61

1
2 Author Accepted Manuscript
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 28
60 Journal of Marketing
Page 29 of 61

1
2 Author Accepted Manuscript
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 29
60 Journal of Marketing
Page 30 of 61

1
2 Author Accepted Manuscript
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 30
60 Journal of Marketing
Page 31 of 61

1
2 Author Accepted Manuscript
3 Does Mental Simulation Affect Attitudes?
4
5
6 Our review included 47 attitude effect sizes from 18 papers that reported both attitude
7
8 towards the simulation target and behavioral responses in 35 separate studies (Table 3). All
9
10 studies induced mental simulation using the single stimulation paradigm. Echoing the overall
11
12
13
findings, simulation in this smaller sample increased behavior by 5% and attitudes by 6%
14
15 (compared with the control), expressed in Cohen’s Improvement Index. Simulations had
16
17 comparable effects on behavior and attitudes.
Pe
18
19
20
e

21 How to Amplify Mental Simulation’s Effect on Behavior?


rR
22
23 Use modalities interchangeably as modality of simulations has no differential effect. All
24
25 effect sizes by subgroup and the HLM results for the impact of other examined factors on
ev

26
27
28
behavioral responses are reported in Table 4. Verbal instructions and visual photos had
iew

29
30 statistically comparable effects on behavior. Specifically, expressed in Cohen’s improvement
31
32 index, simulations that used a verbal induction or a static visual increased behavioral responses
33
Ve

34
by 9.1% and 6.1%, respectively.
35
36
Combine multiple modalities. We further analyzed whether combining both modalities
rsi

37
38
39 had a stronger effect compared with each modality alone (Table 4). Indeed, both modalities in
on

40
41 combination increased behavioral responses by 9.8% compared with using either modality alone.
42
43
44 Use dynamic simulations. Studies that activated mental simulation using dynamic visual
45
46 inductions (e.g., 360-degree videos, AR) increased behavioral responses by 15.5% compared
47
48 with the static visual control, revealing a medium-sized effect (Table 4). To test H1 we
49
50
51
compared the effect size from studies that used dynamic visuals to visual inductions. Supporting
52
53 H1, studies that used dynamic visuals produced a significantly larger behavioral response than
54
55
56
57
58
59 31
60 Journal of Marketing
Page 32 of 61

1
2 Author Accepted Manuscript
3
studies that used static visuals (z = −2.66, 95% CI [ −0.43, −0.06], p = .008), and a marginally
4
5
6 larger response than studies that used instructions (z = −1.94, 95% CI [ −0.37, 0.001], p = .052).
7
8
9
Space repeated simulations to increase behavior, mass simulations to decrease behavior.
10
11 Recall, single simulation studies produced a small effect, increasing consumers’ behavioral
12
13 responses by 5.7% compared with the control. Spaced simulations, which were presented
14
15
multiple times with breaks in-between, produced a large positive effect and increased behavioral
16
17
Pe
18 responses by 33.8% compared with single simulations. Finally, massed simulations, which
19
20 repeated mental simulations without a break, decreased consumers’ behavioral responses by
e

21
rR
22 12.6% compared with single simulations. Supporting H2a, spaced simulations had a larger
23
24
25 positive effect than single simulations. Furthermore, supporting H2b, the effect of single
ev

26
27 simulations was positive and hence larger than the effect of massed simulations, which was
28
iew

29 negative.
30
31
32
Use more in-person and, if possible, general population samples. Mental simulation
33
Ve

34 increased behavioral responses in studies conducted in person with general population samples
35
36 by 18.1% and with student samples by 5.1%. However, simulation had no significant effect in
rsi

37
38
studies conducted with MTurk samples (note, only single and massed simulation studies were
39
on

40
41 conducted online). When we compared the general population and student samples, simulation
42
43 had a stronger influence on the general population than on students, QM(1) = 8.89, p < .001).
44
45
46
47
48 TABLE 4. RESULTS OF MODERATOR ANALYSES
49
50
51 Fully Nested (3-level) Model
52 M effect size Without Research Operational
53
54
k studies (SE) Variables
55 Moderators
56 Induction modality
57
58
59 32
60 Journal of Marketing
Page 33 of 61

1
2 Author Accepted Manuscript
3 Visual (+1) 88 0.16 (0.04) 0.03 (0.03)
4
5
Verbal (0) 60 0.23 (0.05)
6 Simulation frequency
7 Massed (+1) 22 -0.32 (0.09) -0.31* (0.13)
8 Spaced (+1) 10 0.99 (0.13) 0.85*** (0.21)
9 Single (0) 116 0.18 (0.03)
10 Familiarity 148 0.001 (0.01)
11
Simulation target
12
13 Material (+1) 51 0.21 (0.06) 0.09 (0.08)
14 Food (+1) 60 0.07 (0.06) 0.08 (0.06)
15 Health (+1) 14 0.74 (0.13) 0.17 (0.19)
16 Experiential (0) 23 0.13 (0.08)
17
Pe
Context
18 In advertising (+1) 67 0.18 (0.06) -0.03 (0.07)
19
20
No advertising (0) 81 0.18 (0.06)
e

21 Focus
Outcome (+1) 32 0.31 (0.07) 0.03 (0.07)
rR
22
23 Process (0) 116 0.14 (0.04)
24 Population
25 General population (+1) 25 0.47 (0.09) 0.21* (0.09)
ev

26
Students (+1) 65 0.13 (0.05) 0.05 (0.07)
27
28
Mturk (0) 58 0.07 (0.06)
iew

29 AIC 63.96
30 BIC 114.32
31 df 16
32 Analyzed Separately [95% CI]
33
Combined Inductions 22 0.25 (0.05) [0.14, 0.36]
Ve

34
35 Dynamic Inductions 20 0.40 (0.09) [0.23, 0.56]
36
rsi

37
38
39
on

40
41
General Discussion
42
43 Mental simulation had a small and robust effect of increasing behavioral responses in our
44
45 dataset of 237 effect sizes spanning four decades of research and representing 40,705
46
47
respondents. At the same time, the studies revealed several ways to strengthen this effect. Most
48
49
50 important and relevant to both managers and researchers, the type of prompt used to stimulate
51
52 mental simulation was a major influence on simulation effectiveness. When consumers simulated
53
54 the target behavior using either a combination of verbal and static visual prompts or a dynamic
55
56
57
58
59 33
60 Journal of Marketing
Page 34 of 61

1
2 Author Accepted Manuscript
3 visual prompt (such as AR or 360-degree videos), the behavioral impact far exceeded that of
4
5
6 verbal or static visual prompts alone. Thus, simulation-based communication can benefit from
7
8 multiple-modalities or technological developments facilitating easier, more vivid, and realistic
9
10 mental simulation.
11
12
13
The frequency of the simulation also provides a way to increase simulation effects. Prior
14
15 advertising research has highlighted the advantages of frequent presentation because it can
16
17 improve encoding and memory, as reflected in brand recall. However, repeated presentations did
Pe
18
19
not always enhance the effectiveness of mental simulation. Single simulation inductions, often
20
e

21
used in academic research and A/B testing in industry, produced a positive but small increase in
rR
22
23
24 behavioral response. When the simulation was repeated and spaced over time, the simulation’s
25
ev

26 effect was on average five times larger than that of single inductions. In contrast, massed
27
28
iew

29 simulations actually reduced behavior. Small time intervals between exposures likely lead
30
31 consumers to habituate and ultimately disengage from the simulated behavior.
32
33 It is also worth noting that studies conducted online (e.g., MTurk samples) yielded
Ve

34
35
36
nonsignificant results. Studies conducted with in-person samples, however, produced significant
rsi

37
38 effects. These findings suggest that in-person samples offer a more robust context for researchers
39
on

40 to examine mental simulation and its consequences.


41
42
43
44
45 Managerial Implications
46
47 An important finding for practitioners is that more interactive and engaging simulation
48
49 prompts, such as 360-videos and AR tools, are especially effective in increasing behaviors. Fritz,
50
51
52 Hadi, and Stephen (2022) find that, by superimposing objects into a consumer’s real-time
53
54 environment, AR increases the ease with which consumers can mentally simulate behavior. This
55
56
57
58
59 34
60 Journal of Marketing
Page 35 of 61

1
2 Author Accepted Manuscript
3 ease of simulation can, in turn, increase their desire, purchase likelihood, and consumption
4
5
6 enjoyment. Beyond the initial attraction of novel technologies, investing in such technologies
7
8 and approaches may be particularly important for companies that rely on consumers simulating a
9
10 future experience or outcome. Further, certain existing technologies and channels can be
11
12
13
leveraged for simulation-based communications. For instance, animated graphics play an
14
15 increasingly important role in digital advertising and email marketing for product reveals and
16
17 demonstrations (White 2021). Further 1.5 million beauty videos are uploaded each month on
Pe
18
19
YouTube, accounting for 4.6 billion views stimulating simulation (Hudson, Kim and Multon
20
e

21
2018). Additionally, many luxury brands, from shoes to watches, employ unboxing videos on
rR
22
23
24 TikTok and Instagram to stimulate viewers’ imagination and influence their future brand
25
ev

26 purchases.
27
28
iew

29 Even if practitioners rely on traditional prompts to activate mental simulation, they can
30
31 achieve double the effect size by combining visual and verbal prompts. This is in line with
32
33 research on visual and verbal communication in consumer-generated content. For instance, in the
Ve

34
35
36
context of online reviews, Ceylan, Diehl, and Proserpio (2023) find that when review photo and
rsi

37
38 review text convey similar aspects of one’s experience consumers find the review easier to
39
on

40 process which, in turn, increases that review's perceived helpfulness. Similarly, in simulation-
41
42
based ads, practitioners can more easily activate mental simulation and reap the benefits when
43
44
45 they use both modalities in combination.
46
47 Although researchers and managers have previously stressed the importance of frequency
48
49 when planning campaigns, they were generally concerned about ensuring brand recall. Our
50
51
52 findings suggest that frequency and spacing are uniquely important for campaigns involving
53
54 mental simulation. Ads prompting mental simulation are most effective when consumers are
55
56
57
58
59 35
60 Journal of Marketing
Page 36 of 61

1
2 Author Accepted Manuscript
3 exposed to an ad multiple times. However, in today’s age of viewers fragmented across
4
5
6 platforms, some consumers may be reached only once, while others may be exposed to the same
7
8 ad multiple times in brief succession. Both types of exposures are particularly problematic for
9
10 simulation-based ads. Our findings show that simulating a consumption experience only once
11
12
13
has a much smaller effect on behavior than repeated and spaced simulations. Worse, we find that
14
15 repeated simulation prompts without spacing inhibit behavior. Hence, for marketers employing
16
17 mental simulation in their campaigns, controlling, especially limiting, daily exposure is
Pe
18
19
particularly important. Hulu, for example, has taken steps to ensure its viewers encounter the
20
e

21
same commercial only twice per hour, four times per day, or 25 times per week. Other platforms
rR
22
23
24 such as Facebook and Instagram now allow marketers to place limits on daily or weekly
25
ev

26 exposure, which, given our findings, should be set even lower than those employed by Hulu. Yet,
27
28
iew

29 with the practice of retargeting, exposure across venues may still be substantial (Smith 2021).
30
31 Hence, simulation-based ads may particularly benefit from technological advances that allow
32
33 marketers to place frequency limits across platforms, such as Google’s Display & Video 360
Ve

34
35
36
(Bulbul 2022).
rsi

37
38 Furthermore, we found that mental simulation prompts were ineffective for online
39
on

40 respondents, possibly because they were not sufficiently involved or engaged in the simulation
41
42
process. This finding may be particularly alarming for managers because a large chunk of
43
44
45 advertising spending is on TV and digital channels that may be consumed along with distracting
46
47 activities and may involve active disengagement from ads. Ads that include mental simulation
48
49 may better fit into channels in which consumers initiate the marketing activity (Wiesel, Pauwels,
50
51
52 and Arts 2011), such as search ads that ensure greater consumer attention and engagement based
53
54 on their declared interests (Shankar and Malthouse 2007). For other channels, it may be
55
56
57
58
59 36
60 Journal of Marketing
Page 37 of 61

1
2 Author Accepted Manuscript
3 particularly important to target mental simulation messaging well to ensure that consumers are
4
5
6 sufficiently motivated to engage in the effortful simulation process. Implications of our findings
7
8 for managers and researchers are summarized in Table 5.
9
10
11
12
13
TABLE 5. MANAGERIAL AND RESEARCH IMPLICATIONS
14
15
16 Key Findings Managerial and Research Implications
17
Pe
18
19
On average, mental simulation Both managers and researchers should carefully
20 inductions have a small effect on select simulation paradigms that heighten
e

21 consumption and purchase behavior. effectiveness


rR
22
23 Simulation inductions are more Multi-modality can enhance the ease with which
24 effective when they combine visual and consumers can process simulation-based ads, which
25 verbal modalities than simulations that managers and researchers alike can utilize in their
ev

26 use either modality alone. communication and research, respectively.


27
28 Dynamic visual inductions (rotating Engaging techniques such as rotating pictures and
iew

29
pictures or AR tools) are more effective AR tools are more effective in increasing behaviors
30
31 than verbal and static visual inductions. and can be used more often by managers and
32 researchers.
33
Ve

34
35 Simulations are more effective when Both researchers and managers may refrain from
36 repeated over time with breaks (e.g., relying solely on single simulation interventions,
rsi

37 across weeks) such as investing in a simulation-based commercial


38 that is shown only once to consumers (e.g., Scarlet
39 Johansson imagining life with a mind-reading
on

40
Alexxa during Super Bowl 2022; Rawlings 2022).
41
42 Marketing communication largely strives for
43 repeated exposures. However, ensuring frequency
44 is not sufficient to amplify the effectiveness of
45 simulation-based communication. It is also critical
46 that simulation prompts are appropriately spaced.
47 Researchers may benefit from repeating simulation
48
49
inductions to strengthen the effect of their
50 manipulations.
51
52
53
54
55
56
57
58
59 37
60 Journal of Marketing
Page 38 of 61

1
2 Author Accepted Manuscript
3 Repeating the same simulation in a While focusing on frequency, managers should be
4
5
short period of time (i.e., massed mindful that repeated simulations without spacing
6 simulations) reduces behavior. (i.e., massed) can dampen behavior. Hence,
7 presenting the exact same message and visuals
8 repeatedly within a short period of time should be
9 avoided (e.g., by platforms allowing limits on
10 exposure).
11
12 Simulation inductions are least effective Managers should employ simulation-based ads
13 in online studies (such as MTurk) while when consumers are motivated to engage in the
14
15
they are more effective for in-person simulation (e.g., when they actively search for
16 samples (students or general them). When studying mental simulation with
17 population) online samples (e.g., MTurk), researchers must find
Pe
18 ways to keep participants sufficiently motivated
19 and engaged in order to induce simulation.
20
e

21
rR
22
23
24 Future Research
25
ev

26 Our extensive analyses and broad survey of the available literature allowed us to test
27
28
novel questions that were not addressed in the original research, such as the impact of repeated
iew

29
30
31 versus single simulation prompts. However, the majority of studies we identified focused on a
32
33 limited range of visuals (e.g., static, 360-degree, AR) and thus was less informative about other
Ve

34
35 types of visuals, such as videos or VR, which offer great promise for both academics and
36
rsi

37
38 managers. For instance, Kristofferson, Daniels, and Morales (2021) found that VR (vs. 360-
39
degree visuals) increased donations through increased immersion in the experience, suggesting
on

40
41
42 that VR is a promising platform for simulation-based campaigns. Future research into the effects
43
44
45
of new visual technologies on consumers’ mental simulation and behavior will be important.
46
47 Furthermore, future investigation might address whether these more engaging simulation
48
49 methods can improve the effectiveness of mental simulation for targets about which consumers
50
51
have little knowledge on which to base their simulation (e.g., low familiarity targets) or for
52
53
54 contexts in which people are not highly motivated to engage in simulation (e.g., online samples,
55
56 low involvement contexts).
57
58
59 38
60 Journal of Marketing
Page 39 of 61

1
2 Author Accepted Manuscript
3 Although our analysis did not identify familiarity with the simulation target as a
4
5
6 significant predictor, familiarity may still play a role in both theory and practice. In the literature
7
8 that we analyzed, researchers generally chose products and services with which participants had
9
10 moderate to high familiarity (cf. Dahl and Hoeffler 2004). However, brands also use simulation-
11
12
13
based ads to introduce novel products. For example, Mercedes’s print ads promote new models
14
15 as coming from the “Dream Factory” and aim to activate the audience’s mental simulation. In
16
17 line with initial findings by Dahl and Hoeffler (2004) that simulation did not benefit really new
Pe
18
19
products, Mercedes’ campaign may be differentially effective when launching major innovations
20
e

21
or radically new cars (e.g., autonomous cars) because consumers may not have a solid mental
rR
22
23
24 model of the behavior (i.e., driving an autonomous car). How consumers respond to simulation-
25
ev

26 based ads of different modalities in these instances and whether consumers need greater
27
28
iew

29 handholding using vivid visual prompts or easy-to-understand simulation starters can be the
30
31 focus of future research.
32
33 Furthermore, simulation prompts in the reviewed literature were often associated with a
Ve

34
35
36
company and an explicit persuasion attempt (i.e., advertising). Although persuasive context did
rsi

37
38 not influence simulation effectiveness, future research might examine whether simulation
39
on

40 prompts by third parties (e.g., to eat more vegetables by the American Diabetes Association)
41
42
would be more effective than prompts by entities with a financial interest in the simulated
43
44
45 behavior.
46
47 The literature we reviewed focused on engaging in a particular behavior (e.g., eating a
48
49 cookie). However, particularly in the context of health and well-being, policymakers often
50
51
52 employ simulation-based ads to not engage in a particular behavior (e.g., reducing consumption
53
54 of unhealthy foods). The effect of mental simulation may be more complex when reducing an
55
56
57
58
59 39
60 Journal of Marketing
Page 40 of 61

1
2 Author Accepted Manuscript
3 existing behavior via simulation than encouraging new behavior. Further factors such as
4
5
6 familiarity may also play a role. For example, Hagger et al. (2012) found that mental simulation
7
8 prompts to reduce alcohol consumption were more effective for those with higher baseline
9
10 consumption. Future research may systematically examine the effect of simulating not to engage
11
12
13
in certain behaviors and may even contrast these with massed simulations that effectively
14
15 reduced behavior via habituation.
16
17 Given that most current research tested positive consumption experiences and that
Pe
18
19
positive consumption behavior is highly relevant to marketing practice, we focused solely on
20
e

21
such experiences. However, managers and policymakers sometimes leverage negative mental
rR
22
23
24 simulations to raise awareness for certain causes (e.g., the UK Alzheimer’s Society asked
25
ev

26 viewers to imagine what it would be like to go through Covid lockdown while experiencing
27
28
iew

29 Alzheimer’s) or to demarket products (e.g., vivid photos of mouth lesions on cigarette packs to
30
31 reduce smoking). Simulating negative events involves different psychological processes than
32
33 simulating positive events and may thus affect behavior differently (Barsics, van der Linden, and
Ve

34
35
36
D’Argembeau 2016). On the one hand, negative simulation prompts may be more effective
rsi

37
38 because people weigh negative (than positive) information more heavily in their decision
39
on

40 processes (Skowronski and Carlston 1989). On the other hand, counterarguing, motivated
41
42
reasoning, or focusing on emotional responses (such as fear) may reduce the effectiveness of
43
44
45 negative simulation prompts. For example, the graphicness of negative visuals did not affect
46
47 quitting thoughts for heavy smokers (Andrews et al. 2014), and extensive elaboration on harmful
48
49 consequences interfered with processing recommended changes in behavior (Keller and Block
50
51
52 1996). Given these conflicting possibilities, future research could examine the effect of negative
53
54 simulation prompts on behavior more closely.
55
56
57
58
59 40
60 Journal of Marketing
Page 41 of 61

1
2 Author Accepted Manuscript
3 Our investigation focused on purchases and consumption due to their direct link with
4
5
6 mental simulations as well as their importance to researchers and managers. However, other
7
8 simulation-induced outcomes may also be important. For example, earlier research found that
9
10 savoring an upcoming consumption experience heightened enjoyment of the experience (Chun,
11
12
13
Diehl, and MacInnis 2017). Similarly, it is possible that mental simulation increases readiness to
14
15 enjoy the experience beyond just readiness to act, a potentially critical yet unexplored effect.
16
17 Given the importance of the actual consumption experience for both satisfaction and repeat
Pe
18
19
purchases, future research may want to examine mental simulation outcomes that extend beyond
20
e

21
purchase or use.
rR
22
23
24 In sum, with our quantitative review, we contribute to the important literature on mental
25
ev

26 simulation by offering novel insights to inform academic research and marketing practice. Using
27
28
iew

29 these insights, researchers can pursue novel avenues of inquiry and practitioners can more
30
31 effectively use mental simulation in their marketing strategies and communication plans.
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59 41
60 Journal of Marketing
Page 42 of 61

1
2 Author Accepted Manuscript
3 References
4
5
*Denotes papers included in the meta-analysis
6
7 *Adams, Catherine, Laura Rennie, Ayse K. Uskul, and Katherine M. Appleton (2015), “Visualising
8 future behaviour: Effects for snacking on biscuit bars, but no effects for snacking on fruit,”
9 Journal of Health Psychology, 20 (8), 1037–48.
10 *Adaval, R and R Wyer (1998), “The role of narratives in consumer information processing,” Journal
11
of Consumer Psychology, 7 (3), 207–45.
12
13
Albarracín, Dolores, Aashna Sunderrajan, Sophie Lohmann, Sally Chan, and Duo Jiang (2018), “The
14 psychology of attitudes, motivation, and persuasion,” in The Handbook of Attitudes, Routledge,
15 3–44.
16 Albarracin, Dolores, Blair T. Johnson, Martin Fishbein, and Paige A. Muellerleile (2001), “Theories
17 of reasoned action and planned behavior as models of condom use: a meta-analysis.,”
Pe
18 Psychological Bulletin, 127 (1), 142.
19
Amit, E, D Algom, and Y Trope (2009), “Distance-dependent processing of pictures and words.,”
20
e

21 Journal of Experimental Psychology: General, 138 (3), 400.


Anderson, John R. (1983), “A spreading activation theory of memory,” Journal of Verbal Learning
rR
22
23 and Verbal Behavior, 22 (3), 261–95.
24 *Andrade, Jackie, Marina Khalil, Jennifer Dickson, Jon May, and David J. Kavanagh (2016),
25 “Functional Imagery Training to reduce snacking: Testing a novel motivational intervention
ev

26 based on Elaborated Intrusion theory,” Appetite, 100, 256–62.


27
28
Andrews, J. Craig, Richard G. Netemeyer, Jeremy Kees, and Scot Burton (2014), “How graphic
iew

29 visual health warnings affect young smokers’ thoughts of quitting,” Journal of Marketing
30 Research, 51 (2), 165–83.
31 Avramova, Yana R., Patrick de Pelsmacker, and Nathalie Dens (2017), “Brand placement in text: The
32 short-and long-term effects of placement modality and need for cognition,” International Journal
33 of Advertising, 36 (5), 682–704.
Ve

34
*Babin, Laurie A. (1992), “Effects of imagery-eliciting strategies on imagery processing, memory,
35
36
beliefs, attitudes, and intentions from print advertisements,” ProQuest Dissertations and Theses.
*Babin, Laurie A. and Alvin C. Burns (1997), “Effects of print ad pictures and copy containing
rsi

37
38 instructions to imagine on mental imagery that mediates attitudes,” Journal of Advertising, 26
39 (3), 33–44.
on

40 Barsics, Catherine, Martial van der Linden, and Arnaud D’Argembeau (2016), “Frequency,
41 characteristics, and perceived functions of emotional future thinking in daily life,” Quarterly
42
Journal of Experimental Psychology, 69 (2), 217–33.
43
44 Block, Lauren G. and Punam Anand Keller (1997), “Effects of self-efficacy and vividness on the
45 persuasiveness of health communications,” Journal of Consumer Psychology, 6 (1), 31–54.
46 Blondé, Jérôme and Fabien Girandola (2016), “Revealing the elusive effects of vividness: a meta-
47 analysis of empirical evidences assessing the effect of vividness on persuasion,” Social
48 Influence, 11 (2), 111–29.
49 Bolls, Paul D and Darrel D Muehling (2007), “The effects of dual-task processing on consumers’
50
51
responses to high- and low-imagery radio advertisements,” Journal of Advertising, 36 (4), 35–47.
52 Bone, Paula Fitzgerald and Pam Scholder Ellen (1992), “The generation and consequences of
53 communication-evoked imagery,” Journal of Consumer Research, 19 (1), 93–104.
54 Borenstein, Michael, Larry v. Hedges, Julian P.T. Higgins, and Hannah R. Rothstein (2009),
55 Introduction to Meta-Analysis, John Wiley & Sons.
56
57
58
59 42
60 Journal of Marketing
Page 43 of 61

1
2 Author Accepted Manuscript
3 van Boven, Leaf and Laurence Ashworth (2007), “Looking forward, looking back: Anticipation is
4
5
more evocative than retrospection,” Journal of Experimental Psychology: General, 136 (2), 289–
6 300.
7 Bulbul, Cenk (2022), “YouTube joins the Upfronts,” YouTube Official Blog, (accessed July 29, 2022),
8 [available at https://blog.youtube/news-and-events/youtube-joins-the-upfronts-building-the-
9 youtube-of-the-future-for-viewers-creators-and-advertisers/].
10 *Burns, Alvin C., Abhijit Biswas, and Laurie A. Babin (1993), “The operation of visual imagery as a
11
mediator of advertising effects,” Journal of Advertising, 22 (2), 71–85.
12
13
Ceylan, Gizem, Kristin Diehl, and Davide Proserpio (2021), "Words Meet Photos: When and Why
14 Visual Content Increases Review Helpfulness." Available at SSRN 3896569.
15 *Chang, Betty P.I., Maria D.G.H. Mulders, Renata Cserjesi, Axel Cleeremans, and Olivier Klein
16 (2018), “Does immersion or detachment facilitate healthy eating? Comparing the effects of
17 sensory imagery and mindful decentering on attitudes and behavior towards healthy and
Pe
18 unhealthy food,” Appetite, 130 (November 2017), 256–67.
19
Chartrand, Tanya L., and Jessica L. Lakin (2013), "The antecedents and consequences of human
20
e

21 behavioral mimicry." Annual review of psychology, 64, 285-308.


Cheung, M. W. L. (2014), “Modeling dependent effect sizes with three-level meta-analyses: a
rR
22
23 structural equation modeling approach.,” Psychological Methods, 19 (2), 211–29.
24 *Choi, Yung Kyun and Charles R. Taylor (2014), “How do 3-dimensional images promote products
25 on the internet,” Journal of Business Research, 67, 2164–70.
ev

26 Chun, Haeeun Helen, Kristin Diehl, and Deborah J. MacInnis (2017), “Savoring an upcoming
27
28
experience affects ongoing and remembered consumption enjoyment,” Journal of Marketing, 81
iew

29 (3), 96–110.
30 Cian, Luca, Chiara Longoni, and Aradhna Krishna (2020), “Advertising a desired change: When
31 process simulation fosters (vs. hinders) credibility and persuasion,” Journal of Marketing
32 Research, 57 (3), 489–508.
33 Conroy, Dominic and Martin S. Hagger (2018), “Imagery interventions in health behavior: A meta-
Ve

34
analysis,” Health Psychology, 37 (7), 668.
35
36
Conroy, Dominic, Paul Sparks, and Richard de Visser (2015), “Efficacy of a non-drinking mental
simulation intervention for reducing student alcohol consumption,” British Journal of Health
rsi

37
38 Psychology, 20 (4), 688–707.
39 Cooley, Sam J., Sarah E. Williams, Victoria E. Burns, and Jennifer Cumming (2013),
on

40 "Methodological variations in guided imagery interventions using movement imagery scripts in


41 sport: A systematic review." Journal of Imagery Research in Sport and Physical Activity, 8 (1),
42
13-34.
43
44 *Cornil, Yann and Pierre Chandon (2016), “Smaller food portions,pleasure as a substitute for size:
45 How multisensory imagery can make people happier with,” Journal of Marketing Research, 53
46 (5), 847–64.
47 Dahl, Darren W. and Steve Hoeffler (2004), “Visualizing the self: Exploring the potential benefits and
48 drawbacks for new product evaluation,” Journal of Product Innovation Management.
49 D’Argembeau, Arnaud, and Martial Van der Linden (2004), "Phenomenal characteristics associated
50
51
with projecting oneself back into the past and forward into the future: Influence of valence and
52 temporal distance," Consciousness and Cognition, 13 (4), 844-858.
53 Decety, Jean (1996), "Do imagined and executed actions share the same neural substrate?," Cognitive
54 brain research, 3 (2), 87-93.
55
56
57
58
59 43
60 Journal of Marketing
Page 44 of 61

1
2 Author Accepted Manuscript
3 DeRosia, Eric D. and Edward F. McQuarrie (2019), “Lost and found: Individual differences in
4
5
propensity to process visual elements of persuasion,” Psychology and Marketing, 36 (4), 266–75.
6 Dillard, James Price, and Lijiang Shen (2018), "Threat appeals as multi-emotion messages: An
7 argument structure model of fear and disgust." Human Communication Research, 44 (2), 103-
8 126.
9 Driskell, James E., Carolyn Copper, and Aidan Moran (1994), “Does Mental Practice Enhance
10 Performance?,” Journal of Applied Psychology, 79 (4), 481–92.
11
*Duncan, Lindsay R., Craig R. Hall, Philip M. Wilson, and Wendy M. Wilsons (2012), “The use of a
12
13
mental imagery intervention to enhance integrated regulation for exercise among women
14 commencing an exercise program,” Motivation and Emotion, 36 (4), 452–64.
15 Durlak, Joseph A. (2009), “How to Select, Calculate, and Interpret Effect Sizes,” Journal of Pediatric
16 Psychology, 34 (9), 917–28.
17 Duval, Sue and Richard Tweedie (2000a), “Trim and fill: A simple funnel-plot-based method of
Pe
18 testing and adjusting for publication bias in meta-analysis,” Biometrics, 56 (2), 455–63.
19
Duval, Sue and Richard Tweedie (2000b), “A nonparametric ‘Trim and Fill’ method of accounting for
20
e

21 publication bias in meta-analysis,” Journal of the American Statistical Association, 95 (449), 89–
98.
rR
22
23 Egger, M, GD Smith, and AN Phillips (1997), “Meta-analysis: principles and procedures,” BMJ, 315
24 (7121), 1533–37.
25 *Elder, Ryan S. and Aradhna Krishna (2012), “The ‘visual depiction effect’ in advertising:
ev

26 Facilitating embodied mental simulation through product orientation,” Journal of Consumer


27
28
Research, 38 (6), 988–1003.
iew

29 *Escalas, Jennifer Edson (2004), “Imagine yourself in the product : Mental simulation, narrative
30 transportation, and persuasion,” Journal of Advertising, 33 (2), 37–48.
31 *Escalas, Jennifer Edson (2007), “Self-referencing and persuasion: Narrative transportation versus
32 analytical elaboration,” Journal of Consumer Research, 33 (4), 421–29.
33 Eyal, Peer, Rothschild David, Gordon Andrew, Evernden Zak, and Damer Ekaterina (2021), "Data
Ve

34
quality of platforms and panels for online behavioral research." Behavior Research Methods, 1-
35
36
20.
Fast, Nathanael J., Chip Heath, and George Wu (2009), “Common ground and cultural prominence:
rsi

37
38 How conversation reinforces culture,” Psychological Science, 20 (7), 904–11.
39 Fishbein, Martin and Icek Ajzen (2011), Predicting and changing behavior: The reasoned action
on

40 approach, Predicting and Changing Behavior: The Reasoned Action Approach, Psychology
41 Press.
42
Fisher, Zachary, Elizabeth Tipton, and Hou Zhipeng (2017), “Package ‘robumeta.’”
43
44 * Flavián, Carlos, Raquel Gurrea, and Carlos Orús (2017), “The influence of online product
45 presentation videos on persuasion and purchase channel preference: The role of imagery fluency
46 and need for touch,” Telematics and Informatics, 34, 1544–56.
47 Frijda, Nico H., Gordon H. Bower, and Vernon Hamilton (1988), Cognitive perspectives on emotion
48 and motivation, Cognitive Perspectives on Emotion and Motivation, (V. Hamilton, G. H. Bower,
49 and N. H. Frijda, eds.), Dordrecht: Springer Netherlands.
50
51
*Fritz, William, Rhonda Hadi, and Andrew Stephen (2022), "From tablet to table: How augmented
52 reality influences food desirability." Journal of the Academy of Marketing Science, 1-27.
53 Gevelber, Lisa, and Oliver Heckmann (2015), “Travel trends: 4 mobile moments changing the
54 consumer journey.” article in ‘think with Google’, November.
55
56
57
58
59 44
60 Journal of Marketing
Page 45 of 61

1
2 Author Accepted Manuscript
3 Gilbert, Daniel T., and Timothy D. Wilson (2007), “Prospection: experiencing the future,” Science 17,
4
5
1351–1354. doi: 10.1126/science.1144161.
6 Ghimire, Summit (2022), “The Power Of First Impressions: A Marketing Strategy That Optimizes
7 Search For A Transformed Digital Landscape,” Forbes, (accessed July 29, 2022), [available at
8 https://www.forbes.com/sites/forbesbusinesscouncil/2022/03/11/the-power-of-first-impressions-
9 a-marketing-strategy-that-optimizes-search-for-a-transformed-digital-
10 landscape/?sh=785cb4c159fb].
11
Gibson, Lawrence D (1996), “What can one TV exposure do?,” Journal of Advertising, 36 (2), 9–19.
12
13
*Goodrich, Kendall (2011), “Anarchy of effects? Exploring attention to online advertising and
14 multiple outcomes,” Psychology and Marketing, 28 (4), 417–40.
15 *Gregory, W. Larry, Robert B. Cialdini, and Kathleen M. Carpenter (1982), “Self-relevant scenarios
16 as mediators of likelihood estimates and compliance: Does imagining make it so?,” Journal of
17 Personality and Social Psychology, 43 (1), 89–99.
Pe
18 Guillot, Aymeric, Christian Collet, Vo An Nguyen, Francine Malouin, Carol Richards, and Julien
19
Doyon (2009), "Brain activity during visual versus kinesthetic imagery: an fMRI study." Human
20
e

21 Brain Mapping, 30(7), 2157-2172.


*Hagger, Martin S., Adam Lonsdale, and Nikos L.D. Chatzisarantis (2011), “Effectiveness of a brief
rR
22
23 intervention using mental simulations in reducing alcohol consumption in corporate employees,”
24 Psychology, Health and Medicine, 16 (4), 375–92.
25 Hagger, Martin S., Adam Lonsdale, and Nikos L.D. Chatzisarantis (2012), “A theory-based
ev

26 intervention to reduce alcohol drinking in excess of guideline limits among undergraduate


27
28
students,” British Journal of Health Psychology, 17 (1), 18–43.
iew

29 *Hartmann, Patrick and Vanessa Apaolaza-Ibáñez (2009), “Green advertising revisited,”


30 International Journal of Advertising, 28 (4), 715–39.
31 Hedges, Larry v., Elizabeth Tipton, and Matthew C. Johnson (2010), “Robust variance estimation in
32 meta-regression with dependent effect size estimates,” Research Synthesis Methods, 1 (1), 39–
33 65.
Ve

34
* Heller, Jonas, Mathew Chylinski, Ko de Ruyter, Dominik Mahr, and Debbie I. Keeling (2019), “Let
35
36
Me Imagine That for You: Transforming the Retail Frontline Through Augmenting Customer
Mental Imagery Ability,” Journal of Retailing, 95 (2), 94–114.
rsi

37
38 Higgins, E Tory (2006), “Value from hedonic experience and engagement,” Psychological Review,
39 113 (3), 439–60.
on

40 Higgins, Julian P.T. and Simon G. Thompson (2002), “Quantifying heterogeneity in a meta-analysis,”
41 Statistics in Medicine, 21 (11), 1539–58.
42
*Hildebrand, Diogo, R. Dustin Harding, and Rhonda Hadi (2019), “Culturally Contingent Cravings:
43
44 How Holistic Thinking Influences Consumer Responses to Food Appeals,” Journal of Consumer
45 Psychology, 29 (1), 39–59.
46 *Homer, Pamela M and Sandra G Gauntt (1992), “The Role of Imagery in the Processing of Visual
47 and Verbal Package Information,” Journal of Mental Imagery, 16 (3–4), 123–44.
48 Hudson, Sara, Aimee Kim, and Jessica Moulton (2018), “What beauty players can teach the consumer
49 sector about digital disruption | McKinsey,” McKinsey, (accessed July 29, 2022), [available at
50
51
https://www.mckinsey.com/industries/consumer-packaged-goods/our-insights/what-beauty-
52 players-can-teach-the-consumer-sector-about-digital-disruption].
53 Huedo-Medina, Tania B., Julio Sánchez-Meca, Fulgencio Marín-Martínez, and Juan Botella (2006),
54 “Assessing heterogeneity in meta-analysis: Q statistic or I 2 Index?,” Psychological Methods, 11
55 (2), 193.
56
57
58
59 45
60 Journal of Marketing
Page 46 of 61

1
2 Author Accepted Manuscript
3 *Jeong, Eun Ha and Soo Cheong Jang (2016), “Imagine yourself being healthy: The mental
4
5
simulation effect of advertisements on healthy menu promotion,” International Journal of
6 Hospitality Management, 53, 81–93.
7 Jiang, Y and RS Jr Wyer (2009), “The role of visual perspective in information processing,” Journal
8 of Experimental Social Psychology, 45 (3), 486–95.
9 Johnson, Blair T and Alice H Eagly (2014), “Meta-analysis of social-personality psychological
10 research,” in Handbook of research methods in social and personality psychology, 675–722.
11
*Johnson, Dan R., Grace K. Cushman, Lauren A. Borden, and Madison S. McCune (2013),
12
13
“Potentiating empathic growth: Generating imagery while reading fiction increases empathy and
14 prosocial behavior,” Psychology of Aesthetics, Creativity, and the Arts, 7 (3), 306–12.
15 Johnson, Marcia K., Mary A. Foley, Aurora G. Suengas, and Carol L. Raye (1988), “Phenomenal
16 characteristics of memories for perceived and imagined autobiographical events.,” Journal of
17 Experimental Psychology: General, 117 (4), 371–76.
Pe
18 Jones, John Philip (1995), “Single-source research begins to fulfill its promise,” Journal of
19
Advertising, 35 (3), 9–17.
20
e

21 *Kardes, Frank R., Maria L. Cronley, and Steven S. Posavac (2005), “Using implementation
intentions to increase new product consumption: A field experiment,” in Applying Social
rR
22
23 Cognition to Consumer-Focused Strategy, 219–33.
24 Keller, Punam Anand and Lauren Goldberg Block (1996), “Increasing the Persuasiveness of Fear
25 Appeals: The Effect of Arousal and Elaboration,” Journal of Consumer Research, 22 (4), 448–
ev

26 59.
27
28
*Kim, Bang Hyun, Roberta A. Newton, Michael L. Sachs, Peter R. Giacobbi, and Joseph J. Glutting
iew

29 (2011), “The Effect of Guided Relaxation and Exercise Imagery on Self-Reported Leisure-Time
30 Exercise Behaviors in Older Adults,” Journal of Aging and Physical Activity, 19 (2), 137–46.
31 *Kim, Minjeong and Sharron Lennon (2008), “The Effects of Visual and Verbal Information on
32 Attitudes and Purchase Intentions in Internet Shopping,” Psychology & Marketing, 25 (2), 146–
33 78.
Ve

34
Klepacz, Naomi A., Robert A. Nash, M. Bernadette Egan, Charo E. Hodgkins, and Monique M. Raats
35
36
(2016), “When is an image a health claim? A false-recollection method to detect implicit
inferences about products’ health benefits,” Health psychology : official journal of the Division
rsi

37
38 of Health Psychology, American Psychological Association, 35 (8), 898–907.
39 *Knäuper, Bärbel, Amanda McCollam, Ariel Rosen-Brown, Julien Lacaille, Evan Kelso, and
on

40 Michelle Roseman (2011), “Fruitful plans: Adding targeted mental imagery to implementation
41 intentions increases fruit consumption,” Psychology and Health, 26 (5), 601–17.
42
Kosslyn, Stephen M. (1980), Image and Mind, Harvard University Press.
43
44 *Krishna, Aradhna, Maureen Morrin, and Eda Sayin (2014), “Smellizing Cookies and Salivating: A
45 Focus on Olfactory Imagery,” Journal of Consumer Research, 41 (1), 18–34.
46 Kristofferson, Kirk, Michelle E. Daniels, and Andrea C. Morales (2021), “Using Virtual Reality to
47 Increase Charitable Donations,” Marketing Letters.
48 *Landau, Mark J, Jamie Arndt, and Linda D Cameron (2018), “Do metaphors in health messages
49 work? Exploring emotional and cognitive factors,” Journal of Experimental Social Psychology,
50
51
74, 135–49.
52 *Lange, Christine, Camille Schwartz, Célia Hachefa, Yann Cornil, Sophie Nicklaus, and Pierre
53 Chandon (2020), “Portion size selection in children: Effect of sensory imagery for snacks
54 varying in energy density,” Appetite, 150.
55
56
57
58
59 46
60 Journal of Marketing
Page 47 of 61

1
2 Author Accepted Manuscript
3 *Li, Hairong, Terry Daugherty, and Frank Biocca (2003), “The Role of Virtual Experience in
4
5
Consumer Learning,” Journal of Consumer Psychology, 13 (4), 395–407.
6 Li, Jing, Xueming Luo, Xianghua Lu, and Takeshi Moriguchi (2021), “The double-edged effects of e-
7 commerce cart retargeting: does retargeting too early backfire?,” Journal of Marketing, 85 (4),
8 123–40.
9 Lutz, Kathy A. and Richard J. Lutz (1978), “Imagery-Eliciting Strategies: Review and Implications of
10 Research,” Advances in Consumer Research.
11
MacInnis, Deborah J. and Linda L. Price (1987), “The Role of Imagery in Information Processing:
12
13
Review and Extensions,” Journal of Consumer Research, 13 (4), 473–91.
14 *Marszał-Wis̈niewska, Magdalena and Ewa Jarczewska-Gerc (2016), “Role of mental simulations in
15 the weight loss process,” Journal of Psychology: Interdisciplinary and Applied, 150 (1), 1–14.
16 *Meslot, Carine, Aurélie Gauchet, Benoît Allenet, Olivier François, and Martin S. Hagger (2016),
17 “Theory-based interventions combining mental simulation and planning techniques to improve
Pe
18 physical activity: Null results from two randomized controlled trials,” Frontiers in Psychology, 7
19
(NOV), 1–16.
20
e

21 Moorman, Christine, Megan Ryan and Nader Tavassoli (2022), “Why Marketers Are Returning to
Traditional Advertising,” Harvard Business Review, (accessed July 29, 2022), [available at
rR
22
23 https://hbr.org/2022/04/why-marketers-are-returning-to-traditional-advertising].
24 *Morewedge, Carey K., Young Eun Huh, and Joachim Vosgerau (2010), “Thought for food:
25 Imagined consumption reduces actual consumption,” Science, 330 (6010), 1530–33.
ev

26 Nelson, Douglas L, Valerie S. Reed, and Cathy L. McEvoy (1977), “Learning to order pictures and
27
28
words: A model of sensory and semantic encoding.,” Journal of Experimental Psychology:
iew

29 Human Learning & Memory, 3 (5), 485–97.


30 Paivio, Allan (1969), “Mental imagery in associative learning and memory,” Psychological Review,
31 76 (3), 241.
32 Palmer, Tom M., Jaime L. Peters, Alex J. Sutton, and Santiago G. Moreno (2008), “Contour-
33 enhanced funnel plots for meta-analysis,” Stata Journal, 8 (2), 242–54.
Ve

34
Parsons, Lawrence M (1994), "Temporal and kinematic properties of motor behavior reflected in
35
36
mentally simulated action." Journal of experimental Psychology: Human perception and
Performance 20, 4, 709.
rsi

37
38 *Pecher, Diane and Saskia van Dantzig (2016), “The role of action simulation on intentions to
39 purchase products,” International Journal of Research in Marketing, 33 (4), 971–74.
on

40 Pechmann, Cornelia, and David W. Stewart (1988), “Advertising Repetition: A Critical Review of
41 Wearin and Wearout,” Current Issues and Research in Advertising, 11 (1–2), 285–329.
42
Peters, Jamie L, Alex J Sutton, David R Jones, Keith R Abrams, and Lesley Rushton (2008),
43
44 “Contour-enhanced meta-analysis funnel plots help distinguish publication bias from other
45 causes of asymmetry,” Journal of Clinical Epidemiology, 61 (10), 991–96.
46 *Petit, Olivia, Charles Spence, Carlos Velasco, Andy T. Woods, and Adrian D. Cheok (2017),
47 “Changing the influence of portion size on consumer behavior via imagined consumption,”
48 Journal of Business Research, 75, 240–48.
49 *Petrova, Petia K. and Robert B. Cialdini (2005), “Fluency of Consumption Imagery and the Backfire
50
51
Effects of Imagery Appeals,” Journal of Consumer Research, 32 (3), 442–452.
52 Petty, Richard E. and John T. Cacioppo (1986), The elaboration likelihood model of persuasion. In L.
53 Berkowitz (Ed.), Advances in experimental social psychology (vol. 19, pp. 123-205),
54 Communication and Persuasion.
55
56
57
58
59 47
60 Journal of Marketing
Page 48 of 61

1
2 Author Accepted Manuscript
3 Pham, Lien B and Shelley E Taylor (1999), “From thought to action: Effects of process- versus
4
5
outcome-based mental simulations on performance,” Personality and Social Psychology
6 Bulletin, 25 (2), 250–60.
7 Philips, Barbara J (2017), “Consumer imagination in marketing: a theoretical framework,” European
8 Journal of Marketing, 51 (11/12), 2138–2155.
9 Pierce, Marshall (2022), “High-quality food photos can increase orders on restaurant delivery apps by
10 35%,” Snappr Workflows Blog, (accessed February 21, 2023), [available at
11
https://www.snappr.com/enterprise-blog/high-quality-food-photos-can-increase-orders-on-
12
13
restaurant-delivery-apps-by-35].
14 *Powell, Emily and Alixandra Barasch (2019), “When a Photo is Not Worth 1000 Words: How
15 Photos of Experiences Can Harm Expectations for Those Experiences,” in Society for Consumer
16 Psychology.
17 *Rajagopal, Priyali and Nicole Votolato Montgomery (2011), “I Imagine, I Experience, I Like: The
Pe
18 False Experience Effect,” Journal of Consumer Research, 38 (3), 578–94.
19
Rawlings, Charlotte (2022), “Scarlett Johansson and Colin Jost imagine life with 'mind reader' Alexa
20
e

21 in Super Bowl ad, ” Campaign US, (accessed February 22, 2023), [available at
https://www.campaignlive.com/article/scarlett-johansson-colin-jost-imagine-life-mind-reader-
rR
22
23 alexa-super-bowl-ad/1739536].
24 *Renner, Fritz, Fionnuala C. Murphy, Julie L. Ji, Tom Manly, and Emily A. Holmes (2019), “Mental
25 imagery as a ‘motivational amplifier’ to promote activities,” Behaviour Research and Therapy,
ev

26 114, 51–59.
27
28
Rennie, Laura J., Peter R. Harris, and Thomas L. Webb (2014), “The impact of perspective in
iew

29 visualizing health-related behaviors: First-person perspective increases motivation to adopt


30 health-related behaviors,” Journal of Applied Social Psychology, 44 (12), 806–12.
31 *Rennie, Laura J., Ayse K. Uskul, Catherine Adams, and Katherine Appleton (2014), “Visualisation
32 for increasing health intentions: Enhanced effects following a health message and when using a
33 first-person perspective,” Psychology & Health, 29 (2), 237–52.
Ve

34
*Roggeveen, Anne L., Dhruv Grewal, Claudia Townsend, and R. Krishnan (2015), “The impact of
35
36
dynamic presentation format on consumer preferences for hedonic products and services,”
Journal of Marketing, 79 (6), 34–49.
rsi

37
38 Rosenthal, Robert (1991), “Meta-analytic procedures for social research (rev. ed.),” in Applied social
39 research methods series, Vol, 18–20.
on

40 *Rossiter, John R. and Larry Percy (1980), “Attitude change through visual imagery in advertising,”
41 Journal of Advertising, 9 (2), 10–16.
42
*Ross-Stewart, Lindsay (2009), “The effect of a one time imagery intervention on self-efficacy and
43
44 exercise frequency in a non-exercising population,” ProQuest Dissertations and Theses, 101.
45 Rothstein, Hannah R. and Brad J. Bushman (2012), “Publication bias in psychological science:
46 Comment on Ferguson and Brannick (2012),” Psychological Methods, 17 (1), 129–36.
47 *Schlosser, Ann E. (2003), “Experiencing Products in the Virtual World: The Role of Goal and
48 Imagery in Influencing Attitudes versus Purchase Intentions,” Journal of Consumer Research, 30
49 (2), 184–98.
50
51
Shankar, Venkatesh and Edward C Malthouse (2007), “The growth of interactions and dialogs in
52 interactive marketing,” Article in Journal of Interactive Marketing, 21 (2), 2–4.
53 *Shiv, Baba and Joel Huber (2000), “The Impact of Anticipating Satisfaction on Consumer Choice,”
54 Journal of Consumer Research, 27 (2), 202–16.
55
56
57
58
59 48
60 Journal of Marketing
Page 49 of 61

1
2 Author Accepted Manuscript
3 *Silvera, David H., Bruce E. Pfeiffer, Frank R. Kardes, Ashley Arsena, and R. Justin Goss (2014),
4
5
“Using imagine instructions to induce consumers to generate ad-supporting content,” Journal of
6 Business Research, 67 (7), 1567–72.
7 Simonsohn, Uri, Leif D. Nelson, and Joseph P. Simmons (2014), “P-curve: A key to the file-drawer,”
8 Journal of Experimental Psychology: General, 143 (2), 534–47.
9 Skowronski, John J. and Donal E. Carlston (1989), “Negativity and Extremity Biases in Impression
10 Formation: A Review of Explanations,” Psychological Bulletin, 105 (1), 131.
11
Smith, Gerry (2021), “Hulu, Discovery+, Peacock: Streaming Ads Are a Confusing $11 Billion
12
13
Business - Bloomberg,” Bloomberg.com, (accessed November 23, 2021), [available at
14 https://www.bloomberg.com/news/articles/2021-05-04/hulu-discovery-peacock-streaming-ads-
15 are-a-confusing-11-billion-business].
16 Spears, Nancy, Seth Ketron, and Waros Ngamsiriudom (2016), “Three peas in the pod of consumer
17 imagination: Purchase task, involvement, and ad information,” Journal of Consumer Behaviour,
Pe
18 15 (6), 527–37.
19
Steels, Luc (2003), “Intelligence with representation,” Philosophical Transactions of the Royal
20
e

21 Society of London. Series A: Mathematical, Physical and Engineering Sciences, 361, 2381–2395.
doi: 10.1098/rsta.2003.1257
rR
22
23 Sutton, Alex J (2009), “Publication bias,” in Handbook of Research Synthesis and Meta-Analysis.
24 *Szpunar, Karl K. and Kathleen B. McDermott (2008), “Episodic future thought and its relation to
25 remembering: Evidence from ratings of subjective experience,” Consciousness and Cognition,
ev

26 17 (1), 330–34.
27
28
Taylor, Shelley E., Lien B. Pham, Inna D. Rivkin, and David A. Armor (1998), “Harnessing the
iew

29 Imagination: Mental Simulation, Self-Regulation, and Coping,” American Psychologist, 53 (4),


30 429.
31 Taylor, Shelley E., and Sherry K. Schneider (1989), "Coping and the Simulation of Events." Social
32 Cognition, 7 (2), 174.
33 Tellis, Gerry J (2003), Effective advertising: Understanding when, how, and why advertising works,
Ve

34
Sage Publications.
35
36
*Thompson, Debora Viana (2006), “Influencing Consumers’ Preferences: The Effects of Mental
Construal and Mode of Information Processing,” ProQuest Dissertations and Theses.
rsi

37
38 Tipton, Elizabeth, James E. Pustejovsky, and Hedyeh Ahmadi (2019), “Current practices in meta-
39 regression in psychology, education, and medicine,” Research Synthesis Methods, 10 (2), 180–
on

40 94.
41 Viechtbauer, W (2010), “Conducting meta-analyses in R with the metafor package,” Journal of
42
Statistical Software, 36 (3), 1–48.
43
44 *Walters, Gabrielle (2007), “Consumption Vision, Emotion and the Tourism Consumer’s Purchase
45 Decision,” ProQuest Dissertations and Theses.
46 White, Chad S. (2021), “Animated Gifts: Best Uses and Best Practices,” Modern Marketing Blog,
47 (accessed June 29, 2022), [available at https://blogs.oracle.com/marketingcloud/post/animated-
48 gifs-best-uses-and-best-practices].
49 Wiesel, Thorsten, Koen Pauwels, and Joep Arts (2011), “Practice prize paper—Marketing’s profit
50
51
impact: Quantifying online and off-line funnel progression,” Marketing Science, 30 (4), 604–11.
52 *Zhao, Min, Steve Hoeffler, and Gal Zauberman (2007), “Mental simulation and preference
53 consistency over time: The role of process- versus outcome-focused thoughts,” Journal of
54 Marketing Research, 44 (3), 379–88.
55
56
57
58
59 49
60 Journal of Marketing
Page 50 of 61

1
1
2 Author Accepted Manuscript
3
4 From Mentally Doing to Actually Doing: A Meta-Analysis of Induced Positive
5 Consumption Simulations
6
7 Gizem Ceylan1,
8
9
10 Kristin Diehl2,
11
12 Wendy Wood2
13
14
15
16 1
17
Yale University
Pe
18
2
19 University of Southern California
20
e

21
rR
22
23
24
25
WEB APPENDICES
ev

26
27 Page
28
iew

29 A. Graphical Display of the Literature Review 2


30
31 B. Publication Bias Analyses for Behavioral Outcomes 3
32
33 C. Publication Bias Analyses for Attitude Towards the Simulation Target 5
Ve

34
35
36
D. Publication Bias Analyses for Studies that Used Combined Inductions 6
rsi

37
38 E. Publication Bias Analyses for Dynamic Visual Studies 7
39
on

40 F. Moderator Coding and Correlation Matrix 8


41
42 G. Familiarity Analyses Using Factiva 10
43
44
45 H. Sensitivity of Effect Sizes Using Different HLM Models 11
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 51 of 61

2
1
2 Author Accepted Manuscript
3
4 Web Appendix A
5
6 Graphical Display of the Literature Review
7
8
9
10
11
12
13
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 52 of 61

3
1
2 Author Accepted Manuscript
3
4
5 Web Appendix B
6
7 Publication Bias Analyses for Behavioral Outcomes
8
9
10 Publication bias is a threat to the validity of any meta-analysis. The published literature
11 documents only a proportion of all research carried out and the unpublished proportion may be
12 systematically different from the published, because selectivity may exist in deciding what to
13 publish (Sutton 2009). In our analysis we use three-level models and robust variance estimation
14 to account for the hierarchical structure of our data (i.e., multiple effect sizes are derived from
15 one paper). We note that there is not yet an accepted method in the field to correct for
16
17
publication bias when using robust variance estimation models (Bediou et al. 2018; Viechtbauer
Pe
18 2010). One recent recommendation in the case of non-independence is to aggregate effect sizes
19 per study or sample resulting in one effect size per study (Nakagawa et al. 2021). Following this
20 guidance, we aggregated effect sizes to the study level and then used Egger’s tests, trim-and-fill
e

21 analyses (Duval and Tweedie 2000b), and a p-curve (Simonsohn, Nelson, and Simmons 2014) to
rR
22 assess publication bias.
23 Egger’s test estimates the association between the observed treatment effects and their
24
25
standard errors; a strong association implies publication bias. The original Egger’s test regresses
ev

26 the standardized effect (i.e., the effect size divided by its standard error) on the corresponding
27 precision (i.e., the inverse of the standard error) (Egger, Smith, and Phillips 1997).
28 We employed trim-and-fill on funnel plots with contours to verify the extent to which the
iew

29 effect size would change once accounting for unpublished results (Duval and Tweedie 2000a;
30 Palmer et al. 2008; Peters et al. 2008). Funnel plots depict effect size against the standard error
31
for a study; larger studies are more precise and have smaller standard errors (Sterne, Egger, and
32
33 Smith 2001). Trim-and-fill estimates the shape of the coordinates on the funnel plot and, based
Ve

34 on the mean weighted effect size, fills in the funnel in spaces where studies are potentially
35 missing (Duval and Tweedie 2000a; b).
36 We also created a p-curve from all published studies included in our final dataset to test
rsi

37 whether the underlying studies had evidential value (Simonsohn, Nelson, and Simmons 2014).
38 Evidential value conveys that selective reporting cannot entirely explain the results (Simonsohn,
39
Nelson, and Simmons 2014) as indicated by right-skewed p-curves. When the distribution of p-
on

40
41 values is flat, this suggests no real effect because a real effect would have more p-values less
42 than .01 (Simonsohn, Nelson, and Simmons 2014). Whenmost p-values are close to p = .05 (left
43 skew), the p-curve is suggestive of p-hacking in the literature.
44
45
46
Results (Behavioral Outcomes)
47
48
49 We first examined a contour-enhanced funnel plot (see Figure B1) that displays the
50 treatment effects from individual studies (i.e., Hedges’s g) on the horizontal axis against their
51 standard errors on the vertical axis (Sterne, Egger, and Smith 2001). Whereas the funnel plot
52 indicated a slight asymmetry, Egger’s regression did not reach significance, b = 0.02, (95% CI [ -
53 0.19, 0.22]). We then used the nonparametric trim-and-fill procedure, which revealed that
54
trimming 11 studies would make the plot more symmetric. After adjusting for potential
55
56 publication bias in this way, the magnitude of the treatment effect reduced slightly but remained
57
58
59
60 Journal of Marketing
Page 53 of 61

4
1
2 Author Accepted Manuscript
3
4 significantly different from zero, Hedge’s gadjusted = 0.10, 95% CI [0.01, 0.19], z = 2.077, p = .03.
5 Finally, we conducted a p-curve analysis (see Figure B2). The p-curve only examines studies
6 with a significant p-value (N = 54 in this case). Approximately 66% of the p-values associated
7 with published studies in the literature are below .025. The p-curve was significantly right-
8
skewed (full p-curve: z = -6.95, p < .001; half p -curve: z = -6.14, p < .001), indicating that these
9
10 estimates cumulatively contain evidential value, and selective reporting alone cannot explain the
11 obtained results. Jointly, these analyses suggest that the effect of mental simulation has a small
12 but robust true effect on behavioral responses despite some evidence for publication bias.
13
14
15 FIGURE B. PUBLICATION BIAS ANALYSES FOR BEHAVIORAL OUTCOMES
16
17
Pe
18
19
20
e
B1. Trim and fill analysis B2. P-curve analysis
21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38 Note. Trim-and-fill (B1) and p-curve analyses (B2) are presented for studies with behavioral
39 outcomes. In the trim-and-fill graph (B1), effect sizes are on the x-axis, and the standard error
on

40 (precision) is on the y-axis. Unfilled circles in the graph indicate study findings from the
41
42
literature; filled circles indicate findings that the trim-and-fill procedure added. Contour lines
43 indicate levels of statistical significance (p < .05), for which random-effects modeling was used.
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 54 of 61

5
1
2 Author Accepted Manuscript
3
4 Appendix C
5
6 Publication Bias Analyses for Attitudes
7
8
We followed the same steps as outlined for behavioral outcomes also for studies that
9
10 reported effects on attitudes towards the simulation target (k = 47). These studies revealed an
11 average effect size that was medium-sized and significantly different from zero (Hedge’s
12 gunadjusted = 0.15; 95% CI [ 0.08, 0.24]).
13 When visually inspecting the funnel plot, we observed a slightly higher concentration of
14 studies towards the right side of the funnel, causing funnel asymmetry (see Figure C1). This
15 asymmetry may suggest publication bias (Sterne and Harbord 2004). In line with this
16
17
observation, Egger’s regression was significant (b = -0.21, 95% CI [ -0.40, -0.01], z = 3.84, p <
Pe
18 .001).
19 Given the slight asymmetry, Duval and Tweedie’s trim-and-fill procedure indicated a
20 total of six studies to be trimmed to make the plot more symmetric. After adjusting for possible
e

21 publication bias by trimming and filling 6 studies, the estimated effect size of mental simulation
rR
22 reduced slightly; (Hedge’s gadjusted = 0.11, 95% CI = [ .02, .20]) but remained significantly
23 different from zero.
24
25
Finally, we conducted a p-curve analysis. The p-curve only examines studies with a
ev

26 significant p-value (N = 13 in this case). Approximately 69% of the p-values associated with
27 published studies in the literature are below .025. The p-curve was significantly right-skewed
28 (full p-curve: z = -1.84, p = .03, see Figure C2), indicating that these estimates cumulatively
iew

29 contain evidential value, and selective reporting alone cannot explain the obtained results.
30
31
FIGURE C. PUBLICATION BIAS ANALYSES FOR ATTITUDES
32
33
Ve

34
35 C1. Trim and fill analysis C2. P-curve analyses
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52 Note. Trim-and-fill (C1) and p-curve analyses (C2) are presented for studies with attitudinal
53
outcomes.
54
55
56
57
58
59
60 Journal of Marketing
Page 55 of 61

6
1
2 Author Accepted Manuscript
3
4 Appendix D
5
6 Publication Bias Analyses for Studies that Used Combined Inductions
7
8
Studies that used two modalities in combination (i.e., visual + verbal; k = 22) revealed an
9
10 average effect size that was medium-sized and significantly different from zero (Hedge’s
11 gunadjusted = 0.25; z = 4.33, p < .001).
12 When visually inspecting the plot, we did not observe any concentration, further Egger’s
13 regression was not significant (b = -0.20, 95% CI [ -1.00, 0.61], p > .40), suggesting no clear
14 evidence of publication bias. Trim-and-fill procedure also did not suggest trimming or adding
15 any studies.
16
17
Further, the p-curve was significantly right-skewed (full p-curve: z = -4.13, p < .001, see
Pe
18 Figure D2), indicating that these estimates cumulatively contain an evidential value, and
19 selective reporting alone cannot explain the obtained results.
20
e

21 FIGURE D. PUBLICATION BIAS ANALYSES FOR STUDIES THAT USED COMBINED


rR
22 SIMULATIONS
23
24
25 D1. Trim and fill analysis D2. P-curve analyses
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42 Note. Trim-and-fill (D1) and p-curve analyses (D2) are presented for combined simulations.
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 56 of 61

7
1
2 Author Accepted Manuscript
3
4 Appendix E
5
6 Publication Bias Analyses for Dynamic Visual Studies
7
8
We again followed the steps outlined before (Web Appendix B). Studies that used
9
10 dynamic visual simulation prompts (k = 20) revealed an average effect size that was medium-
11 sized and significantly different from zero (Hedge’s gunadjusted = 0.39; z = 8.75, p < .001).
12 When visually inspecting the plot, we observed a slightly higher concentration of studies
13 towards the right side of the funnel, causing funnel asymmetry (see Figure E1). This asymmetry
14 may suggest publication bias (Sterne and Harbord 2004). However, Egger’s regression was not
15 significant (b = 0.08, 95% CI [ -0.38, 0.55], p > .20), suggesting no clear evidence of publication
16
17
bias.
Pe
18 Still, given the slight asymmetry, Duval and Tweedie’s trim-and-fill procedure indicated
19 a total of three studies to be trimmed to make the plot more symmetric. After adjusting for
20 possible publication bias by trimming and filling three studies, the estimated effect size of mental
e

21 simulation reduced slightly; Hedge’s gadjusted = 0.34, 95% CI = [ .24, .44], but remained
rR
22 medium-sized and significantly different from zero (see Figure E2).
23 The p-curve only examines studies with a significant p-value (N = 11 in this case).
24
25
Approximately 73% of the p-values associated with published studies in the literature are below
ev

26 .025. The p-curve was significantly right-skewed (full p-curve: z = -3.63, p < .001, see Figure
27 E2), indicating that these estimates cumulatively contain an evidential value, and selective
28 reporting alone cannot explain the obtained results.
iew

29
30
31
FIGURE E. PUBLICATION BIAS ANALYSES FOR STUDIES THAT USED DYNAMIC VISUAL
32
33 SIMULATIONS
Ve

34
35 E1. E2.
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52 Note. Trim-and-fill (E1) and p-curve analyses (E2) are presented for dynamic visual simulations.
53
54
55
56
57
58
59
60 Journal of Marketing
Page 57 of 61

8
1
2 Author Accepted Manuscript
3
4 Web Appendix F
5
6 Moderator Coding and Correlation Matrix
7
8
9
10 Name in the
11 Moderator Coding Scheme
Correlation Matrix
12
13 Induction induction_modality_du
14 1 = visual, 0 = verbal
modality mmy
15
16 1 = spaced, 0 = otherwise
17
Pe
18
Simulation simulation_frequency_ 1 = massed, 0 = otherwise
19 frequency dummy Reference category if both variables are 0 is single
20 simulation
e

21
Prior
rR
22
23 experience
24
familiarity Familiarity index based on Factiva mentionings
with simulated
25 target
ev

26
27 1 = material, 0 = otherwise
28 Type of 1 = food, 0 = otherwise
iew

29
30
simulation target_type_dummy 1 = health, 0 = No
31 target Reference category if all variables are 0 is
32 experiential
33
Ve

34 Simulation
focus_dummy 1 = process-focused, 0 = outcome-focused
35 focus
36
Simulation 1 = in advertising context, 0 = no advertising-
rsi

37
38
context_dummy
context context
39
on

40 1 = General population, 0 = otherwise


41 Population population_dummy 1 = Students, 0 = otherwise
42 Reference category if both variables are 0 is Mturk
43
44
Journal field journal_field_dummy 1 = marketing, 0 = outside of marketing
45
46
Year year_median_centered Year of publication
47
48
1 = published, 0 = unpublished data (working
49 Published published_dummy
50 paper, dissertation, or conference proceedings)
51
52 Country country_dummy 1 = US, 0 = otherwise
53
54
55
56
57
58
59
60 Journal of Marketing
Page 58 of 61

9
1
2 Author Accepted Manuscript
3
4 Dependent
5
variable is a scale_dummy 1 = continuous, 0 = otherwise (binary, count)
6
7
scale
8
9
Dependent
10 variable is a
11 scale is_multiitem_dummy 1 = multiple-items, 0 = single-item
12 composed of
13 multiple items
14
15
16
17
Pe
18
19
20
e

21
rR
22
23
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 59 of 61

10
1
2 Author Accepted Manuscript
3
4 Appendix G
5
6 Familiarity Analyses Using Factiva
7
8
We used Dow Jones Interactive, an online news database (now known as Factiva.com
9
10 [https://global.factiva.com/]) that allows users to track how a topic, word, or phrase appears in
11 the news over time. For each study, we counted how many times the simulation target appeared
12 in publications in the country where the research was conducted, and in the year an article was
13 published, a dissertation submitted, or a paper presented at a conference. For working papers, we
14 use the year of the working paper version we had access to. For instance, the phrase running
15 shoes (Escalas 2004) appeared 3871 times in various publications in the US (the country where
16
17
the research was conducted) during 2004 (from 1/1/2004 to 12/31/2004). In order to take a
Pe
18 general increase in word appearances over time into account, we also noted the frequency of four
19 commonly used words (news, land, buy, market) during that year. For example, the term news
20 appeared 888,062 times in 2004 , whereas land appeared 140,431 times, buy 228,761 times, and
e

21 market 630,798 times. We then created an index by 1) dividing the frequency of the target word
rR
22 or phrase by the frequency of each of the four common words and then 2) averaging across these
23 four quotients.
24
25
ev

26
27
28
iew

29
30
31
32
33
Ve

34
35
36
rsi

37
38
39
on

40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60 Journal of Marketing
Page 60 of 61

11
1
2 Author Accepted Manuscript
3
4 Appendix H
5
6 Sensitivity of Effect Sizes Using Different Models
7 Table G presents the HLM results for the impact of various moderators on effect sizes of
8
behavioral responses. We calculated four models. Model 1 represents our baseline model,
9
10 explaining 36.9% of the Level 1 variance and 63.1% of the Level 2 variance. Next we estimated
11 a fully nested model with 3 levels given the interdependencies among effect sizes from the same
12 paper. To account for the hierarchical structure of our data (i.e., multiple effect sizes derived
13 from one paper), we estimated the overall effect size using a three-level model, reflecting the set
14 of experiments nested in a given paper (level 3), individual effect sizes nested within
15 experiments (level 2), and each effect size (level 1; Cheung 2014). We refer to this 3 level
16
17
model as “fully nested” in the table below. Model 2 explains 27.9% of Level 1 variance, 0% of
Pe
18 Level 2 variance, and 72.1% Level 3 variance. This suggests that there is substantial between-
19 paper heterogeneity at the third level. More importantly, the added complexity is justified given
20 that Model 2 fits our data better than Model 1 (χ12= 49.9, p < 0.001) with lower AIC and BIC. In
e

21 Model 3, we added familiarity interactions to Model 2. Adding a single interaction effect is a


rR
22 common procedure in meta-regression to keep multicollinearity at an acceptable level (e.g., You,
23 Vaddkkepatt, and Joshi 2015). This led to a small increase in Level 1 variance from 27.9% to
24
25
28.9%, improving model fit compared with Model 2 (χ23= 10.5, p = 0.02). Finally, in Model 4,
ev

26 we added research operational variables to Model 2. Model 4 did not explain any more variance
27 than Model 2 (Level 1 = 28.1%, Level 2 = 0, Level 3 = 71.9%) and the additional complexity of
28 the model is not justified given that Model 4 did not fit any better than Model 2 (χ24= 11.4, p =
iew

29 0.12). Since both Model 2 and Model 3 had the best fit and lowest AIC and BIC measures, and
30 since we did not have a theoretical prediction for the interaction between familiarity and outcome
31
measures, we report Model 2 in the main body of the paper.
32
33
Ve

34 TABLE G. SENSITIVITY OF MENTAL SIMULATION EFFECTS


35
36
rsi

37 (3) Fully Nested


38
(2) Fully Nested (3-level) (4) Fully Nested
39
(3-level) without Interaction Model (3-level) with
on

40
41 Research without Research Research
42 (1) Simple Operational Operational Operational
43 Nested (2-level) Variables Variables Variables
44 DV: Effect
45 Sizes of
46
Behavioral
47
48 Responses and
49 Attitudes Coeff SE Coeff SE Coeff SE
50 Intercept -0.01 0.08 0.26* 0.12 0.29* 0.12 0.82** 0.30
51 Theoretical
52 Moderators
53
Induction
54
55 modality 0.01 0.04 0.03 0.03 0.05 0.03 0.04
56
57
58
59
60 Journal of Marketing
Page 61 of 61

12
1
2 Author Accepted Manuscript
3
4 Simulation
5 frequency
6 Massed -0.37*** 0.10 -0.31* 0.13 -0.35** 0.13 -0.33* 0.14
7 Spaced 0.72*** 0.18 0.85*** 0.21 0.87*** 0.22 0.78*** 0.22
8
Familiarity 0.001* 0.01 0.001 0.01 0.005* 0.003 0.01 0.01
9
10 Simulation
11 target
12 Food 0.08 0.06 0.08 0.06 0.06 0.06 0.07 0.06
13 Health 0.21 0.16 0.17 0.19 0.15 0.19 0.14 0.21
14 Material 0.05 0.06 0.09 0.08 0.05 0.08 0.10 0.08
15 Context 0.01 0.04 -0.03 0.07 -0.03 0.08 -0.13 0.08
16
17
Focus 0.01 0.07 0.03 0.07 -0.04 0.08 -0.14 0.21
Pe
18 Population
19 General
20 population 0.21* 0.07 0.21* 0.09 0.21* 0.09 0.24* 0.11
e

21 Students 0.08 0.05 0.05 0.07 0.002 0.07 0.01 0.08


rR
22 Theoretical
23 outcomes
24
25
Behavior -0.11 0.09 -0.22* 0.09 -0.14 0.10 -0.39** 0.14
ev

26 Attitude -0.03 0.05 0.03 0.03 -0.006 0.03 0.04 0.03


27 Familiarity
28 Interactions
iew

29 Fam X
30 Induction
31
Modality -0.004 0.02
32
33 Fam X
Ve

34 Outcomes
35 Fam X
36 Behavior -0.01 0.01
rsi

37 Fam X
38 Attitude 0.005** 0.002
39
Research
on

40
41 operational
42 factors
43 Publication -0.05 0.09
44 Journal field -0.18 0.15
45 Year 0.01 0.01
46 Country -0.13 0.12
47
48
Scale -0.16 0.12
49 Multi item -0.18* 0.07
50 Model Fit
51 AIC 111.82 63.96 59.42 66.57
52 BIC 159.04 114.32 118.88 138.01
53 df 15 16 19 23
54
55
56
57
58
59
60 Journal of Marketing

View publication stats

You might also like