You are on page 1of 32

Running head: BELIEF DYNAMICS FRAMEWORK 1

Integrating Social and Cognitive Aspects of Belief Dynamics:

Towards a Unifying Framework

Mirta Galesic1,2*, Henrik Olsson1, Jonas Dalege1, Tamara van der Does1, & Daniel L.
Stein3,4

1
Santa Fe Institute, Santa Fe, New Mexico, 2Complexity Science Hub Vienna,
Austria, 3Department of Physics and Courant Institute of Mathematical Sciences, 4NYU-
ECNU Institutes of Physics and Mathematical Sciences at NYU Shanghai, Shanghai,
China. *Santa Fe Institute, 1399 Hyde Park Rd, Santa Fe, NM 87501. E-mail:
galesic@santafe.edu
BELIEF DYNAMICS FRAMEWORK 2

Abstract
Belief change and spread have been studied in many disciplines-- from psychology,
sociology, economics and philosophy, to computer science and statistical physics-- but we
still do not have a firm grasp on why some beliefs change and spread easier than others. To
fully capture the complex social-cognitive system that give rise to belief dynamics, we
need to integrate the findings of these disciplines into a single framework. Communication
between disciplines is limited, and there is a lack of theoretical comparisons and empirical
tests of the many different models of belief dynamics. Here we first review insights about
structural components and processes of belief dynamics studied in different disciplines,
focusing particularly on previously neglected but important areas such as cognitive
representations and strategies used to integrate information. We then outline a unifying
framework that enables theoretical and empirical comparisons of different belief dynamic
models. The framework is quantified using a statistical physics formalism, grounded in
cognitive and social theory as well as empirical observations. We show how the framework
can be used to integrate numerous previous models and develop a more comprehensive
science of belief dynamics.
Keywords: belief dynamics models, model comparison, statistical-physics,
theoretical framework
BELIEF DYNAMICS FRAMEWORK 3

Why do individuals change some beliefs quickly in light of new information while
they fiercely resist changing other beliefs? Why do some beliefs spread faster than others in
societies? Why do societies sometimes come to consensus about an issue and at other times
splinter in clusters with widely different beliefs? Questions about belief dynamics have
long occupied scholars across a wide range of disciplines and have been studied on
individual and societal levels. At the individual level, researchers in psychology and
cognitive science, evolutionary anthropology, political science, and philosophy have been
studying cognitive processes that underlie belief dynamics, including individual and social
learning, social judgment and decision making, social influence, and belief formation and
change. At the societal level, researchers in disciplines ranging from sociology and
economics to statistical physics and computer science have been focusing on the spread of
beliefs in social networks, and the two-way relationship between the structure of these
networks and people’s beliefs.
While these efforts considerably improved our understanding of belief dynamics, a
comprehensive grasp of processes and outcomes of belief dynamics and how they can be
influenced is still elusive. This goal has been elusive for at least three reasons. First, while
most researchers agree that beliefs form and change through a dynamic interplay of social
network structure and cognitive processes, most disciplines focus predominantly on only
one of these aspects. This makes it difficult to adequately capture the underlying social-
cognitive complex system, leading to crucial gaps in our understanding of belief dynamics.
For example, while models of belief dynamics can predict the emergence of full consensus
or strong polarization 1 , they generally cannot predict for which beliefs a society will reach
consensus and for which beliefs polarization will increase. Similarly, these models often
cannot predict why some beliefs are more or less easy to change. Here we propose that the
key to understanding these puzzles might be to integrate social and cognitive components
of the complex system underlying belief dynamics into a single unifying framework, as
illustrated in Figure 1.
The second reason for a lack of understanding of belief dynamics is that there is
little theoretical and empirical comparison of different models, both within and especially
between disciplines (for a rare exception, see 2). Many models are developed on an abstract
level making empirical measurement difficult, or they remain untested because they are
BELIEF DYNAMICS FRAMEWORK 4

developed within fields that do not traditionally conduct empirical studies. Coupled with
relatively scarce communication between different fields, this leads to relatively slow build
up to fundamental principles of belief dynamics. We therefore review insights from
different disciplines organized according to the structural and process components of belief
dynamics identified in Figure 1. We go beyond several excellent reviews that focus mostly
on developments within a single discipline 1,3–6 and follow examples of reviews that cross
interdisciplinary boundaries 7–10. In particular, we review research efforts in areas that are
essential to understanding the complex system underlying belief dynamics but are typically
not considered in this literature, such as the work of on cognitive representations of beliefs
and social networks, and on the many simple strategies people have been shown to use to
integrate information about their own and others’ beliefs.
Third, there is a lack of quantitative modeling frameworks that could be used to
adequately integrate both cognitive and social network aspects and make empirically
testable predictions about belief dynamics. Typically, models rich in cognitive and social
detail tend to be formulated verbally, while quantitative models tend to lack sufficient
realism. Agent based models are well positioned to fill this gap 11–14. Here we propose a
quantitative framework that is rich in cognitive and social detail yet also simple enough to
provide quantitative predictions for integrating these details. The framework integrates the
main structural and process components of belief dynamics and enables systematic
theoretical and empirical comparisons of different implementations (Box 1 and Figure 2).
BELIEF DYNAMICS FRAMEWORK 5

Main Components of Belief Dynamics

Figure 1. Main structural and process components of a complex system framework


of belief dynamics. Individuals interact with their social networks to form representations
of their own beliefs and beliefs of others. Cognitive and social dissonance experienced
because of differences between one's own and others’ beliefs can be resolved by updating
one’s beliefs or network connections. This in turn modifies the social network, starting
another round of belief dynamics.

1. Structural Components
Structurally, the complex system underlying belief dynamics consists of two main
components: individual beliefs and their connections, and people’s social contacts and the
relationships between them.
1.1. Individual Beliefs
Individual beliefs are an essential component of belief dynamics, but there is little
consensus on how they should be represented and measured. Here we adopt an inclusive
definition of beliefs that encompasses conceptualizations used in different fields, including
beliefs as assumptions about states of the world (e.g., “vaccination is safe for children”, “I
BELIEF DYNAMICS FRAMEWORK 6

believe there is a 30% chance of rain tomorrow”), views on moral and political issues (e.g.,
“genetically modified foods are against nature”, “a free market is good for everyone”),
evaluations or cognitive aspects of attitudes (e.g., “cats are fun”, “this politician is
trustworthy”; 15,16), or as own preferences (e.g., “I prefer having more time to earning more
money”, “I like summers more than winters”). Support for a belief can be measured on
different scales, from dichotomous true-false or agree-disagree, to rating scales with several
points, to subjective probabilities and probability distributions. Different domains
emphasize different ways of measurement: rating scales are often used in social psychology
and political science 17,18, while subjective probabilities are relatively more often used in
economics 19–21, philosophy 22,23, artificial intelligence 24, and judgment and decision
making 25–28. Some theories in the attitude literature combine both evaluations of beliefs
and their subjective probabilities 29–32.
Extant models of belief dynamics typically consider changes of a single belief over
time, but beliefs are associated with other beliefs which develop and change together, thus
forming a belief network. Although the idea of belief networks has a long history in
psychology 33,34, formal theories that can provide predictions of how the network structures
of individual beliefs affect belief dynamics and related behaviors have only been developed
relatively recently 35–37. For example, Dalege and his colleagues 38,39 have developed a
quantitative model of belief networks that underlie people’s attitudes towards different
issues. They use a statistical physics framework to formalize the dynamic interactions
between different beliefs related to a particular attitude object. They show that the structure
of belief networks can have important consequences for the likelihood of belief change
(e.g., strongly connected belief networks are more stable than weakly connected belief
networks; 40). Other models combine the structure of the internal belief network and the
social network. For example, 41 developed a model of belief dynamics where each
individual is endowed with a network of interacting beliefs that changes through interaction
with other individuals in a social network.
1.2. Social Networks
Information from people’s social environment is an important contributor to the
complexity of belief formation and change. The relevant social environments can include
many different social contacts, from friends and family, online communities, media figures,
BELIEF DYNAMICS FRAMEWORK 7

to companies, educators, health workers, and politicians. They are embedded in ever
evolving social networks that to a large part determines what information is spread and to
whom.
The importance of social networks for understanding how beliefs change has long
been acknowledged, with Durkheim highlighting already in 1893 42 the need to study
individuals within their social networks. Moreno introduced the first network visualizations
and methods for measuring them in 1951 43. Since then, a large number of network
properties important for belief dynamics have been identified (for an overview of early
work, see 44; and for recent work 5,45–48. Some of the most frequently studied properties are
various measures of centrality (i.e., structural importance) of individuals, strength and
length of paths between them (e.g., how long does it take on average for information to
flow from one individual to another individual in the social network), balance of
relationships in which they are embedded (e.g.., are two friends of an individual generally
also friends?), and homophily (i.e., the tendency to have friends who hold similar beliefs).
For example, homophily can constrain the spread of beliefs and behaviors such as smoking,
obesity, and even divorce 49–51, and the resulting beliefs can influence processes such as
network updating that in turn influence homophily 52. A related concept is echo chambers
in which beliefs are amplified or reinforced by communication to similar others and
repetition of what one already believes 53,54.
2. Process Components
The same structure of a complex social system can give rise to different belief
dynamics: Differences in cognitive and social processes governing people’s representation
of beliefs and social networks, their cognitive and social dissonance, and the way they
update their beliefs and network links can all contribute to different belief dynamics. In the
following sections, we will discuss each of them in turn, even though in reality these
processes often occur in parallel and influence each other.
2.1. Representing Beliefs and Social Networks
Most models of belief dynamics assume that people have stable and accurate
representations of their own beliefs and beliefs of their social contacts. However, research
in cognitive psychology suggests that people often do not have a readily formed belief
about an issue, but must (re)construct it on the spot based on different considerations that
BELIEF DYNAMICS FRAMEWORK 8

come to mind 55,56. These personal considerations can include other related beliefs, values,
assumptions, and preferences one has related to the issue at hand. Specific considerations
come to mind depending on processes of memory, such as priming and forgetting,
processes of self-deception 57,58, and the inter-relationships between one’s beliefs.
Similarly, people might not have a ready-made representation of what others in their social
networks believe. These representations have to be formed by integrating available social
samples and are affected by memory processes 59,60 as well as by one’s social contacts’
impression management 61 and other signaling strategies 62–65.
The many relevant personal considerations and perceived beliefs of social contacts
need to be integrated into an overall representation of one’s own and others’ beliefs about
an issue. Many disciplines study integration of either personal considerations or
information from social networks, but rarely both, although they might involve similar
processes.
Some approaches to studying integration processes are primarily normative,
suggesting how integration should be done to achieve a coherent set of beliefs. A standard
normative approach is Bayesian learning, which assumes that individuals update their
beliefs optimally given an underlying model of the world that includes their own and
others’ beliefs, in line with Bayesian calculus. Bayesian approaches to information
integration have been intensively explored in artificial intelligence 24, cognitive science
66,67
, and economics 20,68. In addition, several logical approaches to information integration
have been developed in the context of belief dynamics 69–71. In dynamic epistemic logic
70,71
, the updating of knowledge and beliefs takes place through plausibility models with
which individuals consider different worlds as possible candidates for the actual world,
where each world has a plausibility order in relation to the other worlds. In its simplest
form, belief revision occurs by removing non-plausible worlds from the total set of worlds.
Some approaches combine logical integration and specific social network structures 72,73.
These theories include information about what others know and believe and analyze what
brings about knowledge and belief change.
Other approaches to studying integration processes are primarily descriptive,
aiming to explain cognitive strategies. These approaches are motivated by observations that
people tend to use relatively simple strategies that can work with limited information and
BELIEF DYNAMICS FRAMEWORK 9

cognitive capacity, sometimes approximating the results of normative ideals 5,21,74–77. One
prominent class of strategies are frequency-dependent strategies. Individuals following
these rules update their beliefs to those supported by a certain number or fraction of their
social contacts, or of their personal considerations about an issue. The most studied rule in
this class is the majority rule 78–82 and its plurality variants (for more than two options) that
can be used both for integrating one’s own considerations (e.g., tallying, 83,84) and external
social information 85. Across many disciplines, rules using other thresholds have been
studied empirically and theoretically, including minority rule, unanimity rule, and others
(in political science, 78,79; economics, 86, sociology, 81,87,88; law, 89, organizational science,
90
; evolutionary anthropology, 91; animal learning, 92; psychology, 82,93–96; machine learning
e.g., 97, and statistical physics, 80,98).
Another prominent class of descriptive approaches for information integration are
averaging strategies (and the closely related weighted additive rules). In psychology, they
have been studied as rules for integrating personal considerations about an issue using
different weighting schemes 84,99, as well as for integrating social information in belief
updating 100, advice taking 101, and social judgment 100,102. In other disciplines they have
been studied mostly for integrating social information. For example, DeGroot updating 21
and its variants 103 studied in sociology and economics involve weighted averaging of one’s
own and neighbors’ beliefs 51,104–107. Several rules studied in statistical physics, such as
voter rule, Ising and Potts models, etc., are essentially averaging rules as well 108–110.
Random updating or “unbiased” copying that is often used as a benchmark in evolutionary
anthropology is also an averaging rule in the long run 111. Epidemiological models of belief
dynamics, where individuals are described with a certain probability of getting “infected
with” and “recovering from” a belief can also be thought of as examples of averaging 112.
A final prominent group of descriptive approaches to information integration are
various non-compensatory strategies. Unlike frequency-dependent and averaging rules,
they do not take into account all personal considerations or social contacts, but choose just
one or a smaller subset based on particular properties of these considerations or contacts.
For example, when people follow a take-the-best rule, they rely on a single best
consideration, disregarding all others 113. Similarly, they might disregard other
considerations when following memory-based cues such as recognition 113, perceptual or
BELIEF DYNAMICS FRAMEWORK 10

memory fluency 114–116, or familiarity 117,118. When it comes to integrating social


information, many non-compensatory model-based strategies have been described 119–121.
Prominent examples are strategies that entail following a leader, a perceived expert, a close
friend, a family member, a successful individual, or the most trusted individual. Variants of
model-based strategies also exist in bounded confidence models in which an individual is
more likely to interact and copy a person with similar opinions. As the similarity of
opinions changes over time, so does the likelihood that any two individuals will influence
one another 122–124. The classic model of dissemination of culture by Axelrod 125 uses
similar assumptions: probability of interaction between two agents is proportional to the
similarity of opinions of these agents.
All of these simple strategies are plausible ways of integrating information both
when forming one’s own beliefs and when forming beliefs about others, and most likely
people use all of them in different contexts. However, current models of belief dynamics
typically assume only one type of strategy (e.g., a majority rule or simple averaging), and
there is a lack of theoretical and empirical comparisons of signatures of different strategies.
The framework we propose in Section 3 enables such comparisons.
2.2. Cognitive and Social Dissonance
If one’s beliefs are inconsistent with each other or if one perceives to have different
beliefs than one’s social contacts, one might experience cognitive and/or social dissonance
34,126
. Cognitive dissonance is often expressed in terms of inconsistency or imbalance
between different beliefs people have 34,127. Formal models of cognitive dissonance have
been implemented within the parallel constraint satisfaction framework 35 and related
connectionist frameworks 36,37, as well as within a statistical physics framework 38–40.
Social dissonance can also be represented as imbalance in perceived beliefs in one’s overall
social network 128.
Even though dissonance might be strong, it does not have to result in a
corresponding belief change. Actual belief change will depend on how much attention one
pays to the inconsistent beliefs, which in turns depends on the presence of other distractors,
noise in the perception process, and subjective importance of the beliefs 34,129. A related
subjective feeling is felt dissonance or felt ambivalence 129–131, a psychological state that
has been described as being uncomfortable, uneasy, and bothered 34,36,127,128,132–134.
BELIEF DYNAMICS FRAMEWORK 11

2.3. Belief and Network Updating


People might resolve cognitive or social dissonance in at least two ways. The first is
to change their own beliefs to fit better with their other beliefs and social contacts. When
relationships with other beliefs or individuals are positive, one is likely to adjust her beliefs
to be more in line with these other beliefs or individuals. When these relationships are
negative, a backlash can occur, whereby people make their beliefs as dissimilar as possible
from beliefs and individuals they have negative relationships with 40,126,135. Alternatively,
people can try to resolve the dissonance by changing the connections between their
different beliefs, and/or between themselves and their social contacts 133,136. In general,
people tend to prefer socializing with others who are similar to them 51,137,138, and even
when this preference is very subtle, it can lead to strong segregation patterns in a society
139,140
.
There is substantially less research on network updating strategies than on belief
updating strategies, in particular when it comes to updating links between one’s own
beliefs. Models of belief dynamics with a social network updating component started to
occur relatively recently, partially building on the literature studying behavioral games on
adaptive networks 141. For example, Holme and Newman 142 develop a coevolving voter
model where a single parameter determines whether an individual adopts a belief of a
random neighbor, or replaces one of her social connections with a connection to an
individual with the same belief she has. Recent empirical and modelling work have also
shown that individuals in social networks can dynamically modify their local connections
in an adaptive way in response to changing environments, making individual and collective
beliefs more accurate.143 In any case, updated beliefs and network links change the
structure of the social networks in which people are embedded, starting another round of
belief dynamics.
3. An Integrative Framework for Components of Belief Dynamics
In this section we propose a unifying quantitative framework that builds upon the
disparate models and empirical observations reviewed in the previous sections. The
framework uses a statistical physics formalism 1 grounded in cognitive and social theory
and observations 39,144,145 to enable constructing and comparing models with different
assumptions about individual and social factors that influence belief formation, change, and
BELIEF DYNAMICS FRAMEWORK 12

spread. Its main advantage is that it can express a large number of different models and
reproduce a variety of empirically observed patterns of belief spread, using only a few
components. Each of its variables and parameters can be either directly observed or
estimated from empirical data, and can be given a plausible psychological meaning
reflecting factors that have been recognized as important for belief dynamics. In what
follows, we outline how our framework incorporates the main aspects of belief dynamics
reviewed in the previous sections.
3.1. Structural Components
Individual Beliefs. We adopt a widely used statistical physics approach 1, where the
focal belief under consideration by individual i is represented by a discrete-valued variable
σ𝑖 that is traditionally (for historical reasons) called a spin. Depending on the issue in
question, the variable σ𝑖 can take anywhere from two (yes/no, agree/disagree) to many
(e.g., ranging from strongly agree to strongly disagree) values corresponding to different
belief states. Belief states can be measured by attitude questions or approximated from
behaviors (e.g., voting or purchases). The spin σ𝑖 is usually treated as a scalar variable
which can change its state among a discrete or continuous set of possibilities over time, but
at any given time is found in one state only. We introduce a more general vector approach
(See Box 1), which includes the usual scalar spin models as a special case. Vector models
of cultural dynamics have been studied by others 125, but the nature and implementation of
those are different from that discussed here. Our conceptualization of beliefs is also similar
to quantum and Markov models of belief change during evidence accumulation described
by Busemeyer and colleagues 146. They, however, do not include the social network
component and are typically applied to relatively short time scales.
Consider an example of our general vector approach in which an individual fills out
a survey and is asked to choose between five possible belief states regarding the issue in
question: strongly agree, agree, neutral/don't know, disagree, strongly disagree. We define
a vector σ
⃗ 𝑖 with (in this example) five components that describe probabilities or weights
which are assigned to each of the belief states — so the sum of all components of σ
⃗ 𝑖 at any
time is equal to one. For example, we might answer “agree” to a survey question, but might
feel our true belief state is somewhere in between “agree” and “strongly agree”, in which
case the vector (0.25,0.75,0,0,0) is a more accurate representation. This framework
BELIEF DYNAMICS FRAMEWORK 13

includes standard scalar-spin statistical physics models as a special case, in which σ𝑖 can
still be in only one of five possible states, now written as (1,0,0,0,0), (0,1,0,0,0),
(0,0,1,0,0), (0,0,0,1,0), and (0,0,0,0,1). Depending on how fine-grained the different
choices are, the framework can accommodate both discrete (or categorical) and continuous
representations of beliefs. For example, one’s beliefs about the safety of genetically
modified food can be described as a probability of choosing each of the points on a scale
from very unsafe to very safe; and one’s beliefs about different political options might be
described as 20% support for party A and 80% for party B.
Social Networks. Individuals are represented by nodes on a social network.
Weighted links connect two nodes, i and j (or neighbors) and represent the possibility that
information can flow between the two. The strength and sign of the interaction between
two nodes is denoted by a positive or a negative link weight. A positive link weight
between two neighbors indicates a tendency for individuals 𝑖 and 𝑗 to want to agree, while
a negative coupling indicates that a change in 𝑖’s belief induces a change in the opposite
direction in 𝑗. Moreover, the graph can be directed, meaning that the communication
between two individuals is only one-way or else two-way but with different link weights in
each direction. These social networks can be measured objectively using sociometric
methods or subjectively using people's perceived associations between their social contacts.
3.2. Process Components
Representing Beliefs and Social Networks. Starting from some initial belief states,
beliefs evolve in time according to specific dynamical rules. These rules depend on
interactions with both the internal cognitive beliefs that individuals hold and with the
perceived beliefs of others around them. As mentioned, the focal belief under consideration
is represented by the vector σ
⃗ 𝑖 The framework also includes other internal beliefs related to
⃗ 𝑐𝑜𝑔 , as well as perceived
that focal belief, which are assumed to form one’s cognitive field ℎ 𝑖

⃗ 𝑖𝑠𝑜𝑐 (See Box 1).


beliefs of one’s social contacts, assumed to form one’s social field ℎ
An individual can use different strategies (e.g. frequency-dependent or averaging
⃗ 𝑐𝑜𝑔
rules) to integrate internal beliefs into her cognitive field ℎ 𝑖 , and perceived social
⃗ 𝑖𝑠𝑜𝑐 (see Box 1). The choice of strategy will depend
contacts’ beliefs into her social field ℎ
on the particular task and cultural context and one’s own cognitive capacities and personal
BELIEF DYNAMICS FRAMEWORK 14

preferences. These strategies can be probed directly through survey and experimental
measurement, or can be inferred by comparing predictions of models implemented in the
framework.
Cognitive and Social Dissonance. When a belief does not fit well with one’s
cognitive and/or social field, one can experience cognitive and/or social dissonance. In our
𝑐𝑜𝑔
framework, these dissonances, 𝑑𝑖 and 𝑑𝑖𝑠𝑜𝑐 correspond to the squared difference between
probability (or support) distributions that are currently assigned to different beliefs. This
difference is then used to calculate a new belief distribution that would best minimize the
dissonance, using a Boltzmann entropy framework (equivalent to the softmax function
often used in decision theory, Box 1).
While attempting to minimize dissonance, individuals can experience a tension
between minimizing their cognitive and social dissonance 34,126. For different individuals
and issues, one or the other source of dissonance can be more important. In our framework
this is represented by the parameter w, which weights the social relative to the cognitive
dissonance (Box 1).
The effect of dissonance on the evolution of beliefs is greatest for those who have
the most certainty about their own beliefs and those of their neighbors. In our framework,
this certainty is modeled by the (inverse of the) parameter β. This parameter captures
random error in updating processes that can occur in at least three different stages of belief
dynamics. People can be distracted and make mistakes in perceiving their own and others’
beliefs. They can also over- or under-estimate the dissonance between their different
beliefs. And they can waver in their beliefs and not adhere completely to achieving the
lowest dissonance state possible at every time step. All of these errors result in a certain
amount of noise and unpredictability. This is represented as temperature in statistical
physics models, with zero temperature corresponding to the complete absence of noise. In
our model, temperature corresponds to 1/β (Box 1). The higher the temperature (i.e., the
smaller β is), the greater the level of uncertainty and therefore the less of an effect
dissonance has on belief updating.
Both parameters w and β and can be estimated from the data or approximated by
empirical measurements of subjective relative importance of staying aligned with one’s
own or with one’s social network’s beliefs (for w), and of attention and subjective certainty
BELIEF DYNAMICS FRAMEWORK 15

(for β).
Belief and Network Updating. Dissonance can be reduced by modifying one's own
beliefs, but it can also be alleviated by changing or cutting links to beliefs and individuals
that disagree with one's firmly held belief. Our framework allows for modeling different
processes of belief and network updating, which can include random rewiring of edges,
rewiring based on some threshold of similarity between any two spins, rewiring aimed at
preserving balance in triads of spins or the overall network, and other processes.
3.3. Implementing and Testing Models Within the Framework
Models formulated within the framework can reproduce commonly observed
outcomes of belief dynamics as well as empirically measured beliefs about diverse real-
world issues. Figure 2A shows predictions of simple models that reproduce several
common phenomena studied in belief dynamics models: consensus, clustering, and
polarization. What ultimately matters most is how well these models explain empirically
observed belief trends. Figure 2B and 2C show results from two empirical studies
investigating beliefs people have about genetically modified (GM) food and their political
preferences. Our framework enables comparison of models using different assumptions
about social networks, strategies for belief integration, relative importance of cognitive and
social dissonance, and noise in the updating process. As described in the caption of Figures
2B and 2C (also see Appendix), some models describe empirical trends better than others,
allowing insights into belief dynamics in different contexts. This kind of modeling could be
used to assess in advance the impacts of different educational interventions designed to
increase public science literacy, as well as to explore and predict societal trends and
understand their underlying mechanisms.
BELIEF DYNAMICS FRAMEWORK 16

Box 1
Quantitative implementation of the integrative framework for models of belief dynamics

Consider a network with 𝑁 nodes, each corresponding to an individual. Each node 𝑖 is


connected to 𝐾𝑖 other nodes by links which can be directed or undirected, and have unit
or other weights. Each node i is assigned a focal belief σ ⃗ 𝑐𝑜𝑔 , and a
⃗ 𝑖 , a cognitive field, ℎ 𝑖

⃗ 𝑖𝑠𝑜𝑐 . The focal belief is an 𝑚-component belief state vector σ


social field, ℎ ⃗ 𝑖 = 𝑝𝑖,α , α =
1, … 𝑚, where the vector components (indexed by α) correspond to 𝑚 possible belief
states on a particular issue. In terms of its 𝑚 components the vector is written as
⃗ 𝑖 = (𝑝𝑖1 , 𝑝𝑖2 , … , 𝑝𝑖𝑚 )
σ (1)
where 𝑝𝑖,α ∈ [0,1], ∑𝑚
α=1 𝑝𝑖,α = 1, are probabilities or weights assigned to each belief

state. (For ease of readability, the time dependence of the quantities σ


⃗ 𝑖 , as well as
𝑐𝑜𝑔 ⃗ 𝑖 to be defined below, is understood.)
𝑑𝑖 , 𝑑𝑖𝑠𝑜𝑐 and 𝐻
We represent an individual’s orientation with respect to a given issue by an
internal cognitive field 145. This individual belief orientation consists of all of the
individual’s beliefs relevant for the focal belief under consideration. The cognitive field
⃗ 𝑐𝑜𝑔 . Components of ℎ
for individual i is represented by an 𝑚-component vector ℎ ⃗ 𝑐𝑜𝑔
𝑖 𝑖

denote probabilities or weights corresponding to the same belief states as in vector σ


⃗ 𝑖.
Similarly, we represent perceived orientation of one’s social contacts regarding the given
⃗ 𝑖𝑠𝑜𝑐 , whose components denote the probabilities that one’s
issue as m-component vector ℎ
social contacts assume different belief states.
The framework can incorporate different strategies for integrating individual
beliefs and perceived social contacts’ beliefs into the cognitive and the social field,
respectively. For instance, a model can assume that beliefs are formed based on a subset
of relevant considerations and social contacts’ beliefs, which are integrated into one’s
cognitive and social fields using different integration strategies. Common examples of
⃗𝑖 = σ
strategies include a random updating rule, ℎ ⃗ 𝑗 , where σ
⃗ 𝑗 , is a randomly chosen other
own belief or social contact, as in classic voter models, 108,110; different averaging
⃗ 𝑖 = ∑𝑚
strategies, ℎ 𝑗≠𝑖 𝑙𝑖𝑗 𝜎𝑗 , where ∑𝑗 𝑙𝑖𝑗 =1
21,103
; majority rule (or more generally,
⃗ 𝑖 = 𝑠𝑔𝑛(∑𝑚
conformism, ℎ 𝑗=1 𝜎𝑗 )) and other frequency-dependent strategies
98,111
; as well
BELIEF DYNAMICS FRAMEWORK 17

as various non-compensatory and other strategies such as a similarity-based rules (as in


bounded confidence models, 122, importance- or expertise-based rules (as it is frequently
⃗ 𝑖 = 𝜎𝑒 , where chosen 𝜎𝑒 stays fixed over
observed in humans and other animals, 76,77, ℎ
time, validity, or recognition-based rules 75.
𝑐𝑜𝑔
A simple but reasonable choice for modelling the cognitive dissonance 𝑑𝑖 =
𝑐𝑜𝑔
𝑑𝑖,α experienced by individual i when considering a particular belief state α is
𝑐𝑜𝑔 ⃗ 𝑐𝑜𝑔 2
𝑑𝑖,𝛼 = (𝛿𝛼 − ℎ 𝑖 ) , 𝛼 = 1, … , 𝑚, (2)

where the components of the “sampling vector” ⃗δ𝛼 are Kronecker δ-functions:

⃗ α = δα,α′ = {1 if 𝛼 = 𝛼 .
δ (3)
0 if 𝛼 ′ ≠ 𝛼
The social dissonance 𝑑𝑖𝑠𝑜𝑐 = 𝑑𝑖,α
𝑠𝑜𝑐
is calculated similarly. In (3) α is held fixed and α′
runs from 1 to 𝑚. We employ here the usual vector notation in which the square of a
vector is a dot (or inner) product of the vector with itself: 𝑣 2 = 𝑣 ⋅ 𝑣.
To get a probability distribution over possible belief states for each node, we
define the total dissonance function for individual i as
⃗ 𝑖 = 𝑑 𝑐𝑜𝑔 + 𝑤𝑑𝑖𝑠𝑜𝑐
𝐻 (5)
𝑖

or expressed in terms of components,


𝑐𝑜𝑔 𝑠𝑜𝑐
𝐻𝑖,𝛼 = 𝑑𝑖,𝛼 + 𝑤𝑑𝑖,𝛼 (6)
where the parameter 𝑤 ∈ [0, ∞) controls the relative weights of cognitive and social
inputs.
Finally, we compute the updated components of the probability vector 𝜎𝑖 from the
dissonance function (6) using the softmax function
𝑝𝑖,𝛼 = 𝑒 −𝛽𝑯𝑖,𝛼 / ∑𝑚
𝛼=1 𝑒
−𝛽𝑯𝑖,𝛼
, (7)
where 𝛽 = [0, ∞) is a parameter that measures the “noisiness” of the belief updating
process: larger 𝛽 values correspond to smaller noise or uncertainty (as noted in the main
text, 𝛽 corresponds to an inverse temperature in physics applications).
BELIEF DYNAMICS FRAMEWORK 18

Figure 2. Examples of theoretical and empirical tests of models constructed using


the framework presented in Box 1 (see Appendix for details). The leftmost subpanels show
the networks used and the node colors represent different belief states. Panel A shows the
results of theoretical simulations with random initial conditions and cognitive field
distributions. Depending on the values of parameters w and 𝛽 (shown at the top of each
graph), patterns resembling consensus, polarization, and clustering occur in otherwise
identical social networks. Panel B presents results from models of empirical changes in
beliefs about safety of GM food, observed in a national longitudinal study that measured
them six times over the course of several months.147 Different models used different rules
for integrating relevant individual beliefs and perceived social contacts’ beliefs into
cognitive and social fields (averaging rule, importance-weighted averaging, or only the
most important belief/contact). Parameters w and 𝛽 were fitted for each time point on half
the participants (middle subpanel), and then used to predict the data for the other half (right
subpanel; see legend for average values of the parameters across all time points). The
model assuming weighted averaging (full red line) proved to be best in both prediction of
BELIEF DYNAMICS FRAMEWORK 19

empirical trends (thick black line). Panel C shows how the framework can be used to
extrapolate empirical survey findings on political preferences and network connections 148
to broader populations. Here we simulated networks of voters in the 2018 elections in Ohio
assuming lower (top subpanels) or higher (bottom subpanels) homophily, and used
different assumptions about how voters integrate their internal preferences and information
from their social circles to predict likely belief dynamics (average or majority rule). The
predictions are compared to actual observed trends in political preferences in a probabilistic
election poll. Assuming a high homophily network leads to better correspondence of
predicted and empirical trends.

4. Conclusion
We reviewed a range of findings and models about belief dynamics from different
disciplines and proposed a quantitative framework that could be used to compare the many
different existing models theoretically and empirically. We hope that this review and
framework will enable integration of important findings so far largely developed
independently in different research fields, and contribute to a better understanding of
dynamics of beliefs about science, health, politics, and other issues arising in human
societies.
BELIEF DYNAMICS FRAMEWORK 20

Appendix
To implement models within the framework and test them theoretically and
empirically, we start by assigning numerical values to each of the components of the belief
vector 𝜎𝑖 . Our first implementation (Figure 2A) corresponds to theoretical tests of the
framework: we initially assign every individual one of 𝑚 belief states (1,0,0, … ,0,0),
(0,1,0, … ,0,0), ..., (0,0,0, … ,0,1), chosen randomly and independently of all neighbors and
𝑐𝑜𝑔
the individual's own cognitive field ℎ𝑖 . In all implementations shown here, the belief
updating is done synchronously; at every timestep, all agents simultaneously measure the
⃗ 𝛼 , where α is an index for the
distance of all possible future belief states (represented by δ
different belief states).
The three patterns in Figure 2A, showing consensus, clustering, and polarization,
were produced by varying only the parameters 𝛽 and 𝑤, and keeping the network structure
and the belief updating rule the same. The networks were generated by stochastic block
models with low homophily, where individuals were parts of social circles loosely
connected to each other and their initial beliefs were assigned randomly (Figure 2A,
leftmost subpanel). At each time step, individuals updated their beliefs using a simple
unweighted average of beliefs of their social contacts. When both the weight 𝑤 on the
social dissonance and the noise strength were relatively high (𝑤 = 2, β = 5, left panel),
populations always reached consensus. With the same high weight on social dissonance but
much lower levels of noise (𝑤 = 2, β = 30, middle panel), patterns akin to polarization
emerged in around 10% of the runs. Finally, when social and cognitive dissonances were
weighted equally (𝑤 = 1) and with the same level of low noise (β = 30, right panel),
individuals clustered into subgroups, with similar beliefs within each subgroup but
differing beliefs between subgroups. Further study of different combinations of the
parameters w and β on different social network structures and with different cognitive
strategies will enable comparisons between the predictions of different models.
Figure 2B and 2C show results from two empirical studies investigating belief
trends. Figure 2B shows the results of a study of beliefs about genetically modified (GM)
food, in which participants were asked to report their beliefs about GM food two times in
each of three study waves across several months 147. The initial conditions used here are the
beliefs reported by participants at the start of the survey on the safety of GM food.
BELIEF DYNAMICS FRAMEWORK 21

Participants also reported twelve other beliefs about moral issues related to GM food 149, as
well as perceived beliefs of different social groups (family, online community, scientists,
doctors, etc.). The underlying social network consists of a series of ego networks where
participants are connected with directed links to each of those social groups (see the
leftmost panel of Figure 2B), weighted by the reported relevance of each of those groups
for participants’ own beliefs about GM food safety. In the second wave participants
received information about recent scientific studies showing no evidence that GM food is
not safe to eat. As a result, most participants’ beliefs became more positive for a while. We
compared these empirical trends with model predictions assuming three different rules for
determining cognitive and social fields based on participants moral beliefs and social
perceptions: simple averaging rule; importance-weighted averaging (where moral beliefs
and social groups were weighted by their reported relevance for participant’s beliefs about
the safety of GM food), and a rule that only took into account the most important moral
belief and social group. The parameters 𝑤 and β were fitted at each time step by a grid
search restricted to half of the participants; these results were then used to predict the
beliefs of the other half. As the middle and right subpanels of Figure 2B show, weighted
averaging proved to be best in prediction of empirical trends. The models correctly
predicted that the greatest increase in positive opinions about GM food occurred
immediately after the intervention, which was followed by a decrease later on.
Finally, Figure 2C presents a study of political preferences as they developed in the
months before the 2018 US elections. We used a probabilistic survey 148 to determine initial
beliefs, internal preferences, and network connections of voters in different US states
(Figure 2C, leftmost panel), and then used our models to predict trajectories of their
support for Republicans, Democrats, other candidates, or for not voting. Voters are
assumed to update their preferences to minimize the dissonance between their own political
leanings (inferred from their vote in previous elections), and their current social
surroundings. We investigated predicted trajectories of their beliefs using two different
rules, unweighted averaging and majority rule, to integrate information from their social
environments. We also compared two different assumptions about their social networks
(Figure 2C, leftmost panel), characterized by either low or high homophily. The results
show that the empirical trends inferred from the probabilistic survey corresponded best to
BELIEF DYNAMICS FRAMEWORK 22

predictions assuming a network with high homophily. Preferences for voting Republican,
and for not voting at all, were best described by a model assuming unweighted averaging,
while preferences for voting Democratic were best described by a model assuming majority
rule.
References

1. Castellano, C., Fortunato, S. & Loreto, V. Statistical physics of social dynamics.


Rev. Mod. Phys. 81, 591–646 (2009).
2. Wiley, J. B., Hunter, J. E., Danes, J. E. & Cohen, S. H. Mathematical models of
attitude change: Change in single attitudes and cognitive structure. (Academic
Press, 1984).
3. Acemoglu, D. & Ozdaglar, A. Opinion dynamics and learning in social networks.
Dyn. Games Appl. 1, 3–49 (2011).
4. Golub, B. & Sadler, E. Learning in social networks. in The Oxford Handbook of the
Economics of Networks (eds. Bramoulle, Y., Galeotti, A. & Rogers, B.) (Oxford
University Press, 2016).
5. Jackson, M. O. Social and economic networks. (Princeton University Press, 2008).
6. Computational social psychology. (Routledge, 2017).
7. Proskurnikov, A. V. & Tempo, R. A tutorial on modeling and analysis of dynamic
social networks. Part I. Annu. Rev. Control 43, 65–79 (2017).
8. Proskurnikov, A. V. & Tempo, R. A tutorial on modeling and analysis of dynamic
social networks. Part II. Annu. Rev. Control 45, 166–190 (2018).
9. Xia, H., Wang, H. & Xuan, Z. Opinion dynamics. Int. J. Knowl. Syst. Sci. 2, 72–91
(2011).
10. Jędrzejewski, A. & Sznajd-Weron, K. Statistical physics of opinion formation: Is it a
SPOOF? Comptes Rendus Phys. 20, 244–261 (2019).
11. Miller, J. H. & Page, S. E. Complex adaptive systems: An introduction to
computational models of social life. (Princeton University Press, 2007).
12. Epstein, J. M. & Axtell, R. L. Growing artificial societies: Social science from the
bottom up. (The Brookings Institution, 1996).
13. Epstein, J. M. Agent_Zero: Toward neurocognitive foundations for generative social
science. (Princeton University Press, 2014).
BELIEF DYNAMICS FRAMEWORK 23

14. Hammond, R. A. Considerations and best practices in agent-based modeling to


inform policy. in Assessing the use of Agent-Based models for tobacco regulation
(The National Academies Press, 2015).
15. Breckler, S. J. Empirical validation of affect, behavior, and cognition as distinct
components of attitude. J. Pers. Soc. Psychol. 47, 1191–1205 (1984).
16. Eagly, A. H. & Chaiken, S. The psychology of attitudes. (Harcourt Brace
Jovanovich, 1993).
17. Allport, G. W. Attitudes: A handbook of social psychology. (Clark University Press,
1935).
18. Campbell, D. T. Social attitudes and other acquired behavioral dispositions. in
Psychology: A study of a science. Study II. Empirical substructure and relations with
other sciences. Volume 6. Investigations of man as socius: Their place in psychology
and the social sciences (ed. Koch, S.) 94–172 (McGraw-Hill, 1963).
19. Banerjee, A. & Fudenberg, D. Word-of-mouth learning. Games Econ. Behav. 46, 1–
22 (2004).
20. Bikhchandani, S., Hirshleifer, D. & Welch, I. A theory of fads, fashion, custom, and
cultural change as informational cascades. J. Polit. Econ. 100, 992–1026 (1992).
21. DeGroot, M. H. Reaching a consensus. J. Am. Stat. Assoc. 69, 118–121 (1974).
22. Dempster, A. P. Upper and lower probability inferences based on a sample from a
finite univariate population. Biometrika 54, 515–528 (1967).
23. Shafer, G. A mathematical theory of evidence. (Princeton University Press, 1976).
24. Pearl, J. Causality. (Cambridge University Press, 2009).
25. de Finetti, B. Logical foundations and measurement of subjective probability. Acta
Psychol. (Amst). 34, 129–145 (1970).
26. Savage, L. J. Elicitation of personal probabilities and expectations. J. Am. Stat.
Assoc. 66, 783–801 (1971).
27. Suppes, P. The measurement of belief. J. R. Stat. Soc. Ser. B 36, 160–175 (1974).
28. Wyer, R. S. & Goldberg, L. A probabilistic analysis of the relationships among
belief and attitudes. Psychol. Rev. 77, 100–120 (1970).
29. Ajzen, I. Attitudes, traits, and actions: Dispositional prediction of behavior in
personality and social psychology. in Advances in Experimental Social Psychology
BELIEF DYNAMICS FRAMEWORK 24

vol. 20 1–63 (1987).


30. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 50,
179–211 (1991).
31. Ajzen, I. & Fishbein, M. Understanding attitudes and predicting social behavior.
(Prentice Hall, 1980).
32. Fishbein, M. & Ajzen, I. Belief, attitude, intention, and behavior: An introduction to
theory and research. (Addison-Wesley, 1975).
33. Converse, P. E. The nature of belief systems in mass publics. in Ideology and
discontent (Free Press, 1964).
34. Festinger, L. A theory of cognitive dissonance. (Stanford University Press, 1957).
35. Kunda, Z. & Thagard, P. Forming impressions from stereotypes, traits, and
behaviors: A parallel-constraint-satisfaction theory. Psychol. Rev. 103, 284–308
(1996).
36. Monroe, B. M. & Read, S. J. A general connectionist model of attitude structure and
change: The ACS (Attitudes as Constraint Satisfaction) model. Psychol. Rev. 115,
733–759 (2008).
37. Van Overwalle, F. & Siebler, F. A connectionist model of attitude formation and
change. Personal. Soc. Psychol. Rev. 9, 231–274 (2005).
38. Dalege, J. et al. Toward a formalized account of attitudes: The Causal Attitude
Network (CAN) model. Psychol. Rev. 123, 2–22 (2016).
39. Dalege, J., Borsboom, D., van Harreveld, F. & van der Maas, H. L. J. The
Attitudinal Entropy (AE) framework as a general theory of individual attitudes.
Psychol. Inq. 29, 175–193 (2018).
40. Dalege, J., Borsboom, D., van Harreveld, F. & van der Maas, H. L. J. A network
perspective on attitude strength: Testing the connectivity hypothesis. Soc. Psychol.
Personal. Sci. 10, 746–756 (2019).
41. Rodriguez, N., Bollen, J. & Ahn, Y.-Y. Collective dynamics of belief evolution
under cognitive coherence and social conformity. PLoS One 11, e0165910 (2016).
42. Durkheim, E. The division of labour in society (G. Simpson, Trans.). (Free Press
(Original work published 1893, 1933).
43. Moreno, J. L. Sociometry, experimental method and the science of society. (Beacon
BELIEF DYNAMICS FRAMEWORK 25

House, 1951).
44. Abelson, R. P. Mathematical models in social psychology. Adv. Exp. Soc. Psychol.
3, 1–54 (1967).
45. Albert, R. & Barabási, A.-L. Statistical mechanics of complex networks. Rev. Mod.
Phys. 74, 47–97 (2002).
46. Easley, D. & Kleinberg, J. Networks, crowds, and markets. (Cambridge University
Press, 2010).
47. The structure and dynamics of networks. (Princeton University Press, 2006).
48. Watts, D. J. Six degrees: The science of a connected age. (Norton, 2004).
49. Christakis, N. A. & Fowler, J. H. Connected: The amazing power of social networks
and how they shape our lives. (Harper Collins, 2010).
50. Centola, D. An experimental study of homophily in the adoption of health behavior.
Science (80-. ). 334, 1269–1272 (2011).
51. Golub, B. & Jackson, M. O. How homophily affects the speed of learning and best-
response dynamics. Q. J. Econ. 127, 1287–1338 (2012).
52. Shalizi, C. R. & Thomas, A. C. Homophily and contagion are generically
confounded in observational social network studies. Sociol. Methods Res. 40, 211–
239 (2011).
53. Jasny, L., Waggle, J. & Fisher, D. R. An empirical examination of echo chambers in
US climate policy networks. Nat. Clim. Chang. 5, 782–786 (2015).
54. Key, V. O. The responsible electorate. (Vintage, 1966).
55. Tourangeau, R., Rasinski, K. A. & D’Andrade, R. Attitude structure and belief
accessibility. J. Exp. Soc. Psychol. 27, 48–75 (1991).
56. Judd, C. M., Drake, R. A., Downing, J. W. & Krosnick, J. A. Some dynamic
properties of attitude structures: Context-induced response facilitation and
polarization. J. Pers. Soc. Psychol. 60, 193–202 (1991).
57. Paulhus, D. L. Self-deception and impression management in test responses. in
Personality assessment via questionnaires 143–165 (Springer, 1986).
58. von Hippel, W. & Trivers, R. The evolution and psychology of self-deception.
Behav. Brain Sci. 34, 1–16 (2011).
59. Galesic, M., Olsson, H. & Rieskamp, J. A sampling model of social judgment.
BELIEF DYNAMICS FRAMEWORK 26

Psychol. Rev. 125, 363–390 (2018).


60. Nisbett, R. E. & Kunda, Z. Perception of social distributions. J. Pers. Soc. Psychol.
48, 297–311 (1985).
61. Leary, M. R. & Kowalski, R. M. Impression management: A literature review and
two-component model. Psychol. Bull. 107, 34–47 (1990).
62. Berger, J. & Heath, C. Who drives divergence? Identity signaling, outgroup
dissimilarity, and the abandonment of cultural tastes. J. Pers. Soc. Psychol. 95, 593–
607 (2008).
63. Leonardelli, G. J., Pickett, C. L. & Brewer, M. B. Optimal distinctiveness theory: A
framework for social identity, social cognition, and intergroup relations. in Advances
in Experimental Social Psychology vol. 43 63–113 (Springer International
Publishing, 2010).
64. McElreath, R., Boyd, R. & Richerson, P. J. Shared norms and the evolution of ethnic
markers. Curr. Anthropol. 44, 122–130 (2003).
65. Smaldino, P. E. Social identity and cooperation in cultural evolution. Behav.
Processes 161, 108–116 (2019).
66. Chater, N., Tenenbaum, J. B. & Yuille, A. Probabilistic models of cognition:
Conceptual foundations. Trends Cogn. Sci. 10, 287–291 (2006).
67. Cook, J. & Lewandowsky, S. Rational irrationality: Modeling climate change belief
polarization using Bayesian networks. Top. Cogn. Sci. 8, 160–179 (2016).
68. Banerjee, A. A simple model of herd behavior. Q. J. Econ. 107, 797–817 (1992).
69. Segerberg, K. Default logic as dynamic doxastic logic. Erkenntnis 50, 333–352
(1999).
70. Benthem, J. van. Logical dynamics of information and interaction. (Cambridge
University Press, 2011).
71. van Ditmarsch, H., van der Hoek, W. & Kooi, B. Dynamic epistemic logic.
(Springer, 2008). doi:10.1007/978-1-4020-5839-4.
72. Baltag, A., Christoff, Z., Rendsvig, R. K. & Smets, S. Dynamic epistemic logics of
diffusion and prediction in social networks. Stud. Log. 107, 489–531 (2019).
73. Seligman, J., Liu, F. & Girard, P. Logic in the community. in Indian Conference on
Logic and Its Applications 178–188 (Springer, 2011). doi:10.1007/978-3-642-
BELIEF DYNAMICS FRAMEWORK 27

18026-2_15.
74. Petty, R. E. & Cacioppo, J. T. The elaboration likelihood model of persuasion. in
Advances in Experimental Social Psychology vol. 19 123–205 (1986).
75. Gigerenzer, G., Todd, P. M. & the ABC Research Group. Simple heuristics that
make us smart. (Oxford University Press, 1999).
76. Hertwig, R., Hoffrage, U. & the ABC Research Group. Simple heuristics in a social
world. (Oxford University Press, 2012).
77. Hoppitt, W. & Laland, K. N. Social learning: An introduction to mechanisms,
methods, and models. (Princeton University Press, 2013).
78. Condorcet, M. Essai sur l’application de l’analyse à la probabilité des décisions
rendues à la pluralité des voix [Essay on the application of analysis to the
probability of majority decisions]. (Imprimerie Royale, 1785).
79. Grofman, B., Owen, G. & Feld, S. L. Thirteen theorems in search of the truth.
Theory Decis. 15, 261–278 (1983).
80. Galam, S. Sociophysics: A review of Galam models. Int. J. Mod. Phys. C 19, 409–
440 (2008).
81. Rogers, E. M. Diffusion of innovations. (Simon & Shuster, 1995).
82. Hastie, R. & Kameda, T. The robust beauty of majority rules in group decisions.
Psychol. Rev. 112, 494–508 (2005).
83. Einhorn, H. J. & Hogarth, R. M. Unit weighting schemes for decision making.
Organ. Behav. Hum. Perform. 13, 171–192 (1975).
84. Dawes, R. M. The robust beauty of improper linear models in decision making. Am.
Psychol. 34, 571–582 (1979).
85. List, C. & Goodin, R. E. Epistemic democracy: Generalizing the condorcet jury
theorem. J. Polit. Philos. 9, 277–306 (2001).
86. Schelling, T. Micromotives and macrobehavior. (Norton, 1978).
87. Centola, D. & Macy, M. Complex contagions and the weakness of long ties. Am. J.
Sociol. 113, 702–734 (2007).
88. Granovetter, M. Threshold models of collective behavior. Am. J. Sociol. 83, 1420–
1443 (1978).
89. Glaeser, E. L. & Sunstein, C. R. Extremism and social learning. J. Leg. Anal. 1,
BELIEF DYNAMICS FRAMEWORK 28

263–324 (2009).
90. Berkes, F. Evolution of co-management: Role of knowledge generation, bridging
organizations and social learning. J. Environ. Manage. 90, 1692–1702 (2009).
91. Boyd, R. & Richerson, P. J. The origin and evolution of cultures. (Oxford University
Press, 1995).
92. Kendal, R. L. et al. Social learning strategies: Bridge-building between fields.
Trends Cogn. Sci. 22, 651–665 (2018).
93. Asch, S. E. Forming impressions of personality. J. Abnorm. Soc. Psychol. 41, 258–
290 (1946).
94. Claidière, N. & Whiten, A. Integrating the study of conformity and culture in
humans and nonhuman animals. Psychol. Bull. 138, 126–145 (2012).
95. Moscovici, S. & Zavalloni, M. The group as a polarizer of attitudes. J. Pers. Soc.
Psychol. 12, 125–135 (1969).
96. Sorkin, R. D., West, R. & Robinson, D. E. Group performance depends on the
majority rule. Psychol. Sci. 9, 456–463 (1998).
97. Breiman, L. Bagging predictors. Mach. Learn. 24, 123–140 (1996).
98. Krapivsky, P. L. & Redner, S. Dynamics of majority rule in two-state interacting
spin systems. Phys. Rev. Lett. 90, 238701 (2003).
99. Payne, J. W., Bettman, J. R. & Johnson, E. J. The adaptive decision maker.
(Cambridge University Press, 1993).
100. Anderson, N. H. Integration theory and attitude change. Psychol. Rev. 78, 171–206
(1971).
101. Soll, J. B. & Larrick, R. P. Strategies for revising judgment: How (and how well)
people use others’ opinions. J. Exp. Psychol. Learn. Mem. Cogn. 35, 780–805
(2009).
102. Brehmer, B. Social judgment theory and the analysis of interpersonal conflict.
Psychol. Bull. 83, 985–1003 (1976).
103. Friedkin, N. E. & Johnsen, E. C. Social influence and opinions. J. Math. Sociol. 15,
193–206 (1990).
104. Golub, B. & Jackson, M. O. Naïve learning in social networks and the wisdom of
crowds. Am. Econ. J. Microeconomics 2, 112–149 (2010).
BELIEF DYNAMICS FRAMEWORK 29

105. Grimm, V. & Mengel, F. Experiments on belief formation in networks. J. Eur. Econ.
Assoc. 18, 49–82 (2020).
106. Acemoglu, D., Ozdaglar, A. & ParandehGheibi, A. Spread of (mis)information in
social networks. Games Econ. Behav. 70, 194–227 (2010).
107. Jadbabaie, A., Molavi, P., Sandroni, A. & Tahbaz-Salehi, A. Non-Bayesian social
learning. Games Econ. Behav. 76, 210–225 (2012).
108. Clifford, P. & Sudbury, A. A model for spatial conflict. Biometrika 60, 581–588
(1973).
109. Holley, R. A. & Liggett, T. M. Ergodic theorems for weakly interacting infinite
systems and the voter model. Ann. Probab. 3, 643–663 (1975).
110. Redner, S. A guide to first-passage processes. (Cambridge University Press, 2001).
111. Boyd, R. & Richerson, P. J. Culture and the evolutionary process. (University of
Chicago Press, 1985).
112. Newman, M. E. J. The structure and function of complex networks. SIAM Rev. 45,
167–256 (2003).
113. Gigerenzer, G. & Goldstein, D. G. Reasoning the fast and frugal way: Models of
bounded rationality. Psychol. Rev. 103, 650–669 (1996).
114. Reber, R. & Schwarz, N. Effects of perceptual fluency on judgments of truth.
Conscious. Cogn. 8, 338–342 (1999).
115. Schooler, L. J. & Hertwig, R. How forgetting aids heuristic inference. Psychol. Rev.
112, 610–628 (2005).
116. Oppenheimer, D. M. The secret life of fluency. Trends Cogn. Sci. 12, 237–241
(2008).
117. Schwarz, N., Bless, H., Strack, F., Klumpp, G. & et al. Ease of retrieval as
information: Another look at the availability heuristic. J. Pers. Soc. Psychol. 61,
195–202 (1991).
118. Weaver, K., Garcia, S. M., Schwarz, N. & Miller, D. T. Inferring the popularity of
an opinion from its familiarity: A repetitive voice can sound like a chorus. J. Pers.
Soc. Psychol. 92, 821–833 (2007).
119. Bandura, A. Psychotherapy as a learning process. Psychol. Bull. 58, 143–159 (1961).
120. Henrich, J. & Gil-White, F. J. The evolution of prestige: Freely conferred deference
BELIEF DYNAMICS FRAMEWORK 30

as a mechanism for enhancing the benefits of cultural transmission. Evol. Hum.


Behav. 22, 165–196 (2001).
121. Laland, K. N. Social learning strategies. Anim. Learn. Behav. 32, 4–14 (2004).
122. Deffuant, G., Neau, D., Amblard, F. & Weisbuch, G. Mixing beliefs among
interacting agents. Adv. Complex Syst. 03, 87–98 (2000).
123. Hegselmann, R. & Krause, U. Opinion dynamics and bounded confidence: Models,
analysis and simulation. Jasss 5, (2002).
124. Weisbuch, G., Deffuant, G., Amblard, F. & Nadal, J.-P. Meet, discuss, and
segregate! Complexity 7, 55–63 (2002).
125. Axelrod, R. The dissemination of culture: A model with local convergence and
global polarization. J. Conflict Resolut. 41, 203–226 (1997).
126. Festinger, L. A theory of social comparison processes. Hum. Relations 7, 117–140
(1954).
127. Gawronski, B. & Strack, F. Cognitive consistency as a basic principle of social
information processing. in Cognitive consistency: A fundamental principle in social
cognition (eds. Gawronski, B. & Strack, F.) 1–16 (Guilford Press, 2012).
128. Heider, F. Attitudes and cognitive organization. J. Psychol. 21, 107–112 (1946).
129. Newby-Clark, I. R., McGregor, I. & Zanna, M. P. Thinking and caring about
cognitive inconsistency: When and for whom does attitudinal ambivalence feel
uncomfortable? J. Pers. Soc. Psychol. 82, 157–166 (2002).
130. Priester, J. R. & Petty, R. E. The gradual threshold model of ambivalence: Relating
the positive and negative bases of attitudes to subjective ambivalence. J. Pers. Soc.
Psychol. 71, 431–449 (1996).
131. van Harreveld, F., van der Pligt, J. & de Liver, Y. N. The agony of ambivalence and
ways to resolve it: Introducing the MAID model. Personal. Soc. Psychol. Rev. 13,
45–61 (2009).
132. Elliot, A. J. & Devine, P. G. On the motivational nature of cognitive dissonance:
Dissonance as psychological discomfort. J. Pers. Soc. Psychol. 67, 382–394 (1994).
133. Heider, F. The psychology of interpersonal relations. (Wiley, 1958).
134. Shultz, T. R. & Lepper, M. R. Cognitive dissonance reduction as constraint
satisfaction. Psychol. Rev. 103, 219–240 (1996).
BELIEF DYNAMICS FRAMEWORK 31

135. Schweighofer, S., Schweitzer, F. & Garcia, D. A weighted balance model of opinion
hyperpolarization. JASSS 23, 1 (2020).
136. Cartwright, D. & Harary, F. Structural balance: a generalization of Heider’s theory.
Psychol. Rev. 63, 277–293 (1956).
137. Duggan, M. & Smith, A. The political environment on social media. (2016).
138. McPherson, M., Smith-Lovin, L. & Cook, J. M. Birds of a feather: Homophily in
social networks. Annu. Rev. Sociol. 27, 415–444 (2001).
139. Schelling, T. C. Models of segregation. Am. Econ. Rev. 59, 488–493 (1969).
140. Schelling, T. C. Dynamic models of segregation. J. Math. Sociol. 1, 143–186 (1971).
141. Skyrms, B. & Pemantle, R. A dynamic model of social network formation. Proc.
Natl. Acad. Sci. 97, 9340–9346 (2000).
142. Holme, P. & Newman, M. E. J. Nonequilibrium phase transition in the coevolution
of networks and opinions. Phys. Rev. E 74, 056108 (2006).
143. Almaatouq, A. et al. Adaptive social networks promote the wisdom of crowds. Proc.
Natl. Acad. Sci. 117, 11379–11386 (2020).
144. Belaza, A. M. et al. Statistical physics of balance theory. PLoS One 12, 1–19 (2017).
145. Galesic, M. & Stein, D. l. Statistical physics models of belief dynamics: Theory and
empirical tests. Phys. A Stat. Mech. its Appl. 519, 275–294 (2019).
146. Busemeyer, J. R., Kvam, P. D. & Pleskac, T. J. Markov versus quantum dynamic
models of belief change during evidence monitoring. Sci. Rep. 9, 18025 (2019).
147. van der Does, T., Galesic, M., Stein, D. L. & Fedorov, N. Moral and social
foundations of beliefs about scientific issues: The case of GM food and vaccination.
(2020).
148. Olsson, H., Bruine de Bruin, W., Galesic, M. & Prelec, D. Bayesian weighting using
wisdom of crowds improves election forecasts. (2020).
149. Graham, J. et al. Mapping the moral domain. J. Pers. Soc. Psychol. 101, 366–385
(2011).
BELIEF DYNAMICS FRAMEWORK 32

Acknowledgements: MG, HO, and JD were supported in part by grants from the
National Science Foundation (BCS-1918490), MG and TvdD were supported in part by a
grant from the National Institute of Food and Agriculture (NIFA- 2018-67023-27677). The
funders had no role in study design, data collection and analysis, decision to publish or
preparation of the manuscript. Any opinions, findings and conclusions or recommendations
expressed in this material are those of the authors and do not necessarily reflect the views
of the funding agencies.
Competing interests: The authors declare no competing interests.

You might also like