You are on page 1of 21

Received: 15 August 2019 Revised: 23 April 2020 Accepted: 9 May 2020

DOI: 10.1111/mila.12326

SUBMITTED ARTICLE

A tribal mind: Beliefs that signal group identity


or commitment

Eric Funkhouser

Philosophy Department, University of


Arkansas, Fayetteville, Arkansas People are biased toward beliefs that are welcomed by
their in-group. Some beliefs produced by these biases—
Correspondence
Eric Funkhouser, 318 Old Main,
such as climate change denial and religious belief—can
University of Arkansas, Fayetteville, AR be fruitfully modeled by signaling theory. The idea is
72701. that the beliefs function so as to be detected by others
Email: efunkho@uark.edu
and manipulate their behavior, primarily for the bene-
fits that accrue from favorable tribal self-presentation.
Signaling theory can explain the etiology, distinctive
form, proper function, and alterability of these beliefs.

KEYWORDS
belief, climate change denial, cooperation, group identity, religious
belief, signaling

1 | INTRODUCTION

Recent work in epistemology and philosophy of mind has taken a social turn. Psychologists
have long known that the mind is shaped for social functioning, but despite our social turn
these insights have not fully penetrated philosophical understandings of the mind. I will focus
on belief. My central claim is that beliefs often have social functions that can be fruitfully
modeled by signaling theory. Theories in this vicinity have been developed by social scientists
(Kahan, 2012, 2017; “cultural cognition”), evolutionary biologists (Trivers, 2002; “self-decep-
tion”), evolutionary psychologists (Richerson & Boyd, 2005; “symbolic markers”; Bliege-Bird &
Smith, 2005; “symbolic capital”), and political scientists (Petersen, 2016; “coalitional
psychology,” drawing on the work of Tooby & Cosmides, 2010). But none has articulated and
developed a theory of the mind—in particular, socially relevant beliefs—as a signaling system.
According to the signaling theory developed here, some beliefs are literally signals designed to
be detected by others and manipulate1 them. This article focuses on beliefs that show us to be

1
“Manipulation” as used here can be harmful, beneficial, or neutral to the audience. See Krebs and Dawkins (1984) for
this liberal use of “manipulation.”

444 © 2020 John Wiley & Sons Ltd wileyonlinelibrary.com/journal/mila Mind Lang. 2022;37:444–464.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 445

cooperative members of a tribe.2 Signaling theory can explain the etiology, distinctive form,
proper function, and alterability of these beliefs.
Signaling is a kind of presentation, and people often present themselves through their
beliefs. Social psychologists study self-presentation under the banner of impression manage-
ment, which is the goal-directed control of information to influence the impressions formed by
an audience in a particular social environment (Schlenker & Pontari, 2000, p. 201). It is well
known that there are conscious, deliberate, or effortful forms of impression management that
we perform before particular audiences—for example, on a first date or job interview. These
efforts often fall into one of two categories: self-enhancement (agentic) or prosociality (commu-
nal) (Paulhus & Trapnell, 2008). The driving idea behind belief-signaling is that there are strate-
gic reasons to more broadly project self-enhancement and prosociality, and these forces can
shape our belief-forming mechanisms and the function of belief.3 Such signaling does not
require conscious, deliberate, or effortful manipulation.
This article makes a philosophical and empirical case for tribal beliefs that signal group
identity or commitment for the sake of cooperation with the tribe. An influential group of social
psychologists have argued for in-group biases or social identity theory, claiming that people
hold group favorable beliefs largely so as to regulate self-esteem or the self-concept (Tajfel &
Turner, 1979; Turner, Brown, & Tajfel, 1979). I argue for an alternative explanation for some
group favorable beliefs: The beliefs themselves are literal signals that reveal tribal identity or
willingness to work with the tribe as a cooperative partner. The novel contribution is in the rig-
orous application of multidisciplinary signaling theory (Bliege-Bird & Smith, 2005; Krebs &
Dawkins, 1984; Maynard-Smith & Harper, 2003; Searcy & Nowicki, 2005; Spence, 1973;
Zahavi, 1975) to these in-group biases. The belief-signaling theory is a critical piece to solving
the problem of cooperation, filling a gap left by kinship selection and reciprocal altruism in
accounting for human cooperation in very large groups of unrelated individuals. Rather than
investigating the varied ways in which people control their behavior for favorable presentation
to the tribe, I will argue that there is a logic leading to the (typically unconscious) selection of
beliefs for tribal presentation. The idea is that the pressure to be socially appropriate can perme-
ate to the beliefs by which we lead our lives and present to others. The incentive is not mere
conformity or self-esteem regulation, but belief-signaling for the sake of cooperation with the
tribe. Two examples of beliefs that signal group identity or commitment will be investigated: cli-
mate change denial and religious beliefs. After considering objections and alternative explana-
tions to belief-signaling, I conclude with lessons to be learned regarding beliefs that function to
signal tribal cooperation.

2 | B ACK GROUND: S IGNALING AND MANIPULATION

We should start with a basic understanding of signaling, and this formulation will serve our
purposes well.

2
“Tribe” refers to any self-perceived group affiliation that contributes to one's self-identity. This includes political,
religious, racial, cultural, professional, or team affiliations, among other possibilities. The word, while problematic in
some ways (e.g., colonial connotations), will be preserved for a few reasons: It is commonly used in the literature on
identity protective cognition, it is contained in the commonplace idea of tribal epistemology, and it correctly connotes
something fundamental to human psychology.
3
Signaling pressure can alter mental states besides belief. I will not here discuss the ways that signaling pressure can
shape our desires, values, and emotions.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
446 FUNKHOUSER

Signal:

1. Any object that is successfully designed or selected to communicate information,


2. so as to be detected by some receiver,
3. in order to modify its behavior. (Funkhouser, 2017, pp. 811–812)

A belief-signal is any belief meeting these conditions: It was specifically designed or selected
because of how it is detected by others and manipulates their behavior. This design or selection
could occur via a process of biological evolution, cultural evolution, or individual learning. The
“selection” is supposed to capture the idea that the belief has a signaling function. It is not just
that the belief (e.g., that human caused climate change is a myth) is a cue or evidence that, say,
the person is a Republican (Maynard-Smith & Harper, 2003, p. 3). Rather, she has that belief
because it communicates this information.
Belief-signals can always be further specified along these values: S's belief B serves as a sig-
nal to audience A of some informational content I, which induces behavioral manipulation M
in A. In any particular case of belief-signaling, there can be multiple values for A-I-M. That is, a
given belief can signal to diverse audiences, multiple informational contents, which induce
multiple behavioral responses. The costs and benefits of these responses, as determined by sig-
naler/receiver preferences, determine the type of signaling game “played.”4 While B is selected
because of the effect it has on a certain audience A, B is nevertheless a genuinely held belief
that is presented to all audiences. However, it might be more prominently displayed in
A-environments (e.g., when with fellow Republicans).
The claim is that beliefs can be signals in the same technical sense that we find with para-
digm signals in the animal world or in human culture. A peacock's tail feather is a signal of
genetic fitness because it was designed in light of the fact that peahens differentially respond to
that attribute and mate with peacocks according to how that information is received
(Zahavi, 1975). Or, formal education is a signal of applicant quality in the job market because
employers differentially hire applicants based on that information (Spence, 1973). The interplay
of detection and behavioral manipulation (plus the high costs—not just economic—of getting
an education, which makes it difficult to fake quality) gives rise to the signaling function. So
too for some beliefs: They are shaped by how others detect them and differentially respond to
us.5 It is widely accepted that behavior can function as a signal, but the more original and stron-
ger claim is that the underlying beliefs can do so as well. We will need evidence that these
beliefs are designed or selected because of how they are detected by and manipulate an audi-
ence, typically to the signaler's advantage.
Belief-signaling for the sake of tribal cooperation is a form of self-presentation, but it differs
from impression management. Impression management is tailored to a specific social environ-
ment, and the behavior does not necessarily reflect the person's genuine beliefs. But the same
logic that leads to situation-specific impression management—communicating information
about the self to be detected by an audience in order to manipulate their behavior—can lead to
more entrenched psychological dispositions, including outright belief. The claim is not that peo-
ple are consciously planning to adopt these beliefs, but that in typical cases this strategy is

4
Formal models of signaling games have their structure determined by the signaler's (unobservable) type, signal options,
receiver response to signal, and signaler/receiver incentives.
5
See Funkhouser (2017) for the full argument for this claim. Williams (2020) does a good job making the more general
case for what he calls “socially adaptive belief.”
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 447

nonconsciously acquired through individual learning, enculturation, or natural selection. This


contrasts with some accounts of social signaling, which require purposive and strategic choice
(Przepiorka & Berger, 2017). The present account merely requires of a signal that it acquires
that function, through whatever means. This more liberal approach fits with interdisciplinary
signaling theory, which covers phenomena as diverse as job market signaling and animal sig-
nals. Such signals can be conditioned without purposive choice, or they might even be acquired
by natural selection. Indeed, it has been shown that repeated self-presentations in the service of
impression management become habitual—they transform into self-deceptive beliefs
(Hogan, 1983). Similarly, Robert Trivers (2002) argues that we self-deceive in order to better
deceive and manipulate others. The main benefits of outright belief over controlled impression
management are twofold:

1. Genuine belief allows the person to reap the benefits of automaticity and effortlessness.
There is no need for dual representations or conscious calculation about how to appear to
others. This reduces cognitive load and makes missed opportunities for positive self-
presentation less likely.
2. Since the person is not being intentionally deceitful, there is less risk of retaliation for mis-
representation. Liars are punished more than the mistaken.

There is a well-known worry that this logic could incentivize the dishonest signaling of all
sorts of false beliefs. But since, in this coevolutionary battle, the audience is incentivized to
ignore dishonest signals, the system should collapse (Krebs & Dawkins, 1984). Yet, signaling
systems exist because most signals are honest or at least worth attending to. The most well-
known mechanism to ensure honesty is costly signaling (Spence, 1973; Zahavi, 1975). The idea
here is that the signal is costly to produce, so only those who genuinely have the advertised trait
can afford to produce the signal (e.g., only people who are genuinely affluent can afford to pur-
chase designer brands). But costly signaling is not the only honesty inducing mechanism. Hon-
esty can also arise due to common interest, intrinsic difficulty of faking the signal, detection
and punishment of cheaters, and the promise of future benefits (Przepiorka & Berger, 2017;
Searcy & Nowicki, 2005). The latter kinds of honestly mechanisms supplement, if not replace,
costly signaling for tribal beliefs.
We typically understand the audience, A, to be distinct from the signaler, S. Yet, a complica-
tion worth considering is that sometimes beliefs function as signals to the self (S = A) for self-
manipulation (Mijovic-Prelec & Prelec, 2010). It may be that certain tribe-characteristic beliefs
have a self-signaling function for reassurance as to one's tribal identity.

3 | S I GNALING TRIBAL I DENTITY OR C OM MITMENT

In his novel Cat's Cradle, Kurt Vonnegut introduced the term “granfalloon” to describe “a
seeming team that was meaningless in terms of the ways God gets things done…examples of
granfalloons are the Communist Party, the Daughters of the American Revolution, the General
Electric Company, the International Order of Odd Fellows—and any nation, anytime, any-
where” (Vonnegut, 1963/2010, pp. 91–92). Vonnegut seemed to think that it was foolish to give
much weight to arbitrary or shallow associations. But research has shown that not only are
such associations robust, there is a wisdom to them when it comes to social behavior and belief.
McElreath, Boyd, and Richerson (2003) developed a mathematical model showing that ethnic
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
448 FUNKHOUSER

markers can evolve to signal strategies and beliefs that are adaptive for local environments. Our
tribal-signaling proposal argues in the opposite direction: Some tribe-characteristic beliefs signal
ethnic or tribal affiliation (Richerson & Boyd, 2005, p. 212). Significantly, a belief system need
not have a distinctively moral or prosocial character to mark off a tribal identity that encour-
ages in-group cooperation. It is well-established that almost any arbitrary marker, including
belief systems, can function in this manner, especially in competitive environments
(Brewer, 1979). The tribal identity could be as frivolous as being a New York Yankees fan,
iPhone user, or Hoosier.
Our specific claim is that beliefs reflecting tribal identity or commitment can signal for
cooperative or other prosocial ends. Two major categories of informational content (I) are in
play here. First, tribal-characteristic beliefs can signal intentions and behavioral strategies. These
give tribal members a reason to trust us and predict our behavior. Second, these beliefs can sig-
nal that we are worthy of receiving benefits from the tribe (and perhaps others). The social sci-
ences have discovered signals that function in these very ways. Nelissen and Meijers (2011),
drawing on Veblen's (1899/1994) theory of conspicuous consumption, showed that wearing lux-
ury brands results in more compliance and generosity from others. They interpreted the luxury
brands as costly signals of social status—since the clothes are so expensive, it is likely that the
person wearing them really does have significant financial resources. The benefits received by
the confederates in those studies (e.g., better evaluations, more pay) are supposedly due to the
signaled social status which, in effect, shows that they are worthy. In their studies, the benefits
were garnered from nontribal members, as well. Subsequent research found evidence that some
social signaling—luxury brands, again, and proenvironmentalism—sometimes induced favor-
able treatment only among tribal peers. Similarly, nontribal members (i.e., those in poorer
neighborhoods) viewed wealthy individuals as less trustworthy (Berger, 2017).
There is much diversity in the formal structure of the “signaling games” for tribal beliefs.
We commonly signal tribal identity so as to advertise to the like-minded (homophily) and sepa-
rate ourselves from others, especially so as to create networks for association, trust, and episte-
mic legitimacy. In the examples discussed below, we will encounter these possibilities for
signaled informational content, responses, and incentives:

Information content signaled (I): I am a Republican; I am a loyal Republican—I can be


trusted to advance the cause; I share Republican values; I am a trusted epistemic source.
(Signalers might not belong to these types, in which case it is dishonest.)

Behavioral response by receiver (M): Welcomes the signaler as a social partner, bestowing
the typical social benefits (companionship, advice, sharing goods, etc.); shares information
with the signaler and seeks information from them as well.

Signaler incentives to signal: Social companionship and exchange based on trust/similarity;


desire to broadcast social-political information and receive welcome information in return.

Receiver incentives to respond: Desire to associate with and reward the like-minded;
looking for mutually beneficial exchanges based on trust/similarity; wants social-political
information that they judge to be reliable or which furthers their goals.

These are some of the many possible incentives, generating a wide range of formal models to fit
to social reality.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 449

3.1 | Climate change denial

I will argue that anthropogenic climate change denial—the genuine underlying belief—often
functions so as to signal political affiliation, along with related facts about one's worldview: For
some of those with Republican political affiliation (S), their belief that human-caused climate
change is a myth (B) serves as a signal to both those within their political party and without
(A) that they are loyal Republicans who distrust scientists, environmental activists, and so forth
(I). The communication of this information induces their fellow party members to differentially
trust and work with them, while establishing boundaries of interaction for those with opposed
political affiliations (M). The claim is not that every case of climate change denial is due to sig-
naling pressure; undoubtedly, many people have those beliefs simply due to poor or motivated
reasoning.
This example was selected because, though climate change denial strongly correlates with
Republican political affiliation, it is not essential to that ideology. It is a question of fact rather
than value. It may, however, be a fact that sits uncomfortably with one's values and policy pref-
erences. One may reasonably wonder if climate change denial based on tribal identity is simply
a case of normal motivated reasoning. Note, though, that most climate change deniers do not
have a robust vested interest in policies grounded in climate change denial. Most are not factory
owners who would be burdened by regulations nor do they have major financial investments in
companies that contribute significantly to climate change. Of course, people can be impacted by
trickle down effects, or they may simply have a vested interest in maintaining their worldview.
The first step to proving that a given belief functions to signal group identity is to establish
that it actually communicates tribal membership to others. Not all Republicans are climate
change deniers, of course, and not all Democrats accept climate change. But the differences are
striking. According to Gallup polling from 2016, 84% of Democrats worry about climate change
a “great deal” or “fair amount” compared to only 40% of Republicans (Egan & Mullin, 2017,
p. 218). Still, there is overlap on this issue between the two parties, and one might think that
other demographic variables have even stronger correlations with climate change denial. How-
ever, a meta-analysis performed by Hornsey, Harris, Bain, and Fielding (2016), testing 27 demo-
graphic and psychological variables, found political affiliation to be the largest demographic
correlate of climate change belief. Political ideology was a distant second with less than half the
effect. The fact that affiliation is much more significant than ideology is very important. It sug-
gests that climate change denial is embraced not so much because it is justified by their values
and principles, but because it conforms to their tribal identity.
In order to be a genuine signal of tribal identity, rather than a mere cue or evidence for it, there
must be some design or selection toward that functionality. Evidence for this can be found with
trends toward polarization or extreme belief, though these effects can have other motivational
explanations as well. Signals function better if they are vivid and easy to detect, which is why they
are often ostentatious or otherwise striking. In the animal world, signals often evolve and acquire
their function through a process of ritualization (Maynard-Smith & Harper, 2003, p. 3). An attribute
becomes exaggerated (in length, color, etc.) as it starts to communicate specific information,
extending well beyond its original shape as it develops a signaling function. If a belief originally
was only loosely associated with one tribe but over time became an issue of polarization among
tribes, this could be due to a process of ritualization.
Indeed, this is what we find for climate change denial. Up until the late-1990s, there was
not a significant divide between Republicans and Democrats when it came to concern about cli-
mate change and belief that it had already begun. Gallup poll data show that a polarizing trend
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
450 FUNKHOUSER

emerged in the United States from 2001 to 2010, when the issue of climate change became
emblematic for the larger culture wars (McCright & Dunlap, 2011, p. 171). Egan and
Mullin (2017, p. 218) use that same Gallup polling data to show that what was an 18% gap
between Republicans and Democrats concerning climate change in 2001 grew into a 44% gap
by 2010.
Extreme beliefs within groups and belief polarization across groups are well-documented,
especially for groups that go through a deliberative process. Cass Sunstein (2000, p. 75) posits
both motivated and cognitive explanations for the extremes that often result from group
deliberation—group members want to maintain good reputations within their group, and they
are exposed to a skewed body of evidence. These explanations are compatible with the beliefs
also functioning as signals, as a belief-signaling function could underlie the motivational expla-
nation. This depends on whether the motive for a good reputation is fundamentally hedonic
(e.g., to think of themselves as good tribal members) or socially manipulative (e.g., to use their
good reputation to reap benefits from others). Many tribes take their tribe-characteristic beliefs
as being settled, and the informational exchange is better described as indoctrination than
deliberation. In these nondeliberative settings, tribal members might embrace polarizing beliefs
so as to make clear their degree of tribal commitment, much like nondeliberating peacocks
develop runaway tail feathers. In fact, group polarization on some issue could precede signal-
ing, with the antecedent division between two groups on some issue (cue) being selected for its
informational efficacy (e.g., vivid and clear displays) to now function as a signal (Guilford &
Dawkins, 1991). This is one reason why our definition of “signal” does not demand signaler-
side design, allowing signaling functions to arise due to receiver-side selection.
In addition to polarization, other clues that a belief functions to signal tribal identity are
increased salience and normative force within that tribe. It is not just that de facto polarization
occurs as the result of a bias, but rather there is something wrong with a party member who
does not go along with this. The combination of salience and normative force has led climate
change denial to be a qualifying standard to receive the full benefits of tribal membership. As
McCright and Dunlap put it:

Indeed, the rise of the Tea Party and rightward drift of the Republican party created
a situation in which skepticism toward climate change became a litmus test for
party candidates in the 2010 election. (McCright & Dunlap, 2011, p. 179)

The salience and normative force of these beliefs are frequently displayed on conservative
media. When unusually cold weather strikes a region, Drudge Report will often post a link with
a sarcastic allusion to global warming claims. Followers who comment on the linked article will
often use pejoratives like “libtard” or “RINO” to describe those counter-normative (would-be)
tribal members who accept anthropogenic climate change. In this way, it can be costly for one
who is a Republican to have a belief in anthropogenic climate change. For this reason (costly
signaling), that belief is a reliable signal of Democratic (or at least non-Republican) affiliation.
Conversely, it is costly (ostracism or decreased support from the group, epistemic distrust) for a
Democrat to be a climate change skeptic, so it is a reliable signal that one is indeed a Republi-
can. Typically, only one who truly has those partisan sympathies could afford to take on that
belief.
It is noteworthy that, rather than emerging organically or as a grass roots effort, climate
change denial is a top-down phenomenon that began with “elites” and then spread to the
masses (McCright & Dunlap, 2011, p. 163). Whether there was a deliberate effort toward
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 451

manipulation is a further question. Congressional roll call votes and surveys of convention dele-
gates show a significant divide between Democrats and Republicans concerning environmental
priorities going back to the 1970s, though this division did not emerge among the rank-and-file
until a couple decades later (Egan & Mullin, 2017, pp. 218–219). Further, climate change denial
will often function differently for elites than it does for the masses. There are two salient possi-
bilities for the elites. First, they do not believe their denials but espouse them to rally their fol-
lowers or to further their own vested interests (e.g., appease lobbyists, protect financial
investments). This is dishonest impression management rather than belief-signaling. Second,
they could genuinely possess the belief, but simply due to motivational biases given their vested
interests. For these reasons, their climate change denials often are not selected for a signaling
function (i.e., to convey information to others). The mass consumers of political information, in
contrast, often do acquire the belief so as to establish their tribal membership. This points to an
interesting fact about signaling systems: They can be initiated by receivers or third parties who
wish to detect allegiance, submission, or other forms of acquiescence among a population. We
might call these imposed signals.6
The signaling function often produces deviations from reasons–responsiveness or outright
perverts it. Climate change denial is notoriously insensitive to factors that would normally
count as evidence and reasons to believe. Among Republicans in particular, education and sci-
ence comprehension bear weakly or even negatively on belief in climate change. Egan and
Mullin (2017, p. 217) conclude that “providing more information to climate skeptics will do lit-
tle to lead them to belief, and it may even backfire.” This suggests that the bias results straight-
forwardly from motivation or else serves some kind of strategic function. Dan Kahan and his
colleagues have argued for both claims. In support of the motivated reasoning proposal, Kahan,
Jenkins-Smith, and Braman (2011) conducted studies showing that group values (“cultural cog-
nition”) lead to biased assimilation of evidence, specifically relating to climate change. While
this is true, we have also seen that political affiliation is an even stronger factor than is ideology.
This suggests that the ultimate explanation is not simply that motivated reasoning (protecting
one's values/beliefs, which so happen to be socially shared) is at work, but that the bias exists
more fundamentally to preserve tribal identity (protect those values/beliefs because one has
that tribal identity). In other words, the values are a means to group cohesion. At times, Dan
Kahan advocates a view that seems to be equivalent to our signaling hypothesis (Kahan, 2017,
p. 28, Kahan, 2012, p. 255; Kahan et al., 2012, p. 734). The present work aims to give that claim
greater clarity.
Climate change denial for the sake of signaling produces distinctive benefits for signalers
and receivers:

a. Establishing a tribe of loyalists. This is likely the most important function of beliefs that sig-
nal tribal identity. By abiding to tribal norms concerning belief, we display our general ten-
dency to conform. We show that we are worthy of support and can be trusted to support
others, opening up opportunities for both unilaterally and mutually beneficial transactions.
The specific value garnered differs according to the status of the signaler. For elites, climate
change denial might signal willingness to work together on political and policy matters. This
furthers one's standing in the tribe and allows for quid pro quo exchanges. For the masses,

6
This is a rather interesting arrangement in which the audience (i.e., the elite) is “manipulated” by the signaler in the
benign sense that the audience responds in light of the signal. But, more fundamentally, with imposed signals the
audience is manipulating the signaler. Compare this to Robert Trivers' (2002) notion of “imposed self-deception.”
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
452 FUNKHOUSER

the denial likely yields the social benefits that come with being a party member in good
standing and that result from sharing a worldview with one's peers. The point of climate
change signaling is often to get others in line.
b. Establishing epistemic communities. It is especially significant that networks of like-minded
individuals serve the epistemic function of determining who is a reliable information source.
Climate change denial/support often functions to signal one's reliability as an epistemic
enabler. One shows that they will provide the right kind of information and encouragement
to maintain the desired worldview. This shows that they also are a reliable information
source for matters besides climate change, which can improve one's standing in the group.
Epistemic communities are especially important, as protecting against challenges to the
group's facts and values buttresses the “tribe of loyalists” function.
c. Self-signaling a sense of belonging. With self-signaling, the person has a belief so as to send a
message to themself. Possessing tribe-characteristic beliefs could be a way of letting oneself
know that they belong to the tribe. This can reassure one of one's place in the world and
relieve social anxiety. It can produce a sense of belonging to a community, leading one to
experience the tribe's successes as one's own.

It is a commonplace that peer pressure and group dynamics contribute to our beliefs and
values. But why think this is also a case of signaling? What is the evidence that the beliefs are
detected by others at all? Signals must have symbolic significance to the receiver. The audience
need not have conscious awareness of this significance (e.g., a peahen need not recognize that
tails of a certain kind signify genetic fitness), but the audience must respond in a certain way
because the signal communicates information that does have symbolic significance to them
(as genetic fitness does for the peahen). Detection and manipulation go hand in hand. We have
already seen that climate change denial is seen as a litmus test for Republican affiliation. The
case for detection is made even stronger by uncovering the manipulations and benefits that sig-
naling induces.

3.2 | Religion

While almost any kind of behavior could function as a signal of group identity or commitment,
much attention has been given to religious practices. I will argue that religious beliefs can func-
tion to signal commitment to both the tribe and a worldview.7 For some religious believers (S),
their belief in god and an afterlife (B) serves as a signal to both fellow believers and nonbe-
lievers (A) that they are trustworthy, believe in an ultimate punishment for immoral behavior,
and so forth (I), which leads to greater trust and social interaction (M) among both fellow
believers and nonbelievers (A). Of course, religious beliefs can also be due to either cognitive or
motivated biases, as well as being the product of good reasoning. It would be foolish to claim a
particular etiology or function for all religious beliefs.
Leak and Fish (1989) made an initial case for self-deceptive (intrinsic) religiosity—in con-
trast to mere impression management—due to a social desirability bias. Subsequent research
has shown that inducing vulnerability to loneliness enhances intrinsic religiosity, further
pointing to a social function for religious beliefs (Burris, Batson, Altstaedten, & Stephens, 1994).
Burris et al. concluded that religious beliefs serve as a defense mechanism for the sake of

7
See also Levy (2018) for discussion of this possibility, though he calls these only “fledgling beliefs.”
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 453

mental health, the idea being that genuine religious belief enhances socialization. These beliefs
are supposedly acquired for the sake of better social functioning, but not necessarily for belief-
signaling—for example, the religious belief might simply motivate more social interactions.
How do we make the extra step to get to signaling? We will need to show not merely that the
religious belief motivates social behavior, but that it signals that the believer is worthy of social
inclusion.
The bulk of the work on religion and signaling has focused on religious practices as costly
signals of genuine belief, loyalty, and willingness to cooperate with the in-group (Irons, 2001;
Power, 2017; Soler, 2012; Sosis, 2003). The present claim is that not only are religious practices
signals of religious belief or devotion (Posner, 2000), but the religious beliefs themselves often
function as signals for the direct purpose of marking off tribal and worldview commitment.
Whereas costly signaling may be the norm for religious practices that signal belief, the religious
beliefs themselves could be had on the cheap. Religions vary, of course, when it comes to their
standards for conversion or acceptance into the group—some never allow it, some require
extensive training or displays of devotion, still others require merely an apparently sincere
avowal. Many religious markers do not seem to be costly signals—for example, wearing a cross
necklace or particular garb. Many nonreligious organizations and communities also have mini-
mal requirements for group membership (e.g., becoming a fan of a sports team), but member-
ship can nevertheless come with extensive social benefits.
Beliefs that signal tribal identity need not impose much of a cost on your average individual,
but for committing them to a political tribe as discussed earlier (Kahan, 2017, p. 28). Costly sig-
naling can explain much tribal belief-signaling, but its honesty is buttressed by other mecha-
nisms as well. In addition to handicaps, signaling theory has identified several other
mechanisms to justify receiver-side attention. There are four other mechanisms that apply to
beliefs signaling tribal identity or commitment:

1. Even if these beliefs are not costly, they are still difficult to fake. It is intrinsically difficult to
will to believe or to hide your true thoughts over the long term, especially on emotionally
charged topics like religion and politics. (Some may see the intrinsic difficulty of faking a sig-
nal as costly signaling, but we should avoid this verbal dispute.)
2. When parties have common interests—as tribal group members do to a significant degree—
signaling can arise without handicaps. (Searcy & Nowicki, 2005)
3. Dishonest signalers can be punished through social mechanisms. Those who do not truly
share the tribal identity or commitment can be found out over time and punished for their
misrepresentation. Tribe members have reputations, and tribes have memories. It has been
empirically demonstrated that social enforcement mechanisms alone can sustain honest sig-
naling even among parties with conflicting interests. (Lachmann, Szamado, &
Bergstrom, 2001)
4. Rather than attending exclusively to the costs and other constraints on the signaler, theorists
should pay more attention to the net benefits signalers can gain in future interactions by sig-
naling honestly (Przepiorka & Berger, 2017). In many situations, it is better for signalers to
live up to the signaled identity or commitment, reaping the benefits of cooperative interac-
tion. Audiences tend to benefit by taking a trusting attitude at first, reverting to social
enforcement if the signal was dishonest.

Our first bit of evidence that some religious beliefs function to signal trustworthiness and
morality (I) is that they actually communicate this information. If religious belief functions as a
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
454 FUNKHOUSER

signal of prosociality (to tribal or nontribal members), it almost certainly is the case that such
beliefs were, at least at one time, a good indicator (cue) of prosociality. If that is no longer the
case, or if other signals of pro-sociality emerge, then we should expect religious belief to corre-
spondingly decrease. Indeed, religious belief has decreased significantly in certain places where
secular beliefs (signals) have taken their place (Gervais, Shariff, & Norenzaya, 2011, p. 1203).
Those who advertise religious beliefs receive preferential treatment in return. It is certainly
true that those who are perceived to be nonreligious often suffer negative repercussions. A Gal-
lup (2015) survey found that 40% of American adults would not vote for an atheist candidate
for president nominated by their favored political party and otherwise well-qualified. (Though,
this is down from 77% in 1958, showing that the signaling pressure for religiosity is not as
strong as it once was.) Of course, most people will not be running for president, but the point is
that this response reveals a strong national prejudice against the character of those who do not
display a belief in god (or who positively display their nonbelief). This permeates everyday life.
Religious belief is seen by many as a prerequisite for trustworthiness.8 Gervais et al. (2011),
p. 1200) conducted six studies confirming this very effect, finding that only rapists were consid-
ered less trustworthy than atheists. More fundamentally, Pew Research Center (2017) reveals
that 42% of American adults think that religious belief is essential to being moral and having
good values. This prejudice even bears on reproductive success: Parents are more likely to dis-
approve of their children marrying atheists, by a significant margin, as compared to other
groups that are common targets of discrimination (e.g., Muslims, African Americans) (Edgell,
Gerteis, & Hartmann, 2006). It should be obvious that there are strong social pressures to signal
religious belief, or at least to not signal atheism. The pressure is not merely to conform, but to
display conformity in order to receive preferential social treatment.
Religious “believers” often fail to act on their beliefs as we would expect of someone who is
a genuine and rational believer. For example, many more people identify as belonging to reli-
gions than actually attend services. The PEW Research Center (2018) found that 60% of Ameri-
cans who identify as Christian rarely or never attend religious services “for reasons other than
nonbelief.” This is significant. Such people avow their religious beliefs even though they are not
driven to act as seems customary and rational for such a believer. One reason for this could be
that it is more valuable to them to appear religious than to be religious, especially given the
supposed connections between religious belief and morality or goodness.
Next, as with climate change denial, the salience and normative force of certain religious
beliefs indicate design or functionality for signaling. Religious believers often go out of their
way to publicly display their beliefs. This is glaringly evident with the creeds or testimonials
central to the world's major religions—for example, the Apostles' or Nicene Creed for
Christianty, the Shahada in Islam. Significantly, these creeds are uttered before others and are
taken as authoritative for admission into the religious group. They are highly overt statements
of belief (e.g., “I believe in God, the Father Almighty…” ). Mechanisms are put in place to make
the context of utterance quite solemn, to ensure proper understanding of the creed, and to
incentivize sincerity. These rituals increase the chances of honest signaling.
Public displays of belief present something of a paradox for the belief-signaling hypothesis.
On one hand, if religious beliefs are social signals, then we should expect them to be promi-
nently displayed (e.g., avowed). But when they are prominently displayed, then the avowals
themselves are good candidates for signals of the underlying belief. One might then think that

8
Note that the tribes in this case are quite large—religious believers versus atheists. There can also be signaling games
modeled at different levels of granularity, for example, tribes of Catholics or Jews, rather than generic “believers.”
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 455

the avowals are the signals, and the underlying beliefs are that which is signaled rather than
being signals themselves. However, we should not be so frugal about signaling. We can allow
that the avowals are signals of underlying belief and that those beliefs signal something further.
Beliefs can be parts of signaling chains:

avowal (signals) ! belief (signals) ! group-desirable traits

The important point is that the belief exists due to the signaling pressure; if there were not so
much value to the display, the belief would not exist. This is unlike many cases of animal sig-
naling which do not give rise to signaling chains. The peacock's tail feather is a display or signal
of genetic fitness, but the genetic fitness is not itself a signal of something further. Animals have
an incentive to be genetically fit even if they do not communicate that fitness to others. But the
incentive to have these tribal beliefs is fundamentally communicative, and those beliefs would
not exist without that social signaling function. In this regard, the belief functions like a pea-
cock's tail feather—it is held because of its signaling value.
Further evidence for a signaling function is the fact that religious beliefs often fail to meet
the normal evidential and behavioral standards for belief. It is commonplace that religious
beliefs are often held without good epistemic reasons (e.g., faith, tradition), and they often run
counter to what would be believed in other contexts (e.g., accepting miracles, tolerating abuse).
It is debatable whether motivated reasoning, misapplied cognitive mechanisms, or functional
mechanisms besides signaling can fully explain this.
Thus far the case for religious belief-signaling is that: (1) religious beliefs are prominently
displayed in environments designed to produce sincerity; (2) such beliefs successfully communi-
cate prosociality and tribal affiliation; (3) this perception benefits the believer; and (4) these
beliefs are often contrary to reason and evidence. Beliefs are generally assumed to be truth-
tracking, so claim (4) motivates the search for an alternative function. Claims (1) to (3) support
the signaling function as a specific candidate. However, one might worry that these conditions
are too weak. Is the idea that belief-signaling theory predicts that people would acquire any
belief that, once detected, causes others to treat them better? If so, we should expect massive
amounts of belief-signaling. This is concerning, because it does not seem plausible that beliefs
are so commonly acquired for the sake of social manipulation rather than for personal goal pur-
suit. Belief-signaling, then, likely requires that other conditions are in place. These include:

Epistemic costs: The deviation from epistemic norms must not produce too great of an episte-
mic cost. Beliefs are governed, at least somewhat, by considerations of rationality and
holism. Accepting an epistemically compromised belief normally requires making modifica-
tions elsewhere in one's web of belief. Climate change denial or religious fundamentalism
are not good signaling strategies if they lead to harms due to a broader distrust of science.
Practical reasoning costs: We are less likely to achieve our goals when we act on false or
unjustified beliefs. One might suffer from a Christian Science treatment for an illness or
from an environmental disaster brought about by inaction born of denial.
Honesty: It must not be too easy to fake belief and get away with it.

The honesty condition deserves special attention. Some people fake religious belief as an
exercise of impression management, but honesty mechanisms are in place that make this some-
what taxing or difficult. This contrasts with climate change denial which is pretty easy to fake.
But the benefits of climate change denial are much less. This is what we should expect, as
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
456 FUNKHOUSER

there are tradeoffs between our honesty standards and the importance/benefits of communi-
cating that information. The more important the matter and the greater the opportunity to
benefit, the higher the standard and the more obstacles that are put in place to ensure honest
signaling. All of this must also be tempered by considerations of the epistemic and practical
reasoning costs.

3.3 | Theoretical benefits of the tribal signaling hypothesis

The signaling approach coheres with, if not outright predicts, many of the distinctive belief-
forming practices central to tribal identity. I will identify five such theoretical benefits. Other
proposed explanations can garner some of these benefits, but I do not think that any alternative
gains them all. Regardless, the alternatives discussed in the next section do not offer a unified
and robust conceptual framework.

1. Signaling explains the etiology of certain tribe-characteristic beliefs that are widespread
within circumscribed populations.

Every tribe has its distinctive beliefs. But how and why do these distinctive beliefs become
widespread? Some of these beliefs are not central to the ideology or values of the tribe, but nev-
ertheless become prevalent because they establish affiliation, strengthen intra-group connec-
tions and establish lines of demarcation with out-groups. Climate change denial is not
intrinsically so core to conservative political beliefs, but it has nevertheless served as a good sig-
nal of political affiliation. Other times a belief is central to the tribe's ideology or values, but its
signaling value came first—the tribe only grew into having that ideology afterwards. For exam-
ple, some early Christians may have been marked off by their beliefs about the special—but not
necessarily divine—status of Jesus. This could have evolved into the more ideological beliefs
about Jesus's divinity. Consistency and other logical considerations could have the same
effect—for example, one begins by modestly denying climate change, but this eventually leads
to skepticism about the scientific method more generally. Or, as Christians develop a theory
concerning the divinity of Jesus, they are led to accept many corollary claims concerning his eti-
ology (e.g., virgin birth, the immaculate conception of Mary, etc.). I am not making a claim
about the actual history of theology. Rather, the point is that ideology could follow signaling in
this way.

2. Signaling explains the salience and normativity of such beliefs within that population.

People are especially vocal about climate change and religious belief, prominently dis-
playing these beliefs in the right contexts. They serve as litmus tests or have creedal authority.
Signaling explains the heightened advertisement and detection mechanisms for such beliefs.
People go out of their way to advertise their religious beliefs, and others go out of their way to
search for the signal and ensure its honesty. But there is no general incentive to advertise cases
of motivated reasoning or to be on the lookout for it. This social significance is left unaccounted
for by purely cognitive or motivational accounts of the biases. Beliefs that are the product of
motivated reasoning or defensive ego protection are not ipso facto more likely to be publicly
salient and group normative, though this is true of tribe-signaling beliefs.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 457

3. Signaling explains the tendency toward polarization for group-contrastive beliefs.

Polarization and extremes are common with signals, so as to better establish intergroup con-
trast and illustrate the degree of intragroup commitment. So, people will avow that Jesus was
completely without sin and offers the only path to salvation.

4. Signaling explains the distribution of benefits and harms in light of displayed beliefs.

If a belief functions as a signal, then there should be some social benefit from its communi-
cation. But there is no general reason to expect that we will receive social benefits from our cog-
nitive or motivated biases.9

5. Signaling explains the profound resistance to rational or evidence-based challenges.

Beliefs that function as signals are, fundamentally, not held for epistemic reasons (though
they are buttressed by them), and they might not be altered by attempts to improve reasoning.
Relative immunity from correction by attempts to remove cognitive and motivated biases
(e.g., symmetrical evidence search, constructing a pro/con list) could be a distinguishing feature
of belief-signaling. However, things are rather complicated, as cognitive and motivated biases
could be recruited as means to acquire and maintain a belief for signaling purposes, making
beliefs that function as signals amenable to correction after all.

4 | OBJECTI ON S AN D A LTERN A T I V E E X P L A NA T I O NS

Of course, there are alternative explanations for these biases or belief-forming practices. I will
consider three categories of response: (1) denying the belief attribution altogether; (2) account-
ing for it as a cognitive bias, and (3) assimilating it under motivated reasoning.

4.1 | Objection 1: These are not really beliefs

One response is to deny that the candidate belief-signalers genuinely possess the belief.
Instead, they might be issuing empty avowals or engaging in a pretense. The idea is that the
partisans might only be giving lip service to climate change without having the underlying
belief. This need not be deceitful or hypocritical; their avowals could be conformist speech
acts without any significant underlying agenda. Alternatively, there are cases of outright
deception, as when someone regularly attends church and proclaims belief, though
inwardly rejecting it.
There are practical problems discerning what a person “really believes,” especially when the
issue is abstract or there are limited opportunities to express that belief in meaningful behavior.
There is little that your typical person can do to express their disbelief in climate change other
than to avow, reason, and vote in certain ways. But these can be very significant. The avowals
can be as full-throated and emotionally charged as any other avowal that we take to be grounds

9
See Mercier and Sperber (2017) for a controversial alternative. On their view, reasoning is a tool primarily in the
service of persuading others. Some biases could further this goal.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
458 FUNKHOUSER

for belief. Deniers can engage in systematic evidence search and rationalization, incorporating
the apparent belief in a much larger cognitive framework. They can put their money where
their mouth is by voting for or against candidates for these reasons. These seem to meet our
normal standards for belief attribution, and those who research climate change opinions often
take answers to polling questions as sufficient for attributing belief. Things are a bit clearer for
religious belief, as there are more opportunities to robustly manifest them in everyday life. Of
course, it is certainly possible for people to misrepresent these beliefs, intentionally or not.
When they do so, their avowals and behaviors very well might (dishonestly) serve a signaling
function, but there is no belief-signaling. However, there is no reason to think that all climate
change deniers or religious people who project these attitudes for social consumption are dis-
honest. Given that they vote, reason and discuss, donate their money, spend their time, and
strongly feel like genuine believers, it is highly implausible to say that the bulk of them do not
believe.
In addition to the worry that the so-called believers might be pretenders or hypocrites, there
are concerns about whether someone can genuinely believe in the face of a lack of evidence.
But such concerns are not daunting. It is possible for arational methods to generate belief, say
by simple repetition or indoctrination from youth. The same social forces that induce impres-
sion management can induce biased belief. As with any kind of self-deception, there is probably
still a role for reason in the process, even if it is distorted. Tribes construct their own epistemic
networks, which selectively admit information and arguments so that the objectively irrational
belief might actually be rational given the factors allowed into consideration.

4.2 | Objection 2: The biased belief is explained by cognitive biases

Cognitive biases undoubtedly explain some of this belief distortion. Prominent among these
include:

• Evidence exposure biases: People operating within tribes are not exposed to a random sam-
pling of evidence. Tribe dynamics skew the available information, resulting in biased judg-
ment. This could occur even without individual tribal members having a motive to avoid
confronting certain information. They read only the climate change skeptics or have never
met an atheist.
• Confirmation bias: When someone begins with a default hypothesis—perhaps suggested by
their tribe—they tend to give greater weight to confirming evidence over disconfirming evi-
dence (Nickerson, 1998). It is easier to be a climate change skeptic if you start by accepting
that position as your default.

Some might argue that these biases alone can explain the polarization.
But as Kahan and others have pointed out, the facts do not support this. If cognitive biases
were the culprit, then we would expect that the effect would be mitigated when given more evi-
dence and opportunities to reflect. In fact, the opposite frequently occurs. Enhanced “numer-
acy” (i.e., use of quantitative data) with respect to climate change risk, indicative of System
2 functioning, is actually negatively correlated with assessing climate change risk (Kahan
et al., 2012, pp. 732–733). This finding is in dispute, though, as Pennycook and Rand (2019) con-
ducted large studies showing that higher performance on the cognitive reflection test (CRT) cor-
related with improved ability to detect fake news. They interpreted this result as showing that
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 459

lazy reasoning, as opposed to motivated reasoning, is more to blame for susceptibility to fake
news. However, there are some problems with this inference. First, the CRT measures one's
general ability to engage in a certain kind of analytical reasoning. The Pennycook and Rand
studies did not involve heightened reflection on the particular subject matter contained in the
news stories under consideration. Second, the supposed fact that distorted belief
(e.g., susceptibility to fake news) can be corrected by improved reasoning does not show that
the original distortion was due to poor reasoning as opposed to motivational factors. Motivation
can influence belief at nonreflective stages, and even when motivation does influence reflective
cognition it is certainly defeasible. (In fairness, it should be noted that Pennycook and Rand
were opposing Kahan's specific “Motivated System 2 Reasoning” account.) Finally, greater sci-
entific comprehension leads to more polarization across those with political values inclining
toward acceptance/denial, showing that tribal identity continues to influence cognition (Kahan
et al., 2012, p. 733). This seems like a motivated bias, though it takes further argumentation to
conclude that the motivation is for the sake of signaling.
There are other aspects of climate change denial and religious belief that are not so plausi-
bly explained by cognitive biases. Cognitive biases alone do not explain: all efforts to go out of
one's way to acquire skewed information, the fact that these beliefs are litmus tests for tribal
membership, the unusual efforts made to publicly avow or display such beliefs, their normative
appeal, or the strong resistance to challenge.

4.3 | Objection 3: This is simply a case of motivated bias without


signaling

The main rival to the belief-signaling hypothesis is that the candidate belief-signals are caused
by familiar motivational biases. The motives could take different forms:

• To believe facts that align with one's values/ideology, prior beliefs, or vested interests.
• Incentives for prosociality that hold regardless of whether that mindset is detected (e.g., one
benefits from one's prosocial behavior, incentivizing the underlying belief).
• To conform to the tribe for personal, hedonic reasons (e.g., self-esteem, defensiveness).

We should first make a concession: In some instances, such motives produce the bias without
any contribution from signaling pressure. Some people are straightforwardly self-deceived about
climate change because they want to believe that the free market cannot possibly cause such
great environmental harms. And some have religious beliefs because those beliefs motivate
moral behavior for which they are rewarded, without those underlying beliefs even being
detected. Yet, these motives do not explain many features of our best candidates for tribal sig-
naling: The tribal salience and normative force of the beliefs, the efforts made to publicly dis-
play them, heightened detection and honesty mechanisms, and the issuing of social benefits/
harms on their basis. These attributes support the underlying logic of the belief-signaling
hypothesis: We not only have an incentive to act prosocially; we have an incentive to appear
prosocial.
For many, the biased belief is acquired because it is tribe-characteristic: There is a motive to
conform to the tribe (Kahan et al., 2011). Group dynamics can account for many of the features
just listed. But there are two importantly different kinds of underlying motives for conformity.
Conformity could be for personal, hedonic reasons (defensiveness), or it could be for the sake of
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
460 FUNKHOUSER

signaling (social adjustment) (Uziel, 2010, pp. 244–245). What is the difference, and how can
this be tested? With the former, the conformity is ultimately for the sake of manipulating the
believer herself. She enjoys a sense of belonging simply in virtue of believing with the tribe. This
does not require that others detect her conformity. With the latter—interpersonal signaling—
there must be function or selection for social manipulation. The biased belief exists for the sake
of detection by and manipulation of others.
There are ways to test between these two different motives for conformity. One way is to
manipulate self-esteem. High self-esteem or security should lower the pressure to conform out
of defensiveness or to secure hedonic benefits. If the bias remains through such a manipulation,
that is some evidence that it has a distinct function. Signaling is a strong alternative. While this
method has been primarily applied to impression management (Uziel, 2010, p. 248), it can also
be applied to genuine belief. In addition to self-esteem, we can appeal to evidence that such
beliefs are prominently displayed and play a significant role in the social marketplace
(e.g., tribal epistemology, the distribution of social benefits) to further buttress a signaling func-
tion. It is not hard to imagine experimental manipulations on these very factors. In the end,
belief-signaling is best supported by evidence that the belief has an interpersonal communica-
tive function.
I have discussed belief-signaling on the assumption that it serves an interpersonal function.
This is the default understanding of signaling, but self-signaling is also a possibility. Climate
change denial and religious belief could be signals to the self for reassurance. There is a worry
that this trivializes the thesis, though, as self-signaling could even explain conformity for the
sake of self-esteem.
The belief-signaling hypothesis can co-opt motivational explanations that appeal to social
conformity—they are not mutually exclusive. In fact, belief-signaling pressure can explain the
existence of certain motivational biases. The distinction between proximate and distal explana-
tions is helpful here. The signaling function (distal) can explain why there is a motive to conform
(proximate) in the first place. The overall picture produced by signaling theory will differ from
motivated reasoning explanations. It is not so much that these biased individuals have a motive
that distorts the proper functioning of the belief (truth-tracking). Rather, the belief-signaling func-
tion generates motives that further helps with achieving the goal of social manipulation.

5 | LESS ONS A ND F UTURE RESEARCH

5.1 | Treatment

If a belief functions as a signal of tribal identity, then this will have repercussions for how to
treat the belief. We may not always want to modify or eliminate the belief, as it might serve a
valuable social function without introducing outweighing epistemic or practical harms. But
assuming that it should be changed, there are various treatments to consider:

Treatment #1: Rational scrutiny. We can appeal to reason and evidence. Belief-signals
are still beliefs, so they should be at least somewhat sensitive to reason. While perhaps
ideal, this approach will often fail as social forces can swamp rationality. Indeed, insen-
sitivity to rational scrutiny is a hallmark of belief-signaling. This treatment does not
confront the real force of belief-signaling unless it also explicitly acts to mitigate the
influence of the motivational biases brought about by group dynamics. Each of the next
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 461

four treatments, in contrast, works by eliminating or deincentivizing the signaling pres-


sure in the first place.
Treatment #2: Challenge the tribal affiliation. The tribal identity itself could be criticized—
for example, one should not be a Republican. This will probably work best when sup-
plemented with an alternative tribal identity with its own appeal. If the belief is had because
it signals tribal membership, then losing that tribal identity undermines the belief by elimi-
nating the incentives to believe and by breaking up the epistemic network that sustains
it. Evidence from political science supports that, if the issue is important enough, citizens
will change parties rather than their position on the issue (Carsey & Layman, 2006). Chang-
ing tribal membership will not always be effective, however: Beliefs can persist due to iner-
tia or the past influence of the group.
Treatment #3: Identify in-group dissenters. We could point to salient members of the
tribe who hold the contrary belief. Studies have shown that when a Republican politi-
cian corrects climate change skepticism, their criticism is taken much more seriously by
both Republicans and Democrats (Benegal & Scruggs, 2018). Appealing to such sources
is a way to show that not all members of the tribe share that belief and that one can reap
tribal benefits without possessing that signal. This might require the display of other
signals that indicate tribal identity.
Treatment #4: Deny that the belief reflects tribal values. We can make the case that “real” or
“genuine” Republicans would not be climate change deniers, so it should not serve as a sig-
nal of that tribal affiliation. More modestly, we can cast the alternative belief in Republican-
friendly terms—for example, “green” technology and investment as the future for economic
growth (Hornsey et al., 2016, p. 625).
Treatment #5: Change the tribal norm. We could modify what it means to belong to that
tribe, such that climate change denial is no longer seen as symptomatic or essential for tribe
membership. We can redefine what it means to be a Republican. This often happens as con-
clusive evidence mounts for a view, the denial of which was once a tribal-identity signal. As
evidence of the Copernican theory or Darwin's theory of evolution mounted, it eventually
became implausible or unwise for various churches to deny those truths even though that
denial was once essential to being a member in good standing.

If the belief is responsive to treatments 2 to 5, then this can be a good indication that the belief
functioned as a tribal-identity signal all along.

5.2 | Future research

Belief-signaling is a research programme with a conceptual foundation and empirical applica-


tion. We often can best control and treat a phenomenon only when we apply a new conceptual
framework to it. When we recognize that certain tribe-characteristic beliefs function as signals,
remedies arise that would not otherwise present themselves. If we take belief-signaling of tribal
identity seriously, then we can set for ourselves the following practical tasks:

1. Find prominent beliefs that signal tribal identity. People categorize themselves into many dif-
ferent kinds of tribes, with corresponding belief systems. Which of these tribal affiliations
are especially important? For these tribes, which beliefs function as signals? We should
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
462 FUNKHOUSER

pursue empirical tests to distinguish signaling from mere conformity (e.g., by manipulating
self-esteem, advertising and detection, or social benefits).
2. Identify the beneficial and harmful belief-signals, as well as equilibria. Much belief-
signaling is relatively harmless. The epistemic costs, if any, will be outweighed by the
social benefits. But some tribe-signals will prove to be positively harmful or at least inef-
ficient. It would be helpful to discern particular examples of good/bad belief-signals, as
well as the general principles for identifying healthy and responsible belief-signaling of
tribal identity. Formal modeling—and experimentation guided by it—will prove very
valuable here.
3. Assess the ways to spread beneficial signals and eliminate harmful signals. We want to propa-
gate healthy signaling systems, and we should use empirical means to discover the best ways
of initiating and growing beneficial signals. Laboratory and field studies can help us distin-
guish the most effective treatments for eradicating harmful signals.
4. Discover the limits and role of rationality. It is an open question how disconnected belief-
signals can be from the norms of rationality, but religious and political systems show us that
the gulf is sometimes wide. Still, as with motivational biases, there is a necessary role for rea-
soning. Some mental gymnastics must be performed to sustain beliefs that are radically dis-
connected from the norms of rationality. What are the limits to irrational belief-signaling?
How are rationalizations employed to preserve beneficial signals and eliminate harm-
ful ones?
5. Search for better ways to establish and signal tribal identity. Belief-signaling is sometimes
harmful or suboptimal. When that is the case, it is better to embrace other ways to signal
identity or achieve the goals of social cohesiveness, coordination, and cooperation.

ACK NO WLE DGE MEN TS


I would like to thank the three anonymous referees for Mind & Language who provided me
with very valuable comments that helped me improve this article considerably.

ORCID
Eric Funkhouser https://orcid.org/0000-0003-3703-5197

R EF E RE N C E S
Benegal, S. & Scruggs, L. (2018). Correcting misinformation about climate change: The impact of partisanship in
an experimental setting. Climatic Change, 148, 61–80.
Berger, J. (2017). Are luxury brand labels and “green” labels costly signals of social status? An extended replica-
tion. PLoS One, 12(2), e0170216.
Bliege-Bird, R. & Smith, E. (2005). Signaling theory, strategic interaction, and symbolic capital. Current Anthro-
pology, 46(2), 221–248.
Brewer, M. (1979). In-group bias in the minimal intergroup situation: A cognitive-motivational analysis. Psycho-
logical Bulletin, 86(2), 307–324.
Burris, C., Batson, C. D., Altstaedten, M. & Stephens, K. (1994). What a friend …: Loneliness as a motivator of
intrinsic religion. Journal for the Scientific Study of Religion, 33(4), 326–334.
Carsey, T. & Layman, G. (2006). Changing sides or changing minds? Party identification and and policy prefer-
ences in the American electorate. American Journal of Political Science, 50(2), 464–477.
Edgell, P., Gerteis, J. & Hartmann, D. (2006). Atheists as “other”: Moral boundaries and cultural membership in
American society. American Sociological Review, 71, 211–234.
Egan, P. & Mullin, M. (2017). Climate change: US public opinion. Annual Review of Political Science, 20,
209–227.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
FUNKHOUSER 463

Funkhouser, E. (2017). Beliefs as signals: A new function for belief. Philosophical Psychology, 30(6), 809–831.
Gallup (2015). In US, socialist presidential candidates least appealing. Retrieved from https://news.gallup.com/
poll/183713/socialist-presidential-candidates-least-appealing.aspx?g_source=link_newsv9&g_campaign=
item_254120&g_medium=copy
Gervais, W., Shariff, A. & Norenzaya, A. (2011). Do you believe in atheists? Distrust is central to anti-atheist prej-
udice. Journal of Personality and Social Psychology, 101(6), 1189–1206.
Guilford, T. & Dawkins, M. S. (1991). Receiver psychology and the evolution of animal signals. Animal Behav-
iour, 14, 1–14.
Hogan, R. (1983). A socioanalytic theory of personality. In M. M. Page (Ed.), Nebraska symposium on motivation
(Vol. 30, pp. 55–89). Lincoln, NE: University of Nebraska Press.
Hornsey, M., Harris, E., Bain, P. & Fielding, K. (2016). Meta-analyses of the determinants and outcomes of belief
in climate change. Nature Climate Change, 6, 622–627.
Irons, W. (2001). Religion as a hard-to-fake sign of commitment. In R. Nesse (Ed.), Evolution and the capacity for
commitment (pp. 290–309). New York, NY: Russell Sage Foundation.
Kahan, D. (2012). Why we are poles apart on climate change. Nature, 488, 255.
Kahan, D. (2017). The expressive rationality of inaccurate perceptions. Behavioral and Brain Sciences, 40, 26–28.
Kahan, D., Jenkins-Smith, H. & Braman, D. (2011). Cultural cognition of scientific consensus. Journal of Risk
Research, 14(2), 147–174.
Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L., Braman, D. & Mandel, G. (2012). The polarizing
impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2,
732–735.
Krebs, J. & Dawkins, R. (1984). Animal signals: Mind-reading and manipulation. In J. R. Krebs & N. B. Davies
(Eds.), Behavioural ecology: An evolutionary approach (2nd ed., pp. 380–402). Oxford: Blackwell.
Lachmann, M., Szamado, S. & Bergstrom, C. (2001). Cost and conflict in animal signals and human language.
PNAS, 98(23), 13189–13194.
Leak, G. & Fish, S. (1989). Religious orientation, impression management, and self-deception: Toward a clarifica-
tion of the link between religiosity and social desirability. Journal for the Scientific Study of Religion, 28(3),
355–359.
Levy, N. (2018). Showing our seams: A reply to Eric Funkhouser. Philosophical Psychology, 31(7), 991–1006.
Maynard-Smith, J. & Harper, D. (2003). Animal signals. Oxford: Oxford University Press.
McCright, A. & Dunlap, R. (2011). The politicization of climate change and polarization in the American public's
views of global warming, 2001–2010. The Sociological Quarterly, 52, 155–194.
McElreath, R., Boyd, R. & Richerson, P. (2003). Shared norms and the evolution of ethnic markers. Current
Anthropology, 44(1), 122–130.
Mercier, H. & Sperber, D. (2017). The enigma of reason. Cambridge, MA: Harvard University Press.
Mijovic-Prelec, D. & Prelec, D. (2010). Self-deception as self-signalling: A model and experimental evidence. Phil-
osophical Transactions of the Royal Society, B: Biological Sciences, 365(1538), 227–240.
Nelissen, R. & Meijers, M. (2011). Social benefits of luxury brands as costly signals of wealth and status. Evolution
and Human Behavior, 32(5), 343–355.
Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychol-
ogy, 2(2), 175–220.
Paulhus, D. & Trapnell, P. (2008). Self-presentation of personality: An agency-communion framework. In O. John, R.
Robins & L. Pervin (Eds.), Handbook of personality: Theory and research (3rd ed.). New York, NY: Guilford Press.
Pennycook, G. & Rand, D. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by
lack of reasoning than by motivated reasoning. Cognition, 188, 39–50.
Petersen, M. (2016). Evolutionary political psychology. In D. Buss (Ed.), Handbook of evolutionary psychology
(pp. 1084–1102). Hoboken, NJ: Wiley.
PEW Research Center (2017). A growing share of Americans say it's not necessary to believe in God to be moral.
Retrieved from https://www.pewresearch.org/fact-tank/2017/10/16/a-growing-share-of-americans-say-its-
not-necessary-to-believe-in-god-to-be-moral/
PEW Research Center (2018). Why Americans go (and don't go) to religious services. Retrieved from https://www.
pewforum.org/2018/08/01/why-americans-go-to-religious-services/
Posner, E. (2000). Laws and social norms. Cambridge, MA: Harvard University Press.
14680017, 2022, 3, Downloaded from https://onlinelibrary.wiley.com/doi/10.1111/mila.12326 by Universidade Federal De Minas Gerais, Wiley Online Library on [13/09/2023]. See the Terms and Conditions (https://onlinelibrary.wiley.com/terms-and-conditions) on Wiley Online Library for rules of use; OA articles are governed by the applicable Creative Commons License
464 FUNKHOUSER

Power, E. (2017). Discerning devotion: Testing the signaling theory of religion. Evolution and Human Behavior,
38(1), 82–91.
Przepiorka, W. & Berger, J. (2017). Signalling theory evolving: Signals and signs of trustworthiness in social
exchange. In B. Jann & W. Przepiorka (Eds.), Social dilemmas, institutions, and the evolution of cooperation
(pp. 373–392). Berlin: De Gruyter.
Richerson, P. & Boyd, R. (2005). Not by genes alone: How culture transformed human evolution. Chicago, IL: Uni-
versity of Chicago Press.
Schlenker, B. & Pontari, B. (2000). The strategic control of information: Impression management and self-
presentation in daily life. In A. Tesser, R. B. Felson & J. M. Suls (Eds.), Psychological perspectives on self and
identity (pp. 199–232). Washington, DC: American Psychological Association.
Searcy, W. & Nowicki, S. (2005). The evolution of animal communication. Princeton, NJ: Princeton University Press.
Soler, M. (2012). Costly signaling, ritual and cooperation: Evidence from Candomble, an Afro-Brazilian religion.
Evolution and Human Behavior, 33, 346–356.
Sosis, R. (2003). Why aren't we all Hutterites? Costly signaling theory and religious behavior. Human Nature, 14
(2), 91–127.
Spence, M. (1973). Job market signaling. The Quarterly Journal of Economics, 87(3), 355–374.
Sunstein, C. (2000). Deliberative trouble? Why groups go to extremes. The Yale Law Journal, 110(1), 71–119.
Tajfel, H. & Turner, J. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.),
The social psychology of intergroup relations (pp. 33–47). Monterey, CA: Brooks/Cole.
Tooby, J. & Cosmides, L. (2010). Groups in mind: The coalitional roots of war and morality. In H. Hogh-Olesen (Ed.),
Human morality and sociality: Evolutionary and comparative perspectives (pp. 191–234). London: Red Globe Press.
Trivers, R. (2002). Self-deception in service of deceit. Reprinted in his Natural selection and social theory.
New York, NY: Oxford University Press.
Turner, J. C., Brown, R. J. & Tajfel, H. (1979). Social comparison and group interest in ingroup favouritism.
European Journal of Social Psychology, 9, 187–204.
Uziel, L. (2010). Rethinking social desirability scales: From impression management to interpersonally oriented
self-control. Perspectives on Psychological Science, 5(3), 243–262.
Veblen, T. (1899/1994). The theory of the leisure class: An economic study of institutions. New York, NY: Dover
Publications.
Vonnegut, K. (1963/2010). Cat's cradle. New York, NY: Random House.
Williams, D. (2020). Socially adaptive belief. Mind & Language, http://dx.doi.org/10.1111/mila.12294.
Zahavi, A. (1975). Mate selection—A selection for a handicap. Journal of Theoretical Biology, 53, 205–214.

How to cite this article: Funkhouser E. A tribal mind: Beliefs that signal group identity
or commitment. Mind Lang. 2022;37:444–464. https://doi.org/10.1111/mila.12326

You might also like