You are on page 1of 4

Economics Letters 124 (2014) 195198

Contents lists available at ScienceDirect

Economics Letters
journal homepage: www.elsevier.com/locate/ecolet

Persuasion, binary choice, and the costs of dishonesty


Roland Hodler a , Simon Loertscher b, , Dominic Rohner c
a

Department of Economics, University of St. Gallen, Switzerland

Department of Economics, University of Melbourne, Australia

Department of Economics, Faculty of Business and Economics (HEC Lausanne), University of Lausanne, Switzerland

highlights
This paper studies persuasion when lies are costly and decisions are binary.
It shows that the equilibrium probability that a biased sender gets his way is a non-monotone function of these costs.
The results suggest that if the sender can determine these costs ex ante, he will choose intermediate costs.

article

info

Article history:
Received 27 March 2014
Received in revised form
5 May 2014
Accepted 16 May 2014
Available online 27 May 2014

abstract
We study the strategic interaction between a decision maker who needs to take a binary decision but is
uncertain about relevant facts and an informed expert who can send a message to the decision maker but
has a preference over the decision. We show that the probability that the expert can persuade the decision
maker to take the experts preferred decision is a hump-shaped function of his costs of sending dishonest
messages.
2014 Elsevier B.V. All rights reserved.

JEL classification:
C72
D72
D82
Keywords:
Persuasion
Costly signaling
Expert advice
Information distortion

1. Introduction
Many decision problems are binary in nature and characterized
by the uncertainty that the decision maker faces about crucial
decision-relevant facts. Examples include a policy makers decision
whether or not to realize a given infrastructure project, a Boards
decision whether or not to replace a companys CEO, the voters
decision whether or not to re-elect an incumbent government,
or a judges decision whether or not to convict a defendant. To

The authors would like to thank an anonymous referee whose comments have
helped us improve the paper. Financial support by the Faculty of Business and
Economics at the University of Melbourne via a Visiting Scholar Grant is also
gratefully acknowledged. This paper supersedes an earlier version titled Biased
Experts, Costly Lies, and Binary Decisions.
Corresponding author.
E-mail addresses: roland.hodler@unisg.ch (R. Hodler), simonl@unimelb.edu.au
(S. Loertscher), dominic.rohner@unil.ch (D. Rohner).

http://dx.doi.org/10.1016/j.econlet.2014.05.013
0165-1765/ 2014 Elsevier B.V. All rights reserved.

reduce uncertainty decision makers often consult experts who


are better informed about the underlying facts. However, experts
may themselves have a preference over the decision, and this
preference may not be well-aligned with the decision makers
preference. Examples include industry experts who are interested
in benefitting from public investment, CEOs and incumbent
governments who want to remain in power and who know more
about their performance and competence than Boards and voters,
respectively, and (expert) witnesses in trials who have private
information about the level of fault of the defendant, but may be
biased towards a specific outcome of the trial.
The question of how likely an expert is to persuade the decision
maker is of obvious interest and relevance for public policy and
the economics of organization. In this letter, we focus on one key
aspect that may affect persuasion: the experts costs of dishonesty,
which can represent mental or moral costs of lying, reputational
concerns, or expected punishment when misreporting is unlawful.
Our main result is that experts will not be able to persuade a
critical decision maker if costs of dishonesty are very low, in which

196

R. Hodler et al. / Economics Letters 124 (2014) 195198

case the decision maker rarely follows the experts advice, or if


costs of dishonesty are very high, in which case the expert rarely
deviates from telling the truth. However, the expert frequently
succeeds in persuading the decision maker to take the experts
preferred binary decision when the costs of dishonesty are intermediate.
Persuasion started to become an influential concept in economics with McCloskey and Klamer (1995). Recent theoretical contributions include Mullainathan et al. (2008) who focus on how
a sender can persuade receivers who are coarse thinking (rather
than Bayesian), and Kamenica and Gentzkow (2011) and Kolotilin
(2013) who study Bayesian persuasion in a setting in which the
sender can choose the signal, but cannot misreport its realization.
In contrast, we focus on Bayesian persuasion through misreporting
the truth.
The canonical model to study strategic interactions between a
sender (or expert) and a receiver (or decision maker) is Crawford
and Sobel (1982). We depart from their framework in two
important ways. First, we assume that it is costly for the sender to
misreport the state of the world. Second, the choice variable of the
receiver is binary rather than continuous. This second difference
implies that there is a conflict of interest between the sender and
the receiver for some, but not all states. While in Crawford and
Sobels model the sender and the receiver perpetually disagree
about the optimal policy (even under complete information), their
disagreement is only partial in our setup.
Banks (1990) introduced lying costs into the literature on
strategic information transmission. Subsequent contributions
include Callander and Wilkie (2007), Kartik et al. (2007) and Kartik
(2009). Kartik et al. (2007) and Kartik (2009) add lying costs to the
framework of Crawford and Sobel, but maintain the assumptions
that the receivers choice variable is continuous and that the
senders preferred action increases in the state of the world. Our
analysis thus complements theirs with the key modification that
the receivers choice is binary, which makes our model suitable to
study the real-world problems discussed above.
Our framework is relevant for various applications. First, it complements contributions showing how CEOs influence continuous
outcome variables, such as the market price of the firm (Fischer and
Verrecchia, 2000), his compensation (Goldman and Slezak, 2006),
or the range of possible projects (Adams and Ferreira, 2007). Second, when applied to incumbent government behavior, our model
is related to the aforementioned works by Banks (1990) and Callander and Wilkie (2007). While they analyze how two symmetric
candidates make costly lies about future policies, our model applies
to asymmetric elections in which an incumbent with an informational advantage about the state of the world runs for re-election.
Hence, our model is also related to Rogoff and Sibert (1988) and
Hodler et al. (2010), where an incumbent with private information
about his competence or the state of the world may choose socially
inefficient policies to improve his re-election prospects, and to Edmond (2013), where a dictator manipulates information to reduce
the risk of an uprising.
This letter is organized as follows: Section 2 describes the setup,
Section 3 provides the results and Section 4 concludes.
2. The setup
There are two strategic players: sender (or expert) S and receiver (or decision maker) R. The state of the world is a random
draw from the distribution F ( ) with density f ( ) > 0 and support [0, 1], which is common knowledge. Timing and actions are
as follows: first, S observes and sends message [0, 1].1 Second, R observes (but not ) and then has the binary choice of

realizing or rejecting the project. We denote the probability with


which she realizes the project by v . A strategy for R is thus a function v : [0, 1] [0, 1], with v() [0, 1] denoting the probability of accepting, given message . The joint assumption of binary
choice and continuous states is a sensible approximation to many
real-world problems, including shareholders and voters decision
(not) to re-elect an incumbent, and a policymakers decision to accept or reject an infrastructure project with uncertain returns.
Payoffs are as follows: S receives a benefit of 1 if and only if
the project is realized, but has to bear costs whenever his message
is not truthful. These costs of dishonesty are kc (d), where k 0,
d | |, c (0) = 0, c (d) > 0 and c (d) 0. Rs net utility
from realizing the project is uR ( ), which satisfies uR ( ) > 0 and
uR (0) < 0 < uR (1). Let be the unique number such that uR ( ) =
1
0. To make the problem interesting, assume 0 uR ( )f ( )d 0.
This ensures that Rs expected net utility from realizing the project
would be negative in the absence of any information about other
than F ( ).
The solution concept is perfect Bayesian equilibrium (PBE), and
we focus on PBE that satisfy the restrictions on off-equilibrium
beliefs proposed by Grossman and Perrys (1986) concept of Perfect
Sequential Equilibria (PSE). Intuitively, the PSE concept agrees
with the Intuitive Criterion that after receiving an off-equilibrium
message
, R should put zero probability on states at which S could
not possibly benefit from a deviation, but adds the requirement
that R should put probability that is proportional to the prior over
on the possibility that S has deviated at any state at which the
deviation
could potentially be profitable for him.
To be precise, denote by ( ) and v(( )) the actions a PBE
prescribes S and R to take after observing and , respectively.
Fix an equilibrium and let uS (v, | ) be Ss expected payoff given
, when he plays and R plays v , and let ( |) be Rs posterior
belief that the state is when the message is . Consequently,
uS (v(( )), ( )| ) is Ss expected equilibrium payoff, given .
Then:
Definition 1. A PSE is a perfect Bayesian equilibrium in which after
observing some
that S does not play in equilibrium, Rs beliefs
satisfy (i) ( |)
= 0 if uS (1, |
) < uS (v(( )), ( )| ),
( |)

f (1 )
and (ii) (1 |)
=
if
u
(
1
,
|

S
i ) uS (v((i )), (i )|i ) for
f (0 )
0
i = 0, 1.
In what follows we focus on monotone PSE, i.e., PSE in which

(0 ) (1 ) and v(0 ) v(1 ) for 0 1 and 0 1 ,


respectively.
3. Results
3.1. Equilibrium

Let be the unique number such that uR ( )f ( )d = 0.


That is, if Rs posterior satisfies ( |) f ( ) for all [ , 1]
and ( |) = 0; otherwise, she is indifferent between accepting
and rejecting the project. This is the case if some message is
sent with equal probability in all states [ , 1] and with zero
probability otherwise. Notice that < .
The equilibrium behavior depends on whether S would be
willing to play ( ) = 1 if v(1) = 1 but v() = 0 for any < 1,
i.e., on whether S would misreport state by d = 1 if doing so
helps to get the project realized. Hence, the equilibrium behavior
1
depends on whether or not the cost parameter k exceeds c (1
) .
Let d c 1

1 Assuming that S observes eases the exposition. All our results go through if
S only observes a noisy signal of , provided that and the signal are affiliated
random variables.

1
k

> 0 and denote the unique number such that

d uR ( )f ( )d = 0. The following proposition describes the


unique monotone PSE:

R. Hodler et al. / Economics Letters 124 (2014) 195198

197

1
(b) k > c (1
) .

1
(a) k c (1
) .

Fig. 1. The monotone PSE.

Proposition 1. There exists a unique monotone PSE. Given k


1
, S plays ( ) = if < , and ( ) = 1 if ,
c (1 )
and R plays v() = 0 for all < 1, and v(1) = kc (1 ) < 1.
1

Given k > c (1
) , S plays ( ) = if < d or ,

and ( ) = if [ d, ), and R plays v() = 0 for all


< and v() = 1 for all .

The proof of Proposition 1 is provided in the Supplementary


material (see Appendix A). Fig. 1 provides the intuition for
Proposition 1.
The left-hand graph shows the equilibrium if k is small. In this
case, R does not realize the project for any message < 1 and
plays a mixed strategy when observing = 1. She thereby mixes
in such a way that, at state , S is indifferent between the truthful
message ( ) = and not having the project realized, and
the distorted message ( ) = 1 and having the project realized
with probability v(1). If R does not realize the project for any off
equilibrium message [ , 1), which is consistent with the
PSE requirements, S finds it indeed optimal to send the truthful
message ( ) = for < and ( ) = 1 for .
1
As k increases beyond the threshold c (1
) , the constraint
v() 1 becomes binding. That is, R cannot possibly give S
more than acceptance with probability 1 upon a sufficiently high
message . This reduces the size of the maximum distortion that
can be supported in equilibrium, and thereby the size of the
interval for which S sends a dishonest message. The right-hand
graph in Fig. 1 shows the resulting equilibrium when k is large. In
this case, R does not realize the project for any message < ,
but realizes the project for . When observing message
= , which is sent for all states [ d, ], R is indifferent
and therefore willing to realize the project. It is consistent with
the PSE requirements that R would not realize the project for any
off equilibrium message [ d, ). Given this strategy of
R, S finds it indeed optimal to send message ( ) = for all
[ d, ), and the truthful message ( ) = for all other
.
3.2. Main result
In this section we study the effect of changes in the costs of
dishonesty on persuasion. We thereby define persuasion as the ex
ante probability P (k) that the project is realized in the equilibrium
of the game and study how P (k) depends on the cost parameter k:
1
Proposition 2. P (k) increases in k for k c (1
) and decreases in
1
k for k > c (1 ) .

Proof. Proposition 1 implies P (k) = [1 F ( )]kc (1 ) for


1
1

k c (1
) , and P (k) = 1 F ( d) for k > c (1 ) . Observe first
that is independent of k. Hence, it directly follows that P (k) > 0
1
for k c (1
) . Observe second that since d decreases in k, it

follows from d uR ( )f ( )d = 0 that d increases in k


(while decreases in k). Hence, it directly follows that P (k) < 0
1
for k > c (1

) .
The intuition for this hump-shaped relationship is as follows:
if dishonesty is not very costly, i.e., if k is relatively small, S sends
the message ( ) = 1 for all , but R responds by only
realizing the project with probability v(1) = kc (1 ). As k and
the associated costs of dishonesty increase, R realizes the project
with ever higher probability v(1) in order to keep S indifferent
between the truthful message ( ) = and ( ) = 1 at
1
state . Persuasion then peaks when k = c (1
) . In this case, S

persuades R with probability one for all states as her costs


d = 1 of misreporting at stage are just equal to her benefit
from the projects realization. As dishonesty becomes even costlier,
Ss willingness to misreport decreases and she only persuades R for
ever fewer states of the world.2
4. Conclusions
We have shown that an expert is most persuasive if misreporting the truth is neither too cheap, nor too costly. Hence, one should
expect experts to be most influential in circumstances in which
their costs of dishonesty are intermediate.
Appendix A. Supplementary material
Supplementary material related to this article can be found
online at http://dx.doi.org/10.1016/j.econlet.2014.05.013.
References
Adams, Rene, Ferreira, Daniel, 2007. A theory of friendly boards. J. Finance LXII,
217250.
Banks, Jeffrey S., 1990. A model of electoral competition with incomplete
information. J. Econom. Theory 50, 309325.
Callander, Steven, Wilkie, Simon, 2007. Lies, damned lies and political campaigns.
Games Econom. Behav. 60, 262286.

2 Related to Proposition 2, it holds that Ss expected utility Eu is maximized


S
1
1
by some k c (1
) . To see this, observe that EuS = k if k c (1 ) , where

[1 F ( )]c (1 ) c (1 )f ( )d > 0.

198

R. Hodler et al. / Economics Letters 124 (2014) 195198

Crawford, Vincent, Sobel, Joel, 1982. Strategic information transmission. Econometrica 50, 14311451.
Edmond, Chris, 2013. Information manipulation, coordination, and regime change.
Rev. Econom. Stud. 80, 14221458.
Fischer, Paul, Verrecchia, Robert, 2000. Reporting bias. Account. Rev. 75, 229245.
Goldman, Eitan, Slezak, Steve, 2006. An equilibrium model of incentive contracts in
the presence of information manipulation. J. Financ. Econ. 80, 603626.
Grossman, Sanford J., Perry, Motty, 1986. Perfect sequential equilibrium. J. Econom.
Theory 39, 97119.
Hodler, Roland, Loertscher, Simon, Rohner, Dominic, 2010. Inefficient policies and
incumbency advantage. J. Public Econ. 94, 761767.
Kamenica, Emir, Gentzkow, Matthew, 2011. Bayesian persuasion. Amer. Econ. Rev.
101, 25902615.

Kartik, Navin, 2009. Strategic communication with lying costs. Rev. Econom. Stud.
76, 13591395.
Kartik, Navin, Ottaviani, Marco, Squintani, Francesco, 2007. Credulity, lies and
costly talk. J. Econom. Theory 134, 93116.
Kolotilin, Anton, 2013. Experimental design to persuad, Working Paper University
of New South Wales.
McCloskey, Donald, Klamer, Arjo, 1995. One quarter of GDP is persuasion. Amer.
Econ. Rev. Pap. Proc. 85, 191195.
Mullainathan, Sendhil, Schwartzstein, Joshua, Shleifer, Andrei, 2008. Coarse
thinking and persuasion. Quart. J. Econ. 123, 577619.
Rogoff, Kenneth, Sibert, Anne, 1988. Elections and macroeconomic policy cycles.
Rev. Econom. Stud. 55, 116.

You might also like