You are on page 1of 30

Lecture Notes 11

Game Theory II

Xu Le
National University of Singapore
Outline
Ø  Nash equilibrium
Ø  Repeated Games
Ø  Sequential Games
̶  Backward Induction/Subgame Perfection
̶  Centipede Game
Ø  Strategic Moves
The Nash Equilibrium Revisited
Dominant Strategies: I’m doing the best I can no matter what you do. You’re doing
the best you can no matter what I do.
Nash Equilibrium: I’m doing the best I can given what you are doing. You’re doing
the best you can given what I am doing.

Example: You (Y) and a competitor (C) plan to sell soft drinks on a beach.
If sunbathers are spread evenly across the beach and will walk to the closest vendor,
the two of you will locate next to each other at the center of the beach. This is the
only Nash equilibrium.
THE PRISONERS’ DILEMMA

TABLE 13.5 PRISONERS’ DILEMMA

Prisoner B
Confess Don’t confess

Confess –5, –5 0, –10


Prisoner A
Don’t confess –10, 0 –1, –1

The ideal outcome is one in which neither prisoner confesses, so that both get 1 year
in prison. However, if A chooses “Don’t confess”, B’ll have an incentive to deviate
from “Don’t confess” to “Confess”.

Let us try to find out its most likely outcome--Nash Equilibrium.


What would be the outcome (equilibrium) of such a
game?

How to solve for the Nash equilibrium?

Step 1:
Find one player’s best response to each of the possible strategies played by the
other.
Circle the payoffs of this player that result from his/her best response and the
given play of the other.

Step 2: Repeat this procedure to the other player.

The combination of strategies that result in two circles in one cell is a Nash-equilibrium.
Prisoners’ Dilemma: Solve the Game

TABLE 13.5 PRISONERS’ DILEMMA

Prisoner B

Confess Don’t confess

Confess –5, –5 0, –10


Prisoner A
Don’t confess –10, 0 –1, –1
Prisoner’s Dilemma– cont.

Ø  The Nash-equilibrium of the game is given by a strategy profile (confess,


confess).
§  Both prisoner will choose to confess.

Ø  Does this strategy profile maximize their collective payoffs?


§  It doesn’t. If they both deny, they can end up with -2 in total.
§  (don’t confess, don’t confess) is not an equilibrium.
§  These two prisoners have conflicting interests!

Ø  When economic agents have conflicting interests, individual decision making


without enforcement cannot reach the collective optimum.

One example for Prisoner’s Dilemma:


https://www.youtube.com/watch?v=TKaYRH6E36U
Finding a Nash Equilibrium

Consider a two-player game with five strategies for each player:

Player 2
F G H I J
A 9 , 9 7 , 1 5 , 6 3 , 4 1 , 1
B 7 , 8 5 , 2 3 , 6 1 , 4 3 , 3
Player 1 C 5 , 6 3 , 3 1 , 8 9 , 7 1 , 5
D 3 , 9 1 , 9 9 , 4 7 , 9 5 , 9
E 1 , 2 9 , 8 7 , 7 5 , 6 3 , 7

What are Nash equilibriums?


Why is it called Coordination Game?

Ø  If the two players communicate with each other before they take every action,
they will follow what they agree with when they take their action, because…
•  They have common interest!
•  Their agreement is a Nash equilibrium.
Ø  The best choice depends on what each player thinks the other party is likely to
do.

Coordination Game 2: Meeting at Canteen

You: “hey, do you want to discuss the


homework?”
Friend: “sure, I’m at central library, where
are you?”
You: “YIH. Shall we meet at a canteen?”
Friend: ok, which…..
[her cellphone runs out of power before you two
have agreed at which canteen you will meet]
Which canteen will you head for?
Coordination Game– cont.

How many Nash equilibria are there in the


game?

Two: (ARTS, ARTS) and (BIZ, BIZ)

Which NE will be picked?

(ARTS, ARTS) is more likely to be played than


(BIZ, BIZ). This is called a focal point.

A focal point may stem from custom, common


sense, tradition, etc.
II. Repeated Games
Repeated Games
● repeated game Game in which actions are taken and payoffs received over and over
again.
TABLE 13.8 PRICING PROBLEM
Firm 2
Low price High price

Low price 10, 10 100, –50


Firm 1
High price –50, 100 50, 50

Suppose this game is repeated over and over again—for example, you and your
competitor simultaneously announce your prices on the first day of every month. Should
you then play the game differently?
Tit-for-Tat Strategy
Ø  Consider the following “trigger strategy” by each firm:
•  “High Price, provided the rival has not chosen Low Price in the past year. If
the rival has ever done that, ‘punish’ it by engaging in Low Price forever
after.

Ø  In effect, each firm agrees to “cooperate” so long as the rival hasn’t “cheated” in
the past.
•  “Cheating” triggers punishment in all future periods.

● tit-for-tat strategy Repeated-game strategy in which a player responds in kind to


an opponent’s previous play, cooperating with cooperative opponents and retaliating
against uncooperative ones.
Infinitely Repeated Game

When my competitor and I repeatedly set prices month after month,


forever, cooperative behavior (i.e., charging a high price) is then the
rational response to a tit-for-tat strategy.
With infinite repetition of the game, the expected gains from cooperation will outweigh
those from undercutting.

Finite Number of Repetitions

Now suppose the game is repeated a finite number of times—say, N months. (N can be
large as long as it is finite.)
Example: Oligopolistic Cooperation in the Water Meter Industry

For some four decades, almost all the water meters sold in
the United States have been produced by four American
companies: Rockwell International, Badger Meter, Neptune
Water Meter Company, and Hersey Products.
Most buyers of water meters are municipal water utilities,
who install the meters in order to measure water
consumption and bill consumers accordingly.

With inelastic and stable demand and little threat of entry by new firms, the existing four
firms could earn substantial monopoly profits if they set prices cooperatively. If, on the
other hand, they compete aggressively, profits would fall to nearly competitive levels.
The firms thus face a prisoners’ dilemma. Can cooperation prevail?
Example: Competition and Collusion in the Airline industry

In March 1983, American Airlines proposed that all


airlines adopt a uniform fare schedule based on mileage.
This proposal would have done away with the many
different fares then available. Most other major airlines
reacted favorably to the plan and began to adopt it.

Was it really to “help reduce fare confusion”? No, the aim was to reduce price
competition and achieve a collusive pricing arrangement. Prices had been driven down
by competitive undercutting, as airlines competed for market share. The plan failed, a
victim of the prisoners’ dilemma.
How to induce and sustain cooperation

Ø  Cooperation yields higher profits, so why not?


•  The trade-off: current gain vs. future gain
•  Defect: current and short-run gain
•  Cooperate: future and long-run gain

Ø  How to induce and sustain cooperation?


•  The promise of future rewards and threat of future
punishments.
“End-of-Period” Problem
Ø  Lame duck
•  Presidents, CEOs close to the end of their terms have low
incentive to cooperate.
•  Employees approaching retirement

Ø  As the relationship is coming to an end, threats and promises


of future behavior are going to disappear.

Ø  If everyone knows the relationship is going to end at a certain


time, cooperation is hard to sustain.
The Likelihood of Cooperation

Ø  The players are patient.

Ø  Interactions between the players are frequent.

Ø  Cheating is easy to detect.

Ø  The one-time gain from cheating is relatively small.


III. Sequential Games
Sequential-Move Games

Ø  In many occasions, players move sequentially.


One player may know what the other has done before s/he decides on the next
move.

Example: Defending your island


Ø  You are the commander of a Roman military force and in charge of the defense
of an island from the threat of barbarian. Barbarians, whose force slightly
outnumbers yours, may launch an attack.
Ø  You may either fight, or choose to retreat via a bridge.
•  If you fight, huge casualty would occur to both parties, and you may lose the
battle and be captured.
•  If you retreat, you lose your honor, but avoid risking your life.
Defending Your Island
Two Nash equilibria exist. But which one is more plausible?
•  The only plausible equilibrium: (Barbarians attack, you retreat if attacked).
Why?
•  Suppose you commit to fight before Barbarians attack. Do you really like to
fight Barbarians after they have launched the attack?

This is a sequential-moving game.


•  (Barbarians stay put, you fight if attacked) is not a subgame perfect
equilibrium.
Game Tree and Extensive-Form Game

We represent the game in extensive form to account for the timing of moves:
A Game Tree
Nodes and Branches

Ø  Node: a point at which a player has to make a move

Ø  Branch: leading from a node represents the possible choices the player has.
Ø 

Ø  Payoffs: the left number indicates the first mover's payoff for an outcome; the right
number indicates the second mover's payoff.

Ø  We call the subsequent strategic interaction starting from a single and separate
node a “subgame”.
How to solve such a game

Ø  How would you make your decision if you were the pilot?

Ø  A strategic player anticipates the future, but reason backward!


•  Anticipate the response of the other player, and choose the best action that
induces the most favorable responses.

Ø  Backward induction: starting from the last stage of the game, find the best response
to each previous action.
Use Backward Induction to Solve the Game

Ø  The Unique Subgame Perfect Equilibrium: (Schiphol, Give In).


Backward Induction
Ø  Start from the final nodes (the last mover's potential action choices), and find
the best actions the player can take at each final node (in each contingency).

Ø  Highlight the branches that represents those best actions.

Ø  Get back to the nodes on the preceding levels of the tree. Find the best actions
the player can take at those nodes. Highlight the branches that represents those
best actions. Do not stop until you reach the root of the game tree

Ø  The equilibrium we find is called Subgame Perfect Equilibrium: A Subgame


Perfect Equilibrium must be a Nash Equilibrium, but a Nash-equilibrium may not
be subgame perfect equilibrium.
Experiment: Centipede Game
§  Original Source: Rosenthal (1981, Journal of Mathematical Psychology)

R R R R R R
(8,6)
D D D D D D

(2, 0) (1, 3) (4, 2) (3, 5) (6, 4) (5, 7)

♦  = A’s moves
• = B’s moves
Payoffs: (A’s, B’s)

How would you play the game if you were A? How about B?

END of LECTURE NOTES 11

You might also like