You are on page 1of 22

Information

BUSI3008 Risk, Information & Insurance


Contents

• Dimensions and characteristics

• Describing, comparing, and updating information

• Information quality
Dimensions and Characteristics
Dimensions

Technically speaking, information has at least two dimensions:


• Engineering aspect: Addressing questions such as how to convey a maximum of
information from sender to recipient, given that
• The transmission channel has limited capacity
• The transmission may be disturbed by some external noise
• Semantic aspect: Showing that messages have a meaning, i.e., they refer to
concepts known to the sender and—hopefully—to the recipient

Economists have borrowed from both engineering and semantic


approaches to information.
As an Economic Good

Information can be viewed as an economic good.

• It can be produced, stored, consumed, invested, or sold.

• It has some value.


Characteristics

Characteristics that distinguish information from most other goods:

• Can be sold without being given away


• Is Cheap to reproduce
• Cannot be actively disposed of
• Cannot be detected in a person
• Can often not be prevented from spreading
• Can often not be valued before it is known
• Can be about facts or about other people's information
Describing, Comparing, and Updating
Information
State-Space Approach

Assumptions
• The world can take one of several different states.
• Each state is a complete description of reality.
• Only one of the states can hold at any time.
• The list or collection or set of all possible states is called the state space denoted
by Ω 𝑤 , containing states 𝑤1 , 𝑤 2 , ⋯ , 𝑤𝑛 .
• The set of possible states is finite (the small-world assumption).
• There are no states of which individuals are not aware.

Example
• Rolling a dice leads to the state space Ω = 1, 2, 3, 4, 5, 6 .
Information Structure

An information structure is the result of


splitting the state space into subsets or of a
partition of the state space.

The subsets or the elements of a partition


are events or-as often referred to in game
theory-information sets.

In the dice example, “odd-even” can be an


information structure. “Odd” and “even” are
then two events.
Signal

A signal is an information source telling individuals in which event the true


state lies.

The signal can take different values or, equivalently, have different
realizations.

In the dice example, a signal can take the values 𝑜𝑑𝑑, 𝑒𝑣𝑒𝑛 and indicates
which of these events has happened.
Perfect and Void Signals

A perfect signal takes different values for each possible state, thus
creating an information structure of events with only one state in each.
[highest information content]

A void signal returns an information structure consisting of only one


event, i.e., the entire state space. [no information content]

In reality, most signals fall between those two extremes. They do contain
some information, but are often imperfect. For example, a signal
𝑜𝑑𝑑, 𝑒𝑣𝑒𝑛 is imperfect as both realizations contain more than one
states.
Noisy Signal

A noisy signal indicates several events in an information structure.


• The dice example: Signal 𝑙𝑜𝑤 = 1, 2, 3 and signal ℎ𝑖𝑔ℎ = 4, 5, 6 . Signal 𝑙𝑜𝑤
or signal ℎ𝑖𝑔ℎ is noisy with respect to the events “odd” and “even” because
signal 𝑙𝑜𝑤 or ℎ𝑖𝑔ℎ doesn’t univocally indicate the event.

A noisy signal is not perfect, but can still be informative.


• Signal 𝑙𝑜𝑤 which indicates the event to be “odd” in two out of three cases is
more informative than a priori information that “odd” and “even” events are
equally likely.
Homogeneous and Heterogeneous Information

Homogeneous information refers to information structures whose events


do not overlap across individuals.

Heterogeneous information refers to information structures whose events


overlap across individuals.
Heterogeneous Information

Here the information is


heterogeneous because events in
information structures of
different individuals overlap.

Alice knows whether the


outcome of rolling a dice will be
“odd” or “even” while Bob knows
whether it will be “low” or “high”.
Asymmetric Information

There is asymmetric
information when one
information structure is finer
than another.

Bob has a finer information


structure, i.e., superior
information, than Alice. There is
thus asymmetric information
between Alice and Bob.
Bayes’ Rule

𝑃𝑟 𝐸 𝑃𝑟 𝐹 𝐸 = 𝑃𝑟 𝐸 ∩ 𝐹 = 𝑃𝑟 𝐹 𝑃𝑟 𝐸 𝐹

• 𝐸 is an event.
• 𝐹 is a signal related to the event.
• | means “conditional on”.
• 𝑃𝑟 𝐸 ∩ 𝐹 denotes the probability that one observes event 𝐸 and signal 𝐹.
Bayes’ Rule

Bayes’ rule tells an individual how to update their information after


receiving a signal.

Assume that an individual originally thinks that event 𝐸 will occur with
probability 𝑃𝑟 𝐸 , the prior probability. Now a signal 𝐹 comes. The prior
information then needs to be updated for the information contained in
signal 𝐹. From the prior probability 𝑃𝑟 𝐸 and the probability 𝑃𝑟 𝐹 of
signal 𝐹, the individual can infer the posterior probability of event 𝐸,
𝑃𝑟 𝐸 𝐹 , by using Bayes’ rule

𝑃𝑟 𝐸 𝑃𝑟 𝐹 𝐸
𝑃𝑟 𝐸 𝐹 =
𝑃𝑟 𝐹
Bayes’ Rule

𝑃𝑟 𝐹𝐸
The fraction can be interpreted as a correction factor affecting the
𝑃𝑟 𝐹
prior probability 𝑃𝑟 𝐸 after observing signal 𝐹.

• It exceeds 1 if 𝑃𝑟 𝐹 𝐸 is greater than 𝑃𝑟 𝐹 , that is, if signal 𝐹 is closely related to


event 𝐸.

• It falls below 1 if the occurrence of signal 𝐹 hints at a low probability of event 𝐸.


Information Quality
Fineness

One criterion to rank information


quality is fineness.

An information structure is said to be


more informative than another if it
has a finer splitting of the state
space.

Alice’s information structure is less


informative than Bob’s slightly finer
information structure.
Fineness

Unfortunately, the fineness criterion does not


make it possible to rank all information structures.

The fineness criterion applies to asymmetric


information, but not to heterogenous information.

Neither of the two information structures is


necessarily more informative. Although the
splitting of the right information structure looks
finer (it has more subsets), not all of its individual
subsets are finer than those of the left
information structure.
Precision of a Signal

Another criterion to rank information quality is the precision of a signal.

The precision, 𝑞 , is equal to the probability that an event leads to the


respective signal.
• In the dice example, a signal 𝑙𝑜𝑤 (ℎ𝑖𝑔ℎ) indicating the event “odd” (“even”) has
a precision of 2Τ3, i.e., 𝑞 = 2Τ3.
• In terms of Bayes’ rule, the precision of signal 𝐹 in event 𝐸 is 𝑃𝑟 𝐹 = 𝐸|𝐸 .

Unfortunately, the precision of a signal only makes sense in special cases


like binary signals. Moreover, not all binary signals have a defined precision.

You might also like