You are on page 1of 15

“Information is received when

Uncertainty is removed.”
- Claude E. Shannon, 1948
Claude E. Shannon

https://www.itsoc.org/about/shannon
https://www.bell-labs.com/claude-shannon/
“One cannot perceive something
about which he is perfectly certain.”
- Kenneth H. Norwich, 1978
Professor Emeritus, University of Toronto

http://www.biopsychology.org/norwich/isp/isp.htm
Information Theory
Information gained ~ Uncertainty (Entropy) reduced

Entropy Number of possible equally


likely outcomes
H = log2 N
Hs = Stimulus Information Content
All Outcomes Not Equally Likely

1
H = log2 Probability of outcome i
Pi

Maximum Total Information occurs when all outcomes are equally likely
Average Information
Hav = Σ Pi log2 1
n

i=1 Pi

e.g: Event Probabilities: P(A)=.5, P(B)=.25, P(C)=.125, P(D)=.125

Hav = .5 log2(2) + .25 log2(4) + .125 log2(8) + .125 log2(8)

Hav = .5 + .5 + .375 + .375 = 1.75 bits


Average Information
Hav = Σ Pi log2 1
n

i=1 Pi

e.g: Event Probabilities: P(A)=.5, P(B)=.25, P(C)=.125, P(D)=.125

Hav = .5 log2(2) + .25 log2(4) + .125 log2(8) + .125 log2(8)

Hav = .5 + .5 + .375 + .375 = 1.75 bits

Redundancy = 1 - Hav where Hmax= log2 N


Hmax
e.g: Hmax= 2 bits
Redundancy = 1 - .875 = 12.5%
Average Information
Hav = Σ Pi log2 1
n

i=1 Pi

e.g: Event Probabilities: P(A)=.5, P(B)=.25, P(C)=.125, P(D)=.125

Hav = .5 log2(2) + .25 log2(4) + .125 log2(8) + .125 log2(8)

Hav = .5 + .5 + .375 + .375 = 1.75 bits

Redundancy = 1 - Hav where Hmax= log2 N


Hmax
e.g: Hmax= 2 bits
Redundancy = 1 - .875 = 12.5%
R-d-nda-cy -s imp-rt-nt in a n-is- ch-n-el
Information Transmitted
NOISE NOISE

HS HR HS HR

HLOSS HLOSS

If HS ≠ HR Equivocation (HSR)
The Magical Number 7 ± 2
Channel Capacity of Sensory Modalities
Auditory Signal Sensitivity

Importance of Redundancy and Context:


Alpha Bravo Charlie Delta Echo Foxtrot…

https://www.liveatc.net/
Redundancy
"Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it
dseno't mtaetr in waht oerdr the ltteres in a wrod are, the
olny iproamtnt tihng is taht the frsit and lsat ltteer be in the
rghit pclae. Tihs is bcuseae the huamn mnid deos not raed
ervey lteter by istlef, but the wrod as a wlohe."

Probability of outcome i
H = log2 1
Pi|X given X (context)
All Outcomes Not Equally Likely

1
H = log2 Probability of outcome i
Pi
Probability of outcome i
H = log2 1
Pi|X given X (context)

Maximum Information occurs when all outcomes are equally likely


Sound Cue Salience

R.W. Bailey, Human Performance Engineering 3d Ed. Prentice Hall, 1996

Importance of Redundancy and Context

You might also like