You are on page 1of 3

Information theory Lecture-1

Introduction to information theory


Information theory is the science of operations on data such as compression, storage, and
communication. Most scientists agree that information theory began in 1948 with Shannon’s
famous article. In that paper, he provided answers to the following questions:
- What is “information” and how to measure it?
- What are the fundamental limits on the storage and the transmission of information?
The answers were both satisfying their ability to reduce complicated problems to simple
analytical forms. Since then, information theory has kept on designing devices that reach or
approach these limits.
Here are two examples which illustrate the results obtained with information theory methods
when storing or transmitting information.
Facsimile transmission:
The page to be transmitted consists of dots represented by binary digits (“1” for a black dot
and “0” for a white dot). Its dimensions are 8.5´11 inches. The resolution is 200 dots per inch.
Consequently, the number of binary digits to represent this page is 3.74 M bits.
With a modem at the rate of 14.4 kbps, the transmission of the page takes 4 minutes and 20
seconds.
When using techniques of coding such as run length coding or Huffman coding, the time of
transmission can be reduced to 17 seconds.
MP3 audio files:
MP3 stands for Moving Picture Experts Group layer 3. It is a standard for compressed audio
files based on psychoacoustic models of human hearing; it allows one to reduce the amount of
information needed to represent an audio signal, as the human ear cannot distinguish
between the original sound and the coded sound.

1/3
Information theory Lecture-1

Let us consider a musical stereo analog signal. For CD quality, left channel and right channel
are sampled at 44.1 KHz. The samples are quantized to 16 bits. One second of stereo music in
CD format generates: 1.411Mbits
By using the MP3 encoding algorithm, this value drops to 128 Kbits without perceptible loss of
sound quality. Eventually, one minute of stereo music requires: 1Mbytes

SHANNON paradigm
Transmitting a message from a transmitter to a receiver can be sketched as follows:

This model, known as the “ Shannon paradigm” , is general and applies to a great variety of
situations.
- An information source is a device which randomly delivers symbols from an alphabet.
As an example, a PC (Personal Computer) connected to internet is an information source
which produces binary digits from the binary alphabet {0, 1}.
- A channel is a system which links a transmitter to a receiver. It includes signaling
equipment and pair of copper wires or coaxial cable or optical fiber, among other
possibilities. Given a received output symbol, you cannot be sure which input symbol

2/3
Information theory Lecture-1

has been sent, due to the presence of random ambient noise and the imperfections of
the signaling process.
- A source encoder allows one to represent the data source more compactly by
eliminating redundancy: it aims to reduce the data rate.
- A channel encoder adds redundancy to protect the transmitted signal against
transmission errors.
- Source and channel decoders are converse to source and channel encoders.
There is duality between “source coding” and “channel coding” , as the former tends to reduce
the data rate while the latter raises it.

3/3

You might also like