Information Theory and Network Raymond Yeung Springer, 2009 978-1441946300 Coding
Elements of Information T. Cover and J. Thomas Wiley-Interscience; 978-0471241959
Theory, 2 nd edition 2006 Lecture 1: Introduction to Information Theory
• What is the course about?
• Introduction to probability • Memory and Memory-less (Stochastic ) sources What is information? How to quantify information? What is the fundamental limit of data transfer rate? Some people think information theory (IT) is about... But IT is also about these... And even these... Where IT all begins... Information Theory Shannon’s information theory deals with limits on data compression (source coding) and reliable data transmission (channel coding) – How much can data can be compressed? – How fast can data be reliably transmitted over a noisy channel? Two basic “point-to-point” communication theorems (Shannon 1948) – Source coding theorem: the minimum rate at which data can be compressed losslessly is the entropy rate of the source – Channel coding theorem: The maximum rate at which data can be reliably transmitted is the channel capacity of the channel Introduction to probability Stochastic sources
Example 1: A text is a sequence of symbols each taking its
value from the alphabet A = (a, …, z, A, …, Z, 1, 2, …9, !, ?, …).
Example 2: A (digitized) grayscale image is a sequence of
symbols each taking its value from the alphabet A = (0,1) or A = (0, …, 255).