ECNG 6703 - Principles of Communications

Introduction to Information Theory - Channel Coding: Part I

Sean Rocke

September 30th , 2013

ECNG 6703 - Principles of Communications

1 / 22

Outline

1

Channel Coding Preliminaries Channel Models Channel Capacity Conclusion

2

3

4

ECNG 6703 - Principles of Communications

2 / 22

. . . on your Facebook profile A GSM phone conversation Sending instrumentation data to a control system for a manufacturing plant Downloading a legal copy of an e–book from amazon. Source Coding: Representing the information to be transmitted with as few bits as possible. Channel Coding: How do we ensure that the compressed data is transmitted reliably on a possibly unreliable channel? (Error Detection & Correction) ECNG 6703 .com Live transmission of a Machel Montano concert over the Internet Last lecture. .Principles of Communications 3 / 22 . (Data Compression) This lecture. .Channel Coding Preliminaries Channel Coding in Context Consider the following Digital Comms Examples: Taking & posting a narcissistic picture of you ziplining in Chaguaramas. .

BAN front end (BANFE).Channel Coding Preliminaries Recall: Elements of a Digital Communications System Information source and input transducer Source encoder Channel encoder Digital modulator Channel Output transducer Source decoder Channel decoder Digital demodulator Elements not specifically included in the illustration: Carrier and Symbol Synchronization A\D interface Channel interfaces (e.. . ) ECNG 6703 . fiber optic front end (FOFE).g. . . RF front end (RFFE).Principles of Communications 4 / 22 .

bit error rate. in a controlled manner. which can be used at the receiver to overcome the effects of noise & interference encountered during signal transmission through the channel. implementation complexity To answer the above. bandwidth. ECNG 6703 .Principles of Communications 5 / 22 . Channel coding challenge: How can the source output be transmitted across an unreliable channel and reliably received. . power efficiency. some redundancy in the binary information sequence. and implementation complexity as possible? Key performance metrics: coding rate. . it is essential to model the channel. bandwidth efficiency.Channel Models Channel Coding Defined Channel encoding: To introduce. with as little power.

(i. xm } Output alphabet: Set of possible outputs. Memoryless Channel: For length n input sequence. . x = (x [1].Principles of Communications 6 / 22 .) ECNG 6703 . . . .Channel Models Channel Models A general communication channel is described in terms of: 1 2 3 Input alphabet: Set of possible inputs. . . x [n]). For soft decoding |X | = |Y|. .. .e. . y = (y [1]. . yn } Transition probabilities: Conditional probability for each each possible input–to–output mapping. and length n output sequence. . X = {x1 . P (y|x) = n i =1 P (y [i ]|x [i ]) for all n. the output at time i depends upon the input at time. . P (Y = yj |X = xi ) Note: For hard decoding |X | = |Y|. . i. Y = {y1 . . . . y [n]).

Channel Models Binary Symmetric Channel (BSC) Binary Symmetric Channel (BSC): 1 2 3 Input alphabet: X = {0. due to channel noise and other disturbances) Channel is obviously memoryless! 4 5 ECNG 6703 .e.Principles of Communications 7 / 22 .. 1} Transition probability matrix: P (Y = 0|X = 0) P (Y = 1|X = 0) 1−p p P (Y |X ) = = P (Y = 0|X = 1) P (Y = 1|X = 1) p 1−p p .average probability of bit errors in transmitted sequence (i. 1} Output alphabet: Y = {0.

yn } (Discrete alphabet) Transition probability matrix: P (Y = y0 |X = x0 ) · · ·  . .  . . xm } (Discrete alphabet) Output alphabet: Y = {y0 . . .Principles of Communications 8 / 22 .. . . . . . P (Y |X ) =  . . P (Y = yn |X = xm ) ECNG 6703 . . P (Y = y0 |X = xm ) · · ·  P (Y = yn |X = x0 )  .Channel Models Discrete Memoryless Channel (DMC) Discrete Memoryless Channel (DMC): 1 2 3 Input alphabet: X = {x0 . .

Channel Models Channel Examples Example: Sketch the channels and discuss the relationships between the input and output for the following channels:  3 1 4 4 0 0 0   2 1 1 Lossless Channel. [P (Y |X )] =  0 0 3 3 0 0 0 0 0 1   1 0 0 1 0 0    2 0 1 0 Deterministic Channel. [P (Y |X )] =  0 0 1 0 0 0 0 1 ECNG 6703 .Principles of Communications 9 / 22 . [P (Y |X )] =    0 1 0 0 0 1   1 0 0 0 0 1 0 0  3 Noiseless Channel.

σ 2 (i.Y = X + N .Channel Models Discrete–Input. N ∼ N (0. y ∈ R Example: AWGN Channel . xm } (Discrete alphabet) Output: Y = R (Unquantized/Continuous detector output) Transition probabilities: P (Y = y |X = xi ) = fY |X (y . xi ∈ X . σ 2 )). Continuous–Output Channel Discrete–Input. xi ) = − √ 1 e 2πσ 2 (y −xi )2 2σ 2 ECNG 6703 .. . where N is a zero–mean Gaussian RV with variance.e. . . Continuous–Output Channel: 1 2 3 Input: X = {x0 . . and fY |X (y . xi ).Principles of Communications 10 / 22 .

i .Channel Models Other Channel Models Discrete–Time AWGN Channel: 1 2 Input: X = R (Unquantized/Continuous valued input) Output: Y = R (Unquantized/Continuous valued detector output) At time instant. σ 2 )) Transition probabilities: P (Y = y [i ]|X = x [i ]) = fY |X (y [i ]. where n[i ] ∼ N (0. x [i ]). (x [i ]. y [i ] = x [i ] + n[i ].Principles of Communications 11 / 22 . y [i ]) ∈ R 3 4 ECNG 6703 .

N 2 Transition probabilities: P (Y = y [i ]|X = x [i ]) = fY |X (y [i ]. t . (x [i ]. y (t ) = x (t ) + n(t ). Unquantized/Continuous detector output) Continuous time interpretation. where n(t ) is a sample function of the AWGN process with power 0 spectral density. how much information can the channel reliably convey? ECNG 6703 . Unquantized/Continuous–valued input) Output: y (t ) (Continuous–time.Channel Models Other Channel Models AWGN Waveform Channel: 1 Input: x (t ) (Continuous–time. y [i ]) ∈ R 2 3 4 Having looked at various channel models. At time.Principles of Communications 12 / 22 . for any given channel model. x [i ]).

6 ECNG 6703 . where maximization is performed over all PMFs of the form p = (p1 . p2 . . Questions: Sketch the channels and determine if equiprobable input symbols maximize the information rate through the channel.3 0.6 0.6 0.1 0. . Y ) bits/transmission.Channel Capacity Channel Capacity Calculation Channel Capacity. 0.3 0.Principles of Communications 13 / 22 .1 0.1 0. .3 0.1 1 [P (Y |X )] = 0.6 2 [P (Y |X )] = 0.3 0. . p|X | ) on the input alphabet X . C = maxp I (X .

reliable communication is impossible. At rates higher than capacity. the theorem does not indicate how to achieve this rate For reliable communication we must have R < C . Shannon’s noisy channel coding theorem indicates the maximum achievable rate for reliable communication over a DMS This is a yardstick to measure the performance of communications systems However.Principles of Communications P N0 W 14 / 22 .Channel Capacity Channel Capacity Limits Shannon’s 2nd Theorem.The Noisy Channel Coding Theorem Reliable communication over a DMC is possible if the communication rate R satisfies R < C. R < W log2 1 + ECNG 6703 . where C is the channel capacity. where C = W log2 (1 + SN R) (Shannon–Hartley Limit) For a band–limited AWGN.

Principles of Communications 15 / 22 3 .Channel Capacity Channel Capacity Example: Sketch the channel defined by:   0. pX (1) = pX (3) = 0? ECNG 6703 .e.5 0.5 0.5 0 0  0 0.5 0   [P (Y |X )] =   0 0 0.5 1 2 What type of channel is this? Is the channel capacity sufficient for an uniform PMF input (i.5..5 0 0 0. pX (0) = pX (1) = pX (2) = pX (3))? What happens if the input PMF is changed to pX (0) = pX (2) = 0.5  0.5 0.

as P → ∞)? Can the channel capacity be increased indefinitely by increasing the bandwidth..e. (i. note the following: 1 2 3 = PTs log2 M = P R Observe what happens when r = →0 16 / 22 ECNG 6703 . as W → ∞)? What is the fundamental relation between bandwidth and power efficiency of a communications system? Energy per bit. R < W log2 1 + 1 P N0 W Can the channel capacity be increased indefinitely by increasing the transmit power. (i..e.Channel Capacity Channel Capacity Questions: For a band–limited AWGN.Principles of Communications . εb = Solve for εb N0 R W ε log2 M 2 3 To answer the above.

Channel Capacity Bandwidth vs Power Efficiency ECNG 6703 .Principles of Communications 17 / 22 .

What is the probability in this case? What happens for a repetition code where each bit is repeated n times? What is the impact on the rate due to repetition coding use? 3 4 ECNG 6703 .Channel Capacity Channel Coding Intuition Repetition Coding Example: Consider a BSC with p = 0.1. where the bit is repeated 3 times and a majority vote is used at the receiver to decide what was transmitted. Assume a repetition code is used.Principles of Communications 18 / 22 . 1 2 Determine the probability of a bit error.

Channel Capacity Error Control Mechanisms Error Control Stop & Wait ARQ Continuous ARQ Go-BackN Selective Repeat Non-linear Non-cyclic Golay FECC Block codes Group(Linear) Polynomially generated (cyclic) BCH ReedBinary BCH Solomon Hamming(e = 1) e>1 Convolutional codes ECNG 6703 .Principles of Communications 19 / 22 .

. ECNG 6703 .Channel Capacity MATLAB Nuggets Let’s take a look at Simulink..Principles of Communications 20 / 22 .

Principles of Communications 21 / 22 .Conclusion Conclusion We covered: Channel models Channel capacity Shannon’s noisy channel coding theorem Channel coding intuition Introducing Simulink Your goals for next class: Continue ramping up your MATLAB & Simulink skills Review channel coding handout on course site Complete HW 3 for submission next week Complete At-Home Exercise for submission next week ECNG 6703 .

Principles of Communications 22 / 22 .Q&A Thank You Questions???? ECNG 6703 .