Professional Documents
Culture Documents
1
Channel Capacity
3
Channel Capacity
7
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Noisy Typewriter
8
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Symmetric Channel
The example of BSC is as shown in following Figure. This is a
binary channel in which the input symbols are complemented
with probability p.
10
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Symmetric Channel
11
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Erasure Channel
A BEC in which some bits are lost (rather
than corrupted) is the binary erasure channel
In this channel, a fraction α of the bits are
erased. The receiver knows which bits have
been erased. The binary erasure channel has
two inputs and three outputs as shown in
Figure.
12
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Erasure Channel
Capacity of binary erasure channel:
13
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Erasure Channel
We may think that H(Y) is log 3, but we cannot achieve this by any
choice of input distribution p(x). Let E be event (Y = e), using the
14
Channel Capacity
EXAMPLES OF CHANNEL CAPACITY
Binary Erasure Channel
Hence the capacity is
15
Channel Capacity
Symmetric Channel
Consider a channel with following transition matrix
16
Channel Capacity
Symmetric Channel
Another example of a symmetric channel is one of the form
17
Channel Capacity
Symmetric Channel
with equality if the output distribution is uniform. But p(x) = 1/|X|
achieves a uniform distribution on Y, as seen from
19
Channel Capacity
Symmetric Channel
Theorem: For a weakly symmetric channel
20
Channel Capacity
Properties of Channel Capacity
21
Channel Capacity
Properties of Channel Capacity
22
Channel Capacity
Preview of the Channel Coding Theorem
Now we prove Shannon’s second theorem, which gives an
operational meaning to the definition of capacity as the number of
bits we can transmit reliably over the channel.
The basic idea is that for large block lengths, every channel looks
like the noisy typewriter channel (Figure 7.4) and the channel has a
subset of inputs that produce essentially disjoint sequences at the
output.
23
Channel Capacity
Preview of the Channel Coding Theorem
For each (typical) input n-sequence, there are approximately 2nH(Y |X)
possible Y sequences, all of them equally likely (Figure 7.7). We
should ensure that no two X sequences produce the same Y output
sequence. Otherwise, we will not be able to decide which X
sequence was sent.
The total number of possible (typical) Y
sequences is ≈ 2nH(Y ). This set has to be
divided into sets of size 2nH(Y |X)
corresponding to the different input X
sequences. The total number of disjoint sets
is less than or equal to 2n( H(Y) −H(Y|X)) = 2nI (X;Y).
Hence, we can send at most ≈ 2nI(X;Y)
distinguishable sequences of length n.
24
Channel Capacity
Some Definitions
Suppose a message W, drawn from the index set {1, 2, . . . , M},
results in the signal Xn(W) is sent by a transmitter. which is
received by the receiver as a random sequence
26
Channel Capacity
Some Definitions
27
Channel Capacity
Some Definitions
Definition An (M, n) code for the channel (X, p(y|x), Y),
consists of the following:
3. A decoding function
g : Yn → {1, 2, . . . , M},
which is a deterministic rule that assigns a guess to each
possible received vector.
28
Channel Capacity
Some Definitions
Conditional Probability of Error: Let
29
Channel Capacity
Some Definitions
Average Probability of Error: The (arithmetic) average
probability of error Pe(n) for an (M, n) code is defined as
Pe(n) (n)
30
Channel Capacity
Some Definitions
Rate:
31
Channel Capacity
Some Definitions
Capacity: The capacity of a channel is the supremum of all achievable
rates.
Thus, rates less than capacity yield arbitrarily small probability of error
for sufficiently large block lengths.
32
Channel Capacity
Some Definitions
Jointly Typical Sequences
33
Channel Capacity
Some Definitions
where
34
Theorem 7.6.1 (Joint AEP) Let (Xn, Yn) be sequences of
length n drawn i.i.d. according to .
Then,
35
Proof
1. We show using the law of large number (LLN),
1 as n
36
2. To prove the second part of the theory, we have
and hence
37
3. Now if and are independent but have the same
marginal as Xn and Yn, then
38
and
39
By similar arguments to the upper bound above, we
can also show that for n sufficiently large,
40
Channel Coding Theorem
Theorem 7.7.1
41
Channel Coding Theorem
Theorem 7.7.1
42
Channel Coding Theorem
Theorem 7.7.1
43
Channel Coding Theorem
44
Channel Coding Theorem
Theorem 7.7.1
45
Channel Coding Theorem
Theorem 7.7.1
46
Channel Coding Theorem
Theorem 7.7.1
47
Channel Coding Theorem
Theorem 7.7.1
48
Channel Coding Theorem
Theorem 7.7.1
49
Channel Coding Theorem
Theorem 7.7.1
50
Channel Coding Theorem
Theorem 7.7.1
51
Zero-error Codes
We
52
Zero-error Codes
53
Channel Coding Theorem
Theorem 7.7.1
54
Universal Codes and Channel Capacity
Relative entropy
55
Universal Codes and Channel Capacity
58
Universal Codes and Channel Capacity
59
Universal Codes and Channel Capacity
60
Universal Codes and Channel Capacity
61
Universal Codes and Channel Capacity
Example:
62
Universal Codes and Channel Capacity
where
Mutual information
63
Universal Codes and Channel Capacity
64
Universal Codes and Channel Capacity
where
65
Universal Codes and Channel Capacity
66
Universal Codes and Channel Capacity
and therefore
is achieved when q = qπ
67
Universal Codes and Channel Capacity
and therefore
68