Professional Documents
Culture Documents
(i) What is the capacity C of the binary symmetric channel (BSC) with transition/cross over probability
p; write the expression.
(ii) Plot the capacity C as a function of p; identify the maximum and minimum values and the corresponding
p values.
1 + 1 = 2 marks
n
2. Let C ⊆ {0, 1} be a code with minimum distance d , dmin (C) and d is odd. Now let us construct a
new code C 0 as follows: let us add an overall parity bit to each codeword of the code C to generate the
corresponding a new codeword in C 0 . Then,
(i) What is the V ol(S(z, r))? Express V ol(S(z, r)) in terms of an exponential (with exponent 2; basically
use the trick x = 2log2 (x) ). Use Stirling’s approximation (mention this first; look up any resource
online) and simplify the same. This expressions will have as one term, the binary entropy function
h2 (p) , −p log2 (p) − (1 − p) log2 (1 − p), as was shown in class.
(ii) Let z0 be such that wtH (z0 ) = n/4. Then, what is V ol(S(z0 , r))?
(iii) Define B(z, r) , {y ∈ {0, 1}n : dH (z, y) ≤ r}. Repeat (i) and (ii) for B(z, r).
(i) Show that this is a valid distance measure (show that it satisfies all the three properties as discussed in
class) for all values p ≥ 1.
(ii) Let the ‘n-ball’ be defined as follows B∞ (x, r) , {z ∈ Rn : d∞ (x, z) ≤ r}, where r is a non-negative real
number. This is an n-dimensinal ball centered at x and with radius r (radius measured under distance
function d∞ p). Draw the ball B∞ (0, 1) centered at origin 0 ∈ Rn with radius 1.
2 + 1 = 3 marks
1. Recall the binary symmetric channel (BSC) with cross over probability p, p ∈ [0, 1]; we define it below:
X = {0, 1}, Y = {0, 1} and the “channel law” PY |X given by
(
1 − p if y = x
PY |X (y|x) = (1)
p if y 6= x
For clarity, draw this as the 2 × 2 graph with 4 edges as done in class !
Now consider a binary block repetition code C ⊆ {0, 1}3 , where C = {000, 111}. Let the channel be a BSC(p),
p ∈ [0, 1]. For the decoder g : {0, 1}3 → C (note that g is a mapping from received vectors to messages which
are uniquely mapped to codewords 000 and 111 here) specified by
(i) determine the probability of error averaged over the two equally likely messages (codewords); show your
calculations.
(ii) how does this quantity depend on p (examine what happens when p < 1/2 and p > 1/2)? Is it better or
worse than sending ‘uncoded’ messages (here I mean message corresponds to exactly one bit, and NOT
three repeated bits)?
(i) What is the capacity C of the binary symmetric channel (BSC) with transition/cross over probability
p; write the expression.
(ii) Plot the capacity C as a function of p; identify the maximum and minimum values and the corresponding
p values.
Lemma 1. Let C ⊆ {0, 1}n be a code with minimum distance d , dmin (C). Then, there exists a decoder
g : {0, 1}n → C that correctly decodes every pattern of up to b d−1
2
c errors (or bit flips) over the channel.
Lemma 2. LetSC ⊆ {0, 1}n be a code with minimum distance d , dmin (C). Then, there exists a decoder
g : {0, 1}n → C ⊥ which corrects detects every pattern of up to d − 1 errors (or bit flips) over the channel.
Then, this minimum distance decoder gM DD that correctly decodes every pattern of up to b d−1 2
c errors (or bit
flips) over the channel and correctly detects every pattern of u tp d − 1 errors over the channel.
5. (Bit erasures instead of bit errors/ bit flips)
Consider the binary erasure channel (BEC) defined below:
X = {0, 1}, Y = {0, 1, } ( denotes an erasure) and the “channel law” PY |X given by
1 − p if y = x
PY |X (y|x) = 0 if y 6= x (3)
p if y =
Compute the decoding error probability under decoder g; namely, compute the probability that decoder
g produces either ⊥ or a wrong codeword.
(vi) Show that the value computed in part (iv) bounds from above the probability that the decoder g in part
(v) produces a wrong codeword (the latter probability is called the decoding misdetection probability
and does not count the event that the decoder produces ⊥.
9. Recall the binary erasure channel (BEC) with erasure probability p (described above). Let the erasure
probability be p = 0.1. A codeword of the binary block parity code C ⊆ {0, 1}4 , where |C| = 8 and dmin (C) = 2,
is transmitted through the BEC(p) and the following decoder g : Y 4 → C, where Y 4 = {0, 1}4 is applied to
the received vector:
(
x if y agrees with exactly one x ∈ C on the entries in {0, 1}
g(y) =
⊥ otherwise
Compute the probability that decoder g produces ⊥. Does this probability depend on which codeword is
transmitted?
10. Let z ∈ {0, 1}n and r ∈ {0, 1, · · · , n}. Define S(z, r) , {y ∈ {0, 1}n : dH (z, y) = r}. Let
V ol(S(z, r)) , |S(z, r)| (size of S(z, r) or number of vectors in S(z, r)). Then
(i) What is the V ol(S(z, r))?
(ii) Express V ol(S(z, r)) in terms of an exponential (with exponent 2; basically use the trick x = 2log2 (x) ).
Use Stirling’s approximation (mention this first; look up any resource online) and simplify the same. This
expressions will have as one term, the binary entropy function h2 (p) , −p log2 (p) − (1 − p) log2 (1 − p),
as was shown in class.
(iii) Let z0 be such that wtH (z0 ) = n/4. Then, what is V ol(S(z0 , r))?
11. Let A, B be m × n matrices over R (each entry in the matrix comes from the set of real numbers).
Define the rank distance between A and B by rank(A − B). Show that the rank distance is a metric over the
set of all m × n matrices over R.
(Here you will need to verify all the 3 properties any distance/metric function needs to satisfy; this was
discussed in class.
A couple of hints which might be useful:
(a) rank(A + B) ≤ rank(A) + rank(B),
(b) rank(A) = rank(−A). These should be sufficient to complete the proof.)
12. Let x, y ∈ Rn be real vectors of length n. Let us define the following function dp (x, y), where
p ∈ [0, ∞), between vectors x, y as follows
n
! p1
X
dp (x, y) , |xi − yi |p
i=1
.
(i) Show that this is a valid distance measure (show that it satisfies all the three properties as discussed in
class) for all values p ≥ 1.
(ii) Let the ‘n-ball’ be defined as follows Bp (x, r) , {z ∈ Rn : dp (x, z) ≤ r}, where r is a non-negative real
number. This is an n-dimensinal ball centered at x and with radius r (radius measured under distance
function p). Draw the ball Bp (0, 1) centered at origin 0 ∈ Rn with radius 1, for all values of p; in
particular, draw for p = 0, p = 1/2, p = 1, p = 2, p = 3, p = ∞.
(iii) Is dp (·, ·) a valid distance function for p ∈ [0, 1)? If not, something fails? Which property is it?
13. Let G = (V, E) be a graph comprising vertices V = {v1 , v2 , · · · , vk } and edges E = (e1 , e2 , · · · , el ),
k, l ∈ N. Let G be a weighted (every edge has a ‘weight’ function associated to it), undirected (this means
that the weight of the edge from node u to node v is the same as the weight from node v to node u), connected
graph (every pair of nodes has a path, either direct or via hops over intermediate notes), all the weights are
strictly positive. For nodes u, v ∈ V , let d(u, v) denote the ‘length’ or ‘weight’ of the shortest path between
u and v.
Show that shortest path distance between two nodes in the graph is a distance function (metric).