You are on page 1of 5

1. What is mutual information and explain the properties 3. Explain the code redundancy and the code variance.

associated to it. Ans:


Ans: In digital communication, code redundancy refers to the
deliberate addition of extra bits or symbols to the transmitted
data to detect and correct errors that may occur during
transmission. Redundancy is added through techniques such
as error detection codes (e.g., parity bits, checksums) and
error correction codes (e.g., Hamming codes, Reed-Solomon
codes). These codes provide the ability to identify and
recover from transmission errors, ensuring reliable data
communication.
Code variance, in the context of digital communication, refers
to the variability in the encoding or modulation schemes used
to represent data. Different encoding or modulation
techniques can be employed to transmit the same
information over a communication channel. Code variance
allows for flexibility in adapting to different channel
conditions and optimizing the efficiency and robustness of
the transmission. For example, in digital modulation,
techniques like Amplitude Shift Keying (ASK), Frequency Shift
Keying (FSK), and Phase Shift Keying (PSK) represent different
ways to vary the carrier signal to encode digital information.

4. Calculate the amount of information if binary digits occur


with equal likelihood in binary PCM.
Ans:
In binary Pulse Code Modulation (PCM) where binary digits
occur with equal likelihood, the amount of information can
be calculated using Shannon's formula for entropy.
Entropy (H) is a measure of the average amount of
information contained in each symbol of a discrete source.
For binary digits with equal likelihood, the probability of
occurrence for each digit is 0.5.
The formula to calculate entropy is:
H = - Σ (p * log2(p))
Where:
 Σ represents the summation over all possible
symbols.
2. Prove that the mutual information of the channel is  p represents the probability of each symbol
symmetric i.e., I (X; Y) = I (Y; X) occurring.
Ans: For binary PCM with equal likelihood, we have two symbols
(0 and 1) each occurring with a probability of 0.5.
Plugging these values into the formula, we get:
H = - (0.5 * log2(0.5) + 0.5 * log2(0.5))
Simplifying the equation:
H = - (0.5 * (-1) + 0.5 * (-1))
H = - (-0.5 + -0.5)
H=1
Therefore, in binary PCM where binary digits occur with equal
likelihood, the amount of information per symbol is 1 bit.

5. Explain the properties of entropy.


Ans:
6. Briefly describe about the Code tree, Trellis and State
Diagram for a Convolution Encoder.
Ans:
7. Draw the state diagram, tree diagram, and trellis diagram
for k=3, rate 1/3 code generated by g1(x) = 1+x2, g2(x) =
1+x and g3(x) = 1+x+x2.
Ans:

8. A memory-less source emits six messages with


probabilities 0.3, 0.25, 0.15, 0.12, 0.1 and 0.08. Find the
Huffman code. Determine its average word length,
efficiency and redundancy
Ans:

9. A code is composed of dots and dashes. Assume that the


dash is three times as long as the dot and has one-third
the probability of occurrence.
(i) Calculate the information in a dot and that in a dash
(ii) Calculate the average information in the dot-dash code.
(iii) Assume that a dot lasts for 10ms and that this same time
interval is allowed between symbols.
Calculate the average rate of information transmission.

Ans:

You might also like