Professional Documents
Culture Documents
- R.S.Kiruthika, L/ECE
1. Define probability.
2. What is joint & conditional probability?
3. What is a random process?
4. What is a random variable?
5. What is probability distribution function? Define JPDF.
6. What is Cumulative probability distribution function? Define JCPDF.
7. Define mean, moments, and variance.
8. What is std deviation?
9. What is a Gaussian process? List its properties.
10.Define. Stationary & non stationary process.
11.Define strict sense stationary & wide sense stationary.
12.What is stochastic and ergotic process?
13.Define noise.
14.Classify the noise. What are the parameters used for classification?
15.What is white, Johnson, thermal noise? Give its voltage equation?
16.What is noise equivalent resistance?
17.What is shot noise? Give its equation?
18.What is partition noise? Give its voltage equation?
19.What is low frequency noise / flicker/ modulation?
20.Define resistor noise.
21.Define mixer noise, burst noise and avalanche noise.
22.What is transit time noise?
23.Give power spectral density of thermal noise.
24.Give power spectral density of shot noise.
25.What is SNR. What is its application in noise analysis?
26.Define noise figure. Why did not we use SNR for noise analysis always?
27.Write equation for noise figure.
28.Express noise figure in Frii’s formula.
29.Write noise figure using Ra and Rt.
30.Give noise figure equation in terms of R’eqv.
31.Define narrow band noise.
32.Write the properties of narrow band noise.
33.Define noise temperature.
34.Write equation for noise temperature.
35.Express noise temperature using in Frii’s formula.
36.Define space noise.
37.What is atmospheric noise?
38.What is internal noise?
39.What is industrial noise?
40.Write equation for equivalent noise resistor connected in parallel & series.
UNIT-4
2 MARKS
4. What is the (S/N)/(S/N) for AM with envelope detection with small noise case?
7. Define SNR.
12.What is fidelity?
18.When a super heterodyne receiver is tuned to 555 kHz, its local oscillation
provides the mixer with an input at 1010 kHz. What is the image frequency?
receiver?
38.Why the noise performance of AM receiver is poor compared with DSB & SSB
UNIT-5
INFORMATION THEORY
1. What is entropy?
2. What is prefix code?
3. Define information rate.
4. What is channel capacity of binary synchronous channel with error probability of
0.2?
5. State channel coding theorem.
6. Define entropy for a discrete memory less source.
7. What is channel redundancy?
8. Write down the formula for the mutual information.
9. When is the average information delivered by a source of alphabet size 2,
maximum?
10. Name the source coding techniques.
11. Write down the formula for mutual information.
12. Write the expression for code efficiency in terms of entropy.
13. Is the information of a continuous system non negative? If so, why?
14. Explain the significance of the entropy H(X/Y) of a communication system where
X is the
transmitter and Y is the receiver.
15. An event has six possible outcomes with probabilities ½.1/4,1/8,1/16,1/32,1/32.
Find the
entropy of the system.
16. State channel coding theorem.
17. A source is emitting symbols X1, X2, X3 with probabilities respectively 0.6, 0.3,
0.1. What is the entropy of the source?
18. When is the average information delivered by a source of alphabet size 2,
maximum?
19. Define discrete entropy.
20. State the properties of entropy.
21. Define source coding.
22. Define average length per code.
23. How the code efficiency is calculated?
24. State the procedure followed by Shannon fano coding?
25. Define the bit of information.
26. Why Huffman coding is said to be optimum?
27. Give relation between average mutual information and average information per
pair for noise free channel.
28. Give the mathematical expression for joint entropy of independent events.
29. What is the need for data compression?
30. Define rate distortion theory.
31. What are the drawbacks of data compression.
32. Define differential entropy.
33. List the steps involved in Lempel ziv coding.
34. What are advantages of coding?
35. Define code variance.
36. When the code is said to be uniquely decodable.
37. Why the channels are called discrete memory less.
38. Write short notes on binary erase channel.
39. Define symmetric channel.
40. List out the properties of mutual information.