This action might not be possible to undo. Are you sure you want to continue?

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

**CS414 – Spring 2007
**

By Roger Cheng, Karrie Karahalios, Brian Bailey

**Communication system abstraction
**

Information source Encoder Modulator

Sender side

Channel Receiver side Output signal Decoder Demodulator

**The additive noise channel
**

• Transmitted signal s(t) is corrupted by noise source n(t), and the resulting received signal is r(t)

• Noise could result form many sources, including electronic components and transmission interference

s(t)

+

r(t)

n(t)

Random processes

• A random variable is the result of a single measurement • A random process is a indexed collection of random variables, or equivalently a nondeterministic signal that can be described by a probability distribution • Noise can be modeled as a random process

the values of n(t) are uncorrelated (ie E[n(t0)n(tk)] = 0) • The power spectral density of n(t) has equal power in all frequency bands . the value of n(t) is normally distributed with mean 0.WGN (White Gaussian Noise) • Properties • At each time instant t = t0. variance σ2 (ie E[n(t0)] = 0. E[n(t0)2] = σ2) • At any two different time instants.

WGN continued • When an additive noise channel has a white Gaussian noise source. we call it an AWGN channel • Most frequently used model in communications • Reasons why we use this model • It’s easy to understand and compute • It applies to a broad class of physical channels .

Signal energy and power • Energy is defined as • Power is defined as x = | x(t ) | 2 dt 1 Px = lim | x(t ) |2 dt T T T / 2 T /2 • Most signals are either finite energy and zero power. or infinite energy and finite power • Noise power is hard to compute in time domain • Power of WGN is its variance σ2 .

Signal to Noise Ratio (SNR) • Defined as the ratio of signal power to the noise power corrupting the signal • Usually more practical to measure SNR on a dB scale • Obviously. want as high an SNR as possible .

Analog vs.Goal is to limit effects of noise to a manageable/satisfactory amount . Digital • Analog system • Any amount of noise will create distortion at the output • Digital system • A relatively small amount of noise will cause no harm at all • Too much noise will make decoding of received signal impossible • Both .

Information theory and entropy • Information theory tries to solve the problem of communicating as much data as possible over a noisy channel • Measure of data is entropy • Claude Shannon first demonstrated that reliable communication over a noisy channel is possible (jumpstarted digital age) .

Review of Entropy Coding • Alphabet: finite. d. non-empty set • A = {a. e…} • Symbol (S): element from the set • String: sequence of symbols from A • Codeword: sequence representing coded string • 0110010111101001010 • Probability of symbol in string p 1 i i 1 N • Li: length of codeword of symbol I in bits . c. b.

1944 ."The fundamental problem of communication is that of reproducing at one point." -Shannon. either exactly or approximately. a message selected at another point.

as it was unexpected or surprising .Measure of Information • Information content of symbol si • (in bits) –log2p(si) • Examples • p(si) = 1 has no information • smaller p(si) has more information.

Entropy • Weigh information content of each source symbol by its probability of occurrence: • value is called Entropy (H) p(s ) log i i 1 n 2 p( si ) • Produces lower bound on number of bits needed to represent the information with code words .

4 + -0.4.Entropy Example • Alphabet = {A.6*log2 0.6 • Compute Entropy (H) • -0.97 bits • Maximum uncertainty (gives largest H) • occurs when all probabilities are equal .4*log2 0.6 = . B} • p(A) = 0. p(B) = 0.

Entropy definitions • Shannon entropy • Binary entropy formula • Differential entropy .

Properties of entropy • Can be defined as the expectation of log p(x) (ie H(X) = E[-log p(x)]) • Is not a function of a variable’s values. is a function of the variable’s probabilities • Usually measured in “bits” (using logs of base 2) or “nats” (using logs of base e) • Maximized when all values are equally likely (ie uniform distribution) • Equal to 0 when only one value is possible .

Y) • Conditional entropy is the entropy of X if the value of Y was known • Relationship between the two .Joint and conditional entropy • Joint entropy is the entropy of the pairing (X.

Mutual information • Mutual information is how much information about X can be obtained by observing Y .

Mathematical model of a channel • Assume that our input to the channel is X. and the output is Y • Then the characteristics of the channel can be defined by its conditional probability distribution p(y|x) .

Channel capacity and rate • Channel capacity is defined as the maximum possible value of the mutual information • We choose the best f(x) to maximize C • For any rate R < C. we can transmit information with arbitrarily small probability of error .

H(p.Binary symmetric channel • Correct bit transmitted with probability 1-p • Wrong bit transmitted with probability p • Sometimes called “cross-over probability” • Capacity C = 1 .1-p) .

p .Binary erasure channel • Correct bit transmitted with probability 1-p • “Erasure” transmitted with probability p • Capacity C = 1 .

Adds redundancy bits to make transmission across noisy channel more robust .Compress source data to a smaller size • Channel coding .Coding theory • Information theory only gives us an upper bound on communication rate • Need to use coding theory to find a practical method to achieve a high rate • 2 types • Source coding .

we can break up source coding and channel coding into separate steps without loss of optimality • Does not apply when there are multiple transmitters and/or receivers • Need to use network information theory principles in those cases .Source-channel separation theorem • Shannon showed that when dealing with one transmitter and one receiver.

we will need to use. C. i. G. 3. • Can code alphabet K as: A 000 B 001 C 010 D 011 E 100 F 101 G 110 H 111 . E. log2n bits per symbol. B.e. F. if we want to distinguish n different symbols. H} • In general.Coding Intro • Assume alphabet K of {A. D.

Coding Intro “BACADAEAFABBAAAGAH” is encoded as the string of 54 bits • 00100001000001100010000010100000 1001000000000110000111 (fixed length code) .

saves more than 20% in space .Coding Intro • With this coding: A0 B 100 C 1010 E 1100 F 1101 G 1110 D 1011 H 1111 • 10001010010110110001101010010000 0111001111 • 42 bits.

G(1). C(1). H(1) . E(1). D(1). F(1). B (3).Huffman Tree A (8).

Huffman Encoding • Use probability distribution to determine how many bits to use for each symbol • higher-frequency assigned shorter codes • entropy-based. block-variable coding scheme .

but this may differ slightly from the theoretical lower limit • lossless • Build Huffman tree to assign codes .Huffman Encoding • Produces a code which uses a minimum number of bits to represent each symbol • cannot represent same sequence using fewer real bits per symbol when using code words • optimal when using code words.

Informal Problem Description • Given a set of symbols from an alphabet and their probability distribution • assumes distribution is known and stable • Find a prefix free binary code with minimum weighted path length • prefix free means no codeword is a prefix of any other codeword .

Huffman Algorithm • Construct a binary tree of codes • leaf nodes represent symbols to encode • interior nodes represent cumulative probability • edges assigned 0 or 1 output code • Construct the tree bottom-up • connect the two nodes with the lowest probability until no more nodes to connect .

25 0.30 0.18 A B C D E .Huffman Example • Construct the Huffman coding tree (in class) Symbol (S) P (S) 0.12 0.15 0.

Characteristics of Solution • Lowest probability symbol is always furthest from root • Assignment of 0/1 to children edges arbitrary Symbol (S) A Code 11 • other solutions possible. lengths remain the same • If two nodes have equal probability. can select any two B C D 00 010 011 • Notes • prefix free code • O(nlgn) complexity E 10 .

Example Encoding/Decoding Encode “BEAD” 001011011 Symbol (S) A Code 11 B 00 010 011 Decode “0101100” C D E 10 .

Entropy (Theoretical Limit) H p( si ) log 2 p ( si ) i 1 N Symbol A B P (S) 0.15 0.18 * log2 .12 + -.30 Code 11 00 = -.30 * log2 .18 010 011 10 .24 bits C D E 0.25 + -.25 0.25 * log2 .30 + -.18 H = 2.15 + -.15 * log2 .12 * log2 .12 0.

12 0.25 0.Average Codeword Length L p ( si )codelength( si ) i 1 N Symbol P (S) Code 0.12(3) + .15 = .25(2) + .30(2) + .27 bits E 0.18 10 .30 0.18(2) A B C D 11 00 010 011 L = 2.15(3) + .

• H <= Code Length <= H + 1 . 1/4. 1/2..Code Length Relative to Entropy L p ( si )codelength( si ) i 1 N H p( si ) log 2 p ( si ) i 1 N • Huffman reaches entropy limit when all probabilities are negative powers of 2 • i. etc.e. 1/16. 1/8.

Example H = -.01 + -.01*log2.99 = .08 L = .01(1) + .99*log2.01 0.99(1) =1 Symbol A B P (S) 0.99 Code 1 0 .

1 0.1 • Code “BCCADE” .Exercise • Compute Entropy (H) • Build Huffman tree • Compute average code length Symbol (S) A B C D E P (S) 0.2 0.4 0.2 0.

2 111 100 • Build Huffman tree • Compute code length • L = 2.1 bits Symbol P(S) Code A B 0.1 0.4 0.1 0 101 110 • Code “BCCADE” => 10000111101110 .2 bits C D E 0.Solution • Compute Entropy (H) • H = 2.2 0.

Limitations • Diverges from lower limit when probability of a particular symbol becomes high • always uses an integral number of bits • Must send code book with the data • lowers overall efficiency • Must determine frequency distribution • must remain stable over the data set .

• (more next week) . • Error detection always precedes error correction. • Error correction has the additional feature that enables localization of the errors and correcting them.Error detection and correction • Error detection is the ability to detect errors that are made due to noise or other impairments during transmission from the transmitter to the receiver.

:79:4/63 .37..6155.96 ::4.5061.: .5:5 195. 6 <> 559.6:365 /.37.:4/63:3351.5. ./.3./.61:.

6155.956/.: 135.061 .96 :50611.:.: .

:469.0 .:0615 /.6155.5 5:7..96 ..::.

4.5'9 .

550615 :796/.61.945 64.4.:.1:.65.5/.0:4/63 9.6:69./3.9/.

9850.9061: 5.967.::51:69.

:1/3602./.

/30615 :04 ..9.

5.0.550615 #9610:.556.4.6979:5.:.3964.6.::5061: .:..336934.979:5.4544 54/96/.3 /.9.0:4/63 0.:79:4/635:5061691: 67.69.0610::.:4.4. 19:3.4:850:599...35:5061691:/. 36::3:: 314.

796../.799 /5.65:2565./3 51.9796/.:.5694.65 .9061.3#96/34:097. 7994.51 .4544 .5.6:4/63:964.51:.35.17.5:56061691:.9061691 ./3..9/..9/.::4:1:.1:.65 5.37.5 6.

/3.9/6.65061 5..9../5.4 65:..96061: 3.::51696. 1:...64.969561:979:5.90.:4/63:.5369.7.561:979:5.796/.061 65:.043.4.90.

7 06550./3.356469561:.6561:... . 5.796/..606550.36:.

4.:: &4/63 & # & .9 503.473 65:.50615.90.4.5..

9.603195 1:.:9. .9:.9/.3 796/.: 799061 " 535 06473.964966.65 6:.9:63.8.5./3.0:6&63..0.6561:.65:76::/335.3.: 94.9.5:30.0..796/.:4/63: .5.:. ::545.6.6 &4/63 & 61 !6.9 6.:.4 ./3.

.473506150615 5061 &4/63 & 61 061 .

34. 5 8 4 5 8 &4/63 # & 61 .0.967 '69.5.

36 .

36 .

36 .

36 .

36 /.: .

: . 5 8 .4/0039 8 &4/63 # & 61 /.616915.9.

.0:5.%3..967 5 8 .769:6 .4/0039 8 5 8 4 5 8 4.65..615.:.96734.5.95. .59.33 796/.0 615./3.

473 ..

36 .

36 &4/63 # & 61 .

06135.967 314..9.90: 647.5. &4/63 & # & 61 .9 647.5.

06135. /.: &4/63 # & 61 314.65 647.9 647.: 61 .5.967 /.5.&63.

: :.65: 9:96436934.9:4/63/064: .:.9. ../3./369.9..1.5:.:::.94..3. 369:69..65 4:.9/.1.1..4.:51061/662.33050 :.55.7.354/96/..6 .5796/..03..94598501:.

5.9 47.51 06990.65..9.6.3.65 4695..9969: ...656.2 .5:4::65964.656:696.909 996906990..11.:.. .9969:.:195.61.4 99691.0.0.94...0.5:4./3:360.5.65.65 99691..65.5106990.:7901:9969 06990.3.9...99691.65.11.65:.3.9./3.0.945.9 .

- Simulation of Adaptive Noise Canceller for an ECG signal Analysis
- 016-timing-analysis
- Kantipudi_Tmeas
- 17 LCDF4_Chap_05_P1
- 016 Timing Analysis
- 1-s2.0-S1434841110001986-main
- A 05410105
- Isit.00
- Partial Generation of 2n-Length Walsh Codes Using N-bit Gray and Inverse Gray c
- Lecture13 14 Dp
- Affine Projection Adaptive Filter is a Better Noise Canceller
- Project Ideas
- haar.pdf
- Dynamic Spectrum Derived Mfcc and Hfcc Parameters and Human Robot Speech Interaction
- 7. Eng-A Review- Performance-tabassum Saifi
- 270
- A DWT-based Image Watermarking Approach Using Quantization on Filtered Blocks
- Teorija Informacija
- Dsp Notes B.tech
- IP_0281
- 4558857
- EK_2_Comb&Seq
- Calibracion ADC ATMEGA
- Floating-Point System Quantization Errors in Digital Control Systems (June 1, 1973)
- Convolution FPGA
- By 21483488
- 14 Ref Import
- Multiplier s
- seq-ckts

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd