3 views

Uploaded by Kemal Gözdeler

Attribution Non-Commercial (BY-NC)

- DT_Parameters [Compatibility Mode]
- Adaptive Noise Reduction Usig LV
- Denon AVR-X5200W .pdf
- 27564967 Document 119 06 Telemetry Applications Handbook May 2006
- Effective Capacity of a Rayleigh Fading Channel in the Presence of Interference
- Events
- Manual-KDC-217.pdf
- Voice Quality Enhancement(GBSS16.0_01)
- MIMO Propagation Measurements in Underground Mine
- 2310 RM-189 Schematics
- Radar Equation2
- getPDF2222222222
- Projected Capacitive Touch Systems From the Controller Point of View
- 00981244
- Lecture 01
- 02-07-12
- Radio
- kotao
- 01094441
- Performance Analysis of New Proposed Window for the Improvement of Snr & Figure of Merit

You are on page 1of 5

How fast can we transmit information over a communication channel? Suppose a source sends r messages per second, and the entropy of a message is H bits per message. The information rate is R = r H bits/second. One can intuitively reason that, for a given communication system, as the information rate increases the number of errors per second will also increase. Surprisingly, however, this is not the case. Shannons theorem: A given communication system has a maximum rate of information C known as the channel capacity. If the information rate R is less than C, then one can approach arbitrarily small error probabilities by using intelligent coding techniques. To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements. Thus, if R C then transmission may be accomplished without error in the presence of noise. Unfortunately, Shannons theorem is not a constructive proof it merely states that such a coding method exists. The proof can therefore not be used to develop a coding method that reaches the channel capacity. The negation of this theorem is also true: if R > C, then errors cannot be avoided regardless of the coding technique used.

1 Shannon-Hartley theorem

Consider a bandlimited Gaussian channel operating in the presence of additive Gaussian noise:

Ideal BPF White Gaussian noise

Input

Output

The Shannon-Hartley theorem states that the channel capacity is given by C = B log2 (1 + S/N ) where C is the capacity in bits per second, B is the bandwidth of the channel in Hertz, and S/N is the signal-to-noise ratio. We cannot prove the theorem, but can partially justify it as follows: suppose the received signal is accompanied by noise with a RMS voltage of , and that the signal has been quantised with levels separated by a = . If is chosen sufciently large, we may expect to be able to recognise the signal level with an acceptible probability of error. Suppose further that each message is to be represented by one voltage level. If there are to be M possible messages, then there must be M levels. The average signal power is then M2 1 S= ( )2 . 12 The number of levels for a given average signal power is therefore 12 S M = 1+ 2 N

1/2

where N = 2 is the noise power. If each message is equally likely, then each carries an equal amount of information H = log2 M = 1 12 S log2 1 + 2 2 N bits/message.

To nd the information rate, we need to estimate how many messages can be carried per unit time by a signal on the channel. Since the discussion is heuristic, we note that the response of an ideal LPF of bandwidth B to a unit step has a 1090 percent rise time of = 0.44/B. We estimate therefore that with T = 0.5/B we should be able to reliably estimate the level. The message rate is then 1 = 2B messages/s. T The rate at which information is being transferred across the channel is therefore 12 S R = r H = B log2 1 + 2 . N This is equivalent to the Shannon-Hartley theorem with = 3.5. Note that this discussion has estimated the rate at which information can be transmitted with reasonably small error the Shannon-Hartley theorem indicates that with sufciently advanced coding techniques transmission at channel capacity can occur with arbitrarily small error. r= The expression of the channel capacity of the Gaussian channel makes intuitive sense: As the bandwidth of the channel increases, it is possible to make faster changes in the information signal, thereby increasing the information rate. As S/N increases, one can increase the information rate while still preventing errors due to noise. For no noise, S/N and an innite information rate is possible irrespective of bandwidth.

Thus we may trade off bandwidth for SNR. For example, if S/N = 7 and B = 4kHz, then the channel capacity is C = 12 103 bits/s. If the SNR increases to S/N = 15 and B is decreased to 3kHz, the channel capacity remains the same. However, as B , the channel capacity does not become innite since, with an increase in bandwidth, the noise power also increases. If the noise power spectral density is /2, then the total noise power is N = B, so the Shannon-Hartley law becomes C = B log2 1 + S B = S . B S log2 1 + S B

x0

B/S

lim (1 + x)1/x = e

and identifying x as x = S/B, the channel capacity as B increases without bound becomes C = lim C =

B

S S log2 e = 1.44 .

This gives the maximum information transmission rate possible for a system of given power but no bandwidth limitations. The power spectral density can be specied in terms of equivalent noise temperature by = kTeq . There are literally dozens of coding techniques entire textbooks are devoted to the subject, and it is an active research subject. Obviously all obey the Shannon-Hartley theorem. Some general characteristics of the Gaussian channel can be demonstrated. Suppose we are sending binary digits at a transmission rate equal to the channel capacity: R = C. If the average signal power is S, then the average energy per bit is E b = S/C, since the bit duration is 1/C seconds.

With N = B, we can therefore write C Eb C = log2 1 + B B Rearranging, we nd that B Eb = (2C/B 1). C This relationship is as follows: .

10 B/C (Hz/bits/s)

Practical systems 10

0

10

b

20

25

30

The asymptote is at E b / = 1.59dB, so below this value there is no error-free communication at any information rate. This is called the Shannon limit.

- DT_Parameters [Compatibility Mode]Uploaded byKhalid Mahmood
- Adaptive Noise Reduction Usig LVUploaded byClaudiu
- Denon AVR-X5200W .pdfUploaded byboroda2410
- 27564967 Document 119 06 Telemetry Applications Handbook May 2006Uploaded byfilzovoc
- Effective Capacity of a Rayleigh Fading Channel in the Presence of InterferenceUploaded byJohn Berg
- EventsUploaded byBalan BaLan
- Manual-KDC-217.pdfUploaded byMaicon Bruno Alba
- Voice Quality Enhancement(GBSS16.0_01)Uploaded byCuongDola
- MIMO Propagation Measurements in Underground MineUploaded byArshad Kv
- 2310 RM-189 SchematicsUploaded byYunita Evi Indrani
- Radar Equation2Uploaded byZeeshan Khan
- getPDF2222222222Uploaded byapi-19755952
- Projected Capacitive Touch Systems From the Controller Point of ViewUploaded byEnes Selman Ege
- 00981244Uploaded bypeppas4643
- Lecture 01Uploaded byzee1234567890
- 02-07-12Uploaded bydioumb
- RadioUploaded bymarkovic1986
- kotaoUploaded byAnonymous N3LpAX
- 01094441Uploaded byRadhakrishna_P_434
- Performance Analysis of New Proposed Window for the Improvement of Snr & Figure of MeritUploaded byesatjournals
- Early Stopping MethodUploaded byFrancesco Petrelli
- PLCUploaded byAhmed Torki
- L060-LPLL-II(2UP)Uploaded byMustafaMuhammad
- TECHNICAL.pdfUploaded byUlianov Sinforoso
- FDD MeasurementsUploaded bySanthosh Pg
- 3g Events CompleteUploaded byakifiqbal11
- 3G_EventsUploaded byAnonymous kC8oFmS
- 212817433-3G-Events.pdfUploaded bymehran_4x
- UMTS EventsUploaded by3a9a
- UMTS EventsUploaded bydebraj230

- Algorithm2e package for LatexUploaded bySundevil Lee
- Where is Hugo? - Part 11Uploaded byKemal Gözdeler
- Where is Hugo-Part 12Uploaded byKemal Gözdeler
- Escape From HugoUploaded byKemal Gözdeler
- Vlc User Guide EnUploaded byDenwego
- ProfMeUploaded byKemal Gözdeler
- howtopdfUploaded byvarokas
- Prims AlgorithmUploaded byKemal Gözdeler
- Algorithm LatexUploaded bySonali Rahagude
- hwk3Uploaded byKemal Gözdeler
- List of Attack AircraftUploaded byKemal Gözdeler
- Inductor AnimateUploaded byUjjwal Karki
- FourierUploaded byapi-3725389
- 10.1.1.48.8088Uploaded byKemal Gözdeler
- 11w Flyback Converter_l6590Uploaded byjpsganesh

- ch10Uploaded byAhmed M T
- AC MachinesUploaded byCamille Kaye Caceres
- Director of Quality AssuranceUploaded byapi-78034492
- Force and MotionUploaded byJoric Magusara
- OmniVision_OV7725Uploaded bystrubbels
- Grunding TV ManualUploaded bymarkoosijek
- FMA RMS Hacking Internet Banking Applications box Kuala Lumpur ByUploaded byriskyart2006
- 10025c.pdfUploaded bybeqs
- Arbeitshilfe Englisch c40Uploaded byarta-kiss
- 1devUploaded byAnonymous JA2me6jkM
- Gas CompressorUploaded byRvSanthosh
- DC(E)2008Uploaded byDonny Rahman Khalik
- Job VacancyUploaded byPriya Arora
- TocUploaded byMo Ml
- Perma Coat Pc 40 SpecUploaded byjory
- Sw Dtm2014manualUploaded bytanujaayer
- Rectilinear, freefall and projectileUploaded byLugabaluga
- RVDTDtSUploaded byJay R SV
- Yamaha Rxv440rds ManualUploaded byHorváth János
- dft pptUploaded byjeevithpaul
- MMC SD Card Interfacing and FAT16 FilesystemUploaded byYadav Mahesh
- CADWorx Plant User GuideUploaded byAleksandar Bijelic
- Multiplexing in Thrift: Enhancing thrift to meet Enterprise expectations- Impetus White PaperUploaded byImpetus
- Flyer-Laser Cladding EN6Uploaded byKevin Zagorski
- Wave PowerUploaded byarunji76884
- Physics Lab ManualUploaded bySakthi Ponnusami
- Liebert DSE Design Manual 60Hz (R01_13) (SL-18926)(1)Uploaded byAlice Del Rosario Cabana
- Fins and Bessel EquationsUploaded byusuario1986
- D&E MINIATURES 2000 Model Submarine CatalogUploaded byDUNCAN0420
- hwk1Uploaded by007wasr