Attribution Non-Commercial (BY-NC)

295 views

Attribution Non-Commercial (BY-NC)

- msg00009
- cycloStationarySignatureOFDM
- Shannon's Theory
- Thesis
- Portal Mathematics
- Comparative Analysis of Bit Error Rate and Outage Capacity for MIMO STTC Coding
- IJCSI-8-2-648-653
- ST7540t
- Lesson 6
- basic lecture information theory and coding
- capacity under water communication
- ED_EE_III_2012
- Potwe 13 Gm Cp 21 s
- Design of Sparse Filters for Channel Shortening.pdf
- Channel
- DAA LP
- Digital Communication Introduction
- Antenna Selection for Energy-efficient MIMO-OfDM Wireless Systems
- hw13-sol
- LOGARITHMS.pdf

You are on page 1of 16

Vahid Meghdadi University of Limoges

meghdadi@ensil.unilim.fr

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Discrete memoryless channel Transmission rate over a noisy channel Repetition code Transmission rate Capacity of DMC Capacity of a noisy channel Examples

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Input random variable X Output random variable Y

The input of a DMC is a RV (random variable) X who selects its value from a discrete limited set X . The cardinality of X is the number of the point in the used constellation. In an ideal channel, the output is equal to the input. In a non-ideal channel, the output can be dierent from the input with a given probability.

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

All these transition probabilities from xi to yj are gathered in a transition matrix. The (i , j ) entry of the matrix is P (Y = yj |X = xi ), which is called forward transition probability. In DMC the output of the channel depends only on the input of the channel at the same instant and not on the input before or after.

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

DMC model

1 pi1 X i p11 p1j pij piM M

pi P ( X = i ) , qj And,

1 j

pMM

P (Y = j ) and pij

M

M

P (Y = j |X = i ).

qj =

i =1

Vahid MeghdadiUniversity of Limoges

pi pij

Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

The last equation q1 q2 Qy = . . . qM can be written using matrix form: p11 p21 . . . pM 1 p12 p22 . . . pM 2 . = . . . . . . . . . . . . p1M p2M . . . pMM

p1 p2 . . . pM

M M M

pij = 1 and pe =

j =1

Vahid MeghdadiUniversity of Limoges

pi

i =1 j =1,i =j

pij

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Normally the size of constellation at the input and at the output are the same, i.e., |X | = |Y |. In this case the receiver employs hard-decision decoding. It means that the decoder makes a rm decision about the transmitted symbol. It is possible also that |X | = |Y |. In this case the receiver employs a soft-decision.

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Repetition code

Consider the above channel and say p = 0.1. With observing a one at the output, are we sure what had been sent? We can only say that X was one with the probability of 90%, it means that pe = 0.1. If we send a one three times and make a majority decision at the output, the probability of error will be: pe = 1 pc = 1 (1 p )3 + 3 2 (1 p )2 p 1 = 0.028

It means that with reducing the rate, we can arbitrarily decrease the error probability.

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Transmission rate

H (X ) is the amount of information per symbol at the input of the channel. H (Y ) is the amount of information per symbol at the output of the channel. H (X |Y ) is the amount of uncertainty remaining on X knowing Y . The information transmission is given by: I (X ; Y ) = H (X ) H (X |Y ) bits/channel use For an ideal channel X = Y , there is no uncertainty over X when we observe Y . So all the information is transmitted for each channel use: I (X ; Y ) = H (X ) If the channel is too noisy, X and Y are independent. So the uncertainty over X remains the same knowing or not Y , i.e. no information passes through the channel: I (X ; Y ) = 0.

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

The question is: Can we send the information over a noisy channel without error but with a non zero rate? The answer is yes, and the limit of the rate is given by Shannon.

Example: For the following channel, for each channel use, two bits are sent. Using uniform input variable X , the input entropy is H (X ) = 2 bits per symbol. There is no way to transmit 2 bits per channel use through this channel. However, it is possible to realize an ideal channel if we reduce the rate.

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Transmission scheme

From the previous example, we change the probability distribution of X as P (X = 0) = P (X = 2) = 0.5 and P (X = 1) = P (X = 3) = 0. So H (x ) = 1 bit per symbol. At the receiver, we can precisely detect the correct transmitted symbol. So the channel has been transformed to an ideal channel with a non-zero rate. This strategy gives an information transmission rate of 1 bit per symbol. Is it the maximum rate? The answer is yes because: The entropy of Y i.e. H (Y ) is at most equal to 2: H (Y ) 2. The conditional entropy H (Y |X ) is the uncertainty over Y given X . But, if X is known, there is just two possibilities for Y giving H (Y |X ) = 1. So [H (Y ) H (Y |X )] 2 1 = 1. Therefore the information rate cannot be greater than 1. So the proposed scheme is optimal.

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Denition

The channel capacity of a discrete memoryless channel is dened by: C = max I (X ; Y )

p (x )

= max[H (Y ) H (Y |X )]

p (x )

= max[H (X ) H (X |Y )]

p (x )

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Choose PX (x ) to maximize I (X ; Y ). Solution (bis): The channel can be modeled using a new binary RV with: Z= 1 prob p 0 prob 1 p

0 p X p 1 1-p X 1 Y 0 1 Y

1-p

Therefore we can write: Y = X Z . When X is given, the information on Y is the same as the information over Z . Therefore H (Y |X ) = H (Z ). And H (Z ) = H (p , 1 p ). Since H (Y ) 1, we can write I (X ; Y ) 1 H (Z ). The maximum of this quantity is obtained when H (Y ) = 1 or when PX (0) = PX (1) = 0.5. The capacity is C = 1 H (p , 1 p ).

Vahid MeghdadiUniversity of Limoges

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

The binary erasure channel is when some bits are lost (rather than corrupted). Here the receiver knows which bit has been erased. What is the capacity of BEC? C = max[H (Y ) H (Y |X )] = max[H (Y ) H (a, 1 a)]

p (x ) p (x )

Symmetry: P (X = 0) = P (X = 1) = 1/2. So Y has three possibilities with probabilities P (Y = 0) = P (Y = 1) = (1 a)/2 and P (Y = e ) = a. So we can write: C 1a 1a , , a H (a , 1 a ) 2 2 = 1 a bit per channel use = H

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

Exercise

Another way to obtain the capacity of BEC. Let dene the RV E for the BEC as E= 1 0 if error, prob=a if not error, prob=1-a

Because C = max H (Y ) H (Y |X ) and H (Y |X ) = H (a, 1 a), we calculate H (Y ) using this new RV. Using chain rule, we know that H (Y , E ) = H (E ) + H (Y |E ) = H (Y ) + H (E |Y ). Since H (E |Y ) = 0 (why), H (Y ) = H (E ) + H (Y |E ). H (E ) = H (a, 1 a) and H (Y |E ) = P (E = 0)H (Y |E = 0) + P (E = 1)H (Y |E = 1). On the other hand, H (Y |E = 1) = 0 and H (Y |E = 0) = H (X ). So, H (Y ) = H (a, 1 a) + (1 a)H (X ) and H (X ) = 1. So the capacity will be H (a, 1 a) + (1 a) H (a, 1 a) = (1 a).

Vahid MeghdadiUniversity of Limoges Chapter 3: Discrete memoryless channel and its capacity

Outline Discrete memoryless channel Transmission rate over a noisy channel Capacity of DMC

We want to know how we can obtain this rate. Suppose a large packet of size N of symbols is transmitted and there is a return path. The receiver, knowing which symbol is not received, can ask the transmitter to re-transmit the lost symbols. So in this block, only N aN symbols are passed to the receiver. So the rate 1 a is achieved. Note: Feedback does not increase the capacity of DMC.

- msg00009Uploaded byAswath78
- cycloStationarySignatureOFDMUploaded byHarish S Sridhara
- Shannon's TheoryUploaded byCata Catax
- ThesisUploaded byhabrengaka
- Portal MathematicsUploaded bysgxrgsys
- Comparative Analysis of Bit Error Rate and Outage Capacity for MIMO STTC CodingUploaded byEditor IJRITCC
- IJCSI-8-2-648-653Uploaded byHarold Wilson
- ST7540tUploaded byjoao paulo nogueira
- Lesson 6Uploaded byteezkang
- basic lecture information theory and codingUploaded byfahad_shamshad
- capacity under water communicationUploaded byNaveen Gupta
- ED_EE_III_2012Uploaded bywrite2arshad_m
- Potwe 13 Gm Cp 21 sUploaded byscribd-in-action
- Design of Sparse Filters for Channel Shortening.pdfUploaded bywuchennuaa
- ChannelUploaded byPradeepCool
- DAA LPUploaded byRam Anand
- Digital Communication IntroductionUploaded bySugumar Sugu K
- Antenna Selection for Energy-efficient MIMO-OfDM Wireless SystemsUploaded byAmber Sonu
- hw13-solUploaded byJason Hua
- LOGARITHMS.pdfUploaded byImtiaz Ahmed
- Mimo CapacityUploaded byTechnos_Inc
- IT Lecture 1Uploaded bysyed02
- HOTQUESTIONSUploaded byDuddukuri Srikar
- Simulation 2012Uploaded bysandeep_2262
- discrete-11-12-17Uploaded byK
- Note.xlsxUploaded bynoer1304
- hw2_soluUploaded byRegner Muñoz Santillan
- Sheet_3Uploaded byhani
- Year 7 Maths AssessmentUploaded byeunife biscarra
- Practice 5Uploaded byJunaid K Jumani

- Resume Jatinder 2005Uploaded byAnkur Agarwal
- OFDMUploaded byAnkur Agarwal
- Channel State InformationUploaded byAnkur Agarwal
- Foreign Exchange Management Policy in IndiaUploaded byapi-3712367
- Science Sample Paper CBSE 3Uploaded byArjun Suresh
- Aman_Resume_Project ReportUploaded byAnkur Agarwal
- Data Signaling RateUploaded byAnkur Agarwal
- 05_proberrUploaded bymah1954
- Intro Mimo&OfdmUploaded bymryanm
- satya ahinsaUploaded byAnkur Agarwal
- Assignment 1Uploaded byAnkur Agarwal

- Python ProgrammingUploaded byKingsley Etornam Anku
- TheoryUploaded byAnnisa Nur Ramadhani
- Lecture 1Uploaded byJonathan Balloch
- Fourier Series ProblemsUploaded byMohamed Khalaf
- Booklet 2012Uploaded byEsteban Rodríguez
- Eliptic Curve ImplementationUploaded byNagarajan Munusamy
- Biological Cybernetics Volume 108 Issue 5 2014 Van Hemmen, J. Leo -- Neuroscience From a Mathematical Perspective- Key Concepts, Scales and Scaling Hypothesis, UniverUploaded byMatias Osta Vélez
- PSERC T-41 Final Report Part 3 2011Uploaded byphamquythangbk
- Fast Discovery of Association Rules PDFUploaded byJennifer
- 06464610Uploaded byJeong-geun Kim
- The Meaning of Intelligence G.D.SUploaded byCristina Turcanu
- Estimating 3-Second and Maximum Instantaneous Gusts From 1-Minute Sustained Wind Speeds During a HurricaneUploaded byAy Ch
- Calculator Techniques for Solving Progression ProblemsUploaded byDash Smth
- Time Management- Fill in BlankUploaded byMichael Messick
- cs703 midUploaded byMian Ejaz
- Thyristor Based HVDC SystemUploaded byAnonymous hzXy93FAE
- GTS NX CatalogUploaded byAnonymous DNb6yWERfB
- How to Estimate the Heat Penetration Factor fh of Packaged Foods.pdfUploaded byAlex Samuel Silva
- M III Assignment 1Uploaded byMaharshiMishra
- Thinking Like a ResearcherUploaded bySrdjan Tomic
- 13-1 Circular Motion web 1Uploaded byivanp
- Tutor Marked Assignment (TMA) SR Secondary 2018 19Uploaded bykanna275
- 1,2Uploaded byMudasir Soomro
- CHAPTER 1Uploaded byMunish Kumar
- 2017 AMC 12B TestUploaded bymmdabral
- Wizard Creation in SAPUploaded byRicky Das
- pb_mdnastran_ltr_wUploaded byErnesto D. Aguirre
- Movie Recommendation Based on Graph Traversal AlgorithmsUploaded bykarthik
- 30112011_Impact of Brand Awareness on the Choice of Retail Stores in India (2)Uploaded byGouthem Karthik
- mathematicsk10 fullUploaded byapi-272037951