6 views

Uploaded by Kenan Olgac

communications systems kitabının 5. bölüm kısaltılmış açıklamalı pdf dosyası

- Standard Normal Deviate
- Midterm Notes
- Ch7_NormalDistribution
- Review Normal
- Lesson 2-03 Random Variables STAT
- 284
- ch8
- Probability Distributions
- Expected Value PDF
- Random variables basics
- fajardo micaela lauren r week3 reflective learning journal
- Project Risk Management Procedures
- Timing of Control Activities in Project Planning
- Continuous Probability Distributions
- Discrete Probability Distribution
- Chapter 13 Probability
- AP Statistics Class Syllabus 2012 Summer
- review stat
- B762-90 (Reapproved 2010).pdf
- Chap 003

You are on page 1of 38

5.1.1 Sample Space, Events, and Probability

The fundamental concept in any probabilistic model is the concept of a random

experiment, which is any experiment whose outcome cannot be predicted with certainty.

A random experiment has certain outcomes, which are the elementary results of the

experiment. In flipping of a coin, head and tail are the possible outcomes. In

throwing a die, 1,2,3,4,5, and 6 are the possible outcomes. The set of all possible

outcomes is called the sample space and is denoted by . Outcomes are denoted by s,

and each lies in , i.e., .

Events are subsets of the sample space; in other words, an event is a collection

of outcomes. For instance, in throwing a die, the event the outcome is odd consists

of outcomes 1, 3, and 5; Events are disjoint if their intersection is empty. For

instance, in throwing a die, the events the outcome is odd and the outcome divides

4 are disjoint.

events E such that the following conditions are satisfied:

The conditional probability of the event E1, given the event E2, is defined by

to be independent. For independent events, P(E1 E2) = P(E1)P (E2).

A random variable is a mapping from the sample space to the set of real numbers.

In other words, a random variable is an assignment of real numbers to the outcomes

of a random experiment. A schematic diagram representing a random variable is

given in Figure 5.1.

mapping from to R.

discrete random variable.

continuous random variable.

as the derivative of its CDF. It is denoted by fX(x), i.e.,

random variables are the following:

Uniform random variable. This is a continuous random variable taking values

between a and b with equal probabilities for intervals of equal length. The density

function is given by

Gaussian or normal random variable. The Gaussian, or normal, random variable is

a continuous random variable described by the density function

A Gaussian random variable with mean m and variance 2 is denoted by N(m, 2).

The random variable N(0, 1) is usually called standard normal.

The Gaussian random variable is the most important and frequently encountered

random variable in communications. The reason is that thermal noise, which is the

major source of noise in communication systems, has a Gaussian distribution.

Assuming that X is a standard normal random variable, we define the function Q(x) as

P(X > x). The Q-function is given by the relation

random variable.

random variable.

of a standard normal random variable.

9

Table 5.1 gives the values of this function for various values of x.

10

For an N(m, 2) random variable, a simple change of variable in the integral that

Example 5.1.6

X is a Gaussian random variable with mean 1 and variance 4. Find the probability that

X is between 5 and 7.

Solution We have m = 1 and = 4 = 2. Thus,

Example 5.1.7

Assuming X is a Gaussian random variable with m = 0 and = 1, find the probability

density function of the random variable Y given by Y = aX + b.

Solution In this case, g(x) = ax + b; therefore, g(x) = a. The equation ax + b = y

has only one solution, which is given by x = . Using these results, we obtain

11

Example 5.1.8

Assume X is a N(3, 6) random variable. Find the density function of Y = 2X + 3.

Solution We know Y is a Gaussian random variable with the mean m = 23+3 = 3

and variance 2 = 4 6 = 24. Therefore, Y is a N(3, 24) random variable and

From this example, we arrive at the important conclusion that a linear function of a

Gaussian random variable is itself a Gaussian random variable.

12

Statistical Averages. The mean, expected value, or expectation of the random variable

X is defined as

Note that E(X) is just a real number. In general, the nth moment of a random

variable X is defined as

13

It is also easy to verify that the variance has the following properties:

Example 5.2.1

Assume that we have a signal generator that can generate one of the six possible

sinusoids. The amplitude of all sinusoids is one, and the phase for all of them is zero,

but the frequencies can be 100, 200, . . . , 600 Hz. We throw a die, and depending on its

outcome, which we denote by F, we generate a sinusoid whose frequency is 100 times

what the die shows (100F). This means that each of the six possible signals will be

realized with equal probability. This is an example of a random process. This random

process can be defined as X(t) = cos(2 100Ft).

Example 5.2.2

Assume that we uniformly choose a phase between 0 and 2 and generate a sinusoid

with a fixed amplitude and frequency but with a random phase . In this case, the

random process is X(t) = Acos(2f0t+), where A and f0 denote the fixed amplitude and

frequency and denotes the random phase. Some sample functions for this random

process are shown in Figure 5.11.

14

Figure 5.11 Sample functions of the random process given in Example 5.2.2.

15

of a random process.

there exists a deterministic time function x(t; i ), which is called a sample function or a

realization of the random process. At each time instant t0 and for each i , we have

the number x(t0; i ). For the different outcomes (i s) at a fixed time t0, the numbers

x(t0; i) constitute a random variable denoted by X(t0). In other words, at any time

instant, the value of a random process is a random variable.

16

Example 5.2.4:In Example 5.2.1, determine the values of the random variable X(0.001).

Solution The possible values are cos(0.2), cos(0.4), . . . , cos(1.2) and each has a

1

probability .

6

Example 5.2.5:Let denote the sample space corresponding to the random experiment of

throwing a die. Obviously, in this case = {1, 2, 3, 4, 5, 6}. For all i, let x(t ;i )=ietu1(t)

denote a random process. Then X(1) is a random variable taking values e1, 2e1, . . . , 6e1

1

and each has probability 6. Sample functions of this random process are shown in Figure

5.14.

Example 5.2.5.

17

we know that, at any time instance t0, the random process at that time, i.e., X(t0), is an

ordinary random variable; it has a density function and we can find its mean and its

variance at that point. Obviously, both mean and variance are ordinary deterministic

numbers, but they depend on the time t0. That is, at time t1, the density function, and

thus the mean and the variance, of X(t1) will generally be different from those of X(t0).

Definition 5.2.1. The mean, or expectation, of the random process X(t) is a

deterministic function of time denoted by mX(t) that at each time instant t0 equals the

mean of the random variable X(t0). That is, mX(t) = E[X(t)] for all t .

18

Since at any t0 the random variable X(t0) is well defined with a probability density

function fX(t0) (x), we have

Example 5.2.7

The mean of the random process in Example 5.2.2 is obtained by noting that

Hence,

Another statistical average that plays a very important role in our study of

random processes is the autocorrelation function. The autocorrelation function

is especially important because it completely describes the power spectral

density and the power content of a large class of random processes.

19

denoted by RX(t1, t2), is defined by RX(t1, t2) = E[X(t1)X(t2)].

From this definition, it is clear that RX(t1, t2) is a deterministic function of two

variables t1 and t2 given by

Example 5.2.8

The autocorrelation function of the random process in Example 5.2.2 is

20

It can happen that some of the properties of this random variable are independent of

time.

Depending on what properties are independent of time, different notions of stationarity

can be defined. One of the most useful notions of stationarity is the notion of widesense stationary random processes.

Definition 5.2.3. A process X(t) is wide-sense stationary (WSS) if the following

conditions are satisfied:

1. mX(t) = E[X(t)] is independent of t .

2. RX(t1, t2) depends only on the time difference = t1 t2 and not on t1 and t2

individually.

Hereafter, we will use the term stationary as a shorthand for wide-sense stationary

processes, and their mean and autocorrelation will be denoted by mX and RX( ).

Example 5.2.10

For the random process in Exercise 5.2.2, we have already seen that mX = 0 and

Therefore, the process is WSS.

21

Example 5.2.11

Let the random process Y(t) be similar to the random process X(t) defined in Exercise

5.2.2, but assume that is uniformly distributed between 0 and . In this case,

From the definition of the autocorrelation function, it follows that RX(t1, t2) = RX(t2,

t1). This means that if the process is stationary, we have RX( ) = RX(), i.e., the

autocorrelation function is an even function in stationary processes.

22

We are assuming that a stationary process X(t) is the input to a linear time-invariant

system with the impulse response h(t) and the output process is denoted by Y(t), as

shown in Figure 5.16.[

]

We next demonstrate that if a stationary process X(t) with mean mX and autocorrelation

function RX( ) is passed through a linear time-invariant system with impulse response

h(t), the input and output processes X(t) and Y(t) will be jointly stationary with

23

By using the convolution integral to relate the output Y(t) to the input X(t), i.e.,

, we have

The cross correlation function between the output and the input is

24

5.2.5 Power Spectral Density of Stationary Processes

A useful function that determines the distribution of the power of the random process at

different frequencies is the power spectral density or power spectrum of the random

process. The power spectral density of a random process X(t) is denoted by SX(f ), and

denotes the strength of the power in the random process as a function of frequency. The

unit for power spectral density is Watts/Hz.

Theorem [WienerKhinchin] For a stationary random process X(t), the power

spectral density is the Fourier transform of the autocorrelation function, i.e.,

Example 5.2.15

For the stationary random process in Example 5.2.2, we had

25

Hence,

The power spectral density is shown in Figure 5.17. All the power content of the

process is located at f0 and f0. This is expected because the sample functions of this

process are sinusoidals with their power at those frequencies.

density of the random

process of Example 5.2.15.

26

The power content, or simply the power, of a random process is the sum of the powers

at all frequencies in that random process. In order to find the total power, we have to

integrate the power spectral density over all frequencies.

Since SX(f ) is the Fourier transform of RX( ), then RX( ) will be the inverse

Fourier transform of SX(f ). Therefore, we can write

27

Example 5.2.17

Find the power in the process given in Example 5.2.15

Solution We can use either the relation

or the relation

28

Power Spectra in LTI Systems. We have seen that when a stationary random process

with mean mx and autocorrelation function RX( ) passes through a linear time-invariant

system with the impulse response h(t), the output process will be also stationary with

mean

and autocorrelation

that

and

we can compute the Fourier

transform of both sides of these relations to obtain

29

Let us define the cross spectral density SXY (f ) as

Then

Example 5.2.18

If the process in Example 5.2.2 passes through a differentiator, we have H(f) = j2f ;

therefore,

and

30

Figure 5.18 Inputoutput relations for the power spectral density and the cross spectral

density.

31

5.3.1 Gaussian Processes

Assume that X(t) is a stationary Gaussian RP. We know that at any time instant t0, the

random variable X(t0) is Gaussian; at any two points t1, t2, random variables

(X(t1),X(t2)) are distributed according to a two-dimensional jointly Gaussian random

variable.

Example 5.3.1

Let X(t) be a zero-mean stationary Gaussian random process with the power spectral

density

. Determine the probability density function of the random

variable X(3).

Solution Since X(t) is a Gaussian random process, the probability density function

of random variable X(t) at any value of t is Gaussian. Therefore, X(3) N(m, 2).

Now we need to find m and 2. Since the process is zero mean, at any time instance

t, we have E[X(t)] = 0; this means m = E[X(3)] = 0. To find the variance, we note

that

32

where, in the last step, we have used the fact that the process is stationary and hence

E[X(t1)X(t2)] = RX(t1 t2). But from Equation (5.2.12), we have

33

The term white process is used to denote processes in which all frequency components

appear with equal power, i.e., the power spectral density is a constant for all

frequencies. This parallels the notion of white light, in which all colors exist.

Definition 5.3.3. A process X(t) is called a white process if it has a flat spectral density,

i.e., if SX(f ) is a constant for all f .

In practice, the importance of white processes stems from the fact that thermal noise

can be closely modeled as a white process over a wide range of frequencies. Figure 5.19

shows the power spectrum of a white process.

white process.

thermal noise.

34

If we find the power content of a white process using SX(f ) = C, a constant, we will

have

constant (equal to 1.38 1023 J/K). T denotes the temperature in degrees Kelvin. The

value kT is usually denoted by N0; therefore, the power spectral density of thermal noise

Looking at the autocorrelation function for a white process, we see that

This shows that for all 0, we have RX( ) = 0. Thus, if we sample a white process

at two points t1 and t2 (t1 = t2), the resulting random variables will be uncorrelated.

If the random process is white and also Gaussian, any pair of random

variables X(t1), X(t2), where t1 t2, will also be independent (recall that for jointly

Gaussian random variables, uncorrelatedness and independence are equivalent.)

35

Properties of the Thermal Noise. The thermal noise that we will use in

subsequent chapters is assumed to have the following properties:

1. Thermal noise is a stationary process.

2. Thermal noise is a zero-mean process.

3. Thermal noise is a Gaussian process.

It is clear that the power spectral density of thermal noise increases with increasing

the ambient temperature; therefore, keeping electric circuits cool makes their noise

level low.

5.3.3 Filtered Noise Processes

In many cases, the white noise generated in one stage of the system is filtered by the

next stage; therefore, in the following stage, we encounter filtered noise that is a

bandpass process.White noise processes ;n(t) is filtered by ideal bandpass filter which

have

W bandwidth, fc center frequency, and assume that filter output is

X(t).[X(f)=H1(f).N(f)]

36

For example, one such filter can have a transfer function of the form

37

Since thermal noise is white and Gaussian, the filtered thermal noise will be Gaussian

but not white. The power spectral density of the filtered noise will be

where we have used the fact that for ideal filters |H(f )|2 = H(f ).

38

- Standard Normal DeviateUploaded byVageesha Shantha Veerabhadra Swamy
- Midterm NotesUploaded byasdfddddddd
- Ch7_NormalDistributionUploaded bymartynapet
- Review NormalUploaded byElena Pinka
- Lesson 2-03 Random Variables STATUploaded byallan.manaloto23
- 284Uploaded bySimon Wer
- ch8Uploaded bySajid Rasool
- Probability DistributionsUploaded bypennylanephotographs
- Expected Value PDFUploaded byMiriam
- Random variables basicsUploaded byb_srikalyan
- fajardo micaela lauren r week3 reflective learning journalUploaded byapi-351776914
- Project Risk Management ProceduresUploaded byAri Supramono
- Timing of Control Activities in Project PlanningUploaded byapi-3707091
- Continuous Probability DistributionsUploaded byLupita Jiménez
- Discrete Probability DistributionUploaded byMutia Zahara
- Chapter 13 ProbabilityUploaded byarjunprasannan7
- AP Statistics Class Syllabus 2012 SummerUploaded byuclabruintized
- review statUploaded byIves Lee
- B762-90 (Reapproved 2010).pdfUploaded bySyed Fadzil Syed Mohamed
- Chap 003Uploaded byShekhar Saurabh Biswal
- ME PresentationUploaded bySMBEAUTY
- Lecture 0 - Review of Basic ConceptsUploaded byKismet
- 3_3_stuUploaded bylennon84
- statisticsUploaded bySalman Shakir
- ProbabilityUploaded bynour
- Exercise 14Uploaded byHien Nguyen
- Notes 03Uploaded bythaonguyen215
- Assignment Four SolutionUploaded byMalek Yaseen
- Statistics2 Chapter1 DraftUploaded byRaquel Gutierrez Fernandez
- Chapter 1Uploaded bybrgmauro

- Development Ohmic Heating SystemUploaded byachmatsarifudin1317
- D1321-10(2015)_Standard_Test_Method_for_Needle_Penetration_of_Petroleum_Waxes.pdfUploaded byjdiosbernal10
- RF ModuleUploaded bySankula Siva Sankar
- python for CFD.pdfUploaded byS. OUCHENE
- Ch2Uploaded byapi-3815042
- Basic Electronics Course Humphrey KimathiUploaded byFlavio nascimento
- Liquids, Solutions, and Interfaces: From Classical Macroscopic Descriptions to Modern Microscopic Details (Topics in Analytical Chemistry) (Fawcett 2004)Uploaded bymy_Scribd_pseudo
- me471s03_q2_ansUploaded byHasen Bebba
- QuenchingUploaded byjohn yabez
- Fe de Errata Aisc Steel Construction Manual 13 EdUploaded byRafael Alberto Guilarte Bravo
- Project PortfolioUploaded byArnel Dodong
- Compuware FileaidUploaded byPrashant Bobde
- e_6aUploaded byDamanveer
- Heat Tracing Design Guide for Self Regulating Cables[1]Uploaded byReza Zigheimat
- 265120 V07 (Architectural FOH Lighting)Uploaded byParth Wankhede
- 1Evaluating EVault a Guide for Credit Union Technical Decision MakerUploaded byPradeep Shukla
- How to Proof-read documents with text-to-speechUploaded bywsengdara
- GPRS_EDGE Optimisation.docxUploaded bykarimos_17
- Microwave BasicsUploaded bySheenly Joy Abalajen
- 25805_D.pdfUploaded byJuanBocic
- 1. Thin Wall Structures LabUploaded byGabriel Gonzalez
- Chapter 11 ASK Mod RevisiUploaded byRizal Aji Istantowi
- CS6003 Sensor IQ Nov.dec 2017Uploaded byGowri Boccaro
- Midea-DC-Inverter-VRF-Products-2013-60Hz-Part-1.pdfUploaded byhakimmunas
- IPV6-1Uploaded byAmro Goneim
- Siemens NX Software is an Integrated Product DesignUploaded bysuraj j k
- PowerUploaded bysri1414
- Soil Mechanics and Foundation EngineeringsUploaded byFeras
- Cl III Face MaskUploaded bymutans
- Sine PianoUploaded byJK