- Intro to Uncertainty Analysis
- Model- vs. design-based sampling and variance estimation
- STEP FORMULA BOOKLET.pdf
- AgHbk232.pdf
- [Henry Stark, John W. Woods] Probability and Rando(BookFi.org)
- Stock Optimisation
- mk
- Probability&RandomProcesseswithApplicationstoSignalProcessing3eStark
- Ratio
- MR
- Stark, Solution of Probability Random Processes and Estimation Theory
- Probabilty
- SSRN-id231857
- Stark Woods Solution Chp1-7 3ed
- HW4 Solution
- Assignment 1
- MEMs Tutorial
- 4notes
- Problemas y Ejers Resuletos Proba
- Pitts Et Al-2001-Journal of Applied Microbiology
- 18thJuly_RandomVariables_ProbDistributions
- US Federal Reserve: 200114pap
- bkmr
- Test
- Variance Value Flow to COPA
- computational mechanics.pdf
- Assessment of Sampling and Analytical Uncertainty
- horngren_ima16_stppt08
- Excel Beta Example
- Rigobon - Measuring the Reaction of Monetary Policy to the Stock Market
- The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
- Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future
- Dispatches from Pluto: Lost and Found in the Mississippi Delta
- Sapiens: A Brief History of Humankind
- Yes Please
- The Unwinding: An Inner History of the New America
- The Prize: The Epic Quest for Oil, Money & Power
- Grand Pursuit: The Story of Economic Genius
- This Changes Everything: Capitalism vs. The Climate
- A Heartbreaking Work Of Staggering Genius: A Memoir Based on a True Story
- The Emperor of All Maladies: A Biography of Cancer
- John Adams
- Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America
- Rise of ISIS: A Threat We Can't Ignore
- The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers
- Smart People Should Build Things: How to Restore Our Culture of Achievement, Build a Path for Entrepreneurs, and Create New Jobs in America
- The World Is Flat 3.0: A Brief History of the Twenty-first Century
- Team of Rivals: The Political Genius of Abraham Lincoln
- The New Confessions of an Economic Hit Man
- Bad Feminist: Essays
- How To Win Friends and Influence People
- Angela's Ashes: A Memoir
- Steve Jobs
- The Incarnations: A Novel
- You Too Can Have a Body Like Mine: A Novel
- The Silver Linings Playbook: A Novel
- Leaving Berlin: A Novel
- Extremely Loud and Incredibly Close: A Novel
- The Sympathizer: A Novel (Pulitzer Prize for Fiction)
- The Light Between Oceans: A Novel
- The Blazing World: A Novel
- The Rosie Project: A Novel
- The First Bad Man: A Novel
- We Are Not Ourselves: A Novel
- Brooklyn: A Novel
- The Flamethrowers: A Novel
- A Man Called Ove: A Novel
- The Master
- Bel Canto
- Life of Pi
- The Love Affairs of Nathaniel P.: A Novel
- A Prayer for Owen Meany: A Novel
- The Cider House Rules
- Lovers at the Chameleon Club, Paris 1932: A Novel
- The Bonfire of the Vanities: A Novel
- The Perks of Being a Wallflower
- Interpreter of Maladies
- The Kitchen House: A Novel
- Beautiful Ruins: A Novel
- The Art of Racing in the Rain: A Novel
- Wolf Hall: A Novel
- The Wallcreeper
- My Sister's Keeper: A Novel

**EE department, USC, Fall 2014
**

Instructor: Prof. Salman Avestimehr

Homework 3

Solutions

1. (Uncorrelated vs. Independent)

(a) We have seen that independence of two random variables X and Y implies that they are

uncorrelated, but that the converse is not true. To see this, let Θ be uniformly distributed

on [0, π] and let random variables X and Y be deﬁned by X = cos Θ and Y = sinΘ.

Show that X and Y are uncorrelated but they are not independent.

(b) Now show that if we make the stronger assumption that random variables X and Y

have the property that for all functions g and h, the random variables g(X) and h(Y )

are uncorrelated, then X and Y must be independent. That is, uncorrelation of the ran-

dom variables is not enough, but uncorrelation of all functions of the random variables

is enough to ensure independence. For simplicity assume that X and Y are discrete

random variables.

Hint Consider for all x and y the indicator function

g(x) = 1

a

(x) =

1 x = a

0 x = a

h(y) = 1

b

(y) =

1 y = b

0 y = b

Problem 1 Solution

(a) X and Y are uncorrelated if cov(X,Y) = 0.

E[X] = E[cos(θ)] = 0

E[Y ] = E[sin(θ)] =

π

0

1

π

sin(θ)dθ =

2

π

cov(X, Y ) = E[XY ] −E[X]E[Y ] =

π

0

1

π

sin(θ) cos(θ)dθ

=

π

0

2

π

sin(2θ)dθ = 0

(1)

Hence, X and Y are uncorrelated. Moreover, if you know X, you can exactly tell the

value of Y; in other words, Y = f(X) =

√

1 −X

2

. This is due to the fact that θ ∈

[0, π]. Therefore, they are not independent.

(b) X and Y are uncorrelated therefore cov(X,Y) = 0. Using the functions given in the hint

E[g(X)] = P(X = a)

E[h(Y )] = P(Y = b)

⇒ cov(X, Y ) = E[g(X)h(Y )] −E[g(X)]E[h(Y )]

= E[g(X)h(Y )] −P(X = a)P(Y = b) = 0

⇒ E[g(X)h(Y )] = P(X = a)P(Y = b)

⇒ P(X = a, Y = b) = P(X = a)P(Y = b) ∀a, b

⇒ P(X = a, Y = b) = P

X

(a)P

Y

(b) ∀a, b

(2)

which means X and Y are independent.

2. (Bayes’ Rule for PDF) Consider a communication channel corrupted by noise. Let X be

the value of the transmitted signal and Y be the value of the received signal. Assume that

the conditional density of Y given {X = x} is Gaussian, that is

f

Y |X

(y|x) =

1

√

2πσ

2

exp

−

(y −x)

2

2σ

2

,

and X is uniformly distributed on [−1, 1]. What is the conditional probability density func-

tion of X given Y (i.e., f

X|Y

(x|y)?

Problem 2 Solution

f

X|Y

(x|y) =

f

Y |X

(y|x)f

X

(x)

f

Y

(y)

f

X

(x) =

1

2

rect(

x

2

)

f

Y

(y) =

∞

−∞

f

X,Y

(x, y)dx

=

∞

−∞

f

Y |X

(y|x)f

X

(x)dx

=

1

−1

1

2

1

√

2πσ

2

e

−

(y−x)

2

2σ

2

dx

=

1

2

1−y

σ

−1−y

σ

1

√

2π

e

−

u

2

2

du

=

1

2

[−Q(

1 −y

σ

) + Q(

−1 −y

σ

)]

=

1

2

[−Q(

1 + y

σ

) + Q(

y −1

σ

)]

⇒ f

X|Y

(x|y) =

1

√

2πσ

2

e

−

(y−x)

2

2σ

2

rect(

x

2

)

Q(

y−1

σ

) −Q(

1+y

σ

)

(3)

2

3. (Cauchy-Schwarz Inequality)

(a) Let f

1

(X) and f

2

(X) be function of the randomvariable X. Showthat (E[f

1

(X)f

2

(X)])

2

≤

E[f

1

(X)

2

]E[f

2

(X)

2

].

(b) Use (a) to deduce that P(X = 0) ≤ 1 −

(E[X])

2

E(X

2

)

.

(c) Generalize (a) and prove that for any two random variables, X and Y , (E[XY ])

2

≤

E[X

2

]E[Y

2

].

Hint: Use the fact that E[(tX + Y )

2

] ≥ 0 for all t ∈ R.

Problem 3 Solution

(a) You can either prove this part using the hint for part(c) or use the Cauchy-Schwarz in-

equality for integrals, i.e. |

f(x)g(x)dx|

2

≤

|f(x)|

2

dx

|g(x)|

2

dx, and let f(x) =

f

1

(x)

f

X

(x) and g(x) = f

2

(x)

f

X

(x).

(b) Assume X is a discrete RV (for continuous X the result is trivial as long as f

X

(x) does

not include any impulse functions). Let f

1

(X) = X and f

2

(X) = 1 − δ(X), where

δ(X) = 1 if and only if X = 0. Applying the result in part (a) to these functions we get

the desired result. Note that E[Xδ(X)] = 0 and E[(1 −δ(X))

2

] = E[(1 −δ(X))].

(E[f

1

(X)f

2

(X)])

2

= (E[X])

2

(E[f

1

(X)])

2

= E[X

2

]

(E[f

2

(X)])

2

= (1 −P(X = 0))

⇒ (E[X])

2

≤ E[X

2

](1 −P(X = 0))

⇒ P(X = 0) ≤ 1 −

(E[X])

2

E[X

2

]

(4)

(c)

E[(tX + Y )

2

] ≥ 0for any t ∈ R

⇒ t

2

E[X

2

] + 2E[XY ]t +E[Y

2

] ≥ 0

(5)

Now by setting t = −

E[XY ]

E[X

2

]

we get

(E[XY ])

2

≤ E[X

2

]E[Y

2

] (6)

4. (Linear Estimation) The output of a channel is Y = X + N, where the input X and the

noise N are independent, zero mean random variables.

(a) Find the correlation coefﬁcient between the input X and the output Y .

(b) Suppose we estimate the input X by a linear function g(Y ) = aY . Find the value of a

that minimizes the mean squared error E[(X −aY )

2

].

(c) Express the resulting mean squared error in terms of the ratio between the variance of

X and the variance of N (i.e,

σ

X

σ

Y

).

3

(d) Find COV(X −aY, X) for your choice of a in part (b).

Problem 4 Solution

(a)

ρ

XY

=

cov(XY )

σ

X

σ

Y

cov(X, Y ) = E[XY ] −E[X]E[Y ] = E[XY ]

⇒ cov(X, Y ) = E[X(X + N)] = E[X

2

] +E[XN] = E[X

2

]

E[X

2

] = var(X) −(E[X])

2

= var(X) = σ

2

X

⇒ ρ

XY

=

σ

2

X

σ

X

σ

Y

=

σ

X

σ

Y

σ

Y

=

σ

2

X

+ σ

2

N

(7)

(b)

E[(X −aY )

2

] = E[X

2

] −2aE[XY ] + a

2

E[Y

2

]

= σ

2

X

−2aσ

2

X

+ a

2

(σ

2

X

+ σ

2

N

)

d

da

E[(X −aY )

2

] = 0 ⇒ a =

σ

2

X

σ

2

X

+ σ

2

N

(8)

(c)

MSE = σ

2

X

−2(

σ

2

X

σ

2

X

+ σ

2

N

)σ

2

X

+ (

σ

2

X

σ

2

X

+ σ

2

N

)

2

(σ

2

X

+ σ

2

N

)

=

σ

2

X

1 +

σ

2

X

σ

2

N

(9)

(d)

cov(X −aY, X) = E[X(X −aY )] −E[X]E[(X −aY )]

= E[X

2

] −aE[XY ] = σ

2

X

(1 −

σ

2

X

σ

2

X

+ σ

2

N

) = σ

2

X

(1 −ρ

2

XY

)

(10)

5. (a) Show that COV(X, E[Y |X]) = COV(X, Y ).

(b) Show that if E[Y |X = x] = E[Y ], for all x implies that X and Y are uncorrelated.

Problem 5 Solution

(a)

cov(X, E[Y |X]) = E[XE[Y |X]] −E[X]E[E[Y |X]]

= E[E[XY |X]] −E[X]E[Y ]

= E[XY ] −E[X]E[Y ] = cov(X, Y )

(11)

4

(b)

E[Y |X = x] = E[Y ] ∀x ⇒ E[Y |X] = E[Y ]

⇒ cov(X, Y ) = cov(X, E[Y |X]) = E[XE[Y |X]] −E[X]E[E[Y |X]]

= E[XE[Y ]] −E[X]E[Y ] = E[X]E[Y ] −E[X]E[Y ] = 0

⇒ X and Y are uncorrelated.

(12)

6. (Linear Combination of Two RVs) The characteristic function of a continuous random

variable X is deﬁned as

Φ

X

(ω) = E [exp(jωX)]

=

∞

−∞

exp(jωx)f

X

(x) dx,

where j =

√

−1. Thus Φ

X

(−ω) is the Fourier transform of the PDF. In particular, the

characteristic function uniquely determines the PDF.

(a) Suppose that X is N(µ, σ

2

). Show that its characteristic function equals

Φ

X

(ω) = exp(jµω −σ

2

ω

2

/2).

Hint: Complete the square in the exponent.

(b) Suppose X and Y are independent. Determine the characteristic function of aX + bY

in terms of a, b, and the characteristic functions of X and Y .

Hint: You do not need to compute the density of aX + bY .

(c) Suppose that X and Y are independent and Gaussian. Use the results in (a) and (b) to

show that aX + bY is also Gaussian.

Problem 6 Solution

(a)

Y = aX + b ⇒ Φ

Y

(ω) = e

jbω

Φ

X

(aω)(∗)

Φ

X

(s) = E[e

sX

]

Z =

X −µ

σ

Φ

Z

(s) =

1

√

2π

∞

−∞

e

sz

e

−z

2

2

dz

Φ

Z

(s) = e

s

2

2

∞

−∞

1

√

2π

e

−(z−s)

2

2

dz

(13)

Apply (*) since we have X = σz + µ, and set s = jω.

5

(b)

Φ

aX+bY

(ω) =

∞

−∞

∞

−∞

e

jω(ax+by)

f

X,Y

(x, y)dxdy

=

∞

−∞

e

jωax

f

X

(x)dx

∞

−∞

e

jωby

f

Y

(y)dy

= Φ

X

(aω)Φ

Y

(bω)

(14)

(c)

Φ

aX+bY

(ω) = Φ

X

(aω)Φ

Y

(bω)

= e

jµ

X

aω−

σ

2

X

a

2

ω

2

2

e

jµ

Y

bω−

σ

2

Y

b

2

ω

2

2

= e

j(aµ

X

+bµ

Y

)ω−

(a

2

σ

2

X

+b

2

σ

2

Y

)ω

2

2

(15)

since the characteristic function uniquely determines the PDF, aX + bY is Gaussian

with mean aµ

X

+ bµ

Y

and variance a

2

σ

2

X

+ b

2

σ

2

Y

.

6

- Intro to Uncertainty AnalysisUploaded byRodrigo P. Silveira
- Model- vs. design-based sampling and variance estimationUploaded byFanny Sylvia C.
- STEP FORMULA BOOKLET.pdfUploaded byDana B
- AgHbk232.pdfUploaded bySima Andreea
- [Henry Stark, John W. Woods] Probability and Rando(BookFi.org)Uploaded byCan Çamlık
- Stock OptimisationUploaded byJini Enock Gwaula Ntuta
- mkUploaded bypata nahi hai muje
- Probability&RandomProcesseswithApplicationstoSignalProcessing3eStarkUploaded byMayank Gupta
- RatioUploaded byamit kumar dewangan
- MRUploaded byDivya Dang
- Stark, Solution of Probability Random Processes and Estimation TheoryUploaded bycocofarm
- ProbabiltyUploaded byRajeshKolhe
- SSRN-id231857Uploaded byElena Polunkina
- Stark Woods Solution Chp1-7 3edUploaded byVi Tuong Bui
- HW4 SolutionUploaded byDi Wu
- Assignment 1Uploaded byrianbe
- MEMs TutorialUploaded bythatmike
- 4notesUploaded byAkon Akki
- Problemas y Ejers Resuletos ProbaUploaded byAllan Rg
- Pitts Et Al-2001-Journal of Applied MicrobiologyUploaded bylookyan
- 18thJuly_RandomVariables_ProbDistributionsUploaded byGaurav Sahu
- US Federal Reserve: 200114papUploaded byThe Fed
- bkmrUploaded byDízia Lopes
- TestUploaded byothersk46
- Variance Value Flow to COPAUploaded bytrinath79
- computational mechanics.pdfUploaded byPramodVerma
- Assessment of Sampling and Analytical UncertaintyUploaded bymichypao
- horngren_ima16_stppt08Uploaded byOmnia Hassan
- Excel Beta ExampleUploaded byKaushik_n
- Rigobon - Measuring the Reaction of Monetary Policy to the Stock MarketUploaded byCristian Piñeros