0 Up votes0 Down votes

4 views7 pagesSolution

Sep 15, 2018

© © All Rights Reserved

PDF, TXT or read online from Scribd

Solution

© All Rights Reserved

4 views

Solution

© All Rights Reserved

- BlumK2006Teaching Students How to Write a Chapter Four and Five of a Dissertation
- 13. Testing of Hypothesis.
- 9 pengujian hipotesis 2
- Culture and accounting in Indonesia
- Event-study Methodology Under Conditions of Event-Induced Variance (E. Boehmer, J. Masumeci, A. Poulsen)
- Determinationof Optimized Data Fusion Algorithms for Radars Network by Ordered Weighted Averaging
- 1
- Taylor Rules and the Euro
- Dmuu Stats
- William Allan Kritsonis, PhD, Dissertation Chair/Major Professor for
- MBA - 3rd Semi
- Kl 2518311838
- Star PR
- Unit 31
- Unit 15 Hypothesis
- I. Douven - 2013- Inference to the Best Explanation, Dutch Books, And Inaccuracy Minimisation
- Engineering 6
- Clustered Compressive Sensingbased Image Denoising Using Bayesian Framework
- RPG131.docx
- A View on Statistical Schools

You are on page 1of 7

∫∞ ∫∞

1. Given that −∞ p1 (y)dy = −∞ p0 (y)dy = 1, the constants c0 and c1 can be found as c0 = 32 , c1 = 19 .

Hence, the prior ratio is η = 1 and the likelihood function L(y) is

{ 2(3−|y|)

p1 (y) 27y 2

|y| ≤ 1

L(y) = = ,

p0 (y) ∞ 1 ≤ |y| ≤ 3

which by using L(y) ≷ η, gives the Bays rule ĤB (y), and the minimum risk rB , as

{

1 y ∈ Λ1 = {y| |y| ≤ c ∨ 1 < |y| ≤ 3} 1 (√ )

ĤB (y) = ,c = 163 − 1 ≃ 0.43582,

0 y ∈ Λ0 = {y| − 1 ≤ y < −c ∨ c < y ≤ 1} 27

√

1 11423 − 326 163

rB = (Pr(y ∈ Λ1 |H = 0) + Pr(y ∈ Λ0 |H = 1)) = ≃ 0.184446.

2 39366

{ 2(3−|y|)

p1 (y) 27y 2

|y| ≤ 1

L(y) = = ,

p0 (y) ∞ 1 ≤ |y| ≤ 3

which under the assumption of uniform costs and by using L(y) ≷ η = 1−π π0

0

, gives the Bays rule

ĤB (y, π0 ), as

{

1 y ∈ Λ1 (π0 ) = {y| |y| ≤ c(π0 ) ∨ 1 < |y| ≤ 3}

ĤB (y, π0 ) = ,

0 y ∈ Λ0 = {y| − 1 ≤ y < −c(π0 ) ∨ c(π0 ) < y ≤ 1}

in which, c(π0 ) is

{

1 0 ≤ π0 ≤ 31

4

c(π0 ) = √ .

27π0 (π0 − 1 + −161π02 + 160π0 + 1) 31 < π0 ≤ 1

1 4

π0 0 ≤ π0 < 4

rB (π0 ) = (π0 −1) 2 √−(π0 −1)(161π0 +1)−1 +π0 −10451π0 +322√−(π0 −1)(161π0 +1)−482

( ( ) ( )) 31

2

4

≤ π0 ≤ 1

19683π0 31

1

Reza Oftadeh

For minmax rule we have to find the least favorable prior πl for which rB (πl ) is maximum. The above

function is continuous and a simple plot shows that it acquire its maximum between 31 4

≤ π0 ≤ 1.

0.20

0.15

0.10

0.05

Hence, the least favorable prior πl and the minmax risk are

{

1 y ∈ Λ1 (π0 ) = {y| |y| ≤ c ∨ 1 < |y| ≤ 3}

ĤM (y) = , c = 0.587156

0 y ∈ Λ0 = {y| − 1 ≤ y < −c ∨ c < y ≤ 1}

4. (a) The minimum Bayes risk V (π0 ) is a concave function, where the least favorable prior is

πl that maximizes it and the minmax risk is its value V (πl ). V (π0 ) = 2 − π02 has its maximum

at

πl = 0 and the minmax risk isV (πl ) = 2.

(b) Under the prior πl = 0 the Bayes rule is simply 1 for any y ∈ R and hence, Λ1 = R and Λ0 = ∅.

Therefore, the probability of error under H0 is

5. We have Hi : Y ∼ N ((i − 3)µ, σ 2 ). The Bayes risk under uniform costs becomes as a function

of priors π = {π1 , · · · , π5 }

∑

5 ∑

5 ∑

5 ∑

5 ∑

5

r(δ, π) = cij πj Pr(δ(y) = i|Hj ) = πj Pr(δ(y) = i|Hj ) = πi (1 − Pr(δ(y, πi ) = i|Hi )) .

i=1 j=1 i=1 j=1,j̸=i i=1

2

Reza Oftadeh

The Bayes decision rule δ(y, π) minimizes the above risk and can be written as

1≤i≤5

1 1 1

= arg max {− (y − (i − 3)µ)2 − ln 2π − ln σ 2 + ln πi }

1≤i≤5 2σ 2 2 2

1

= arg max {ln πi − 2 (y − (i − 3)µ)2 }.

1≤i≤5 2σ

The above maximization is over the 2nd order polynomials si (y) = ln πi − 2σ1 2 (y − (i − 3)µ)2 . The

crossing point of si and sj is where the regions change from i to j but there are many subtleties

to account for since some priors can lead to some regions become empty. Generally, si (y) = sj (y)

results in

σ2 πj 1

yij = ln + (i + j − 6)µ

µ(i − j) πi 2

which is simplified to

1 y ∈ (−∞, min y1j ]

j>1

2 y ∈ (max y2j , min y2j ]

j<2 j>2

δ(y, π) = 3 y ∈ (max y3j , min y3j ] .

j<3 j>3

4 y ∈ (max y4j , min y4j ]

j<4 j>4

5 y ∈ (max y5j , ∞)

j<5

By setting d = σµ , let

yij (d, π) = = ln + d,

σ d(j − i) πj 2

we have

′

Φ(min y1j (d, π)) i=1

(

j>1 )

′ ′

Pr(δ(y, π) = i|Hi ) = max Φ(min yij (d, π)) − Φ(max yij (d, π)), 0 i ∈ {2, 3, 4} .

j>i j<i

′

1 − Φ(max y5j (d, π)) i=5

j<5

∑

Hence, the average Bayes risk for the Bayes rule is r(d, π) = 5i=1 πi (1 − Pr(δ(y, π) = i|Hi )), which

is function of d and the priors π. Now for the minmax rule we have to find the least favorable

prior πl that maximizes r(d, π) and as qn equalizer results in R = (1 − Pr(δ(y, π) = i|Hi )) be the

same for all i. Also, the minmax rule will be (δ(y, πl ) with minmax error being r(d, πl ), which is a

function of only d.

As requested, I wrote a Matlab code to calculate πl for d = 3, which is shown below:

3

Reza Oftadeh

2 d=5;

3 yp = z e r o s ( 5 , 5 ) ;

4 f o r i =1:5

5 f o r j =1:5

6 i f j==i

7 yp ( i , j ) = 0 ;

8 else

9 yp ( i , j ) = 1 / ( d ∗ ( j −i ) ) ∗ l o g ( p i ( i ) / p i ( j ) ) +( j −i ) ∗d / 2 ;

10 end

11 end

12 end

13 pr ( 1 ) = p h i ( min ( yp ( 1 , 2 : 5 ) ) ) ;

14 pr ( 2 ) = p h i ( min ( yp ( 2 , 3 : 5 ) ) ) − p h i (max( yp ( 2 , 1 ) ) ) ;

15 pr ( 3 ) = p h i ( min ( yp ( 3 , 4 : 5 ) ) ) − p h i (max( yp ( 3 , 1 : 2 ) ) ) ;

16 pr ( 4 ) = p h i ( min ( yp ( 4 , 5 ) ) ) − p h i (max( yp ( 4 , 1 : 3 ) ) ) ;

17 pr ( 5 ) = 1−p h i (max( yp ( 5 , 1 : 4 ) ) ) ;

18 f o r i =1:5

19 i f pr ( i ) < 0

20 pr ( i ) = 0 ;

21 end

22 end

23 v = −sum ( p i . ∗ ( 1 − pr ’ ) ) ;

24 end

25 f u n c t i o n pr = p h i ( x )

26 pr = ( 1 / 2 ) ∗ e r f c (−x/ s q r t ( 2 ) ) ;

27 end

28

29 p i l = fmincon ( @ p r o b e r r o r , [ 1 / 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 1 5 ; 1 / 5 ] , . . .

30 [] ,[] ,[1 ,1 ,1 ,1 ,1] ,1 ,[0;0;0;0;0] ,[1;1;1;1;1]) ;

The optimization provides the following result for the least favorable priors πl and respective

errors:

5 ∀i and hence, the Bayes rule becomes

1 i ∈ (−∞, − 32 µ]

i ∈ (− 32 µ, − 12 µ]

2

δ(y, πl ) = arg min {(y − (i − 3)µ) } = 3

2

i ∈ (− 12 µ, 12 µ] .

1≤i≤5

4 i ∈ ( 12 µ, 32 µ]

5 i ∈ ( 32 µ, ∞]

4

Reza Oftadeh

1

r(δ(y)) = π0 (c10 PF + c00 (1 − PF )) + π1 (c11 PD + c01 (1 − PD )) = 2(π0 − 1)PF4 + π0 PF + 2.

Now the Bayes rule, for a given π0 minimizes the above risk by setting PF (δ(y)) through δ(y).

The PF = PF,m that minimizes the risk r(δ(y)) is achieved by setting

( )

∂r(δ(y)) 1 − π0 4/3

= 0 =⇒ PF,m = ∧ 0 ≤ PF,m ≤ 1.

∂PF 2π0

Now for π0 = 13 the above expression is PF,m = 1 and PF,m > 1 for π0 < 13 . Hence, for pi0 ≤ 13 ,

the minimum Bayes risk is achieved by setting

PF,m = Pr(δ(y) = 1|H0 ) = 1 =⇒ δ(y) = 1.

( )4/3

1−π0 1

(b) From the previous part we know that PF,m = 2π0 minimizes the Bayes risk. For π0 = 2

4

13

we have PF,m = 2 and the minimum Bayes risk becomes

1 3

rB = 2(π0 − 1)PF,m

4

+ π0 PF,m + 2 =⇒ rB = 2 − √ ≃ 1.40472.

432

Optional Problems:

1. The minimum Bayes risk rB is

rB = π0 (c10 PF + c00 (1 − PF )) + π1 (c11 PD + c01 (1 − PD )) = π0 (PF + 1) + (1 − π0 )(4 − 2PD )

= π0 (PF + 2PD − 3) + (4 − 2PD ),

comparing with the given V (π0 ) = 2 − π02 , we have PD = 1 and PF = 1 − π0 . Hence, the ROC plot

is just a horizontal line PD (PF ) = 1 ∀ 0 ≤ PF ≤ 1.

2.0

1.5

1.0

0.5

5

Assignment 3

Problems:

1. Find the Bayes rule and the minimum Bayes risk for a binary hypothesis testing problem

with equal priors and uniform costs, with

(

c0 y 2 |y| ≤ 1

p0 (y) =

0 otherwise

(

c1 (3 − |y|) |y| ≤ 3

p1 (y) =

0 otherwise

2. A communication system uses binary signaling by sending one of the following two signals:

s1 (t) to send one, and s0 (t) to send zero, where s1 (t) = −As(t), s0 = As(t), A > 0, and

(

1 0<t<1

s(t) = .

0 otherwise

The transmitted signal is corrupted by zero mean white Gaussian noise n(t) with autocorre-

lation function Rn (τ ) = σ 2 δ(t), so that the received signal is given by

(

−As(t) + n(t) 1 sent

y(t) = .

As(t) + n(t) 0 sent

Z ∞

Z= y(t)s(t)dt.

−∞

The receiver would like to decide whether 0 or 1 was sent, or (if it is not confident about a

0/1 decision) erase the signal. We would like to obtain a Bayesian decision rule for doing this

as follows.

Let Θ = {0, 1} and A = {0, 1, e}. The observation is Z. Assume the cost structure

1 a 6= θ, a = 0, 1

C(a, θ) = 0 a=θ .

c a = e

(a) Find the Bayes rule for equal priors. Simplify the form of the rule as much as possible

and specify its dependence on the problem parameters c, A, and σ 2 .

(b) For d = A/σ = 5, find c and a correspoinding Bayesian decision rule δA so that the

probability of erasure p1 is twice the probability of error p2 . Compare with the values

of p1 and p2 for a Bayesian decision rule δB corresponding to c = 1/2.

1

3. Find the minimax rule and minimax risk for the binary hypothesis testing problem with

(

c0 y 2 |y| ≤ 1

p0 (y) =

0 otherwise

(

c1 (3 − |y|) |y| ≤ 3

p1 (y) =

0 otherwise

4. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =

2, C10 = 2, C01 = 4 is given by

V (π0 ) = 2 − π02

where π0 is the prior probability of hypothesis H0 .

(a) Find the minimax risk and the least favorable prior.

(b) What is the conditional probability of error given H0 for the minimax rule in (a)?

5. A communication system uses multi-amplitude signaling to send one of 5 signals. The decision

statistic when i(i = 1, . . . , 5) is sent is given by

Y = (i − 3)µ + N, i = 1, . . . , 5

where N is Gaussian noise with mean zero and variance σ 2 . For uniform costs, specify in

its simplest form a Bayes rule which is an equalizer, and hence is minimax. Show that the

minimax error probability depends only on d = µ/σ, and give a numerical value for the

minimax error probability (use a computer program if needed) for d = 3. You may use a

computer program if you wish, but you may save time by deriving an approximation based on

large d instead. Compare with the worst-case error probability using a Bayes rule for uniform

costs and equal priors.

6. Consider a binary hypothesis testing problem in which the ROC for the Neyman-Pearson rule

is given by PD = (PF )1/4 . For the same hypotheses, consider a Bayesian problem with cost

structure C10 = 3, C01 = 2, C00 = 2, C11 = 0.

(a) Show that the decision rule δ(y) = 1 is optimal for π0 ≤ 1/3.

(b) Find the minimum Bayes risk for π0 = 1/2.

Optional Problems:

1. The minimum Bayes risk for a binary hypothesis testing problem with costs C00 = 1, C11 =

2, C10 = 2, C01 = 4 is given by

V (π0 ) = 2 − π02

where π0 is the prior probability of hypothesis H0 . Find the receiver operating characteristic

of optimal detectors.

- BlumK2006Teaching Students How to Write a Chapter Four and Five of a DissertationUploaded byPaulo Nuñez
- 13. Testing of Hypothesis.Uploaded bySanjeev Gautam
- 9 pengujian hipotesis 2Uploaded bysudahkuliah
- Culture and accounting in IndonesiaUploaded byRhie FahriahTahar
- Event-study Methodology Under Conditions of Event-Induced Variance (E. Boehmer, J. Masumeci, A. Poulsen)Uploaded byAlizada Huseynov
- Determinationof Optimized Data Fusion Algorithms for Radars Network by Ordered Weighted AveragingUploaded byJournal of Computing
- 1Uploaded byGoutham Raj
- Taylor Rules and the EuroUploaded byAlina Maxim
- Dmuu StatsUploaded byShantanu Mishra
- William Allan Kritsonis, PhD, Dissertation Chair/Major Professor forUploaded byAnonymous sewU7e6
- MBA - 3rd SemiUploaded byNikhil Keshav
- Kl 2518311838Uploaded byAnonymous 7VPPkWS8O
- Star PRUploaded byAnonymous pcbik80
- Unit 31Uploaded byZaenal Muttaqin
- Unit 15 HypothesisUploaded byBijoy Ayyagari
- I. Douven - 2013- Inference to the Best Explanation, Dutch Books, And Inaccuracy MinimisationUploaded byMatthijs Ruiter
- Engineering 6Uploaded byZul Abror Bin Ya'akop
- Clustered Compressive Sensingbased Image Denoising Using Bayesian FrameworkUploaded byCS & IT
- RPG131.docxUploaded byIerah Mohd Hgg
- A View on Statistical SchoolsUploaded bySparkniano He
- Using Background Music to Affect the Behavior of Supermarket ShoppersUploaded byIulian Bogdan
- Chapter 3 No Data Decision ProblemsUploaded byChia Yin Teh
- Chapter 2_ Decision TheoryUploaded bySyaz Amri
- A Study on Online Shopping Behavior of ConsumersUploaded byInternational Journal of Innovative Science and Research Technology
- Fallsem2018-19 Mee2013 Eth Mb218 Vl2018191002622 Reference Material i Simulation III UnitUploaded byRaj Patel
- 14-188Uploaded byP6E7P7
- DoE Lecture 2 and 3Uploaded byNur Afiqah
- Chapter 10 Decision Trees TheoriesUploaded byjamaira harid
- Statistical_treatment_of_the_Solomon_fou.pdfUploaded byDion Aris Simanjuntak (dionjuntak)
- 2013s1quantitativeresearchmethods-150512031820-lva1-app6891.pdfUploaded bySaranya Sekhar

- Sugar refiningUploaded bydj2991
- (3GPP TR 38.912 version 14.1.0 Release 14) 5G Study on new radio access technology.pdfUploaded byscolic76
- NODE B ClassUploaded byabdullah
- ch11 Hash Functions.pptUploaded byAisha Saman Khan
- EIS-0493-FEIS-2014Uploaded byAnonymous ISn9FrY
- ag04_160913_syllabusUploaded byhortalemos
- Kyocera API FlyerUploaded byEnrique Horta
- Exeltech XP1100 Inverter ManualUploaded byDeep Patel
- Price List ReportUploaded byNaresh Babu Mamilla
- Spark Resistance Fan ConstructionUploaded bysurawutwijarn
- d10.enUploaded byBtissamBetty
- Final Exam ME2 Electrical 2015Uploaded byAhmed Rabie Abd Elazeem
- AICadUploaded byJeshi
- Anritsu MT9090A-MU909020AUploaded byHamdala An
- TH_HMI_Manual.pdfUploaded byAbdullah Talib
- 6. Air Operator Certificate Requirements Issue 3 Revision 31 Jul 2016Uploaded byThupten Gedun Kelvin Ong
- REQUIRED the Photoelectric EffectUploaded bypradeep bijarnia
- Autodesk Inventor - Use Custom Structural Content to Create FramesUploaded byNdianabasi Udonkang
- ER300PUploaded bytrustngs
- 74HC125Uploaded byAndes Indrayanto
- WiX TutorialUploaded byr0k0t
- Caterpillar Inc (CAT) Company ProfileUploaded bywickedastronomy96
- Chemistry Lab Manual-1Uploaded byVishal Gupta
- VSICM51_M01_CourseIntro_Uploaded byLeandro Figueras
- Creo 210 Admin Guide to ProductViewUploaded byButterwecke
- 20320130406017-2Uploaded byIAEME Publication
- ReadmeUploaded byChandra Shekhar
- Coatings Word April 2015Uploaded bysami_sakr
- Conquer Radio Frequency eBookUploaded byThemba Kaonga
- Introduction to Wireless Communication.radio CommunicationUploaded byYas773

## Much more than documents.

Discover everything Scribd has to offer, including books and audiobooks from major publishers.

Cancel anytime.