You are on page 1of 99

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

then the loss is measured by L(T. . . Show that as n → ∞. (a) If π (θ) is the prior PDF for θ. Yn are independent random variables with the PDF f (y | θ). θ) = (T − θ)2 /θ. . then the Bayes risk of T is R(θ. . . . . . . . Suppose that Y1 . Yn [is proportional to θn exp{−( Yj + α)θ}]. . . Show that the estimator which minimizes Bayes risk minimizes E{L(T. . Show that the uniformly most powerful test at level α rejects H0 if n j =1 Xj ≥ cα . 1]. Part A: M. 1/n (b) Let Zn = ( n . . . . Yn ). . 2. . pass should answer at least two questions from Part B. µ < ∞. i=1 Yi ) Zn → c in probability for some c. All questions carry equal weight. (ii) Show that the estimator that minimizes Bayes risk is T = n . . .D.UNIVERSITY OF CALIFORNIA. θ) and the risk function is R(θ. π (θ) = α exp(−αθ) for θ ≥ 0 and L(T. Yn }. Xn each have density f (x | µ ) = 1 (2π )1/2 x3/2 exp − (x − µ ) 2 . . (a) Find the PDF of − loge {(Y1 Y2 )2 }. where θ is an unknown scalar parameter. (b) Calculate E(X ) and Var(X ). (i) Obtain the posterior PDF of θ given Y1 . . the geometric mean of Y1 . . δ )π (θ)dθ. X2 . Students seeking a Ph. . Yn . . . . Yn be independent random variables with the Uniform distribution on [0. . If the decision rule δ is to estimate θ by the estimator T = t(Y1 . SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Saturday June 15 9:00 – 12:00 1996 Answer any 5 out of the 9 questions.S. Yj + α 3. δ ). and find c. (b) Let f (y | θ) = θ exp(−θy ) for y ≥ 0. Questions 1. (a) Express the PDF as an exponential family PDF. Let Y1 . The independent random variables X1 . 2xµ2 where 0 < x. . and obtain an approximation for cα that is suitable for large n. (c) We wish to test the null hypothesis H0 : µ = 1 against the alternative hypothesis HA : µ > 1. θ) | Y1 .

Suppose that certain electronic components have independent lifetimes (measured in days) that are exponentially distributed with PDF f (t|τ ) = (1/τ ) exp(−t/τ ). x are of type AA. 5. . In a random sample of n of these plants. Let Y1 . µ ≤ 2π.D. The genetic theory of random mating says that the ratios E(X ) : E(Y ) : E(Z ) are θ2 : 2θ(1 − θ) : (1 − θ)2 . (c) Explain how to test the goodness of fit of the genetic theory. .] OR [Hint: It may help to prove that E{sin(Y − µ)} = 0 and to assume that E(cos Y ) = 0. . Yn be independent random variables with the PDF f (y | µ. versus HA : θ < 1 . κ) = c(κ) exp{κ cos(y − µ)}. . . θ > 0. Yn are independent N (0. . 1 (b) Show that the maximum likelihood estimator of θ is 2 Y(n) . 0 ≤ y. 7. (a) Write down the likelihood function for θ given x. z .d. 0 ≤ θ ≤ 1. . show that S/(n + 2) has the minimum mean squared error. Five new components are put on test simultaneously. Suppose that Y1 . . σ 2 ) random variables. . and nothing else is observed.] (c) Calculate an approximation for the variance of µ ˆ. t ≥ 0. (a) Show that the MLE of µ is µ ˆ = tan−1 ( sin Yj / cos Yj ). (a) What is the likelihood function of τ ? (b) Obtain the maximum likelihood estimator of τ . . y. and show how to calculate the P-value. Aa and aa. 8. (c) Among all estimators of the form cS for some constant c. . from the uniform distribution on [θ. Derive the uniformly (b) We wish to test the hypothesis H0 : θ = 1 2 2 most powerful test. Yn are i. where κ is a known positive constant and c(·) is a known function. . 2θ]. y are of type Aa. (b) Prove that µ ˆ → µ in probability as n → ∞. (a) Obtain the minimal sufficient statistic for θ.i. (c) Prove or disprove: the minimal sufficient statistic is complete.4. We observe that the first failure occurs immediately after 100 days. [Assume that E(cos Y ) = 0. Is it a minimal sufficient statistic? Is it a complete sufficient statistic? (b) Show that n−1 S is the UMVU estimator of σ 2 . (c) Find a 95% confidence interval for τ . such that cos µ ˆ < 0 iff cos Yj < 0. A certain plant can appear in any one of three genotypes labelled AA. Y1 . Questions 6. and z are of type aa. Part B: Ph. . (a) Show that S = Yi2 is a sufficient statistic for σ .

However. if Xj > a then Treatment B is applied and the further response Yj is measured. What are the sufficient statistics for these parameters? (b) Show that if α = β . . What are the sufficient statistics now? (c) Write β = γα. Show that the score test statistic is proportional to Yj − j :Xj >a n k =1 Yk n k =1 (Xk + Yk ) ( Xj + Y j ) . which is Poisson with mean µj .9. All responses are mutually independent. . . conditional on the µj s. . If Xj ≤ a. and consider testing the null hypothesis H0 : γ = 1 versus HA : γ > 1. which are all unknown. The parameters α and β are also unknown. then Treatment A is applied to the subject and the further response Yj is measured. . α) = α Yj j =1 µj j X + Yj exp{−µj (1 + α)}. β .] (d) Explain in one or two sentences what part (b) has to do with testing H0 in part (c). then the likelihood function is n L(µ1 . where Yj is Poisson with mean αµj . yn ). . . The experimental data are therefore paired outcomes (x1 . . An experiment to compare two treatments (A and B) is performed on n subjects as follows. but do explain briefly how you would. . y1 ). µn . (a) Write down the joint likelihood function for µ1 . . α. [You need not standardize this. (xn . where Yj is Poisson with mean βµj . . and we are interested in comparing them. µn . For the j th individual an initial response Xj is measured. . .

. X2 . yn ] = 1 − α. . Observations y1 . . where ν and λ are known constants. µ2 < ∞. . B. (b) Show that the posterior distribution of (νλ + φ is chi-squared with ν + n degrees of freedom. 1) and Y1 . (a) Find the prior means of φ and φ−1 . (a) Explain the use of complete sufficient statistics in unbiased minimum variance unbiased estimation. a)|y1. . −∞ < µ1 . yn are a random sample from the normal distribution with mean zero and variance φ. . The prior distribution of ξ = νλ/φ is chi-squared with ν degrees of freedom. . .Unused Questions A. . 1). Ym ∼ N (µ2 . . . Y2 . Derive the uniform minimum variance unbiased estimator of P (X1 < Y1 ). a) such that P [φ ∈ [0. Both µ1 and µ2 are unknown. Xn ∼ N (µ1 . . n i=1 2 yi ) . . (b) Consider independent random samples X1 . . (c) Find the posterior means of φ and φ−1 . . (d) Find an interval [0. . y2.

.

.

.

k and j = 1. Explain brie y how you would construct a Bayesian con dence interval for . 1. Derive the exact (uniformly most powerful unbiased) test of NH: = versus AH: > . .p denotes the p quantile of the Student-t distribution with df degrees of freedom. 1. x . All questions carry equal weight. SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Sunday September 21 9:00 { 12:30 1997 Answer FIVE questions. Suppose that X and Y are independent Poisson random variables with means and . : : : . Consider the interval X1 c ^ td. Let x . 1 (a) Derive the posterior distribution of given the data. 2 3. k and 2 are all unknown. r with r > 1. Specify c. : : : . and the estimate ^ 2 so that the 1 1 2 interval is an exact 95% interval for .UNIVERSITY OF CALIFORNIA. d. We P want to nd an accurate 95% con dence interval for the rst mean. The parameters 1. where tdf. . De ne Xi = r?1 r j =1 Xij . (b) Obtain the Bayes estimator t of for the loss function L(t. 2. : : : . : : : . 2 ) for i = 1. xn denote a random sample drawn from an exponential distribution with probability density function f (xj ) = exp(? x) (x > 0) where has a gamma prior distribution with probability density function 1 hg g? exp(?h ) ( > 0) ( ) = ?( g) with g and h known positive constants. Suppose that Xij N ( i . ) = (t ? ) = .

: : : . (c) Let = 1 in the Weibull probability density function given above. ) = x ?1 exp ( ? x ) (x > 0). and correlation . (b) Show that the MLE of is ^ = X + (1 ? )(X + + Xn? ) + Xn : 2 + (n ? 2)(1 ? ) 1 2 1 (c) Calculate the e ciency of X = n? Pn j Xj relative to ^. Obtain equations that determine (i) the maximum likelihood estimator for . (a) Prove that E(X jX a) = (a)= (a). n are IID N (0. n. ): Develop an explicit expression for the likelihood function for . if is known. where (a) = 1 ? (a) with and the PDF and CDF of the N (0.:::.Xn (x1 . Suppose that X N ( . : : : . : : : . xn j ) = fX1 (x1 j ) n Y j =2 fXj jXj?1 (xj j xj ?1 . : : : .4. 2) for all j . (b) Prove that E(X jX a) is an increasing function of a. Y ) has a bivariate normal distribution with mean vector 0. if is known. 2 ) and that Xj = + (Xj ?1 ? ) + j for j = 2. and note that Cov(Xj . : : : . E(Y jX a) = (a)= (a). (a) Show that. Xn in this case may be written in the form 1 fX1 . then there is a one-dimensional su cient statistic for . You may quote properties of the conditional distribution of Y given X without proof. These conditions guarantee that Xj N ( .] 5. X . X n ) is minimal su cient for . (1) (2) ( ) 6. Consider a random sample of observations x . marginal variances equal to 1. (b) Show that. x . Assume that and 2 are known. Xk ) = jj?kj 2 . then the vector of order statistics (X . 1 =1 . xn from the two-parameter Weibull distribution with probability density function 1 2 f (xj . (ii) a method of moments estimator for . where and are positive constants. 1 (a) The joint PDF of X . where 2 . such that 2 = 2 =(1 ? 2 ). 2 ) independent of the Xj 's. : : : . Suppose that (X. 1) distribution.

(a) State the Neyman-Pearson Lemma for testing NH versus AH. (b) Suppose that f (y) = (y). (b) If Pn j a(Yj . 1) PDF. (Specify a numerically.7. verify that for any d n X Pr( ~ < + d) = Prf a(Yj . and use it to derive a formula for the variance of the asymptotic distribution of n = ( ~ ? ). and that fA(y) = (2 )? expf?jyj= g. j =1 . Derive the most powerful size test of NH versus AH. : : : . Suppose that Y . the N (0. Yn are IID with some unknown PDF f (y). Two hypotheses 1 0 0 1 about f are NH: f = f and AH: f = fA.) (c) Obtain a formula for approximate numerical evaluation of c in (b). monotone decreasing with respect to . yn) : yj + a jyj j c g. where both f and fA are completely speci ed. Yn are independent and identically distributed with a distribution depending on the scalar parameter . ) is continuous. ) = 0. 0 1 2 8. Suppose that Y . =1 1 (a) Derive the in uence function for ~. where is such that Var(Y ) = 1. : : : . We are interested in point estimation and con dence intervals for consistent estimator ~ is de ned implicitly through the Pn . The ~ estimating equation j a(Yj . P and show P that the critical region can be expressed in the form f(y . : : : . + d) < 0g: 1 2 =1 Explain carefully how this identity can be used to calculate an approximate 1 ? con dence interval for .

.

.

.

. (a) Show that T = Xj is complete and sufficient. θ). Xn are independent Binomial random variables with index m and rate θ. (b) Using the Lehmann-Scheffe Theorem or otherwise. the advantages and disadvantages of maximum likelihood as a method of statistical estimation. . Xn are observed. . i. . . Prior information about θ is summarized by the prior PDF 2θ. (b) Calculate Pr(Xn+1 = 1 | X1 . for any estimator T . Yn ). 3. (c) Say whether or not the estimator in (b) should attain the Cramer-Rao bound. Xn+1 are independent Bernoulli random variables with Pr(Xj = 1) = θ = 1 − Pr(Xj = 0). . Xn . and not Xn+1 . . . . . . Only the values of X1 . . . . 2. . . . . θ) = (T − θ)4 . Xn . . . and the Bayes risk. Yn ) = ¯ ). . . Xn ). 0 < θ < 1. given X1 . Yn are independent and identically distributed with probability density f (y | θ). Suppose that Y1 . All questions carry equal weight. . .e. Suppose that X1 . . θ). 4. . . using appropriate brief examples. (a) Define the risk function. the loss function is L(T. . and give a reason for your answer. (b) If θ ∈ R. SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Thursday September 23 1999 9:00 – 12:00 Answer FIVE (5) questions. . and the posterior density for θ is f (θ | Y1 .UNIVERSITY OF CALIFORNIA. prove that the Bayes estimator for θ is T = Y ¯ if g (z ) = g (−z ) for all z . find the Uniform Minimum Variance Unbiased Estimator for q (θ) = (1 − θ)m + mθ(1 − θ)m−1 = Pr(X ≤ 1). L(T. . For estimation of θ by T = t(Y1 . . B (m. Suppose that X1 . where θ is a scalar parameter. . 1. And θ has a prior distribution with density π (θ). Discuss. g (θ − Y PLEASE TURN OVER . (a) Calculate the posterior PDF for θ.

(a) Define Y = aj Xj . a4 . . . If Y 2 and Q = X1 X2 − X3 X4 are independent. . . 7. (b) Find the distribution of Q = 1 2 ( X1 2σ2 2 + X3 )+ 1 2 ( X2 σ2 + X1 X3 ). . X4 are independent N (0. Yn be random samples from two exponential distributions with unknown scale parameters θ1 and θ2 respectively. . (b) If m = n = 1. Xm and Y1 . means 1/θ1 and 1/θ2 . . σ 2 ). . are independent Poisson with mean λ. (a) Show that the critical region of the Generalized Likelihood Ratio Test of H0 : θ1 = ¯ /X ¯ . . X2 . where a1 . . . . . i. . . θ2 ) with confidence coefficient 1 − (1 + c) exp(−c). 6. determine a1 . Let X1 . show that S (X1 . with both error probabilities set at 0. . a4 are real constants. θ2 ) : θ1 X1 + θ2 Y1 ≤ c} is a confidence region for (θ1 . . . ∗ ∗ (b) In using the above test for the composite hypotheses H0 : λ ≤ 1 versus H1 : λ ≥ 3. (a) Set up the Wald sequential probability ratio test for testing H0 : λ = 1 versus H1 : λ = 3. Y1 ) = {(θ1 . . . Suppose that the sequence of random variables X1 . . Explain how one would θ2 versus H1 : θ1 = θ2 depends only on the ratio Y obtain the necessary critical values for the test. . .5. find approximations to the power function β (λ) and the average sample number function Eλ (N ).e. Suppose that X1 . .2.

ÍÆÁÎ ÊËÁÌ Ç ÄÁ ÇÊÆÁ ¸ Ë ÆÌ ÈÖÓ ÄÁ Ê Ð ØÝ ÁÆ Ê Ô ÖØÑ ÒØ Ó ËØ Ø ×Ø × ² ÔÔÐ Å ÌÀ Å ÌÁ Ä ËÌ ÌÁËÌÁ Ë ÉÍ Å Ö ÓÙ × ÓÙÐ Ö Ý Ë ÔØ Ñ Ö ¾¾ ¾¼¼¼ ¿¼ ß ½¾ ¿¼ ÕÙ ×Ø ÓÒ Ö ÙÐÐݺ Ò×Û Ö ÁÎ ´ µ ÕÙ ×Ø ÓÒ׺ ÐÐ ÕÙ ×Ø ÓÒ× ÖÖÝ ÕÙ Ð Û Øº ÇÇ ÄÍ Ã ½º ËÙÔÔÓ× Ø Ø × ÓÑ ØÖ Ö Ò ÓÑ Ú Ö Ð Û Ø ×Ù ×× ÔÖÓ Ð ØÝ Ô ¾ ¼ ½℄¸ ×Ó Ø Ø ÈÖ´ µ Ô´½   Ôµ  ½ ÓÖ ½¾ º Ú Ò ÜÔÐ Ø ÓÖÑÙÐ ÓÖ Ø ÍÅÈ ÙÒ × Ð Ú Ð « Ø ×Ø Ó À¼ Ô Ô¼ Ò×Ø À½ Ô Ô¼ Û Ò ¼ Ô¼ ½ Ò ¼ « ½º ´ ×ÙÖ ØÓ Ú Ö Ý Ø Ø ÝÓÙÖ Ò×Û Ö × ÍÅÈ ÙÒ × ¸ Ø Ö Ö ØÐÝ ÓÖ Ý ÕÙÓØ Ò Ò Ö Ð Ø ÓÖ Ñ׺µ ¾º ÄØ ½ ÖÓÑ Ø Ò ´ µ Ë ÓÛ Ø Ø ´Òµ Ñ Ü ½ ´ µ Ï Ø ÓÒ× ÕÙ Ò × Ó × ´ µ ¿º Í ¼ ℄ ×ØÖ ÙØ ÓÒ Û Ø ½ ÙÒ ÒÓÛÒº × ×ÙÆ ÒØ ÓÖ ÙØ × ÒÓØ ÓÑÔÐ Ø º Ò Ú ÓÖ ÔÓ ÒØ ×Ø Ñ Ø ÓÒ Ó ÜÔРغ ÄØ ½ ¾ Ò Ô Ò ÒØ Æ ´ ½ µ Ö Ò ÓÑ Ú Ö Ð ×¸ Ò Ð Ø Ú Æ ´¼ ½µ ÔÖ ÓÖ ×ØÖ ÙØ ÓÒº Ï Ø × Ø Ý × ×Ø Ñ ØÓÖ ÓÖ ÙÒ Ö ×ÕÙ Ö ÖÖÓÖ ÐÓ×× ÓÑÔ Ö Ø Ö × ÙÒ Ø ÓÒ Ó Ø Ý × ×Ø Ñ ØÓÖ Û Ø Ø Ø Ó Ø Ñ Ü ÑÙÑ Ð Ð ÓÓ ×Ø Ñ ØÓÖº º ´ µ Á × Æ ´¼ ½µ¸ ¬Ò Ø È Ó ´ µ ½ ´ÑÓ ½µ¸ Ø Ö Ø ÓÒ Ð Ô ÖØ Ó ¸ Ò ´ µ ¾ ℄¸ Ø ÒØ Ö Ô ÖØ Ó º ´ µ ËÙÔÔÓ× Ø Ø ´ ½ ¾ µ Ú Ú Ö Ø ÒÓÖÑ Ð ×ØÖ ÙØ ÓÒ Û Ø Þ ÖÓ Ñ Ò Ú ØÓÖ¸ ÙÒ Ø Ú Ö Ò ×¸ Ò ÓÖÖ Ð Ø ÓÒ º ÈÖÓÚ Ø Ø ½ ½ ¾ ´½   ¾ µ ½ ¾ ´ ½   ¾ µ Ö Ò Ô Ò ÒØ Æ ´¼ ½µº ÜÔÐ Ò ÓÛ Ø × Ö ×ÙÐØ Ñ Ø Ù× ØÓ Ò Ö Ø Ú Ö Ø ÒÓÖÑ Ð Ö Ò ÓÑ Ú Ö Ð × ¾ ¾ ´ ½ ¾ µ Û Ø Ñ Ò× ´ ½ ¾ µ¸ Ú Ö Ò × ½ ´ ÐÐ Ô Ö Ñ Ø Ö× ×Ô ¬ µ ¾ Ò ÓÖÖ Ð Ø ÓÒ ÖÓÑ Ò Ô Ò ÒØ ×Ø Ò Ö ÒÓÖÑ Ð Ö Ò ÓÑ Ú Ö Ð ×º ÈÄ Ë ÌÍÊÆ ÇÎ Ê .

ÓÒ× Ö Ø Ð Ò Ö ÑÓ Ð Ý ¬¼ · ¬½ Ü · ¸ ½ Ò¸ Û Ö Ø Ö Ò ÓÑ ÖÖÓÖ× ¾ Ú Þ ÖÓ Ñ Ò Ò Ö Ò Ô Ò Òظ ÙØ Î Ö´ µ Û Û Ø Û ³× Ú Ò Û Ø׺ ´ µ Ò Ø Û Ø Ð ×Ø ×ÕÙ Ö × ×Ø Ñ Ø × Ó ¬¼ ¬½ Ò ¾ º ´ µ Ò Ø Ú Ö Ò × Ó Ø ×Ø Ñ Ø × Ò ´ µº ´ µÁ Ø Û Ö ÒÓÖÑ ÐÐÝ ×ØÖ ÙØ ¸ ÛÓÙÐ Ø Ñ Ü ÑÙÑ Ð Ð ÓÓ ×Ø Ñ Ø × Ó ¬¼ ¬½ Ò ¾ Ó Ò Û Ø Ø Ó× Ø Ø ÝÓÙ Ó Ø Ò Ò ´ µ ÜÔÐ Ò Ö ­Ýº º º ÄØ ½ Ò Ò Ô Ò ÒØ ÑÑ ÛØ È ´ ܵ  ½ ÜÔ´  ܵ  ´ µ ´ µ Ò Ö Ø Ô Ö Ñ Ø Ö× Ó ÒØ Ö ×غ ´ µ ÏÖ Ø ÓÛÒ Ø Ð Ð ÓÓ ÙÒ Ø ÓÒ ÓÖ ´ µ¸ Ò Ó Ø Ò Ô Ö Ó ÕÙ Ø ÓÒ× ÓÖ Ø ´ÙÒÖ ×ØÖ Ø µ ÅÄ Ó ´ µº ´ µ ËÙÔÔÓ× Ø Ø × ÔÖ ÓÖ ×ØÖ ÙØ ÓÒ ÓÒ Ø ÔÓ× Ø Ú ÒØ Ö× Û Ø ÔÖÓ ÐØ × ÔÖ ÈÖ´ Öµ¸ Ö ½ ¾ º ËÙÔÔÓ× Ð×Ó Ø Ø Ò Ö Ò Ô Ò ÒØ ÔÖ ÓÖ ¸ Ò Ø Ø × Ú Ò Ø ÑÔÖÓÔ Ö ÔÖ ÓÖ Ò× ØÝ Ð Ñ ÒØ  ½ º Ç Ø Ò Ø Ý × ×Ø Ñ ØÓÖ ÓÖ ÙÒ Ö ×ÕÙ Ö ÖÖÓÖ ÐÓ×׺ ´ µ ÁÒ ×ÓÑ ÔÔÐ Ø ÓÒ× Ø ÛÓÙÐ Ö ×ÓÒ Ð ØÓ ÜÔ Ø Ø Ø × ÔÓ× Ø Ú ÒØ Ö¸ ÙØ ÝÓÙ Ñ Ø Û ÒØ ØÓ Ø ×º ÜÔÐ Ò ÓÛ Ø Ð Ð ÓÓ ÙÒ Ø ÓÒ Ò ´ µ Ñ Ø Ù× ØÓ Ø ×Ø Ø ÓÑÔÓ× Ø ÒÙÐÐ ÝÔÓØ × × Ø Ø × Ò ÒØ Ö¸ Ú ÐÙ ÒÓØ ×Ô ¬ º º ËÙÔÔÓ× Ø Ø ´ ½ Ò Ô Ò ÒØ Ö Ò ÓÑ Ú Ö Ð ×¸ ×Ù Ø Ø × Òµ Ö Ø È ´Ý µ ¬ ÜÔ´ ¬ Ý µ ´Ý ¼µ Û Ø ¬ Ü ÓÖ ÒÓÛÒ ÓÒ×Ø ÒØ× Ü½ ÜÒ º ÇÙÖ Ó Ø Ú × ØÓ Ø ÖÑ Ò ÓÛ ØÓ Ð ÙÐ Ø ÓÓ ½   « ÓÒ¬ Ò ÒØ ÖÚ Ð ÓÖ º ´ µ Ö ÙÐÐÝ ¬Ò Û Ø × Ñ ÒØ Ý ÓÒ¬ Ò ÒØ ÖÚ Ð ¸ Ò Û Ø ÓÓ Ñ Ò׺ ´ µ ÏÖ Ø ÓÛÒ Ø ÈÐÓ Ð Ð ÓÓ ÙÒ Ø ÓÒ ÄÒ ´ µ ÓÖ ¸ Ò × ÓÛ Ø Ø Ø Ð Ð ÓÓ ÕÙ Ø ÓÒ × Ò  ½   Ò ½ Ü ¼º ´ µ Ò ÔÔÖÓÜ Ñ Ø ×ÓÐÙØ ÓÒ ØÓ Ø ÓÒ¬ Ò × Ø ÔÖÓ Ð Ñ × ÓÙÒ Ý Ù× Ò Ï Ð ×³× Ì ÓÖ Ñ¸ Û × Ý× Ø Ø ¼ × Ø ØÖÙ Ú ÐÙ Ó Ø Ò Ï ¾ ×ÙÔ Ò ¼ ÄÒ ´ µ   ÄÒ ´ ¼ µ Ó ¾ ½ Ò ½ ÔÔÐÝ Ø × ØÓ Ø ÔÖ × ÒØ ÔÖÓ Ð Ñ¸ × ÓÛ Ò Ð ÖÐÝ ÓÛ Ø Ò ÔÓ ÒØ× Ó Ø ÒØ ÖÚ Ð Û ÐÐ Ð ÙÐ Ø º ´ Úµ Á× Ø Ö Ò Ü Ø ×ÓÐÙØ ÓÒ ØÓ Ø × ÓÒ¬ Ò ÒØ ÖÚ Ð ÔÖÓ Ð Ñ Á ×Ó¸ ÜÔÐ Ò Ö ­Ý Û Ø Ø ×º .

.

.

ÍÆÁÎ ÊËÁÌ Ç Å ÌÀ Å ÌÁ ÄÁ ÇÊÆÁ Ä ËÌ ÌÁËÌÁ Ì Ë ÆÌ ÈÖÓ ÄÁ Ë ÉÍ Ê Ð ØÝ ÁÆ Ê Å Ô ÖØÑ ÒØ Ó ËØ Ø ×Ø × ² ÔÔÐ ÅÓÒ Ý Ë ÔØ Ñ Ö ¾¿ ¾¼¼¾ ½¼ ¼¼ Ñ ß ½ ¼¼ ÔÑ ÈÐ × Ò×Û Ö ÁÎ ´ µ ÕÙ ×Ø ÓÒ׺ ÐÐ ÕÙ ×Ø ÓÒ× ÖÖÝ ÕÙ Ð Û Øº ÓÙ × ÓÙÐ Ö ÕÙ ×Ø ÓÒ Ö ÙÐÐݺ ÁØ × Ô ÖÑ ØØ ØÓ Ù× Ø ÓÓ ËØ Ø ×Ø Ð ÁÒ Ö Ò Ý × ÐÐ ² Ö Öº ½º ÄØ ½ ÖÓÑ Ø ÜÔÓÒ ÒØ Ð ×ØÖ ÙØ ÓÒ Û Ø Ñ Ò ¸ Ò Ô Ò ÒØ Ò Ó ½ Ö ÖÓÑ Ø ÜÔÓÒ ÒØ Ð ×ØÖ ÙØ ÓÒ Û Ø Ñ Ò  ½ º ¬Ò Ò Û ÈÒ   ½ Ò Ò × Ñ Ð ÖÐÝ ÓÖ º ½ ´ µ Ë ÓÛ Ø Ø Ø Ô Ö Ë ´ µ × Ó ÒØÐÝ Ñ Ò Ñ Ð ×ÙÆ ÒØ ÓÖ º ´ µ ÈÖÓÚ Ø Ø Ë × ÒÓØ ÓÑÔÐ Ø º ½¾ ´ µ Ë ÓÛ Ø Ø Ø ÅÄ ÓÖ × º ´ Úµ Ö Ú Ø Ú Ö Ò Ó Ø ×ÝÑÔØÓØ ÒÓÖÑ Ð ×ØÖ ÙØ ÓÒ Ó Ò½ ¾ ´   µº ¾º ÄØ ½ ÖÓÑ Ø ÈÓ ××ÓÒ ×ØÖ ÙØ ÓÒ Û Ø Ñ Ò º ¬Ò Ò ÜÔ´ ¾ µº Ï Û × ØÓ ×Ø Ñ Ø Ù× Ò Ó × ÖÚ Ø ÓÒ× Ó º Ì ×Ø Ø ×Ø ½ Ò ÈÒ × ÓÑÔÐ Ø ×ÙÆ ÒØ ÓÖ ¸ Ò Ò ÓÖ ÝÓÙ Ñ Ý ××ÙÑ Ø ×º ½ ´ µ Ì × ÑÔÐ ×Ø Ñ ØÓÖ Í Ø × Ø Ú ÐÙ ½ ¼ Ò Ú ÐÙ ¼ ÓØ ÖÛ × º ½ ¾ ÈÖÓÚ Ø Ø Í × ÙÒ × º ´ µ Ý ×Ø ÖØ Ò Û Ø Í ÓÖ ÓØ ÖÛ × ¸ × ÓÛ Ø Ø Ì ½   ¾Ò ½ × Ø ÍÅÎÍ ÓÖ º ´ µ Í× Ø ×ØÖ ÙØ ÓÒ Ó ØÓ Ð ÙÐ Ø ´Ì Ö µ ÓÖ Ö ½ ¾¸ Ò Ù× Ø Ö ×ÙÐØ× ØÓ ÔÖÓÚ Ø Ø Ì × ÓÒ× ×Ø ÒØ Ò ×ÝÑÔØÓØ ÐÐÝ Æ ÒØ Ö Ð Ø Ú ØÓ Ø Ö Ñ Ö¹Ê Ó ÐÓÛ Ö ÓÙÒ º ¿º  ½ ÜÔ  ´Ü   µ Ì Ø ×¸   × ÜÔÓÒ ÒØ Ð Û Ø Ñ Ò º ¬Ò Ø ÓÖ Ö Ú ÐÙ × Ó ½ Òº ´ µ ÈÖÓÚ Ø Ø ´Ü µ ËÙÔÔÓ× Ø Ø ½ Ò Ö ÛØ È Ü ´½µ ¼ ´¾µ ¡¡¡ ´Òµ ØÓ ´Ò   · ½µ ´ µ   ´  ½µ ¾ Ò ½ Ò ´½µ   Ö Ò Ô Ò ÒØ ÜÔÓÒ ÒØ Ð Û Ø Ñ Ò º ´ µ ÈÖÓÚ Ø Ø Ì × Ñ Ò Ñ Ð ×ÙÆ ÒØ ÓÖ ´ µ¸ Ò Ú Ö Ý Ø Ø Ì × ´½µ ÈÒ Ó Ë ÓÒ ¹ÓÒ ÙÒ Ø ÓÒ Ó Ë ´½µ ¾ º Ï Ø × Ø Ó ÒØ È ÈÄ Ë ÌÍÊÆ ÇÎ Ê .

ÙÒ × ´ µ Í× Ö ÓÒ À º ÓÖ Ø Ø Ò ÑÓ Ð × Ö Ò ÉÙ ×Ø ÓÒ ¿¸ Û Û × ØÓ Ø ×Ø À¼ ¼ Ú Ö×Ù× ¼¸ Û Ø ÓÑÔÐ Ø ÐÝ ÙÒ ÒÓÛÒº Ì Ö Ö × Ú Ö Ð Û Ý× ØÓ Ó Ø Ò Ø ÍÅÈ ÈÒ Ø ×ظ ÒÒ Ò Ø Ö Û Ø ½ Ò ÓÖ Û Ø Ë ´½µ ¾ º Ø Ö ÒÚ Ö Ò ÓÖ Ò Ö Ð Þ Ð Ð ÓÓ Ö Ø Ó ØÓ Ö Ú Ø ÙÒ × Ø ×Ø Ö Ø Ð Ê« Ü Ü´½µ Ü « ´ µ Í× ÉÙ ×Ø ÓÒ ¿ ´ µ ØÓ ÜÔÖ ×× Ø ×Ø Ò Ö Ø Ð ×ØÖ ÙØ ÓÒº ËÙÔÔÓ× Ø Ø ½ Ò Ö Ô Ö ÑØÖ × ÑÔÖÓÔ Ö ÔÖ ÓÖ È Ò ÓÖÑÙÐ ÓÖ Ø Ò ÔÓ ÒØ× Ó Ø ÜÒ º º º Ö Ø Ð Ð Ú Ð « Ò Ø ÖÑ× Ó Ø ½   « ÕÙ ÒØ Ð Ó Û Ø Ø ÍÒ ÓÖÑ ×ØÖ ÙØ ÓÒ ÓÒ   ¼¸ ÓÖ ×ÓÑ ¼º ¢ ´ µ »  ´ ·½µ ½   « ÀÈ ÒØ ÖÚ Ð ÓÖ Ú Ò ½ ܽ ℄º Ì Ò Ö ×Ø Ñ Ø Ò Ø ÒÓÑ Ð Ô Ö Ñ Ø Ö ¸ ¼ ½¸ × ÓÒ Ö Ò ÓÑ × ÑÔРܽ ÜÒ ÖÓÑ Ø ÖÒÓÙÐÐ ×ØÖ ÙØ ÓÒ ÈÖ´ Ü µ Ü ´½   µ½ Ü Ü ¼ ½ Ì ¾  µ º Ø ÐÓ×× ÙÒ Ø ÓÒ Ä´ µ ´ ´½  µ ´ µ Ë ÓÛ Ø Ø Æ ´Ü½ ÜÒ µ Ü × Ø Ý × ×Ø Ñ Ø ÓÖ Û Ø Ö ×Ô Ø ØÓ Ø ÙÒ ÓÖÑ ÔÖ ÓÖ ×ØÖ ÙØ ÓÒ ÓÒ ¼ ½℄ ÓÖ º ´ µ ÜÔÐ Ò Û Ý Æ ´Ü½ ÜÒ µ × Ð×Ó Ø Ñ Ò Ñ Ü ×Ø Ñ Ø ÓÖ º ÓÒ× Ö Ò Ô Ò ÒØ Ó × ÖÚ Ø ÓÒ× ½ ¾ ¿ ÖÓÑ Ø Ô Ö Ñ Ø Ö× Ò ¾ º À Ö Ö Ø Ö ÔÓ×× Ð ×Ø Ñ Ø × Ó Ø º ÓÒ× ÆÓÖÑ Ð Ú Ö Ò ×ØÖ ÙØ ÓÒ Û Ø ¾ ¾ ½ ½ ¿ ´ ¿ ½   µ¾ ¾ ¾ ´ µ Ï Ó Ø × ×Ø Ñ ´ µ Ï ×Ø Ñ ØÓÖ × ´ µÏ Ø ×Ø Ö Ð ØÚ ´ Úµ Ï Ó Ø Ø Ö Ø ÐÓ×× ÙÒ Ø ÓÒ ØÓÖ× × ÙÒ × ÓÖ ¾ Ø ×Ñ ÐÐ ×Ø Ú Ö Ò ¾ Ú Ö×Ù× Æ Ò Ý Ó ¾ ×Ø Ñ ØÓÖ× ½ ¸ ¾ Ò ½ ¿ ´ ¾ ½   ¾ ¿ ¿Ó ¾ µ¾ µ¾ ¾ ¿ ´ ½   ¾ µ¾ ¾ ÑÓÖ Ø Ò ÓÒ ºµ ´ÁØ ÓÙÐ ×Ø ×Ñ ÐÐ ×Ø Ö × Û Ò Ù× Ò Ä´ µ ´   ÈÄ Ë ÌÍÊÆ ÇÎ Ê .

ËÙÔÔÓ× Ø Ø ØÛÓ ÜÔ Ö Ñ ÒØ× Ö ÖÖ ÓÙغ ÜÔ Ö Ñ ÒØ ½ Ö Ø × Ø ´Ü Ý µ × Ö Ý Ø ÑÓ Ð ¬Ü · ¸ ½ Ò¸ Û Ö × ÜÔ Ö Ñ ÒØ ¾ Ö Ø × Ø ´Ü Þ µ × Ö Ý Ø ÑÓ Ð « · ¬Ü · Æ ¸ ½ Òº Ì ÖÖÓÖ× Ò Æ Ö ¾ ÑÙØÙ ÐÐÝ ÙÒ ÓÖÖ Ð Ø Æ ´¼ µº ÆÓØ Ø Ø Ø Ü Ú Ð٠׸ Ø × ÑÔÐ × Þ ¸ Ø Ö Ö ×× ÓÒ ×ÐÓÔ ¸ Ò Ø ÖÖÓÖ Ú Ö Ò Ö Ø × Ñ ÓÖ ÓØ ÜÔ Ö Ñ ÒØ׺℄ ´ µ Ú ÓÖÑÙÐ ÓÖ ´ µ Ø Ð ×ع×ÕÙ Ö × ×Ø Ñ ØÓÖ ¬½ Ó ¬ ÖÓÑ ÜÔ Ö Ñ ÒØ ½¸ Ò ´ µ Ø Ð ×ع×ÕÙ Ö × ×Ø Ñ ØÓÖ× «¾ Ò ¬¾ Ó « Ò ¬ ÖÓÑ ÜÔ Ö Ñ ÒØ ¾º ´ µÇ Ø ÒØ ÄÍ ÓÖ « × ÓÒ Ø Ö ×ÙÐØ× Ó ÇÌÀ ÜÔ Ö Ñ ÒØ׺ ´ µ Ò ÙÒ × ×Ñ Ø Ó ¾ Ò Ó Ø Ò ÖÓÑ Ø ÓÚ Ö ÐÐ Ö × Ù Ð ×ÙÑ Ó ×ÕÙ Ö × ÓÖ Ø ÓÑ Ò Ø ¸Û Ò × ÓÛÒ ØÓ ÕÙ Ð º É Ï Ø Ö Ø Ò ½   ¬½ Ü Ò Ø ¾ · Ò ½   «¾   ¬¾Ü ¾ · ¬½   ¬¾ ¾ Ú ÐÙ × Ó Ö ×Ó Ö ÓÑ ÓÖ É .

½º Á ½ Ò Ö × ÑÔÐ Ö Ò ÓÑ × ÑÔÐ ÖÓÑ Ø « ´Üµ ×ØÖ ÙØ ÓÒ Û Ø Ò× ØÝ « ½ ´½ · ܵ  «·½ « ´ µ ´ µ Ò Ø Ñ Ü ÑÙÑ Ð Ð ÓÓ ×Ø Ñ Ø Ó «º Ò Ø Ö Ñ Ö¹Ê Ó ÐÓÛ Ö ÓÙÒ ÓÒ Ø Ú Ö Ò Ó ÙÒ ÓÙ Ò Ù× Ø ÓÐÐÓÛ Ò ÜÔ Ø Ø ÓÒ× ´ÐÓ ´½ · µµ × ¾«¾ ×Ø Ñ Ø × Ó «º « ÐÓ ´½ · µ℄¾ ´ µ Á× Ø ¾º Á Ñ Ü ÑÙÑ Ð Ò Ð ÓÓ ×Ø Ñ ØÓÖ Ø ×Ø ×Ø Ñ ØÓÖ Ó « ×ØÖ ÙØ ÓÒ Û Ø ¼ « ¼ Ò× ØÝ ½ Ö × ÑÔÐ Ö Ò ÓÑ × ÑÔÐ ÖÓÑ Ø « ´Üµ « ½ ´½ · ܵ  «·½ « Ü ´ µ Ò Ø Ñ Ü ÑÙÑ Ð Ð ÓÓ ×Ø Ñ Ø Ó «º ´ µ Ë ÓÛ Ø Ø Ø ÅÄ Ú × Ø Ö Ñ Ö¹Ê Ó ÐÓÛ Ö ÓÙÒ º ¿º ÓÒ× Ð ØÝ ÓÖ Ü ´ µ Ö Ö Ò ÓÑ × ÑÔÐ Ó Ò Ó × ÖÚ Ø ÓÒ× ½ ×ØÖ ÙØ ÓÒ× ´   ½µÜ È Ü Ü·½ ¼ ½ ¾ Ò ½º Ò Ø Ñ Ü ÑÙÑ Ð Ð ÓÓ ×Ø Ñ ØÓÖ Ó º Ò ÖÓÑ Ø × Ö Ø ÔÖÓ ¹ ´ µ Ë ÓÛ Ø Ø Ø ÅÄ Ú × Ø Ö Ñ Ö¹Ê Ó ÐÓÛ Ö ÓÙÒ º ´ ÓÙ Ò Ù× Ø Ø× Ø Ø ´ µ   ½ Ò Î Ö´ µ ´½   µºµ º ÓÒ× Ö Ø Ø Û Ø Ô Ö Ñ Ø Ö× ½ Ò ¾ ¿ Ò Ô Ò ÒØ Ó × ÖÚ Ø ÓÒ× ÖÓÑ ÒÓÖÑ Ð ×ØÖ ÙØ ÓÒ ¾ º À Ö Ö Ø Ö ÔÓ×× Ð ×Ø Ñ Ø × Ó Ø Ú Ö Ò ¾ ¾ ½ ½ ¿ ´ ¿ ½   ܵ¾ ¾ ¾ ½ ¿ ´ ¾ ½ ×   Ü µ¾ ´ÁØ ÓÙÐ ¾ ¿ ´ ½   ¾ µ¾ ¾ ÑÓÖ Ø Ò ÓÒ ºµ ´ µ Ï Ó Ø × ×Ø Ñ ØÓÖ× × ÙÒ ´ µ Ï ×Ø Ñ ØÓÖ × Ø ×Ñ ÐÐ ×Ø Ú Ö Ò ¾ Ú Ö×Ù× ¾ ´ µ Ï Ø × Ø Ö Ð Ø Ú Æ Ò Ý Ó ¾ ¿ ´ µ Ï ×Ø Ñ ØÓÖ × Ø ×Ñ ÐÐ ×Ø Ö × Ú Ö×Ù× Ø ÐÓ×× ÙÒ Ø ÓÒ µ¾ Ä´ ¾ ¾ µ º ËÙÔÔÓ× Û Ú Ò Ó × ÖÚ Ø ÓÒ× ÖÓÑ ´   ¾ Æ´ ¾µ Ò Û Û ÒØ ØÓ ×Ø Ñ Ø ¾º .

´ µ Ë ÓÛ Ø Ø ×¾ Ò × ×ÙÆ ÒØ ÓÖ ×Ø Ñ Ø Ò ¾ º ´ µ Ò Ø ÓÒ×Ø ÒØ ×Ù Ø Ø Ø ½ ´   ܵ¾ ×Ø Ñ ØÓÖ ¾ ×¾ ¾   ¾ µ¾ ´ ÑÒÑÞ ×Ø Ö× ÓÖ Ø ÐÓ×× ÙÒ Ø ÓÒ Ä ´ µ Á× Ø × Ñ Ò Ñ Ü ×Ø Ñ Ø Ó ¾ ¾ ¾ ÓÖ Ä´ ¾ ¾µ .

ÇÖ Ò Ð Ú Ö× ÓÒ Ó ¿ Ò × Ò ÓÑ Ò ÕÙ ×Ø ÓÒ ËÙÔÔÓ× Ø Ø ½ Ö µ ÛØ È ´Ü  ½ ÜÔ  ´Ü   µ ¬Ò ´ Ü ´½µ µ ×Ë Ì Ø ×¸   × ÜÔÓÒ ÒØ Ð Û Ø Ñ Ò º Ø ÓÖ Ö Ú ÐÙ × Ó ½ Òº ´ µ ÈÖÓÚ Ø Ø Ø Ñ Ò Ñ Ð ×ÙÆ ÒØ ×Ø Ø ×Ø ÓÖ ´ µ ÈÖÓÚ Ø Ø ´¾µ ´½µ ¾ ¡¡¡ ÈÒ ´Òµ ØÓ ½ º ½ Ò ´½µ   ´Ò   · ½µ ´ µ   ´  ½µ Ò Ö Ò Ô Ò ÒØ ÜÔÓÒ ÒØ Ð Û Ø Ñ Ò º ´ µ Ï Û × ØÓ Ø ×Ø À¼ ¼ Ú Ö×Ù× À ÈÒ Ø Ø Ë¼ × Ñ Ò Ñ Ð ×ÙÆ ÒØ ÓÖ ½ È Ó Ë Ú Ò Ë¼ Ø × Ë Ë¼ ´Þ ¼¸ Û Ø ÓÑÔÐ Ø ÐÝ ÙÒ ÒÓÛÒº Ë ÓÛ ÙÒ Ö À¼ ¸ Ò ÔÖÓÚ Ø Ø Ø ÓÒ Ø ÓÒ Ð Ø¼ Ø µ Ò´Ò   ½µ ´Ø   ÒÞ µÒ ¾ ´Ø   Ò µÒ ½ ؼ Ø Þ Ø Ò ×Þ « Ò ¼ ÓÖ Ø¼ غ À ÒØ Ù× ´ µº℄ ´ Úµ À Ò Ö Ú Ø ÍÅÈ ÙÒ × Ø ×Ø Ó À¼ Ú Ö×Ù× À ¸ Ò × ÓÛ Ø Ø Ø Ö Ø Ð Ö ÓÒ × Ó Ò Ê« ´Ü½ ÜÒ µ Ü´½µ ½   «½ ´Ò ½µ Ü ÐØ ÖÒ Ø Ú ÐÝ ÜÔÖ ×× Ø Ö Ø Ð Ú ÐÙ Ò Ø ÖÑ× Ó ×ØÖ ÙØ ÓÒº ´Úµ Ë ÓÛ Ø Ø ÒÚ Ö Ò Ñ Ø Ó × Ð ØÓ Ø × Ñ Ø ×غ .

.

.

.

.

(i) Find a simple unbiased estimator for θ . The mean of Xij is µij which can be represented by the loglinear model log µij = α + βi + γj with βi = γj = 0. · · · . Xn are independent Poisson random variables. η . X2 . · · · . Find a suitable prediction for X22 given one observation of the triple (X11 . and that S = s(X) is a complete sufficient statistic for θ . 2. (ii) It may be that the exact variance of T does not exist. (iii) If µ has the N (ξ. X12 . Xn are IID random variables each with the PDF f (x | θ ) = exp{−(x − θ )}. 1). 4. 1. and define θ = µ2 . Suppose that X1 . (c) Derive the uniformly most accurate (UMA) lower 1 − α confidence bound for θ . 5. Suppose that Xij are independent Poisson random variables for i = 1. τ 2 ) prior distribution. Yn are IID with mean ¯ Y ¯ be chosen as estimator of θ . show that E(T | H ) = n − 1. (i) Suppose that the random variables X = (X1 . (ii) If X1 . what is E(θ | X )? 2. · · · . All questions carry equal weight. Suppose that X1 . show that E{a(X) | S } = b(S ). Give one example each of situations where this is (a) true. X2 .UNIVERSITY OF CALIFORNIA. where X = n Using part (i) or otherwise. Xn ) have joint distribution depending on parameter θ . X2 . X21 ). Obtain the formula for the exact level α critical value. X2 . (b) untrue. If E{a(X) | θ } = c(θ ) for all θ . x > θ. (ii) Show that the estimator in (i) is inadmissible under the squared-error loss function. one statistic used to test n −1 ¯ 2 ¯ ¯ hypothesis H that the n means are equal is T = n j =1 Xj . j =1 (Xj − X ) /X . Xn are IID with mean ξ and that Y1 . (b) Write down the UMP level α test for testing H0 : θ = θ0 versus HA : θ > θ0 . and E{b(S ) | θ } = c(θ ) (a) Show that this family has the monotone likelihood ratio (MLR) property in T = X(1) . 2 and j = 1. · · · . PLEASE TURN OVER . Suppose that X is N (µ. Define θ = ξ/η and let T = X/ (i) Use the CLT and the Delta Method to obtain a Normal approximation for the distribution of n1/2 (T − θ ). · · · . SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Friday September 9 2005 9:00 – 12:00+ Answer SIX (6) of the ten questions. 3.

i = 1. MLE = maximum likelihood estimator. . If the values of the xi ’s are constrained to the interval [−c. . 1]. ρ unknown. n.019 for testing H0 : p = p0 versus H1 : p = p1 . We wish to test the null hypothesis H0 : σ1 = σ2 versus HA : σ2 > σ1 . Ym – which will be independent Exponential. For fixed constants xi . find the choice of x1 . Dj )? (ii) Using the transformation in (i) or otherwise. The parameters α and β will be estimated by Least Squares. PMF = probability mass function.] 9. t] – which will be Poisson with mean µt – and then record the next m inter-event times Y1 . CLT = Central Limit Theorem. where p0 is the Poisson distribution x+1 ] with mean 1 and p1 is the geometric distribution with mean 1. (i) Compute separately the MLEs µ ˆN and µ ˆY based on N and on Y1 . σ2 . construct the best test that you can for H0 versus HA . n. CDF = cumulative (probability) distribution function. · · · . the random responses Yi are given by Yi = α + βxi + Zi . X2 . ABBREVIATIONS USED: IID = independent and identically distributed. Assume a prior distribution on θ that is uniform on [0. . (i) Transform Xj .e p1 (x) = ( 1 2) 10. Yj ). [You can take n to be an even number if it helps. where the Zi ’s are independent with means zero and variances σ 2 . 7. Yj to Sj = Xj + Yj . 2 Var(Yj ) = σ2 . · · · . Suppose that X1 . . UMP = uniformly most powerful. She decides to count the number of events N in the time interval [0. [i. PDF = probability density function.6. (i) What is the posterior density of θ ? (ii) What is the posterior mode of θ ? (iii) Find the Bayes estimator of θ under squared-error loss. X is a single non-negative integer random variable with PMF p(x). j = 1. · · · . (iii) Is it possible to obtain a best unbiased combination of µ ˆN and µ ˆY ? Explain. · · · . are independent bivariate Normal with mean vector 0. 2 Var(Xj ) = σ1 . · · · . c]. An experimenter wants to estimate the rate µ of a homogeneous Poisson process of events. Xn are IID random variables with density fθ (x) = 0 1 θx2 for x ≥ 1 θ otherwise. Dj = Xj − Yj . 8. respectively. Ym . What is the joint distribution of (Sj . Find the most powerful test of level 0. Yj ) = ρσ1 σ2 with σ1 . xn that maximizes the accuracies of the two estimators. . The pairs (Xj . Cov(Xj . UMA = uniformly most accurate. (ii) Calculate the mean of µ ˆY .

0) has the monotone likelihood ratio (11LR) property in T = - 2:7=1 Xi. 24. . . (3) . N(/3X i . (a) V\Trite down the likelihood function L( e) for {} and sketch the graph of L( e). . Suppose that we observe n independent pairs (Xi. R :::. 1) and then conditionally Yi I Xi r. . 1.tivc hypothesis that a > 1.20] with 0 > O. /3) = /3-n yo.University of California Santa Barbara Departn1ent of Statistics and Applied Probability Mathematical Statistics Qualifying Exam Please answer any 5 of these 8 questions. 2. (b) Under the null hypothesis. Sept. (a) Calculate the correlation of X and Y. (d) If 0 has the prior density n( 0) = {o-n.Xn are independent and identically distributed randOlTI variables. the mean and standard deviation of Yare equal so a simple intuitive test n1ight be based on the ratio of the sample mean and standard deviation.0:) uniforlnly lTIOSt accurate one-sided upper confidence interval for O. W = Y / s. The special case Q = 1 corresponds to the exponential distribution.. Let Xl . Independent lifetin1es Y 1 . .. = and R = X(n)/ X(l) are jointly n1inimal sufficient for O. each with uniforlTI distribution on the interval [0. What would you conclude about 0 if R 3. 2. J \vhere c5 > 0 is given. Sho\v that In(W -1) converges in distribution to a standard normal under the null..Xn be lID exponential \vith pdf j(xO) = {Be-ex. o 0 > J 0 :::.-l exp[-y/ /3]/r(o:).. o~ X>o x~O (a) Sho\v that the joint pdf f(x.1). 2? (d) Note that 1 :::. Suppose that Xl.. (3). . X 2.0:) highest posterior density (HPD) credible interval for O. Let X(l) and X(n) denote the smallest and largest of the Xi's. (c) Is this acornplete statistic? Justify your response. (b) Find a IniniInal sufficient statistic for estilnating (OL. (a) Obtain an expression for the log likelihood function of (a..... find the posterior pdf of 0 and calculate the (1 .. (c) Use the result from (3b) to suggest an approximate size OL critical region based on VV. and this is a null hypothesis to be tested against the alternp. (d) \iVhy Inight you expect that the test based on W is not the most powerful test available? 4. 2007 Yi) where the Xi rv N(OL. . (c) Find a (1 . Yn have the COInmon Gamma PDF j(yIOL. (b) Shovv that (c) Verify that T = X(n) T /2 is the MLE of e. (b) Find a UI\lP test for testing H o : 0 ? 00 versus HI : 0 < 00 and find explicitly the critical value needed to insure a test of size a.

(d) Using the result in (7c) or otherwise find the UMVUE for 8. If Xl ~ . e2 . (b) Sho\v that T = L~l Xi is sufficient for e. . X n be lID observations from a distribution with density for 0 < Xi < 1 and e > O. .0) versus HI : ({Lli {L2) -I (0.0). for the following tests ii. . 2 . (a) Find the distribution of W = -log (Il~=l Xi)' (b) Shovv that Vii ( [ D n Xi ] lin .... . and e > o. thi~ (a) \lcrify that belongs to an exponential fa111ily. H o : ILl + 1£2 = (0...5. 0) == ex . Let X I . (c) Calculate the asymptotic variance of the estimator T == [rr~l Xi] lin.. (c) \\Trite down the appropriate large sample approximation to the distribution of the likelihood ratio in each of these tests.ILE's of l i x ·and (b) Find the GLRT of size i. Is it c0111plete? (c) Find the pn1f of }rl = (Xl . . Then for each i. Let Xl. Suppose that the observations Yi. IE Yi == {Ly....e- O) converges in distribution to a normal distribution with lllean O. 7. .l (1 + 02 ).i are generated by a mixture distribution that first generates d independent exponential random variables L i that are not observed and all have mean {3 . X n are lID with comrnon pmf p(x. X = 1. 6. Calculate the n1ean and variance of YI . . Var(Yi) == 1.5. (b) Find the joint Plllf for the Yij..3.. == 0 versus HI : PI + {L2 -=I O. .o. H o : ({Ll. Yn be independent normal randolll variables with IE Xi = (a) Find the I\. {L2) (X {Lx.X!l. . We want to estimate T = e. X n: Y I . (c) Calculate the MLE for {3. Yi2. .. {Ly.1)/2.Yini with mean L i · (a) Calculate a lllethod of moments estimator of (3. there is a sample of ni independent Poisson random variables Yil. Var(X i ) == 1.

e. (b) Discuss briefly the properties of these estimator(s) namely. . consis­ tency. then what is the f (X. .d.UNIVERSITY OF CALIFORNIA.o/IO . a-I 11-1 ( X Y 1- X - Y )7-1 +y < (a) Find the joint density of S == X + Y and R == X/(X + Y).. Does this attain the Cran1er-Rao bound? If the bound is at fault.~] for some unknown () > posterior mode? 4. September 22. X n (n ~ 2) are independent copies of the Bernoulli random variable X with Pr(X == 1) == p == 1 . copies of the pair (Xi.. 1f(B) == Iloe. i. Answer any 5 out of the 8 Questions.. Y > 0. (a) find a (i) method of moments estin1ator and (ii) maximum likelihood estimator (MLE) for {3 based on an iid sample of size n.r(a)r(. From n i. (i) Show that TI~1 (1.exp(-x 7 /{3) x ~ O. Suppose that we have n independent observations UI . (c) Find also the MLE for P(X > 2) when X has the above cdf. == 1. sufficiency and efficiency (whether the Cran1er-Rao bound is attained). . .Pr(X == 0). Suppose that Xl. 2006. SANTA BARBARA Department of Statistics and Applied Probability Mathematical Statistics Qualifying Exam Friday. 1000AM-100PM.. Obtain the uniformly minimum variance unbiased estimate of B == p2. are there more precise bounds? 3. is KNOWN.Si) is a complete sufficient statistic for estimating .. y ) .. (ii) Show that TI~1 R i is an ancillary statistic for estimating .B)r(')') for x > 0.Un from a uniform distribu­ tion on the interval [o.) 1. 2. it is often assumed that the life-time X follows a Weibull(" {3) distribution. (b) Assume that a and {3 are known.. Suppose that the pair of random variables (X. Yi).i. In life-testing and reliability. What does Basu's theorem say in this context? . If we have an exponential prior distribution on the parameter B with mean 10. with the Cumulative Distribution Function (cdf) F(x) Assuming that. 1. unbiasedness. and x _ r(a+{3+. suppose we want to estimate .. Y) has the joint density o.

X 2 . and b == -(lin) (-call this resulting matrix (b) Show that Ql == Z' A1Z and Q2 == Z' A 2 Z are independent using results on quadratic forms. -# ° are scalar (a) Show that for matrices A of the above form. 2} given S ::::: s and use (c) Outline an appropriate test of the model fornl.. What does "Unbiased" mean in the case of a confidence set? (b) What is a good pivot in this case for constructing a confidence interval for B? Use this pivot to construct the shortest level (1 . ° == 1. 7. . If there is.. i. a 2) if (i) /-l is known and (ii) /-l is unknown.In + b. find a large-sample confidence interval for B. Find the degrees of freedom of the chi-squared in each case.. i == 1. i == 1. if you can. use this to find the Uniformly Most Accurate Unbiased confidence set of level (1. .Xnl +Xn2 ) is minimal sufficient for under H o. OR (ii) a == and b == (lin) (.d.a) confidence interval for B.n and j == 1... otherwise.. . (a) Explain if there is a UMPU test for testing Ho : B == B o versus the 2-sided alter­ native.call the resulting matrix A 2 ). . In) distribution where In is the Identity matrix.2. .i. a test of whether or not the belief /-lil == ai. Let Xl. Consider matrices of the form A == a.. (c) Find the MLE of B and the Fisher infornlation i(B) in a single observation. Q == Z' AZ has a chi-squared distri­ bution if and only if either (i) a A l ). Using these. . n. 2 . . .a).Xn be independent and identically distributed (i. and En is the nxn matrix with all entries 1. (Hint: Recall this is a scale parameter family and use. H A : B -# B o. . The 2n random variables Xij are independent Poisson with means /-lij. .. X n from N(/-l.) observations from the exponential pdf po(x) ~e-(X/O) for B ' 0.. the fact that (~) has a chi-square distribution).. Relate these quadratic forms to the sanlple mean and sample variance of the components of Z.an) (b) Derive the joint distribution of {X ij . j == 1.. /-li2 == (3ai is correct. x > 0. a == (al' . this to derive the UMPU test of H o versus HA. /-lil == ai. We believe that H A : {3 > 1.e. Find the Generalized Likelihood Ratio Test for testing H o : a == ao versus the one-sided alternative H o : a < ao based on a sample Xl. 6.5. .En where a and b constants. Do either of these tests result in a UMP or UMPU procedure? 8.. Let Z be an nx1 column vector with Nn(O. /-li2 == {3ai and we want to test H o : {3 ::::: 1 versus (a) Show that S ::::: (X ll +X12 .

ÍÆÁÎ ÊËÁÌ Ç ÄÁ ÇÊÆÁ ¸ Ë ÆÌ ÈÖÓ ÄÁ Ê Ð ØÝ ÁÆ Ê Ô ÖØÑ ÒØ Ó ËØ Ø ×Ø × ² ÔÔÐ Å ÌÀ Å ÌÁ Ä ËÌ ÌÁËÌÁ Ë ÉÍ Å Ö Ò×Û Ö Ó Ø ÕÙ ×Ø ÓÒ׺ ÓÓ ËØ Ø ×Ø Ð ÁÒ Ö Ò Ý ÓÖ ÒÝ ÒÙÑ Ö Ð ÛÓÖ º ½º Ý Ë ÔØ Ñ Ö ½ ¾¼¼ ¼¼ ß ½¾ ¼¼ ÐÐ ÕÙ ×Ø ÓÒ× ÖÖÝ ÕÙ Ð Û Øº ÁØ × Ô ÖÑ ØØ ØÓ Ö Ö ØÓ Ø × ÐÐ Ò Ö Ö¸ ÙØ ÒÓØ Ò Ð× º ÓÙ Ñ Ý Ù× Ð ÙÐ ØÓÖ ÖÓ Ù×Ø Ú Ö× ÓÒ Ó Ø × ÑÔÐ ¬Ò Ý Ø ×Ø Ø ×Ø Ð ÙÒ Ø ÓÒ ÚÖ ½ ÓÖ × ÑÔÐ Ñ Ò × Ø Õ½ Ô ´ µ µ ØÖ ÑÑ Ñ Òº Ì × × Ø´ µ ½   ¾Ô ÕÔ ´ Ü ´Üµ × ÑÔÐ ØÖ ÑÑ ÔÔÖÓÜ Ñ Ø ±  ½ ´«µ × Ø «Ø ÕÙ ÒØ Ð Ó ÙÑÙÐ Ø Ú ×ØÖ ÙØ ÓÒ º Ì Û Ö Õ« ´ µ Ñ Ò Ì Ø Ò ×Ù ×Ø ØÙØ × Ø ÑÔ Ö Ð ÓÖ º ´ µ Ð ÙÐ Ø Ø Ò­Ù Ò ÙÒ Ø ÓÒ Ó Ø × ØÖ ÑÑ Ñ Ò ÙÒ Ø ÓÒº ´ µ ÜÔÐ Ò Ö ÙÐÐÝ ÓÛ Ø × Ò­Ù Ò ÙÒ Ø ÓÒ Ñ Ø Ù× ØÓ Ð ÙÐ Ø ÓÒ¬ Ò Ð Ñ Ø× ÓÖ Ø´ µº ¾º ËÙÔÔÓ× Ø Ø ½ ¾ Ò Ö Ò Ô Ò Òظ ÒÓÒ¹Ò Ø Ú ÒØ Ö Ö Ò ÓÑ Ú Ö Ð × Û Ø ÔÖÓ Ð ØÝ Ñ ×× ÙÒ Ø ÓÒ Ô´Üµº Ì Ö Ö ØÛÓ ÝÔÓØ × × ÓÙØ Ô¸ ¬Ö×Ø ´Ø ÒÙÐРݹ ÔÓØ × ×µ Ø Ø Ø × ÈÓ ××ÓÒ Ò × ÓÒ ÐÝ ´Ø ÐØ ÖÒ Ø Ú ÝÔÓØ × ×µ Ø Ø Ø × ÓÑ ØÖ ½ ´ÈÖ´ ܵ ´½   µÜ¸ Ü ¼ ½ µ¸ Û Ø ÙÒ ÒÓÛÒ Ñ Ò Ò Ø Ö × º ÈÒ × Ñ Ò Ñ Ð ×ÙÆ ÒØ ÓÖ Ø ÙÒ ÒÓÛÒ ´ µ Ë ÓÛ Ø Ø ÙÒ Ö Ø Ö ÝÔÓØ × ×¸ Ë ½ Ñ Òº Ì Ò ÓÒ¬ÖÑ Ø Ø Ë × ÓÑÔÐ Ø º ´ µ Ç Ø Ò Ø ÑÓ×Ø ÔÓÛ Ö ÙÐ × Ñ Ð Ö Ø ×Ø ´ º Û Ø × Þ Ò Ô Ò ÒØ Ó Ø ÙÒ ÒÓÛÒ Ñ Òµ Ó Ø ÒÙÐÐ ÝÔÓØ × × Ò×Ø Ø ÐØ ÖÒ Ø Ú ÝÔÓØ × ×º ´ µ ÁÐÐÙ×ØÖ Ø Ø Ø ×Ø ÓÖ Ø × Ò Û Ø ÐÐ ÓÙÖ Ó × ÖÚ Ø ÓÒ× ÕÙ Ð ØÓ ½º ¿º ÁÒ Ô Ò ÒØ Ö ×ÔÓÒ× × × Ø×ÝØ Ö Ö ×× ÓÒ ÑÓ Ð ½ ½ « · ¬Ü · Ò Ì Ö Æ ´¼ ¾ µ¸ º º « Ö ÒØ Ú Ö Ò ×º ¾ ¾ Ö ÒÓÛÒº ´ µ Ç Ø Ò Ø ÓÖÑÙÐ ÓÖ Ø ÄÍ Ó ¬ ××ÙÑ Ò Ø Ø ½ ´ µ Ë ÓÛ ÓÛ Ø Ö ×ÙÐØ Ò ´ µ Ò Ù× Û Ø Ø Ö Ø Ú ÐÝ Û Ø Ð ×Ø ×ÕÙ Ö × ØÓ Ó Ø Ò ¾ ¾ Ö ÙÒ ÒÓÛÒ Ò Ò ¿º ×Ø Ñ Ø × Ó «½ « ¬Û Ò ½ ÈÄ ½ Ë ÌÍÊÆ ÇÎ Ê Ø × « Ö× ÖÓÑ Ø ÓÖÑÙÐ Ø ÓÒ Ù× ÒØ Ø ÜØ ÓÓ ½ .

º Ò ÜÔ Ö Ñ ÒØ Ö Û ÒØ× ØÓ ×Ø Ñ Ø Ë × ØÓ ÓÙÒØ Ø ÒÙÑ Ö Ó Ú Û Ø Ñ Ò Ø ß Ò Ø Ò Ö ÓÖ Ø Ò Ô Ò ÒØ ÜÔÓÒ ÒØ Ðº ´ µ ÓÑÔÙØ × Ô Ö Ø ÐÝ Ø ÅÄ × Æ ´ µ Ð ÙÐ Ø Ø Ñ Ò Ó º ´ µ Á× Ø ÔÓ×× Ð ØÓ Ó Ø Ò ×Ø ÙÒ Ø Ö Ø Ó ÓÑÓ Ò ÓÙ× ÈÓ ××ÓÒ ÔÖÓ ×× Ó Ú ÒØ׺ ÒØ× Æ Ò Ø Ø Ñ ÒØ ÖÚ Ð ¼ Ø℄ ß Û Û ÐÐ ÈÓ ××ÓÒ Ò ÜØ Ñ ÒØ Ö¹ Ú ÒØ Ø Ñ × ½ ¡ ¡ ¡ Ñ ß Û Û ÐÐ Ò × × ÓÒ Æ Ò ÓÒ Æ Ò ½ ½ ¡¡¡ Ñ ¸ Ö ×Ô Ø Ú Ðݺ ÓÑ Ò Ø ÓÒ Ó ÜÔÐ Òº Ò¸ ÈÖ ´ º Ï ÔÐ Ò ØÓ Ó Ø Ò Ø Ö × ÑÔÐ × Ó Ò Ô Ò Ö ×ÔÓÒ Ò ØÓ Ð Ú Ð× Ü½  ½ ܾ ½ Ó ÓÚ Ö Ø × Ô ÐÒ ½ Ô Û Ö ¬ × ÙÒ ÒÓÛÒº ´ µ Ú Ò Ø × ÐÓ ×Ø ÑÓ Ð¸ Û Û ÒØ ØÓ Ø ×Ø Ø Ç Ø Ò Ø ÙÒ ÓÖÑÐÝ ÑÓ×Ø ÔÓÛ Ö ÙÐ Ø ×غ È ´ µ ÔÔÐÝ Ø Ø ×Ø ØÓ Ø × Ò ¿¸ ¿ ½ ´ µ ÜÔÐ Ò Ö ­Ý ÓÛ ØÓ Ø ×Ø Û Ø Ö ÓÖ ÒÓØ Ø ÓÙØ ÓÑ × Ó Ø ³×º ܺ Ì ÐÓ ×Ø ÑÓ Ð ÓÖ Ô ¬Ü ÒÙÐÐ ÝÔÓØ × × À¼ ¬ ÒØ Ò ÖÝ Ö ×ÔÓÒ× × ½ ¾¸ ÓÖ¹ ½ ܵ ¼ Ú Ö×Ù× À ¬ ¼º ¼ ¿ ÓÖ ½ ¾º ×Ô ¬ Ð Ò Ö ÐÓ ×Ø ÑÓ Ð ¬Ø׸ ÓÖ ÒÖ Ð º Ï × ÑÙÐØ Ò ÓÙ×ÐÝ Ø Ú Ø Ò Ð ØÖÓÒ ÓÑÔÓÒ ÒØ׸ Û Ó× Ð Ø Ñ × Ö Ò Ô Ò ÒØ ÜÔÓ¹ Ò ÒØ Ð Ö Ò ÓÑ Ú Ö Ð × Û Ø ÙÒ ÒÓÛÒ Ñ Ò º Ì ÔÖ ÓÖ ×ØÖ ÙØ ÓÒ ÓÖ ½ × ÑÑ Û Ø ÒÓÛÒ Ò Ü Ò × Ð Ô Ö Ñ Ø Ö׺ Ì ÓÖ Ö ÐÙÖ Ø Ñ × Û ÐР̽ ̾ ÌÒ º ´ µ Ë ÓÛ Ø Ø ½ Ò̽ ¸ ´Ò   · ½µ´Ì   Ì  ½ µ¸ ¾ Ò¸ Ö ÑÙØÙ ÐÐÝ Ò Ô Ò ÒØ ÜÔÓÒ ÒØ Ð Ö Ò ÓÑ Ú Ö Ð × Û Ø Ñ Ò º ´ µ Ï Ò Ø ´Ò   ½µ×Ø ÐÙÖ Ó ÙÖ× Û Û ÒØ ØÓ Ú Ò ÙÔÔ Ö ½   Ô ÔÖ Ø ÓÒ Ð Ñ Ø ÓÖ Ø ¬Ò Ð ÐÙÖ Ø Ñ ÌÒ º Ð ÙÐ Ø Ø × Ð Ñ Øº º ËÙÔÔÓ× Ø Ø Ò Ö Ò Ô Ò Ö ×Ô Ø Ú Ðݺ Ì ÔÖ ÓÖ ÔÖÓ Ð ØÝ ×ØÖ ÑÑ ´« ¬ µ ×ØÖ ÙØ ÓÒ Û Ø « ¿ Ò × Ø ÍÒ ÓÖÑ ×ØÖ ÙØ ÓÒ ÓÒ ¼ ½℄º ´ µ Ð ÙÐ Ø Ø Ó ÒØ ÔÖ ÓÖ Ò× ØÝ Ó ´ ´ µ Ò Ø Ó ÒØ ÔÓ×Ø Ö ÓÖ Ò× ØÝ Ó ´ ´ µ Ò Ø Ý × ×Ø Ñ Ø × Ó Ò ÒØ ÈÓ ××ÓÒ Ö Ò ÓÑ Ú Ö Ð × Û Ø Ñ Ò× Ò ÙØ ÓÒ ÓÒ ´ µ × ×Ù Ø Ø ´ µ · ×Ø ´ µ ­ Û Ö ­ × ÔÖ ÓÖ Ò Ô Ò ÒØ Ó Ò µº µ ÚÒ´ µº ÙÒ Ö ØÓØ Ð ×ÕÙ Ö ÖÖÓÖ ÐÓ×׺ ËÙÔÔÓ× Ò Ú Ù Ð׺ Ó Ø Ñ Ö ´ µ Ò Ø ´ µ Ë ÓÛ Ø ´ µ Ò Ø ´ µ Ò Ø º Ø Ø ÓÖ ÔÓÔÙÐ Ø ÓÒ Ó ÙÒ ÒÓÛÒ × Þ Æ Ø × ÒÓÛÒ Ø Ø Ø Ö Ö ½ ¼ Ñ Ö ÁÒ Ú Ù Ð× Ö × ÑÔÐ Ö Ò ÓÑÐÝ Û Ø Ö ÔÐ Ñ ÒØ ÖÓÑ Ø ÔÓÔÙÐ Ø ÓÒ ÙÒØ Ð ½ ÖÓÑ Ø Ñ Ö ×Ù × Øº Ä Ø Ø × Þ Ó Ø × ÑÔÐ º Ñ Ø Ó Ó ÑÓÑ ÒØ× ×Ø Ñ ØÓÖ Æ ÓÖ Æ º ØÆ ×Ø ×Ø ÙÒ × ×Ø Ñ ØÓÖ Ó Æ º ÅÄ Ó Ø ÔÖÓÔÓÖØ ÓÒ Ó Ø ÔÓÔÙÐ Ø ÓÒ Û × Ñ Ö º Ö Ñ Ö¹Ê Ó ÐÓÛ Ö ÓÙÒ ÓÖ Ø Ú Ö Ò Ó ×Ø Ñ ØÓÖ× Ó º ¾ .

.

.

. . but no notes. 2. 6. . Derive a good test and explain why it is good. . where µij = λi ψj for i = 1. . . (b) Prove that S is not complete. . c2 . . does the lack of completeness of S have for estimation of θ? (d) Each component of S can be used to generate a simple estimate of θ. . . . (c) Find an interval of the form [0. p3 are unknown. if any. . (c) What implications. Please indicate on the front of your exam paper which 5 questions you have answered. . Suppose that we have n independent observations from a geometric distribution with pmf Pp {X = k } = p(1 − p)k−1 (a) Find MLE of τ = (1 − p)4 . Yn are independent N (θ. Suppose that X1 . . For µ = (µ1 . ψn are unknown. . . . There is a prior distribution on φ where ξ = νλ/φ has a chi-squared distribution with ν degrees of freedom and ν and λ are known constants. . n. (b) Find the MVUE of τ = (1 − p)4 . . . p2 . We wish to test the null hypothesis N H : p1 = p3 versus the alternative hypothesis p1 > p3 . 2. 2. µn )T . . . Suppose that Y1 . (b) Show that the posterior distribution of νλ + φ is chi-squared with ν + n degrees of freedom. a) that is a 1 − α credible set for φ. 2010 Answer any 5 of these 8 questions. Obtain the most powerful exact test and give a formula for calculating the P -value (significance probability). You may consult your textbook. 2. k . All parameters p1 . 5. We want to test the null hypothesis that λ1 = λ2 versus the one-sided alternative λ1 < λ2 . 4. . . . λ2 . 2. What is the UMVUE of σ2 ? n i=1 2 yi for k = 1. . . Observations Y1 . for j = 1. Xn be independent normal random variables with unknown parameters E(Xi ) = µi and Var(Xi ) = σ 2 for i = 1. 3. 2 and j = 1. . Yn are a random sample from the normal distribution with mean zero and variance φ. . 3. The random variables Yij are independent Poisson with means µij . where cj and dj are given for 0 < k < n and c1 . . (a) Find the prior means of φ and φ−1 . . . All parameters λ1 . . cθ2 ) where c is a known constant and −∞ < θ < ∞. n. . .Mathematical Statistics Qualifying Exam September 8. suppose that we know cT j µ = dj . X2 . ck are linearly independent. . 1. Which one is better? Explain your choice. 1 . . . . ψ1 . µ2 . . 3. (a) Obtain formulas for the two statistics which form the minimal sufficient statistic S for θ. . Let X1 . . . Xn are independent trinomial with Pr(Xi = k ) = pk for k = 1.

find the critical value of the test in (b) so as to achieve significance level α = 1/24. (a) Construct a pivot that is a function of x ¯. (c) Construct another approximate 95% confidence interval that is a function of the median of xi . θ). . X2 . 8. Let X1 . H1 : θ > τ max Xi max Yi Consider the test which rejects H0 when U > c for some given critical value c. Assume the Xi ’s and Yi ’s are mutually independent. Xm be iid Uniform distributed on the interval (0. Yn iid Uniform on (0. . . . (b) Calculate an approximate 95% confidence interval for µ2 based on the pivot from part (a). (c) If m = 4 and n = 2. (d) Which interval should we prefer? Why? 1 −|xi −µ|/2 e .7. Y2 . . (a) Derive the density for U= (b) Suppose we wish to test the hypotheses H0 : θ ≤ τ vs. . . Show that the power of the test is monotonically increasing in ρ = θ/τ . Suppose that we have n independent observations from a Laplace distribution f (xi | µ) = We hope to find a 95% confidence interval for µ2 . and Y1 . 4 2 . . τ ).

i=1 (ii) Specify exactly how you would apply these Bayes estimates if α and β were unknown. Yn are independent with pdf f (y | µ. µm with respect to the loss function m (Ti − µi )2 /µi . 2πc(τ ) where 0 ≤ y. Tm of µ1 . Suppose that X1 . . . 2. Prove that (a) E(S 2 | X ¯ ). X2 . . Let X1 . two ways to show this. Xn are independent bivariate Normal with mean vector θ and variance matrix σ 2 I with I the identity matrix and σ 2 is known. . . . (i) Derive the maximum likelihood estimator µ ˆ of µ. PLEASE TURN OVER 1 . . Suppose that X1 . . with variance equal to mean – with θ > 0 unknown. . Y2 . Var(S 2 ) > Var(X 4. Carefully describe (i) The sample average X ¯ ) and the other not. β α Γ(α) (i) Obtain the optimal Bayes estimates T1 . . We want to test the null hypothesis N H : θT θ ≤ 1 versus AH : θT θ > 1. ¯ is minimal sufficient for θ. . .e. τ ) = 1 exp{τ cos(y − µ)}. . . exactly or approximately. µ < 2π and τ ≥ 0. one requiring calculation of Var(X 2 ¯) = X ¯ and (b) (ii) The sample variance S is also an unbiased estimate for µ. Xn be independent Poisson with common mean µ. . (iii) Explain how you would evaluate the P -value of your test. All questions carry equal weight. . . . . β ) = µα−1 exp(−µ/β ) . . µm . . .UNIVERSITY OF CALIFORNIA. It is permitted to refer to the book Statistical Inference by Casella and Berger. ¯ is the uniformly minimum variance unbiased estimator of µ. X2 . The random variables Y1 . . . SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Monday September 12 2011 9:00 – 12:00 Answer 5 of the 9 questions. Both µ and τ are unknown. . . .) 3. Xn are independent N (θ. Suppose that Y1 . . X2 . but nothing else. . 1. but briefly try and justify your answer. . Those means are independently sampled from the Gamma distribution with probability density function p(µ | α. θ) – i. . (i) Show that X (ii) Derive an appropriate powerful test for N H versus AH . Describe two distinct pivotal quantities that can be used to obtain exact confidence intervals for θ. Is it possible to say which interval is better? 5. (ii) Obtain a formula for calculating an approximate variance of µ ˆ that does not require knowing c(τ ). (iii) What is the exact distribution of µ ˆ when τ = 0? (Note: you are not expected to derive this exact distribution using density function calculations. . Ym are independent Poisson with different means µ1 .

β1 . We want to test the null hypothesis N H : γ = 0. Let Yi = α + βxγ i + εi for i = 1. V ’s and W ’s are mutually independent Poisson. . β ) under N H ? (iii) Show that the score test statistic for testing N H is proportional to n T = j =1 ¯ )(Yj − Y ¯ )/(X ¯Y ¯ ). . [You may need the following matrix identity: If M= A bT b c then M −1 = A−1 + kddT −kdT −kd k . and the random errors εi are independent N (0. Let Yij = α + βi xj + εij for i = 1. . The independent pairs (Xj . . The random errors εij are independent N (0. (i) Obtain explicit formulae for the least square estimates of α. . . n. . . Yj ). . 2 . (ii) What is the minimal sufficient statistic S = s(X1 . . . where c is scalar. Suppose that we are going to observe n independent Exponential failure times with mean µ = 1/θ. j = 1. σ 2 ) where σ is unknown. m and j = 1. (ii) Obtain a formula for the upper 1 − p prediction limit for the final failure time Tn given T1 < T2 < · · · < Tn−1 . . with one-sided alternative of positive dependence (i. Yn ) for (α. The ordered failure times are T1 < T2 < · · · < Tn . . . β. . β and γ which are all unknown. Obtain the score test for the null hypothesis that γ = 1 with alternative hypothesis that γ < 1. . n are mutually independent Exponential with mean µ.6. . γ are all unknown. the means being respectively α. 9.] 8. d = A−1 b and k = (c − bT A−1 b)−1 . but not being independent necessarily. . (ii) Explain carefully how to test whether or not the slope parameters are equal when σ is unknown.e. (Xj − X (iv) Prove that E(T | S. are non-negative integers with the X ’s and Y ’s having Poisson distributions with different unknown means. 7. where the U ’s. . σ 2 ). Y1 . . βm . . j = 2. N H ) = 0. . . . (i) Show that Z1 = nT1 and Zj = (n − j + 1)(Tj − Tj −1 ). Xn . . . corresponding to independence of X ’s and Y ’s. The parameters α. γ > 0). n. For the single unknown parameter θ we have the conjugate prior distribution π (θ) ∝ θa exp(−bθ). Yj ). . (i) Write down the joint probability density function of (Xj . . We want to predict Tn as soon as the (n − 1)st failure occurs. n. Rather Xj = Uj + Wj and Yj = Vj + Wj .