You are on page 1of 3

[ADV ANCED] ANALYSIS, PROBABILITY AND STATISTICS EXERCISES [10]

Proof is the idol before whom the pure mathematician tortures himself. SIR ARTHUR EDDINGTON Statistics: The only science that enables dierent experts using the same gures to draw dierent conclusions. EVAN ESAR Life is a school of probability. WALTER BAGEHOT

x1 x2 . . . xn , then

1.

[Bounds for order statistics] Prove that if x1 + x2 + . . . + xn = 0 and x2 + x2 + . . . + x2 = 1, where n 1 2

n1 x1 n ni xi ni 1 n(n 1) xn

1 n(n 1)

i1 n(n + 1 i) n1 , n

for all i {2, 3, . . . , n 1},

where all the inequalities being the best possible.


[1.] A.V. BOYD: Bounds for order statistics. Univ. Beograd. Publ. Elektrotechn. Fak. Ser. Mat. Fiz. 380(1971) 3132. [2.] D.M. HAWKINS: On the bounds of the range of order statistics. J. Amer. Statist. Assoc. 66(1971) 644645.
2. [Gamma distribution] The median of the gamma distribution with (positive) parameter x is dened implicitly by the formula

1 (x)

m(x)

et tx1 dt =
0

1 2

or
0

m(x)

et tx1 dt =

1 2

et tx1 dt .
0

Prove that the function x m(x) is continuous and increasing on (0, ), while the function x m(x)x is continuous, decreasing and strictly convex on (0, ).
[1.] C. BERG and H.L. PEDERSEN: The Chen-Rubin conjecture in a continuous setting. Methods Appl. Anal. 13(2006) 6388. [2.] H. ALZER: A convexity property of the median of the gamma distribution. Statist. Probab. Lett. 76 (2006) 15101513.

[Skewness and kurtosis] Let x1 , x2 , . . . , xn be real numbers and let nm = xm + xm + . . . + xm , n 1 2 where m is an arbitrary real number. Prove that if 1 = 0 and 2 = 1, then the following inequalities hold:
3.

n2 1 2 , 4 n 2 + 4 3 + 1 and 3 n1 n1

for all n 2.

More generally, prove that if m = p/q 3 and p, q are integers with q odd, then for all n 2 integer we have:
2 2 2m m+1 + m

and m

(n 1)m1 + (1)m . n(n 1)m/21

[1.] K. PEARSON: Mathematical contributions to the theory of evolution. XIX: Second supplement to a memoir on skew variation. Phil. Trans. Roy. Soc. A 216(1916) 432. [2.] J.E. WILKINS: A note on skewness and kurtosis. Ann. Math. Statistics. 15(1944) 333335. [3.] M.C. CHAKRABARTI: A note on skewness and kurtosis. Bull. Calcutta Math. Soc. 38(1946) 133136. [4.] M. LAKSHMANAMURTI: On the upper bound of 18(1950) 111116.
n i=1

xm i

subject to the conditions

n i=1

xi = 0

and

n i=1

x2 = n. i

Math. Student.

4. [Upper bound for the dispersion] Let f : [a, b] R [0, ) be the probability density function of a random variable X whose expectation and dispersion are respectively given by

E(X) =
a

tf (t) dt

and D(X) =
a

t2 f (t) dt
a

tf (t) dt

Prove that if the expectation and the dispersion of X exist, then D(X) min{max{|a|, |b|}, b a}.
[1.] N.K. AGBEKO: Some p.d.f.-free upper bounds for the dispersion (X) and the quantity 2 (X) + (x EX)2 . JIPAM. J. Inequal. Pure Appl. Math. 7(5)(2006), Article 186, 3 pp. (electronic).
5. [Moments and central moments] Let n be an arbitrary natural number and let f : [a, b] R [0, ) be the probability density function of a random variable X whose expectation and dispersion are

respectively given by

E(X) =
a

tf (t) dt

and D(X) =
a

t2 f (t) dt
a

tf (t) dt

By denition the nth moment (about zero) and the nth central moment, respectively, of a probability density function f is the expected value of X n , and of (X E(X))n , respectively, i.e.
b

n = E(X n ) =
a

tn f (t) dt

and n = E((X E(X))n ) =


a

(t 1 )n f (t) dt .

For n = 1 we have 1 = E(X) and 2 = D2 (X). Moreover, the skewness (or the so-called third standardized moment) is written as 1 and dened as 1 = 3 / 3 , where 3 is the third moment about the mean and = D(X) is the standard deviation. The kurtosis, or the fourth standardized moment is dened as 4 / 4 , where 4 is the fourth moment about the mean and is the standard deviation. Kurtosis is more commonly dened as the fourth cumulant divided by the square of the variance of the probability distribution, 2 = 4 / 4 3, which is known as excess kurtosis. The minus 3 at the end of this formula is often explained as a correction to make the kurtosis of the normal distribution equal to zero. Problem: nd the mean, median, mode, variance, skewness, excess kurtosis, nth moment and nth central moment of the following continuous distributions: beta, uniform, rectangular, KUMARASWAMY, logarithmic, triangular, truncated normal, chi, non-central chi, chi-square, non-central chi-square, exponential, FISHER-SNEDECOR, non-central FISHER-SNEDECOR, gamma, ERLANG, half-normal, LVY, logistic, log-logistic, log-normal, PARETO, RAYLEIGH, RICE, WEIBULL, CAUCHY, GUMBEL, FISHER-TIPPETT, LAPLACE, normal, STUDENT, MAXWELL, VON MISES.
[1.] http://en.wikipedia.org/wiki/Probability distribution.
6. [WALLIS' inequality and WALLIS' formula] Let be the well-known EULER gamma function. Prove that for all n 1 integer the following improved WALLIS' inequality holds

n+ 1 2 < (n + 1) (n + 1 ) 1

1 (n + 2 )

where

n+ 1 (2n 1)!! 2 = (2n)!! (n + 1)

and the constants 1 = 4/ 1 and 2 = 1/4 are the best possible. Using this WALLIS' inequality prove the following WALLIS' formulae
= 2
n1

(2n)2 (2n)!! = lim (2n 1)(2n + 1) n (2n 1)!!

1 2n + 1

and =
n0

(n!)2 2n+1 . (2n + 1)!

[1.] C.P. CHEN and F. QI: The best bounds in Wallis' inequality. Proc. Amer. Math. Soc. 133(2005) 397401. [2.] S. KOUMANDOS: Remarks on a paper by Chao-Ping Chen and Feng Qi. Proc. Amer. Math. Soc. 134(2005) 13651367.
7. [Monotone form of l'HOSPITAL's rule] For a, b R let f1 , f2 : [a, b] R be continuous on [a, b], and dierentiable on (a, b). Further let f2 (x) = 0 for all x (a, b). Prove that if f1 /f2 is increasing (decreasing) on (a, b), then so are

x
160(1)(1993) 118.

f1 (x) f1 (a) f (x) f1 (b) and x 1 . f2 (x) f2 (a) f2 (x) f2 (b)

[1.] G.D. ANDERSON, M.K. VAMANAMURTHY and M. VUORINEN: Inequalities for quasiconformal mappings in space. Pacic J. Math.

[2.] G.D. ANDERSON, M.K. VAMANAMURTHY and M. VUORINEN: Conformal Invariants, Inequalities, and Quasiconformal Maps. John Wiley & Sons, New York, 1997.
8. [Log-concave distributions] Let f : [a, b] R [0, ) be a continuously dierentiable probability density function. Moreover, let us consider the cumulative distribution function F : [a, b] [0, 1], the survival function F : [a, b] [0, 1], the left hand integral of the cumulative distribution function G : [a, b] [0, ), the right hand integral of the reliability function H : [a, b] [0, ) and the failure rate r : [a, b] [0, ), dened by

F (x) =
a

f (t) dt, F (x) =


x

f (t) dt, G(x) =


a

F (t) dt, H(x) =


x

F (t) dt, r(x) =

f (x) . F (x)

[Using the monotone form of l'HOSPITAL's rule] Prove that the following implications are true: a. f is log-concave = F is log-concave = G is log-concave. b. f (a) = 0 and f is log-convex = F is log-convex = G is log-convex. c. f is log-concave = F is log-concave = H is log-concave. b. f (b) = 0 and f is log-convex = F is log-convex = H is log-convex. d. f is log-concave (log-convex) = r is increasing (decreasing).
[1.] M. BAGNOLI and T. BERGSTROM: Log-concave probability and its applications. Econ. Theory 26(2)(2005) 445469. [2.] S. ANDRS and . BARICZ: Properties of the probability density function of the non-central chi-squared distribution. J. Math. Anal. Appl. 346(2)(2009) 395402. [3.] . BARICZ: Geometrically concave univariate distributions. J. Math. Anal. Appl. 363(1)(2010) 182-196.
9. [Reliability function] Let X be a continuous random variable and suppose that its expectation exists. Prove that if x tends to innity, then x(1 F (x)) tends to zero, where F : [0, ) [0, 1] is the cumulative distribution function of the random variable X. Using this property prove that

E(X) =
0

F (t) dt =
0

(1 F (t)) dt,

where F : [0, ) [0, 1], dened by F (t) = 1F (t), is the survival (or reliability) function of the random variable X.
10. [NBU property] Let us consider the continuously dierentiable function : [0, ) (0, ). Prove that if (0) 1 and is log-concave, then for all x, y 0 we have (x + y) (x)(y). Moreover, if (0) 1 and is log-convex, then the above inequality is reversed. Now, let f be a continuously dierentiable probability density function which has support [0, ). Using the above result prove that if f is log-concave, then for all x, y 0 we have F (x + y) F (x)F (y), F : [0, ) [0, 1], dened by F (x) = 1 F (x), is the survival (or reliability) function of the random variable X. Moreover, if f is log-convex, then the above inequality is reversed. The above inequality is called in economic theory as the new-is-better-than-used property, since if X is the time of death of a physical object, then the probability P (X x) = F (x) that a new unit will survive to age x, is greater than the probability

P (X x + y) F (x + y) = P (X y) F (y) that a survived unit of age y will survive for an additional time x.
[1.] M.Y. AN: Log-concave probability distributions: Theory and statistical testing. Technical report, Economics Department, Duke University, Durham, N.C. 277080097, 1995. [2.] . BARICZ: A functional inequality for the survival function of the gamma distribution. JIPAM. J. Inequal. Pure Appl. Math. 9(1) (2008), art. 13, 5 pp. [electronic]

You might also like