You are on page 1of 5

# HT 2004

Probability Generating Functions

It is important to realise that you cannot have intuition about p.g.f.s because they do not correspond to anything which is directly observable. • A p.g.f. is nothing more than a mathematician’s trick. • You should think of it in terms of the deﬁnition. The p.g.f. of a discrete random variable X is deﬁned by ¡ ¢ gX (s) = E sX .

Why bother with p.g.f.s?

• They make calculations of expectations and of some probabilities very easy. • The distribution of a random variable is easy to obtain from its p.g.f. • They are easy to calculate and can almost always be found by using one of 3 standard tricks. • They make sums of independent random variables easy to handle. Using p.g.f.s 1. Calculating E (X) ¡ ¢ gX (s) = E sX ¢ ¡ 0 gX (s) = E XsX−1 , 0 so gX (1) = E (X) . 2. Calculating V (X) ¡ ¢ 00 gX (s) = E X (X − 1) sX−2 , ¡ ¢ 00 so gX (1) = E X 2 − E (X) ¡ ¢ 00 0 and gX (1) + gX (1) = E X 2

Thus

3. Calculating P (X = 0) Since

¡ ¢ 00 0 0 V (X) = E X 2 − E (X)2 = gX (1) + gX (1) − gX (1)2 .

gX (0) = P (X = 0) .

∞ ¡ ¢ X x s P (X = x) = P (X = 0) + sP (X = 1) + · · · , gX (s) = E sX = x=0

Obviously other probabilities may be calculated by repeated diﬀerentiation and choosing s = 0. 0 e.g. gX (0) = P (X = 1).

1

is ∙ ¸ X µ ¶k+1 ∞ 1³ s ´−1 1 s ³ s ´2 1 1 gX (s) = = 1− 1+ + = + .f. Example What is the probability mass function whose p..m. . × 1 + + + +. 2 x = 0.s It is straightforward to obtain any or all probabilities from a p.g. Calculating P (X is odd) and P (X is even) gX (1) = 1 = gX (−1) = so 1 + gX (−1) = 2 1 − gX (−1) = 2 Probabilities from p. is g(s) = ¤ 1 ? 2−s ∞ X ∞ X x=0 ∞ X x=0 P (X = x) . P (X = 2m + 1) m=0 ∞ X m=0 Thus the p. 2−s 2 2 2 2 2 2 k=0 ¥ Example If the p. 2.g. (2 − s2 ) (4 − s) µ ¶−1 ³ s ´−1 (2 + s) s2 1− gX (s) = 1− 8 2 4 µ ¶µ ¶ µ ¶ 2 4 1 s s s s s2 s3 = + 1+ + +..f. of X is µ ¶x+1 1 p(x) = . gX (s) = what is the P (X = 3)? ¤ 2+s . 4 8 2 4 4 16 64 P (X = 3) = coeﬃcient of s3 1 1 1 1 1 = + + + + 8 × 2 4 × 2 × 4 4 × 64 8 × 2 8 × 16 ¥ Calculating p.f..s Trick 1 Use the fact that a probability mass function sums to 1 to motivate a re-arrangement. = sk .g..f..f.f. (−1)x P (X = x) .g. P (X = 2m) .4. 1.. . 2 .g. .

B the game continues without addition to the score. . . and condition on the ﬁrst turn. mutually exclusive events can occur: A the game terminates without addition to the score. x = 0. . 2. The events are not equiprobable and the average score per game is m. C are p. p + q = 1. . x ∞ ∞ X µr + x − 1¶ X µr + x − 1¶ r x x g(s) = p q s = (1 − q)r (qs)x x x x=0 x=0 ¶ ¶ µ ∞ µ (1 − q)r X r + x − 1 1−q r r x (1 − qs) (qs) = = . x! (1 − e−µ ) x = 1. Show that the probability of mx . r. C the game continues with the addition of one point to the score. E sX | B = E sX . . . . and only three. scoring x points in one game is (1 + m)x+1 ¤ Suppose the respective probabilities of A. x µ ¶ r+x−1 = (1 − q)r q x . E sX | C = E sX+1 = sE sX . Example A game consists of a number of independent turns at each of which three. (1 − e−µ ) ∞ e−µ (1 − e−µs ) X (µs)x e−µs · (1 − e−µ ) e−µs x! (1 − e−µs ) x=1 Trick 2 Use the partition theorem for expectation. (1 − qs)r x=0 1 − qs x ¥ Example (deliberately chosen to be a very hard one you have never seen) p(x) = ∞ X µx e−µ . 1. ¡ ¢ ¡ ¢ ¡ ¢ ¡ ¢ E sX = E sX | A p + E sX | B q + E sX | C r ¡ ¢ ¡ ¢ ¢ ¡ ¢ ¡ ¢ ¢ ¡ ¡ E sX | A = 1. . ∞ X (µs)x e−µ µx e−µ g(s) = sx = x! (1 − e−µ ) x! (1 − e−µ ) x=1 x=1 = = ¥ e−µ (eµs − 1) . q.Example (deliberately chosen to be a hard one) µ ¶ r+x−1 r x p(x) = p q . gX (s) = p + gX (s)q + sgX (s)r µ ¶ ¶ ∞ µ p rs −1 rs k p p X = 1− gX (s) = = . B. 1 − q − rs 1−q 1−q 1−q 1−q k=0 Thus 3 .

gX (s) = gYi (s) which is (gY (s))n if the Yi are identically distributed. Then P (Y = k) = qk p = (1 − q) q k . 1+m 1+m (1 + m)x+1 ¥ Trick 3 Use the sum of a sequence of simpler independent random variables. p). Then. gX (s) = (q + ps)n . Let Y ∼ B(1. p) .f of X ∼ B (n. . . . (1 − qs) 1 − qs (1 − q) X (1 − qs) (qs)k (1 − qs) k=0 ∞ n X i=1 Yi 4 .P (X = x) = The average score per game is p 1−q µ r 1−q ¶x . . gY (s) = qs0 + ps1 = q + ps. x = 0.g. so that Then X = ¥ Pn i=1 Yi and Example Consider a sequence of success-failure trials and let the probability of a success for an individual trial be p. 1. (1 − q) q k sk = (1 − q) p = . Find the probability mass function for the number of failures X before the rth success occurs. and gY (s) = = Now X= ∞ X k=0 k = 0. ¯ ¯ pr ¯ = E(X) = 2¯ (1 − q − rs) s=1 pr r = 2 = p = m. 1. if n X i=1 n Y i=1 X= Yi . . . (1 − q − r) 0 gX (1) [Note that p + q + r = 1] µ ¶x µ ¶x p p r r P (X = x) = = 1−q 1−q p+r p+r µ ¶x m 1 mx = = . ¤ Let Y be the number of failures which precede the 1st success. Example ¤ Derive the p.

k! (n − 1)! k n µ k=0 k=0 Thus ¥ µ ¶ n+x−1 n x P (X = x) = p q .. .. 1! 2! ∞ ∞ X (n + k − 1)! X µn + k − 1¶ = pn q k sk = pn q k sk . . . 1.so ¶n p gX (s) = [gY (s)] = 1 − qs ¶ µ n (n + 1) 2 2 n n = p 1 + qs + q s + . x x = 0. 5 .