You are on page 1of 9

Chapter 3: Discrete distribution

Shilpa G.

1 / 29
Definition
Let X be a discrete random variable. The function f given by

f (x) = P[X = x]

for x real is called the (probability) density function for X .

Remark: Some authors use letter 0 p 0 instead of 0 f 0 to denote PDF.


Also, some authors call it (probability) mass function.

Necessary and sufficient conditions for a function to be a


discrete density:

f (x) 0,
X
f (x) = 1.
all x

2 / 29
Representing PDF

1. Mathematical expression

2. Tabular form

3. Diagrammatic representation

3 / 29
Example
1. Tossing a coin 3 times. X is total number of heads appear.
(Remember: when no information is provided, we use classical
definition of probability.)
(
(1/2)y y = 1, 2, 3, . . .
2. f (y ) =
0 otherwise.

Check whether each of the following is a PDF.


x 2
1. f (x) = 2 for x = 1, 2, 3, 4.
(x 3)2
2. f (x) = 5 for x = 3, 4, 5.
x2
3. f (x) = 25 for x = 0, 1, 2, 3, 4.

4 / 29
Describe RVs (use ’canonical’ choices when possible) and
corresponding PDFs.
1. An alphabet is chosen randomly.

1. 5 dice are tossed simultaneously and numbers appearing on


the top are noted down. This is repeated until we get 6 on
top of each die involved.

2. A biased coin with probability of head equal to 0.6 is tossed


10 times. We count how many trials resulted in head.

5 / 29
Definition
Let X be a discrete random variable with density f . The
commulative distribution function for X , denoted by F , is defined
by
F (x) = P[X  x] for x real.

6 / 29
Definition
Let X be a discrete random variable with density f . Let H(X ) be a
random variable. The expected value of H(X ), denoted by
E [H(X )], is given by
X
E [H(X )] := H(x)f (x)
all x
P
provided |H(x)|f (x) is finite.
all x

E [X ] is called mean of the random variable X . It is denoted by µ


or µX . It is also called expected value, (weighted) average value,
mean value. It is also refered to as a ‘location’ parameter.

7 / 29
Theorem
Let X and Y be random variables and let c be any real number.
Then,
1. E [c] = c.
2. E [cX ] = cE [X ].
3. E [X + Y ] = E [X ] + E [Y ].

8 / 29
Definition
We say two variables X and Y are independent i↵

P[X = x, Y = y ] = P[X = x]P[Y = y ]

for all observed values x of X and y of Y .


Remark: We will see precise definition of independence when we
study PDF of two variables.
Fact
(i) Let X and Y be independent random variables. Then, f (X )
and g (Y ) are independent for any choice of continuous functions f
and g of X and Y respectively.
(ii) Let X and Y be independent random variables. Then,

E [XY ] = E [X ]E [Y ].

9 / 29

You might also like