You are on page 1of 9

"CH 3 1-RANDOM VARIABLES

In Chapter 2 we focused on
experiments and whether or not certain events occured/happened
In this chapter we don't focus events individual outcomes but rather some aspect of the
measuring
on or on
,

experiment
A random variable :
formally a function which
assigns a real number to each sample point in the sample
space for given experiment
a

It easiest to think of random variable instead


counting/measuring/indicating some aspect of the
·

is a as

experiment numerically
Ex. Kevin flips
"
a coin 11 times"> EXPERIMENT

H =
# of heads that were flipped , RANDOM VARIABLE

Ex. Prof
"

O'Hara is late to class and has to run


up the hill to MS5 from
Western Road"

T time =
in seconds it takes him to run

Ex A student seated at random from class"


"

. is

H :
students height Y :
current year of program
For us In Math 1228 there are two types of random variables :

Discrete random variable : ones where possible values the random variable can be are distinct and

listable
:
Ex .
Y :
random chosen student
program year
=
1 ,
2
,
3 4 ,

"Ex .
M =
mark I will
get on test 2
=
0, 0 3 .
,
1 ,
1 3
.
...
25

Continuous random whose values fall into real


range of
·
variable : ones a numbers

"Ex H .
:
random chosen person
height
*
heights are between Ocm and 30cm

Ex T .
:
temp in office
between 18 -25
°

temp probably
°
* C

Discrete random variables tend to measure artificial data whereas continuous random variables

tend to come from real world measurement .


Typically ,
random variables are represented by capital
letters
Ex. Jeff has a
jar
with 2 black marbles and 3 pink marbles. He reaches in
,
removes a marble and sets it

aside. If the marble is


pink ,
he repeats but If it is black he stops .
Let's let P pink he draws before
:

stopping. Find the probability P 1 :


.

↳ Draw
probability tree
: a

Pr(P 2) 3/sx
""
: :

·
3/s P
"so
:

st
as B x = 0
Ex. John and Oliver are
playing with a coin that comes up heads
30 % of the time .
John flips this con

twice. For each head he flips Oliver will him $3 and for each tall Oliver will him $1 Let
,
pay pay .

X :
amount of
money
Oliver John Po(x 6) ?
pays
=

..
3 H < = 10
Pr(x 6) =
:
0 3
. x 0 7 .
+ 0 7
. x 0 3 .

H T X : b
s
o

0 21
3

0 21
0 =
+
.

. .

: To e
WARMUP : A chest holds 3 robies
O emeralds Jewels ·
are removed one at a time and set aside . This is

done twice. Let D # of rubles removed # of emeralds . Determine all


removed values D be
possible
: -

can .

↳ Let's keep track of the removed with tree


Jewels a .
(I ROE) D

Prod
R = 2 . 0 = 2

R
E(IRIR) D = 1 -
1 : 0

Dean be
only
2 00r2
,
·

R (IE ,
(R)
E
E LEE ORL ,
D = 002 : -2

Binomial random variables : some random variables are more common


important than others

·
Are written Ben p) count the number of times the outcome with
probability occurs if repeat
,
p you

an experiment with two outcomes a times. K Ch2 .


6

Ex of '4s' she rolls.


.
Sally rolls a six-sided die 200 times and wants to keep track of the number

↳ B (200 , %) ) could count the number of '4s'


n
200 4
=

prob of

p rolling
: a
rolls

Ex
. Trevor that of the time . If he tips the 36 times. B(36 6)
is
flipping a coin comes
up heads
60 % con ,
0 .

would represent the number of heads flipped.


"B(36 ,
0 .
4) world count the number of talls
flipped
Ex. Udit local visitors end What random variable
works at a
pet rescue .
He knows of up adopting a
pet .

could be used to count how out of the next 80 visitors end up pet ?
many adopting a

= B(n p) ,
= counts number of occurrences of the outcome with
probability pin n times/ =
BLOO , Y3)
prob of visitor adopting pet 4 80 visitors
p
: : ·
·
a a n =

B(n , p) are discrete : B(n , p) =


0 1 . ,
2
, ...,
-1
,
n

the probability of B(n p)


Using our formula from 2 6 .

,
we can calculate ,
being equal to specific values

Pr(B(n p) K) -
= =
(i) p"(1 p)""I -

Ex Find the
probability that exactly 71 people adopt a
pet
.

Pr(B100 Ys) 71) ,


= :
(ii) (i) "('s)"

"CH 3 1 CONTINUED
. CONSIDERING RANDOM VARIABLES AS A WHOLE OBJECT
"

We to broaden just
are
going
our view and
try to understand the whole random variable X
,
not a

value. We do this of two functions.


specific by relating X to one

A
probability mass function (p f) known . m .
in older texts as a
probability distribution function (p d f) . .

·
A function -x =
Pr(X x) :
A comulative distribution function (c .
d .
f) is a function Fy :
Pr(X x) :

Often for simpler discrete random variables


, ,
we represent them as tables of values

Ex .
Pr(X x) x Pr(X x) =
x
=

E E
10 15 .
I 0 15 .

POSSIBLE
POSSIBLE 20 . 3 2 0 45 .

VALUES
VALUES

X Can
3 0 3 .
X CAN 3 0 95 .
x Pr(y 1) =

BE BE
40 05 I 2 0 2
. 4 ,

E 0 4
↳ p
. m
. f ↳ c .
d f .
.

L
L

Pr(X 2) = =
0 3 .
Pr(X 3) = : 0 .
95
S 0 . 1

Ex. Consider the mass function for Y


given
,
8 0 . 3
Find Pr(Y :

S)
WARMUP : John and Oliver are
flipping a coin that comes
up
heads 40 % of the time. For each head flipped,
John pays Oliver $2 and for each fail flipped John Oliver $7 If X the John after the
pays money
:

pays
.

coin is flipped twice create ,


a
p .
d f for
.
X .

"To make a
p d f . .

(p .
m .
f) :

④ Find all values of X


possible
② For each value work ,
out the probability that X can be that value
↳ To understand X let's make a tree
,

Pr(X x)
H x = 4
0 4
.
x =

0 4 H X =
9
4 0 16 Pr(X 4) 0 16
.

0 4
Fix 0 4
· = =
0. 6
.
=
.
x .
.

&
0 .4
= 9

, 6
0 T 9 0 48. Pr(X 9) = :
0 4 . x 0 6 .
+
0 6 . x 0 .
4 = 0 24 .
+ 0 24
.
=
0 .
48
0. 6 T X = 14
14 0 36 .
> Pr(X :
14) =
0 6 . x 0 6 .
:
0 36 .

Notes about p .
m .
f's (p .
d .
f's) and c .
d f's .
:

Probabilities the f table add to 1 (TOTAL 1)


always PROBABILITY
·
in
p
:
.
.
m

The last value d f I


always
·
in a c . . is

·
To convert/create a c d f from a
.f, copy the values of X from the left and then to calculate the
. .

p m
.

probabilities for the add up from the


c .
d f .

,
we
just starting top
·
To find the probabilities in a
p . m f
.
(p .
d .
f) take the difference
,
in successive
probabilities from the c .
d f
.

Ex. From p m
.the .
f make
,
a c .
d f for .
X

Pr(X x) Pr(X x) Ex. (p


From the
following f d f) for random variable Y
x = x =
p . .
m . .

determine Pr(Y 19)

"Of
4 or !e
=
.

9 · . y Pr(Y x =
TOTAL PROBABILITY :
1
Pr(Y 19)
= :
3p
14 14 I 2
P 1 =
p + Y
,
+ 3, 0 +
3p
=
3("(20)


p .
.
m f ↳ c .
d f . 3 Yo 1 =
4p +
Yo 1 =
420
11 "o 3) s =
4p
3
19 3
p o
=
P

Pr(y 19) = =
34
Ex. Suppose X is a discrete random variable which
only takes on
integer values· Suppose F Is the c .
d f for X
.
,
and that we're told ,
F(2) :
0 .
11
,
F(3) =
0 2 .
,
F(4) =
0 23
.

,
F(S) =
0 .
38. .
Find Pr (3 < x = 3) .

Recall F is d f F(x) Pr(X . So F(4) that's Pr(x = 4)


c means = x) when 0 23
really saying
0 23
.
= =
, . . we see :
. .

?
"What values
satisfy 3 : X15 - need values IS but not =3
.

·
We reason 3 < x = S is all X with X = 5 but NOT X = 3

Pr(3 X 3) Pr(X 3) Pr(X = 3)


-

< = =
= - ↑

F(s) F(3) In If F
=

general Is c d f then :
-

,
a .
.

=
0 .
38 0 .
2 Pr(acX = b) =
Pr(X = b) Pr(X = a) -

1 = 0 .
18
-(b)
F(a) -

"

"CH 3 .
2 .
STATISTICS
P d fs and . .
c .
d fs . are
great tools to
study a whole random variable . However
they can be a bit

difficult to work with


If they're too
big/chaotic .

To make
things a bit easier
,
we tend to calculate
single values to interpret/represent the properties of

our random variables. These values are called statistics

THREE STATISTICS : myn'


·
E(X) :
expected value (or mean) , (or M(x) M)
average
or or

Has same
meaning
and
understanding as
average
·

·
V(X) :
variance "Sigma"

·
o(X) :
standard deviation , (or of
V(X) and o(x) both talk about how spread out the values of X are from the
average
·

CALCULATING STATISTICS :
How do calculate E(X) the expected value of ? If d f find f(x) simple
average
or we
given
we are we
x
p . .

,
can in a

:
way
& Create a 32P column
consisting of xPr (X x) = the product of the first two columns
② Add up the values in the new column

Ex. Suppose x has a mass function of HOW TO CALCULATE VARIANCE ?


Pr(X x)

I
Pr(x There
x =
x) x :
are
actually two approaches to

3 0 3
.
3 x 0 3 .
:
0 9 . Find [(x) .
calculate U(X) as It has two different

4 0 . 5 4x0 3 .
:
0 2 . E(X) = 0 9 .
+ 2 + 1 .
4) =
4 . 3) formulas
-
M =
E(X)

10 . 2 7 x 0 2 .
:
1 4
.
V(x) f((x M)) =
-

Ex. Find E(W) given the table : 1 Find M =


E(x)

W Pr(W n) WPr/W wh = = 2 Create a


p d f (p. .
.
m .
f) for a new

-
2 Y -2 x
=
"'s [(N) :
is Y -
+ 3/0 random variable (x-M)
:
-
I 4/s -

1 x =" =
Yo / + 3, 0 3 Find the expected value of (x-M)
O Ys 0 x = O
1 3 :.

I 3/ 1 x 0
:
3/, 0
W

I I
Ex
. Find V(X) If X has
p .
df .
STEP I

x) xPv(X x) f(x) (f(X)


:

Pr(
= =
E(X) 1 = +
Y6 V(X) = -

FindEx
Yz 3/ Y6
!
1 = +
=
-
x 2
-

0 Y 0 x = 0 :
46 p d . .
f(p .
m .
A fora n e
Y Yo :
*
I 1 x
=
random variable X

create d f for x Calculate [(X") and then U(X)


get
STEP 2 3
p
-

. .

"Start with the p d f for .


. x
,
and add on a third column

calculating x
NEW P D F
STEP 3 -
Find E(Xt

I
. .

Po(x x) Pr(X x)
x)
Pr(
Pr(X x)
2

x x
x
= < = = =

O " "3 0 x"3 :


0

I
"3

Y
10) ::

" = I
0
& >
-
>
-
Y1 + :
43 I

E(X)
43
=
0 +
1

43) Y3) =
x Y :
Y

V(x) =
E(x) -
(E(x)): =
<3 -

( )" = "3 - ('a) =


Y Yq -
=
"a
HOW TO CALCULATE THE STANDARD DEVIATION : S(X) :
V(X)
Ex. For that last example S(x) " ,
: :
af "
Ex.
Greg
has a with 3 blue marbles
bag and 2
green
marbles. He draws 2 marbles out of the
bag one at a time ,

without replacement. If Greg earns $1 for each blue marble drawn and $4 for each and
X the amount
green ,
is

earned after 2 draws find 3(x)


. ,

Let's start with tree make p d f Now that we have d f we


a
p can
· ·

a . .
. .
,

2/4b X = 2
X Pr(X =
x) w calculate our statistics
b
3/s q x S

30
=
2/4 2

3("s) ( ) (s)( ) :
get E(xY)
S
3/4b
=
x

"s 3 Now
+ :

" 943x = 8

8 (Ys) ("+) :
"20 "Yo * Pr(X x 2) X Pr(X x*)
= =

1 Get f(x) 4 3/10 " =


Y
x Pr(x =
x) XPr(X x) =
E(X) = % 0
+
3 +
% 25 3/s 79 =
15

% %/ by
3
Yo
3
3/10 %
2 = +
% +
, 0
64
:

s
s 3/s 3 =
"Y
8 Yo 8/10 1 24/s/
=
E(X)) % =
+
15 + 34s
2
Now make f for X =
4/ +
73 + 34s
a
p . .
m

↳ Find what values X" can be 1 "3/s/ :

"Calculate the probabilities


NEWODFX
x) f(x) /(E(X)
Px
<2
<

Pr(X =
, 7 V(x) = -
S(X) =
V(x)

20 Y :
"Y -

(2) =
s

s"28 2S 3/s =
363 -

484 s
=
0/25
8 Yo 64 64 Y 1 %s=
1 Y=
FORMULA FOR CALCULATING STATISTICS
Instead f(X) V(X) o (x) directly table (p f) it turns out there
of
always calculating , ,
from a . m .
,
are a

of formulas which speed


variety in certain circumstances can
up our calculations.

These formulas fall into of four


one
categories :

DEFINITIONAL Ex. If Y is a r v . .
and 0 /X) :
3 find ,
V(Y)
(0(X) (V(Y) 23)
"

f(xY (f(x) V(X) 3


:
·
V(X) = -
= = =

V(X) /0(X) EX. told V/W) 2 f(W) (W)


:

o(X)
·
:
v(X) OR =

Suppose we're :

,
= -

3
,
find .

*
TIP : these used to relate V(W) :
[ (WY) /f (W) 2 :
E(W) 9

-(
-

are
-

different stats to each c =


f(W) -

( 3)-
:
E(wYl
other

x
BINOMIAL Find 7((B(50 Ys)) .
,
I
asking for ECX) when x is binomial

E(B(n p) f(x) (F(X)


:
·

,
=
up V(X) = -

·
V(B(n p) ,
=
np
-

np
/1 p) -

V(B(50 "s) E(/B(S0 "s) Y)


,
:

,
-

E(B(50 4 s) ,

G(B(n p) np(1 p) (Ys) (l "s) E((B130 s) ) (50 "s)


"
·

,
= -

so - = -

,
,

Ex Find f(B (35 "k)


. ,
12 :
E((B(50 ,
3) 2) -

CO2

n p -
12 =
E((B (50 "s)") 400 ,
-

-
nxp =
35 x x =
3 x 4 /20)
-
1412 :
f((B150 3) Y / ,

Ex. Jeff is
flipping a fair con 100 times. If he counts the number of heads flipped as a random

variable find
,
the standard deviation of this random variable

0 (B(n p) up (1 p)
1001/s
=
. -

s ")
110 0
:

ARITHMETIC + If a
,
b ,
c are real numbers Ex. Suppose we know f(X) is 4
,
find f(- 3 x + 2)
·
f(aX b) + =
aE(X) + b E( 3x 2) -
+ = ( 3)f(x)
.
+ 2

·
f(aX by + + c) =
af(x) + b(f(4) + c
=
( 3)(4)
- + 2

·
V(aX b) + =
a'V(X) =
-
1 +
20
o(ax b) (a)0(X)
·
+ :

Ex. Suppose E(CN 1) + =


S
.
find [/-W + 7 Ex. Suppose o (4) :
3 find
, O(-y -3)
We Know 5 E/IN : + 1) E( -

W + 7) o(-(y - s) :
1410(X)
:
(f(w) + 1 = -
f (w) 7 + =
(2)(3)
4 :
(E(w)
=
-

2 + 7 1 6 =

2 :
E(W) 1 S/ :

Ex Find U(X)
.
If f(XY :
18 and f(-x 3) 4 .
:
Ex. If -(Y) 5 :

,
find -(24-B(10 ,
"s)
V(x) f(X)" - -

(f(x)Y -

f(X) + 3 =
4 2f(Y) f(B (10 "s)) -

=
18 -

(-1)2 -

Ex(X) =
1 2(3) -

2
=
18 .

1 I E 10-2 E
WARMUP : Suppose for random variable Y
,
we know U(-34) :
18. Find.
U(Y).
↳ Recall V(aX b) +
=
a V(X)
,

So ,
18 :
VI 34) - :
/ 3) V(X) 9 U(4)
- :

(2 v(x) =

We have final of formulas but


they first definition
. Recall two events independent if
one
group , require a
,
are

Knowing about (happening one or not


happening) doesn't tell
you anything about the after

(Pr(f) Pr(f(F) / =

* "E & F ARE INDEPENDENT"

LIKELIHOOD OEE ↳ EVEN IF WE ASSUME


IS THE SAME F HAPPENED

Inspired by this idea


,
we
say
two random variables X and Y are independent if
knowing the value X is at the

end of the experiment doesn't make


any
value of X more or less
likely (knowing Xs value doesn't tell
you
anything about Y's value)
Pr(x a) = = Pr(X =
a /Y b) for all
=
ab values that X and Y can be

Future courses will show the massive importance of independent random variables. For Math 1228 ,
we need to

independent
Invoke random variables for our final formulas:

INDEPENDENCE
independent
*
These formulas ONLY WORK If X Y .
are

f(Xy) =
f(x)E(Y) If a b are real numbers,
·

V(X Y) -
:
V(X) + V(y)
·
V(x +
y) V(X) V(y)= + ·
V(aX by) + =
arV(x) + b V(y) ·
V(aX by) - = a U(X) -
b V(y)

Ex. Suppose X and Y are random variables for the same experiment with E(X) =
-4 and E(4) :
3. Find [(XY-3).
"We aren't told X Y .
are independent ,
so
you can't use [(XY) :
f(x) f (Y) & CANNOT BE DETERMINED

Ex. Suppose X and Y are independent random variables with f(X) :


-
4 ,
f(y) =
3 .
Find f(XY -
3).

E(XY -
3) =
f(xy) -

3
=
E(X)E(y) -

3 =
( 4)(3)
- .
3) = -

23)
Ex. Suppose X and Y are independent with V(X) : 1
,
(4) =
40 Find .
o (-3X +
Y).
0(-3x + 4) =
V( -
3x +
y) =
( 3) V(X)
.
+ V(y) =
9(1 + 40
=
49
"

CH 4 1 .
-
CONTINUOUS RANDOM VARIABLES

Recall discrete random of tend to easily


,
variables have a listable set
possible values.
They appear in

survey data and other human activities .

Continuous random variables have their values fall


anywhere is a
given interval/range of real numbers

Continuous random variables are more abstract and tend to come from real-world/natural measurements

Discrete r v . .
s are studied/summarized with our mass functions (p .
m . fs/p d f) . .

Continuous summarized this


r .
v .
s can't be
way, as they have an infinite number of
possible values

For a continuous r .
v .
s X
,
we create a function f
>
called a
probability density function (p .
d . f)
.
Pr(acX-b) asa

"torany
that egual to the under yify(x)
real
ask

numbers
a and b
,
we is area over

-larea : Pr(a < x - b)


/
b
1
This VERY connection From AREA PROBABILITY
is a
big now on :
.

Because of this connection when


, dealing with continuous r .
v .
s
,
we
only ever talk about probabilities
with If Pr(x c)
inequalities >
-
X is continuous ,
= = 0

You might also like