You are on page 1of 40

A quick review of calculus Econ 3334

 Use calculus to find Min or Max of functions


 Multi-value functions
z = −5x 2 + 10xy − 20x − 7y 2 + 240y − 5300
 Find the x & y that gives the Max z
∂z ∂z
∂x
= 0 and ∂y
= 0 are the first order conditions (FOCs)
 Solve simultaneously for x & y
∂z
∂x
= −10x + 10y − 20 = 0
∂z
∂y
= 10x − 14y + 240 = 0
 Solve simultaneously: y ∗ = 55 x ∗ = 53
 Plug in x* and y* into z to get the value z ∗ = 770
 Is it a maximum or minimum? Need 2nd order conditions (don’t
1
worry here). Here it is the maximum.
Summations Econ 3334

 We’ll use summation notation a lot


If x = 8, −3, 4, 2, −1, n = 5
then ∑ i=1
n
x i = 8 + −3 + 4 + 2 + −1 = 10
If y = 0, 0, 10, −1, −1, then ∑ y = 8
If a is constant, then ∑ i=1
n
a =a + a + a +. . . . . +a= na
n times

e.g. ∑ i=1
99
10 =10 + 10 +. . . . . +10= 99 ⋅ 10 = 990
99 times
 Note: ∑ x i ⋅ ∑ y i
= 10 ⋅ 8
≠ ∑ x i ⋅ yi
= 8 ⋅ 0 + −3 ⋅ 0 + 4 ⋅ 10 + 2 ⋅ −1 + −1 ⋅ −1 = 39
Summations Econ 3334

 A few important properties

∑ ax i = a ∑ x i
∑a = n ⋅ a
∑ xiyi = ∑ yixi
∑ax i + by i  = a ∑ x i + b ∑ y i
∑ ∑ xi = n ∑ xi
Econ 3334

Review of Probability Theory

4
1) Random variable and distribution Econ 3334
 Outcomes- Potential results
 Sample Space- All possible outcomes
 Probability- Proportion of the time that an outcome occurs
 Event- Set of one or more outcomes (subset of sample space)
 e.g. Grade Probability
A 4.0 .15
B 3.0 .40
C 2.0 .30
D 1.0 .10
F 0.0 .05
 A,B,C,D,F are each outcomes, together they are sample
space
 ∑ prob = 1 and each prob∈ 0, 1 
1) Random variable and distribution Econ 3334
 Let V=the event that a random student gets an A or B
In this example “Grade” is a discrete random variable
PV = PA + PB  =. 15 +. 40 =. 55

 Two types of random variables (RV):


• Discrete random variable: takes only a discrete set
of values, like 0,1,2…,
• Continuous random variable: takes on a continuum
of possible values.
1) Random variable and distribution Econ 3334

 Probability Distribution
For a discrete random variable, the probability distribution
lists all the probabilities (e.g. our grade example)

Outcomes x 1 x 2 . . . x k
Probability p 1 p 2 . . . p k
1) Random variable and distribution Econ 3334
 For continuous random variable, we have an
equation called probability density function that
does a similar thing. It provides the probability
that the RV takes on a value within an interval.

 E.g. Normal Dist: x−μ 2


− 2
φx;μ, σ  = 1
e 2σ
2π σ
2
• When μ = 0, σ = 1 , φx = 1 − x2
e

1) Random variable and distribution Econ 3334

 Normal probability density


1) Random variable and distribution Econ 3334

 CDF (cumulative density function)


 If X is a random variable, then its CDF is defined
for any real number x by
Fx = PrX ≤ x
 So, F−∞ = 0; F∞ = 1 , and F takes values between 0
and 1
 If a<b, then F(a) ≤ F(b), (that is, F is non-decreasing
in its argument)
 If a<b, then Pr(a<X<b)=F(b)-F(a)
1) Random variable and distribution Econ 3334

11
Econ 3334

12
2) Expectation, Mean, and Variance Econ 3334

 Expected value of RV X is the average value of the RV


over many repeated trials.
EX = μ X = mean of X
 For discrete random variables:
EX = μ X = ∑ i x i ⋅ px i 
 Grade Probability
A 4.0 .15
EX = ∑ x i ∗ Px i 
B 3.0 .40
C 2.0 .30 = 4. 0 ∗. 15 + 3. 0 ∗. 40 +. . . . . +0 ∗. 05

D 1.0 .10 = 2. 5

F 0.0 .05
2) Expectation, Mean, and Variance Econ 3334

 For a continuous variable

EX  = μ X = ∫ x ⋅ fx dx,


where fx  is the probability density function (PDF).
2) Expectation, Mean, and Variance Econ 3334

 Variance & Standard deviation are measures of “spread”


varX = σ 2X = E X − μ X  2 = EX 2  − μ 2X
 The standard deviation is simply
σX = σ 2X
 For a discrete RV,
varX  = σ 2X = ∑x i − μX  2 ⋅ px i 
i
 For a continuous RV,

varX = ∫x − μ X  2 ⋅ fxdx


2) Expectation, Mean, and Variance Econ 3334

 Example:
Grade Probability
A 4.0 .15
σ 2X = 4 − 2.5 2 ⋅ 0.15 + 3 − 2.5 2 ⋅ 0.40 +...
B 3.0 .40
C 2.0 .30 = 1.05
D 1.0 .10
F 0.0 .05 σ X = 1. 05 = 1. 025
2) Expectation, Mean, and Variance Econ 3334

 Suppose
Y = a + bX
EY = Ea + bX = a + bEX
VarY = Vara + bX = 0 + b 2 VarX

 E.g. Consumption & Income

C = 1000 + 0. 9I EI = 4000 VarI = 800

 Then
EC = 1000 + 0. 9EI = 1000 + 0. 9 ⋅ 4000 = 4600

varC = 0. 9 2 varI = 0. 81 ⋅ 800 = 648


2) Expectation, Mean, and Variance Econ 3334

 Measures of Shape of Dist


1)Mean (Central tendency)
2)Variance & Standard deviation (Dispersion)
3) skewness = measure of asymmetry of a distribution
E X−μ X  3
skewness =
σ 3X

 Skewness = 0: distribution is symmetric


 Skewness > (<) 0: distribution has long right (left) tail
2) Expectation, Mean, and Variance Econ 3334

 4) kurtosis = measure of mass in tails


= measure of probability of large values

E X−μ X  4
kurtosis =
σ 4X

 kurtosis = 3: normal distribution


 kurtosis > 3: heavy tails (“leptokurtotic”)
2) Expectation, Mean, and Variance Econ 3334


3) Two Random Variables Econ 3334

 Consider 2 RVs X & Y


e.g. X is Income, Y is Gender
e.g. X is Parent height , Y is Child’s height
e.g. X is Price, Y is Qty demand

 Joint Prob Dist

 Marginal Dist

 Conditional Dist
3) Two Random Variables Econ 3334
 Joint distribution : Random variables X and Y have a joint
distribution, i.e., the probability that the random variables
simultaneously take on certain values, say x and y.
 The time to commute to school, and the weather
Time\weather 0 (raining) 1 (not raining)
40 min 0.08 0.15
50 min 0.20 0.25
60 min 0.28 0.04

 Joint “and”
PX = x, Y = y
e. g PX = 60, Y = 0 =. 28
3) Two Random Variables Econ 3334
 Marginal probability distribution of a random variable Y is
just another name for its probability distribution.
 This term is used to distinguish the distribution of Y alone
and another random variable, say X.
 PY = y = ∑ i PX = x i, Y = y
PY = 0 = ∑ i PX = x i, Y = 0 = 0. 56

 Time\weather 0 (raining) 1 (not raining) Marginal of time


40 min 0.08 0.15 0.23
50 min 0.20 0.25 0.45
60 min 0.28 0.04 0.32
Marginal of weather 0.56 0.44 1
3) Two Random Variables Econ 3334

 Conditional Distribution:
PX=x,Y=y
PX = x ∣ Y = y = PY=y
PX=x,Y=y
PY = y ∣ X = x = PX=x

• What is the probability of X=60 given that Y=1?


PX = 60 ∣ Y = 1 = 0.04
0.44
= 0. 09

• What is the probability of Y=1 given that X=60?

PY = 1 ∣ X = 60 = 0.04


0.32
= 0. 125
3) Two Random Variables Econ 3334

Time\weather 0 (raining) 1 (not raining) Marginal of time


40 min 0.08 0.15 0.23
50 min 0.20 0.25 0.45
60 min 0.28 0.04 0.32
Marginal of weather 0.56 0.44 1

Time Condition distribution Time Condition distribution


given whether =0 given whether =1
40 min 0.08/0.56=0.143 40 min 0.15/0.44=0.340
50 min 0.20/0.56=0.357 50 min 0.25/0.44=0.568
60 min 0.28/0.56=0.5 60 min 0.04/0.44=0.091
sum 1 sum 1
3) Two Random Variables Econ 3334

 X & Y are independent if


PX = x ∣ Y = y = PX = x
for all the possible x and y.
 i.e. “given” Y does not change the probability
 Since Px=x,Y=y
PX = x ∣ Y = y = PY=y

then X & Y are independent of

PX = x, Y = y = PX = x ⋅ PY = y


for all the possible x and y.
3) Two Random Variables Econ 3334
 Let’s check our time/weather example
Time\weather 0 (raining) 1 (not raining) Marginal of time
40 min 0.08 0.15 0.23
50 min 0.20 0.25 0.45
60 min 0.28 0.04 0.32
Marginal of weather 0.56 0.44 1

 PX = 60 ∣ Y = 1 = 0. 09
PX = 60 = 0. 32
0. 09 ≠ 0. 32
so clearly X & Y are NOT independent
3) Two Random Variables Econ 3334
 Conditional Expectation is the mean of Y given X=x:
EY ∣ X = x  = ∑ y i ⋅ PY = y i ∣ X = x 

 Conditional Expectation is the mean of X given Y=y:

EX ∣ Y = y = ∑ x i ⋅ PX = x i ∣ Y = y

 If X and Y are independent, then


EY ∣ X = EY and EX ∣ Y = EX
3) Two Random Variables Econ 3334

 Time/weather example
Time Condition distribution Time Condition distribution
given whether =0 given whether =1
40 min 0.08/0.56=0.143 40 min 0.15/0.44=0.340
50 min 0.20/0.56=0.357 50 min 0.25/0.44=0.568
60 min 0.28/0.56=0.5 60 min 0.04/0.44=0.091
sum 1 sum 1

EX ∣ Y = 0 = 40 ⋅ 0.08
0.56
+ 50 ⋅ 0.20
0.56
+ 60 ⋅ 0.28
0.56
= 53. 57

EX ∣ Y = 1 = 40 ⋅ 0.15
0.44
+ 50 ⋅ 0.25
0.44
+ 60 ⋅ 0.04
0.44
= 47. 5
3) Two Random Variables Econ 3334

 Suppose that X is a random variable:


x 1 with probability p1
x 2 with probability p2
X=
...
x k with probability pk


Suppose that Y is a random variable:
y 1 with probability q1
y 2 with probability q2
Y=
...
y m with probability qm
3) Two Random Variables Econ 3334

 Define a random variable EY ∣ X as:


EY ∣ X = x 1  with probability p 1
EY ∣ X = x 2  with probability p 2
EY ∣ X =
...
EY ∣ X = x k  with probability p k
This is just how we define the random variable EY ∣ X

 Define a random variable EX ∣ Y as


EX ∣ Y = y 1  with probability q 1
EX ∣ Y = y 2  with probability q 2
EX ∣ Y =
...
EX ∣ Y = y m  with probability q m
This is just how we define the random variable EX ∣ Y
3) Two Random Variables Econ 3334

 Law of Iterated Expectations (LIE):


EY = EEY ∣ X

E EY ∣ X
take the expectation w ith respect to Y given X

take the expectation w ith respect to X

 EX = EEX ∣ Y


 In the discrete case
EX = ∑EX ∣ Y = y i  ⋅ pY = y i 
EY = ∑EY ∣ X = x i  ⋅ pX = x i 
3) Two Random Variables Econ 3334
 Let’s check our time/weather example
Time\weather 0 (raining) 1 (not raining) Marginal of time
40 min 0.08 0.15 0.23
50 min 0.20 0.25 0.45
60 min 0.28 0.04 0.32
Marginal of weather 0.56 0.44 1

 EX = ∑ x ⋅ PX = x  = 40 ⋅ 0. 23 + 50 ⋅ 0. 45 + 60 ⋅ 0. 32 = 50. 9


i i

 We show: EX ∣ Y = 0 = 40 ⋅ 0.08 0.56


+ 50 ⋅ 0.20
0.56
+ 60 ⋅ 0.28
0.56
= 53. 57

EX ∣ Y = 1 = 40 ⋅ 0.15
0.44
+ 50 ⋅ 0.25
0.44
+ 60 ⋅ 0.04
0.44
= 47. 5

 By LIE, EX = ∑ EX ∣ Y = yi  ⋅ PY = yi 


= EX ∣ Y = 0 ⋅ pY = 0 + EX ∣ Y = 1 ⋅ pY = 1
= 53. 57 ⋅ 0. 56 + 47. 5 ⋅ 0. 44
= 50. 9
3) Two Random Variables Econ 3334
 Properties of conditional expectation
 Ec|X = c, where c is a constant
 EaY + b|X = aEY|X + b
 EfX|X = fX
 Ef 1 X ⋅ f 2 Y|X = f 1 X ⋅ Ef 2 Y|X
 EY|X = EY if X and Y are independent
 EY = EEY|X (law of iterated expectation)
 Example EU|X = 0 implies that EU = 0 and EUX = 0
Proof: EU = EEU|X = E0 = 0 (law of iterated expectation)
EUX = EEUX|X (law of iterated expectation)
= EX ⋅ EU|X = EX ⋅ 0 = 0 ■
3) Two Random Variables Econ 3334
 Conditional Variance:
VarY ∣ X = x  = E Y − EY ∣ X = x  2 X = x
VarX ∣ Y = y = E X − EX ∣ Y = y 2 Y = y
 Discrete case:
varY ∣ X = x = ∑ y i − EY ∣ X = x 2 ⋅ PY = y i ∣ X = x 
varX ∣ Y = y = ∑ x i − EX ∣ Y = y 2 ⋅ PX = x i ∣ Y = y

 If X and Y are independent


varY ∣ X = x  = varY
varX ∣ Y = y = varX
3) Two Random Variables Econ 3334

 Conditional Variance:
Time Condition distribution
given whether =0
40 min 0.08/0.56=0.143
50 min 0.20/0.56=0.357
60 min 0.28/0.56=0.5
sum 1

EX ∣ Y = 0 = 40 ⋅ 0.08
0.56
+ 50 ⋅ 0.20
0.56
+ 60 ⋅ 0.28
0.56
= 53. 57

varX ∣ Y = 0
= 40 − 53. 57 2 ⋅ 0. 08 + 50 − 53. 57 2 0. 20 + 60 − 53. 57 2 0. 28
0. 56 0. 56 0. 56
= 51. 53
3) Two Random Variables Econ 3334

 Covariance: (linear) statistical relationship between


variables

covX, Y = σ XY = EX − μ X Y − μ Y  = EXY − μ X μ Y

 Correlation:
covX,Y σ XY
corrX, Y = ρ XY = = σ X ⋅σ Y
varX  varY


− 1 ≤ corr ≤ 1

 If X and Y are independent, then corr(X,Y)=0


3) Two Random Variables Econ 3334

 corr(X,Y) = 1 mean perfect positive linear association

 corr(X,Y) = –1 means perfect negative linear association

 corr(X,Y) = 0 means no linear association


3) Two Random Variables Econ 3334
The correlation coefficient measures linear association
3) Two Random Variables Econ 3334

 Mean and variance of sums of RVs


EaX + bY = aEX + bEY

varaX + bY = a 2 varX + b 2 varY + 2ab ⋅ covX, Y


 If X and Y are independent
varaX + bY = a 2 varX + b 2 varY

You might also like