Introduction to Probability

© All Rights Reserved

0 views

Introduction to Probability

© All Rights Reserved

- The Effect of Corporate Income Tax on Financial Performance of Listed Manufacturing Firms in Ghana
- Chapter 5
- Formulas and Tables[1]
- MB 0024 Set2
- Attribute Importance
- Quiz - II Solution
- 216S
- Discrete
- Probability Theory
- Correlation and Regression
- An Efficient Point Estimate Method for Probabilisticanalysis
- maximiano
- waqas aslam .doc
- SBE_SM05.doc
- Taburan Kebarangkalian
- LO_Unit2_ProbabilityAndDistributions.pdf
- OCCurves Final [16,37]
- notes_exp
- 4. Queueing Theory.pdf
- 19043-22877-1-SM

You are on page 1of 63

Lecture 1

Henrik Vie Christensen

vie@control.auc.dk

Institute of Electronic Systems

Aalborg University

Denmark

Set Definitions I

A set is a collection of elements. Sets are denoted by

CAPITAL letters, and elements by small letters.

The symbol ∈ means “is an element of”, and ∈

/ means

“is not an element of”.

The symbol ∅ denotes the empty set.

The entire space is denoted by S (Sample Space).

A set is countable if it is finite, or its elements has a

one-to-one correspondence with the integers.

A set A is contained in a set B , denoted A ⊂ B ( or

B ⊃ A), if every element of A is also in B .

The following three statements are always satisfied:

A ⊂ S , ∅ ⊂ A and A ⊂ A

Set Definitions II

A = B if and only if A ⊂ B and B ⊂ A.

The union of two sets A and B , denote A ∪ B , is the set

of all elements that belongs to A or B or both.

The intersection of two sets A and B , denote A ∩ B , is

the set of elements that belong to both A and B .

Two sets A and B are mutually exclusive if

A ∩ B = AB = ∅.

The complement, A, of a set A relative to a set S ,

consists of the elements of S , which are not in A.

Set Definitions III

The Commutative Laws

A∪B = B∪A

A∩B = B∩A

(A ∪ B) ∪ C = A ∪ (B ∪ C) = A ∪ B ∪ C

(A ∩ B) ∩ C = A ∩ (B ∩ C) = A ∩ B ∩ C

A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)

A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)

Set Definitions IV

DeMorgan’s Laws

(A ∪ B) = A ∩ B

(A ∩ B) = A ∪ B

The Sample Space

The Sample Space for an experiment is the set of all

possible outcomes of the experiment. The Sample Space is

denoted S .

completely additive if

1. S ∈ S

Sn

2. If Ak ⊂ S for k = 1, 2, 3, . . ., then k=1 Ak ⊂ S for

n = 1, 2, 3, . . ..

3. If A ⊂ S , then A ⊂ S .

Probabilities of Random Events

Definition: A probability measure is a set function whose

domain is a completely additive class S of events defined

on the sample space S such that the measure satisfies the

following conditions:

1. P (S) = 1

2. P (A) ≥ 0 for all A ∈ S

S P

N N

3. P A

k=1 k = k=1 P (Ak )

if Ai ∩ Aj = ∅ for i 6= j , and N may be infinite

space, a probability measure, and a class of sets forming

the domain set of the probability measure. The combination

of these three items are called a probabilistic model.

Introduction to Probability and Stocastic Processes - Part I – p. 7/50

Probabilities of Random Events

Relative Frequency Definition (Probability based on experiment): The

number of times the experiment is performed is n, and nA is

the number of times, where the outcome belongs to A ⊂ S .

nA

P (A) = lim

n→∞ n

and NA is the number of outcomes that belongs to A ⊂ S .

NA

P (A) =

N

Ex. where the classical definition fails

Willard H. Longcora drilled die experiment:

Thrown a die with drilled pips over one million times, using

a new die every 20,000 throw because the die wore down.

Upface 1 2 3 4 5 6 Total

Rel. freq. 0.155 0.159 0.164 0.169 0.174 0.179 1.00

1 1 1 1 1 1

Classical 6 6 6 6 6 6 1.00

Probabilities of Random Events I

The following is a number of Laws which can be shown

using the definition of a Probability measure

1. P (∅) = 0

2. P (A) ≤ 1

3. P (A) = 1 − P (A)

4. If A ⊂ B then P (A) ≤ P (B)

5. P (A ∪ B) = P (A) + P (B) − P (A ∩ B)

6. P (A ∪ B) ≤ P (A) + P (B)

Probabilities of Random Events II

7. If A1 ∪ A2 ∪ · · · ∪ An = S and the Ai ∩ Aj = ∅ if i 6= j , then

P (A) = P (A ∩ S) = P (A ∩ (A1 ∪ A2 ∪ · · · ∪ An ))

= P ((A ∩ A1 ) ∪ (A ∩ A2 ) ∪ · · · ∪ (A ∩ An ))

= P (A ∩ A1 ) + P (A ∩ A2 ) + · · · + P (A ∩ An )

8.

n

!

[

P Ai = P (A1 ) + P (A1 A2 ) + P (A1 A2 A3 ) + · · ·

i=1

n−1

\

+P (An Ai )

i=1

Joint and Marginal Probability I

Consider experiment E1 having sample space S1 consisting

of outcomes a1 , a2 , . . . , an1 and experiment E2 having

sample space S2 consisting of outcomes b1 , b2 , . . . , bn2 .

The joint sample space of the experiments is defined as

S = S1 × S2

= {(ai , bj ) : i = 1, 2, . . . , n1 , j = 1, 2, . . . , n2 }

is denoted P (Ai ∩ Bj ), often abbrevated P (Ai Bj ).

Joint and Marginal Probability II

If the events A1 , . . . , An of the experiment E1 are mutually

exclusive and exhaustive, then for the event Bj ⊂ S2 , and

S = S1 × S2 :

P (Bj ) = P (Bj ∩ S)

= P (Bj ∩ (A1 ∪ A2 ∪ · · · ∪ An ))

Xn

= P (Ai Bj )

i=1

called a marginal probability.

Conditional Probability

Using the following definitions of probability

NAB NA

P (AB) = , P (A) =

N N

the probability that B will happen given that A has

happened is

NAB NAB /N

P (B|A) = =

NA NA /N

P (AB)

P (B|A) =

P (A)

Joint, Marginal and Conditional Prop.

Relationships:

1. P (AB) = P (A|B)P (B) = P (B|A)P (A)

2. If AB = ∅, then P (A ∪ B|C) = P (A|C) + P (B|C)

3. P (ABC) = P (A)P (B|A)P (C|AB)

4. If B1 , B2 , . . . , Bm are mutually exclusive and exhaustive,

then

m

X

P (A) = P (A|Bj )P (Bj )

j=1

5. Bayes rule

P (A|Bj )P (Bj ) P (A|Bj )P (Bj )

P (Bj |A) = Pm =

j=1 P (A|Bj )P (Bj ) P (A)

Introduction to Probability and Stocastic Processes - Part I – p. 15/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

B1 B2 B3 B4 B5 Totals

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

B1 B2 B3 B4 B5 Totals

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

the defect B1 ?

P (M2 B1 ) =?

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

B1 B2 B3 B4 B5 Totals

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

the defect B1 ?

145

P (M2 B1 ) = ≈ 27%

530

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

P (B2 ) =?

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

11

P (B2 ) = ≈ 2%

530

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

P (M1 ) =?

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

140

P (M1 ) = ≈ 26%

530

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

P (B2 |M2 ) =?

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

2

P (B2 |M2 ) = ≈ 1, 3%

160

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

2

P (B2 M2 ) 530 2

P (B2 |M2 ) = = 160 =

P (M2 ) 530

160

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

P (M1 |B2 ) =?

Ex. on joint and marginal probabilities

A number of components from manufacturer Mi for is

grouped in the following classes of defects Bj , i, j = {1, . . . , 4}.

M1 124 6 3 1 6 140

M2 145 2 4 0 9 160

M3 115 1 2 1 1 120

M4 101 2 0 5 2 110

Totals 485 11 9 7 18 530

6

P (M1 |B2 ) = ≈ 55%

11

Introduction to Probability and Stocastic Processes - Part I – p. 16/50

Binary Communication Channel

One and zero is transmitted. If

A = one transmitted

B = one is received

P (A) = 0.6

P (B|A) = 0.90

P (B|A) = 0.05

Binary Communication Channel

One and zero is transmitted. If

A = one transmitted

B = one is received

P (A) = 0.6

P (B|A) = 0.90

P (B|A) = 0.05

= 0.90 ∗ 0.6 + 0.05 ∗ 0.4 = 0.56

Binary Communication Channel

One and zero is transmitted. If

A = one transmitted

B = one is received

P (A) = 0.6

P (B|A) = 0.90

P (B|A) = 0.05

received?

P (B|A)P (A) 0.90 ∗ 0.6 27

P (A|B) = = = ≈ 96%

P (B) 0.56 28

Statistical Independence

Two events Ai and Bj are statistically independent if

P (Ai |Bj ) = P (Ai )

exclusive!!

A = {2, 4, 6} and B = {5, 6}

Then

P (AB) = 16 = P (A)P (B) and P (A|B) = 1

2 = P (A)

A and B are statistically independent but not mutually

exclusive.

Introduction to Probability and Stocastic Processes - Part I – p. 18/50

Random Variables

Definition:A random variable X is a function X(λ) : S → R,

such that

1. The set {λ : X(λ) ≤ x} is an event for every x ∈ R.

2. P (X = ∞) = P (X = −∞) = 0.

called the image of A. For every set T ⊂ R there exists a set

X −1 (T ) ∈ S , called the inverse image of T , which satisfies

X −1 (T ) = {λ ∈ S : X(λ) ∈ T }

From Sample Space to Random Variable

Tossing a die the Sample Space is:

• • • • • • •••

• • •

• • • • • • •••

S A C

S A C

X: S A C

S A C

S

w AU CW ?

-1 0 1 2 3 4 5 6 7 8

side of the die” to R.

Distribution Function

Definition:The distribution function of the random variable

X is given by FX (x) = P (X ≤ x).

1. FX (−∞) = 0

2. FX (∞) = 1

3. lim FX (x + ǫ) = FX (x)

ǫ→0

ǫ>0

5. P (x1 < X ≤ x2 ) = FX (x2 ) − FX (x1 )

Distribution Function for a fair die

FX (xi )

1 •

5 •

6

4 •

6

3 •

6

2 •

6

1 •

6

0

0 1 2 3 4 5 6 7 8 9 xi

Joint Distribution Function

Definition:The joint distribution function for the two random

variables X and Y is given by

FX,Y (x, −∞) = 0, FX,Y (∞, ∞) = 1, FX,Y (x, ∞) = FX (x)

Discrete Random Variables

Definition:A discrete random variable only takes on a finite

set of values. The probability P (X = xi ) for i = 1, 2, . . . , n is

called the probability mass function.

1. P (X = xi ) > 0 for i = 1, 2, . . . , n

n

X

2. P (X = xi ) = 1

i=1

X

3. P (X ≤ x) = FX (x) = P (X = xi )

xi ≤x

ǫ→0

ǫ>0

Probability Mass function for a die

P (X = xi )

2

6 Fair die

1 • • • • • •

6

0

0 1 2 3 4 5 6 7 8 9 xi

P (X = xi )

2

6 Drilled die

1

• • • • •

6 •

0

0 1 2 3 4 5 6 7 8 9 xi

Introduction to Probability and Stocastic Processes - Part I – p. 25/50

Relationships for two Random Variables

X X

1. P (X ≤ x, Y ≤ y) = P (X = xi , Y = yj )

xi ≤x yj ≤y

m

X

2. P (X = xi ) = P (X = xi , Y = yj )

j=1 m

X

= P (X = xi |Y = yj )P (Y = yj )

j=1

P (X = xi , Y = yj )

3. P (X = xi |Y = yj ) =

P (Y = yj )

P (Y = yj |X = xi )P (X = xi )

= Pn

i=1 P (Y = yj |X = xi )P (X = xi )

P (X = xi , Y = yj ) = P (X = xi )P (Y = yj )

Introduction to Probability and Stocastic Processes - Part I – p. 26/50

Example: Joint distribution

Joint distribution for two fair dies:

X : # of eyes on the Up Face side of die 1.

Y : # of eyes on the Up Face side of die 2.

1

P (X = i, Y = j) = for i, j ∈ {1, 2, . . . , 6}

36

and the joint distribution function

y

x X

X 1 xy

FX,Y (x, y) = = for x, y ∈ {1, 2, . . . , 6}

36 36

i=1 i=1

Expected values, mean and variance

The average or expected value of a function g of a discrete

random variable X is

Xn

E{g(X)} = g(xi )P (X = xi )

i=1

Xn

E{X} = µX = xi P (X = xi )

i=1

Xn

E{(X − µX )2 } = σX2

= (xi − µX )2 P (X = xi )

i=1

Introduction to Probability and Stocastic Processes - Part I – p. 28/50

Die example!!

X : # of eyes on the Up Face side of a fair die.

The mean value:

6

X 1

E{X} = µX = xi P (X = xi ) = (1 + 2 + 3 + 4 + 5 + 6) ∗ = 3.5

i=1

6

The variance

6

X 35

E{(X − µX )2 } = σX

2

= (xi − µX )2 P (X = xi ) = ≈ 2.9

i=1

12

6

X 91

E{g(X)} = x2i P (X = xi ) = ≈ 15.2

i=1

6

Introduction to Probability and Stocastic Processes - Part I – p. 29/50

Expected values, mean and variance I

If the probability mass function is not known, but the mean

and the varians are known, Tchebycheff’s inequality can be

used to evaluate the probability of a random variable

2

σX

P [|X − µX | > k] ≤

k

The expected value of a function of two random variables is

defined as

n X

X m

E{g(X, Y )} = g(xi , yj )P (X = xi , Y = yj )

i=1 j=1

Expected values, mean and variance II

The correlation coefficient is defined as

E{(X − µX )(Y − µY )} σXY

ρXY = =

σX σY σX σY

The correlation coefficient ρXY ∈ [−1, 1],

if X and Y are independent, then ρXY = 0 and

if they are linearly dependent, then |ρXY | = 1.

Note: ρXY = 0 does NOT imply statistical independence!

EXY = 0

Expected values, mean and variance III

The conditional expected values are defined as

n

X

E{g(X, Y )|Y = yj } = g(xi , yj )P (X = xi |Y = yj )

i=1

Xm

E{g(X, Y )|X = xi } = g(xi , yj )P (Y = yj |X = xi )

j=1

X

E{X|Y = yj } = µX|Y =yj = xi P (X = xi |Y = yj )

i

Introduction to Probability and Stocastic Processes - Part I – p. 32/50

The Uniform Probability Mass Function

The Uniform Probability Mass Function is given by

1

P (X = xi ) = , i = 1, 2, . . . , n

n

Example: Fair die

P (X = xi )

2

6

1 • • • • • •

6

0

0 1 2 3 4 5 6 7 8 9 xi

Introduction to Probability and Stocastic Processes - Part I – p. 33/50

The Binomial Probability Mass Function

The Binomial Probability Mass Function.

If P (A) = p, and the experiment is repeated n times, let X

be a random variable representing the number of times A

occurs, then

n k

P (X = k) = p (1 − p)n−k , k = 1, 2, . . . , n

k

n n!

where k = k!(n−k)! .

variable are

2

µX = np and σX = np(1 − p)

Example: The Binomial Prob. Mass Fct

For n = 10 and p = 0.5

0.25

0.2

0.15

0.1

0.05

0

0 1 2 3 4 5 6 7 8 9 10

Example: The Binomial Prob. Mass Fct

For n = 10 and p = 0.25

0.35

0.3

0.25

0.2

0.15

0.1

0.05

0

0 1 2 3 4 5 6 7 8 9 10

Poisson Probability Mass Function

Assume

1. The number of events occurring in a small time interval

∆t → λ′ ∆t as ∆t → 0.

2. The number of events occurring in non overlapping time

intervals are independent.

Then the number of events in a time interval T have a

Poisson Probability Mass Function of the form

λk −λ

P (X = k) = e , k = 0, 1, 2, . . .

k!

where λ = λ′ T .

2 = λ.

The mean and the variance are µX = σX

Example: Poisson Prob. Mass Fct

For λ = 0.9

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0

0 1 2 3 4 5 6 7 8 9 10

Binary communication system Ia

3 1

X : input, 0 or 1, with P (X = 0) = 4 and P (X = 1) = 4

Y : output, due to noise:

3 7

P (Y = 1|X = 1) = and P (Y = 0|X = 0) =

4 8

Find P (Y = 1) and P (Y = 0)

P (Y = 1) = P (Y = 1|X = 0)P (X = 0)

+P (Y = 1|X = 1)P (X = 1)

7 3 3 1 9

= 1− + =

8 4 4 4 32

23

P (Y = 0) = 1 − P (Y = 1) =

32

Binary communication system Ib

3 1

X : input, 0 or 1, with P (X = 0) = 4 and P (X = 1) = 4

Y : output, due to noise:

3 7

P (Y = 1|X = 1) = and P (Y = 0|X = 0) =

4 8

Find P (X = 1|Y = 1)

P (Y = 1|X = 1)P (X = 1)

P (X = 1|Y = 1) =

P (Y = 1)

3

1

4 4 2

= 9

=

32

3

Binary communication system IIa

Binary data are sent in blocks of 16 digits over a noisy

communication channel. p = 0.1 is the probability that a digit

is in error (independent of whether other digits are in error).

distribution.

16

P (X = k) = (0.1)k (0.9)16−k , k = 1, 2, . . . , 16

k

Binary communication system IIb

X : Number of errors per block. X has a binomial

distribution.

16

P (X = k) = (0.1)k (0.9)16−k , k = 1, 2, . . . , 16

k

2

σX = np(1 − p) = (16)(0.1)(0.9) = 1.44

Find P (X ≥ 5)

P (X ≥ 5) = 1 − P (X ≤ 4)

4

X 16

= 1− (0.1)k (0.9)16−k ≈ 0.017

k

k=0

Introduction to Probability and Stocastic Processes - Part I – p. 42/50

Continuous Random Variables

A continuous random variable can take any value in an

interval of the real line. For a continuous random variable

the probability density function (pdf) is defined by

dFX (x)

fX (x) =

dx

Properties:

1. fX (x) ≥ 0

R∞

2. −∞ fX (x)dx = 1

Ra

3. P (X ≤ a) = FX (a) = −∞ fX (x)dx

Ra

4. P (a ≤ X ≤ b) = b fX (x)dx

Ra

5. P (X = a) = a fX (x)dx = lim fX (a)∆x = 0

∆x→0

Introduction to Probability and Stocastic Processes - Part I – p. 43/50

Two Continuous Random Variables

The joint probability density function

d2 FX,Y (x, y)

fX,Y (x, y) = ≥0

dxdy

Z y Z x

FX,Y (x, y) = fX,Y (µ, η)dµdη

−∞ −∞

Z ∞Z ∞

fX,Y (µ, η)dµdη = 1

−∞ −∞

Two Continuous Random Variables I

The marginal and conditional density functions are obtained

as follows:

Z ∞

fX (x) = fX,Y (x, y)dy

−∞

Z ∞

fY (y) = fX,Y (x, y)dx

−∞

fX,Y (x, y)

fX|Y (x|y) = , fY (y) > 0

fY (y)

fX,Y (x, y)

fY |X (x|y) =

fY (y)

fX|Y (x|y)fY (y)

= R∞

−∞ fX|Y (x|λ)fY (λ)dλ

Two Continuous Random Variables II

Two random variables are statistically independent if

Expected values I

Z ∞ Z ∞

E{g(X, Y )} = g(x, y)fX,Y (x, y)dxdy

−∞ −∞

Z ∞

µX = E{X} = xfX (x)dx

−∞

Z ∞

2

σX = E{(X − µX )2 } = (x − µX )2 fX (x)dx

−∞

σXY = E{(X − µX )(Y − µY )}

Z ∞Z ∞

= (x − µX )(y − µY )fX,Y (x, y)dxdy

−∞ −∞

E{(X − µX )(Y − µY )}

ρXY =

σX σY

Expected values II

Conditional expected values are defined as

Z ∞

E{g(X, Y )|Y = y} = g(x, y)fX|Y (x|y)dx

−∞

E{g(X)h(Y )} = E{g(X)}E{h(Y )}

Example

The joint density function of X and Y is

fX,Y (x, y) = axy 1 ≤ x ≤ 3, 2 ≤ y ≤ 4

fX,Y (x, y) = 0 elsewhere

Find a:

4Z 3 4 2

3

x

Z Z

1 = axy dx dy = a y dy

2 1 2 2 1

Z 4

= a 4y dy = 24a

2

1

so a = 24

Example (continued)

The marginal pdf of X :

1 4 x

R

fX (x) = 24 2 xy dy = 4 1≤x≤3

fX (x) = 0 elsewhere

FY (y) = 0 y≤2

FY (y) = 1 y>4

1 y 3 1 y

R R R

FY (y) = 24 2 1 xv dx dv = 6 2 v dv

1 2 − 4)

= 12 (y 2≤y≤4

- The Effect of Corporate Income Tax on Financial Performance of Listed Manufacturing Firms in GhanaUploaded byAlexander Decker
- Chapter 5Uploaded byPrasath Pillai Raja Vikraman
- Formulas and Tables[1]Uploaded byapi-26315128
- MB 0024 Set2Uploaded bylakkuMS
- Attribute ImportanceUploaded byZaidatul Zahraa Zain
- Quiz - II SolutionUploaded byVivek Bhanushali
- 216SUploaded byWajahat Riaz
- DiscreteUploaded byrubina.sk
- Probability TheoryUploaded byAbdus Samad Memon
- Correlation and RegressionUploaded byJj Aiteng
- An Efficient Point Estimate Method for ProbabilisticanalysisUploaded byamit244
- maximianoUploaded byRara Alonzo
- waqas aslam .docUploaded byHaji Muhammad Aslam
- SBE_SM05.docUploaded byAngel Adolfo Lavado Villafuerte
- Taburan KebarangkalianUploaded byAbdul Manaf
- LO_Unit2_ProbabilityAndDistributions.pdfUploaded byPuthpura Puth
- OCCurves Final [16,37]Uploaded bydion_d_sa
- notes_expUploaded byHOD General Dept. LEC
- 4. Queueing Theory.pdfUploaded byMarkVincentTulinao
- 19043-22877-1-SMUploaded byRaúl León Medina
- Fuyuki Yoshikane, 2013Uploaded byMacksWendhell
- Research DesignUploaded bySohail Rana
- Managerial AccountingUploaded byRegine Consuelo
- Chapter 11 Correlation DesignUploaded byTrisha Gonzales
- Unit 3 Probabilitity Notes (Answers)Uploaded byrohitrgt4u
- PartialUploaded byFahma Amindaniar
- Tutorial 2.pdfUploaded byUddeshya Gupta
- ContiUploaded byteklay
- SecBUploaded bylalit chola
- WANIIIUploaded byBenazir Amrinnisa Yusuf

- Case DigestUploaded bymaanyag6685
- Lithuanian Language LinguisticsUploaded byBenne Dose
- kinematics of foot and ankleUploaded byGary Scott
- Tourism PresentationUploaded byNikita Pawar
- Disney%2BProfile%2BFinal%2BDraft[1]Uploaded byDonnetta Wallace
- Wood Elf Characters 7th EditionUploaded byRobert Young
- Multiplicative InverseUploaded bydearbhupi
- Summary of Indonesian Laws Which Recognised Indigenous RightUploaded byEmil Kleden
- The Life and Opinions of Tristram Shandy, Gentleman by Laurence SterneUploaded byXanai Ortiz
- wh poi ch 6 3 rise of christianityUploaded byapi-286833366
- Cs610-Computer Network Helping MaterialUploaded bycs619finalproject.com
- Heat and DustUploaded byJelena Abula
- Pmt Seminar PDFUploaded bykarimakki
- Essential Marketing Models Smart InsightsUploaded bySitiSuharijanSaid
- Karakas.docUploaded byAnonymous AmsYtpC
- The Neurohistological TechniquesUploaded byapi-3846255
- Cases on Sexual HarassmentUploaded byLourd Sydnel Roque
- AnnualReport 07-08 - latest versionUploaded bySandip Patel
- Week 10 Modeling & Assembly ActivitesUploaded byMiguel Rueda Cuerda
- IBM Informix 11.7Uploaded byFábio Augusto Kepalas Chiaradia
- Renato_ConstantinoUploaded byIgnacio Carlo V Angeles
- HW1 Relativity 27Aug10 SOLNUploaded byJordan Moshcovitis
- Case Presentation TFAUploaded byRizma Adlia
- Habib Bank Limited HR DepartmentUploaded byAnonymous 2F0b4b
- United States v. Julio Acuna, 11th Cir. (2009)Uploaded byScribd Government Docs
- 5 Eulogio v ApelesUploaded byAlexa Valderama
- Describing PeopleUploaded byCsilla
- BakkerUploaded byTithi Bhattacharya
- 復活節和耶穌 - Easter and JesusUploaded byFree Children's Stories
- Use of Web 2.0 Tools as a Learning Activity among Adult Higher Education FacultyUploaded byMike E. Fortune