You are on page 1of 39

Engineering Mathematics 2: MT233 (2019)

Section : Probability and Statistics

Lecturer : Prof. S. Mushayabasa

Department : Mathematics, University of Zimbabwe

Course Outline
1. Introduction

1.1. What is statistics?


1.2. Important definitions
1.3. Applications of statistics
1.4. Explanatory statistics

2. Continuous random variables

2.1. Continuous random variables in one-dimensional


2.2. Mean and variance for continuous random variables
2.3. Median and Mode for continuous random variables.
2.4. Continuous random variables in 2D.
2.5. Marginal probability distribution
2.6. Conditional probability for continuous random variables.
2.7. Independence in continuous random variables.
2.8. Properties of variance for continuous random variables.
2.9. The Normal distribution

3. Hypothesis testing

3.1. Testing concerning means and differences between means


3.2. Confidence intervals

4. Regression analysis

1
References

1. Statistics and probability with applications for engineers and scientist set by Bhisham
C. Gupta, Irwin Guttman

2. An introduction to probability and statistics, 3rd ED by Vijang K. Rohatgi, AK. Md


Ehsanes

3. Design and Analysis of experiments 2nd, 3rd, 4th, 5th...8th ED by Douglas C Mont-
gomery

4. Statistics for engineers and scientist 3rd ED by William Navidi

1 Introduction
1.1 What is statistics?

• The word statistics has two meanings:

1. It refers to the sets of data relating to a wide range of topics such as the size of
populations, production activity, retail prices, incomes, rainfall, etc.
2. Statistics refers to the theory and methods used for collection, description, analysis
and interpretation of numerical data.

• Based on the above definitions one can say statistics comprises of two branches

– Descriptive statistics is concerned with summarizing and describing numerically a


body of data.
– More importantly, Inferential statistics is the process of reaching generalizations
about the whole (called the populations) by examining a portion or many portions
(called samples).

1.2 Statistics in engineering and scientific experimentation

Statistical methods are applied in an enormous diversity of problems in fields as:


• Agriculture (which varieties grow best?)

• Genetics, Biology (selecting new varieties, species)

• Economics (how are the living standards changing?)

• Market Research (comparison of advertising campaigns)

• Education (what is the best way to teach small children reading?)

• Environmental Studies (do strong electric or magnetic fields induce higher cancer rates?)

• Quality engineering

2
1.3 What does a statistician needs to be able to do?

1. Formulate a real world problem in statistical terms.

2. Give advise on efficient data collection.

3. Analyse data effectively and extract the minimum amount of information

4. Interpret and report the results.

1.4 Definition of terms

Below are the definitions of some common terms in statistics:

• Population: is the totality of all objects/items which we are concerned.

• Sample: It is a part of population with which we are concerned or under study.

1.4.1 Why do we sample?

There are various reasons why statisticians use samples and some are as follows:

• Cost-effective: To consider a sample is cost effective with respect to time, money, and
labour that in considering the whole population.

• Accessibility: Some members of the population may not be accessible, therefore it is


only logical to consider a sample.

• Utility: In some experimental methods it will be futile to consider the whole population
if the process involves destroying the objects/items/individuals.

• Precision: less errors due to human errors in a sample than in a survey.

1.5 Descriptive statistics/Explanatory statistics

1.5.1 Measures of central tendency

• Mode: is the value or observation which occurs most frequently.

– Advantage : It is simple and straight forward to identify


– Disadvantage: It ignores some of the collected data

• Median -The sample median is obtained by first ordering the n observations from small-
est to largest (with any repeated values included so that every sample observation appears

3
in the ordered list). Then,



 The single middle
n+1 th
 
 value if n is odd = 2 ordered value




x = The average of the two




 middle values if n
 is even = average of n  and n + 1th ordered value.


2 2

• Advantages of median:

– It is easy and straight forward to compute.


– It is hardly affected by extreme values in a data set.

• Disadvantage(s) of median:

– It is hardly affected by extreme values in a data set.

• Arithmetic mean: is the sum of all n observations or values divided by sample size, n,
that is
Pn
i=1 xi
x̄ = (1)
n

– Advantages: It considers all the values in the sample to find the outcome.
– Disadvantage: It is affected by extreme values in a data set.

• Variance is the sum of the squared deviations from the mean of n values divided by the
degrees of freedom (n − 1), that is

Pn
2 i=1 (xi− x̄)2
Var[xi ] = s =
n−1
Pn 2
− 2x̄xi + x̄2 )
i=1 (xi
=
n−1
 Pn
( n 2
 P P
i=1 xi i=1 xi )
Pn 2−2 n
x
i=1 i n ( x
i=1 i ) + n n2
=
n−1
( n 2
P
i=1 xi )
Pn 2
i=1 xi − n
= . (2)
n−1

4
Example 1 Determine the mode, median and mean from the following dataset: 4,5,1,4,12,10.

Solution

1. Mode: Since 4 is the most frequently occurring observation it is the mode.

2. Median: To find the median, first arrange the observations as to size, that is, in either
ascending or descending order, that is,

1, 4, 4, 5, 10, 12

4+5
Thus, the median = = 4.5
2
3. Mean:
4 + 5 + 1 + 4 + 12 + 10
x̄ =
6
36
=
6

= 6.

4. Variance:
( n 2
P
i=1 xi )
Pn 2
i=1 xi − n
Var[xi ] =
n−1
2
302 − (36)
6
=
5

= 17.2

1.6 Graphical presentation of data

Prepare your own notes on (focus on how to construct these graphs, look at advantages and
disadvantages)

• Histogram

• Stem and leaf plot

• Box-plot

• Frequency polygon

5
Home work

Explain clearly, using the following data, how the following are constructed:

23 29 40 28 15 22 46 39 22 17 26 33 35 49 20,
36 25 15 31 17 43 54 36 30 30 40 27 24 20 28 42
22 37 17 39 17 22 9 26 29

(a) Stem and leaf plot,

(b) Histogram,

(c) Box plot.

2 Random variables
2.1 Continuous random variables in one-dimensional

• A discrete random variable is a random variable with a finite (or countably infinite) set
of real numbers for its range.

• Examples of discrete random variables: number of scratches on a surface, proportion of


defective parts among 1000 tested, number of transmitted bits received in error.

• A continuous random variable is a random variable with an interval (either finite or


infinite) of real numbers for its range.

• Examples of continuous random variables: electrical current, length, pressure, temper-


ature, time, voltage, weight

• A continuous random variable can also be defined a random variable where the data
can take infinitely many values.

• In continuous random variable, the interval on the real line may be open or closed,
bounded or unbounded.

• For instance, the interval could be [0, 1], (0, ∞) or (−∞, ∞).

• To define the probability of an event involving a continuous random variable, we cannot


simply count the number ways the event can occur ( as we can with a discrete random
variable).

• Rather, we introduce a function f called a probability density function.

6
• This function must be non-negative and have the property that the area of the region
bounded by the graph of f and the x− axis, −∞ < x < +∞, is 1, that is;
Z ∞
P (−∞ < x < +∞) = f (x)dx = 1. (3)
−∞

• The probability that x lies in the interval [c, d] is given by


Z d
P (c ≤ x ≤ d) = f (x)dx. (4)
c

• Another property of probability density function is that

f (x) ≥ 0 − ∞ < x < +∞ . (5)

• Another way to describe the probability distribution of a random variable is to define a


function (of real number x) that provides the probability that X is less than x, that is
F (x) = P (X < x).

• F (x) is known as the cumulative distribution function (cdf) of a continuous random


variable with probability density function f (x),
Z x
F (x) = P (X ≤ x) = f (u)du, f or − ∞ < x < +∞. (6)
−∞

• Therefore

dF (x)
= f (x) (7)
dx

Example 2 Suppose that the battery failure time, measured in hours, has a probability density
function (p.d.f )
2
f (x) = x ≥ 0.
(x + 1)3

(i) Determine whether this is a valid p.d.f

(ii) Find the probability that a randomly selected battery from the warehouse will have a
lifetime less than 5 hours.

Determine the cumulative distribution function of this continuous random variable.

Solution

(i) To be a valid p.d.f over the given interval, f must have two characteristics.

1. It must be non-negative over the entire interval, and,

7
2. Its definite integral over the interval must be precisely a unity.

Since the give p.d.f is non-negative over the defined interval, we now investigate the
second characteristic, that is,
Z ∞  b
2 1
P (0 ≤ x ≤ ∞) = dx = lim − = 1.
0 (x + 1)3 b→∞ (x + 1)2 0

(ii)
Z 5  5
2 1 35
P (0 ≤ x ≤ 5) = 3
dx = − 2
= .
0 (x + 1) (x + 1) 0 36

(iii)
Z x  x
2 1 1
F (x) = 3
du = − 2
=1− .
0 (u + 1) (u + 1) 0 (x + 1)2

2.2 Mean and Variance of continuous random variables

• Expected value or Mean: If f is a probability density function (pdf) for a continuous


random variable x over the interval [a, b], then the expected value of x is given by
Z b
µ = E(x) = xf (x)dx. (8)
a

• Variance and Standard Deviation: If f is a probability density function for a continuous


random variable x over the interval [a, b], then the variance of x is given by
Z b
2
σ = V (x) = (x − µ)2 f (x)dx
Za b
= (x2 − 2xµ + µ2 )f (x)dx
Za b Z b Z b
2 2
= x f (x)dx − 2µ xf (x)dx + µ f (x)dx
Za b a a

= x2 f (x)dx − 2µ2 + µ2
a
Z b
= x2 f (x)dx − µ2 . (9)
a

From the last integral of (9) we can also write


Z b
V (x) = x2 f (x)dx − µ2 = E(X 2 ) − µ2 . (10)
a

Example 3 : Let the continuous random variable X denote the current measured in a thin
copper wire in milliamperes. Assume that the range of X is [0, 20mA], and assume that the
probability density function of X is

f (x) = 0.05, 0 ≤ x ≤ 20

8
(a) What is the probability that a current measurement is less than 10 milliamperes?

(b) Determine the expected value and the variance.

Solution
Z 10
(a) P (X < 10) = 0.05dx = 0.5.
0
20
x2 20
Z
(b)(i) Expected value E(x) = xf (x)dx = 0.05 = 10.
0 2 0
20
(x − 10)3 20
Z
(ii) Variance V (x) = (x − 10)2 f (x)dx = 0.05 = 33.33.
0 3 0

2.3 Median and Mode of continuous random variables

• Median: Another useful measure of central tendency is the median. We define the median
to be the number m such hat precisely half of the x− values lie below m and the other
half of the x− values lie above m., That is

P (a ≤ x ≤ m) = 0.5

Example 4 Determine the median value for the following p.d.f


1 −x/10
f (x) = e , x > 0.
10
Solution Using the definition of median, we have
Z m
1
P (0 ≤ x ≤ m) = e−x/10 dx
10
h 0 im
= −e−x/10
0
= −e−m/10 + 1
= 0.5

Simplifying gives

m = −10 ln 0.5 ≈ 6.93.

• Mode: The mode is the value that appears most often in a set of data. The mode of a
continuous probability distribution is the value x at which its probability density function
has its maximum value, so the mode is at the peak.

9
Tutorial #1

1. Show that the following functions are probability density functions for some k and de-
termine the value of k. Then determine the mean and variance of X.

(a) f (x) = kx2 for 0 < x < 4


(b) f (x) = k(1 + 2x) for 0 < x < 2
(c) f (x) = ke−x for x > 0.

2. Suppose that

e−(x−6) , 6<x
f (x) =
0 x≤6

Determine the following probabilities

(a) P (X > 6) (b) P (X < 8) (c) P (6 ≤ X < 8).

3. Show that for a continuous random variable


Z b Z b
2
(x − µ) dx = x2 f (x)dx − µ2 , a ≤ x ≤ b.
a a

4. The probability density function of the time to failure of an electronic component in a


copier (in hours) is
1 −0.001x
f (x) = e , x>0
1000
(a) Determine the probability that
(i) a component lasts more than 3000 hours before failure.
(ii) a component fails in the interval from 1000 to 2000 hours.
(iii) a component fails before 1000 hours.
(b) Determine the number of hours at which 10% of all components have failed.

10
2.4 Continuous random variables in two-dimensional

2.4.1 Joint probability distributions

• Analogous to the probability density function of a single continuous random variable, a


joint probability density function can be defined over two-dimensional space.

• The double integral of fXY (x, y) over a region R provides the probability that (X, Y )
assumes a value in R.

• This integral can be interpreted as volume under the surface fXY (x, y) over the region
R.

Definition: A joint probability density function for the continuous random variable X and Y
denoted as fXY (x, y), satisfies the following properties

1.

fXY (x, y) ≥ 0, for all x, y (11)

2.
Z ∞ Z ∞
fXY (x, y)dxdy = 1. (12)
−∞ −∞

3. For any region R of two-dimensional space


Z Z
P ([X, Y ] ∈ R) = fXY (x, y)dxdy. (13)
R

Example 5 : A privately owned business operates both a drive-in facility and a walk-in facil-
ity. On a randomly selected day, let X and Y, respectively, be the proportions of the time that
the drive-in and the walk-in facilities are in use, and suppose that the joint density function of
these random variables is


2
 (2x + 3y), 0 ≤ x ≤ 1, 0≤y≤1
5


f (x, y) =



0 otherwise

(a) Determine whether this is a valid p.d.f

(b) Find P [(X, Y ) ∈ A], where A = {(x, y)|0 < x < 0.5, 0.25 < y < 0.5}

11
Solution

(a) It can easily be verified that

fXY (x, y) ≥ 0, for all x, y.

Thus

Z ∞ Z ∞ Z 1Z 1
2
fXY (x, y)dxdy = (2x + 3y)dxdy,
−∞ −∞ 0 0 5
Z 1 2 1
2x 6xy
= + dy
0 5 5 0
Z 1 
2 6y
= + dy
0 5 5
2y 3y2 1
 
= + = 1.
5 5 0

(b) To calculate the probability, we use


 
1 1 1
P [(X, Y ) ∈ A] = P 0 < X < , <Y <
2 4 2
Z 0.5 Z 0.5
2
= (2x + 3y)dxdy
0.25 0 5
13
= .
160

2.5 Marginal and conditional probability distributions

In probability theory and statistics, the marginal distribution of a subset of a collection of


random variables is the probability distribution of the variables contained in the subset. It
gives the probabilities of various values of the variables in the subset without reference to the
values of the other variables. This contrasts with a conditional distribution, which gives the
probabilities contingent upon the values of the other variables.

Definition: If the joint density function of continuous random variables X and Y is fXY (x, y),
the marginal probability density functions of X and Y are

Z
fX (x) = fXY (x, y)dy, (14)
Rx

Z
fY (y) = fXY (x, y)dx, (15)
Ry

12
respectively, where Rx denotes the set of all points in the range of (X, Y ) for which X = x,
and Ry denotes the set of all points in the range of (X, Y ) for which Y = y.

Example 6 From example 5, the marginal distributions are as follows


Z y2 =1 ! 1
4xy 6y 2

2 (4x + 3)
fX (x) = 2x + 3y dy = + =
y1 =0 5 5 10 0 5

!
Z x2 =1
2 2(1 + 3y)
fY (y) = 2x + 3y dx = .
x1 =0 5 5

Definition 1 : Given continuous random variables X and Y with joint probability density
function fXY (x, y), the conditional probability density function of Y given X = x is

fXY (x, y)
fY |x (y) = f (y|x) = for fX (x) > 0 (16)
fX (x)

Similarly, the conditional probability density function of X given that Y = y is

fXY (x, y)
fX|y (y) = f (x|y) = for fY (y) > 0 (17)
fY (y)

2.6 Conditional mean and variance for continuous random variables

• The conditional mean of X given Y = y, denoted as E(X|y) or µX|y , is


Z
E(X|y) = xfX|y (x)dx, (18)
Ry

• The conditional mean of Y given X = x, denoted as E(Y |x) or µY |x , is


Z
E(Y |x) = yfY |x (y)dy, (19)
Rx

• The conditional variance of Y given X = x, denoted as V (Y |x) or σY2 |x , is


Z Z
2
V (Y |x) = (y − µY |x ) fY |x (y)dy = y 2 fY |x (y)dy − µY |x (20)
Rx Rx

2 , is
• The conditional variance of X given Y = y, denoted as V (X|y) or σX|y
Z Z
2
V (X|y) = (x − µX|y ) fX|y (x)dx = x2 fX|y (x)dx − µX|y (21)
Ry Ry

13
Example 7 Consider the pdf fXY (x, y) = x + y, for 0 < x < 1 and 0 < y < 1.
Determine

1. f (Y |x)

2. the conditional mean of Y given that X = 0.5

3. P (0.25 < Y < 0.5|x = 0.5)

Solution

1.
" #1
1
y2
Z
f (x) = (x + y)dy = xy + = x + 0.5
0 2
0

Thus

fXY (x, y) x+y


f (Y |x) = = .
fX (x) x + 0.5

2.
! " #0.5
Z 0.5
0.5 + y (1 + y)y 7
P (0.25 < Y < 0.5|x = 0.5) = dy = = = 0.21875.
0.25 0.5 + 0.5 2 32
0.25

3.
! " #1
1 1
y2 y3
Z Z
0.5 + y 7
E(Y |x) = yf (Y |x)dy = y dy = + = .
0 0 0.5 + 0.5 4 3 12
0

2.7 Independence of continuous random variables

The definition of independence for continuous random variables is similar to that of discrete
random variables. For continuous random variables if fXY (x, y) = fX (x)fY (y) for all x and y
then the random variables X and Y are said to be independent.

Example 8 Demonstrate that for the pdf

fXY (x, y) = e−(x+y) , x > 0, y > 0,

the random variables X and Y are independent.

14
Solution

It can easily be verified that fXY (x, y) is a valid pdf.

Z ∞ Z ∞
−(x+y) −x
fX (x) = e dy = e e−y dy = e−x .
0 0

Z ∞ Z ∞
−(x+y) −y
fY (y) = e dy = e e−x dx = e−y .
0 0

Clearly

fXY (x, y) = fX (x)fY (y).

which implies that the random variables X and Y are independent.

2.8 Properties of variance

1. If X is a random variable and a, b, c are constants, then

• var[aX + c] = a2 var[X]

2. If X and Y are random variable with joint probability distribution, with a and b as
constants then

• var[aX + bY ] = a2 var[X] + b2 var[Y ] + 2abcov[X, Y ]

3. If X and Y are independent random variable (Cov(X, Y ) = 0), with a and b as constants,
then

• var[aX ± bY ] = a2 var[X] + b2 var[Y ]

4. If X and Y are random variable with joint probability distribution, with a and b as
constants then

• var[aX − bY ] = a2 var[X] + b2 var[Y ] − 2abcov[X, Y ]

Example 9 If X and Y are random variable (r.v) with joint probability distribution, such that
var[X] = 2, var[Y ] = 4 and cov(X, Y ) = −2, find

(a) var[Z] where Z = 3X − 4Y + 8.

(b) cov(P, Q) where P = X − 2Y and Q = 3X + Y.

Solution

15
(a)

var[Z] = 32 var[X] + 42 var[Y ] − 2(3)(4)cov(XY )


= 9(2) + 16(4) − 24(−2)
= 34.

(b)

cov(X, Y ) = cov(X − 2Y, 3X + Y )


= cov(3X, X − 5X, Y − 2Y, Y )
= 3cov(X, X) − 5cov(X, Y ) − 2cov(Y, Y )
= 3var[X] − 5cov(X, Y ) − 2var[Y ]
= 3(2) − 5(−2) − 2(4)
= 8.

2.9 Normal distribution function

• One of the most important examples of a continuous probability distribution is the Nor-
mal distribution.

• The pdf of the Normal distribution (Gaussian distribution) is given by


1 2 2
f (x) = √ e−(x−µ) /2σ , −∞<x<∞ (22)
σ 2π
where µ and σ are the mean and standard deviation, respectively.

• If we let z be the standardised variable corresponding to x, that is, if we let


x−µ
Z= (23)
σ
then (22) becomes
1 2
Φ(z) = √ e−z /2 (24)

• Equation (24) is often referred termed the standard normal density function.

• Properties of Normal curve

1. The model & mean occur at x = µ.


2. The curve is symmetric about a vertical axis through the mean.
3. The curve has its points of infection at x = µ + σ and x = µ − σ.
4. The curve approaches the horizontal axis asymptotically
5. The total area under the curve is equivalent to unity.

16
Example 10 1. Find the area under the standard normal curve

(a) between z = 0 and z = 1.2


(b) between z = −0.68 and z = 0

2. Suppose the current measurement in a strip of wire are assumed to follow a normal
distribution with a mean of 10 milliamperes and a variance of 4 (milliamperes)2 . What
is the probability that a measurement will exceed 13 milliamperes?
Solution

1. (a)

P (0 ≤ Z ≤ 1.2) = P (Z ≤ 1.2) − P (Z ≤ 0)
= 0.8849 − 0.5
= 0.3849

2. Let X denote the current milliamperes. The requested probability can be represented as
x − 10
P (X > 13). Let Z = . Therefore
2
 
x − 10 13 − 10
P (X > 13) = P >
2 2

= P (Z > 1.5)

= 0.06681.

17
3 Hypothesis testing
3.1 Introduction

• Definition: A hypothesis is a statement about a population.

• Hypothesis testing is concerned with deciding between the two hypothesis H0 ( null hy-
pothesis) and H1 (alternative hypothesis)

• H0 is an assertion that a parameter in a statistical model takes a particular value.

• It is the hypothesis we usually set up with the expectation of rejecting it.

• H1 express the way in which the value of a particular parameter in a statistical model
may deviate from that specified in H0

Example 11 A machine that produces metal cylinders is set to make cylinders with a diameter
of 50 mm. Is it practical that all cylinders that this machine will produce will have a diameter
of exactly 50 mm?
Solution

(i) H0 : µ = 50 (All cylinders produced by the machine have the set diameter, 50mm)

(ii) H1 6= 50 (There is a possibility that the machine can produce cylinders whose diameter is
not 50mm)

Definitions

• Test statistics: H0 generally reflects a position of no change

• We conduct a test not to prove H0 but to see if it should be rejected.

• Such tests are based on the value of sample statistics, such as x̄, z or t scores and these
are called test statistics.

• Critical region: It is a subset of a test statistic that might be observed in an experiment.

• The subset is chosen so that the total probability is low on H0 and is better explained
by H1 .

• Type I error : is the probability of rejecting the H0 when it is true

• Type II error: is the probability of failing to reject H0 when it is not true.

• Level of significance: is the probability of committing a type I error.

18
Steps in hypothesis testing

1. State your H0 and H1

(a) If H1 is of the form µ 6= µ0 , then a two tail test is used.


(b) If H1 is of the form µ > µ0 , then a single tail or one sided test to right is used.
(c) If H1 is of the form µ < µ0 , then a single tail or one sided test to the left is required.

Figure 1: The distribution of Z0 when H0 : µ = µ0 is true; with the critical region for (a) the
two sided H1 : µ 6= µ0 , (b) the one-sided alternative H1 : µ > µ0 , and the one-sided alternative
H1 : µ < µ0 .

2. Choose the level of significance.

3. Choose the appropriate test statistics and establish the critical region.

4. Compute the test statistics value based on the sample data

5. Draw conclusion, that is reject or accept H0

Remark: We reject H0 when the computed value lies in the critical region.

Example 12 : An electrical firm manufactures light bulbs that have a lifetime that is approx-
imately normally distributed with a mean of 12 hours and variance 0.64 hours2 . A light bulb
is selected at random, and is tested, and the lifetime is found to be 13.3 hours. Determine
whether this bulb belongs to the manufacturer. Use the 5% level of significance.

Solution

1. • H0 : µ=12 (The bulb belong to the manufacturer)


• H1 : µ 6=12 (The bulb does not belong to the manufacturer)

2. Level of significance: 5%

19
3. Test statistic: Normal distribution.
x−µ 13.3 − 12
4. Computed Z-value:Z = = √ = 1.625
σ 0.64
5. Conclusion: Since the computed Z value (1.625) dose not lie in the critical region [Z ≤
−1.96 U Z > 1.96] we fail to reject H0 at 5% level of significance and we conclude that
the bulb belongs to the manufacturer.

3.2 Comparing a single mean to a specified value, when the population


variance is known

• It is typical that we compare a single observation to a specified value

• Usually we take a sample of size n, from which we compute the sample mean x̄ which
we then compare with a specified value.


 Pn 
i=1 xi
var[x̄] = var
n
1
= var[x1 + x2 + · · · + xn ]
n2
nσ 2 σ2
= =
n2 n
r
σ2 σ
• Therefore the standard error S.E[x̄] = =√
n n
Example 13 A soft-drink bottler purchases 10 bottles from a glass company. The bottler
wants to know if the average mean breaking strength exceeds 200 psi. If so she wants to accept
the bottles. Past experience indicates that for 4 specimen bottles the variance of the breaking
strength is 100 psi2 and a mean of 214 psi. Investigate at 5% level of significance whether the
manufacturer should accept or reject the bottles.

Solution

1. • H0 : µ = 200
• H1 : µ > 200

2. Level of significance: 5%

3. Test statistic: Z (Normal distribution)

4. Computed Z-value
x̄ − µ 214 − 200
Z= = 10 = 2.8
S.E[x̄] 2

20
5. Conclusion: Reject H0 . The manufacturer should accept the lot since the mean breaking
strength is greater than 200 pascals.

3.3 The difference between two mean when population variances σ12 and σ22
are known

• Since we have two populations under study, we are supposed to deduce the standard
error for these populations
σ2
• We know that var[x̄] = ,
n
σ12 σ22
• Then var[x¯1 − x¯2 ] = var[x¯1 ] + var[x¯2 ] = + ,
n1 n2
s
σ12 σ22
• Thus S.E.[x¯1 − x¯2 ] = + ,
n1 n2
(x¯1 − x¯2 ) − (µ1 − µ2 )
• Hence Z = ,
S.E[x¯1 − x¯2 ]
Example 14 A manufacturer claims that the average tensile strength of synthetic fibre A ex-
ceeds the average tensile strength of synthetic fibre B. To test his claim, 50 pieces of each type
of synthetic fibre are tested under similar conditions. Type A had an average tensile strength
of 43.7 psi and a variance of 11.8 psi2 , while type B had an average tensile strength of 41.5
psi and a variance of 46.3 psi2 . At 5% significance level, test the manufacturer’s claim.

Solution

The problem objective is to compare two populations, thus

1. • H0 : µA − µB = 0
• H1 : µA − µB > 0

2. Test statistic
(x¯A − x¯B ) − (µA − µB )
Z = q 2 2
σA σB
nA + nB

(43.7 − 41.5) − 0
= q
11.8 46.3
50 + 50

= 2.04

3. Level of significance 5% (one tail test)

4. Conclusion: Reject H0 . There is sufficient evidence to allow us to conclude that µA > µB ,


hence the manufacturer’s claim is true.

21
3.4 Hypothesis testing when the population variances is not known
and n < 30.

Example 15 A manufacturer of television picture tubes has a production line that produces
an average of 100 tubes per day. Because of new government regulations, a new safety device
has been installed, which the manufacturer believes will reduce average daily output. A random
sample of 15 days’ output after the installation of the safety device is shown below.

93, 103, 95, 101, 91, 105, 96, 94, 101, 88, 98, 94, 101, 92, 95

At 5% significance level, is there sufficient evidence to conclude that the average daily output
has decreased following the installation of the safety device?

Solution

• Here the mean x̄ and the variance are unknown.

• Also observe the sample (n = 15) < 30 hence we use the t distribution

1. • H0 : µ = 100 (Daily production has not changed)


• H1 : µ < 100 (Daily production has decreased)

2. Level of significance 5% (one tail test)

3. Test statistic
x̄ − µ
t = s

n
96.47 − 100
= 4.85

15
= −2.82.

4. Conclusion: Reject H0 since the computed t value lie in the critical region (−2.82 <
−1.761), and conclude that there is enough evidence to show that the average daily pro-
duction has decreased.

3.5 Difference between two population mean when the population variances
σ12 and σ22 are not known and (n1 − 1) + (n2 − 1) = n1 + n2 − 2 < 30.

• In order to use t distribution to make a valid test of hypothesis about µ1 −µ2 the following
conditions must be met.

1. The two population random variables (x1 and x2 ) are normally distributed.
2. The two sample must be independent
3. The two population variances are equal, that is σ12 = σ22 .

22
• By condition 3, we have common variance known as pooled variance, given by
(n − 1)s21 + (n2 − 1)s22
s2p =
n1 + n2 − 2
• Since we are comparing two populations, thus
(x¯1 − x¯2 ) − (µ1 − µ2 ) (x¯1 − x¯2 ) − (µ1 − µ2 )
t= q 2 2
= r   .
sp sp
1 1
n1 + n2
2
sp n1 + n2

Example 16 The manager of a large production facility believes that worker productivity is a
function of, among other things, the design of the job, which refers to the sequence of move-
ments. Two designs are being considered for the production of new product. In an experiment,
six workers using design A had a mean assembly time of 7.60 minutes, with a standard devi-
ation of 2.36 minutes, for this product. (The six observation were 8.2, 5.3, 6.5, 5.1, 9.7, 10.8).
Eight workers using design B had a mean assembly time of 9.20 minutes, with a standard
deviation of 1.35 minutes. (The observations were 9.5, 8.3, 7.5, 10.9, 11.3, 9.3, 8.8, 8.0). Can
we conclude at the 5% level of significance that the average assembly times differ for the two
designs? Assume that the times are normally distributed.

Solution

• Here were being asked to determine if µ1 6= µ2 .

• Observe that (n1 − 1) + (n2 − 1) < 30, ⇒ t distribution.

1. • H0 : µ1 − µ2 = 0
• H1 : µ1 − µ2 6= 0

2. Test statistic
(x¯1 − x¯2 ) − (µ1 − µ2 ) (7.60 − 9.20) − 0
t= r   = q
1 1
 = −1.61
1
s2p n1 + n2 1 3.38 6 + 8

• Recall that
(n − 1)s21 + (n2 − 1)s22
s2p =
n1 + n2 − 2
(6 − 1)2.362 + (8 − 1)1.352
=
6+8−2

= 3.38

3. Conclusion: We fail to reject H0 . Since the computed t value does not lie in the critical
region. Therefore we conclude that there is no sufficient evidence to allow us to conclude
that a difference in mean assembly times exists between designs A and B.

23
3.6 Paired Comparison

• Here, we consider two sample as in a two sample t− test, the difference is that in this
experimental design the samples are not independent.

• Observations occur in pairs such that, the two observations in a pair are taken from the
same the same experimental unit

• Or from two similar experimental unit (similar with respect to certain attribute)

Example 17 Gasohol has received much attention in recent years as possible alternative to
gasoline as a fuel for auto-mobiles. To compare the mileages per-gallon that can be achieved
with the two fuels, the following test was performed. Eight cars were selected and their fuel
tanks completely cleaned. Each car was driven twice over a predetermined course-once using
gasohol and once using gasoline and the miles per gallon was recorded for each trip.
At 10% significance level, does the data support the hypothesis that the mean mileage per gallon
of gasohol is less than that of gasoline?

Solution

This is a paired comparison

Mileage with gasohol 30 36 34 22 12 32 15 31


Mileage with gasoline 35 42 40 27 15 33 20 35
Difference -5 -6 -6 -5 -3 -1 -5 -4
(Difference)2 25 36 36 25 9 1 25 16

It follows that

P
D −35
x¯d = = = −4.375
n 8
s P P 2 s
2
n D − ( D) 8(173) − (−35)2
sd = = = 1.69
n(n − 1) 8(7)
1. • H0 : µd = 0
• H1 : µd < 0

2. Test statistic
x̄ − µd −4.375 − 0
t= sd = 1.69 = −7.29
√ √
8 8

3. Level of significance 10% (one tail test, t < (t0.1,7 = −1.42)

4. Conclusion: Reject H0 and conclude that the mean mileage for gasohol is less than that
of gasoline at 10% level of significance.

24
3.7 Confidence interval

• In general a (1 − α)100% confidence interval (CI) for the population parameter θ is given
by

θ̂ − critical value · SE[θ̂] < θ < θ̂ − critical value · SE[θ̂]. (25)

3.8 Critical value is for a two tail test

CI for the true population mean µ, when the population variance σ 2 is known

Example 18 Revisit the soft-drink bottler problem (example 13).

• A (1 − α)100% CI for µ is given by

x̄ − z α2 · SE[x̄] < µ < x̄ + z α2 · SE[x̄]. (26)

• H0 : µ = 200, H1 : µ > 200

• Level of significance:5% ⇒ 95% CI for µ

• Recall that x̄ = 214, n = 4 and σ 2 = 100


r r
σ2 σ2
214 − 1.96 < µ < 214 − 1.96
n n

204.2 < µ < 223.8

• Conclusion: Since the stated value in the null hypothesis does not lie within the 95% CI
we reject H0 at 5% level of significance and conclude that the mean breaking strength is
greater than 200.

3.9 CI for true population mean µ when the population variance σ 2 is un-
known and n < 30.

Example 19 Revisit example 15 (The television picture tubes problem).

• A (1 − α)100% CI for µ in this case is given by


 
4.85
x̄ ± t
(α ,n−1) · SE[x̄] = 96.47 ± 2.14 √
2
15

⇒ 93.79 < µ < 99.15

• H0 : µ = 100, H1 : µ < 100.

• Conclusion: Reject H0 (value stated in the null hypothesis not in the CI). Conclude that
the average production has changed.

25
3.10 CI for the difference between two populations

CI for two populations when µ1 − µ2 , σ12 and σ22 are known and n1 + n2 − 2 > 30.

Example 20 Revisit Example 14

• A (1 − α)100% for µ1 − µ2 is given by


s
s21 s2
(x̄1 − x¯2 ) ± z α2 + 2
n1 n2
r
11.8 46.3
= (43.7 − 41.5) ± 1.96 +
50 50

⇒ 0.087 < µ1 − µ2 < 4.31

• H0 : µ1 − µ2 = 0, H1 : µ1 − µ2 > 0

• Conclusion. Reject H0 . Conclude that the average tensile strength for the two metals are
different.

3.11 CI for the difference between two populations

CI for two populations when µ1 − µ2 , σ12 and σ22 are unknown and n1 + n2 − 2 < 30.

Example 21 Revisit Example 16.

• A (1 − α)100% for µ1 − µ2 is given by


s  
1 2
(x̄1 − x¯2 ) ± t( α2 ,n1 +n2 −2) s2p +
n1 n2

s  
1 1
= (7.6 − 9.2) ± 2.179 3.38 +
6 8

⇒ −3.76 < µ1 − µ2 < 0.56

• H0 : µ1 − µ2 = 0, H1 : µ1 − µ2 6= 0

• Conclusion. Accept H0 . Conclude that there is no sufficient evidence for us to conclude


that there is a significant difference in mean assembly times between designs A and B.

26
4 Linear Regression and Correlation
4.1 Introduction

• In many real world problems there are two or more variables that are related, and it is
important to explore this relationship.

• E.g in an industrial situation it is known that the tar content in the outlet stream in a
chemical process is related to the inlet temperature.

• It may be of interest to develop a method of prediction, that is,

• A procedure for estimating the tar content for various levels of the inlet temperature.

• One such model is a regression model

• Applications of regression are numerous and occur in almost every filed, including:

– engineering
– physical sciences
– economics and resource management
– life and biological sciences

• This is a model with a single regressor or independent variable, say x

• The relationship between the response y and the regressor, x, is a straight line.

• The simple linear regression model is

y = β0 + β1 x +  (27)

• where β0 is the y intercept, β1 is the slope, and  is a random error component.

• Errors are assumed to be normal distributed with zero mean and an unknown but con-
stant variance.

• Errors are also assumed to be uncorrelated, implying that the value of one error does not
depend on the value of any other error.

• Parameters β0 and β1 are called regression coefficients

• β1 gives the change in the mean distribution of y produced by a unit change in x

• If the range of data on x includes x = 0, then the y− intercept β0 is the mean of the
distribution of the response y when x = 0

• However, if the range of values for x does not include zero, then β0 has no practical
interpretation.

27
4.2 Least squares estimation of β0 and β1

• Parameters β0 and β1 are unknown and must be estimated using sample data

• These data may result in either from a controlled experiment designed specifically to
collect the data, or from existing records.

• The method of least squares is used to estimate β0 and β1 , that is,

• We need to estimate β0 and β1 so that the sum of squares of the differences between the
observations yi and the straight line is at minimum.

• From (27) we may write

yi = β0 + β1 xi + i , i = 1, 2, 3, · · · , n. (28)

• Equation (27) can be viewed as a population regression model while equation (28) is a
simple regression model written in terms of the n pairs of data (yi , xi ), i = 1, 2, · · · , n.

• The fitted regression line is

ŷ = βˆ0 + βˆ1 x,
such that each pair of observations satisfies the relation
yˆi = βˆ0 + βˆ1 x + ei (29)

where ei = yi − yˆi is called a residual and describes the error in the fit of the model at
the ith data point

• We need to find βˆ0 and βˆ1 so as to minimize the Residual Sum of Squares (RSS).
n
X n
X
RSS = e2i = (yi − βˆ0 − βˆ1 xi )2 (30)
i=1 i=1

• Taking partial derivative of (30) with respect to βˆ0 and βˆ1 , gives
n
∂RSS X
= −2 (yi − βˆ0 − βˆ1 xi ) (31)
∂ βˆ0 i=1
n
∂RSS X
= −2 (yi − βˆ0 − βˆ1 xi )xi , (32)
∂ βˆ1 i=1

• Setting the partial derivatives to zero and rearranging the terms we obtain
n
X n
X
nβˆ0 + βˆ1 xi = yi (33)
i=1 i=1
n
X n
X n
X
βˆ0 xi + βˆ1 x2i = xi yi (34)
i=1 i=1 i=1

28
which yields (make βˆ0 subject from (33) and substitute in (34)
Pn Pn Pn
n x i y i − ( x i )( i=1 yi )
βˆ1 = i=1Pn 2 i=1
Pn 2
n i=i xi − ( i=1 xi )
Sxy
= (35)
Sxx

• where
n Pn 2
X ( i=1 xi )
Sxx = x2i − (36)
n
i=1
n Pn 2
X ( i=1 yi )
Syy = yi2 − (37)
n
i=1
n
( n i=1 yi )( n i=1 x1 )
X P P
Sxy = x i yi − (38)
n
i=1

• Then from (33) we have


βˆ0 = ŷ − βˆ1 x̄

4.3 Analysis of variance approach to regression analysis

• Often the problem of analysing the quality of the fitted regression line is handled through
an ANOVA approach.

• The analysis of variance approach in a simple regression model test the hypothesis:

– H0 : β1 = 0 (The slope is not significant different from zero)


– H1 : β1 6= 0 (The slope is significantly different from zero)

• One-way ANOVA Table

Source df SS MS F
Regr SS
Regression 1 Regr SS Regr SS s2
SSE
Residual n-2 SSE s2 = n−2
Total n-1 SS(Total)

4.4 The coefficient of determination

• The quantity r2 is called the coefficient of determination

• Mathematically it is given by
Regression SS residual SS
r2 = =1− (39)
Syy Syy

29
• Since Syy is a measure of the variability in the response y without considering the effect
of the regressor x, and the residual SS is a measure of the variability in y remaining after
x has been considered.

• r2 is the proportion of the variation in the response y accounted for by the regressor x.

• By formulation the coefficient of determination is in the range 0 ≤ r2 ≤ 1

• Values of r2 close to 1 imply that the model explains most of the variation in y.

4.5 Inferences concerning the regression coefficients

1. Confidence interval (CI) for β1

• A (1 − α)100% CI for the slope β1 is given by


s s
S 2 S2
βˆ1 − t( α2 ,n−2) < β1 < βˆ1 + t( α2 ,n−2) (40)
Sxx Sxx

• where t( α2 ,n−2) is the value of the t− distribution with (n-2) degrees of freedom, and
S 2 is the residual mean square from the ANOVA table.

2. Confidence interval (CI) for β0

• A (1 − α)100% CI for the slope β1 is given by


s  s 
2 x̄2
 
ˆ 2
1 x̄ ˆ 2
1
β0 − t( α2 ,n−2) S + < β0 < β0 + t( α2 ,n−2) S + (41)
n Sxx n Sxx

3. Prediction of new observations

• Equation ŷ = βˆ0 + βˆ1 x may be used to predict the mean response y|x0 at x = x0 .
• Where x0 is not necessarily one of the pre-chosen values.

4. Prediction interval for y0

• A (1 − α)100% confidence interval for the mean response y|x0 is given by

yˆ0 − t( α2 ,n−2) · SE[x0 ] < y0 < yˆ0 + t( α2 ,n−2) · SE[x0 ]


s
1 (x0 − x̄)2
 
where SE[x0 ] = S2 1+ +
n Sxx

Example 22 An experiment on the amount of converted sugar in a certain biochemical process


at various temperatures was conducted, and the following results were obtained.

Temperature (x) 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2.0
Converted sugar (y) 8.1 7.8 8.5 9.8 9.5 8.9 8.6 10.2 9.3 9.2 10.5

30
(a) Fit a simple linear regression model to the data.

(b) Carry out an analysis of variance (ANOVA) to test at the 5% level of significance whether
he slope is significantly different from zero. From the ANOVA table, compute the coeffi-
cient of determination, r2 and interpret it.

(c) Predict the amount of converted sugar when the coded temperature is 1.75. Find a 95%
prediction interval for this prediction.

Solution

(a) From the given data we have the following

Pn
i=1 xi 16.5
x̄ = = = 1.5, (42)
n 11
Pn
i=1 yi 100.4
ȳ = = = 9.13, (43)
n 11
n Pn 2
X ( i=1 xi )
Sxx = x2i − = 1.1, (44)
n
i=1
n Pn 2
X ( i=1 yi )
Syy = yi2 − = 7.20, (45)
n
i=1
n
( ni=1 xi )( ni=1 yi )
X P P
Sxy = x i yi − = 1.99.
n
i=1

Using the above results we have


Sxy 1.99
βˆ1 = = = 1.81 (46)
Sxx 1.1

βˆ0 = ȳ − βˆ1 x̄ = 9.13 − 1.81(1.5) = 6.41 (47)

Thus, the fitted regression equation is ŷ = 6.41 + 1.81x


Remark Now, to draw the fitted line on the scatter, we need to compute at least two fitted
response values at two given values of the regressor x, e.g

ŷ|x=1.0 = 6.41 + 1.81(1.0) = 8.22 (48)


ŷ|x=2.0 = 6.41 + 1.81(2.0) = 10.03. (49)

31
(b) For the ANOVA table we need to compute the following

Regression SS = βˆ1 Sxy = 1.81(1.99) = 3.60 (50)


Residual SS = Syy − βˆ1 Sxy = 7.20 − 3.60 = 3.60

The hypothesis are:

• H0 : β1 = 0
• H1 : β1 6= 0

One-way Anova Table

Source df SS MS F
Regression 1 3.60 3.60 9.0
Residual 9 3.60 0.40
Total 10 7.20

0.05 > 5.12


Critical value F(1,9)
Conclusion: Since the computed F −value (9.0) is greater than the critical value (5.12) at
5% level of significance, we reject H0 and conclude that the slope is significantly different
from zero.

Regression SS 3.60
r2 = = = 0.50
Syy 7.20

Comment: This implies that 50% of the variation in converted sugar (y) is explained by
the temperature (x) and the remainder is unaccounted for by our regression model.

(c) The predicted converted sugar at a coded temperature of 1.75 is

yˆ0 = 6.41 + 1.81(1.75) = 9.58


s
2
 
1 (x 0 − x̄)
SE[y0 ] = S2 1 + + (51)
n Sxx
s
(1.75 − 1.5)2
 
1
= 0.40 1 + + = 0.6776 (52)
11 1.1

Therefore the 95% prediction interval for y0 is

yˆ0 − t0.025,9 < y0 < yˆ0 + t0.025,9

32
9.58 − 2.26(0.6776) < y0 < 9.58 + 2.26(9.58)

8.05 < y0 < 11.11

THE END OF THE NOTES !!!!!!!!!!!!!!!!!!

Engineering Mathematics 2: MT233 (2019): Tutorial

1. Verify that the function f is a probability density function(p.d.f) over the given interval

(a) f (x) = 18 , x ∈ [0, 8]


(b) f (x) = 16 e−x/6 , x ∈ [0, ∞)
(c) f (x) = 12x2 (1 − x), x ∈ [0, 1].

2. Find the constant k so that the function f is a probability density function over the given
interval

(a) f (x) = kx, x ∈ [1, 5]


k
(b) f (x) = b−a , x ∈ [a, b]
(c) f (x) = k(4 − x2 ), x ∈ [−2, 2]

3. The distribution of petrol consumption at a garage is given by



k(x − 1)(3 − x), 1 < x < 3
f (x) =
0 otherwise

(a) Determine the value of k, so that f (x) becomes a valid p.d.f.


(b) Find the model, mean and variance of X
(c) Find the probability that X is greater that 2.5 litres

4. Let X denote the reaction time, in seconds, to a certain stimulus and Y denote the
temperature (o F) at which a certain reaction starts to take place. Suppose that two
random variables X and Y have the joint density

k(2x + y), 2 < x < 6, 0 < y < 5
f (x) =
0 otherwise

(a) Find k,

33
(b) P (X > 3, Y > 2)
(c) P (X + Y < 4)

5. Let X and Y denote the lengths of life, in years, of two components in an en electronic
system. If the joint density function of these variables is

e−(x+y) , x > 0, y > 0
f (x) =
0 otherwise

find P (0 < X < 1|Y = 2).

6. Given that var[X] = 2 and var[Y ] = 3, compute


1 
(a) var 2X + 10
(b) var[4Y + 9],
(c) var 92 Y − 3
 

7. If X and Y are random variable (r.v) with joint probability distribution, such that
var[X] = 1.5, var[Y ] = 2 and cov(X, Y ) = −1, find

(a) var[Z], where Z = X − 2Y + 1


(a) var[W ], where W = X + Y + 1
(c) cov(P, Q), where P = X + Y , and Q = X − Y
(c) cov(P, Q), where P = X + Y , and Q = X − 3Y

8. The compressive strength of samples of cement can be modeled by a normal distribution


with a mean of 6000 kilograms per square centimeter and a standard deviation of 100
kilograms per square centimeter.

(a) What is the probability that a sample’s strength is less that 6250 Kg/cm2 ?
(b) What is the probability that a sample’s strength is between 5800 and 5900 Kg/cm2 ?
(c) What strength is exceeded by 95% of the sample?

9. The diameter of holes for cable harness is known to have a standard deviation of 0.01
cm. A random sample of size 30 yields an average diameter of 1.5045 cm. Use α = 0.01,
to test the hypothesis that the true mean hole diameter is 1.50 cm.

10. A contractor makes a large purchase of cement. The bags of cement are supposed to
weigh 94 kg. The contractor decided to test a sample of bags to see if he is getting
stipulated weight. A random sample of size 9 yielded the following weights.

94.1 93.4 92.8 93.4 93.5 94.0 93.8 92.9 94.2

Make an appropriate test at the 0.05 level of significance.

34
11. Iron ore is extracted from rocks obtained from two different sites A and B. The observa-
tions are of percentage of iron ore per sample:

A: 47.9 51.3 42.4 54.9 61.8


B: 39.7 50.3 36.8 29.6 41.2 43.7

2 ) and N (µ , σ 2 ) respectively.
Assume normal populations, N (µA , σA B B

(a) Test the hypothesis H0 : µA = 50 against the alternative H1 : µA < 50.


(b) Test the hypothesis H0 : µA = µB against the alternative H1 : µA 6= µB .
(d) Find 95% confidence intervals for µA and µB .
(d) Find a 90%confidence interval for µA − µB .

12. The average fuel consumption for 10 small cars before and after a certain additive sub-
stance was introduced into their fuel was observed and the data obtained were recorded
as follows:

After : 47 38 44 48 52 55 44 52 60 44
Before : 40 39 32 33 40 27 36 56 50 40

Suppose that the differences in fuel consumption are normally distributed with mean µD
2
and variance σD

(a) At 5% significance level, is there sufficient evidence to conclude that the additive
substance increases fuel consumption in each vehicle?
(b) Construct a 90% confidence interval for µD .

13. A market survey was conducted for the purpose of forming a demographic profile who
would like to own an electronic engineering company. The data collected is presented
below:

Response Men Women


Interested 32 20
Not interested 118 130

Is there sufficient evidence to conclude that the desire to own an electronic engineering
company is related to gender? (Test using α = 0.05).

35
14. Voltage output (y) and engine speed (x) in metres per second were observed for a turbine
at a hydroelectric station were recorded as follows:

Engine speed (x) : 166 169 186 202 203


Voltage Output (y) : 1.6 1.3 1.9 1.6 2.2

(a) Plot the data on a graph paper.


(b) Fit a simple linear regression model to the data.
(c) Carry out an analysis of variance (ANOVA) to test at the 5% level of significance
whether he slope is significantly different from zero. From the ANOVA table, com-
pute the coefficient of determination, r2 and interpret it.
(d) Predict the amount of converted sugar when the coded temperature is 1.75. Find a
95% prediction interval for this prediction.
(e) Find the standard errors of the estimated parameters β̂0 and β̂1 .

15. Explain clearly, using the following data, how the following are constructed:

23 29 40 28 15 22 46 39 22 17 26 33 35 49 20,
36 25 15 31 17 43 54 36 30 30 40 27 24 20 28 42
22 37 17 39 17 22 9 26 29

(a) Stem and leaf plot,


(b) Histogram,
(c) Box plot.

16. An experiment was conducted to compare the speeds of the word-processing packages
of two brands of minicomputers A and B. Forty people with similar backgrounds were
randomly selected and divided into two groups. One group was assigned minicomputer
A and another to minicomputer B. Each person was asked to perform the same word
processing job and the length of time it takes for each person to complete the job was
recorded. Past experience has shown that the population, associated with both mini-
computer A and B are normally distributed. The times required by the group using
minicomputer A had a mean of 14.8 minutes and a variance of 3.9 minutes2 . For the
group using using minicomputer B, the mean length of time to complete the task was
12.3 minutes and the variance was 4.3 minutes2 .
Is there sufficient evidence to conclude that the mean length of time required to complete
a word-processing task using minicomputer A is less than that of minicomputer B? (Use
α = 0.01).

36
TableI
A r e a sU n d e rt h e N o r m a lC u r v e

o.oo o.oI 0.02 0.03 o.o4 0.05 0.06 0.a7 0'08 0'o9 r - ,(:L"t
- J.4 0.0003 0.0001 0.0003 0.0003 0.0003 0.0001 0.ooo3 0.0003 0.0003 0.0002 z
- J.J 0.0005 0.0005 0.0005 0.0004 0.0004 0.0004 0.0004 0.0004 0.0004 0.0003 o
0.0007 0 0007 0.0006 0.0006 0.0006 0.0()06 0.0006 0.0005 0.0005 0.0005 0.I0 1.2815
- J , I 0 . 0 0l 0 0.0009 0.0009 0.0009 0.0008 0.0008 0.0003 0.0008 0.0007 0.0007 0 .0 5 1.5449:
- 1.0 0 . 0 0 Il 0.mr3 0.0013 0.0012 0.00r2 0.001I 0 . 0 0 II 0 . 0 0 I| 0,00I0 0.0010 0.025 1.9600
0.010 2. )26I
-? a 0.00r 9 0 . 0 0 I8 0 . 0 0t 7 0.00l7 0.00l6 0 . 0 0| 6 0.00r 5 0.00tJ 0.00r4 0.m14
0.0021 0.002I 0.0020 0.00I9 0.005 ?.5758'
-2.8 0.0026 0.0025 0.0024 0.0023 0.0023 0.ffi22
0.0035 0.0034 0.0031 0.0032 0.003I 0.0010 0.0029 0.0028 0.oo27 0.0026 0 , 0 0t I .0902
- ) A 0.0047 0.0045 0.0044 0.0043 0.004I 0.0040 0.0019 0.0038 0.0017 0.0016 0.0005
0.0048 1.2905
-2-5 0.0062 0.0060 0.0059 0.0057 o.oo55 o.oot4 0.00J2 0.005I 0.0049
0 . 0 0 0I 1.7r90
-2.4 0.0078 0.0075 0.0071 0.007I 0.0069 0.0068 0.0066 0.0064 0.00005 I .8906
0.0082 0.0080
-2.J 0 . 01 0 7 0.0104 0.0| 02 0.0099 0.0096 0.0094 0.009r 0.0089 0.0087 0.0084 0 . 0 0 0 0I 4.2649
0 . 0 13 9 0.0136 0 . 0r 1 2 0.0129 0 . 0 12 5 0.0122 0 . 0 tt 9 0 . 0 tr 6 0.01l3 0 . 0 tt 0
-2.1 0 . 0t 7 9 0 . 0r ? 4 0.0170 0.0166 0 . 0t 6 2 0 . 0r 5 8 0 . 0 15 4 0.0t50 0.0r46 0.0t41
-- 2.0 0.0228 0.022? 0.02l7 0 . 0 2| 2 0.0207 0.0202 0 . 0 r9 7 0 . 0I 9 2 0 . 0t 8 8 0 . 0t 8 l

. - t . 9 0.0287 0.028| 0.0214 0.0268 0-0262 0.0256 0.0250 0.0244 0.0239 0.0213
- l . E 0.0359 0.0352 0 . 0 3 4 4 '.00..0043 1 6 0.0129 0.0322 0 . 0 3l 4 0.0307 r0.010l 0.0294
- t.t 0.0446 0.0416 0.0427 l8 0.0409 0.040I 0.0192 0.0384 0.0375 0.0367
- 1 . 6 0.0548 o.o517 0 . 0 5 2 6 0 . 0 5I f 0.0505 0.0495 0.0485 0.0475 0.q4q5 0.o455
t a
0.0668 0.0655 0.0641 0.0610 0.0618 0.0606 0.0594 0 . 0 5 8 2 0 . 0 5 7r 0.0559

- 1 . 4 0.0808 0.0791 0.0778 0.0764 0.0749 0.0735 o.o722 0.0708 0.0694 0.068|
- l.l 0,@68 0.095I 0.0934 0.0918 0 . @ 0l 0.0885 0.0869 0.0851 0.0818 0.082J
p .r l 5 l 0 . ln l 0 . 1| | 2 0.1091 0 .r 0 7 5 0 .r 0 5 6 0.1038 0.1020 0.1003 0.0985
Q .I 3 5 7
i l
0 .l ] 3 5 0 .t 3 r 4 0.1292 0.127 | 0 .1 2 5I 0.1230 0.t2t0 0.1190 0.1t70
- 1 . 0 0 .I 5 8 7 0.1562 0.t519 0.1515 o.t492 0.1469 o.1446 0.1423 0.1401 o.t379

- 0.9 0 . 18 41 0,lEr4 0.1788 o.t762 0.1736 0 . t 7 1l 0.1685 0. I 660 0.r635 0.l6lt


- u.6 0 . 2r t 9 0.2090 0.2061 0.203I 0.2005 0.1977' 0.1949 0.t912 0. r 894 0. I 867
n ? o.2420 0.2189 0.2158 0.2327 0.2296 0.2266 0.2216 o.2206 o.2t77 Q . 2 t4 8
-- 0.6 o.274J 0.2109 0.2676 0.2643 0.26I I 0.2578 0.2546 o . 2 5| 4 0.2483 0.2451
- 0.5 0.3085 0.t050 0 . 1 0I 5 0.298l 0.2946 0.29t2 0.2877 0.2843 o.:8ro 0.2176

- 0.4 0.3,146 0 . 1 4 0 9 0.1172 0.3316 0.3300 0.3264 0.3228 0 . 3r 9 2 0 . 1r 5 6 0 . 3r 2 r


- 0 . 3 8 2I
U.J 0.1781 0.174-5 0.3707 A.J669 0.3632 0.3594 0.3557 0.1520 0.3481
- 0 . 2 o.1207 0 . 4t 6 8 0.4t29 o.4090 0.4052 0 . 4 0 13 0 . 1 9 74 0.3916 0.3897 0.3E59
- 0 . I 0.4602 0 . 4 5 6 2 0.4522 0.4,r81 O.4443 o.4404 0.4361 0.4325 0.4286 0.4247
0.5000 0.4960 0.4920 0.4880 0.4840 0 . 4 8 0r 0 . 4 7 6I 0.4721 0.4681 0 . 4 6 4I

0.0 0.J000 0.5040 0.5080 0.5120 0 . 5r 6 0 0 . 5t 9 9 0.5239 0.5279 0.5319 0.5359


0 .I 0.5198 0.5418 0.5478 0 . 5 Jr 7 0.5557 0.5596 0.5636 0.5675 0.57|4 0.575l
0.2 0.5791 0.5812 0 . 5 8 7l 0 . 5 9r 0 0.5948 0.5987 0.6026 0.6064 0 . 6l 0 l 0 . 6r 4 r
0.1 0.6r 79 0.62t7 0.6255 0.6293 0.6131 0.6368 0.906 0.u43 0.980 0 , 6 5r 7
0.4 0.6554 0.659I 0.6628 0.6664 0.6700 0.6716 0.6772 0.6808 0.6844 0.6879

0.5 0 . 6 9t 5 0.6950 0.6985 0.70| 9 0.7054 0.7088 0.7123 0 . 71 5 7 4.7190 0.7224


0.6 0.7257 0.729t 0.7124 0.7357 0.?189 0.7122 0.7454 0.7486 0.75 t7 0.7549
0.7 0 . 75 8 0 0.76I | 0.76,4.2 0.7673 0.7'104 o.77J4 0.11u 0.7794 0.7823 0.7852
0.8 0 . 7 E 8r 0 . 7 9I 0 0.7939 0.7967 0.7995 0.8021 0 . 8 0 5I 0.E078 0 . E1 0 6 0 . 8| 3 l
0.9 0 . 8t 5 9 0 . 81 8 6 0.8212 0.E238 0.8264 0.8289 0 . 8 1 t5 0.8340 0.E365 0.8189

0.841l 0.84J8 0.E461 0,8485 0.E508 0 . 8 5 3I 0.8554 0.8577 0.E599 0.862r


0.E93 0.866J 0.8686 o.8708 0.E729 0.8749 0.8770 0.8790 0.88t0 0.8810
0.8849 0.8869 0.8888 0.8907 0.8925 0.8944 0.8962 0.8980 0.8997 0.90r 5
0.9012 0.9049 0.9066 0.9082 0.9099 0 . 9 1l 5 0.913I o.9t47 0 . 9r 6 2 0 . 9t 7 7
0.9t92 0.9207 0.9222 0.9236 0.9251 0.9265 0.9278 0.9292 0.9106 0.93l9

0.9145 0.9157 0.9370 0.9182 0.9394 0.9406 0 . 9 4l 8 a.9429 0.944l


t.5 0.9332
0.9525 0.9535 0.9545
1.5 0.9452 0.9461 0 . 9 4 74 0.9484 0.9495 0.9505 0 . 9 5l 5
*t.7 0 . 95 5 4 0.9564 0.957i 0.9582 0.9591 0.9599 0.9608 0.9616 0.9625 0.9633
0.9ff9 0.9656 0,9664 0.967I 0.9678 0.9686 0.9691 0.9699 0.9706
t.8 0.964I
0.9756 0.976r O.9767
1.9 0.97r 3 0.97t9 0.9726 0.9732 0.9738 0.9744 0.9750
')n
0.9172 0.9778 0.9783 0.9788 0.9793 0.9798 0.9803 0.9808 0.9E12 0.9817
0.9821 0.9826 0.9830 0.9814 0.9838 0.9842 0.9846 0.9850 0.9854 0.9857
0.9861 0.9864 0.9868 0.9871 0.9E75 0.98?8 0.9881 0.98E4 0.9887 0.9890
0.9896 0.9898 0.990| 0.9904 0.9906 0.9909 0.99I I 0 . 9 9I I 0.99t6
2.J 0.9891 03932 0.9914 0 . 9 9 36
7 . 4 0.99r 8 0.9920 0.9922 0.9925 0.9927 0.9929 0.993r
't<
0.9938 0.9940 0.994r 0.9941 0.9945 0.9946 0.9948 0.9949 0.995l - 0.9952
0.9959 0.9960' 0.996| 0.9962 0.9961 0.9964

n+-
1 A 0.9951 0.9955 0.9955 .0.9957
a', 0.9965 0.9966 0.9967 0.9968 0.9969 0.9970 0.997l 0.9972 0.9971 0.9974
0.9974 0"9975 0.9976 0.9977 0.9977 0.9978 0.9979 09919 0.9980 0 . 9 9 8|
0.99EI 0.9982 0.9982 0.9983 0.9984 0.9984 0.9985 0.9985 0.9986 0.9986
7.9
1.0
l. l
0.9987
0.9990
0.9987
0.99el
0.9987
0.999r
0.9988
0.9991
0.9988
0.9992
0.998e
0.9992
0.9989
0.9992
0.9994
0.9989
0.9992
0.9995
0.9990
0.9991
0.9995
0.9990
0.9991
0.9995
v
'lo)
1 1 0.9991 0.999r 0.9994 0.9994 0.9994 0.9994
1 1 n 60< n doo< n ooo< n ooo< ^ ooo< n AAA6 o s995 0.9995 0.9996 0.9997
Table Thc cnrricsirr this tablc arc tlrc criticnl valucsfor Studcnt'st lor an area of q in lhc
Crllicrlvrlucsol Studcnt's
I r i g l r t - h u n dt u i l .C r i t i c a l v a l u c sf o r t h c l c f t - h a n dt a i l a r c f o u n d b y s y m m c t r y
distribution
-Amountol a in One-lail
0.25 0.10 0.01 0.005

I t.000 3.08 6.1I l2;l 3r.8 63.7


'2 0.816 t.E9 2.92 4.10 6.97 9.92
J 0.765 t.& 2.15 3.18 4.54 5.E4
/ 0.74
| t.53 2.rl 2.t8 1.75 4.@
5 n ?1t
1.48 2.02 2.57 3.3't 4.03
6 0.7rE l.u 1.94 2.45 l.14 , J.1l
7 0.7il 1.42 r.89 2.36 l.m 3.50
8 0.7c5 t.40 r.86 2.H 2.90 3.36
v 0.701 l.3E r.83 2.26 2.82 3.25
l0 0.700 r.l7 r . 8I 2.23 2.76 3.t7
ll 0.697 l.l6 t.80 2.20 2.72 3 . rt
t1 0.695 r.l6 L78 2 .t 8 2.68 3.05
tl 0,694 r.i5 1.77 2.t6 2.65 3.01
I t
0.692 r.J5 r.76 2.t4 2.62 2.98
t5 0.69| t.l{ t . 75 2.t3 2.fi L95
l6 0.690 t.34 1.7,5 2.t2 2.5E L92
l'l ,0.689 t.3l 1.74 t t l
2.5'l L90
t8 0.68E t.3l t.73 . 2.i6 L55 2.8E
t9 0.68E r.33 1.73 2.W 2.54 zE6
0.6E7 r.3l 1.72 2.W ?{1 2.85
0.686 t.l2 1.72 2.0E 2.52 L83
,) 0.686 t.32 1.72 2.07 2.51 2.E2
2J 0.685 r.l2 1.7 | 2.07 2.50 zEl
24 0.685 t.3z t . 7| 2.06 2.49 2.80
t{ 0.684 1.32 t . 7| 2.06 2.49 L19
lo 0.684 t 1 ? tr7I 2.06 2.48 2.78
a1 0.684 t . 3r t.v0 2.05 2.47 2.7'l
'te
0.681 t . 3I r.io 2.05 2.47 2.76
t> 0.6E3 l.l l t.7q 2.05 2.46 2.16
0.674 t.28 r.65 1.96 2.31 2.58
t -
\r'
NOTE: For df 2 3Q thccriticalvalucr(df,c) is approrimetctlby :(cf givcnin
lhc bolromrow of trblc.
t h - - r F - - F 9
b' =
' - o- i€nh 9. -q ' l o N s \ F h
n r i e i c i r j : F r : F = E I E -i I I ! r 3 : 1 I g
o l n : 1 \ e o o F r h €
x r r " =H = I I E! E = l !
h ! h @ - € _ t s ^
N :.. o .- v ' l 9 v - r(\o\rh € € r h N
€ h 9 n ei ei e,i ei € n ! 6 6 1

o
@ aiqEqisaRs
"-€rrt{' rtneicioi S A F T=: F R 5IT3 E i 3r : q n Y n ' i t ! f ? - ) n N

(!

E - h 6 N \ O r V v - \ O

z s ': ro - . €n hr s: q t : n d r c s
m e i r i c i n i N : i T F: : F F 1T I s E !f l E E q E
E C > n N q l O - @ € 6 ^
o

o,
(-) :'r ) o
-
i9\a
@ h r<r o
09a5;X
ri ei ei e..i R : T i T: : = R F- 5 T i I t q . < t h h €
oer:gntc

u
q n T F m v _ N O . c
o c\l
9 o i € r n
n L ai ri 'a
€ v J o . F ;
-i ,-i -i c.i r.i
6;$nR x3':q € E ei- q_. E sRR6q
ci
ci.i ci.i ci e.i e.i c.r nl ri.i J .J J J --i J
Ol
0 9 y \ 9 c ) . o r r q h { r
cv ': -o - i €g no vq n 0 9 e _ 6 r
a a r i n i n i *q *3 *3 .Sin ^ i K; .Rr g; .Y =^ r. .=
o s n n -
r= a
_ = p i RI =
i!EI L
5 = : = fft g 3 : : i i I I F : : : " F = i r ! !
|r) \ O v O \ O N
v ; r € €
N = d F ; +

cv V 9 9 - €
9 a r O r €
N = c d r i
g IIia
+Fid-i BEqqq QEi;n RRR -eRRr=:l
ei
".i",-,r^^ oiaic.icioi
d
nFI!l
"iF::
u Rin
€ vx) vI ' c E r E
.' - i e i nr i { e6 €nqq; q
i - i c i F . io i - i
qiEn ssFRF\o€o-Fr
oiruorcr r-i .i.i.i.i i rR33=
t)
o
x. ' - @; \E
o . <g
r I : l rEr i - E
q i i qd .Ei . i oEi . i ;c . e
,i
a nsi cEl c rqe r cqv E rEi c . f. l i. i csi c si KF s B H s
o .irid_i_.

n
r<f ';i+RtE
c r;: o
: <di .i oxvi + ; ; ;
o\ h !t N h h ? -,R
fin;S$ i gFSF r:6o.re
a) ; AqiEq ; . j . i . i . i . l r$ X iii i=l=Ra
o ".i*kil^
a rg
c)
i ; S q E qnd eF i q Fl ild ; * .a{ . e; i ;€. j E i J€n e
,i-€\o? q q 6 nsr €
rcorc,i
s $ ? 6 , . F e-
.c z
ciciciJ ii"{R_q

o g - 9 \ O h € T € F N
L

E
n . i O \ - A
N : . - d 6 s
N 6 N N N
+ - i F i F i - i
88SsR i8€qe
r;dcicici
GRR;$ $xR=o
N N N r{ ^t r.i c..i .i
ei .i tri
l{ ...| .i .i .l

o' O) O n - \ O h o \ r 6 6 n

o lt
o a ' O f t O
e { = c i 6 r i
.'l O\ \O rt
. a d d d - i
n
O - n \ o o
c l - : q q q qEEi.: qssGB RsER;
1 6 n 6 d d N N ^t r. hi Fi ci c.i ri ci ni ei oi ci ni

\ 6l h an N O\'A n N 9 m @ \ O € € - \ O

.9 o l
N ^ ' - n -
' ' - 6 € h
n - € \ O r
.q rr d.i ..i
n 6. -6 {6 -n - : q
o n
Eqqqq 4sBFF s6Bss
N c{ .l .r c] r.l F{ .i ai ..t ?.1 a.l ci al
o l
o \ o a . t 6 o \ - \ o n F
u - d d h v rr.l
O \ O \ - . ! ! > ( ! t O \ O n O r h - ,
N = o i \ d r ; .a+q
0 r
-i
nY 9n.!
n n n m m
+i;;=
nnnnrt
H 6 'r i,e U
rr m
5 eS g F € 8
i ci Fio{rici.i
oc)
Q o h q a t \ O \ q O @ O \ - q €
o ; n o \ r n R : ? N o \
.t -c- t$ .N cO q
a
-
' r - A € h
r 9 ^ l -
. c v . a +
O \ € @ f - \ O
d-i-i-iFi :::nc
nn n nm
.N t- "r nj ; cF -qo
m ri ri m ri Fid-i-iFi
- =
. - o - n - - F
o\ o\dc{\o gr r c) !f
\o^;-ir\o q n d_r - 6, 9 ? q= g g , ? N c ) € € $ r Q c ) d q
3.2 - - r € € r* rO 6 ri
hhn - i S',.q'dd!',.' : : nn .?n r t c . tc . . t - blq4
- E
= T s v v r r t . r t ' t f9 v
v . q q r i r i "q
c o O
- . . { F r . < r h\ o F c o o \ o - s = : : --=gR
u
5 9 F F .R
i XR S?gR g
F =
rolEuturouaO
Jo,uropeaJJ
1osaa:6eq
r a \ 6
: o
o = c i
l f , o l
l' - J o v o
E c
> o

You might also like