Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Save to My Library
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Test for Independence

Test for Independence

Ratings: (0)|Views: 113|Likes:
Published by lovepink_17

More info:

Published by: lovepink_17 on Nov 18, 2011
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

01/10/2014

pdf

text

original

 
 
TEST FOR INDEPENDENCE
 
By:Ondoy, Rica JanProcianos, Enoch
(Reporter)
 Sampaco, Nisa
 
Test for Independence
| 2
Introduction
In the test for independence, the claim is that the row and column variables are independent of each other. This is the null hypothesis.The multiplication rule said that if two events were independent, then the probability of bothoccurring was the product of the probabilities of each occurring. This is the key to working the test forindependence. If you end up rejecting the null hypothesis, then the assumption must have been wrongand the row and column variable are dependent. Remember, all hypothesis testing is done under theassumption the null hypothesis is true.The test statistic used is the same as the chi-square goodness-of-fit test. The principle behindthe test for independence is the same as the principle behind the goodness-of-fit test. The
test forindependence is always a right tail test
.In fact, you can think of the test for independence as a goodness-of-fit test where the data isarranged into table form. This table is called a
contingency table
.The test statistic has a chi-square distribution when the following assumptions are met:
 
The data are obtained from a random sample
 
The expected frequency of each category must be
at least 5
.The following are properties of the test for independence:
 
The data are the observed frequencies.
 
The data is arranged into a contingency table.
 
The degrees of freedom are the degrees of freedom for the row variable times the degrees of freedom for the column variable. It is not one less than the sample size; it is the product of thetwo degrees of freedom.
 
It is always a right tail test.
 
It has a chi-square distribution.
 
The expected value is computed by taking the row total times the column total and dividing bythe grand total
 
The value of the test statistic doesn't change if orders of the rows and columns are interchanged(transpose of the matrix).This approach consists of four steps: (1) state the hypotheses, (2) formulate an analysis plan, (3) analyzesample data, and (4) interpret results.State the HypothesesSuppose that Variable A has
levels, and Variable B has
c
levels. The null hypothesis states thatknowing the level of Variable A does not help you predict the level of Variable B. That is, the variablesare independent.
 
Test for Independence
| 3H
0
: Variable A and Variable B are independent.H
a
: Variable A and Variable B are not independent.The alternative hypothesis is that knowing the level of Variable A
can
help you predict the levelof Variable B.
Note: Support for the alternative hypothesis suggests that the variables are related; but the relationship is not necessarily causal, in the sense that one variable "causes" the other.
Formulate an Analysis PlanThe analysis plan describes how to use sample data to accept or reject the null hypothesis. Theplan should specify the following elements.
 
Significance level 
. Often, researchers choose significance levels equal to 0.01, 0.05, or 0.10; butany value between 0 and 1 can be used.
 
Test method 
. Use the chi-square test for independence to determine whether there is a significantrelationship between two categorical variables.Analyze Sample DataUsing sample data, find the degrees of freedom, expected frequencies, test statistic, and the P-value associated with the test statistic.
 
Degrees of freedom
. The degrees of freedom (v) is equal to:
v = (r - 1) * (c - 1)
where, r is the number of row and c is the number of column
 
Expected frequencies
. The expected frequency counts are computed separately for each level of one categorical variable at each level of the other categorical variable. Compute r * c expectedfrequencies, according to the following formula.
E
r,c
= (n
r
* n
c
) / n
 
where E
r,c
 
is the expected frequency count for level
of Variable A and level
c
of Variable B, n
r
 
isthe total number of sample observations at level r of Variable A, n
c
is the total number of sample observations at level
c
of Variable B, and n is the total sample size.
 
Test statistic
. The test statistic is a chi-
square random variable (Χ
2
) defined by the followingequation.

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->