You are on page 1of 2

1

Dependent Means

There are two possible cases when testing two population means, the dependent case and the independent
case. Most books treat the independent case first, but I'm putting the dependent case first because it follows
immediately from the test for a single population mean in the previous chapter.

The Mean of the Difference:

The idea with the dependent case is to create a new variable, D, which is the difference between the paired
values. You will then be testing the mean of this new variable.

Here are some steps to help you accomplish the hypothesis testing

1. Write down the original claim in simple terms. For example: After > Before.
2. Move everything to one side: After - Before > 0.
3. Call the difference you have on the left side D: D = After - Before > 0.
4. Convert to proper notation:
5. Compute the new variable D and be sure to follow the order you have defined in step 3. Do not
simply take the smaller away from the larger. From this point, you can think of having a new set of
values. Technically, they are called D, but you can think of them as x. The original values from the
two samples can be discarded.
6. Find the mean and standard deviation of the variable D. Use these as the values in the t-test from
chapter 9.

Independent Means

Sums and Differences of Independent Variables

Independent variables can be combined to form new variables. The mean and variance of the combination
can be found from the means and the variances of the original variables.

Combination of Variables In English (Melodic Mathematics)


The mean of a sum is the sum of the means.

The mean of a difference is the difference of the means.

The variance of a sum is the sum of the variances.

The variance of a difference is the sum of the variances.

The Difference of the Means:


2
Since we are combining two variables by subtraction, the important rules from the table above are that the
mean of the difference is the difference of the means and the variance of the difference is the sum of the
variances.

It is important to note that the variance of the difference is the sum of the variances, not the standard
deviation of the difference is the sum of the standard deviations. When we go to find the standard error, we
must combine variances to do so. Also, you're probably wondering why the variance of the difference is the
sum of the variances instead of the difference of the variances. Since the values are squared, the negative
associated with the second variable becomes positive, and it becomes the sum of the variances. Also,
variances can't be negative, and if you took the difference of the variances, it could be negative.

You might also like