This action might not be possible to undo. Are you sure you want to continue?
https://www.scribd.com/doc/67641899/SummaryStatistics
04/22/2012
text
original
Sample: subset of a population, containing the objects or outcomes that are actually observed. Simple Random Sample: size n is a sample chosen by a method in which each collection of n population items is equally likely to comprise the sample (i.e. a lottery) *Sample items are independent if knowing the values of some of the items does not help predict the values of others. ∑ Sample Mean: gives an indication of the center of the data ̅ Sample Standard Deviation: gives an indication of “spreadoutedness” √ ∑ ̅ √ ∑ ̅
Histograms: skewed right if tail is to the right (mean>median), skewed left is tail is to the left (mean < median). Anatomy of a box plot 1. Compute the median and 1st(median of 1st half) and 3rd(median of 2nd half) quartiles. Q3 – Q1 is the Inter quartile range IQR 2. Draw fences at Q1–1.5IQR and Q3+1.5IQR. Extend whiskers from box to fences. 3. Plot outliers individually (anything outside the upper and lower fences) Probability The set of all possible outcomes of an experiment is called the sample space for the experiment. A subset of a sample space is called an event. Combining events The union of two events A and B, denoted , is the set of outcomes that belong either to A, to be or to both; means “A or B”, thus indicating that the event occurs when either A or B (or both) occurs. The intersection of two events A and B, denoted , is the set of outcomes that belong both to A and B; means “A and B” the event occurring when both A and B occur. The complement of an event A, denoted Ac, is the set of outcomes that do not belong to A, meaning “not A,” the events occurring when A does not occur. *Two events A and B are mutually exclusive is they have no outcomes in common. Axioms of Probability 1. Let S be a sample space. Then P(S)=1 4. For any event A, P(Ac)=1P(A) 2. For any event A, 0≤P(A)≤1 5. Let Ø denote the empty set, then P(Ø)=0 3. If A and B are mutually exclusive events, then 6. P( )=P(A)+P(B)P( ). Note that if A and B are P( )=P(A)+P(B) mutually exclusive, P( )=0. *A probability that is based on a part of a sample space is called a conditional probability. An unconditional probability is one based on the entire sample space. Let A and B be events where P(B)≠0. The conditional probability of A given B is

Two events A and B are independent if the probability of each event remains the same whether or not the other occurs. If P(A)≠0 and P(B)≠0, then   A and B are independent if or if . If either P(A)=0 or P(B)=0, then A and B are independent.  . The Multiplication Rule: • If A and B are two events with P(B)≠0, then P( )=P(B)  . • If A and B are two events with P(A)≠0, then P( )=P(A)  =P(A) and  =P(B) so the above condense to P( • When two events are independent, then )=P(A)P(B). Law of Total Probability: If A1,…An are mutually exclusive and exhaustive (meaning their union covers the whole sample space) events, and B is in any event, then P(B)= P( ) + P( ). Equivalently, if P(Ai)≠0 for each Ai then P(B)= P  P(A1) + P  P(An). Bayes’ Rule: Let A and B be events with P(A)≠0 and P(Ac)≠0 and P(B)≠0 then




then If X and Y are independent random variables with and (the spreadoutedness does not change)   . the mean of X (E[V]. the pth percentile is the cumulative distribution function F(x). Then P(a≤X≤b)= P(a≤X<b)= P(a<X≤b)= P(a<X<b)=∫ . The probability mass function of a discrete random variable X is the function p(x)=P(X=x). X2. Note that ∫ Let X be a CRV with probability density function f(x). The PMF is sometimes called the probability distribution. ∑ ∑ ∑ F(x)=P(X≤x) F(x)= . the mean (or expected Value E[V]) of X is ∑ given by The population standard deviation is the square root of the variance (also V[X]) described as ∑ ∑ √ Therefore. the standard deviation is Continuous Random Variables A random variable is continuous if its probabilities are given by areas under a curve. The pdf is sometimes called the probability distribution. or center of mass) is given by ∫ Let X be a CRV with probability density function f(x). ) ( ) ∫ Linear Functions of Random Variables If X is a random variable and b is a constant. and ∑ where the sum is over all possible values of x. X3. Let X be a CRV with probability density function f(x) and let a and b be any two numbers with a<b. Let X be a discrete random variable with probability mass function p(x)=P(X=x). . are independent random variables. A function called the cumulative distribution function specifies the probability that a random variable is less than or equal to a given value. standard deviation ∫ ∫ Let X be a CRV with probability density function f(x) and If p is a number between 0 and 100. and   . Random variables can de discrete (coming from a finite set with gaps between the ordered values) or continuous (contained on an interval). If X is a random variable and a and b are constants. then If X1. then . …Xn. The population mean of a discrete random variable can be thought of as the mean of a hypothetical sample that follows the probability distribution perfectly. . or the moment of inertia) is given by ∫ √ or . . . the variance of X (V[X]. then .Random Variables An assignment of a numerical value to an outcome of an experiment is called a random variable. In addition P(X≤b)=P(X<b)= ∫ and P(X≥a)=P(X>a)= ∫ . The curve is called the probability density function for the random variable. Therefore. the median of X is given by point xp that solves the equation ∫ ( . then If X and Y are random variables and a and b are constants. then If X is a random variable and a is a constant. The cumulative distribution function of X is the function Let X be a CRV with probability density function f(x).
This action might not be possible to undo. Are you sure you want to continue?