6 views

Uploaded by José Daniel Rivera Medina

ASD

ASD

© All Rights Reserved

- Relatedness of Learners’ Attitude and Raised Proficiency: A Study with Special Reference with To Uttrakhand Technical Students
- ANOVA Analysis
- gs_short_ex
- Destructive R&R
- Lecture One Way ANOVA
- Time Trap Analysis
- Anova
- Design of Experiments Wizard
- Lab 3 Report
- Do audit tenure and firm size contribute to audit quality Empirical evidence from Jordan.pdf
- Creating Online Brand Personality the Role of Personal Difference
- TI guide
- 123-1008-1-PB
- ANOVA.ppt
- Academic Staff in Nigeria
- simple linear reg stat
- g 0603014560
- anova review
- 20181128081125ANOVA and 2Way
- 605 Kruskal Wallis Test

You are on page 1of 4

n = sample size

x = sample mean

s = sample stdev

Chapter 3

Qj = jth quartile

N = population size

m = population mean

xi

n

xi

N

Population standard deviation (standard deviation of a variable):

Population mean (mean of a variable): m =

Sample standard deviation:

(xi - x )2

s =

B n - 1

or

s =

x2i - (xi)2>n

B

n - 1

g(xi - m)2

B

N

s =

or

gx2i

B N

- m2

x - m

s

Probability Concepts

f

P(E ) =

N

where f denotes the number of ways event E can occur and

N denotes the total number of outcomes possible.

Special addition rule:

P(A or B or C or ) = P(A) + P(B) + P(C) +

(A, B, C, mutually exclusive)

Complementation rule: P(E) 1 P(not E)

General addition rule: P(A or B) P(A) P(B) P(A & B)

Conditional probability rule: P(B A) =

P(A & B)

P(A)

Special multiplication rule:

P(A & B & C & ) = P(A) # P(B) # P(C)

j=1

Bayess rule:

P(Ai B) =

k

P(Aj) # P(B Aj)

aj=1

(A1, A2, , Ak mutually exclusive and exhaustive)

Factorial: k! = k(k - 1) 2 # 1

Permutations rule: mPr =

m!

(m - r)!

Combinations rule: mCr =

m!

r!(m - r)!

(A, B, C, independent)

N!

n!(N - n)!

Standard deviation of a discrete random variable X:

s = 2(x - m)2P(X = x) or s = 2x2P(X = x) - m2

Factorial: k! = k(k - 1) 2 # 1

n

n!

Binomial coefficient: a b =

x

x!(n - x)!

Binomial probability formula:

n

P(X = x) = a bpx(1 - p)n - x

x

Chapter 6

s =

Standardized variable: z =

Chapter 5

p = population proportion

O = observed frequency

E = expected frequency

Descriptive Measures

Sample mean: x =

Chapter 4

s = population stdev

d = paired difference

pN = sample proportion

probability.

Mean of a binomial random variable: np

Standard deviation of a binomial random variable:

s = 1np(1 - p)

Poisson probability formula: P(X = x) = e-l

Mean of a Poisson random variable:

Standard deviation of a Poisson random variable: s = 1l

x - m

s

lx

x!

Chapter 7

Chapter 8

x - m

z =

t =

s> 1n

x ; ta>2 #

n = a

za>2 # s

E

s> 1n

s

x ; za>2 #

1n

x - m

s

1n

s

1n

with df n 1.

Chapter 9

large sample):

x - m0

z =

s> 1n

t-test statistic for H0: 0 ( unknown, normal population or

large sample):

t =

W1 - A = n(n + 1)>2 - WA

Wilcoxon signed-rank test statistic for H0: 0 (symmetric

population):

W sum of the positive ranks

x - m0

s> 1n

with df n 1.

Chapter 10

sp =

(n1 -

+ (n2 n1 + n2 - 2

1)s21

1)s22

normal populations or large samples, and equal population

standard deviations):

x1 - x2

t =

sp 2(1>n1) + (1>n2)

with df n1 n2 2.

t =

x1 - x2

2(s21>n1) + (s22>n2)

with df .

Nonpooled t-interval for 1 2 (independent samples, and

normal populations or large samples):

(x1 - x2) ; ta>2 # 2(s21>n1) + (s22>n2)

with df .

populations or large samples, and equal population standard

deviations):

(x1 - x2) ; ta>2 # sp 2(1>n1) + (1>n2)

with df n1 n2 2.

M1 - A = n1(n1 + n2 + 1) - MA

MannWhitney test statistic for H0: 1 2 (independent samples and same-shape populations):

M sum of the ranks for sample data from Population 1

=

and normal populations or large samples):

[(s21>n1) + (s22>n2)]2

differences or large sample):

(s21>n1)2

(s22>n2)2

+

n1 - 1

n2 - 1

t =

with df n 1.

d

sd > 1n

or large sample):

sd

d ; ta>2 #

1n

with df n 1.

Chapter 11

sample and symmetric differences):

W sum of the positive ranks

n - 1 2

x2 =

s

s20

with df n 1.

populations):

F = s21>s22

n - 1#

n - 1#

s to

s

A x2a>2

A x21 - a>2

s

1 # s1

1

# 1

to

s

s

2

2Fa>2

2F1 - a>2 2

with df n 1.

Chapter 12

Sample proportion:

x

n

where x denotes the number of members in the sample that have

the specified attribute.

pN =

z-interval for p:

Margin of error for the estimate of p:

E = za>2 # 2pN (1 - pN )>n

Sample size for estimating p:

za>2 2

za>2 2

b or n = pN g (1 - pN g) a

b

n = 0.25 a

E

E

rounded up to the nearest whole number (g educated guess)

z =

all 5 or greater)

z-interval for p1 p2:

( pN 1 - pN 2) ; za>2 # 2pN 1(1 - pN 1)>n1 + pN 2(1 - pN 2)>n2

(Assumptions: independent samples; x1, n1 x1, x2, n2 x2 are

all 5 or greater)

Margin of error for the estimate of p1 p2:

E = za>2 # 2pN 1(1 - pN 1)>n1 + pN 2(1 - pN 2)>n2

Sample size for estimating p1 p2:

n1 = n2 = 0.5 a

z =

pN - p0

2p0(1 - p0)>n

(Assumption: both np0 and n(1 p0) are 5 or greater)

x1 + x2

Pooled sample proportion: pN p =

n1 + n 2

Chapter 13

pN 1 - pN 2

2pN p(1 - pN p)2(1>n1) + (1>n2)

za>2

E

or

n1 = n2 = 1 pN 1g (1 - pN 1g) + pN 2g (1 - pN 2g)2 a

za>2

E

Chi-Square Procedures

E np

Test statistic for a chi-square goodness-of-fit test:

x2 = (O - E)2>E

with df c 1, where c is the number of possible values for the

variable under consideration.

Expected frequencies for a chi-square independence test or a

chi-square homogeneity test:

R#C

E =

n

where R row total and C column total.

x2 = (O - E)2>E

with df (r 1)(c 1), where r and c are the number of possible

values for the two variables under consideration.

Test-statistic for a chi-square homogeneity test:

x2 = (O - E)2>E

with df (r 1)(c 1), where r is the number of populations

and c is the number of possible values for the variable under

consideration.

Chapter 14

2

>Sxx

Regression sum of squares: SSR = ( yN i - y)2 = Sxy

Sxx = (xi - x)2 = x2i - (xi)2>n

2

>Sxx

Error sum of squares: SSE = ( yi - yN i )2 = Syy - Sxy

Syy = ( yi - y) =

2

y2i

- (yi) >n

2

Sxy

1

b1 =

and b0 = (yi - b1 xi) = y - b1x

n

Sxx

Total sum of squares: SST = ( yi - y) = Syy

2

Chapter 15

Coefficient of determination: r 2 =

1

n - 1 (xi - x)( yi - y)

r =

sx sy

or r =

Sxy

1SxxSyy

SSE

Standard error of the estimate: se =

An - 2

Test statistic for H0: 1 0:

b1

t =

se> 1Sxx

with df n 2.

corresponding to xp:

yN p ; ta>2 # se

b1 ; ta>2 #

with df n 2.

1 +

t =

se

1Sxx

variable corresponding to xp:

(x - xi>n)2

yN p ; ta>2 # se 1 + p

An

Sxx

with df n 2.

(xp - xi>n)2

1

+

n

Sxx

with df n 2.

Chapter 16

SSR

SST

r

1 - r2

An - 2

with df n 2.

Test statistic for a correlation test for normality:

Rp =

xiwi

2Sxx w2i

where x and w denote observations of the variable and the corresponding normal scores, respectively.

k number of populations

n total number of observations

x mean of all n observations

nj size of sample from Population j

xj mean of sample from Population j

s2j variance of sample from Population j

Tj sum of sample data from Population j

Defining formulas for sums of squares in one-way ANOVA:

SST = (xi - x)2

SSTR = nj (xj - x)2

SSE = (nj -

1)s2j

Computing formulas for sums of squares in one-way ANOVA:

SST = x2i - (xi)2>n

SSTR = (Tj2>nj) - (xi)2>n

SSE = SST - SSTR

Mean squares in one-way ANOVA:

SSTR

SSE

MSTR =

MSE =

k - 1

n - k

populations, and equal population standard deviations):

F =

MSTR

MSE

with df (k 1, n k).

Confidence interval for i j in the Tukey multiple-comparison

method (independent samples, normal populations, and equal

population standard deviations):

(xi - xj) ;

qa

12

# s2(1>ni) + (1>nj)

k and n k.

Test statistic for a KruskalWallis test (independent samples,

same-shape populations, all sample sizes 5 or greater):

k R2

j

SSTR

12

H =

or H =

- 3(n + 1)

SST>(n - 1)

n(n + 1) a n

j=1

where SSTR and SST are computed for the ranks of the data,

and Rj denotes the sum of the ranks for the sample data from

Population j. H has approximately a chi-square distribution

with df k 1.

- Relatedness of Learners’ Attitude and Raised Proficiency: A Study with Special Reference with To Uttrakhand Technical StudentsUploaded byIOSRjournal
- ANOVA AnalysisUploaded bykvnikhilreddy
- gs_short_exUploaded bywerwerwer
- Destructive R&RUploaded byKalyan Srinivas
- Lecture One Way ANOVAUploaded byAnum Shahzad
- Time Trap AnalysisUploaded bywongsosoegali
- AnovaUploaded byjoy raval
- Design of Experiments WizardUploaded byAnonymous FZNn6rB
- Lab 3 ReportUploaded bylantea1
- Do audit tenure and firm size contribute to audit quality Empirical evidence from Jordan.pdfUploaded byMuhammad Akbar
- Creating Online Brand Personality the Role of Personal DifferenceUploaded byJoseph
- TI guideUploaded byfocus16hoursgmailcom
- 123-1008-1-PBUploaded bySumit Waghmare
- ANOVA.pptUploaded byChristy Daniel
- Academic Staff in NigeriaUploaded byUba Osinachi
- simple linear reg statUploaded bychrisadin
- g 0603014560Uploaded byfriska anja
- anova reviewUploaded byapi-285777244
- 20181128081125ANOVA and 2WayUploaded byMUTHAARASAN
- 605 Kruskal Wallis TestUploaded bysebastian
- 7Modeling and Analysis of the Weld Bead Geometry in Robotic Gas Metal Arc Welding OL15Apr14 CopyUploaded byananthakumar
- One way anovaUploaded bySachin Joshi
- SolutionsUploaded byerinedeleeuw
- NET Psycho II Dec 2009Uploaded byfandus
- unit operation lab manuaol.pdfUploaded byhmce kalyani
- Introductory SAS TutoriUploaded bySAT1243
- Lecture 8Uploaded byHwan H Han
- Optimization of Process Parameters in the Catalytic Degradation of Polypropylene to Liquid Fuel by Taguchi MethodUploaded bySEP-Publisher
- Anava MpiwwwwUploaded byWeni Winarti
- 41-46Uploaded byBAYU

- An Introduction to MathematicaUploaded bymartin_salazar5856
- Matrix Analysis for Statistics (Wiley Series in Probability and Statistics) by James R. SchottUploaded byJosé Daniel Rivera Medina
- Statistical Inference, Econometric Analysis and Matrix Algebra. Schipp, Bernhard; Krämer, Walter. 2009Uploaded byJosé Daniel Rivera Medina
- Structural Analysis of Discrete Data With Econometric Applcations. Manski, Charles F. and McFadden, Daniel. 1990Uploaded byJosé Daniel Rivera Medina
- The Elements of Real Analysis. Bartle, Robert G. 1964Uploaded byJosé Daniel Rivera Medina
- Abstract Algebra. Herstein I. 1996Uploaded byJosé Daniel Rivera Medina
- A Course in Mathemathical Analysis Vol. II. Goursat, Edouard. 1916Uploaded byJosé Daniel Rivera Medina
- Applied Mutivariate Statistical Analysis. 6a. Ed. Johnson, Richard a. y Wichern, Dean W. 2007Uploaded byJosé Daniel Rivera Medina
- Calculus on Manifolds. Spivak M.1965Uploaded byJosé Daniel Rivera Medina
- Statistical Desing and Analysis of Experiments. Mason, R. L.; Gunst, R. F. and Hess, J. L. 2003.pdfUploaded byMARINA
- Case Studies of Markets and Innovative Financial Mechanisms for Water Services From Forests. Perrot-Maître, Danièle and Davis, Patsy. 2001Uploaded byJosé Daniel Rivera Medina
- A Generalized Least Squares Estimation Method for Vector Moving Average Models. Flores de F., R. y Serrano, G. R. 1996Uploaded byJosé Daniel Rivera Medina
- A Brief Introduction to Measure Theory and Integration. Bass, Richard F. 1998Uploaded byJosé Daniel Rivera Medina
- Difference Calculus and Generating Functions. Santos, D. a. 2005Uploaded byJosé Daniel Rivera Medina
- Statistical Methods for Research Workers. Fisher, Ronald a. 2000Uploaded byJosé Daniel Rivera Medina
- The Fundamental Theorem of Calculus for Lebesgue Integral. Bárcenas Diómedes. 2000Uploaded byJosé Daniel Rivera Medina
- Developing Markets for Water ServicesUploaded byjogijo
- The Fundamental Theorem of Calculus_Sec43Uploaded byJosé Daniel Rivera Medina
- review_of_algebraUploaded byteachopensource
- Linear Models. Kachman, S. D. 1999Uploaded byPatricia Estrada Drouai

- Lab5_SPSS_ChiSquareUploaded byUmair Uddin
- R09 Regression.128Uploaded byEduardo
- Experimental DesignUploaded byIfrah Mahmood
- Bellack a.S., Hersen M. Eds. Comprehensive Clinical Psychology Volume 3 2000Uploaded byontzy26
- D 2915 – 03 ;RDI5MTU_Uploaded byCarlos L. Oyuela
- Anselin-1995-Geographical_Analysis.pdfUploaded byDavid Puchana
- Merkle You Preacher 2016Uploaded bywuri
- Introduction 2Uploaded byRavi Kumaran
- TESTING OF HYPOTHESIS Parametric Test (T, Z, F) Chi-SqUploaded byAnonymous CwJeBCAXp
- IIW_Best Practical Guide on Statistical Analysis of Fatigue DataUploaded byBaran Yeter
- Software Engineering - Evolution and Emerging Technologies.pdfUploaded byAtif Rehman
- skripsi bahasa inggrisUploaded byeirynee
- Bstat TestUploaded by89639
- NormalityUploaded byEDYLON01
- Module 7Uploaded byRiyan Wahyudo
- Examination Notes in PsychiatryUploaded bysorphy
- 4. Human Resources - IJHRMR-Competency Mapping of Management-Stella-ArockiamUploaded byTJPRC Publications
- SIP PROJECT REPORT FOR college real.docxUploaded byTanaya Ghosal
- PQ2.pdfUploaded byBrew-sam AB
- JoGBE_July2013_V7N1.pdfUploaded byErmita Yusida
- IMF the Volatility of Capital Flows in Emerging Markets - Measures and Determinants,2017Uploaded bysdinu
- Gross Motor Skills of Children With Mild Intellectual DisabilitiesUploaded bynurul_sharaswati
- Prob&StatsBook.pdfUploaded bybrandon_medina2011
- Teaching Methods and Students Academic PerformanceUploaded byTamara
- Theoretical Basis Practical Example of Trend Following Lukasz WojtowUploaded byvibhor_maheshwari_2
- STEM LITERACY.pdfUploaded bydevy
- outliers.pdfUploaded byBagus Hadiwibowo
- Nebulizer - PI JurnalUploaded bydrrianacs
- Assumption of NormalityUploaded byEmy Lily
- Chapter 14Uploaded bymoneypigagain