You are on page 1of 4
cipal Formulas in Part | Notation (Chapter 2) (Measured value of 3) = Seq 8, (p13) where Zaex = best estimate for x, &e = uncertainty or error in the measurement. Fractional uncertainty = (p. 28) Pre Propagation of Uncertainties (Chapter 3) If various quantities x,..., w are measured with small uncertainties &x,..., 61, and the measured values are used to calculate some quantity q, then the uncertainties in x,.... w cause an uncertainty in g as follows: If q is the sum and difference, q = x + + + z— (w+ ++ + w), then [(BxyP + += + (8)? + (Buy? + + (wy? for independent random errors: = het tat Bute + Bw always. (p. 60) XX IF q is the product and quotient, q = =~. then [m= mp , (up Bi) = (By +--+ (FF + (BF + + (F VG PCY 64 for independent random errors: Ia ax bu bw <%, ee Ai lel bo} always. ©. 61) If q = Bx, where B is known exactly, then 6q = |Blée. (p. 54) If q is a function of one variable, q(x). then aa = || (9.68) If q is a power, q = 2, then (p. 66) If q is any function of several variables x, ..., z, then ba = (Fea) +--+ (ay (p. 75) (for independent random errors). Statistical Definitions (Chapter 4) If xj,..-+y denote N separate measurements of one quantity x, then we define: r-13 09.98 Bay , x mean; (p. 98) °, 2 = standard deviation, or SD (p. 100) o; = “£ = standard deviation of mean, or DOM. (p. 102) The Normal Distribution (Chapter 5) For any limiting distribution f(x) for measurement of a continuous variable x fox) dx = probability that any one measurement will give an answer between x and x + dx: (p. 128) i (f(x) dx = probability that any one measurement will p give an answer between x=aandx=b; — (p. 128) f (f(x) dx = 1 is the normalization condition, (p. 128) ‘The Gauss or normal distribution is Grol) = eae, (p. 133) where center of distribution = true value of x ‘mean after many measurements, @ = width of distribution = standard deviation after many measurements. ‘The probability of a measurement within ¢ standard deviations of X is e7" dz = normal error integral; (p. 136) 1 Probwithin to) = —= Vox in particular Prob(within 1a) = 68%. Principal Formulas in Part Il Weighted Averages (Chapter 7) If-x,, ..., xy are measurements of the same quantity x, with known uncertainties «++ Oye then the best estimate for x is the weighted average Lwin, Fay = Se 175) wen = SN, (p. 175) where the weight w, = /o?, Least-Squares Fit to a Straight Line (Chapter 8) If (x.y). «= (ys a9) are measured pairs of data, then the best straight line y =A + Br to fit these N points has A= (Ex) Qy) — Lay Ex, B= INExy) — Exp ya, where A = N(x?) - xy. (p. 184) Covariance and Correlation (Chapter 9) The covariance oy of N pairs (x), yy)e +++ Gye ¥y) is y Ley — DC — Fh (p. 212) The coefficient of linear correlation is (p.217) Values of r near I or ~ 1 indicate strong linear correlation; values near 0 indicate little or no correlation. (For a table of probabilities for 1, see Appendix C.) If the probability of “success” in one trial is p, then the probability of v successes in n trials is given by the binomial distribution "a= pyr (p. 230) Prob(v successes in n tials) = B,,(v) = ow P After many sets of n trials, the mean number of successes is and the standard deviation is o, = Nap ~ pi. (. 232) Poisson Distribution (Chapter I!) In counting radioactive decays (and other similar random events), the probability of vy counts (in some definite time interval) is given by the Poisson distribution Prob(v counts) = P,(v) (p. 246) Where xis the expected average count in the time interval concerned P= w Cafter many experiments). (p.247) ‘The standard deviation is Ve (p. 249) Chi Squared (Chapter 12) The resulls of any repeated measurement can be grouped in bins, k= Ty... Let O, denote the number of results observed in bin k. Similarly, E, denotes the number expected in bin k, based on some assumed distribution (Gauss. binomial, Poisson, etc.). We define chi squared as x= DO, ~ EVIE. (p. 266) and the reduced chi squared as x? = ld, (e271) where d is the number of degrees of freedom. If x? >> 1, the agreement between O, and E, is unacceptable, and we reject the assumed distribution. If y? = 1, the agreement is satisfactory, and the observed and expected distributions are compatible. (For a table of probabilities for 7, see Ap- pendix D.)

You might also like