You are on page 1of 8

FormulaCard.

indd Page 1 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter

KEY FORMULAS
Prem S. Mann • Introductory Statistics, Ninth Edition

Chapter 2 • Organizing and Graphing Data • Chebyshev’s theorem:
• Relative frequency of a class = f∕ ∑ f For any number k greater than 1, at least (1 − 1∕k2) of the
values for any distribution lie within k standard deviations
• Percentage of a class = (Relative frequency) × 100% of the mean.
• Class midpoint or mark = (Upper limit + Lower limit)∕2 • Empirical rule:
• Class width = Upper boundary − Lower boundary For a specific bell-shaped distribution, about 68% of the
observations fall in the interval (μ − σ) to (μ + σ), about
• Cumulative relative frequency
95% fall in the interval (μ − 2σ) to (μ + 2σ), and about
Cumulative frequency 99.7% fall in the interval (μ − 3σ) to (μ + 3σ).
=
Total observations in the data set • Q1 = First quartile given by the value of the middle term
• Cumulative percentage among the (ranked) observations that are less than the
= (Cumulative relative frequency) × 100% median
Q2 = Second quartile given by the value of the middle term
in a ranked data set
Chapter 3 • Numerical Descriptive Measures
Q3 = Third quartile given by the value of the middle term
• Mean for ungrouped data: μ = ∑ x∕N and x = ∑ x∕n among the (ranked) observations that are greater than
• Mean for grouped data: μ = ∑ mf∕N and x = ∑ mf∕n the median
where m is the midpoint and f is the frequency of a class • Interquartile range: IQR = Q3 − Q1
• Weighted Mean for ungrouped data = ∑ xw∕ ∑ w • The kth percentile:
• k% Trimmed Mean = Mean of the values after dropping kn
k% of the values from each end of Pk = Value of the ( th term in a ranked data set
100 )
the ranked data
• Percentile rank of xi
• Median for ungrouped data
= Value of the middle term in a ranked data set Number of values less than xi
= × 100
• Range = Largest value − Smallest value Total number of values in the data set
• Variance for ungrouped data:
( ∑ x) 2 ( ∑ x) 2 Chapter 4 • Probability
∑ x2 −( ∑ x2 −(
N ) n ) • Classical probability rule for a simple event:
σ2 =  and s2 =
N n−1 1
P(Ei ) =
2 2
where σ is the population variance and s is the sample variance Total number of outcomes
• Standard deviation for ungrouped data: • Classical probability rule for a compound event:
( ∑ x) 2 ( ∑ x) 2 Number of outcomes in A
∑ x2 −( ∑ x2 − P(A) =
N ) ( n ) Total number of outcomes
σ=  and s = • Relative frequency as an approximation of probability:
R N R n−1
f
where σ and s are the population and sample standard devia- P(A) =
tions, respectively n
σ s • Conditional probability of an event:
• Coefficient of variation = × 100% or  × 100%
μ x P(A and B) P(A and B)
• Variance for grouped data: P(A∣B) =  and P(B∣A) =
P(B) P(A)
(∑ m f )2 (∑ m f )2 • Condition for independence of events:
∑ m2 f −( ∑ m2 f −(
N ) n ) P(A) = P(A∣B) and∕or P(B) = P(B∣A)
σ2 =  and s2 =
N n−1 • For complementary events: P(A) + P(A) = 1
• Standard deviation for grouped data: • Multiplication rule for dependent events:
(∑ m f )2 (∑ m f )2 P(A and B) = P(A) P(B∣A)
∑ m2 f −( ∑ m2 f −
N ) ( n ) • Multiplication rule for independent events:
σ= and s =
R N R n−1 P(A and B) = P(A) P(B)

FormulaCard. variance.05: σx = σ∕√n sample: x−μ p̂ − p pq • z value for x : z = z=  where σp̂ = σx σp̂ A n . σ. . and σ = √λ and Proportion • Test statistic z for a test of hypothesis about μ using the normal distribution when σ is known: Chapter 6 • Continuous Random Variables and the Normal Distribution x−μ σ z=  where σ x = x−μ σx √n • z value for an x value: z = σ • Test statistic for a test of hypothesis about μ using the t dis- • Value of x when μ.05: σp̂ = √pq∕n • Addition rule for mutually exclusive events: p̂ − p • z value for p̂: z = P(A or B) = P(A) + P(B) σp̂ • n factorial: n! = n(n − 1)(n − 2) . and z are known: x = μ + zσ tribution when σ is not known: x−μ s t=  where s x = sx √n Chapter 7 • Sampling Distributions • Mean of x : μx = μ • Test statistic for a test of hypothesis about p for a large • Standard deviation of x when n∕N ≤ . .indd Page 2 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter • Joint probability of two mutually exclusive events: • Population proportion: p = X∕N P(A and B) = 0 • Sample proportion: p̂ = x∕n • Addition rule for mutually nonexclusive events: • Mean of p̂: μp̂ = p P(A or B) = P(A) + P(B) − P(A and B) • Standard deviation of p̂ when n∕N ≤ . σ2 = λ. 3 · 2 · 1 • Number of combinations of n items selected x at a time: Chapter 8 • Estimation of the Mean and Proportion n! • Point estimate of μ = x nCx = x!(n − x)! • Confidence interval for μ using the normal distribution • Number of permutations of n items selected x at a time: when σ is known: n! x ± zσ x where σ x = σ∕√n n Px = (n − x)! • Confidence interval for μ using the t distribution when σ is not known: Chapter 5 • Discrete Random Variables and Their x ± ts x  where s x = s∕√n Probability Distributions • Mean of a discrete random variable x: μ = ∑ xP(x) • Margin of error of the estimate for μ: • Standard deviation of a discrete random variable x: E = zσx or t sx σ = √∑ x 2P(x) − μ2 • Determining sample size for estimating μ: • Binomial probability formula: P(x) = nCx p q x n−x n = z 2σ2∕E 2 • Mean and standard deviation of the binomial distribution: • Confidence interval for p for a large sample: μ = np and σ = √npq p̂ ± z sp̂  where sp̂ = √p̂q̂∕n • Hypergeometric probability formula: • Margin of error of the estimate for p: r Cx N−r Cn−x P(x) = N Cn E = z sp̂  where sp̂ = √p̂q̂∕n λx e−λ • Determining sample size for estimating p: • Poisson probability formula: P(x) = x! n = z 2pq∕E 2 • Mean. and standard deviation of the Poisson prob- ability distribution: Chapter 9 • Hypothesis Tests about the Mean μ = λ.

FormulaCard. confidence interval Pooled standard deviation: for p1 − p2: (n1 − 1)s21 + (n2 − 1)s22 p̂1q̂1 p̂2q̂2 sp = (p̂1 − p̂2 ) ± z sp̂1 −p̂ 2 where sp̂1 −p̂ 2 = + B n1 + n2 − 2   B n1 n2 Estimate of the standard deviation of x1 − x2: • For two large and independent samples. t=   s x1 −x2 respectively.indd Page 3 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter Chapter 10 • Estimation and Hypothesis Testing: • For two paired or matched samples: Two Populations Sample mean for paired differences: d = ∑ d∕n • Mean of the sampling distribution of x1 − x2: Sample standard deviation for paired differences: μx1−x2 = μ1 − μ2 ( ∑ d) 2 • Confidence interval for μ1 − μ2 for two independent ∑d2 − n samples using the normal distribution when σ1 and σ2 are sd = known: R n−1 Mean and standard deviation of the sampling distribution σ21 σ22 of d ( x1 − x2 ) ± zσ x1 −x2 where σ x1 −x2 = + μd = μd and s d = sd ∕√n B n1 n2   Confidence interval for μd using the t distribution: • Test statistic for a test of hypothesis about μ1 − μ2 for two independent samples using the normal distribution when σ1 d ± ts d where s d = sd ∕√n   and σ2 are known:   Test statistic for a test of hypothesis about μd using the t ( x1 − x2 ) − (μ1 − μ2 ) distribution: z= σ x1−x2 d − μd t= • For two independent samples taken from two populations sd with equal but unknown standard deviations: • For two large and independent samples. Test statistic: z= sp̂1 −p̂2 tions with unequal and unknown standard deviations: s21 s22 2 Chapter 11 • Chi-Square Tests ( n1 + n2 ) • Expected frequency for a category for a goodness-of-fit Degrees of freedom: df = test: s21 2 s22 2 ( n1 ) ( n2 ) E = np + n1 − 1 n2 − 1 • Degrees of freedom for a goodness-of-fit test: df = k − 1 where k is the number of categories Estimate of the standard deviation of x1 − x2: • Expected frequency for a cell for an independence or homo- s21 s22 geneity test: sx1 −x2 = + B n1 n2 (Row total)(Column total) E= Confidence interval for μ1 − μ2 using the t distribution: Sample size ( x1 − x2 ) ± tsx1 −x2   • Degrees of freedom for a test of independence or homogeneity: Test statistic using the t distribution: df = (R − 1)(C − 1) ( x1 − x2 ) − (μ1 − μ2 ) where R and C are the total number of rows and columns. in the contingency table . for a test of hypothesis about p1 − p2 with H0: p1 − p2 = 0: 1 1 s x1 −x2 = sp + Pooled sample proportion: A n1 n2 x1 + x2 n1 p̂1 + n2 p̂2 Confidence interval for μ1 − μ2 using the t distribution: p=  or  n1 + n2 n1 + n2 ( x1 − x2 ) ± ts x1 −x2   Estimate of the standard deviation of p̂1 − p̂2: Test statistic using the t distribution: 1 1 ( x1 − x2 ) − (μ1 − μ2 ) sp̂1 −p̂ 2 = pq( + B n1 n2 ) t=   s x1 −x2 ( p̂1 − p̂2 ) − ( p1 − p2 ) • For two independent samples selected from two popula.

and yy: pendence or homogeneity: ( ∑ x)( ∑ y) (O − E) 2 SSxy = ∑ xy − χ2 = ∑   n E ( ∑ x) 2 ( ∑ y) 2 SSxx = ∑ x2 −  and SSyy = ∑ y2 − • Confidence interval for the population variance σ2: n n (n − 1)s 2 (n − 1)s 2 • Least squares estimates of A and B:  to      χ 2α∕2   χ 21− α∕2   b = SSxy 兾 SSxx and a = y − bx • Test statistic for a test of hypothesis about σ2: • Standard deviation of the sample errors: SSyy − b SSxy (n − 1)s2 se = χ2 = B n−2 σ2 • Error sum of squares: SSE = ∑ e2 = ∑ (y − ŷ) 2 ( ∑ y) 2 Chapter 12 • Analysis of Variance • Total sum of squares: SST = ∑ y2 − n Let: • Regression sum of squares: SSR = SST − SSE k = the number of different samples • Coefficient of determination: r 2 = b SSxy ∕SSyy (or treatments)   • Confidence interval for B: ni = the size of sample i Ti = the sum of the values in sample i b ± tsb where sb = se∕√SSxx n = the number of values in all samples b−B • Test statistic for a test of hypothesis about B: t= sb = n1 + n2 + n3 + . √SSxx SSyy Σx = the sum of the squares of values in all samples 2 • Test statistic for a test of hypothesis about ρ: • For the F distribution: n−2 t=r Degrees of freedom for the numerator = k − 1 A 1 − r2   Degrees of freedom for the denominator = n − k • Confidence interval for μy| x: • Between-samples sum of squares: 1 (x0 − x ) 2 ( ∑ x) 2 ŷ ± t sŷm where sŷ m = se +   T12 T22 T32 + . . xx.) − Bn   SSB = ( + + SSxx n1 n2 n3 n • Prediction interval for yp: • Within-samples sum of squares: 1 (x0 − x ) 2 T12 T22 T32 ŷ ± t sŷ p where sŷ p = se 1+ +   SSW = ∑ x − ( + 2 + + .FormulaCard. Chapter 13 • Simple Linear Regression • Simple linear regression model: y = A + Bx + ϵ • Estimated simple linear regression model: ŷ = a + bx . . . • Sum of squares of xy. . .) B n   n1 n2 n3 SSxx • Total sum of squares: Chapter 14 • Multiple Regression ( ∑ x) 2 SST = SSB + SSW = ∑ x − 2 Formulas for Chapter 14 along with the chapter are on the n Web site for the text. SSxy Σx = the sum of the values in all samples • Linear correlation coefficient: r = = T1 + T2 + T3 + . • Variance between samples: MSB = SSB∕(k − 1) • Variance within samples: MSW = SSW∕(n − k) • Test statistic for a one-way ANOVA test: Chapter 15 • Nonparametric Methods F = MSB∕MSW Formulas for Chapter 15 along with the chapter are on the Web site for the text. . . .indd Page 4 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter • Test statistic for a goodness-of-fit test and a test of inde.

0008 .0029 .05 .1736 .0668 .04 .4207 .1357 .0375 .0005 .2483 .3859 −0. z 0 z z .0006 .2296 .5000 .7 .0003 .9 .0793 .0047 .0010 .0228 .4 .0869 .1949 .0018 .0003 .0268 .0003 .0043 .1469 .1379 −0.1003 .0005 .6 .1251 .0166 .0014 −2.9 .0537 .2420 .1711 .4364 .0951 .0516 .0 .5 .1075 .0015 .0041 .0139 .1423 .0526 .0401 .0024 .1492 .1292 .0495 .0004 .4681 .0032 .3483 −0.0113 .0294 −1.0594 .0154 .1093 .0008 .2236 .0344 .0016 .0071 .0003 .4920 .0054 .02 .0036 −2.3192 .0122 .0004 .1230 .0082 .0630 .1446 .0985 −1.01 .0002 −3.3300 .0643 .0004 .0005 .0057 .1271 .0066 .0073 .0256 .0025 .4168 .0202 .2843 .3228 .0003 .4840 .1 .0778 .0006 .00 .2 .0069 .7 .2266 .0183 −1.3 .0465 .4602 .1314 .0107 .4247 −0.0436 .06 .1867 −0.8 .1587 .2327 .0012 .0062 .0007 .0040 .09 −3.0409 .1056 .3446 .indd Page 5 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter Table IV Standard Normal Distribution Table The entries in the table on this page give the cumulative area under the standard normal curve to the left of z with the values of z equal to 0 or negative.2776 −0.1170 −1.0011 .0735 .0003 .0037 .0051 .0014 .1685 .0548 .3 .0089 .0217 .0003 −3.0721 .4880 .0207 .0336 .4090 .FormulaCard.0838 .0655 .4286 .0150 .4 .0301 .2912 .0020 .0028 .0367 −1.0005 −3.1 .0 .0146 .0007 .0571 .0102 .0918 .0262 .8 .1 .0307 .0011 .4761 .2119 .1 .0329 .0019 .2946 .0005 .4641 .5 .0013 .2389 .0885 .0018 .3974 .1635 .0034 .1562 .0188 .3264 .3632 .1539 .2810 .0006 .0099 .4562 .0392 .0003 .0681 −1.0060 .0606 .0119 .3821 .0049 .3 .4325 .1131 .3936 .0475 .0068 .2877 .0 .2005 .0033 .9 .7 .0045 .0021 .0287 .0048 −2.0418 .0901 .6 .1660 .1611 −0.3594 .0104 .4960 .1814 .0009 .0143 −2.2061 .0075 .0162 .2981 .1977 .0197 .2643 .0455 −1.2177 .1762 .8 .0934 .0010 .4483 .0427 .0 .2 .0013 .0013 .4721 .08 .4129 .0031 .0027 .2676 .1020 .4404 .0080 .0004 .3050 .3 .2 .0003 .0244 .3121 −0.0808 .1335 .4522 .0559 −1.0158 .0021 .0044 .1112 .0094 .0749 .4013 .0003 .4801 .0129 .4 .0708 .03 .3745 .0007 .1841 .0096 .0005 .07 .0016 .3409 .4 .0505 .1151 .0314 .1788 .0026 −2.0179 .0039 .0233 −1.2358 .2148 −0.0170 .0006 .3520 .0023 .0132 .2 .1401 .2514 .0274 .3156 .0007 −3.0359 .0004 .0250 .2578 .0764 .0010 −2.2451 −0.1922 .0059 .0008 .1038 .0192 .0009 .3707 .0694 .2206 .3336 .0012 .0322 .0015 .3015 .0091 .3783 .1515 .0030 .0174 .3085 .0011 .4443 .0116 .0038 .0006 .0035 .0017 .0064 −2.2709 .0078 .4052 .0004 .0281 .0026 .0008 .2090 .3372 .0222 .2546 .0212 .5 .2611 .0446 .1894 .3669 .0823 −1.0582 .0110 −2.6 .0125 .0384 .0618 .0239 .0019 −2.3557 .0052 .0968 .0009 .3897 .0485 .0853 .2743 .0087 .0023 .0022 .0084 −2.2033 .0055 .0351 .0136 .1210 .1190 .

5040 .6141 0.5517 .8980 .9952 2.9881 .9871 .9991 .5438 .9495 .7389 .9960 .9995 3.9992 .6736 .7190 .9664 .9332 .9977 .7642 .9484 .9994 .9946 .5910 .9997 .7549 0.3 .8389 1.FormulaCard.8461 .9788 .7517 .3 .8078 .6026 .7324 .9732 .09 0.8686 .6517 0.9948 .7 .8729 .9357 .9830 .5753 0.9973 .7486 .8849 .6331 .7123 .9418 .9997 .8531 .9750 .6443 .5239 .8869 .7 .9306 .8133 0.0 .9292 .9922 .5675 .9988 .9986 .9015 1.6480 .9236 .6772 .8925 .5279 .7157 .8023 .6915 .9956 .9981 2.9957 .9990 .8749 .9986 3.9834 .8554 .9192 .9633 1.7939 .9756 .6293 .9890 2.9953 .9940 .8 .9554 .7054 .9868 .9925 .9706 1.9678 .7734 .2 .9671 .7823 .9932 .7764 .9 .5596 .9906 .9990 .9699 .9909 .8 .4 .6217 .9943 .8485 .9998 This is Table IV of Appendix B.9989 .9251 .7704 .9918 .8508 .5000 .9772 .8708 .9995 .9826 .1 .8340 .9989 .6950 .9854 .8 .9608 .8365 .8665 .7357 .9898 .6368 .9817 2.9991 .9978 .9441 1.9997 .9982 .2 .2 .9982 .8790 .9911 .5 .9394 .8599 .9994 .9941 .9693 .9641 .9994 .7224 0.9599 .9 .6103 .9992 .5199 .5987 .9505 .9934 .5160 .9778 .8212 .7454 .5 .9913 .9515 .5557 .7881 .9971 .8621 1.9798 .0 .9049 .7794 .9719 .9649 .9850 .5319 .6406 .9992 .9976 .9564 .9987 .9974 .9963 .9995 .7088 .9997 .9995 .9996 .1 .9884 .8289 .4 .9970 .9993 .9406 .9961 .6 .9936 2.8770 .5120 .9988 .9545 1.6591 .9066 .8186 .9207 .8830 1.9345 .9082 .9535 .5793 .08 .7967 .9319 1.1 .9842 .9265 .5 .06 .01 .9997 .8106 .9966 .9904 .9993 .7257 .9993 .5636 .4 .9 .9964 2.5714 .6664 .9846 .9887 .9279 .9996 .9981 .0 .9995 .9996 .9573 .9985 .6 .7580 .9878 .8438 .9474 .2 .9032 .9893 .9985 .9987 .4 .9452 .9761 .0 .9861 .9162 .6064 .8944 .3 .9901 .8264 .9959 .9949 .7910 .9616 .6700 .7422 .9980 .9812 .9997 .9738 .9929 .9803 .9625 .03 .9945 .9864 .7019 .9429 .9983 .9997 .5478 .8997 .9726 .9996 .6844 .9979 .9177 1.7291 .9975 .9967 .9979 .6179 .07 .9997 3.9993 3.8159 .5398 .9992 .9996 .9968 .9793 .9767 2.9916 2.9994 .9984 .9994 .5948 .9808 .8577 .9965 .6808 .indd Page 6 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter Table IV Standard Normal Distribution Table (continued) The entries in the table on this page give the cumulative area under the standard normal curve to the left of z with the values of z equal to 0 or positive. .9783 .9686 .6628 .9463 .7852 0.7673 .9920 .5359 0.9962 .02 .9115 .9838 .9131 .9099 .8315 .9656 .9951 .9896 .7 .8810 .9857 2.04 .8962 .9997 .9977 .6 .8888 .9713 .7995 .9991 .9997 .8907 .9927 .9582 .5871 .9955 .9875 .6879 0.9972 .8238 .9995 .9990 3.8051 .8413 .6985 .5832 .9525 .00 .9931 .9987 .05 .8643 .6554 .9744 .1 .5080 .3 .9989 .9984 . 0 z z z .9370 .9938 .9996 .9147 .9222 .9974 2.6255 .9821 .7611 .9591 .9969 .9382 .

042 2.895 2.764 3.106 4.571 3.215 4 1.397 1.699 2.785 8 1.708 2.807 3.787 3.921 3.893 6 1.638 2.746 2.527 22 1.485 2.032 2.650 3.309 2 1.796 2.421 28 1.898 3.145 2.947 3.707 5.500 2.005 .711 2.025 .056 2.040 2.306 1.779 3.indd Page 7 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter Table V The t Distribution Table The entries in this table give the critical values of t for the specified number of degrees of freedom and areas in the right tail.201 2.861 3.771 2.776 3.012 3.10 .694 2.646 18 1.131 2.449 2.318 1.729 2.691 2.501 9 1.657 318.734 2.819 3.048 2.878 3.552 2.518 2.782 2.069 2.965 9.365 4.307 1.733 16 1.692 2.738 3.337 1.086 2.697 2.315 1.583 2.321 1.733 3.319 1.753 2.750 3.330 1.001 1 3. 0 t Area in the Right Tail Under the t Distribution Curve df .408 29 1.064 2.353 3.297 10 1.308 1.998 3.078 6.441 2.093 2.744 3.845 3.476 2.812 2.316 1.385 31 1.541 5.492 2.250 4.696 2.690 2.143 3.567 2.467 25 1.355 4.761 2.372 1.763 3.485 24 1.132 2.706 31.886 2.173 5 1.821 63.833 2.435 27 1.681 3.120 2.831 3.740 2.074 2.717 2.314 12.747 4.182 4.356 1.725 2.450 26 1.703 2.533 2.327 3 1.015 2.841 10.179 2.718 3.05 .453 2.721 2.462 2.032 5.943 2.797 3.624 2.852 14 1.110 2.701 2.309 1.350 1.325 1.728 3.457 2.025 12 1.610 19 1.01 .896 3.328 1.306 2.771 3.365 2.311 1.920 4.363 1.055 3.262 2.930 13 1.579 20 1.160 2.345 1.035 2.467 2.539 2.169 4.602 2.228 2.052 2.340 .348 35 1.101 2.438 2.333 1.552 21 1.208 7 1.383 1.323 1.310 1.528 2.396 30 1.508 2.313 1.341 1.447 3.724 3.706 2.860 2.309 1.686 17 1.445 2.356 34 1.314 1.499 4.080 2.037 2.604 7.473 2.756 3.144 11 1.505 23 1.030 2.303 6.415 1.479 2.787 15 1.440 1.977 3.375 32 1.FormulaCard.925 22.821 3.365 33 1.045 2.060 2.714 2.

382 2.297 1.670 3.298 1.680 3.277 47 1.018 2.998 2.293 1.675 2.294 1.001 36 1.651 3.995 2.FormulaCard.299 1.576 3.296 1.381 2.379 2.293 1.690 3.297 1.414 2.229 62 1.999 2.418 2.666 1.023 2.300 1.298 1.394 2.692 3.303 1.090 This is Table V of Appendix B.383 2.377 2.673 2.008 2.686 2.426 2.002 2.431 2.672 2.297 1.667 1.410 2.960 2.319 39 1.683 2.211 71 1.291 44 1.659 3.654 3.412 2.014 2.218 67 1.302 1.389 2.005 2.673 2.000 2.421 2.688 2.669 1.682 2.388 2.402 2.281 46 1.666 1.668 3.656 3.662 3.678 2.004 2.295 1.006 2.387 2.382 2.298 1.indd Page 8 12/01/16 7:28 AM f-389 /208/WB01777/9781119055716/bmmatter Table V The t Distribution Table (continued) Area in the Right Tail Under the t Distribution Curve df .677 2.025 .295 1.000 2.232 61 1.676 2.248 55 1.313 40 1.296 43 1.300 1.674 2.011 2.380 2.399 2.026 2.396 2.206 74 1.306 1.301 42 1.434 2.423 2. .209 72 1.681 2.220 66 1.667 1.223 65 1.294 1.379 2.701 3.666 1.667 3.708 3.994 2.663 3.668 1.698 3.674 3.012 2.213 70 1.995 2.670 1.234 60 1.001 2.251 54 1.645 1.993 2.647 3.020 2.643 3.301 1.405 2.687 3.015 2.297 1.295 1.326 38 1.403 2.293 1.009 2.671 2.649 3.293 1.719 3.227 63 1.687 2.682 3.028 2.378 2.269 49 1.384 2.669 1.652 3.680 2.304 1.665 3.207 73 1.416 2.296 1.002 2.397 2.299 1.669 1.665 1.024 2.301 1.021 2.677 2.295 1.013 2.998 2.237 59 1.202 ∞ 1.994 2.392 2.386 2.245 56 1.685 2.286 45 1.296 1.679 2.408 2.299 1.429 2.282 1.993 2.307 41 1.672 2.992 2.261 51 1.296 1.10 .304 1.390 2.400 2.678 3.326 2.010 2.668 1.242 57 1.645 3.258 52 1.997 2.391 2.704 3.255 53 1.265 50 1.684 2.333 37 1.667 1.302 1.214 69 1.685 3.385 2.225 64 1.01 .660 3.003 2.695 3.672 3.017 2.395 2.303 1.655 3.676 3.05 .644 3.294 1.657 3.294 1.005 .007 2.650 3.671 2.674 2.239 58 1.305 1.670 2.273 48 1.715 3.675 2.294 1.648 3.295 1.993 2.646 3.407 2.668 1.204 75 1.712 3.216 68 1.996 2.679 2.997 2.