You are on page 1of 27

7/19/2010

Session 13: Use of Statistics for


Repeatability, Intermediate Precision
and Reproducibility

Steven S. Kuwahara,
Kuwahara, Ph.D.

GXP BioTechnology
BioTechnology,, LLC
PMB 506, 1669-
1669-2 Hollenbeck Avenue
Sunnyvale, CA 94087-
94087-5042
Tel. & FAX: (408) 530-
530-9338
e-Mail: stevekuwahara@yahoo.com
Website: www.gxpbiotech.org
IVTStatRep,IntPrec,Repr0710 1

VARIABILITY. I.
• Nothing is ever perfect. In fact if it is, you should
wonder why.
• It is almost impossible to control things so tightly
that variation does not happen.
• Variation occurs because of human variability and
machine wear or imperfections.
• Expect variation and be prepared to deal with it.
• Disasters, both small and large occur because
people assume perfection and plan on that basis.
• When variations occur, the plans fall apart.

IVTStatRep,IntPrec,Repr0710 2

1
7/19/2010

VARIABILITY. II.

• Wise Men seek Perfection.


• Fools Expect It ! (Anonymous)

• Any fool can design a perfect system.


• What takes talent is designing a system that does not
go to Hell in the face of surrounding imperfection.
(Coffee’s Law)
• Naïve people often believe that test results are
perfect and have no uncertainty in them.

IVTStatRep,IntPrec,Repr0710 3

CHARACTERISTICS FOR AN ASSAY


VALIDATION: PRECISION
• PRECISION: . . . expresses the closeness of agreement
(degree of scatter) between a series of measurements
obtained from multiple sampling of the same
homogeneous, authentic sample. . .
– Precision is usually expressed as the variance, standard
deviation, or coefficient of variation (%RSD).
– Precision may be considered at three levels:
– REPEATABILITY: . . . under the same operating
conditions over a short period of time. (Intra-assay
precision). It should cover the range of the assay.
– Set-up one assay with several replicates. Note that the
number of replicates should be related to the number of
replicates that will be used in the test.
4 IVTStatRep,IntPrec,Repr0710

2
7/19/2010

REPEATABILITY. I.
• Also known as within test precision, it is measured
by taking one sample and testing replicates, then
calculating the standard deviation of the
replicates.
• The study should be performed at high, middle
and low concentrations of the analyte.
– For a product these should be 80%, 100%, and 120% of
a unit dose.
– For analytes whose concentrations are not known, these
should be at the LLOD, ULOD and middle.
– The middle should be the target concentration for the
test, where it should operate most efficiently.
IVTStatRep,IntPrec,Repr0710 5

REPEATABILITY. II.
VARIANCE OF A SAMPLE

2 (Σ X )2
Σ X i − i

S 2
= n
n − 1

S 2
=
Σ (X i − X )2
n − 1
2 n Σ X − (Σ X
2
)2
S = i i
n (n − 1 )
6 IIRStat081301

3
7/19/2010

REPEATABILITY. III.
RANGE, S, AND C.V.
• The range can be related to the standard deviation
for n<16. SPC Tables have d2.
XL − Xs
s= d 2 is a tabular value.
d2

S
C.V . = X 100 = % RSD
X

7 IIRStat081301

The Determination of n.
• The problem here is that you need to know how
many replicates are needed to obtain a good
estimate of the true standard deviation of the test
system.
– This number is not the same as the n for determining an
average and confidence interval.
– There are different n values depending on what you are
trying to do.
• Note that if a good n value is calculated, but you
decide to use a smaller n value for routine work,
the s that you obtain will not be as good as the one
obtained in the validation study.
IVTStatRep,IntPrec,Repr0710 8

4
7/19/2010

Sample Sizes for Estimating Standard Deviations. I.

• The problem is to choose n so that s at n – 1 will be


within a given ratio of s/σ.
• Examples are found in reproducibility,
repeatability, and intermediate precision
measurements.
• s = standard deviation experimentally determined.
σ = population or true standard deviation. s2 and
σ2 are corresponding variances.
• You will use n to derive s.

Tor6BasicStat09 9

Sample Sizes for Estimating Standard Deviations. χ2

•This is the asymmetric


distribution for σ2.
•Now as an example, χ n2−1 =
(n − 1)s 2

assume n-1 = 12. At 12 df,


χ2 will exceed 21.0261 5%
2
σ
of the time and it will
exceed 5.2260 95% of the χ n2−1 s2
=
time. Therefore 90% of the
time, χ2 will lie between
5.2260 and 21.0261 for 12
(n − 1) σ 2
2
⎛ s ⎞ ⎛ χ n −1 ⎞
df. 2
•Check your tables to
⎜ ⎟ = ⎜⎜ ⎟⎟
⎝ σ ⎠ ⎝ (n − 1) ⎠
confirm this.

Tor6BasicStat09 10

5
7/19/2010

Confidence interval for the standard deviation.

• Given the data in the previous slide, we know that


(s2/σ2) will lie between (5.2260/12) and
(21.0261/12), or between 0.4355 and 1.7552.
• Thus the ratio of s/σ will lie between the square
roots of these numbers or between 0.66 and 1.32
or 0.66 < s/σ < 1.32. This gives:
• s/1.32 < σ < s/0.66. If you know s this gives you a
90% confidence interval for the standard
deviation.
• Now let’s reverse the thinking.
Tor6BasicStat09 11

Sample Sizes for Estimating Standard Deviations.


Continued. I.

• Instead of the confidence interval, suppose we say


that we want to determine s to be within ± 20% of
σ with 90% confidence. So:
• 1 – 0.2 < s/σ < 1+ 0.2 or 0.8 < s/σ < 1.2
• This is the same as: 0.64 < (s/σ)2 < 1.44
• Since we want 90% confidence we use levels of
significance at 0.05 and 0.95.
• Now go to the χ2 table under the 0.95 column and
look for a combination where χ2/df is not < 0.64,
but df is as large as possible.
Tor6BasicStat09 12

6
7/19/2010

Sample Sizes for Estimating Standard Deviations.


Continued. II.

• Trial and error shows this number to be about 50.


• Next we go to the column under 0.05 and look for
a ratio that does not exceed 1.44, but df is as small
as possible.
• Trial and error will show this number to be
between 30 and 40.
• You must take the larger of the two numbers and
since df = n – 1, n = 51 replicates.

Tor6BasicStat09 13

Do Not Panic. Consider This!

• Instead of ± 20% suppose we say that we want to


determine s to be within ± 50% of σ with 95%
confidence. So:
• 1 – 0.5 < s/σ < 1+ 0.5 or 0.5 < s/σ < 1.5
• This is the same as: 0.25 < (s/σ)2 < 2.25
• Since we want 95% confidence we use levels of
significance at 0.025 and 0.975.
• Now go to the χ2 table under the 0.975 column and
look for a combination where χ2/df is not < 0.025,
but df is as large as possible.

Tor6BasicStat09 14

7
7/19/2010

Greater Confidence, But Lesser Certainty


• Trial and error shows this number to be 8.
• Next we go to the column under 0.025 and look for
a ratio that does not exceed 2.25, but df is as small
as possible.
• Trial and error will show this number to be 8. The
same as the other df.
• You must take the larger of the two numbers and
but in this case df = 8 and n = 9.
• You have traded a greater confidence interval for
a smaller n.

Tor6BasicStat09 15

Intermediate Precision. I.

• Also referred to as run-to-run, analyst-to-analyst,


or day-to-day precision.
• The problem is that interactions (synergistic
effects) need to be considered, and there are many
factors that could interact with each other.
– For instance analysts have different levels of stress
depending on the day of the week, and work loads may
vary depending on the day of the week, so an analyst-to-
analyst variation may interact with a day-to-day
variation.

IVTStatRep,IntPrec,Repr0710 16

8
7/19/2010

Interaction

• We’re worried about interactions.


• Defined in the PAT guidance
• “Interactions essentially are the inability of one
factor to produce the same effect on the response at
different levels of another factor.”
• Interactions are the joint action of two or more
factors working together in a non-additive
manner..

Tor7DOEInteger09 17

INTERACTION

B3 B2

E
β
B1

Tor7DOEInteger09 18

9
7/19/2010

GENERAL ONE WAY ANOVA DATA. 1.


• Data Format Variable
• A B C j
• X1A X1B X1C X1J
• X2A X2B X2C X2J
• X3A X3B X3C X3J
• XiA XiB XiC XiJ
• TCA + TCB + TCC + TCJ = T
• n=i N=ixJ

3MultSamp 19

GENERAL ONE WAY ANOVA TABLE.


2.

• Source SS df MS Exp.
• Between SSc = ΣT2cJ/n - T2/N J-1 SSc/J - 1 σ2 +nσ2
• (columns)

• Within SSw= ΣΣX2iJ - ΣT2cJ/n N-J SSw/N-J σ2


• Variables (rows)

• Total SSt= ΣΣX2iJ - T2/N N-1


• T = grand total of all values. n = number of replicates in each
column. N = Total number of values.
• Remember: SSw = SSt - SSc

3MultSamp 20

10
7/19/2010

ONE-WAY ANALYSIS OF VARIANCE


(ANOVA). 8.

• Multiple Averages. DATA: % purity


• Lab A Lab B Lab C Lab D
• 73 74 68 71
• 75 74 69 72
• 73 75 69 72
• 75 74 70 71
• 73 74 69 73
• TcJ = 369 371 345 359
• ⌧= 73.8 74.2 69.0 71.8

• T = 1,444 n=5 N = 20

3MultSamp 21

ONE-WAY ANALYSIS OF VARIANCE


(ANOVA). 8.
• SSC = 3692/5 + 3712/5 + 3452/5 + 3592/5 - 1,4442/20
• SSC = 104,341.60 - 104,256.80 = 84.80
• SSt = 732 + 752 + 732 + . . . + 732 - 1,4442/20
• SSt = 104,352.0 - 104,256.80 = 95.20
• SSw = 95.20 - 84.80 = 10.40

• Source SS df MS Expected
• Between (SSc) 84.8 3 28.270 σ2 + 5σ2
• Within (SSw)10.4 16 0.650 σ2
• Total (SSt) 95.2 19

• F = 28.27/0.65 = 43.49 F0.05/2, 3,16 = 3.24


3MultSamp 22

11
7/19/2010

TWO WAY ANOVA, NO REPLICATES,


DATA

• Data Table: Variable I


• Variable H A B C D J SUM
• a X11 X12 X13 X14 X1J T1r
• b X21 X22 X23 X24 X2J T2r
• c X31 X32 X33 X34 X3J T3r
• i Xi1 Xi2 Xi3 Xi4 XiJ Tir
• Col. Tot. Tc1 Tc2 Tc3 Tc4 TcJ T
• r = number of rows. c = number of columns. Tci = total of values in
column i. Tir = total of values in row r. T = grand total of all values.

3MultSamp 23

TWO WAY ANOVA, NO REPLICATES,


TABLE

• Source SS df MS
• Between SSc = ΣT2cJ/r - T2/rc c - 1 SSc/df
• columns

• Between SSr = ΣT2ir/c - T2/rc r - 1 SSr/df


• rows

• Residual SS = SSt - SSc - SSr (c-1)(r-1) SS/df


• Total SSt = ΣΣX2iJ - T2/rc rc - 1

3MultSamp 24

12
7/19/2010

TWO WAY ANOVA, EXAMPLE. Random


Effects Model 1.

• DATA: Variable I: Analyst


• Variable II: Lot A B C Row Sum
• 1 20 27 25 72
• 2 1 7 15 23
• 3 16 28 30 74
• 4 14 27 29 70
• Column Total 51 89 99 239

• SSc = 512 /4 + 892 /4 + 992 /4 - 2392 /12 = 320.6667


• SSr = 722/3+232/3+742/3+702/3 - 2392/12 = 602.9167
• SSt = 202 + 12 + 162 +. . .+302 + 292 = 947.9167

3MultSamp 25

TWO WAY ANOVA, EXAMPLE. Random


Effects Model 2.

• TABLE:
• Source SS df MS Expected
• Btw. Anyl. (SSc) 320.6667 2 160.3334 σ2 + 4σ2c
• Btw. Lots (SSr) 602.9167 3 200.9722 σ2 + 3σ2r
• Residual (SS) 51.3333 6 8.5556 σ2
• Total (SSt) 974.9167 11

• Lot Effect Fr = 200.9722/8.5556 = 23.49 F0.05, 3, 6 = 4.36
• Analyst Fc = 160.3334/8.5556 = 18.74 F0.05, 2, 6 = 5.14
• SS = 974.9167 - 602.9167 - 320.6667 = 51.3333

3MultSamp 26

13
7/19/2010

4 Factors High (+) and Low (-)


16 Experiments (42)
(Actually could be yes/no, present/not present.)

A + + + + - + + + - - - + - - - -

B + + + - + + - - + + - - + - - -

C + + - + + - + - + - + - - + - -

D + - + + + - - + - + + - - - + -
Eff.

Tor7DOEInteger09 27

2 Factors, High (+), Middle (0), & Low (-)


32 Checks for Curvature.

1 2 3 4 5 6 7 8 9

A + + + 0 0 0 - - -

B + 0 - + 0 - + 0 -
Effect

Tor7DOEInteger09 28

14
7/19/2010

MATHEMATICAL MODEL

• Suppose you have three variables (usually studied


as high and low points, but could be yes/no. So
there are 8 measurements (23)
Yijk = Y…+Ai+Bj+Ck+ABij+ACik+BCjk+ABCijk
– Yijk = Single measured value, i,j,k = low & high values
– Y = Overall average for the 8 measurements.
– Ai, Bj, Ck +Main effects (factors directly affecting Yijk.
– ABij, ACik, BCjk = Two factor interactions (not
explained by main effects.)
– ABCijk= Three factor interaction. (not explained by two
factor interactions and main effects.)
– There are 8 equations.
Tor7DOEInteger09 29

SCREENING STUDIES

• You have a BIG problem!


– Most people bail out and say that it can’t be done or
their boss won’t allow them to do it.
• The solution is to reduce the number of studies
that will be required.
• We already know that DOE methods require the
minimum number of studies.
• Screening studies allow you to screen out the non-
critical parameters.

Tor7DOEInteger09 30

15
7/19/2010

Popular Methods for Screening Studies

• The two most popular methods for screening


studies are the Taguchi and Plackett - Burman
methods.
• The choice of method appears to depend upon
personal preferences based on past training and
experience.
– There are some arguments that say that their use
should depend on the circumstances.
– Consult a real statistician to choose the method and
apply the method to your processes.
– Depending on the complexity of the process,
application of these methods can save many thousands
of $$$$.
Tor7DOEInteger09 31

Plackett-Burman Designs

• Use N measurements to study N-1 main effects.


• Their use for checking Robustness or
Ruggedness in Assay validations has been
recommended by Torbeck (Pharm. Tech.,
3/1996, pp. 168), the ASTM (ASTM, E 1169-89),
and the NBS (J. Res. NBS 1986: 91, 3 & 9)
• In a Plackett-Burman design it is assumed that
the interactions are minimal and really have no
influence on the main effects.
• This is what you hope for, but can you prove it?
Tor7DOEInteger09 32

16
7/19/2010

N=8, 7 Factor Plackett-


Plackett-Burman Design
(Yates--Youden)
(Yates Youden)
FACTORS ( = 2993)
Row A B C D E F G Eff.
1 - - - - - - - 2904
2 - - + - + + + 3015
3 - + - + - + + 3006
4 - + + + + - - 2964
5 + - - + + - + 2999
6 + - + + - + - 3055
7 + + - - + + - 3049
8 + + + - - - + 2949

Tor7DOEInteger09 33

CALCULATING EFFECTS
(Taken from NBS Publications)
• The effect is the average of the positive effects minus the
average of the negative effects.
• Thus the effect of A is: (2999+3055+3049+2949)/4 –
(2904+3015+3006+2964)/4 = 3013 – 2972=+41
• A was the effect of Temperature (30o & 25o) in milliUnits
of pH so A = +0.041 pH Units.
• B was: (3006+2964+3049+2949)/4 – (2904+3015+
2999+3055)/4 = 2992 – 2993.25 = -1.25 or -0.00125 pH
unit.
• B was the effect of stirring during pH measurement given
as ON (+) or OFF (-).

Tor7DOEInteger09 34

17
7/19/2010

Plackett-Burman Considerations

• The initial assumption is that all of the main


effects are “Orthogonal”, i.e. independent of
each other.
• Each of the effects may be calculated by taking
the average of its (+) and subtracting the average
of its (-) results. Look at the calculations for A
and B on the preceding slide.
• If you have too many factors you can group them
by their expected influence on the effects.
• If you don’t have enough, you can repeat factors
to obtain more data.

Tor7DOEInteger09 35

Judging the Significance of the Effects.

• Scientific judgement: In the case of the B effect on the pH


measurements just looking at the result may tell you that
it is not significant.
• Another way is given by Torbeck (See Pharm. Tech.
Reference), using a normal probability plot of the effects.
• Another method uses the t-test given in the papers from
the NBS. This will require knowledge of the standard
deviation of the test method or the process (You should
have these from either the process or the method
validation study.)
• Is the observed effect greater than what might be
expected from the known variance?
Tor7DOEInteger09 36

18
7/19/2010

2- & 3-Factor Interactions and Confounding of


the N = 8, 7 Factor Design

A B C D E F G
-BD -AD -AE -AB -AC -AG -AF
-CE -CF -BF -CG -BG -BC -BE
-FG -EG -DG -EF -DF -DE -CD
BCG ACG ABG ACF ABF ABE ABC
BEF AEF ADF AEG ADG ACD ADE
CDF CDE BDE BCE BCD BDG BDF
DEG DFG EFG BFG CFG CEG CEF

Tor7DOEInteger09 37

How To Have Confounding

A B D BD • BD is generated via the


- - - + Multiplication Rule for
Signs
- - - +
•Thus A = -BD
- + + + •This creates “confounding”
- + + + and in this experiment A
+ - + - cannot be distinguished
from –BD, -CE, -FG, BCG,
+ - + - BEF, CDF or DEG
+ + - -
+ + - -

Tor7DOEInteger09 38

19
7/19/2010

Benefits of Confounding

• The multiple confounding with the interactions


creates an interesting situation.
• In the N=8, 7-Factor study, each of the main
effects is confounded with 15 possible
interactions.
– There are 3 2-factor, 4 3-factor, 4 4-factor, 3 5-factor,
and 1 6-factor interactions.
– Usually the higher order interactions are assumed to
be small or weak (not always true!)
– B was confounded with –AD, -CF, -EG, ACG, AEF,
CDE & DFG. (The 2- & 3-factor interactions)
– So if B was small and negligible? . . . . ?
Tor7DOEInteger09 39

Now Look at the Interactions That Can be


Eliminated !
A B C D E F G
Temp Stir Dil.Y/N Depth +NaNO3 +/-KCl Equil.
-BD -AD -AE -AB -AC -AG -AF
-CE -CF -BF -CG -BG -BC -BE
-FG -EG -DG -EF -DF -DE -CD
BCG ACG ABG ACF ABF ABE ABC
BEF AEF ADF AEG ADG ACD ADE
CDF CDE BDE BCE BCD BDG BDF
DEG DFG EFG BFG CFG CEG CEF

Tor7DOEInteger09 40

20
7/19/2010

Other Considerations

• To fully evaluate 7 main effects and all of the


interactions would require a full factorial study
of 27 = 128 experiments. (This is still better than
OFAT)
• If you use the confounding to eliminate
interactions, you can reduce the number of
interactions that would need to be studied in
follow-up studies.
• If you know interactions exist do not use the
factors in combination in a Plackett-Burman.

Tor7DOEInteger09 41

More Considerations

• If you suspect interactions, call in a real


statistician who knows something about
processing.
• Use the confounding to remove possible
interactions, and reduce the size of your factorial
experiments.
• Each design has a different confounding pattern
• People have used interactions to reduce costs in
manufacturing, by using non-linear effects.

Tor7DOEInteger09 42

21
7/19/2010

REPRODUCIBILITY. I.
• REPRODUCIBILITY: . . . expresses the precision
between laboratories (collaborative studies,
usually applied to the standardization of
methodology.) quantitated as an exact value.
– One set of samples distributed to multiple sites.
– Usually not run if there is only one laboratory, but this
is unrealistic.
– There is always a second laboratory – AT THE FDA!
– In the event of a product liability suit, it will be
suspicious if the test only works well in the company’s
hands, and the results cannot be reproduced at an
independent test facility.
IVTStatRep,IntPrec,Repr0710 43

Multiple Averages (ANOVA)


N = 25 df = 4 columns and rows
Σyi=376 Grand Ave = 15.04
Ho= All labs are equal

Lab Replicates Totals Ave

1 2 3 4 5 yi 
A 7 7 15 11 9 49 9.8
B 12 17 12 18 18 77 15.4
C 14 18 18 19 19 88 17.6
D 19 25 22 19 23 108 21.6
E 7 10 11 15 11 54 10.8
Σ= 376 15.04
3MultSamp 44

22
7/19/2010

ANOVA for Labs.: Sum of Squares

SSTotal
5
= ∑∑ y −
(∑ y )
5
2 ij
2

= 636.96
ij
i =1 j =1N

= (7) + (7 ) + (15) + .... + (14) + .... + (15) + (11) −


2 2 2 2 (376)
2
2 2

25

SSTreat
1 5
= ∑ ( yi ) −
2 (∑ yij ) = 1 (49)2 + .... + (54)2 − (376)2
2

[ ]
n i =1 N 5 25
SSTreat = 475.76 SSE = SSTotal − SSTreat
SSE = 636.96 − 475.76 = 161.20

3MultSamp 45

ANOVA Table: Labs


Fo=MSTreat/MSError
F0.05,4,20=2.87 from F tables
Fo=14.76 by calculation
Probability (p) is < 1% that differences among
labs is due to replicate error.
Source of Variation Sum of Deg. Mean Fo
Sqrs Freed. Sq

Treatments (Labs) 475.76 4 118.94 14.76


Error (Replicates) 161.20 20 8.06
Total 636.96 24 p<0.01

3MultSamp 46

23
7/19/2010

Duncan’s Multiple Range Test


• ANOVA told us that the Lab averages were
significantly different.
• Fortunately the numbers of replicates was
equal.
• The standard error of each Lab average is:

MS Error 8 . 06
S yi = = = 1 . 27
n 5
n = 5 because there are 5 averages
There are 20 error degrees of freedom.
3MultSamp 47

Least Significant Ranges


• Ri = Syi[r0.05(p,df)] The term r0.05(p,df) is taken from
Duncan’s table. [Duncan, D.B., “Multiple Range and
Multiple F Tests.” Biometrics 11, 1-42 (1955)
• R2 = (2.95)(1.27) = 3.75 r0.05(2,20) = 2.95
• R3 = (3.10)(1.27) = 3.94 r0.05(3,20) = 3.10
• R4 = (3.18)(1.27) = 4.04 r0.05(4,20) = 3.18
• R5 = (3.25)(1.27) = 4.13 r0.05(5,20) = 4.13
• Rank treatment (lab) averages in increasing order.
• y1 = 9.8 y5 = 10.8 y2 = 15.4 y3 = 17.6 y4 = 21.6
p = 2, p = 4

3MultSamp 48

24
7/19/2010

Lab Comparisons
• 4 vs 1: 21.6 – 9.8 = 11.8 > 4.13 (R5)
• 4 vs 5: 21.6 – 10.8 = 10.8 > 4.04 (R4)
• 4 vs 2: 21.6 – 15.4 = 6.2 > 3.94 (R3)
• 4 vs 3: 21.6 – 17.6 = 4.0 > 3.75 (R2)
• 3 vs 1: 17.6 – 9.8 = 7.8 > 4.04 (R4)
• 3 vs 5: 17.6 – 10.8 = 6.8 > 3.95 (R3)
• 3 vs 2: 17.6 – 15.4 = 2.2 < 3.75 (R2)*
• 2 vs 1: 15.4 – 9.8 = 5.6 > 3.94 (R3)
• 2 vs 5: 15.4 – 10.8 = 4.6 > 3.75 (R2)
• 5 vs 1: 10.8 – 9.8 = 1.0 < 3.75 (R2)*
3MultSamp 49

Differences

Lab 1 Lab 5 Lab 2 Lab 3 Lab 4


9.8 10.8 15.4 17.6 21.6

At the level of a 5% probability of making a Type 1 error,


1 and 5 are not different and 2 and 3 are not different.
Can you guess the possible cause?

3MultSamp 50

25
7/19/2010

Modified Table of Significant Ranges


Copied from Montgomery, D.C., “Design and Analysis of
Experiments.” 5th Ed., John Wiley, 2001.

Df P = Pair Range
Error 2 3 4 5 6 7 8 9
9 3.20 3.34 3.41 3.47 3.50 3.52 3.52 3.52
12 3.03 3.23 3.33 3.36 3.40 3.42 3.44 3.44
15 3.01 3.16 3.25 3.31 3.36 3.38 3.40 3.42
20 2.95 3.10 3.18 3.25 3.30 3.34 3.36 3.38
30 2.89 3.04 3.12 3.20 3.25 3.29 3.32 3.35
60 2.83 2.98 3.08 3.14 3.20 3.24 3.28 3.31
100 2.80 2.95 3.05 3.12 3.18 3.22 3.26 3.29

3MultSamp 51

• BONUS MATERIAL

• CREATION OF LARGE PLACKETT –


BURMAN DESIGNS

IVTStatRep,IntPrec,Repr0710 52

26
7/19/2010

Plackett-Burman for N = 12, 11 – Factors.


Run A B C D E F G H I J K
1 + + - + + + - - - + -
2 - + + - + + + - - - +
3 + - + + - + + + - - -
4 - + - + + - + + + - -
5 - - + - + + - + + + -
6 - - - + - + + - + + +
7 + - - - + - + + - + +
+
8 + - - - + - + + - +
9 + + + - - - + - + + -
10 - + + + - - - + - + +
11 + - + + + - - - + - +
12 - - - - - - - - - - -
SessionVMultiVar 53

How to Make Other Plackett-Burmans


(Larger ones are known)
• First Lines are: (Integer multiples of 4)
• N=4 ++-, N=8+++-+--,
• N = 12 + + - + + + - - - + -
• N = 16 + + + + - + - + + - - + - - -
• N = 20 + + - - + + + + - + - + - - - - + + -
– For the selected N, write down the first line. Shift the
line one place to the right, and place the last sign of row
1 into the first place of row 2 then fill the line. Continue
to do this, except that the last row is all minus signs.

SessionVMultiVar 54

27

You might also like