1~

PWS
count

PWSName

PWSwfo
data count
l

AK

AK2211423 Elmendorf Air Force Base

SVVP

13100 GINS Large

No data

AK

AK2212039 Fort Richardson

SW

11500 GINS Large

No Data

AK

AK231 0918 Fort Wainwright

GW

12000 GINS Large

No Data

AL

04

dekalb

AL0000509

Fort Payne Waterwor1<s Board

00001T

0509001

SW

18735 GINS Large Perchlorate

uglL

2002_07Jul

4

<MRL

o

o

AL

04

dekalb

AL0000509

Fort Payne Waterwor1<s Board

00001T

0509001

SW

18735 GINS Large Perchlorate

uglL

2002_04Apr

4

<MRL

o

o

AL

04

dekalb

AL0000509

Fort Payne Waterwor1<s Board

00001T

0509001

SW

18735 GINS Large Perchlorate uglL

2002_100ct

4

<MRL

o

AL

04

dekalb

ALOOOO509

Fort Payne Waterwor1<s Board

oooon

0509001

SW

18735 GINS Large Perchlorate

uglL

°

2002_01 Jan

4

<MRL

AL

04

madison

AL0000899

US Army Aviation & Missile Command

00001T

0899001

SW

21180 GINS Large Perchlorate

uglL

2001_12Dec

4

<MRL

o

o

AL

04

medison

ALOOOO899

US Army Aviation & Missile Command

00002T

0899002

SW

21180 GINS Large Perchlorate

uglL

2001_120ec

4

<MRL

AL

04

madison

AL0000899

US Army Aviation & MlssH~ Command

00002T

0899002

SW

21180 GINS Large Perchlorate

uglL

°

2002_03Mar

4

<MRL

o

AL

04

madison

AL0000899

US Army Aviation & Missile Command

00001T

0899001

SW

21180 GINS Large Perchlorate

uglL

2002_03Mar

4

<MRL

AL

04

madison

AL0000899

US Army Aviation & Missile Command

00001T

0899001

SW

21180 GINS Large Perchlorate

uglL

2001_09Sep

4

<MRL

AL

04

madison

ALOOOOB99

US Army Aviation & Missile Command

00002T

0899002

SW

21180 GINS Large Perchlorate uglL

°
°

2001_06Jun

4

<MRL

o

AL

04

madison

AL0000899

US Army Aviation & Missile Command

00001T

0899001

SW

21180 GINS Large Perchlorate

uglL

2001_06Jun

4

<MRL

o

°
°
°
°
°

AL

04

madison

AL0000899

US Army Aviation & Missile Command

00002T

0899002

SW

21180 GINS Large Perchlorate

uglL

2001_09Sep

4

<MRL

AL

04

russell

AL0001137

Fort Mitchell Water System

00004T

1137003

GW

4674 GINS M

Perchlorate

uglL

2002_07Jul

4

<MRL

AL

04

russell

AL0001137

Fort Mitchell Water System

00004T

1137003

GW

4674 GINS M

Perchlorate

uglL

°

4

<MRL

o

°

2002_01Jan

AL

04

russell

AL0001137

Fort Mitchell Water System

00003T

1137002

GW

4674 GINS M

Perchlorate uglL

2002_07Jul

4

<MRL

o

o

AL

04

russell

AL0001137

Fort Mitchell Water System

00003T

1137002

GW

4674GINS M

Perchlorate

uglL

2002_01 Jan

4

<MRL

o

o

AL

04

russell

AL0001137

Fort Mitchell Water System

000021

1137001

GW

4674 GINS M

Perchlorate uglL

2002_02Feb

4

<MRl

AL

04

dale

AL0001489

Fort Rucker

00005T

1489004

GW

18000 GINS Large Perchlorate

uglL

2002_02Feb

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00003T

1489002

GW

18000 GINS Large Perchlorate

uglL

2002_02Feb

4

<MRL

o

AL

04

dale

AL0001489

Fort Rucker

00004T

1489003

GW

18000 GINS Large Perchlorate

uglL

2002_02Feb

4

<MRL

AL

04

dale

AL0001489

Fort Rucker

00007T

1489006

GW

18000 GINS Large Perchlorate

uglL

°
°

2002_02Feb

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00008T

1489007

GW

18000 GINS Large Perchlorate ugIL

2002_02Feb

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00006T

1489005

GW

18000 GINS Large Perchlorate

uglL

2002_02Feb

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

000031

1489002

GW

18000 GINS Large Perchlorate

uglL

2002_07Jul

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00004T

1489003

GW

18000 GINS Large Perchlorale

uglL

2002_07Jul

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00005T

1489004

GW

111000 GINS Large Perchlorate

uglL

2002_07Jul

4

<MRL

o

o

AL

04

dale

"ALOOO1489

Fort Rucker

00007T

1489006

GW

18000 GINS Large Perchlorate

uglL

2002_07Jul

4

<MRL

o

o

AL

04

dale

AL0001489

Fort Rucker

00008T

1489007

GW

18000 GINS Large Perchlorate

uglL

2002_07Jul

4

<MRL

o

f)

AL

04

dale

ALOOO1489

Fort Rucker

00006T

1489005

GW

18000 GINS Large Perchlorate ug/L

2002_07Jul

4

<MRL

o

o

o

o

o

o

o

. .

PWS

PWSName

count

PWSw/o;
data count
~

AR

AROO00507 Fort Smith Waterworl<s

SW

72798 CWS Vl

No Data

AZ

AZ0402078 Fort Huachuca

GW

15603 CWS large

No Data

SWP

17486 CWS large Perch/orale

ugll

2002_07Jul

4

<MRL

SWP

17486 CWS large Perchlorate

ugll

2002_07Jul

°

4

<MRL

SWP

17486 CWS large Perchlorate

ugll

2002_07Jul

4

<MRL

SWP

17486 CWS large Perch/orate

ugll

2002_07Jul

4

<MRl

°
°

SWP

17486 CWS large Perchlorate

ugll

2002_07Jul

4

<MRl

a

SWP

17486 CWS large Perchlorate

ugll

2002_07Jul

4

<MRl

o

SWP

17486 CWS large Perchlorale

ugll

2002_07Jul

4

<MRl

°

SWP

17486 CWS large Perchlorate

ugll

2002_07Jul

4

<MRl

°

SWP

17486 CWS large Perchlorate ugll

2002_07Jul

4

<MRl

°
°

CA

09

kern

CA1510101

Edwards Air Force Base· Main Base

00011

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00007

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00008

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00016

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00012

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00014

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00010

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00013

CA

09

kern

CA1510701

Edwards Air Force Base· Main Base

00015

08N110W.
02GOI
09N110W­
16P03
09N110W.
16R02
08N/l0W­
01COI
09N110W.
2.4G02
09N110W­
36101
08N/l0W­
02MOI
09N/l0W­
24E03
O9NI10W­
36P03

CA

CA1610700 lemoore Naval Air Station

SW

10363 CWS large

No Data

CA

CA2710705 Camp Roberts· California National Guard

GW

20370 CWS large

No Data

CA

CA3410700 McClellan Air Force Base - Main Base

GW

17600 NTNC large

No Data

CA

CA3610703 USMC Twentynine Palms

GW

17000 CWS large

No Data

CA

CA361 0705 Fort Irwin

GW

13092 CWS large

No Data

CA

CA3710700 Camp Pendleton (North)

GW

11620 CWS large

No Data

CA

CA3710702 Camp Pendleton (South)

GW

33000 CWS large

No Data

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA

09

CA
CA
CA

santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara
santa
barbara

CA421 0700 Vandenberg Air Force Base

00016

CA4210700 Vandenberg Air Force Base

00003

CA421 0700 Vandenberg Air Force Base

00001

CA421 0700 Vandenberg Air Force Base

00001

CA4210700 Vandenberg Air Force Base

00002

CA421 0700 Vandenberg Air Force Base

00003

CA4210700 Vandenberg Air Force Base

00016

CA421 0700 Vandenberg Air Force Base

00016

CA4210700 Vandenberg Air Force Base

00001

CA421 0700 Vandenberg Air Force Base

00002

CA421 0700 Vandenberg Air Force Base

00003

solano

CA4810015 Travis Air Force Base· Vallejo

00003

09

solano

CA4810015 Travis Air Force Base - Vallejo

00003

09

solano

CA4810015 Travis Air Force Base· Vallejo

00003

CA4810701

Travis Air Force Base

08NI34W­
SWP
16G05

08NI34W­

SWP
16G04

08NI34W­

SWP
16C05

08NI34W­

SWP
16C05
08NI34W­
SWP
16J02

08N134W­

SWP
16G04
08N134W­
SWP
16G05
08N/34W­
SWP
16G05
08NI34W­
SWP
16C05
08NI34W­
SWP
16J02
08NI34W­
SWP
16G04
4810015­
SW
003
4810015­
SW
003
4810015­
SW
003
GW

o

°
°
o

°
o

°
°

12000 CWS large Perchlorate ug/l

2003_120ec

4

<MRl

12000 CWS large Perchlorate ug/l

2003_120ec

4

<MRl

12000 CWS large Perchlorate

ugll

2003_120ec

4

<MRl

12000 CWS large Perchlorate ug/l

2002_120ec

4

<MRl

12000 CWS large Perchlorale

ugll

2002_120ec

4

<MRl

°
°
°
°o

12000 CWS large Perchlorate ug/l

2002_120ec

4

<MRl

o

12000 CWS large Perchlorate

ugll

2002_120ec

4

<MRl

12000 CWS large Perchlorate

ugll

2003_05May

<MRl

°o

°

4

12000 CWS large Perchlorate

ug/l

2003_05May

4

<MRl

12000 CWS large Perchlorate ugll

2003_05May

°

4

<MRl

°

12000 CWS large Perchlorate

ugll

2003_05May

4

<MRl

32000 CWS large Perchlorate ugll

2001_01Jan.

4

<MRl

32000 CWS large Perchlorate

ugll

2001_07Jul

4

<MRl

32000 CWS large Perchlorate ugll

2001_100ct

4

<MRl

13837 CWS large

No Data

o

°p
°
°
°
°
o
o

o

°

o

°

°
o

. ,

PWS
count

PWSName

PWSw/o
data count

"
FL

FL1170814

Corry Field· Naval Air Station

GW

17192 CWS Large

No Data

FL

04

duval

FL2160734

Mayport Naval Station (",ainside)

08001

32228

GW

14642 CWS Large Perchlorale

uglL

2oo3_02Feb

4

<MRL

FL

04

duval

FL2160734

Mayport Naval Station (Malnslde)

08001

32228

GW

14642 CWS Large Perchlorate

uglL

2oo3_07Jul

4

<MRL

FL

04

duval

FL2161212

Jacksonville Naval Air Ssation

08001

32212

GW

20000 CWS Large Perchlorate uglL

2oo3_07Jul

4

<MRL

FL

04

duval

FL2161212

Jacksonville Naval Air Ssatlon

08001

32212

GW

20000 CWS Large Perchlorate

ug/L

2oo3_02Feb

4

<MRL

FL

04

duval

FL2161212

Jacksonville Naval Air Ssatlon

08002

83868

GW

20000 CWS Large Perchlorate

uglL

2oo3_02Feb

4

<MRL

FL

04

duval

FL2161212

JackSonville Naval Air Ssalion

08002

83868

GW

20000 CWS Large Perchlorate

uglL

2oo3_07Jul

4

<MRL

FL4560490

Fort Pierce Utilities Authority

GW

45100 CWS Large

FL
04

liberty

,GA1790024 Fort Stewart· Main

01472

301

GW

21000 CWS Large Perchlorate

uglL

2oo2_06Jun

4

<MRL

GA

04

liberty

GA1790024 Fort Stewart· Main

03468

303

GW

21000 CWS Large Perchlorate uglL

2oo2_06Jun

4

<MRL

GA

04

liberty

GAt790024 Fort Stewart· Main

01472

301

GW

21000 CWS Large Perchlorate

uglL

2002_01 Jan

4

<MRL

GA

04

liberty

GA1790024 Fort Stewart - Main

03687

304

GW

21000 CWS Large Perchlorate uglL

2002_01 Jan

4

<MRL

GA

04

liberty

GA1790024 Fort Stewart· Main

03795

305

GW

21000 CWS Large Perchlorate uglL

2002_01 Jan

4

GA

04

liberty

GA1790024 Fort Stewart - Main

02965

302

GW

21000 CWS Large Perchlorate uglL

2002_01 Jan

GA

04

liberty

GA1790024 Fort Stewart· Main

03468

303

GW

21000 CWS Large Perchlorate

uglL

GA

04

liberty

GAl 790024 Fort Stewart· Main

03687

304

GW

GA

04

liberty

GA1790024 Fort Stewart· Main

02965

302

GA2150002 Fort Benning

°

°
°
°

°

o

°
°

°

No Data

GA

GA

°

°
°
°
°

°
°

<MRL

o

o

4

<MRL

o

2002_01 Jan

4

<MRL

o

21000 CWS Large Perchlorate uglL

2oo2_06Jun

4

<MRL

GW

21000 CWS Large Perchlorate

2oo2_06Jun

4

<MRL

°

°
°
°o

SW

44000 CWS Large

uglL

o

o

No Data

GA

04

richmond

GA2450028 Fort Gordon

02054

301

SW

24000 CWS Large Perchlorate uglL

2oo3_07Jul

4

<MRL

o

o

GA

04

richmond

GA2450028 Fort Gordon

02054

301

SW

24000 CWS Large Perchlorate uglL

2003_04Apr

4

<MRL

°

04

richmond

GA2450028 Fort Gordon

02054

301

SW

24000 CWS Large Perchlorate uglL

2oo3_12Dec

4

<MRL

°

GA
GA

04

richmond

GA2450028 Fort Gordon

02054

301

SW

24000 CWS Large Perchlorate uglL

2003_09Sep

4

<MRl

SW

14300 CWS Large

GUOOOOO10 U.S. Navy Water System

GU

o

o

°

No Data

IA

07

lee

IA5625062

Fort Madison Municlpat Waterworks

04840

02

SW

11618 CWS Large Perchlorate uglL

2oo1_03Mar

4

<MRL

o

o

IA

07

lee

IA5625062

Fort Madison Municipal Waterworks

04840

02

SW

11618 CWS Large Perchlorate uglL

2001_11Nov

4

<MRL

07

lee

IA5625062

Fort Madison Municipal Waterworks

04840

02

SW

11618 CWS Large Perchlorate uglL

2oo1_12Dec

4

<MRL

°o

°

IA
IA

07

lee

IA5625062

Fort Madison Municipal Waterworks

04840

02

SW

11618 CWS Large Perchlorate uglL

2oo1_06Jun

4

<MRL

fA

07

webster

IA94330SO

Fort Dodge Water Supply

06471

02

GW

25894 CWS Large Perchlorate ug/L

2oo2_02Feb

4

<MRL

o

o

IA

07

webster

IA94330SO

Fort Dodge Water Supply

06470

03

GW

25894 CWS Large Perchlorate ugIL

2oo2_02Feb

4

<MRL

o

o

fA

07

webster

tA94330SO

Fort Dodge Water Supply

06472

01

GW

25894 CWS Large Perchlorate

uglL

2oo2_07Jul

4

<MRL

o

o

IA

07

webster

IA94330SO

Fort Dodge Water Supply

06472

01

GW

25894 CWS Large Perchlorate

uglL

2oo2_02Feb

4

<MRL

o

o

fA

07

webster

IA9433050

Fort Dodge Waler Supply

06470

03

GW

25894 CWS . Large Perchlorate uglL

2oo2_07Jul

4

<MRL

o

o

°o

PWS

PWSName

count

IA

07

webster

IA9433050

Fort Dodge Water Supply

IL

05

lake

IL0975227

IL

05

lake

IL

05

IL

06471

02

GW

25894 CWS Large Perchlorate uglL

2002_07Jul

4

<MRL

GREAT LAKES NAVAL TRAINING STATION 16173

TAP_01

SW

31637 CWS Large Perchlorale uglL

2003_02Feb

4

<MRL

lL0975227

GREAT LAKES NAVAL TRAINING STATION 16173

TAP_Ol

SW

31637 CWS Large Perchlorate uglL

2003_11Nov

4

<MRL

lake

IL0975227

GREAT LAKES NAVAL TRAINING STATION 16173

TAP_Ol

SW

31637 CWS Large Perchlorate uglL

2003_05May

4

<MRL

05

lake

IL0975227

GREAT LAKES NAVAL TRAINING STATION 16173

TAP_01

SW

31637 CWS Large Perchlorate uglL

2003_0BAug

4

<MRL

IN

05

gibson

IN5226001

Fort Branch Water Department

oooon

901

GW

3591 CWS M

Perchlorate uglL

2002_11Nov

4

<MRL

IN

05

gibson

IN5226001

Fort Branch Water Department

oooon

901

GW

3591 CWS M

Perchlorate uglL

2002_05May

4

<MRL

KS

KS2006114 FORT RILEY

GW

18000 CWS Large

No Data

KY

'KY(}470990

FORT KNOX I ENGINEERING & HOUSING

SW

42400 CWS Large

No Data

LA

LA1115065

South Fort Polk

GW

21500 CWS Large

No Data

MD

(}3

MD

03

MD

03

MD

03

anne

arundel
anne

arundel
anne
arundel
anne

arundel

MD0020012 FORT GEORGE G. MEADE

00001

0100000

SW

50001 CWS VL

Perchlorate uglL

2003_01 Jan

4

<MRL

MD0020012 FORT GEORGE G. MEADE

00001

0100000

SW

50001 CWS VL

Perchlorale uglL

2002_100ct

4

<MRL

MDOO20012 FORT GEORGE G. MEADE

00001

0100000

SW

50001 CWS VL

Perchlorate uglL

2003_04Apr

4

<MRL

MDOO20012 FORT GEORGE G. MEADE

00001

0100000

SW

50001 CWS VL

Perchlorate uglL

2003_07Jul

4

<MRL

MD

MD012ooo2 Aberdeen Proving Ground· Chapet Hlfl

SW

12002 CWS Large

No Data

MD

MD01B0022 Patuxent Naval Air Station INAWCAD)

GW

11722 CWS Large

No Data

MO

M03079500 Fort Leonard Wood

SW

24000 CWS Large

No Data

NC

{}4

cumberland NC0326344 FORT BRAGG

00004

EPI

SW

65000 CWS VL

Perchlorate ugIL

2002_07Jul

4

<MRL

NC

{}4

cumberland NCro26344 FORT BRAGG

00004

EPI

SW

65000 CWS VL

Perchlorate uglL

2003_01 Jan

4

<MRL

NC

04

cumberland NC0326344 FORT BRAGG

00004

EPI

SW

65000 CWS VL

Perchlorate ugIL

2002_100d

4

<MRL

NC

NC(}467041 USMC LEJEUNE·HADNOl POINT

GW

35000 CWS Large

No Data

NC

NC(}467042 USMC lejune • New River Air Station

GW

11500 CWS Large

No Data

NC

NC(}467043 USMC LEJEUNE·HOLCOMB BLVD.

GW

17000 CWS Large

No Data

NJ

NJ0325001

FORT DtX

SW

14500 CWS Large

No Data

NM

NM3567701

KII~TLAND AIR FORCE BASE

GW

30000 CWS Large

No Oala

PWSw/o
data count

°

°
°
°
°
°
°

°

°

°
°

°

°

°

°

°

°
°
°

°
°

°
°
°

OH

05

greene

OH2903312 WRIGHT·PATTERSON AFB, 'B'

00013

EPI

GW

12045 CWS Large Perchlorate uglL

2002_07Jul

4

<MRL

OH

05

greene

OH2903312 WRIGHT·PATTERSON AFB, 'B'

00014

EP2

GW

12045 CWS Large Perchlorate uglL

2002_07Jul

4

<MRL

OH

05

greene

OH2903312 WRIGHT·PATTERSON AFB, 'B'

00013

EPI

GW

12045 CWS Large Perchlorate uglL

2002_02Feb

4

<MRL

OH

05

greene

OH2903312 WRIGHT·PATTERSON AFB, 'B'

00014

EP2

GW

12045 CWS Large Perchlorate uglL

2002_02Feb

4

17.2

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00014

EP3

GW

15160 CWS Large Perchlorate uglL

2OO2_02Feb

4

<MRL

o

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AlC

00013

EPI

GW

15160 CWS Large Perchlorate uglL

2002_07Jul

4

<MRL

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00013

EPI

GW

15160 CWS Large Perchlorale uglL

2002_02Feb

4

<MRL

°
°

o

°
°

o

°
°

°
°
°

.'

PWS

PWSName

count

PWSw/o
data count

,.

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00016

EP2

GW

15160 ews Large Perchlorate

ugll.

2oo2_02Feb

4

<MRl

.0

o

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00016

EP2

GW

15160 ews Large Perchlorate

ugll.

2oo2_07Jul

4

<MRl

o

o

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00014

EP3

GW

15160 ews Large Perchlorate

ugll.

2oo2_07Jul

4

<MRl

o

o

OH

05

greene

OH2903412 WRIGHT·PATTERSON AFB AREA AIC

00015

EP4

GW

15160 ews Large Perchlorate

ugll.

2oo2_02Feb

4

<MRl

o

o

OH

05

greene

OH2903412 WRtGHT·PATTERSON AFB AREA AIC

00015

EP4

GW

15160 ews Large Perchlorate ug/L

2oo2_07Jul

4

<MRl

SW

40000 ews Large

No Data

TN

TN0000820 FORT CAMPBEll WATER SYSTEM

TX

TX0140107

South Fort Hood

SWP

38139 CWS large

No Data

TX

TX01S0113

KEllY AIR FORCE BASE

GW

21500 ews Large

No Data

TX

TX01S0114

LACKLAND AIR FORCE BASE

GW

20640 ews large

No Data

TX

TX01S0115

RANDOLPH AIR FORCE BASE

GW

11820 ews Large

No Data

TX

TX01S0116

FORT SAM HOUSTON

GW

17000 ews large

No Data

TX

TX071 0020

FORT BLISS MAIN BASE AREA

GW

21800 ews Large

No Data

TX

TX243OOO7

SHEPPARD AIR FORCE BASE

SWP

12810 ews Large

No Data

o

UT

08

dallis

UT4900513 Hill AIR FORCE BASE

00009T

0602409

SWP

22082 ews Large Perchlorale ugll.

2oo1_100ct

4

<MRl

o

o

UT

08

dallis

UT~900513

HIll AIR FORCE BASE

00002T

0602402

SWP

22082 CWS Large Perchlorate

ugll

2ool_100ct

4

<MRl

o

o

UT

08

dallis

UT4900513 Hill AIR FORCE BASE

00008T

0602408

SWP

22082 CWS Large Perchlorate ugll

2oo1_100ct

4

<MRl

o

o

UT

08

dallis

UT4900513 HIll AIR FORCE BASE

00002T

0602402

SWP

22082 CWS Large Perchlorate

ugll

2ool_05May

4

<MRl

o

o

UT

08

dallis

UT4900513 Hill AIR FORCE BASE

00008T

0602408

SWP

22082 CWS Large Perchlorate

ugll

2ool_05May

4

<MRl

o

o

UT

08

dallis

UT4900513 Hill AIR FORCE BASE

00009T

0602409

SWP

22082 ews Large Perchlorate

ugll.

2oo1_05May

4

<MRl

SW

13012 ews Large

VA

VA6153675

QUANTICO MARINE BASE· MAINSIDE

o

No Data

WA

10

k~sap

WA5302714 SUBASE BANGOR

70090

7oo9C

GW

15843 ews Large Perchtorate

ugll.

2oo3_05May

4

<MRl

o

o

WA

10

kitsap

WA5302714 SUBASE BANGOR

70510

7051CL

GW

15843 ews Large Perchlorate

ugll.

2oo3_05May

4

<MRl

o

o

WA

10

kilsap

WA5302714 SUBASE BANGOR

70510

7051CL

GW

15843 ews large Perchlorale ugll

2oo3_12Dec

4

<MRl

o

o

WA

10

kitsap

WA5302714 SUBASE BANGOR

70090

7009C

GW

15843 ews Large Perchlorate

2oo3_12Dec

4

<MRl

ugll

WA

WA53243SO FAIRCHilD AIR FORCE BASE

GW

11227 CWS Large

No Data

WA

WA53260SO FORT lEWIS WATER· CANTONMENT

GW

37577 ews Large

No Data

VVI

VV11280103

GW

10227 ews Large

No Data

Fort Atkinson Waterworks

o

PWS

PWSwfo

count

data count

61

40

DoD UCMR DATA FROM DANIEL HAUTMAN, UCMR IMPLEMENTATION TEAM CO-LEADER, OFFICE OF GROUNDWATER AND DRINKING WATER, EPA

PWS
SlmelD

EPA_Region PCOunty

PWSID

PWSName

Fecility 10

Sample point Source
Population PWSType
Waler
10

Size

Contaminant

Un~ meallUre

Sample collection MRL
year & month (ugIL)

Reaull

Detect

PWS

wlo

count data
count

Air Force UCMR Dala From EPA UCMR POC
AK

AK2211423

Elmendorf Air Force
Base

SWP

13100 C\IVS

Large

No data

CA

09

kern

CA1510701

Ed~ards Air Force Base· 00011
Ma,n Base

OBNll0W­
02G01

SWP

17466 C\IVS

Large

Perchlorate

ugll

2oo2_07Jul

4

<MRL

°

o

CA

09

kern

CA1510701

Edwards Air Force Base· 00007
Main Sase

09Nll0W­
16P03

SWP

17466 C\IVS

large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

o

o

CA

09

kern

CA1510701

Edwards Alr Force Base· ooooB
Main Base

O9N110W­
16R02

SWP

17466 C\IVS

large

Perchlorate

uglL

2oo2_07JuI

4

<MRL

o

(j

CA

09

kern

CA1510701

Edwards Alr Force Base· 00016
Main Base

OBN/10W­
01C01

SWP

17466 C\IVS

Large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

o

o

\

CA

09

kern

CA1510701

Edwards Alr Force Base· 00012
Main Base

09N/10W­
24G02

SWP

17466 C\IVS

Large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

°

o

CA

09

kern

CA1510701

Edwards Alr Force Base· 00014
Main Base

O9N/10W­
36L01

SWP

17466 C\IVS

Large

Parchlorate

ugll

2oo2_07Jul

4

<MRL

o

o

CA

09

kern

CA1510701

Edwards Air Force Base· 00010
MatnBase

OBN/10W­
02MOl

SWP

174B6 C\IVS

Large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

o

o

CA

09

kern

CA1510701

Edwards Air Force Base· 00013
Main Base

O9N/10W­
24E03

SWP

174B6 C\IVS

large

Perchlorale

ug/L

2oo2_07Jul

4

<MRL

o

o

CA

09

kern

CA1510701

Edwards Air Force Base· 00015
Main Base

09NI10W­
36P03

SWP

17466 C\IVS

Large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

CA3410700

McClellan Alr Force Base
- Main Base

GW

17600 NTNC

large

CA421 0700

Vandenberg Air Force
Base

00016

OBNI34W­
16G05

SWP

12000 C\IVS

large

Perchlorate

uglL

2oo3_12Dec

CA

santa

No Dala

<MRL

o

o

2oo3_12Dec

<MRL

°

o

ugll

2OO3_120ec

<MRL

° °

Perchlorate

uglL

2oo2_120ec

<MRL

o

o

large

Perchlorate

uglL

2oo2_120ec

<MRL

o

o

12000 C\IVS

large

Perchlorate

uglL

2oo2_120ec

<MRL

°

o

SWP

12000 C\IVS

large

Perchlorate

uglL

2002_120ec

<MRL

00016

OBNI34W­
16G05

SWP

12000 C\IVS

Large

Perchlorate

ugll

2oo3_05May

<MRL

° °
° °
° o

CA

09

CA

09

santa
barbara

CA421 0700

Vandenberg Air Force
Base

00003

OBNI34W­
16G04

SWP

12000 C\IVS

Large

Perchlorate

uglL

CA

09

santa
barbara

CA421 0700

Vandenberg Air Foree
Base

00001

OBNI34W·
1SCOS

SWP

12000 C\IVS

Large

Perchlorate

CA

09

santa
barbara

Vandenberg Air Force
CA421 0700
Base

00001

OBNI34W­
16C05

SWP

12000 C\IVS

large

CA

09

santa
barbara

CA421 0700

Vandenberg Air Force
Base

00002

08N134W­
16J02

SWP

12000 C\IVS

CA

09

santa
barbara

CA421 0700

Vandenberg Air Force
Base

00003

OBNI34W­
16G04

SWP

CA

09

CA421 0700

Vandenberg Air Force
Base

00016

OBNI34W­
lSG05

Vandenberg Air Force
CA421 0700
Base

barbara

santa
barbara
santa

4

CA

09

CA

09

santa
barbara

CA421 0700

Vandenberg Air Force
Base

00001

OBNI34W·
16C05

SWP

12000 C\IVS

Large

Perchlorate

ugll

2oo3_05May

<MRL

CA

09

santa
barbara

CA421 0700

Vandenberg Air Force
Base

00002

OBNI34W·
16J02

SWP

12000 C\IVS

Large

Perchlorate

uglL

2oo3_05May

<MRl

CA

.09

sanla
barbara

CA421 0700

Vandenberg Air Force
Base

00003

OBNI34W­
1SG04

SWP

12000 C\IVS

Large

Perchlorate

ugll

2003.05May

<MRl

barbara

o

o

o
o

PBqP 1 0f paqe 8

T
SlatelD

EPA...Region PCounty

PWSIO

PWSName

Sample point Source
Facility 10
Population
Water
ID

PWSType

Size

Conlamin8nl

Unit measure

PWS
Sample COllection MRL
,ear & monlh (ugIL)

Result ,Detect

PWS wlO
count data
count

CA

09

solano

CA481oo15

Travis Air Force Base·
00003
Vallejo

481OO15-{)03 SW

32000 CWS

Large

Perchlorate

ug/L

2oo1_0tJan

<MRL

° °

CA

09

solano

CA481OO15

Travis Air Force Base ­
00003
Vallejo

4810015,003 SW

32000 CWS

Large

Perchlorate

ug/L

2oo1_07Jul

<MRL

° °

solano

Travis Air Force Base,
CA481OO15
00003
Vallejo

4810015-003 SW

32000 CWS

Large

Perchlorate

ug/L

2oo1_100cl

<MRl

°

GW

13837 CWS

Large

No Data

GW

30000CWS

Large

No Data

CA

09

CA

CA481 0701 Travis Air Force Base

NM

NM3567701

KIRTLAND AIR FORCE
BASE

OH

05

greene

DH2903312 WRIGHT-PATTERSON
AFB, 'B'

00013

EP1

GW

12045 CWS

Large

Perchlorate

ugll

2oo2_07Jul

OH

05

greene

OH2903312 WRIGHT·PATTERSON
AFB, 'B'

00014

EP2

GW

12045 CWS

Large

Perchlorate

ugIl

2oo2_07Jul

OH

os

greene

OH2903312 WRIGHT·PATTERSON
AFB, 'B'

00013

EP1

GW

12045 CWS

Large

Perchlorate

ugll

2oo2_02Feb

OH

os

greene

OH2903312 WRIGHT·PATTERSON
AFB, 'B'

00014

EP2

GW

12045 CWS

Large

Perchlorate

ugiL

OH

05

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

00014

EP3

GW

15160 CWS

Large

Perchlorate

OH

05

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

00013

EP1

GW

15160CWS

Large

OH

05

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

00013

EP1

GW

15160 CWS

OH

05

greene

002903412 WRlGHT·PATTERSON
AFB AREA AlC

00016

EP2

GW

OH

05

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

00016

EP2

00

os

greene

OH2903412 WRtGHT·PATTERSON
AFB AREA AlC

00014

OH

os

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

OH

os

greene

OH2903412 WRIGHT·PATTERSON
AFB AREA AlC

4

<MRL

° °

<MRL

° °

4

<MRL

° °

2oo2_02Feb

4

17,2

ugiL

2oo2_02Feb

4

<MRL

° °

Perchlorate

ug/L

2oo2_07Jul

4

<MRL

° °

Large

Perchlorate

ugll

2002_02Feb

<MRL

° °

15160 CWS

Large

Perchlorate

ugiL

2oo2_02Feb

<MRL

° °

GW

15160CWS

Large

Perchlorale

ugiL

2oo2_07Jul

<MRL

° °

EP3

GW

15160CWS

Large

Perchlorate

ugll

2oo2_07Jul

<MRL

° °

00015

EP4

GW

15160CWS

Large

Perchlorate

ugll

2oo2_02Feb

<MRL

° °

00015

EP4

GW

15160CWS

Large

Perchlorate

ugll

2oo2_07Jul

<MRL

°

4

4

TX

TX0150113

KELLY AIR FORCE
BASE

GW

21500 ews

Large

No Dala

TX

TX0150114

LACKLAND AIR FORCE
BASE

GW

20640ews

Large

No Data

°

Paqf1 ') of page A

Sample point Source

Population
Waler

10

~'

Sample collection MRL

PWSIO

PWSName

TX

TX0150115

RANDOLPH AIR FORCE
BASE

GW

11820CWS

Large

No Data

TX

TX2430007

SHEPPARD AIR FORCE
BASE

SWP

12810CWS

Large

No Data

SlaleiO

EPA_Region PCoun\Y

Facility 10

PWSType

Size

Contaminant

Unit meaeur/l

year & month

(ugIL)

ResUlt Detect

PWS
PWS wlo
count data
count

UT

08

davis

UT4900513
HILL AIR fORCE BASE 00009T

0602409

SWP

22D82CWS

Large

Perchlorate

ugiL

2001_100ct

4

<MRL

°

0

UT

08

davis

UT49OO513
HILL AIR FORCE BASE 00002T

0602402

SWP

22082 CWS

Large

Perchlorate

ugIL

2001_100ct

4

<MRL

°

0

UT

08

davis

UT49OO513
HILL AIR FORCE BASE 00008T

0602408

SWP

22082 CWS

Large

Perchlorate

ugiL

2001_100ct

4

<MRL

°

0

UT

08

davis

UT49OO513
HILL AIR FORCE BASE 00002T

0602402

SWP

22082 CWS

Large

Perchlorate

uglL

2001_05May

4

<MRL

°

0

°

0

\

UT

08

davis

UT49OO513
HILL AIR FORCE BASE 00008T

0602408

SWP

22082 CWS

Large

Perchlorale

ugIL

2001_05May

4

<MRL

UT

08

davis

UT49OO513
HILL AIR FORCE BASE 00009T

0602409

SWP

22082 CWS

Large

Perchlorate

ugiL

2001_05May

4

<MRL

GW

11227 CWS

Large

WA

WA5324350

FAIRCHILD AIR fORCE
BASE

IIREFI IIREFI

No Data

P,ge 3 olp,ge 8

StatelD

EPA_Region PCounIy

PWS~ame

PWSID

Facility 10

Sample point Source
Population PWSType
10
Waler

Size

ConbIminant

'

.
Unit measure

PWS
Sample coIl9dion MRl
year & month (ugIL) Result

Detect

PWS w/o
count data
count

Navy/Marine Corps UCMR Data From EPA UCMR P~
CA

CA1610700

CA

Lemoore Naval Air

SW

10363CWS

Large

No Data

CA3610703 USMC Twentynine Palms

GW

17OO0CWS

Large

No Data

CA

CA3710700 Camp Pendleton (North)

GW

11620CWS

Large

No Data

CA

CA371 0702 Camp Pendleton (South)

GW

33000CWS

Large

No Data

FL

F11170814

Cony Field· Naval Air
Station

GW

17192 CWS

Large

No Data

duvsl

FL2160734

Mayport Naval Statton
(Malnalde)

08001

32228

GW

14642

cm

Large

Perchlorale

uglL

2oo3_02Feb

4

<MRL

32228

GW

14642CWS

Large

Perchlorate

uglL

2oo3_07Jur

4

<MRL

FL

04

Station

FL

04

duval

Mayport Naval Statton

FL2160734
08001
(Malnalde)

FL

04

duval

FL2161212

Jacksonville Naval Air
08001
Saatlon

32212

GW

20000CWS

Large

Perchlorate

uglL

2oo3_07Jul

4

<MRL

FL

04

duval

F12161212

Jacksonville Naval Air
08001
Saatlon

32212

GW

20000CWS

Large

Perchlorale

ug/L

2oo3_02Feb

4

<MRL

FL

04

duYal

F12161212

Jacksonville Naval Air
08002
Saatlon

83868

GW

20000CWS

Large

Perchlorate

uglL

2oo3_02Feb

4

<MRL

FL

04

duval

FL2161212

83868

GW

20000CWS

Large

Perchlorate

uglL

2oo3_07Jul

4

<MRL

SW

14300CWS

Large

GU

Jacksonville Naval Air

Ssation

08002

GUOOOOO10 U.S. Navy Water System

° °
°
° °
° °
° °
°
1

1

No Data

IL

05

lake

IL0975227

GREAT LAKES NAVAL
16173
TRAINING STATION

TAP_01

SW

31637 CWS

Large

Perchlorate

uglL

2oo3_02Feb

4

<MRL

° °

IL

05

fake

IL0975227

GREAT LAKES NAVAL
16173
TRAINING STATION

TAP_01

SW

31637 CWS

Large

Perchlorate

uglL

2oo3_11Nov

4

<MRL

° °

IL

05

lake

IL0975227

GREAT LAKES NAVAL
16173
TRAINING STATION

TAP_01

SW

31637 CWS

Large

Perchlorate

uglL

2003_05May

4

<MRL

° °

Il

05

lake

Il0975227

GREAT LAKES NAVAL
16173
TRAINING STATION

TAP_01

SW

31637CWS

large

Perchlorate

ug/L

2oo3_08Aug

4

<MRL

1

MD

MD018oo22

Patuxent Naval Air
Station (NAWCADI

GW

11722 CWS

Large

No Data

NC

NC0467041

USMC LEJEUNE·
HADNOT POINT

GW

35000CWS

Large

No Data

NC

NC0467042

USMC Lejune • New
River Air Station

GW

115QOCWS

Large

No Data

NC

NC0467043

USMC LEJEUNE­
HOLCOMB BLVD.

GW

17000 CWS

Large

No Data

Page 4 of paqe II

°

I
SlatelD

EPA_Region PCounty

PWSIO

PWSName

Sample point Source
FacilitylD
Population
10
Water

PWSType

Size

Contaminant

Unit measure

PWS
Sample c:oflec:tion MRl
year & month (ugIlJ

Result

Delect

PWS

wlo

count data
count

VA6153675

VA

QUANTICO MARINE
BASE - MAINSIDE

SW

13012 CINS

Large

No Dala

WA

10

kllsap

WA5302714 SUBASE BANGOR

70090

7009C

GW

15843 CINS

Large

Perchlorate

uglL

2003_05May

4

<MRL

0

0

WA

10

kitsap

WA5302714 SUBASE BANGOR

70510

7051CL

GW

15843 CWS

Large

Perchlorate

ugll

2003_0SMay

4

<MRl

0

0

WA

10

k~sap

WA5302714 SUBASE BANGOR

70510

7051CL

GW

15843 CINS

Large

Perchlorate

uglL

2003_12Dec

4

<MRl

0

0

WA

10

k~sap

WA5302714 SUBASE BANGOR

70090

7009C

GW

15843CWS

large

Perchlorate

ugiL

2003_12Dec

4

<MRl

1

0

PAge 5 Qf fYage 8

f'
StalelD

PWSIO

EPA_Region PCounty

PWSName

Sample point Source
Population PWSType
Facility 10
Water
10

Size

Contaminant

Unit measure

Sample collection MRl
ResUlt
year & month (ugIL)

Detect

PWS
PWS wlo
count data
count

Army UCMR Data From EPA UCMR POC

AK

AK2212039 Fort Richardson

SW

11500CWS

Large

No Dala

AK

AK2310918 Fort Wainwright

GW

12000 CWS

Large

No Dala

Al

04

madison

AL0000899

US Army Aviation &
Missile Command

oooon

0899001

SW

21180 CWS

Large

Perchlorate

ugfL

2oo1_120ec

4

<MRl

o

o

Al

04

madison

ALOOOO899

US Army Aviation &
Missile Command

oooo2T

0899002

SW

21180 CWS

Large

Perchlorate

ugfL

2oo1_120ec

4

<MRl

o

o

Al

04

madison

Al0000899

00002T

0899002

SW

21180CWS

Large

Perchlorate

ugIL

2oo2_03Mar

4

<MRl

o

o

us Anny Aviation &
Missile Command

I

Al

04

madison

ALOOOO899

US Army Aviation &
Missile Command

oooon

0899001

SW

21180 CWS

Large

Perchlorate

ugfL

2oo2_03Mar

4

<MRl

o

o

AL

04

madison

AL0000899

US Army Aviation &
Missile Command

oooon

0899001

SW

21180CWS

Large

Perchlorale

uglL

2oo1_09Sep

4

<MRl

o

o

AL

04

madison

AL0000899

US Army Aviation &
Missile Command

00002T

0899002

SW

21180 CWS

Large

Perchlorate

uglL

2oo1_06Jun

4

<MRl

o

o

AL

04

madison·

ALOOOO899

US Army Aviation &
Missile Command

oooon

0899001

SW

21180 CWS

Large

Perchlorete

uglL

2oo1_06Jun

<MRl

o

o

AL

04

madison

AL0000899

US Army Avialion &
Missile Command

00002T

0899002

SW

21180 CWS

Large

Perchlorate

uglL

2oo1_09Sep

4

<MRl

AL

04

dale

ALOOO1489 Fort Rucker

00005T

1489004

GW

18000CWS

Large

Perchlorate

ugIL

2002_02Feb

4

<MRl

o

o

AL

04

dale

ALOOO1489 Fort Rucker

00003T

1489002

GW

18000CWS

Large

Perchlorate

ugfl

2oo2_02Feb

4

<MRL

o

o

Al

04

dale

ALOOO1489 Fort Rucker

00004T

1489003

GW

18000CWS

Large

Perchlorate

uglL

2oo2_02Feb

4

<MRL

o

o

Al

04

dale

ALOOO1489 Fort Rucker

OOOOlT

14890C6

GW

18000CWS

Large

Perchlorate

ug1L

2oo2_02Feb

4

<MRl

o

o

AL

04

dale

ALOOO1489 Fort Rucker

00008T

1489007

GW

18000CWS

Large

Perchlorate

uglL

2oo2_02Feb

4

<MRL

o

o

AL

04

dale

ALOOO1489 Fort Rucker

00006T

1489005

GW

18000CWS

Large

Perchlorate

ugfL

2oo2_02Feb

4

<MRL

o

o

AL

04

dale

ALOOO1489 Fort Rucker

00003T

1489002

GW

18000CWS

Large

Perchlorate

ugfL

2oo2_07Jul

4

<MRL

o

o

Al

04

dale

ALOOO1489 Fort Rucker

00004T

1489003

GW

18000CWS

Large

Perchlorate

ugfl

2oo2_07Jul

<MRl

o

o

At

04

dale

ALOOO1489 Fort Rucker

00005T

1489()(J4

GW

18000CWS

Large

Perchlorate

ugfL

2oo2_07Jul

<MRL

o

o

Al

04

dale

ALOOO1489 Fort Rucker

OOOOlT

1489000

GW

18000CWS

Large

Perchlorale

ugfL

2oo2_07Jul

<MRL

o

o

o

Page f) of pagp 8

r
StalelO

EPA..Region PCounly

PWSID

PWSNarne

Sample poInt Source
Population
Facility 10
10
Waler

PWSType

Size

COntaminant

Unit measure

Sample collection MRl
Result Detect
year & month (ug/l)

AL

04

dale

ALOoo1489 Fort Rucker

000081

1489007

GW

18000 CWS

Large

Perchlorate

uglL

2oo2_07Jul

4

<MRL

AL

04

dale

AL0001489

000061

1489005

GW

18000 CWS

Large

Perchlorate

ugiL

2oo2_07Jui

4

<MRl

Fort Rucker

AZ

AZ0402078 Fort Huachuca

GW

15603 CWS

Large

No Data

CA

Camp Roberts •
CA2710705 Calilomla National
Guard

GW

20370 ews

Large

No Data

CA

CA3610705 Fort Irwtn

GW

13092 CWS

Large

No Data

PWS
PWS wlo
count data
count

o

°

°

GA

04

liberty

GA1790024 Fort Slewllrt • Main

01472

301

GW

21oooe~

Large

Perchlorate

ugll

2oo2_06Jun

<MRl

o

.0

GA

04

liberty

GA1790024 Fort Stewart· Main '

03468

303

GW

21000CWS

Large

Perchlorate

ugIl

2oo2_06Jun

<MRl

°

o

GA

04

liberty

GA1790024 Fort Stewart· Main

01472

301

GW

21000CWS

large

Perchlorate

ugiL

2002_01 Jan

<MRl

° °

GA

04

liberty

GA1790024 Fort Stewart - Main

03687

304

GW

21000CWS

Large

Perchlorate

ug/L

2002_01 Jan

<MRl

o

GA

04

liberty

GA1790024 Fort Stewart· Main

03795

305

GW

21000ews

Large

Perchlorate

ugIL

2002_01 Jan

<MRl

° °

GA

04

liberty

GA1790024 Fort Stewart· Main

02965

302

GW

21000ews

Large

Perchlorate

ugll

2002_01 Jan

<MRl

° °

GA

04

liberty

GA1790024 Fort Stewart· Main

03468

303

GW

21000 ews

Large

Perchlorate

ugiL

2002_01 Jan

4

<MRl

° °

GA

04

liberty

GA1790024 Fort Stewart· Main

03687

304

GW

21000ews

large

Perchlorate

uglL

2oo2_06Jun

4

<MRl

o

GA

04

liberty

GA179OO24 Fort Stewart· Maln

02965

302

GW

21000ews

Large

Perchlorate

uglL

2oo2_06Jun

4

<MRl

SW

44000ews

large

GA2150002 Fort Benning

GA

No Data

04

richmond

GA245oo28 Fort Gordon

02054

301

SW

24000 ews

Large

Perchlorate

uglL

2oo3_07Jul

4

<MRl

o

GA

04

richmond

GA245OO28 Fort Gordon

02054

301

SW

24000 CWS

Large

Perchlorate

ugiL

2003_04Apr

4

<MRl

°

GA

04

ridlmOnd

GA2450028 Fort Gordon

02054

301

SW

24000 CWS

Large

Perchlorate

uglL

2oo3_12Dec

4

<MRl

°

GA

04

richmond

GA2450028 Fort Gordon

02054

301

SW

24000ews

Large

Perchlorate

ugiL

2oo3_09Sep

4

<MRl

GW

18000ews

Large

KS2006114 FORT RilEY

o

°

GA

KS

°

°
o

°

No Data
PogP 7 01 page 8

,;
StalelO

EPA_Region PCouoty

PWSIO

PWS~ame

Facility 10

Sample point Source
Population PWSType
10
Water

Size

Contaminant

Unit measure

Sample collection MRl
year & month (og/l)

Result

Oeleet

PWS
PWS wi,!
count data

count
KY

FORT KNOX I
KY0470990 ENGINEERING &
HOUSING

SW

42400 ews

Large

No Data

LA

LA1115065 South Fort Polk

GW

21500 ews

Large

No Data

MD

03

anne arundel MD0020012

FORT GEORGE G.
MEADE

00001

0100000

SW

50001 ews

VL

Perchlorate

uglL

2003_01 Jan

4

<MRL

0

0

MD

03

anne arundel M00020012

FORT GEORGE G.
MEADE

00001

0100000

SW

50001 ews

VL

Perchlorate

uglL

' 2002_100ct

4

<MRL

0

0

MD

03

ame arundel MD0020012

FORT GEORGE G.
MEAOE

00001

0100000

SW

50001 ews

VL

Perchlorate

ugll

2003_04Apr

4

<MRL

0

·0

MD

03

anne arundel MD0020012

FORT GEORGE G.
MEAOE

00001

0100000

SW

50001 ews

VL

Perchlorate

uglL

2003_07JuI

4

<MRL

1

0

MD

MD012ooo2

Aberdeen Proving
Ground - Chapel Hill

SW

12002 ews

Large

No Data

MO

M03079500 Fort Leonard Wood

SW

24000 ews

Large

No Data

Ne

04

cumberland Ne0326344 FORT BRAGG

00004

EP1

SW

65000eWS

VL

Perchlorate

uglL

2002_07Jul

4

<MRL

0

0

Ne

04

cumberland Ne0326344 FORT BRAGG

()()()()4

EPI

SW

65000eWS

VL

Perchlorate

ugiL

2003_01 Jan

4

<MRL

1

0

SW

14500 ews

Large

NoOata

SW

40000 ews

Large

No Dala

1

,.

NJ

NJ0325001 FORTDIX

TN

TNoooo820

TX

TX0140107 South Fort Hood

SWP

38139 ews

Large

No Dala

TX

TX01S01'6 FORT SAM HOUSTON

GW

17000 ews

Large

No Oala

TX

TX0710020

FORT BLISS MAIN.BASE
AREA

GW

21800 ews

Large

No Oala

WA

WA53260SO

FORT LEWIS WATER·
CANTONMENT

GW

37577 ews

Large

No Dala

FORT CAMPBELL
WATER SYSTEM

r,ge 8 of rOQe 8

SITES THAT APPEAR IN EPA's UCMR DATABASE

BUT NOT IN DoD's UCMR DATA

The following sites appear in EPA's UCMR database, but are NOT included in DoD's
consolidated UCMR spreadsheet. Most of these sites may not need to be included in
DoD's current UCMR data since they report "No Data" in EPA's database under
"Result." However, there are four sites (shown below) that report I/<MRL" in EPA's
database under "Result," but do not appear in DoD's consolidated UCMR spreadsheet.
Result = <MRL

Great Lakes Naval Training Station

U.s. Army Aviation & Missile Command

Fort Rucker

Fort Bragg
Result = No Data

Elemendorf AFB

McClellan AFB - Main Base

Kirtland AFB

Kelly AFB

Randolph AFB

Sheppard AFB

Corry Fi.~ld - Naval Air Station
u.s. Navy Water System (Guam)


Fort Richardson

Fort Wainwright
Camp Roberts - California National Guard


Fort Irwin

Fort Knox

South Fork Polk

Fort Campbell Water System
South Ford Hood


Fort Sam Houston

Fort Bliss Main Base Area

SITES THAT APPEAR IN DoD's UCMR DATA

BUT NOT IN EPA's UCMR DATABASE

The following sites are included in DoD's consolidated UCMR spreadsheet, but NOT
included in EPA's UCMR database.














Fort Monroe
Fort Drum
Commander U.s. Naval Forces, Marianas
Coronado Navbase
El Centro NAF
Marine Corps Logistics Base Barstow
NAVAIRWPNSTA CHINA LAKE CA
Navy Public Works Center (IL)
Navy Public Works Center, Pearl Harbor
Pensacola NAS
Barksdale AFB
Beale AFB
CannonAFB
Davis-Monthan AFB
Shaw AFB
HanscomAFB

Guidelines for Ensuring and
Maximizing the Quality, Objectivity,
Utility, and Integrity of Information
Disseminated by the Environmental
Protection Agency

EPA/260R-02-008
October 2002

Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and
Integrity, of Information Disseminated by
the Environmental Protection Agency
Prepared by:
U.S. Environmental Protection Agency
Office of Environmental Infonnation (2810)
1200 Pennsylvania Avenue, NW
Washington, DC 20460

also availab Ie via the internet at:

http://www .epa.gov/oei/qualityguidelines/

GUldeline~.

for Ensuring and Maxill1i!ing the Quality. Objectivity. Utility. and Integrity of Information Dissemtn31cO

nv

EP"

Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of
Information Disseminated by the Environmental Protection Agency
Table of Contents

Introduction

J

EPA Mission and Commitment to Quality
2.1
EPA's Mission and Commitment to Public Access
2.2
Information Management in EPA
2.3
EPA's Relationship with State, Tribal, and Local Governments

~

3

OMB Guidelines

2

4

Existing Policies and Procedures that Ensure and Maximize Information Quality
4.1
Quality System
4.2
Peer Review Policy
4.3
Action Development Process
4.4
Integrated Error Correction Process
4.5
Information Resources Management Manual
4.6
Risk Characterization Policy and Handbook
4.7
Program-Specific Policies
4.8
EPA CQmmitment to Continuous Improvement
4.9
Summary of New Activities and Initiatives

2

~

5.
~

.ill
lQ

11
12
12

n

il
13

.H
14

5

Guidelines Scope and Applicability
12
5.1
What is "Quality" According to the Guidelines?
il
5.2
What is the Purpose of these Guidelines?
12
5.3
When Do these Guidelines Apply?
12
5.4
What is Not Covered by these Guidelines?
16
5.5
What Happens if Information is Initially Not Covered by these Guidelines, but
il.
EPA Subsequently Disseminates it to the Public?
5.6
How does EPA Ensure the Objectivity, Utility, and Integrity of information that is
not covered by these Guidelines?

6

Guidelines for Ensuring and Maximizing Information Quality
19
6.1
How does EPA Ensure and Maximize the Quality of Disseminated Information?

.............................................................. ·12
6.2
6.3
6.4
6.5

How Does EPA Define Influential Information for these Guidelines?
12
How Does EPA Ensure and Maximize the Quality of "Influential" Information?
............................................................... 20
How Does EPA Ensure and Maximize the Quality of "Influentfal" Scientific Risk
21
Assessment Information?
Does EPA Ensure and Maximize the Quality of Information from External
28
Sources?

Table of Contents

GUidelines for Ensurin9 and Maximizmg the Quality. Objectivity. Utility. and Integrity of Information DIsseminate';

EP/:.

D\'

7

Administrative Mechanism for Pre-dissemination Review
7.1
What are the Administrative Mechanisms for Pre-dissemination Reviews?

8

Administrative Mechanisms for Correction of Information
30
8.1
What are EPA's Administrative Mechanisms for Affected Persons to Seek and
30
Obtain Correction of Information?
30
8.2
What Should be Included in a Request for Correction of Information?
8.3
When Does EPA Intend to Consider a Request for Correction of Information?
............................................................... 31
8.4
How Does EPA Intend to Respond to a Request for Correction of Information?
..................................... o' o'
o' oJ.!
8.5
How Does EPA Expect to Process Requests for Correction of Information on
Which EPA has Sought Public Comment? .
32
8.6
What Should be Included in a Request Asking EPA to Reconsider its Decision on
a Request forthe Correction ofInformation? .
34
8.7
How Does EPA Intend to Process Requests for Reconsideration of EPA
Decisions? .
o..
34
0

0

0

0

0

••

0

•••••••

0

••••

0

•••••

0

••••

0

•••••

0

•••••••

0

•••

0

••••••

0

••••••

0

0

0

••

0

•••

0

0

0

••••

19
29

0

0

0

0

•••••••

00'

•••••••••••

0

•••

•••

0

0

0

0

••

Appendix A: IQG Development Process and Discussion of Public Comments

A.l

Introduction o,

A.2

General Summ,ary of Comments

A.3

Response to Comments by Guidelines Topic Area
A.3.1 Existing Policy ....
A.3.2 Scope and Applicability
A.3.3 Sources of Information ..
Ao3.4 Influential Information ..
A.3.5 Reproducibility
o,
A.3.6 Influential Risk Assessment ..
A.3.7 Complaint Resolution

0

••••

0

••••

0

0

0

••

•••

0

0

00

0

0

0

0

•••••

0

••

0

•••

•••••••

••

••

0

0

•••

0

0

0

0

Table of Contents

0

•••

0

0

0

••

0

0

0

0

•••••••

0

••••••••••••••

0

0

0

••••••

0

••

0

0

•••

,

0

0

0

0

0

0

•••••••••

•••••••••••••

••••••••••••••

0

0

0

•••••••••

0

•••

0

•••••••••

0

0

•••••

0

0

000

•••••••

0

0

0

••••

•••••••••••••

••

0

••••

••

0

0

•••••

0

0

••

••••

0

Next Steps

••••

0

A.4

0

36

0

0

0

••••

••••••••

0

0

0

0

•••

0

•••••

••

0

0

0

0

••••

0

0

0

0

0

•••••

0

0

0

0

0

0

0

•••••

•••••••••••••

0

••••••

o' ..

o'

••••••••••••••••••••••••••••••••••••••

0

0

0

0

••

0

••

0

0

•••

0

0

••

•••

••••••••

0

36

31
38
38
39
42
43
47
48
53
56

2

c.UldCllOe~ 10,

1

EnsunnQ and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dlssemmated bv :oPt,

Introduction

The Environmental Protection Agency (EPA) is committed to providing public access to
environmental information. This commitment is integral to our mission to protect human health
and the environment. One of our goals is that all parts of society - including communities,
individuals, businesses, State and local governments, Tribal governments - have access to
accurate information sufficient to effectively panicipate in managing human health and
environmental risks. To .fulfill this and other important goals, EPA must rely upon information
of appropriate quality for each decision we make.
Developed in response to guidelines issued by the Office of Management and Budget (OMB)J
under Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal
Year 2001 (Public Law 106-554; H.R. 5658), the Guidelinesfor Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by the Environmental
Protection Agency (the Guidelines) contain EPA's policy and procedural guidance for ensuring
and maximizing the quality of information we disseminate. The Guidelines also outline
administrative mechanisms for EPA pre-dissemination review of information products and
describe some new mechanisms to enable affected persons to seek and obtain corrections from
EPA regarding disseminated information that they believe does not comply with EPA or OMB
guidelines. Beyond policies and procedures these Guidelines also incorporate the following
performance goals:

DissemJnated information should adhere to a basic standard of quality, including
objectivity, utility, and integrity.

The principles of information quality should be integrated into each step of EPA's
development of information, including creation, collection, maintenance, and
dissemination.

Administrative mechanisms for correction should be flexible. appropriate to the
nature and timeliness of the disseminated information, and incorporated into
EPA's information resources management and administrative practices.

OMB encourages agencies to incorporate standards and procedures into existing information
resources management practices rather than create new, potentially duplicative processes. EPA
has taken this advice and relies on numerous existing quality-related policies in these Guidelines.
EPA will work to ensure seamless implementation into existing practices. It is expected that
EPA managers and staff will familiarize themselves with these Guidelines, and will carefully
review existing program policies and procedures in order to accommodate the principles outlined
in this document.

IOuidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Infonnation
Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452) Herein after "OMB guidelines".
http://www. whitehouse.gov/omb/fedreglreproducible2 .pdf
Introduction

3

GUidelines tor Ensuring and Maximizing the Quality. ObJectivity. Utility. and Integrity' of Information Dissemlnmcc'

[l\'

EP';'

EPA's Guidelines are intended to carry out OMB's government-wide policy regarding
information we disseminate to the public. Our Guidelines reflect EPA's best effort to present our
goals and commitments for ensuring and maximizing the quality of information we disseminate.
As such, they are not a regulation and do not change or substitute for any legal requirements.
They provide non-binding policy and procedural guidance, and are therefore not intended to
create legal rights, impose legally binding requirements or obligations on EPA or the public
when applied in particular situations, or change or impact the status of information we
disseminate, nor to contravene any other legal requirements that may apply to particular agency
determinations or other actions. EPA's intention is to fully implement these Guidelines in order
to achieve the purposes of Section 515.
These Guidelines are the product of an open, collaborative process between EPA and numerous
EPA stakeholders. The Guidelines development process is described in the Appendix to this
document. EPA received many public comments and has addressed most comments in these
Guidelines. A discussion of public comments is also provided in the Appendix and is grouped by
overarching themes and comments by Guidelines topic areas. EPA views these Guidelines as a
living document, and anticipates their revision as we work to further ensure and maximize
information quality.

Introduction

4

Guidelmes for Ensuring and Maxllnizmg the Quality. Objectivity. Utility. and Integrity of Inlormatlon DIsseminated

2

EPA Mission and Commitment to Quality

2.1

EPA's Mission and Commitment to Public Access

D~

EP':'

The mission of the EPA is to protect human health and safeguard the natural environment upon
which life depends. EPA is committed to making America's air cleaner, water purer. and land
better protected and to work closely with its Federal, State, Tribal, and local government
partners; with citizens; and with the regulated community to accomplish its mission. In addition,
the United States plays a leadership role in working with other nations to protect the global
environment.
EPA's commitment to expanding and enhancing access to environmental information is
articulated in our Strategic Plan. EPA works every day to expand the public's right to know
about and understand their environment by providing and facilitating access to a wealth of
information about public health and local environmental issues and conditions. This enhances
citizen understanding and involvement and provides people with tools to protect their families
and their communities.
EPA statutory responsibilities to protect human health and safeguard the natural environment are
described in the statutes that mandate and govern our programs. EPA manages those programs in
concert with numerous other government and private sector partners. As Congress intended, each
statute provides regulatory expectations including information quality considerations and
principles. Some statu~es are more specific than others, but overall, each directs EPA and other
agencies in how we regUlate to protect human health and the environment. For example, the Safe
Drinking Water Act (SDWA) Amendments of 1996 set forth certain quality principles for how
EPA should conduct human health risk assessments and characterize the potential risks to
humans from drinking water contaminants. Information quality is a key component of every
statute that governs our mission.

2.2

Information Management in EPA

The collection, use, and dissemination of information of known and appropriate quality are
integral to ensuring that EPA achieves its mission. Information about human health and the
environment -- environmental characteristics; physical, chemical, and biological processes; and
chemical and other pollutants -- underlies all environmental management and health protection
decisions. The availability of, and access to, information and the analytical tools to understand it
are essential for assessing environmental and human health risks, designing appropriate and
cost-effective policies and response strategies, and measuring environmental improvements.
EPA works every day to ensure information quality, but we do not wait until the point of
dissemination to consider important quality principles. While the final review of a document
before it is published is very important to ensuring a product of high quality, We know that in
order to maximize quality, we must start much earlier. When you read an EPA report at your
local library or view EPA information on our web site, that infonnation is the result of processes
EPA Mission and Commitment to Quality

5

GUidelines tor Ensuring and Maximlzmg the Quality. Objectivity. Utility. and Integrity of Information Dlssemlnal€>Cl

D·.

OOP;'

undertaken by EPA and our partners that assured quality along each step of the way. To better
describe this interrelated information quality process, the following presents some of the major
roles that EPA plays in its effort to ensure and maximize the quality of the information:

EPA is a collector and generator of information: While most of our programs
rely on States, Tribes, or the private sector to collect and report information to
EPA, there ate some programs in which EPA collects its own information. One
example is the Agency's enforcement and compliance program, under which EPA
collects samples in the field or conducts onsite inspections. We also conduct
original, scientific research at headquarters, in Regional Offices, and at our
research laboratories to investigate and better understand how our environment
works, how humans react to chemical pollutants and other environmental
contaminants, and how to model our natural environment to assess the potential
impact of environmental management activities. Ensuring the quality of collected
information is central to our mission.

EPA is a recipient of information: EPA receives a large amount of information
that external parties volunteer or provide under statutory and other mandates.
Much of the environmental information submitted to EPA is processed and stored
in Agency information management systems. While, we work to ensure and
maximize the integrity of that information through a variety of mechanisms and
policies, we have varying levels of quality controls over information developed or
collecte~d by outside parties. This information generally falls into one of four
categories:
~

Information collected through contracts with EPA. Examples of this
information include studies and collection and analysis of data by parties
that are under a contractual obligation with EPA. Since EPA is responsible
for managing the work assigned to contractors, EPA has a relatively high
degree of control over the quality of this information.

Information collected through grants and cooperative agreements
with EPA. Examples of this information include scientific studies that are
performed under research grants and data collected by State agencies or
other grantees to assess regulatory compliance or environmental trends.
Although EPA has less control over grantees than contractors, EPA can
and does include conditions in grants and cooperative agreements
requiring recipients to meet certain criteria.

Information submitted to EPA as part of a requirement under a
statute, regulation, permit, order or other mandate. Examples of this
information include required test data for pesticides or chemicals, Toxies
Release Inventory (TRI) submissions and compliance information
submitted to EPA by States and the regulated community. EPA ensures
EPA Mission and Commitment to Quality

6

Guidelines tor Ensuring and Maxlll1lzing the Quality. OO)ectivity. Utility. and Integrity ot Intormation Dlssemlnateo o\'

E~';

quality control of such information through regulatory requirements. such
as requiring samples to be analyzed by specific analytical procedures and
by certified laboratories. However, each EPA program has specific
statutory authorities which may affect its ability to impose certain quality
practices.

The final category of information that is not included in any of the above
three categories includes information that is either voluntarily
submitted to EPA in hopes of influencing a decision or that EPA
obtains for use in developing a policy, regulatory, or other decision.
Examples of this information include scientific studies published in
journal articles and test data obtained from other Federal agencies,
industry, and others. EPA may not have any financial ties or regulatory
requirements to control the quality of this type of information.

While the quality of information submitted to EPA is the responsibility of the
original collector of the information, we nevertheless maintain a robust quality
system, that addresses information related to the first three bullets above by
including regulatory requirements for quality assurance for EPA contracts, grants,
and assistance agreements. For the fourth category, we intend to develop and
publish factors that EPA would use in the future to assess the quality of voluntary
submissions or information that the Agency gathers for its own use.
~

EPA is a user of information: Upon placement in our information management
systems, information becomes available for use by many people and systems.
EPA users may include Program managers, information product developers, or
automated financial tracking systems. Depending on the extent of public release,
users may also include city planners, homeowners, teachers, engineers, or
community activists, to name a few. To satisfy this broad spectrum of users, it is
critical that we present information in an unbiased context with thorough
documentation.
EPA is moving beyond routine administration of regulatory information and
working in concert with States and other stakeholders to provide new information
products that are responsive to identified users. Increasingly, information
products are derived from information originally collected to support State or
Federal regulatory programs or management activities. Assuring the suitability of
this information for new applications is of paramount importance.

EPA is a conduit for information: Another major role that EPA plays in the
management of information is as a provider of public access. S~ch access enables
public involvement in how EPA achieves it mission. We provide access to a
variety of information holdings. Some information distributed by EPA includes
information collected through contracts; information collected through grants and

EPA Mission and Commitment to Quality

7

GUideilnos for EnwTln\l and Maximlzrng the Quality. ObJecttvity. Utility_ and Integrity of Information Disseminaled

0\-

EP;.

cooperative agreements; information submitted to EPA as part of a requirement
under a statute, regulation, permit, order, or other mandate; and information that
is either voluntarily submitted to EPA in hopes of influencing a decision or that
EPA obtains for use in developing a policy, regulatory, or other decision. In some
cases, EPA serves as an important conduit for information generated by external
parties; however, the quality of that information is the responsibility of the
external information developer, unless EPA endorses or adopts it.

2.3

EPA's Relationship with State, Tribal, and Local Governments

As mentioned in the previous section, EPA works with a variety of partners to achieve its
mission. Our key government partners not only provide information, they also work with EPA to
manage and implement programs and communicate with the public about issues of concern. In
addition to implementing national programs through EPA Headquarters Program Offices, a vast
network of EPA Regions and other Federal, State, Tribal and local governments implement both
mandated and voluntary programs. This same network collects, uses, and distributes a wide
range of information. EPA plans to coordinate with these partners to ensure the Guidelines are
appropriate and effective.
One major mechanism to ensure and maximize information integrity is the National
Environmental Information Exchange Network (NEIEN, or Network). The result of an important
partnership between EPA, States and Tribal governments, the Network seeks to enhance the
Agency's information ~chitecture to ensure timely and one-stop reporting from many of EPA's
information partners. Key components include the establishment of the Central Data Exchange
(CDX) portal and a System of Access for internal and external users. When fully implemented,
the Network and its many components will enhance EPA and the public's ability to access, use,
and integrate information and the ability of external providers to report to EPA.

EPA Mission and Commitment to Quality

8

GwdCllfles ior Ensuring and Maxlmizrng the Quality. Objectivity. Utility. and Integrrty of Information Dlssemrnated b\ [ =>r.

3

OMB Guidelines

In Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year
2001 (Public Law 106-554; H.R. 5658), Congress directed OMB to issue government-wide
guidelines that "provide policy and procedural guidance to Federal agencies for ensuring and
maximizing the quality, objectivity, utility, and integrity of information (including statistical
information) disseminated by Federal agencies...." The OMB guidelines direct agencies subject
to the Paperwork Reduction Act (44 V.S.c. 3502(1» to:
.

Issue their own information quality guidelines to ensure and maximize the
quality, objectivity, utility, and integrity of information, including statistical
information, by no later than one year after the date of issuance of the OMB
guidelines;

Establish administrative mechanisms allowing affected persons to seek and obtain
correction of information maintained and disseminated by the agency that does
not comply with the OMB or agency guidelines; and

Report to the Director of OMB the number and nature of complaints received by
the agency regarding agency compliance with OMB guidelines concerning the
quality, objectivity, utility, and integrity of information and how such complaints
were resolved.
~

The OMB guidelines provide some basic principles for agencies to consider when developing
their own guidelines including:

Guidelines should be flexible enough to address all communication media and
variety of scope and importance of information products.

Some agency information may need to meet higher or more specific expectations
for objectivity, utility, and integrity. Information of greater importance should be
held to a higher quality standard.

Ensuring and maximizing quality, objectivity, utility, and integrity comes at a
cost, so agencies should use an approach that weighs the costs and benefits of
higher information quality.

Agencies should adopt a common sense approach that builds on existing
processes and procedures. It is important that agency guidelines do not impose
unnecessary administrative burdens or inhibit agencies from disseminating
quality information to the public.

OMS Guidelines

9

GUidelines tor E:nsurll1g and Maximizing the Quality. Objectivity. Utility. and Integrity of InformatIon DlsSemlnalec c'·

4

:::t:"

Existing Policies and Procedures that Ensure and Maximize Information Quality

EPA is dedicated to the collection, generation, and dissemination of high quality infonnation.
We disseminate a wide variety of infonnation products, ranging from comprehensive scientific
assessments of potential health risks, 2 to web-based applications that provide compliance
infonnation and map the location of regulated entities,3 to simple fact sheets for school children. 4
As a result of this diversity of information-related products and practices, different EPA
programs have evolved specialized approaches to information quality assurance. The OMB
guidelines encourage agencies to avoid the creation of "new and potentially duplicative or
contradictory processes." Further, OMB stresses that its guidelines are not intended to "impose
unnecessary administrative burdens that would inhibit agencies from continuing to take
advantage of the Internet and other technologies to disseminate information that can be of great
benefit and value to the public." In this spirit, EPA seeks to foster the continuous improvement
of existing information quality activities and programs. In implementing these guidelines, we
note that ensuring the qUality of infonnation is a key objective alongside other EPA objectives,
such as ensuring the success of Agency missions, observing budget and resource priorities and
restraints. and providing useful infonnation to the public. EPA intends to implement these
Guidelines in a way that will achieve all these objectives in a hannonious way in conjunction
with our existing guidelines and policies, some of which are outlined below. These examples
illustrate some of the numerous systems and practices in place that address the quality,
objectivity, utility, and integrity of infonnation.

4.1

Quality Systelp

The EPA Agency-wide Quality System helps ensure that EPA organizations maximize the
quality of environmental information, including infonnation disseminated by the Agency. A
graded approach is used to establish quality criteria that are appropriate for the intended use of
the information and the resources available. The Quality System is documented in EPA Order
5360.1 A2, "Policy and Program Requirements for the Mandatory Agency-wide Quality
System" and the "EPA Quality Manual."s To implement the Quality System, EPA organizations
(l) assign a quality assurance manager, or person assigned to an equivalent position, who has
sufficient technical and management expertise and authority to conduct independent oversight of
the implementation of the organization's quality system; (2) develop a Quality Management
Plan, which documents the organization's quality system; (3) conduct an annual assessment of
the organization's quality system; (4) use a systematic planning process to develop acceptance or
performance criteria prior to.the initiation of all projects that involve environmental information

2

http;/Icfpub.epa.gov/ncealcfmlpartmatt.cfm

3

http://www.epa.gov/enviro/wme/

4

http://www.epa.gov/kids

S EPA

Quality Manual for Environmental Programs 5360 AI. May 2000.

http://www.epa.gov/quality/qs-docs/5360.pdf

Existing Policies and Procedures that Ensure and Maximize Information Quality

10

Guidelines for Ensunn9 and M;mmizing the Quality. Objectivity. Utility. and Integrity of Informatton Dlssemmatee

0,.' ;:0;0.

collection and/or hse; (5) develop Quality Assurance Project Plan(s), or equivalent document(s)
for all applicable projects ,and tasks involving environmental data; (6) conduct an assessment of
existing data, when used to support Agency decisions or other secondary purposes, to verify that
they are of sufficient quantity and adequate quality for their intended use; (7) implement all
Agency-wide Quality System components in all applicable EPA-funded extramural agre.ements;
and (8) provide appropriate training, for all levels of management and staff.
The EPA Quality System may also apply to non-EPA organizations, with key principles
incorporated in the applicable regulations governing contracts, grants, and cooperative
agreements. EPA Quality System provisions may also be invoked as part of negotiated
agreements such as memoranda of understanding. Non-EPA organizations that may be subject to
EPA Quality System requirements include (a) any organization or individual under direct
contract to EPA to furnish services or items or perform work (i.e., a contractor) under the
authority of 48 CFR part 46, (including applicable work assignments, delivery orders, and task
orders); and (b) other government agencies receiving assistance from EPA through interagency
agreements. Separate quality assurance requirements for assistance recipients are set forth in 40
CFR part 30 (governing assistance agreements with institutions of higher education, hospitals,
and other non-profit recipients of financial assistance) and 40 CFR parts 31 and 35 (government
assistance agreements with State, Tribal, and local governments).

4.2

Peer Review Policy

In addition to the Qualjty System, EPA's Peer Review Policy provides that major scientifically
and technically based work products (including scientific, engineering, economic, or statistical
documents) related to Agency decisions should be peer-reviewed. Agency managers within
Headquarters, Regions, laboratories, and field offices determine and are accountable for the
decision whether to employ peer review in particular instances and, if so, its character, scope,
and timing. These decisions are made consistent with program goals and priorities, resource
constraints, and statutory or court-ordered deadlines. For those work products that are intended
to support the most important decisions or that have special importance in their own right,
external peer review is the procedure of choice. For other work products, internal peer review is
an acceptable alternative to external peer review. Peer review is not restricted to the penultimate
version of work products; in fact, peer review at the planning stage can often be extremely
beneficial. The basis for EPA peer review policy is articulated in Peer Review and Peer
involvement at the U.S. Environmental Protection Agency.6 The Peer Review Policy was first
issued in January, 1993, andwas updated in June, 1994. In addition to the policy, EPA has
published a Peer Review Handbook,? which provides detailed guidance for implementing the
policy. The handbook was last revised December, 2000.

6Peer Review and Peer Involvement at the U.S. EPA. June 7, 1994.
hnp://www.epa.gov/osp/spc/oerevmem.htm
7 Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
1OO-B-OO-OO 1. http://www.epa.gov/osp/spc/prhandbk.pdf

Existing Policies and Procedures that Ensure and Maximize Information Quality

11

Guidelines fat Ensuring and MaKimizing the Quality. Objectivity. Utility. and Integnty 01 Information Dlssemmated L'. EO.

4.3

Action Development Process

The Agency's Action Development Process also serves to ensure and maximize the quality of
EPA disseminated information. Top Agency actions and Economically Significant actions as
designated under Executive Order 12866 are developed as part of the Agency's Action
Development Process. The Action Development Process ensures the early and timely
involvement of senior management at key decision milestones to facilitate the consideration of a
broad range of regulatory and non-regulatory options and analytic approaches. Of particular
importance to the Action Development Process is ensuring that our scientists, economists, and
others with technical expertise are appropriately involved in determining needed analyses and
research, identifying alternatives, and selecting options. Program Offices and Regio.nal Offices
are invited to participate to provide their unique perspectives and expertise. Effective
consultation with policy advisors (e,g., Senior Policy Council, Science Policy Council), co­
regulators (e.g., States, Tribes, and local governments), and stakeholders is also part of the
process. Final Agency Review (FAR) generally takes place before the release of substantive
information associated with these actions. The FAR process ensures the consistency of any
policy determinations, as well as the quality of the information underlying each policy
determination and its presentation.

4.4

Integrated Error Correction Process

The Agency's Integrated Error Correction Process8 (IECP) is a process by which members of the
public can notify EPA.of a potential data error in information EPA distributes or disseminates.
This process builds on existing data processes through which discrete, numerical errors in our
data systems are reported to EPA. The IECP has made these tools more prominent and easier to
use. Individuals who identify potential data errors on the EPA web site can contact us through
the IECP by using the "Report Error" button or error correction hypertext found on major data
bases throughout EPA's web site. EPA reviews the error notification and assists in bringing the
notification to resolution with those who are responsible for the data within or outside the
Agency, as appropriate. The IECP tracks this entire process from notification through final
resolution.

8Integrated Error Correction Process for Environmental Data.
http://www.epa.gov/cdx/iecp.htm]
Existing Policies and Procedures that Ensure and Maximize Information Quality

12

Guidehn",~

4.5

for EnsUring and MaXllntzing the Quality. Ob,ectlvity. Utility. and Integrity of Information Disseminated bl'

EP~

Information Resources Management Manual

The EPA Infonnation Resources Management (lRM) Manual 9 articulates and describes many of
our information development and management procedures and policies, including infonnation
security, data standards, records management, infonnation collection, and library services.
Especially important in the context of the Guidelines provided in this document, the IRM
Manual describes how we maintain and ensure infonnation integrity. We believe that
maintaining infonnation integrity refers to keeping infonnation "unaltered, i.e., free from
unauthorized or accidental modification or destruction. These integrity principles apply to all
information. Inappropriately changed or modified data or software impacts infonnation integrity
and compromises the value of the infonnation system. Because of the importance of EPA's
information to the decisions made by the Agency, its partners, and the public, it is our
responsibility to ensure that the infonnation is, and remains, accurate and credible.
II

Beyond addressing integrity concerns, the IRM Manual also includes Agency policy on public
access and records management. These are key chapters that enable EPA to ensure transparency
and the reproducibility of infonnation.

4.6

Risk Characterization Policy and Handbook

The EPA Risk Characterization Policy and Handbook 10 provide guidance for risk
characterization that is designed to ensure that critical infonnation from each stage of a risk
assessment is used in fprming conclusions about risk. The Policy calls for a transparent process
and products that are clear, consistent and reasonable. The Handbook is designed to provide risk
assessors, risk managers, and other decision-makers an understanding of the goals and principles
of risk characterization.

4.7

Program-Specific Policies

We mentioned just a few of the Agency's major policies that ensure and maximize the quality of
information we dissemina.te. In addi.tion .to these Agency-wide systems and procedures, Program
Offices and Regions implement many Office-level and program-specific procedures to ensure
and maximize information quality. The purpose of these Guidelines is to serve as a common
thread that ties all these policies together under the topics provided by OMB: objectivity,
integrity and utility. EPA's approach to ensuring and maximizing quality is necessarily
distributed across all levels of EPA's organizational hierarchy, including Offices, Regions,
divisions, projects, and even products. Oftentimes, there are different quality considerations for
different types of products. For example, the quality principles associated with a risk assessment

9 EPA Directive 2100 Information Resources Management Policy Manual.
htlp:llwww.epa.gov/irmpoli8/polmani
IO Risk Characterization Handbook, U.S. EPA, Science Policy Council, December 2000.
http://www.epa.gov/osp/socl2riskchr.htm

Existing Policies and Procedures that Ensure and Maximize Information Quality

13

Guiot>ltnes for

Ensufln~

and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Disseminated b\' !:P,:

differ from those associated with developing a new model. The Agency currently has a
comprehensive but distributed system of policies to address such unique quality considerations.
These Guidelines provide us with a mechanism to help coordinate and synthesize our quality
policies and procedures.

4.8

EPA Commitment to Continuous Improvement

As suggested above, we will continue to work to ensure that our many policies and procedures
are appropriately implemented, synthesized, and revised as needed. One way to build on
achievements and learn from mistakes is to document lessons learned about specific activities or
products. For example, the documents that present guidance and tools for implementing the
Quality System are routinely subjected to external peer review during their development;
comments from the reviewers are addressed and responses reviewed by management before the
document is issued. Each document is formally reviewed every five years and is either reissued,
revised as needed, or rescinded. If important new information or approaches evol ve between
reviews, the document may be reviewed and revised more frequently.

4.9

Summary of New Activities and Initiatives

In response to OMB's guidelines, EPA recognizes that it will be incorporating new policies and
administrative mechanisms. As we reaffirm our commitment to our existing policies and
procedures that ensure and maximize quality, we also plan to address the following new areas of
focus and commitment:

Working with the public to develop assessment factors that we will use to assess
the quality of information developed by external parties, prior to EPA's use of
that information.

Affirming a new commitment to information quality, especially the transparency
of information products.

Establishing Agency-wide correction process and request for reconsideration
panel to provide a centralized point of access for all affected parties to seek and
obtain the correction of disseminated information that they believe does not
conform to these Guidelines or the OMB guidelines.

Existing Policies and Procedures that Ensure and Maximize Information Quality

14

GU;QOim8S lOr E:nsuTlng and MaxIl11lzing the Quality. Objectivity. Utility. and Integrity ot Information Disseminated 0\ U':'

5

Guidelines Scope and Applicability

5.1

What is "Quality" According to the Guidelines?

Consistent with the OMB guidelines, EPA is issuing these Guidelines to ensure and maximize
the quality, including objectivity, utility and integrity, of disseminated infonnation. Objectivity,
integrity, and utility are defined here, consistent with the OMB guidelines. "Objectivity" focuses
on whether the disseminated infonnation is being presented in an accurate, dear, complete. and
unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. "Integrity"
refers to security, such as the protection of infonnation from unauthorized access or revision, to
ensure that the infonnation is not compromised through corruption or falsification. "Utility"
refers to the usefulness of the infonnation to the intended users.

5.2

What is the Purpose of these Guidelines?

The collection, use, and dissemination of information of known and appropriate quality is
integral to ensuring that EPA achieves its mission. Infonnation about the environment and
human health underlies all environmental management decisions. Information and the analytical
tools to understand it are essential for assessing environmental and human health risks, designing
appropriate and cost-effective policies and response strategies, and measuring environmental
improvements.
These Guidelines describe EPA's policy and procedures for reviewing and substantiating the
quality of information before EPA disseminates it. They describe our administrative mechanisms
for enabling affected persons to seek and obtain, where appropriate, correction of information
disseminated by EPA that they believe does not comply with EPA or OMB guidelines.

5.3

When Do these Guidelines Apply?

These Guidelines apply to "infonnation" EPA disseminates to the public. "Infonnation," for
purposes ofthese Guidelines, generally includes any communication or representation of
knowledge such as facts or data, in any medium or fonn. Preliminary information EPA
disseminates to the public is also considered "information" for the purposes of the Guidelines.
Information generally includes material that EPA disseminates from a web page. However not
all web content is considered "information" under these Guidelines (e.g., certain infonnation
from outside sources that is not adopted, endorsed, or used by EPA to support an Agency
decision or position).
For purposes of these Guidelines, EPA disseminates information to the public when EPA
initiates or sponsors the distribution of information to the public.

EPA initiates a distribution of information if EPA prepares the information and
distributes it to support or represent EPA's viewpoint, or to fonnulate or support a
regulation, guidance, or other Agency decision or position.

Guidelines Scope and Applicability

15

GUldCltn'C$ for Ensuring and Mnxllnizing the Quality. Objectivity. Utility. and Integrity

of Information

DlssemlOJtec

tJV

EF':"

EPA initiates a distribution of information if EPA distributes information
prepared or submitted by an outside party in a manner that reasonably suggests
that EPA endorses or agrees with it; if EPA indicates in its distribution that the
information supports or represents EPA's viewpoint; or if EPA in its distribution
proposes to use or uses the information to formulate or support a regulation,
guidance, policy, or other Agency decision or position.

Agency-sponsored distribution includes instances where EPA reviews and
comments on information distributed by an outside party in a manner that
indicates EPA is endorsing it, directs the outside party to disseminate it on EPA's
behalf, or otherwise adopts or endorses it.

EPA intends to use notices to explain the status of information, so that users will be aware of
whether the information is being distributed to support or represent EPA's viewpoint.

5.4

What is Not Covered by these Guidelines?

If an item is not considered "information," these Guidelines do not apply. Examples of items that
are not considered information include Internet hyperlinks and other references to information
distributed by others, and opinions, where EPA's presentation makes it clear that what is being
offered is someone's opinion rather than fact or EPA's views.

"Dissemination" for the purposes of these Guidelines does not include distributions of
information that EPA does not initiate or sponsor. Below is a sample of various types of
information that would not generally be considered disseminated by EPA to the public:

Distribution of information intended only for government employees (including
intra- or interagency use or sharing) or recipients of government contracts, grants,
or cooperative agreements. Intra-agency use of information includes use of
information pertaining to basic agency operations, such as management,
personnel, and organizational information.

EPA's response to requests for agency records under the Freedom of Information
Act (FOIA). the Privacy Act, the Federal Advisory Committee Act (FACA), or
other similar laws.

Distribution of information in correspondence directed to individuals or persons
(i.e., any individual, group, or entity, including any government or political
subdivision thereof, or Federal governmental component/unit).

Information of an ephemeral nature, such as press releases, fact~sheets. press
conferences, and similar communications, in any medium that advises the public
of an event or activity or announces information EPA has disseminated

Guidelines Scope and Applicability

16

GUIdelines tor Ensuring and Ma~lInizing the Quality. Objectivity. Utility. and Integrity of Information DisseminatE'''

0,

EO D '"

elsewhere; interviews, speeches, and similar communications that EPA does not
disseminate to the public beyond their original context, such as by placing them
on the Internet. If a speech, press release, or other "ephemeral" communication is
about an information product disseminated elsewhere by EPA, the product itself
will be covered by these Guidelines.

Information presented to Congress as part of the legislative or oversight
processes, such as testimony of officials, information, or drafting assistance
provided to Congress in connection with pending or proposed legislation, unless
EPA simultaneously disseminates this information to the public.

Background information such as published articles distributed by libraries or by
other distribution methods that do not imply that EPA has adopted or endorsed
the materials. This includes outdated or superseded EPA information that is
provided as background information but no longer reflects EPA policy or
influences EPA decisions, where the outdated or superseded nature of such
material is reasonably apparent from its form of presentation or date of issuance,
or where EPA indicates that the materials are provided as background materials
and do not represent EPA"s current view.

These Guidelines do not apply to information distributed by recipients of EPA
contracts, grants, or cooperative l:!-greements, unless the information is
dissemipated on EPA's behalf, as when EPA specifically directs or approves the
dissemination. These Guidelines do not apply to the distribution of any type of
research by Federal employees and recipients of EPA funds, where the researcher
(not EPA) decides whether and how to communicate and publish the research,
does so in the same manner as his or her academic colleagues, and distributes the
research in a manner that indicates it does not necessarily represent EPA's official
position (for example, by including an appropriate disclaimer). The Guidelines do
not apply even if EPA retains ownership or other intellectual property rights
because the Federal government paid for the research.
Distribution of information in public filings to EPA, including information
submitted to EPA by any individual or person (as discussed above), either
voluntarily or under mandates or requirements (such as filings required by
statutes, regulations, orders, permits, or licenses). The Guidelines do not apply
where EPA distributes this information simply to provide the public with quicker
and easier access to materials submitted to EPA that are publicly available. This
will generally be the case so long as EPA is not the author, and is not endorsing,
adopting, using, or proposing to use the information to support an Agency
decision or position.

Distribution of information in documents filed in or prepared specifically for a
judicial case or an administrative adjudication and intended to be limited to such

Guidelines Scope and Applicability

11

GlIIdell'leS tor Ensuring and MaximIzing the Quality. Objectivity. Utility. and Integrity ot Information Disseminated bl' EP;,

actions, including infonnation developed during the conduct of any criminal or
civil action or administrative enforcement action, investigation, or audit involving
an agency against specific parties.
5.5

What Happens if Information is Initially Not Covered by these Guidelines. but EPA
Subsequently Disseminates it to the Public?

If a particular distribution of information is not covered by these Guidelines, the Guidelines may
still apply to a subsequent dissemination of the infonnation in which EPA adopts, endorses, or
uses the information to fonnulate or support a regulation, guidance, or other Agency decision or
position. For example, if EPA simply makes a public filing (such as facility data req.uired by
regulation) available to the public, these Guidelines would not apply to that distribution of
information. However, if EPA later includes the information in a background document in
support of a rulemaking, these Guidelines would apply to that later dissemination of the
information in that document.

5.6

How does EPA Ensure the Objectivity, Utility. and Integrity ofinforrnation that is
not covered by these Guidelines?

These Guidelines apply only to information EPA disseminates to the public, outlined in section
5.3, above. Other information distributed by EPA that is not covered by these Guidelines is still
subject to all applicable EPA policies, quality review processes, and correction procedures.
These include quality 1Jlanagement plans for programs that collect, manage, and use
environmental infonnation, peer review, and other procedures that are specific to individual .
programs and, therefore, not described in these Guidelines. It is EPA's policy that all of the
information it distributes meets a basic standard of information quality, and that its utility,
objectivity, and integrity be scaled and appropriate to the nature and timeliness of the planned
and anticipated uses. Ensuring the quality of EPA information is not necessarily dependent on
any plans to disseminate the information. EPA continues to produce, collect, and use information
that is of the appropriate quality, irrespective of these Guidelines or the prospects for
dissemination of the information.

Guidelines Scope and Applicability

18

Guidelllles tor Ensuring and M;lXImlzlng the Quality. ObJectivity. utilily. and Integrity of Inform:llion Disseminated

b} ED.:.

6

Guidelines for Ensuring and Maximizing Infonnation Quality

6.1

How does EPA Ensure and Maximize the Quality of Disseminated Information?

EPA ensures and maximizes the quality of the infonnation we disseminate by implementing well
established policies and procedures within the Agency as appropriate to the infonnation product.
There are many tools that the Agency uses such as the Quality System, Il review by senior
management, peer review process,12 communications product review process,13 the web guide, 14
and the error correction process. 15 Beyond our internal quality management system, EPA also
ensures the quality of infonnation we disseminate by seeking input from experts and the general
public. EPA consults with groups such as the Science Advisory Board and the Science Advisory
Panel, in addition to seeking public input through public comment periods and by hosting public
meetings.
For the purposes of the Guidelines, EPA recognizes that if data and analytic results are subjected
to fonnal, independent, external peer review, the infonnation may generally be presumed to be
of acceptable objectivity. However, this presumption of objectivity is rebuttable. The Agency
uses a graded approach and uses these tools to establish the appropriate quality, objectivity,
utility, and integrity of information products based on the intended use of the infonnation and
the resources available. As part of this graded approach, EPA recognizes that some of the
infonnation it disseminates includes influential scientific, financial, or statistical infonnation,
and that this category should meet a higher standard of quality.

6.2

How Does EPA Define Influential Information for these Guidelines?

"Influential," when used in the phrase "influential scientific, financial, or statistical
information," means that the Agency can reasonably determine that dissemination of the
infonnation will have or does have a clear and substantial impact (i.e., potential change or effect)
on important public policies or private sector decisions. 16 For the purposes of the EPA's
llEPA Quality Manual for Environmental Programs 5360 AI. May 2000.
hup://www.epa-.gov/gualjty/gs-docs/5360.pdf

12Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
Ioo-B-OO-OOI. hup://www.epa.gov/osp/spc/prhandbk.pdf
13 EPA 's

14 Web

Print and Web Communications Product Review Guide. hup://www.epa.gov/dced/pdf/review.pdf

Guide. U.S. EPA. http://www.epa.gov/webguide/resQurces/webserv.htrnJ

15Integrated Error Correction Process. hllp://www.epa.gov/cdx/iecp.html
16The term "clear and substantial impact" is used as part of a definition to distinguish different categories of
information for purposes of these Guidelines. EPA does not intend the classification of information under this
definition to change or impact the status of the information in any other setting, such as for purposes of determining
whether the dissemination of the information is a final Agency action.

Guidelines for Ensuring and Maximizing Information Quality

19

GUlClc,lmes: for EnsUrm9 and M:lXImizmg the Quality. Oblectivity. Utility. and Integrity of Information Disseminated nv EPt.

Information Quality Guidelines, EPA will generally consider the following classes of
information to be influential, and, to the extent that they contain scientific, financial, or statistical
information, that information should adhere to a rigorous standard of quality:

6.3

Information disseminated in support of top Agency actions (Le., rules, substantive
notices, policy documents, studies, guidance) that demand the ongoing
involvement of the Administrator's Office and extensive cross-Agency
involvement; issues that have the potential to result in major cross-Agency or
cross-media policies, are highly controversial, or provide a significant opportunity
to advance the Administrator's priorities. Top Agency actions usually have
potentially great or widespread impacts on the private sector, the public or state,
local or tribal governments. This category may also include precedent-setting or
controversial scientific or economic issues.

Information disseminated in support of Economically Significant actions as
defined in Executive Order 12866, entitled Regulatory Planning and Review (58
FR 51735, October 4, 1993), Agency actions that are likely to have an annual
effect on the economy of $100 million or more or adversely affect in a material
way the economy, a sector of the economy, productivity, competition, jobs, the
environment, public health or safety, or State, Tribal, or local governments or
communities.

Major \york products undergoing peer review as called for under the Agency's
Peer Review Policy. Described in the Science Policy Council Peer Review
Handbook, the EPA Peer Review Policy regards major scientific and technical
work products as those that have a major impact, involve precedential, novel,
andlor controversial issues, or the Agency has a legal.andlor statutory obligation
to conduct a peer review. These Major work products are typically subjected to
external peer review. Some products that may not be considered "major" under
the EPA Peer Review Policy may be subjected to external peer review but EPA
does not consider such products influential for purposes of these Guidelines.

Case-by-case: The Agency may make determinations of what constitutes
"influential information" beyond those classes of infonnation already identified
on a case-by-case basis for other types of disseminated information that may have
a clear and substantial impact on important public .policies or private sector
decisions.

How Does EPA Ensure and Maximize the Quality of "Influential" Information?

EPA recognizes that influential scientific, financial, or statistical infonnation ~hould be subject
to a higher degree of quality (for example, transparency about data and methods) than
information that may not have a clear and substantial impact @n important public policies or
private sector decisions. A higher degree of transparency about data and methods will facilitate
Guidelines for Ensuring and Maximizing Information Quality

20

GlIl[je!Il1"'~ '0'

Ensuring and Maximizing the

Oualit~.

Objectivity. Utility. and Integrity of Information Disseminated

D"

:::0.'

the reproducibility of such infonnation by qualified third parties, to an acceptable degree of
imprecision. For disseminated influential original and supporting data, EPA intends to ensure
reproducibility according to commonly accepted scientific, financial, or statistical standards. It is
important that analytic results for influential information have a higher degree of transparency
regarding (1) the source ofthe data used, (2) the various assumptions employed, (3) the analytic
methods applied, and (4) the statistical procedures employed. It is also important that the degree
of rigor with which each of these factors is presented and discussed be scaled as appropriate, and
that all factors be presented and discussed. In addition, if access to data and methods cannot
occur due to compelling interests such as privacy, trade secrets, intellectual property, and other
confidentiality protections, EPA should, to the extent practicable, apply especially rigorous
robustness checks to analytic results and carefully document all checks that were undertaken.
Original and supporting data may not be subject to the high and specific degree of transparency
provided for analytic results; however, EPA should apply, to the extent practicable, relevant
Agency policies and procedures to achieve reproducibility, given ethical, feasibility, and
confidentiality constraints.
Several Agency-wide and Program- and Region-specific policies and processes that EPA uses to
ensure and maximize the quality of environmental data, including disseminated information
products, would also apply to. information considered "influential" under these Guidelines.
Agency-wide processes of particular importance to ensure the quality, objectivity, and
transparency of "influential" information include the Agency's Quality System, Action
Development Process, Peer Review Policy, and related procedures. Many "influential"
information products Il)ay be subject to more than one of these processes.

6.4

How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information?

EPA conducts and disseminates a variety of risk assessments. When evaluating environmental
problems or establishing standards, EPA must comply with statutory requirements and mandates
set by Congress based on media (air, water, solid, and hazardous waste) or other environmental
interests (pesticides and chemicals). Consistent with EPA's current practices, application of these
principles involves a "weight-of-evidence" approach that considers all relevant infonnation and
its quality, consistent with the level of effort and complexity of detail appropriate to a particular
risk assessment. In our dissemination of influential scientific information regarding human
health, safety '7 or environmental 18 risk assessments, EPA will ensure, to the extent practicable

I7"Safety risk assessment" describes a variety of analyses, investigations, or case studies conducted by EPA
to respond to environmental emergencies. For example, we work to ensure that the chemical industry and state and
local entities take action to prevent, plan and prepare for, and respond to chemical emergencies through the
development and sharing of information. tools, and guidance for hazards analyses and risk assessment.
JRBecause the assessment of "environmental risk" is being distinguished from "human health risk," the term
"environmental risk" as used in these Guidelines does not directly involve human health concerns. In other words, an
"environmental risk assessment" is in this case the equivalent to what EPA commonly calIs an "ecological risk

Guidelines for Ensuring and Maximizing Information Quality

21

GUidelines for Ensuring and M:lXlml:!ing the Quality. Oblectivlty. Utility. and Integrity of Information Disseminated lw EP':

and consistent with Agency statutes and existing legislative regulations, the objectivityl9 of such
information disseminated by the Agency by applying the following adaptation of the quality
21
principles found in the Safe Drinking Water Aceo (SOWA) Amendments of 1996 :
(A)

The substance of the information is accurate, reliable and unbiased. This involves the use
of:
(i)
the best available science and supporting studies conducted in accordance with
sound and objective scientific practices, including, when available, peer reviewed
science and supporting studies; and
(ii)
data collected by accepted methods or best available methods (if the reliability of
the method and the nature of the decision justifies the use of the data).

(B)

The presentation of information on human health, safety, or environmental risks,
consistent with the purpose of the information, is comprehensive, informative, and
understandable. In a document made available to the public, EPA specifies:
(i)

(ii)

each population addressed by any estimate of applicable human health risk or
each risk assessment endpoint, including populations if applicable, addressed by
any estimate of applicable ecological risk 22 ;
the expected risk or central estimate of human health risk for the specific

assessment".
190MB stated in its guidelines that in disseminating infonnation agencies shall develop a process for
revieWing the quality of the infonnation. "Quality" includes objectivity, utility, and integrity. "Objectivity" involves
two distinct elements, presentation and substance. Guidelines for Ensuring and Maximizing the Quality, Objectivity,
Utility, and Integrity of Infonnation Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452)
http://www.whitehouse.gov/omb/fedreglreproducible2.pdf

20Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-1(b)(3)(A) & (B)
21 The exception is risk assessments conducted under SDWA which will adhere to the SDWA principles as
amended in 1996.
22Agency assessments of human health risks necessarily focus on populations. Agency assessments of
ecological risks address a variety of entities, some of which can be described as populations and others (such as
ecosystems) which cannot. The pl}rase "assessment endpoint" is intended to reJlect the broader range of interests
inherent in ecological risk assessments. As discussed in the EPA Guidelines for Ecological Risk Assessment (found
at http://cfpub.epa.gov/ncealcfmlrecordisplay.cfm?deid-12460), assessment endpoints are explicit expressions of the
actual environmental value that is to be protected, operationally defined by an ecological entity and its attributes.
Furthennore, those Guidelines explain that an ecological entity can be a species (e.g., eelgrass, piping plover), a
community (e.g., benthic invertebrates), an ecosystem (e.g., wetland), or other entity of concern. An attribute of an
assessment endpoint is the charactenstic about the entity of concern that is important to protect and potentially at
risk. Examples of attributes include abundance (of a popUlation), species richness (of a comniunity), or function (of
an ecosystem). Assessment endpoints and ecological risk assessments are discussed more fully in those Guidelines
as well as other EPA sources such as Ecological Risk Assessment Guidance for Superfund: Process for Designing
and Conducting Ecological Risk Assessments· Interim Final found at
http://www.epa.gov/oerrpage/superfundlprograms/risklecorisklecorisk.htm

Guidelines for Ensuring and Maximizing Information Quality

22

GuidellllC& for Ensuring and Maximizing the Quality. ObJectivity. Utility. and Integrity of Information DISSCITllnatcc' 0, ED;

(iii)
(iv)
(v)

populations affected or the ecological assessment endpoints23 , including
populations if applicable;
each appropriate upper-bound or lower-bound estimate of risk;
each significant uncertainty identified in the process of the assessment of risk and
studies that would assist in resolving the uncertainty; and
peer-reviewed studies known to the Administrator that support, are directly
relevant to, or fail to support any estimate of risk and the methodology used to
reconcile inconsistencies in the scientific data.

In applying these principles, "best available" usually refers to the availability at the time an
assessment is made. However, EPA also recognizes that scientific knowledge about risk is
rapidly changing and that risk information may need to be updated over time. When deciding
which influential risk assessment should be updated and when to update it, the Agency will take
into account its statutes and the extent to which the updated risk assessment will have a clear and
substantial impact on important public policies or private sector decisions. In some situations,
the Agency may need to weigh the resources needed and the potential delay associated with
incorporating additional information in comparison to the value of the new information in terms
of its potential to improve the substance and presentation of the assessment.

Adaptation clarifications
In order to provide more clarity on how EPA adapted the SDWA principles in this guidance in
light of our numerous ~tatutes, regulations, guidance and policies that address how to conduct a
risk assessment and characterize risk we discuss four adaptations EPA has made to the SDWA
quality principles language.

EPA adapted the SDWA principles by adding the phrase "consistent with Agency statutes and
existing legislative regulations, the objectivity of such information disseminated by the Agency"
in the introductory paragraph, therefore applying to both paragraphs (A) and (B). This was done
to explain EPA's intent regarding these quality principles and their implementation consistent
with our statutes and existing legislative regulations. Also, as noted earlier, EPA intends to
implement these quality principles in conjunction with our guidelines and policies. The
procedures set forth in other EPA guidelines set out in more detail EPA's policies for conducting
risk assessments, including Agency-wide guidance on various types of risk assessments and
program-specific guidance. EPA recognizes that the wide array of programs within EPA have
resulted not only in Agency-wide guidance, but in specific protocols that reflect the
requirements, including limitations, that are mandated by the various statutes administered by
the Agency. For example, the Agency developed several pesticide science policy papers that
explained to the public in detail how EPA would implement specific statutory requirements in
the Food Quality Protection Act (FQPA) that addressed how we perform risk assessments. We
also recognize that emerging issues such endocrine disruption, bioengineered organisms, and
genomics may involve some modifications to the existing paradigm for assessIng human health

Guidelines tor Ensuring and Maximizing Information Quality

23

GUidelines tor Ensurlfl<) and MaxlIl1Izing the Quailty. ObJectivity. Utility. ana Integrity of Information Dlsscmlnatcc 0\ EO"':

and ecological risks. This does not mean a radical departure from existing guidance or the
SDWA principles, but rather indicates that flexibility may be warranted as new information and
approaches develop.
.
EPA introduced the following two adaptations in order to accommodate the range of real-world
situations that we confront in the implementation of our diverse programs. EPA adapted the
SDWA quality principles by moving the phrase "to the extent practicable" from paragraph (B) to
the introductory paragraph in this Guidelines section to cover both parts (A) and (B) of the
SDWA adaptation. 24 The phrase refers to situations under (A) where EPA may be called upon to
conduct "influential" scientific risk assessments based on limited information or in novel
situations, and under (B) in recognition that all such "presentation" information may not be
available in every instance. The level of effort and complexity of a risk assessment should also
balance the information needs for decision making with the effort needed to develop such
information. For example, under the Federal Insecticide, Fungicide and Rodenticide Act25
(FIFRA) and the Toxic Substances and Control Act 26 (TSCA), regulated entities are obligated to
provide information to EPA concerning incidents/test data that may reveal a problem with a
pesticide or chemical. We also receive such information voluntarily from other sources. EPA
carefully reviews incident reports and factors them as appropriate into risk assessments and
decision-making, even though these may not be considered information collected by acceptable
methods or best available method as stated in A(ii). Incident information played an important
role in the Agency's conclusion that use of chlordanelheptachlor termiticides could result in
exposures to persons living in treated homes, and that the registrations needed to be modified
accordingly. Similarly~ incident reports concerning birdkills and fishkills were important
components of the risk assessments for the reregistration of the pesticides phorate and terbufos,
respectively. In addition, this adaptation recognizes that while many of the studies incorporated
into risk assessments have been peer reviewed, data from other sources may not be peer
reviewed. EPA takes many actions based on studies and supporting data provided by outside
sources, including confidential or proprietary information that has not been peer reviewed. For
example, industry can be required by regulation to submit data for pesticides under FIFRA or for
chemicals under TSCA. The data are developed using test guidelines and Good Laboratory
Practices (GLPs) in accordance with EPA regulations. While there is not a requirement to have
studies peer reviewed, such studies are reviewed by Agency scientists to ensure that they were
conducted according to the appropriate test guidelines and GLPs and that the data are valid.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
in many circumstances to conserve Agency resources and those of the regulated community who
otherwise might have to generate significant additional data. This flexibility is already provided
24 The discussion in this and following paragraphs gives some examples of the types of assessments that
may under some circumstances be considered influential. These examples are representative of assessments
performed under other EPA programs, such as CERCLA
25

7 U.S.C. 136 et seq.

26

15 U.S.c. 2601 et seq.

Guidelines for Ensuring and Maximizing Information Quality

24

GUidelines for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Disseminated

0\

EP;

for paragraph (B) in the SDWA quality principles. Pesticide and chemical risk assessments are
frequently performed iteratively, with the first iteration employing protective (conservative)
assumptions to identify possible risks. Only if potential risks are identified in a screening level
assessment, is it necessary to pursue a more refined, data-intensive risk assessment. This is
exhibited, for example, in guidance developed for use in CERCLA and RCRA on tiered
approaches. In other cases, reliance on "structure activity relationship" or "bridging data" allows
the Agency to rely on data from similar chemicals rather than require the generation of new,
chemical-specific data. While such assessments mayor may not be considered influential under
the Guidelines, this adaptation of the SDWA principles reflects EPA's reliance on less-refined
risk assessments where further refinement could significantly increase the cost of the risk
assessment without significantly enhancing the assessment or changing the regulatory outcome.

In emergency and other time critical circumstances, risk assessments may have to rely on
information at hand or that can be made readily available rather than data such as described in
(A). One such scenario is risk assessments addressing Emergency Exemption requests submitted
under Section 18 of FIFRA27 which, because of the emergency nature of the request, must be
completed within a short time frame. As an example. EPA granted an emergency exemption
under Section 18 to allow use of an unregistered pesticide to decontaminate anthrax in a Senate
office building. The scientific review and risk assessment to support this action were necessarily
constrained by the urgency of the action. Other time-sensitive actions include the reviews of new
chemicals under TSCA. Under Section 5 of TSCA 28 , EPA must review a large number of
pre-manufacture notifications (more than 1,000) every year, not all of which necessarily include
"influential" risk asses.sments, and each review must be completed within a short time frame
(generally 90 days). The nature of the reviews and risk assessment associated with these
pre-manufacture notifications are affected by the limited time available and the large volume of
notifications submitted.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
to account for safety risk assessment practices. This flexibility is already provided for paragraph
(B) in the SDWA quality principles. We applied the same SDWA adaptation for use with human
health risk assessments to safety risk assessments with the needed flexibility to apply the
principles to the extent practicable. "Safety risk assessments" include a variety of analyses,
investigations, or case studies conducted by EPA concerning safety issues. EPA works to ensure
that the chemical industry and state and local entities take action to prevent, plan and prepare for,
and respond to environmental emergencies and site specific response actions through the
development and sharing of information, tools and guidance for hazard analyses and risk
assessment. For example, although the chemical industry shoulders most of the responsibility for
safety risk assessment and management, EPA may also conduct chemical hazard analyses,
investigate the root causes and mechanisms associated with accidental chemical releases, and
assess the probability and consequences of accidental releases in support of agency risk

27

Section 18 ofFIFRA, 7 U.S.C. 136p

28

Section 5 of TSCA. 15 U.S.c. 2604

Guidelines for Ensuring and Maximizing Information Quality

25

GlIIdeline~

for Ensuring and MaKlmizmg the Quality. ObJectivity. Utility. and Integrity of Information Disseminated

l)\,

EP~

assessments. Although safety risk assessments can be different from traditional human health
risk assessments because they may combine a variety of available information and may use
expert judgement based on that information, these assessments provide useful information that is
sufficient for the intended purpose.
Next, EPA adapted the SDWA quality principles by adding the clause "including, when
available, peer reviewed science and supporting studies" to paragraph (A)(i). It now reads: "the
best available science and supporting studies conducted in accordance with sound and objective
scientific practices, including, when available, peer reviewed science and supporting studies." In
the Agency's development of "influential" scientific risk assessments, we intend to use all
relevant information, including peer reviewed studies, studies that have not been peer reviewed,
and incident information; evaluate that information based on sound scientific practices as
described in our risk assessment guidelines and policies; and reach a position based on careful
consideration of all such information (i.e., a process typically referred to as the "weight-of­
evidence" approach 29 ). In this approach, a well-developed, peer-reviewed study would generally
be accorded greater weight than information from a less well-developed study that had not been
peer-reviewed, but both studies would be considered. Thus the Agency uses a "weight-of­
evidence" process when evaluating peer-reviewed studies along with all other information.
Oftentimes under various EPA-managed programs, EPA receives information that has not been
peer-reviewed and we have to make decisions based on the information available. While many
of the studies incorporated in risk assessments have been peer reviewed, data from other sources,
such as studies submit~ed to the Agency for pesticides under FIFRA 30 and for chemicals under
TSCA, may not always be peer reviewed. Rather, such data, developed under approved
guidelines and the application of Good Laboratory Practices (GLPs), are routinely used in the
development of risk assessments. Risk assessments may also include more limited data sets such
as monitoring data used to support the exposure element of a risk as.sessment. In cases where
these data may not themselves have been peer reviewed their quality and appropriate use would
be addressed as part of the peer review of the overall risk assessment as called for under the
Agency's peer review guidelines.
Lastly, EPA adapted the SDWA principles for influential environmental ("ecological") risk
assessments that are disseminated in order to use terms that are most suited for such risk
assessments. Specifically, EPA assessments of ecological risks address a variety of entities,

29 The weight-of-evidence approach generally considers all relevant information in an integrative
assessment that takes into account the kinds of evidence available, the quality and quantity of the evidence, the
strengths and limitations associated of each type of evidence, and explains how the various types of evidence fit
together. See. e.g.• EPA's Proposed Guidelines for Carcinogen Risk Assessment (Federal Register 61 (79):
17960-18011; April 23, 1996) and EPA's Guidelines for Carcinogen Risk Assessment (Federal Register 51 (185):
33992-34003; September 24, 1986), available from: www.epa.gov/ncealraf/.andEPA·sRisk Characterization
Handbook (Science Policy Council Handbook: Risk Characterization, EPA 100-B-OO-OO2, Washington, DC: U.S.
EPA, December 2000).

3040 CPR part 158

Guidelines for Ensuring and Maximizing Information Quality

26

GUldelmcs tor Ensunnq and MaKlmizlng the Quality. ObJectivity. Utility, and Integrity of Information Dlsscmlnatec Di' ED;'

some of which can be described as populations and others (such as ecosystems) which cannot.
Therefore, a specific modification was made to include "assessment endpoints. including
populations if applicable" in place of the term "population" for ecological risk assessments and
EPA added a footnote directing the reader to various EPA risk policies for further discussion of
these concepts in greater detail.

Guidelines for Ensuring and Maximizing Information Quality

27

GUldelllles tor Ensuring and Ma)(llnizing the Quality. Objectivity. Utility. and Integrity of InformatIon Disseminated th ,,0,:

6.5

Does EPA Ensure and Maximize the Quality of Information from External Sources?

Ensuring and maximizing the quality of information from States, other governments, and third
parties is a complex undertaking, involving thoughtful collaboration with States, Tribes, the
scientific and technical community, and other external information providers. EPA will continue
to take steps to ensure that the quality and transparency of information provided by external
sources are sufficient for the intended use. For instance, since 1998, the use of environmental
data collected by others or for other purposes, including literature, industry surveys,
compilations from computerized data bases and information systems, and results from
computerized or mathematical models of environmental processes and conditions has been
within the scope of the Agency's Quality System3 ),
For information that is either voluntarily submitted to EPA in hopes of influencing a decision or
that EPA obtains for use in developing a policy, regulatory, or other decision, EPA will continue
to work with States and other governments, the scientific and technical community, and other
interested information providers to develop and publish factors that EPA would use to assess the
quality of this type of information.
For all proposed collections of information that will be disseminated to the public, EPA intends
to demonstrate in our Paperwork Reduction AcrJ 2 clearance submissions that the proposed
collection of information will result in information that will be collected, maintained and used in
ways consistent with the OMB guidelines and these EPA Guidelines. These Guidelines apply to
all information EPA disseminates to the public; accordingly, if EPA later identifies a new use for
the information that was collected, such use would not be precluded and the Guidelines would
apply to the dissemination of the information to the public.

EPA Quality Manual for Environmental Programs 5360 AI. May 2000, Section 1.3.1.
http://www.epa.gov/guality/gs-docsl5360.pdf
31

32

44 U.S.C. 3501 et seq.

Guidelines for Ensuring and Maximizing Information Quality

28

Guidelmes for Ensuring and Maximizing the Quality, Objectivity, Utility. and Integrity of Information Dlssemln:J!('c

7

Administrative Mechanism for Pre-dissemination Review

7.1

What are the Administrative Mechanisms for Pre-dissemination Reviews?

D'.

Sf',

Each EPA Program Office and Region will incorporate the information quality principles
outlined in section 6 of these Guidelines into their existing pre-dissemination review procedures
as appropriate. Offices and Regions may develop unique and new procedures, as needed, to
provide additional assurance that the information disseminated by or on behalf of their
organizations is consistent with these Guidelines. EPA intends to facilitate implementation of
consistent cross-Agency pre-dissemination reviews by establishing a model of minimum review
standards based on existing policies. Such a model for pre-dissemination review would still
provide that responsibility for the reviews remains in the appropriate EPA Office or Region.
For the purposes of the Guidelines, EPA recognizes that pre-dissemination review procedures
may include peer reviews and quality reviews that may occur at many steps in development of
information, not only at the point immediately prior to the dissemination of the information.

Administrative Mechanism for Pre-dissemination Review

29

GUidelines ter Ensuring 3nd M3xlmizmg the Quality. ObJectivIty. Utility. and Integmy of Information Dlssemln]tcC' r,·

8

Administrative Mechanisms for Correction of Information

8.1

What are EPA's Administrative Mechanisms for Affected Persons to Seek and
Obtain Correction of Information?

~":

EPA's Office of Environmental Information (OEI) manages the administrative mechanisms that
enable affected persons to seek and obtain, where appropriate, correction of information
disseminated by the Agency that does not comply with EPA or OMB Information Quality
Guidelines. Working with the Program Offices, Regions, laboratories, and field offices. OEI will
receive complaints (or copies) and distribute them to the appropriate EPA information owners.
"Information owners" are the responsible persons designated by management in the applicable
EPA Program Office, or those who have responsibility for the quality, objectivity, utility, and
integrity of the information product or data disseminated by EPA. If a person believes that
information disseminated by EPA may not comply with the Guidelines, we encourage the person
to consult informally with the contact person listed in the information product before submitting
a request for correction of information. An informal contact can result in a quick and efficient
resolution of questions about information quality.
8.2

What Should be Included in a Request for Correction of Information?

Persons requesting a correction of information should include the following information in their
Request for Correction (RFC):
,

Name and contact information for the individual or organization submitting a
complaint; identification of an individual to serve as a contact.

A description of the information the person believes does not comply with EPA
or OMB guidelines, including specific citations to the information and to the EPA
or OMB guidelines, if applicable.

An explanation of how the information does not comply with EPA or OMB
guidelines and a recommendation of corrective action. EPA considers that the
complainant has the burden of demonstrating that the information does not
comply with EPA or OMB guidelines and that a particular corrective action
would be appropriate.

An explanation of how the alleged error affects or how a correction would benefit
the requestor.

An affected person may submit an RFC via anyone of methods listed here:

Internet at http://www.epa.gov/oeilgualityguidelines

E-mail atguality.guidelines@epa.gov

Fax at (202) 566-0255

Administrative Mechanisms for Correction of Information

30

GUideline" tor Ensuring and Maxl/TIIzlng the Quality. Objectivity. Utility. and Integrity of Information Dlssemlnatea

8.3

O\,

EP':'

Mail to lnfonnation Quality Guidelines Staff, Mail Code 2822lT. U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington. DC, 20460
By courier or in person to lnfonnation Quality Guidelines Staff. OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC

When Does EPA Intend to Consider a Request for Correction of Information?

EPA seeks public and stakeholder input on a wide variety of issues, including the identification
and resolution of discrepancies in EPA data and information. EPA may decline to review an
RFC under these Guidelines and consider it for correction if:

8.4

The request does not address information disseminated to the public covered by
these Guidelines (see section 5.3 or OMB's guidelines). In many cases, EPA
provides other correction processes for information not covered by these
Guidelines.

The request omits one or more of the elements recommended in section 8.2 and
there is insufficient information for EPA to provide a satisfactory response.

The request itself is "frivolous," including those made in bad faith, made without
justificl!tion or trivial, and for which a response would be duplicative. More
information on this subject may be found in the OMB guidelines.

How Does EPA Intend to Respond to a Request for Correction of Information?

EPA intends to use the following process:

Each RFC will be tracked in an DEI system.

If an RFC is deemed appropriate for consideration, the information owner office
or region makes a decision on the request on the basis of the information in
question, including a request submitted under section 8.2. Rejections of a request
for correction should be decided at the highest level of the information owner
office or region. EPA's goal is to respond to requests within 90 days of receipt, by
1) providing either a decision on the request, or 2) if the request requires more
than 90 calendar days to resolve, informing the complainant that more time is
required and indicate the reason why and an estimated decision date.

If a request is approved, EPA detennines what corrective action is appropriate.
Considerations relevant to the determination of appropriate corrective action
include the nature and timeliness of the information involved and such factors as
the significance of the error on the use of the information and the magnitude of

Administrative Mechanisms for Correction of Information

31

GUIc!e!lne~

to; EnsurH19 and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Drsseminil!ec' DI'

==> •.

the error. For requests involving infonnation from outside sources. considerations
may include coordinating with the source and other practical limitations on EPA' s
ability to take corrective action.

Whether or not EPA detennines that corrective action is appropri1\te. EPA
provides notice of its decision to the requester.

For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.

OEI will provide reports on behalf of EPA to OMB on an annual basis beginning January 1.
2004 regarding the number, nature, and resolution of complaints received by EPA.

8.5

How Does EPA Expect to Process Requests for Correction of Information on Which
EPA has Sought P1Jblic Comment?

When EPA provides opportunities for public participation by seeking comments on infonnation,
the public comment process should address concerns about EPA's infonnation. For example,
when EPA issues a notice of proposed rulemaking supported by studies and other information
described in the proposal or included in the rulemaking docket, it disseminates this infonnation
within the meaning of,the GUidelines. The public may then raise issues in comments regarding
the infonnation. If a group or an individual raises a question regarding infonnation supporting a
proposed rule, EPA generally expects to treat it procedurally like a comment to the rulemaking,
addressing it in the response to comments rather than through a separate response mechanism.
This approach would also generally apply to other processes involving a structured opportunity
for public comment on a draft or proposed document before a final document is issued, such as a
draft report, risk assessment, or guidance document. EPA believes that the thorough
consideration provided by the public comment process serves the purposes of the Guidelines,
provides an opportunity for correction of any infonnation that does not comply with the
Guidelines, and does not duplicate or interfere with the orderly conduct of the action. In cases
where the Agency disseminates a study, analysis, or other infonnation prior to the final Agency
action or infonnation product, it is EPA policy to consider requests for correction prior to the
final Agency action or infonnation product in those cases where the Agency has detennined that
an earlier response would not unduly delay issuance of the Agency action or infonnation product
and the complainant has shown a reasonable likelihood of suffering actual hann from the
Agency's dissemination if the Agency does not resolve the complaint prior to the final Agency
action or infonnation product. EPA does not expect this to be the nonn in rulemakings that it
conducts, and thus will usually address infonnation quality issues in connection with the final
Agency action or infonnation product.

Administrative Mechanisms for Correction of Information

32

GUldelmes for Ensufln~ and Maxllmzmg the Quality. ObJectivity. Utility. and Integrity of Inform~tlon DissemInated by ED,:

EPA generally would not consider a complaint that could have been submitted as a timely
comment in the rulemaking or other action but was submitted after the comment period. If EPA
cannot respond to a complaint in the response to comments for the action (for example, because
the complaint is submitted too late to be considered and could not have been timely submitted, or
because the complaint is not gennane to the action), EPA will consider whether a separate
response to the complaint is appropriate.

Administrative Mechanisms for Correction of Information

33

GUidellnes tor Ensurln9 and MaxImizing the Quality. Objectivity. Utility. and Integrity of InformatIon Dlsscm'"~ted t:. "P:

8.6

What Should be Included in a Request Asking EPA to Reconsider its Decision on a
Request for the Correction of Information?

If requesters are dissatisfied with an EPA decision, they may file a Request for Reconsideration

(RFR). The RFR should contain the following infonnation:

An indication that the person is seeking an appeal of an EPA decision on a
previously submitted request for a correction of infonnation, including the date of
the original submission and date of EPA decision. A copy of EPA's original
decision would help expedite the process.

Name and contact infonnation. Organizations submitting an RFR should identify
an individual as a contact.

An explanation of why the person disagrees with the EPA decision and a specific
recommendation for corrective action.

A copy of the original RFC of infonnation.

An affected person may submit a Request for Reconsideration (RFR) via anyone
of the methods listed here:

Internet at http://www.epa.gov/oei/gualityguidelines

~.mail at quality.guidelines@epa.gov

Fax at (202) 566-0255

Mail to Information Quality Guidelines Staff, Mail Code 2822IT, U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington, DC, 20460

By courier or in person to Information Quality Guidelines Staff, OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC

EPA recommends that requesters submit their RFR within 90 days of the EPA decision. If the
RFR is sent after that time, EPA recommends that the requester include an explanation of why
the request should be considered at this time.
8.7

How Does EPA Intend to Process Requests for Reconsideration of EPA Decisions?

EPA intends to consider RFR using the following process:

Each RFR will be tracked in an OEI system.

OEI sends the RFR to the appropriate EPA Program Office or Region that has
responsibility for the infonnation in question.

Administrative Mechanisms for Correction of Information

34

GUldelme" for Ensuring and Maxllnizing the Quality. ObJectivity. Utility. and Integrity ollnlormation Dissemm8tec u·. E::;~

The Assistant Administrator (AA) or Regional Administrator (RA) information
owner presents to an executive panel. The executive panel would be comprised of
the Science Advisor/AA for the Office of Research and Development (ORD).
Chief Information Officer/AA for OEI, and the Economics Advisor/AA for the
Office of Policy, Economics and Innovation (OPEL). The 3-member executive
panel would be chaired by the Chief Information Officer/AA for OEI. When the
subject of the RFR originated from a member office, that panel member would be
replaced by an alternate AA or RA. While the executive panel is considering an
RFR, the decision made on the initial complaint by the information owner office
or region remains in effect.
The executive panel makes the final decision on the RFR.
EPA's goal is to respond to each RFR within 90 days of receipt, by 1) providing
either a decision on the request or 2) if the request requires more than 90 calendar
days to resolve, informing the complainant that more time is required and indicate
the reason why and an estimated decision date.

If a request is approved, EPA determines what type of corrective action is
appropriate. Considerations relevant to the determination of appropriate
corrective action include the nature and timeliness of the information involved
and such factors as the significance of the error on the use of the information and
the magnitude of the error. For requests invol ving information from outside
sources, considerations may include coordinating with the source, and other
practical limitations on EPA's ability to take corrective action.

Whether or not EPA determines that corrective action is appropriate, EPA
provides notice of its decision to the requester.

For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.

Administrative Mechanisms for Correction of Information

35

GUlO~llnes

Tor Ensurln9 and MDximlzmg the Qualitv. Objectivity. Utility. and Integrity of Information Dlsscmrnalcc

J'.

ED:

Appendix A

IQG Development Process and Discussion of Public Comments

A.I

Introduction

EPA's Guidelines are a living document and may be revised as we learn more about how best to
address, ensure, and maximize information quality. In the process of developing these
Guidelines, we actively solicited public input at many stages. While the public was free to
comment on any aspect of the Guidelines, EPA explicitly requested input on key topics such as
influential information, reproducibility, influential risk assessment, infonnation sources, and
error correction.
Public input was sought in the following ways:

An online Public Comment Session was held March 19-22,2002, as the first draft of the
Guidelines was being developed. EPA received approximately 100 comments.

A Public Meeting was held on May 15, 2002, after the draft Guidelines were issued.
There were 99 participants, 13 of whom made presentations or commented on one or
more issues.
A 52 day Public Comment period lasted from May 1 to June 21, 2002, where comments
could be maileg, faxed, or e-mailed to EPA. EPA received 55 comments during this
period.

A meeting with State representatives, sponsored and supported by the Environmental
Council of the States (ECOS), was held on May 29, 2002.

A conference call between EPA and Tribal representatives was held on June 27, 2002.

More detailed information on the public comments is available through an OEI web site, serving
as the home page for the EPA Information Quality Guidelines through the development and
implementation process. Please visit this site at http://www.epa.gov/oei/gualityguidelines.
We have established a public docket for the EPA Information Quality Guidelines under Docket
ill No. OEI-10014. The docket is the collection of materials available for public viewing
Information Quality Guidelines Staff, OEl Docket Center, Room B128, EPA West Building,
1301 Constitution Ave., N.W., Washington, DC, phone number 202-566-0284. This docket
consists of a copy of the Guidelines, public comments received, and other infonnation related to
the Guidelines. The docket is open from 12:00 PM to 4:00 PM, Monday through Friday,
excluding legal holidays. An index of docket contents will be available at
http://www.epa.gov/oeilgualityguidelines.

Appendix

36

Gu,dellnes for Ensurin", and Mo~imlzin9 the Quality. Objectivity. Utility. and Integrity of Information Disseminated by ED;.

A.2

General Summary of Comments

During the various public comment opportunities, EPA received input from a diverse set of
organizations and private citizens. Comments came from many of EPA's stakeholders - the
regulated community and many interest groups who we hear from frequently during the
management of EPA's Programs to protect the nation's land, air, water, and public health.
Government agencies at the Federal, State, Tribal, and local level also commented on the
Guidelines. OMB sent comments to every Federal agency and EPA received comments from two
members of Congress. Beyond our government colleagues, the private sector voiced many
concerns and helpful recommendations for these Guidelines. We would like to take this
opportunity to thank all commenters for providing their input on these Guidelines. Due to the
tight time frame for this project, this discussion of public comments generally describes the
major categories of comments and highlights some significant comments, but does not contain
an individual response to each public comment.
Comments received by EPA during the publi~ comment period reflect a diversity of views
regarding EPA'~ approach to developing draft Guidelines as well as the general concept of
information quality. Some commenters included detailed review of all Guidelines sections, while
others chose to address only specific topics. In some cases, commenters provided examples to
demonstrate how current EPA procedures may not ensure adequate information quality for a
specific application. Commenters provided general observations such as stating that these
Guidelines did not sufficiently address EPA's information quality problems. Some commenters
offered that the Guideljnes relied too much on existing policies. Interpretations of the intent of
the Data Quality Act were offered by some commenters. One comment noted that improvement
of data quality is not necessarily an end in and of itself. Another comment was that the goal of
Guidelines should be more to improve quality, not end uncertainty. Public interest and
environmental groups voiced concern over what they believed was an attempt by various groups
to undennine EPA's ability to act in a timely fashion to protect the environment and public
health. Some commenters stated that the directives of the Data Quality Act and OMB cannot
override EPA's mission to protect human health and the environment per the statutory mandates
under which it operates.

EPA was congratulated for the effort and, in some cases, encouraged to go even further in
addressing information quality. Some commenters encouraged EPA to provide additional
process details, provide more detailed definitions, augment existing policies that promote
transparency, and share more infonnation about the limitations of EPA disseminated
information. In one case, EPA was encouraged to develop a rating scheme for its disseminated
infonnation.
This section discusses public comments and our responses to many of the important questions
and issues raised in the comments. First, we provide responses to some overarching comments
we received from many commenters, then we provide a discussion of public comments that were
received on specific topics addressed in the draft Guidelines.

Appendix

37

GlIIocllfle!O- tor Ensurlnil and Maximizil]g the Quality. Objectivity. Utility. and Integrity of Information Disseminalec

D'."

E":

Tone: Commenters criticized the "defensive tone", "legalistic tone", and the lack
of detail afforded in the Guidelines. Some commenters said that it was not clear
what the Guidelines were explaining, or how they might apply to various types of
information. We understand and agree with many of these criticisms and have
made attempts to better communicate the purpose, applicability, and content of
these Guidelines.

Plan for implementation: Commenters suggested that the Guidelines should
describe EPA's plans for implementing the Guidelines. These Guidelines provide
policy guidance, and as such, do not outline EPA's plan for implementation. That
is, they do not describe in great detail how each Program and Regional Office will
implement these principles. We do not intend to imply that each Office will
implement them in conflict with one another, but rather assume that because each
Program implements a different statutory mandate or mandates, there will be
some inherent differences in approach. Beyond internal implementation, we agree
that there is more work and communication to be conducted with information
providers and users to optimize the provisions set forth in these Guidelines.
Commitment to public access: One commenter suggested that we "remove
outdated information" from our web site. Other c(Jrnmenters suggested that when
a complaint has been filed that the information should be removed from public
view while a complaint is being reviewed. This is generally unacceptable to EPA
in light,of our commitment to providing the public with access to information;
however, in certain cases EPA may consider immediate removal of information
(for example, when it is clear to us that the information is grossly incorrect and
misleading and its status cannot be adequately clarified through a notice or other
explanation). With respect to outdated information, ~ometimes it serves a
historical purpose, and should continue to be disseminated for that purpose.

A.3

Response to Comments by Guidelines Topic Area

A.3.t Existing Policy
Many commenters told us that we rely excessively on existing EPA information quality policies.
Commenters provided specific examples of areas they believed were demonstrative of our lack
of commitment to or uneveILimplementation of our existing policies. Some commenters also
pointed out that there are key areas in which we lack policies to address quality and, as a result,
the Guidelines should address such issues in more detail. Some commenters also noted that EPA
itself has highlighted lessons learned with existing approaches to information product
development.
Ongoing improvement in implementing existing processes is a key principle of quality
management. We view these Guidelines as an opportunity to enhance existing policies and
redouble our commitment to quality information.
Appendix

38

GlJloellnes for Ensuring and Maximizing the Quality. Objectivity. Utilify. and Integrity of Information

Dlssem,"3Ie~ t .. ;: D ~

The concept of peer review is considered in three Guidelines sections. (1) Application of the
Agency's Peer Review Policy language for "major scientific and technical work products and
economic analysis used in decision making" as a class of information that can be considered
"influential" for purposes of the Guidelines; ( 2) Use of "peer-reviewed science" as a component
of some risk assessments; and (3) Use of the Agency's Peer Review Policy as one of the
Agency-wide processes to ensure the quality, objectivity, and transparency of "influential"
scientific, financial, and statistical information under the Guidelines.
Some commenters expressed concerns regarding application of peer review in EPA.
Commenters suggest that current peer reviews are not sufficiently standardized, independent, or
consistently implemented. Peer review is a cornerstone to EPA's credibility and we must ensure
that the process always works as designed. For this reason, we conduct routine assessments to
evaluate and improve the peer review process.
Commenters also questioned whether peer review is an adequate means to establish
"objectivity." We note that OMB guidelines specifically allow for the use of formal, external,
independent peer review to establish a presumption of objectivity. OMB guidelines also state
that the presumption of objectivity is rebuttable, although the burden of proof lies with the
complainant. Some cornmenters asked for additional definitions for peer review terms. Our
current peer review policy is articulated in Peer Review and Peer Involvement at the U.S.
Environmental Protection Agency.33 Additional discussion regarding the application of
peer-reviewed science is provided in the discussion of comments on risk assessment.

A.3.2 Scope and Applicability
We received a number of comments on section 1.1 (What is the Purpose of these Guidelines?) of
the draft Guidelines. Some commenters argued that the Guidelines should be binding on EPA,
that they are legislative rules rather than guidance, or that the Guidelines must be followed
unless we make a specific determination to the contrary. Others argued that the Guidelines
should not be binding or that we should include an explicit statement that the Guidelines do not
alter substantive agency mandates. Some suggested that our statements retaining discretion to
differ from the Guidelines sent a signal that EPA was not serious about information quality.
With respect to the nature of these Guidelines, Section 515 specifies that agencies are to issue
"guidelines." As directed by OMB's guidelines, we have issued our own guidelines containing
nonbinding policy and procedural guidance. We see no indication in either the language or
general structure of Section 515 that Congress intended EPA's guidelines to be binding rules.
We revised this section (now section 1 in this revised draft) by adding a fuller explanation of
how EPA intends to ensure the quality of information it disseminates. This section includes
language explaining the nature of our Guidelines as policy and procedural guidance. This
language is intended to give clear notice of the nonbinding legal effect of the Guidelines. It

33http://epa.gov/osp/spc/perevmem.htm

Appendix

39

GUidelines for Ensurmg and Maximizing the Quality. Objectivity. Utility. and Integrity of Information DIsseminated 0\ Eel

notifies EPA staff and the public that the document is guidance rather than a substantive rule and
explains how such guidance should be implemented. Although we believe these Guidelines
would not be judicially reviewable, we agree that a statement to this effect is unnecessary and
have deleted it. In response to comments that EPA clarify that the Guidelines do not alter
existing legal requirements, we have made that change. In light of that change, we think it is
clear that decisions in particular cases will be made based on applicable statutes, regulations, and
requirements, and have deleted other text in the paragraph that essentially repeated that point.
Elsewhere in the document, EPA has made revisions to be consistent with its status as guidance.
Some commenters argued that all EPA disseminated information should be covered by the
Guidelines and that we lack authority to "exempt" information from the Guidelines. Others
thought that the coverage in EPA's draft was appropriate. EPA does not view its Guidelines as
establishing a fixed definition and then providing "exemptions." Rather, our Guidelines explain
when a distribution of information generally would or would not be considered disseminated to
the public for purposes of the Guidelines. As we respond to complaints and gain experience in
implementing these Guidelines, we may identify other instances where information is or is not
considered disseminated for the purposes of the Guidelines.
Some commenters cited the Paperwork Reduction Act (PRA), 44 U.S.C. 3501 et seq., to support
their argument that the Guidelines should cover all information EPA makes public. EPA's
Guidelines are issued under Section 515 of the Treasury and General Government
Appropriations Act for Fiscal Year 2001, which directs OMB to issue govemment-wide
guidelines providing P9licy and procedural guidance to Federal agencies. In tum, the OMB
guidelines provide direction and guidance to Federal agencies in issuing their own guidelines.
EPA's Guidelines are intended to carry out OMB's policy on information quality. One
commenter cited in particular the term "public information" used in the PRA as evidence of
Congress's intent under Section 515. In EPA's view, this does not show that Congress intended a
specific definition for the key terms, "information" and "disseminated," used in Section 515. In
the absence of evidence of Congressional intent regarding the meaning of the terms used in
Section 515, EPA does not believe the PRA requires a change in EPA's Guidelines.
We agree with commenters who noted that even if a particular distribution of information is not
covered by the Guidelines, the Guidelines would still apply to information disseminated in other
ways. As stated in section 1.4, if information is not initially covered by the Guidelines, a
subsequent distribution of that information will be subject to the Guidelines if EPA adopts,
endorses, or uses it.
Some commenters made specific recommendations about what should and should not be covered
by the Guidelines. In addition to the specific recommendations, some suggested that the "scope
and applicability" section was too long, while others thought it had an appropriate level of detail.
Based on other agencies' guidelines and public comments, EPA has removed much of the detail
from the discussion of Guidelines coverage. These revisions were intended to shorten and
simplify the discussion without changing the general scope of the Guidelines.

Appendix

40

Guidelines for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dlssemmatcc

b\

"P;

We revised our definition of "information" in section 5.3, in response to a comment requesting
that the Agency make clear that information from outside sources is covered by the Guidelines if
EPA adopts, endorses, or uses it to support an Agency decision or position. In section 5.4, we
modified several of the provisions. We added statements of "intent" or similar language to define
the scope of several of the provisions. Accordingly, dissemination would not include distribution
of information "intended" for government employees or recipients of contracts, grants, or
cooperative agreements. Nor would information in correspondence "directed to" individuals or
persons be covered. This recognizes that there may be instances where EPA may use a letter
written to an individual in a way that indicates it is directed beyond the correspondent and
represents a more generally applicable Agency policy. The Guidelines would apply in such a
case. EPA has created a category for information of an "ephemeral" nature, including press
releases, speeches, and the like. The intent was that the Guidelines should not cover
communications that merely serve as announcements, or for other reasons are intended to be
fleeting or of limited duration. Consistent with other agency guidelines, we have added language
indicating that the Guidelines do not cover information presented to Congress, unless EPA
simultaneously disseminates this information to the public.
Some commenters thought all information from outside sources should be covered by the
Guidelines, even if EPA does not use, rely on, or endorse it. Others wished to clarify the point at
which the Guidelines cover information from outside sources. As noted above, section 1.4 of the
Guidelines explains how subsequent distributions of information in public filings may become
subject to the Guidelines. We continue to think that EPA's own public filings before other
agencies should not geJlerally be covered by the Guidelines as long as EPA does not
simultaneously disseminate them to the public, since use of this information would be subject to
the requirements and policies of the agency to which the information is submitted.
We received a number of comments, including from OMB, arguing that the provision regarding
information related to adjudicative processes was too broad, and that the Guidelines should
cover some or all information related to adjudicative processes, particularly administrative
adjudications. In addition to shortening this section, we have limited this provision to
information in documents prepared specifically for an administrative adjudication. This would
include decisions, orders, findings, and other documents prepared specifically for the
adjudication. As indicated in the Draft Guidelines, our view is that existing standards and
protections in administrative adjudications would generally be adequate to assure the quality of
information in administrative adjudications and to provide an adequate opportunity to contest
decisions on the quality of information. For example, in permitting proceedings, parties may
submit comments on the quality of information EPA prepares for the permit proceeding, and
judicial review is available based on existing statutes and regulations. Narrowing the provision
to information prepared specifically for the adjudication should make clear that the Guidelines
would not generally provide parties with additional avenues of challenge or appeal during
adjudications, but would still apply to a separate distribution of information where EPA adopts,
endorses, or uses the information, such as when EPA disseminates it, on the Iriternet, or in a
rulemaking, or guidance document. When we intend to adopt information such as models or risk
assessments for use in a class of cases or determinations (e.g., for use in all determinations under
AppendiX

41

Guidelines for Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity at Informallan Dlssemmated

lh ::" ~

a particular regulatory provision), EPA often disseminates this information separately and in
many instances requests public comment on it. Accordingly, it is not clear that there would be
many instances where persons who are concerned about information prepared specifically for an
adjudication would not have an opportunity to contest the quality of information.
We respectfully disagree with a commenter's recommendation that regulatory limits established
by EPA should be subject to the Guidelines. The Guidelines apply to information disseminated
by EPA, not to regulatory standards or other Agency decisions or policy choices. In response to
comments regarding information disseminated in rulemakings and other matters subject to public
comment, EPA considers that this information would be disseminated within the meaning of the
Guidelines, although we would generally treat complaints regarding that information
procedurally like other comments on the rulemaking or other matter.
A.3.3 Sources of Information
We received many comments on how the Guidelines apply to external parties, the shared quality
responsibilities between EPA and external parties, and specific EPA responsibilities when using
or relying on information collected or compiled by external parties.
EPA roles: Some commenters emphasized that ensuring quality of information at the point of
dissemination is no substitute for vigorous efforts by EPA to receive quality information in the
first place and therefore for information providers to produce quality information. One
commenter stated that ftPA cannot be responsible for all aspects of the quality of the information
we disseminate. In response to this and other comments, we have provided additional language
in these Guidelines on the various roles that EPA assumes in either ensuring the quality of the
information we disseminate or ensuring the integrity of information EPA distributes. One
comment suggested that we mention the role of the National Environmental Information
Exchange Network in ensuring information integrity, which we have done in section 2.4 of the
Guidelines.
Assessment factors: Overall, public input was positive and welcoming of our proposal to
develop assessment factors to evaluate the quality of information generated by third parties. A
few commenters offered their involvement in the development of these factors, their advice on
how to develop such factors, and some examples of what assessment factors we should consider.
EPA staff have provided such comments to the EPA Science POlicy Council workgroup that was
charged with developing the.assessment factors. EPA welcomes stakeholder input in the
development of these factors and published draft assessment factors for public comment in
September 2002.
Coverage of State Information: Some commenters suggested that our Guidelines must apply to
all information disseminated by EPA, including information submitted to us by States. Whereas
some commenters stressed that the quality of information recei ved by EPA is the responsibility
of the providers, others expressed concern about the potential impact that EPA's Guidelines
could have on States. We believe it is important to differentiate between information that we
AppendiX

42

Guidelines tor Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity of Information Disseminated bv

po,~

generate and data or information generated by external parties, including States. State
information, when submitted to EPA, may not be covered by these Guidelines, but our
subsequent use ofthe information may in fact be covered. We note, however, that there may be
practical limitations on the type of corrective action that may be taken, since EPA does not
intend to alter information submitted by States. However, EPA does intend to work closely with
our State counterparts to ensure and maximize the quality of information that EPA disseminates.
Furthermore, one commenter stated that if regulatory information is submitted to an authorized
or delegated State' program, then the State is the primary custodian of the information and the
Guidelines would not cover that information. We agree with that statement.
We also received comments regarding the use of labels, or disclaimers, to notify the public
whether information is generated by EPA or an external party. We agree that disclaimers and
other notifications should be used to explain the status of information wherever possible, and we
are developing appropriate language and format.
A statement regarding Paperwork Reduction Act clearance submissions has been added in
response to comment by OMB.
A.3.4 Influential Information
EPA received a range of comments on its definition of "influentiaL" Below we provide a
summary of the comments raised and EPA's response.
~

Several commenters generally assert that the definition is too narrow. Other commenters
indicated that under EPA's draft definition, only Economically Significant actions, as defined in
Executive Order 12866, or only Economically Significant actions and information disseminated
in support oftop Agency actions, are considered "influentiaL" We disagree. To demonstrate the
broad range of activities covered by our adoption of OMB's definition, we reiterate the
definition below and include an example of each type of action, to illustrate the breadth of our
definition. "Influential," when used in the phrase '~influential scientific, financial, or statistical
information," means that the Agency can reasonably determine that dissemination of the
information will have or does have a clear and substantial impact on important public policies or
important private sector decisions. We will generally consider the following classes of
information to be influential: information disseminated in support of top Agency actions;
information disseminated in support of "economically significant" actions; major work products
undergoing peer review; and other disseminated information that.will have or does have a clear
and substantial impact (i.e., potential change or impact) on important public policies or
important private sector decisions as determined by EPA on a case-by-case basis. In general,
influential information would be the scientific, financial or statistical information that provides a
substantial basis for EPA's position on key issues in top Agency actions and Economically
Significant actions. If the information provides a substantial basis for EPA's position, EPA
believes it would generally have a clear and substantial impact.
.

Appendix

43

Gliideline~ tor Ensurin\l and MaxlmizlnrJ the Quality. Objectivity. Utility. and Integrity ot Informatlol', Disseminated b\ E:':;

Top Agency actions: An example of a top Agency action is the review of the National
Ambient Air Quality Standards (NAAQS) for Particulate Matter. Under the Clean Air
Act, EPA is to periodically review (1) the latest scientific knowledge about the effects on
public health and public welfare (e.g., the environment) associated with the presence of
such pollutants in the ambient air and (2) the standards, which are based on this science.
The Act further directs that the Administrator shall make any revisions to the standards
as may be appropriate, based on the latest science, that in her judgment are requisite to
protect the public health with an adequate margin of safety and to protect the public
welfare from any known or anticipated adverse effects. The standards establish allowable
levels of the pollutant in the ambient air across the United States, and States must
development implementation plans to attain the standards. The PM NAAQS were last
revised in 1997, and the next periodic review is now being conducted.
"Economically significant" rules: An example of a rule found to be economically
significant is the Disposal of Polychlorinated Biphenyls (PCBs) Final Rule. In 1998, EPA
amended its rules under the Toxic Substances Control Act (TSCA), which addresses the
manufacture, processing, distribution in commerce,.use, cleanup, storage and disposal of
PCBs. This rule provides flexibility in selecting disposal technologies for PCB wastes
and expands the list of available decontamination procedures; provides less burdensome
mechanisms for obtaining EPA approval for a variety of activities; clarifies and/or
modifies certain provisions where implementation questions have arisen; modifies the
requirements regarding the use and disposal of PCB equipment; and addresses
outstanding issyes associated with the notification and manifesting of PCB wastes and
changes in the operation of commercial storage facilities. EPA would consider the
information that provides the principal basis for this rule to be influential information.

Peer reviewed work products: An example of a major work product undergoing peer
review is the IRIS Documentation: Reference Dose for Methylmercury. Methylmercury
contamination is the basis for fish advisories. It is necessary to determine an intake to
humans that is without appreciable risk in order to devise strategies for decreasing
mercury emissions into the environment. After EPA derived a reference dose (RID) of
0.0001 mg/kg-day in 1995, industry argued that it was not based on sound science.
Congress ordered EPA to fund an National Research CouncillNational Academy of the
Sciences panel to determine whether our RID was scientifically justifiable. The panel
concluded that the 0.0001 mg/kg-day was an appropriate RfD, based on newer studies
than the 1995 RID. The information in this document was.evaluated, incorporated, and
subjected to comment by the Office of Water, where it contributed in large part to
Chapter 4 of Drinking Water Criteria for the Protection of Human Health:
Methylmercury (EPA/823/R-01/00I) January 2001. The peer review mechanism was an
external peer review workshop and public comment session held on November 15, 2000,
accompanied by a public comment period from October 30 to Novem~er 29, 2000.

Case-by-base determination - PDT Chemicals Rule: An example of a case-by-case
determination is the Guidance Document for Reporting Releases and Other Waste
Appendix

44

GWdeimes for Ensurlf19 and Maximizing the Quality. ObJectIvity. Utility. and Integrity of Information Dissemlnat£,c

b'. ED,.

Management Activities of Toxic Chemicals: Dioxin and Dioxin-like Compounds
(December, 2000). In a final rule published October 29, 1999, EPA lowered the reporting
thresholds for certain persistent bioaccumulative toxic (PBT) chemicals that are subject to
reporting under Section 313 ofthe Emergency Planning and Community Right-to-Know Act of
1986 (EPCRA) and Section 6607 of the Pollution Prevention Act of 1990 (PPA). We also added
a category of dioxin and dioxin-like compounds to the EPCRA Section 313 list of toxic
chemicals and established a 0.1 gram reporting threshold for the category. In addition, EPA
added certain other PBT chemicals to the EPCRA Section 313 list of toxic chemicals and
established lower reporting thresholds for these chemicals. As a result of this rulemaking. we
developed a guidance document on the reporting requirements for the dioxin and dioxin-like
compounds category, as well as a number of other guidance documents. The dioxin guidance
document provides guidance on how to estimate annual releases and other waste management
quantities of dioxin and dioxin-like compounds to the environment from certain industries and
industrial activities. Due to the high interest level of stakeholders, we solicited public comments
on the draft guidance document and fonned a workgroup of interested stakeholders. The
workgroup reviewed all public comments, provided their own comments, and then reviewed and
commented on the final draft.
Case-by-case determination - National Water Quality Inventory Report: A second
example of a case-by-case determination is the National Water Quality Inventory Report
to Congress. The National Water Quality Inventory Report to Congress is a biennial
report to Congress and the public about the quality of our nation's waters. It is prepared
under Section ~05 (b) of the Clean Water Act (CWA), which requires States and other
jurisdictions to assess the health of their waters and the extent to which water quality
supports State water quality standards and the basic goals of the CW A. States' Section
305 (b) assessments are an important component of their water resource management
programs. These assessments help States: implement their water quality standards by
identifying healthy waters that need to be maintained and impaired waters that need to be
restored, prepare their Section 303 (d) lists of impaired waters, develop restoration
strategies such as total maximum daily loads and source controls, and evaluate the
effectiveness of activities undertaken to restore impaired waters and protect healthy
waters.
A number of commenters said that EPA created a limited definition of what types of infonnation
are to be considered "influential," and that we have no rational basis to do so. A number of
commenters also stated that :'all Agency infonnation should be considered influential"; that "all
data relied upon by the Agency should meet a high standard of quality regardless of the type"; or
that "'influential' infonnation includes infonnation used to support any EPA action, not just
'top' Agency actions." EPA followed OMB's guidelines in establishing a definition for
"influential" infonnation that was not all-encompassing. OMB stated "the more important the
information, the higher the quality standards to which it should be held, for example. in those
situations involving "influential scientific, financial or statistical infonnation.. ~". OMB narrowed
the definition of "influential" in their final guidance as follows:

Appendix

45

Guidelines tor Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated

D,'

EPt.

In this narrower definition, "influential", when used in the phrase "influential
scientific; financial, or statistical information", is amended to mean that "the
agency can reasonably determine that dissemination of the information will have
or does have a clear and substantial impact on important public policies or
important private sector decisions" (67 FR 8455).

OMB also amended their definition to say that "each agency is authorized to define "influential"
in ways appropriate for it given the nature and multiplicity of issues for which the agency is
responsible" (67 FR 8455). We adopted OMB's "influential" definition. Once the Agency
reviewed the wide range of information disseminated to the public, such as major rulemakings,
risk assessments, rule related guidance, health advisories, annual reports, fact sheets, and
coloring books, it became apparent that there were reasons to distinguish between "influential"
information and other information. EPA adopted OMB' s definition for "influential" and used
types of information the Agency disseminates to further explain what information is included.
Another commenter suggested that EPA should not indicate whether disseminated information is
"influential" when it is first disseminated but should wait to designate information as
"influential" until either an information correction request is made or a final agency action is
taken. We intend to consider this point, as well as other comments made about when
disseminated information becomes influential, as the Agency implements the Guidelines.
One commenter suggests that the definition of the term "influential" should be more narrow.
Specifically, the c0rIlIl!enter states the following:
Within the relatively narrow sphere of "disseminated" information, an agency
should reserve the designation of "influential" for information disseminated in
support of agency actions that are "major" regulations under Executive Order
12866, provide a "significant" opportunity to advance the agency's mandate by
other means, or involve precedent-setting or reasonably controverted issues. This
designation recognizes that procedures to promote the quality of information have
significant costs, and that the most significant (and therefore the most costly) of
such procedures should be reserved for information that is the most important in
terms of the agency's mission.
EPA agrees with the commenter that there are significant costs associated with ensuring that
information disseminated by.the Agency is of high quality. Consequently, EPA chose a
definition of the term "influential" to cover information that, when disseminated, will result in a
clear and substantial impact on important public policies and private sector decisions. We
believe that this definition balances the costs associated with implementing the Guidelines, the
need to ensure high quality information, and the Agency's mission to protect human health and
safeguard the natural environment.
'
Several commenters indicated that it is inappropriate for EPA to base its definition of
"influential" on categories of actions. They suggest that the definition be based instead on the
AppendiX

46

GUidelmes for Ensurm9 and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Disseminated b\ EP':'

content of the infonnation. We consider our definition to be based on infonnation content. given
that those categories of disseminated infonnation we defined as influential are those that EPA
can reasonably determine will or do have a clear and substantial impact on important public
policies or private sector decisions. We note here that, in addition to the specific classes of
disseminated infonnation we have defined as "influential," EPA has reiterated the "case-by­
case" portion of the OMB "influential" definition. This general provision is intended to capture
disseminated infonnation, based on its content, that would not otherwise rise to the level of
"influential" under the other parts of our definition (i.e., top Agency actions, Economically
Significant actions, major peer reviewed products).
Several commenters assert that EPA should categorically state that certain specific types of
disseminated infonnation products are influential, and that we should categorically state that
certain specific types of disseminated infonnation products are not influential. Given the vast
array of infonnation disseminated by the Agency, and given the fact that certain infonnation
may have a clear and substantial impact on important public policies or private sector decisions
at one time, but not have such an impact later on (and vice versa), classifying types of
information as "influential" or otherwise upfront is difficult and could be misleading. We intend
to rely on our definition in determining whether speCific types of disseminated infonnation
products are to be considered "influential" for purposes of the Guidelines.
A.3.5 Reproducibility
Some commenters stat~d that there needs to be more clarity in the definition of "reproducibility"
and related concepts. We have tried to provide definitions that are consistent with OMB
guidelines. Also, our Guidelines now include that EPA intends to ensure reproducibility for
disseminated original and supporting data according to commonly accepted scientific, financial,
or statistical standards. Many commenters thought there should be some kind of method to
consider reproducibility when proprietary models, methods, designs, and data are used in a
dissemination. Some commenters discourage all use of proprietary models; others suggest
proprietary model use be minimized with application limited to situations in which it is
absolutely necessary. We understand this concern, but note that there are other factors that are
appropriately considered when deciding whether to use proprietary models, including feasibility
and cost considerations (e.g., it may be more cost-effective for the Agency to use a proprietary
model in some situations than to develop its own model). In cases where the Agency relies on
proprietary models, these model applications are still subject to our Peer Review Policy. Further,
as recently directed by the Administrator, the Agency's Council on Regulatory Environmental
Modeling is now revitalizing its development of principles for evaluating the use of
environmental models with regard to model validation and certification issues, building on
current good modeling practices. In addition. these Guidelines provide for the use of especially
rigorous "robustness checks" and documentation of what checks were undertaken. These steps,
along with transparency about the sources of data used, various assumptions employed, analytic
methods applied, and statistical procedures employed should assure that analytic results are
"capable of being substantially reproduced."

Appendix

47

Guidelines tor Ensuring and Maximizing the Quality. Objectivity. Utility, and Integrity of Information DlssemlOalec 0. EoPt

Regarding robustness checks, commenters were concerned that the EPA did not use the term
"especially rigorous robustness checks." We have modified our Guidelines to include this term.
Some commenters speculated on the ability of the Agency's Peer Review program to meet the
intent of the Guidelines and were concerned about the process to rebut a peer review used to
support the objectivity demonstration for disseminated information. Our Peer Review program
has been subject to external review and we routinely verify implementation of the program.
Affected persons wishing to rebut a formal peer review may do so using the complaint resolution
process in these Guidelines, provided that the information being questioned is considered to be
"disseminated" according to the Guidelines.
Regarding analytic results, some commenters indicated that the transparency factors identified
by EPA (section 6.3 of the Guidelines) are not a complete list of the items that would be needed
to demonstrate a higher degree of quality for influential information. EPA agreed with the list of
four items that was initially provided by the OMB and recognizes that, in some cases, additional
information regarding disseminated information would facilitate increased quality. However,
given the variety of information disseminated by the Agency, we cannot reasonably provide
additional details for such a demonstration at this time. Also, in regards to laboratory results,
which were mentioned by several commenters, these Guidelines are not the appropriate place to
set out for the science community EPA's view of what constitutes adequate demonstration of test
method validation or minimum quality assurance and quality control. Those technical
considerations should be addressed in the appropriate quality planning documentation or in
regulatory requirements.

-

EPA has developed general language addressing the concept of reproducibility and may proVide
more detail after appropriate consultation with scientific and technical communities, as called for
by OMB in its guidelines. We have already begun to consult relevant scientific and technical
experts within the Agency, and also have planned an expedited consultation with EPA's Science
Advisory Board (SAB) on October I, 2002. Based on these initial consultations, EPA may seek
additional input from the SAB in 2003. These consultations will allow EPA to constructively and
appropriately refine the application of existing policies and procedures, to further improve
reproducibility. In the interim, EPA intends to base the reproducibility of disseminated original
and supporting data on commonly accepted scientific, financial, or statistical standards.

A.3.6 Influential Risk Assessment
General Risk Assessment .
Risk assessment is a process where information is analyzed to determine if an environmental
hazard might cause harm to exposed persons and ecosystems (paraphrased from Risk
Assessment in the Federal Government, National Research Council, 1983). That is:
Risk =hazard x exposure
For a chemical or other stressor to be "risky," it must have both an inherent adverse effect on an
Appendix

48

GUideltnes for Ensuring anti Maximizing the Quality. Objectivity. Utility, and Integrity of Information Disseminatet1 b\ E".:.

organism, population, or other endpoint and it must be present in the environment at
concentrations and locations that an organism, population, or other endpoint is exposed to the
stressor. Risk assessment is a tool to detennine the likelihood of hann or loss of an organism,
population, or other endpoint because of exposure to a chemical or other stressor. To assist those
who must make risk management decisions, risk assessments include discussions on uncertainty,
variability and the continuum between exposure and adverse effects.
Risk assessments may be. perfonned iteratively, with the first iteration employing protective
(conservative) assumptions to identify possible risks. Only if potential risks are identified in a
screening level assessment is it necessary to pursue a more refined, data-intensive risk
assessment. The screening level assessments may not result in "central estimates" of risk or
upper and lower-bounds of risks. Nevertheless, such assessments may be useful in making
regulatory decisions, as when the absence of concern from a screening level assessment is used
(along with other infonnation) to approve the new use ofa pesticide or chemical or to decide
whether to remediate very low levels of waste contamination.

Appendix

49

GUlOCiines lor !:nsuring and Maximizing the Quality. Objectivity. Utility. and Integrity' of Inlormatlon Olssemmalcc

:l\

E"!

OMB Guidelines
In its guidelines OMB stated that, with respect to influential information regarding health. safety

or environmental risk assessments, agencies should either adopt or adapt the quality principles in
the Safe Drinking Water Act (SDWA) Amendments of 1996. 34.35 In the background section of
the OMB guidelines, OMB explains that "the word 'adapt' is intended to provide agencies
flexibility in applying these principles to various types of risk assessment."
Guidelines Development Consideration
EPA carefully and practically developed the adaptation of the SDWA quality principles using
our considerable experience conducting human health and ecological 36 risk assessments as well
as using our existing policies and guidance.
EPA conducts many risk assessments every year. Some of these are screening level assessments
based on scientific experts' judgments using conservative assumptions and available data and can
involve human health, safety, or environmental risk assessments. Such screening assessments
provide useful information that are sufficient for regulatory purposes in instances where more
elaborate, quantitative assessments are unnecessary. For example. such assessments could
indicate. even with conservative assumption, the level of risk does not warrant further
investigation. Other risk assessments are more detailed and quantitative and are based on
research and supporting data that are generated outside EPA. For example, pesticide reviews are
based on scientific stugies conducted by registrants in accordance with our regulations and
guidance documents. Our test guidelines and Good Laboratory Practices (GLPS)37 describe
sound scientific practices for conducting studies needed to assess human and environmental
hazards and exposures. Such studies are not required to be peer-reviewed. Risk assessments
based on these studies can include occupational, dietary, and environmental exposures.

34 Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-1(b)(3)(A) & (B).
35 In section III.3.iLC. of its guidelines, OMB states that: "With regard to analysis of risks to human health,
safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the
equality principles applied by Congress to risk information used and disseminated pursuant to the Safe Drinking
Water Act Amendments of 1996 (42 U.S.c. 300g-1(b)(3)(A) & (B». Agencies responsible for dissemination of vital
health and medical information shall interpret the reproducibility and peer-review standards in a manner appropriate
to assuring the timely flow of vital' information from agencies to medical provioers, patients. health agencies, and the
public. Information quality standards may be waived temporarily by agencies under urgent situations (e.g., imminent
threats to public health or homeland security) in accordance with the latitude specified in agency-specific
guidelines".
36Because the assessment of "environmental risk" is being distinguished in OMB' s adaptation of the
SDWA quality principles from "human health risk", the term "environmental risk" as used in these Guidelines does
not directly involve human health concerns. In other words, "environmental risk assessment" i~, in this case, the
equivalent to what EPA commonly refers to as "ecological risk assessment".
3740 CPR part 160 for FIFRA and 40 CPR part 792 for TSCA.

AppendiX

50

GUldolines for EnsurinQ and Malumlzing the Quality. Objectivity. Utility. and Integrity of InformatIon Disseminated oj"

~o.:.

The results of these risk assessments are conducted and presented to policy makers to infonn
their risk management decisions. EPA currently has numerous policies that provide guidance to
internal risk assessors on how to conduct a risk assessment and characterize risk. The EPA Risk
Characterization Policy8 and associated guidelines are designed to ensure that critical
information from each stage of a risk assessment is used in forming conclusions about risk and
that this infonnation is communicated from risk assessors to policy makers.

EPA Existing Policies and Guidance
Current EPA guidance and policies incorporate quality principles. These are designed to ensure
that critical infonnation from each stage of a risk assessment is used in forming conclusions
about risk and that this infonnation is communicated from risk assessors to policy makers. One
example is the EPA Risk Characterization Policy9 which provides a single, centralized body of
risk characterization implementation guidance to help EPA risk assessors and risk managers
make the risk characterization process transparent and risk characterization products clear,
consistent and reasonable (TCCR). These principles have been included in other Agency risk
assessment guidance, such as the Guidelines for Ecological Risk Assessment. 40 Other examples
of major, overarching guidelines for risk assessments include: Guidelines For Exposure
Assessment 41, Guidelines For Neurotoxicity Risk Assessment,42 and Guidelines For Reproductive
Toxicity Risk Assessment. 43 Each of these documents has undergone external scientific peer
review as well as public comment prior to publication. Additionally, individual EPA offices have
developed more specific risk assessment policies to meet the particular needs of the programs
and statutes under whi~h they operate. 44 EPA's commitment to sound science is evidenced by our
ongoing efforts to develop and continually improve Agency guidance for risk assessment.

38http://www.epa.gov/OSP/spc/rcpolicy.htm
39

Ibid.

40US EPA(1998). Guidelines for ecological risk assessment (Federal Register 63(93):26846-26924).
http://www.eDa.gov/ncyalraf.
41US EPA (1992). Guidelines For Exposure Assessment. Federal Register 57(104):22888-22938.
http://www.epa.gov/ncealraf/.
42US EPA (1998). Guidelines For Neurotoxicity Risk Assessment. Federal Register 63(93):26926-26954.
http://www.epa.gov/ncea/raf/.
43US EPA (1996). Guidelines For Reproductive Toxicity Risk Assessment. Federal Register 61 (212):56274­
56322. http://www.epa.gov/ncealraf.
44 The Office of Solid Waste and Emergency Response has developed Tools for Ecological Risk
Assessment for Superfund Risk Assessment. One example is the Ecological Risk Assessment Guidance for
Superfund: Process for Designing and Conducting Ecological Risk Assessments - Interim Final.
http://www.epa.gov/oerrpage/superfundlprograms/risklecorisklecorisk.htm
http://www.epa.gov/oerrpage/superfundlprograms/riskltooleco.htm

Appendix

51

GLildelInes for Ensuring and MaximizIng the Quality. Objectivity. Utility. and Integrity of Information D,ssemlnatea

0\

EP,:

EPA's Experience Condoctine Risk Assessments
The first EPA human health risk assessment guidelines 45 were issued in 1986. In 1992, the
Agency produced a Frameworkfor Ecological Risk Assessment46 which was replaced by the
1998 Ecological Risk Assessment Guidelines. 47 As emphasized elsewhere in this document, the
statutes administered by EPA are diverse. Although the majority of risk assessments conducted
within the Agency are for chemical stressors, we also assess risks to biological and physical
stressors. In addition to risk assessment guidelines, both the EPA Science Policy Council and the
EPA Risk Assessment Forum have coordinated efforts to address the complex issues related to
data collection and analysis for hazard and exposure assessments. Thus, the Agency has
considerable experience in conducting both screening level and in-depth assessments for a wide
array of stressors.
Most environmental statutes obligate EPA to act to prevent adverse environmental and human
health impacts. For many of the risks that we must address, data are sparse and consensus about
assumptions is rare. In the context of data quality, we seek to strike a balance among fairness,
accuracy, and efficient implementation. Refusing to act until data quality improves can result in
substantial harm to human health, safety, and the environment.

Public Comments
We received a range of public and stakeholder comments on the adaptation of the SDWA
principles for "influential" human health, safety, and environmental risk assessments that are
disseminated by EPA. Some commenters stated that we should adopt the SDWA quality
principles for human health risk, safety and environmental risk assessments. Many commenters
sought clarification on reasons for EPA's adaptation of the SDWA quality principles for human
health risk assessments and additional information on how we plan to address this process.
Others urged us to adapt the SDWA principles rather than adopt, because of certain elements in
the SDWA principles that may not be applicable to all risk assessments such as a "central
estimate of human risk for the specific populations affected." Others stated that we should
neither adapt nor adopt SDWA principles because the "Data Quality Act" does not authorize
importing decisional criteria into statutory provisions where they do not apply. The decisional
criteria set forth in SDWA are expressly limited to SDW A. We also received comments at a
level of detail that are more appropriate for implementation of the Guidelines than for the
formulation of the Guidelines. These include comments regarding the use of clinical human test
data, and comments regarding the use of particular types of assumptions in risk assessments. To
the.extent that an affected person believes that our use of data or assumptions in a particular

4551 FR 33992-34054,24 September 1986.
46Framework For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1992, EPN6301R­
92/001.

470uidelines For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1998, EPN6301R­
95/002F, http://cfpub.epa.gov/ncealcfmlecorsk.cfm
Appendix

52

GUidelines tor Ensuring and Ma)(Imizing the Quality. ObJectivity. Utility. and Integrity of Information D,SSemln3tN' b, EP;

dissemination of infonnation is inconsistent with these Guidelines, the issue can be raised at that
time.
A few commenters raised a question regarding a conflict between EPA's existing policies and
the SDWA principles and asked us to identify the conflicting specific risk assessment standards
and make every effort to reconcile the conflicting standards with the SDWA principles. A few
commenters stated that EPA should not have two separate standards for risk assessments (i.e..
one for influential and one for non-influential), but that all risk assessments should be considered
influential. Another stated that if there is a conflict between existing policies and the SDWA
principles, EPA should identify the conflicting specific risk assessment standards and make
every effort to reconcile the conflicting standards with the SDWA principles. Some commenters
have questioned why the "best available, peer reviewed science and supporting studies"
language of SDWA was conditioned by terms such as "to the extent practicable" or "as
appropriate."

Adaptation of SDWA Quality Principles
Public comments received by the Agency on the draft Guidelines were widely divergent. As no
obvious consensus could be drawn, we carefully considered comments and arguments on
adoption and adaptation. We also reviewed our experience with the SDWA principles, existing
policies, and the applicability and appropriateness of the SDWA language with regard to the
variety of risk assessments that we conduct and have detennined that, to best meet the statutory
obligations of the many statutes EPA implements, it remains most appropriate to adapt the
SDWA principles to human health, safety, and environmental risk assessments.

In response to public comments we have removed "as appropriate" from these Guidelines in our
SDWA adaptation. EPA agrees that the phrase peer reviewed science "as appropriate" was
unclear. We revised this statement in part (A) to "including, when available, peer-reviewed
science and supporting studies." EPA introduced such adaptations in order to accommodate the
range of real-world situations we address in the implementation of our diverse programs.
Numerous commenters expressed that EPA did not provide adequate clarifications of how we
adapted the principles and what our thinking was on each adaptation. In these Guidelines we
have provided detailed clarifications regarding each adaptation made to the original SDWA
language and other remarks regarding our intent during the implementation of the SDWA
adaptation for influential disseminations by EPA. We direct reader to the Guidelines text for
such clarifications.

A.3.7 Complaint Resolution
A few commenters noted that EPA should outline how an affected person would rebut the
presumption of objectivity afforded by peer review. EPA believes this determination would be
made on a case-by-case basis considering the circumstances of a particular peer review and has

Appendix

53

GUldelmes for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of InformatIon Dlsseminatea b\

EC~

decided not to provide specific suggestions for affected persons on how to rebut the presumption
of objectivity afforded by a peer review.
OMB and other commenters noted that agencies' guidelines needed to make clear that a request
for correction can be filed if an affected person believes that information does not comply with
the EPA Guidelines and the OMB guidelines. EPA has added language in the EPA Guidelines to
make sure this is more clear to readers.
EPA received numerous comments on the EPA definition of affected persons. In the draft
Guidelines, EPA had adopted OMB's definition. EPA agrees with comments suggesting that,
instead of elaborating on the definition of "affected person," a more open approach would be to
ask complainants to describe how they are an affected person with respect to the information that
is the subject of their complaint. EPA is asking that persons submitting requests for correction
provide, among other things, such an explanation. EPA has revised the Guidelines accordingly,
so that we may consider this information along with other information in the complaint in
deciding on how to respond.
Some commenters noted that the EPA Guidelines do not state how the process will work,
specifically, for States, municipalities, and EPA. They expressed concern of being "caught in the
middle," so to speak, on trying to get their own information corrected. EPA does not believe that
the Guidelines needed greater details on how States will work with EPA to address complaints,
but intends to work closely with States to better ensure timely correction. EPA does appreciate
the frustration of an information owner in seeing what they deem "incorrect" information in a
disseminated document or web site. However, EPA notes that this is a very complex issue that
cannot be addressed with general language in the Guidelines for all cases.
Several comments indicated that EPA appears to have given itself "Garte blanche" authority to
"elect not to correct" information. The commenters stated that there was no valid reason why
EPA would opt out of correcting information and that all errors should be corrected. To the
contrary, EPA like every Federal agency wants to correct wrong information. The issue is not as
simple as the correction of an improper zip code or phone number on the EPA web site. Even
these simple errors may be very complex if it would involve changing data in an EPA and/or
State database. Furthermore, EPA is not certain of the volume of complaints it will receive after
October 1 and therefore needed to provide a general provision in the Guidelines to recognize that
once EPA approves a request, the corrective action may vary depending on the circumstances.
On a case-by-case basis, EPA will determine the appropriate corrective action for each
complaint. EPA determined that this was the most reasonable approach. The revision also
recognizes practical limitations on corrective action for information from outside sources.
Several commenters noted that EPA needs to establish time frames for the complaint process.
Commenters stated that EPA should establish time frames for when affected persons can submit
a complaint on an information product, when EPA needs to responds to affected persons with a
decision on discrete, factual errors, when EPA would respond to affected persons with a decision
on more complex or broader interpretive issues, and when an affected person should submit a
Appendix

54

Guioell!1cs tor Ensuring and MaxImIzing the Quality. ObJectivity. Utility. and IntegritY 01 Information DISSCmlnaleC 0\ !OPt

request for reconsideration. One commenter suggested that EPA solicit all complaints at one
time during a 6-month window or another time frame. EPA notes that commenters provided
helpful examples and well thought out proposals for such a suite of time frames and appreciates
the public input.
EPA did not agree on the need to develop two separate time frames for complaints that are more
factual in nature versus those that are more complex. One commenter suggests a 15-day time
line for discrete factual errors and a 45-day time line for all other complaints. Another
commenter recommended 30 days for factual errors and 60 days for all other complaints.
Another commenter advised EPA to model this complaint process according to the FOIA
process. This commenter also suggested a 3-week time line for more numeric corrections and 60
days for "broader interpretive issues or technical questions." While EPA appreciates the value of
these approaches, they might be problematic to implement. However, as EPA learns more about
the nature of this complaint process following some period of implementation, these suggested
approaches could be revisited.
EPA also agreed with commenters that a window of opportunity for commenters to submit a
request for reconsideration made sense. EPA has advised affected persons in these Guidelines to
submit a request for reconsideration within 90-days of the initial complaint decision by EPA.
Some commenters asked that EPA establish time lines for when EPA would take corrective
action. EPA does not anticipate that there would be any value in applying a specific time frame
for this action and pre(ers to look at each complaint and appropriate corrective action on a
case-by-case basis, as discussed above.
Commenters suggested that 45 days was a reasonable time frame for EPA to get back to the
affected person with either a decision or a notice that EPA needs more time. One group noted
that HHS, SSA, and NRC adopted the 45-day window. EPA disagreed with this approach and
instead opted for a 90-day time frame similar to the DOT Guidelines.
EPA received many comments on how EPA should structure its internal processes for the
complaint resolution process. Several comments specifically discussed the role that OEI should
play in the initial complaint and the requests for reconsideration. EPA does not agree that OEI
should be the arbiter on all requests for reconsideration, but does view the role of DEI in the
process as an important one. Namely, OEI may work to help ensure consistent responses to
complaints and requests for reconsideration. Other comments recommending specific internal
implementation processes are being considered as EPA designs the correction and request for
reconsideration administrative processes in greater detail.
Many commenters argued that Assistant Administrators and Regional Administrators should not
decide requests for reconsideration because they would be biased or would have a conflict of
interest when deciding complaints regarding information disseminated by their own Offices or
programs, or if they had to reconsider decisions made by their own staffs. EPA does not agree.
This type of decision making is within the delegated decision making authority of EPA's
Appendix

55

GUidelines TOr Erlsunng and Maximizing the Quality. ObJectivity. Utility. and Integrity of Information Dlssemmated bv EP;'

officials, and these decisions should be presumed to be unbiased absent a specific showing that a
decision maker is not impartial in a particular case. EPA does agree with commenters who noted
that it is important to make consistent decisions on cross-cutting information quality issues. In
order to achieve appropriate consistency of response to affected persons on requests for
reconsideration and to ensure that cross-cutting information quality issues are considered across
the Agency at a senior level, EPA intends for an executive panel to make the final decisions on
all requests for reconsideration. Furthermore, we felt it important to add greater detail on the
time frame within which EPA would respond to a requestor on their request for reconsideration.
We have added that it is EPA's goal to respond to requesters regarding requests for
reconsideration within 90 days.
EPA received many recommendations in public comments to include the public in the EPA
complaint process. Specifically, commenters requested that EPA notify the public about all
pending requests to modify information and one commenter stated that EPA should allow the
public to comment on information corrections requests for information that are considered
"central to a rulemaking or other Final Agency Action" before EPA accepts or rejects the request.
As a general matter, EPA does not intend to solicit public comment on how EPA should respond
to requests for correction or reconsideration. EPA also does not intend to post requests for
correction and requests for reconsideration on the EPA web site, butwe plan to revisit this and
many other aspects of the Guidelines within one year ofimplementation.
EPA also received many comments on how information that is currently being reviewed by EPA
in response to a compl~int appears to the public on the EPA web site or some other medium.
Some commenters recommended the use of flags for all information that has a complaint pending
with a note that where appropriate, challenged information will be pulled from dissemination and
removed from EPA's web site. Other commenters stated that the information in question should
be removed from public access until the resolution process has been completed. Still other
commenters requested that EPA not embark on self-censorship. As a general rule, EPA has
decided not to flag information that has a complaint pending. EPA believes that information that
is the subject of a pending complaint should not necessarily be removed from public access based
solely on the receipt of a request for correction.

A.4

Next Steps

EPA is actively developing new policies and procedures, as appropriate, to improve the quality of
information disseminated to ,the public. Some activities specifically support ensuring and
maximizing the quality, objectivity, utility, and integrity of information. For instance, we are
consulting with the scientific community on the subject of reproducibility. The EPA Science
Advisory Board (SAB) is performing an expedited consultation on the subject on October 1,
2002. Based on this initial consultation, EPA and the SAB may consider a full review of
reproducibility and related information quality concepts in 2003. Furthermore, as noted earlier,
the EPA Science Policy Council has commissioned a workgroup to develop aSsessment factors
for consideration in assessing information that EPA collects or is voluntarily submitted in support
of various Agency decisions.
Appendix

56

GUldeilnes tor Ensuring and Maxlmlzmg the Quality. Objectivity. Utility. and Integrity of Information Disseminated b~ ;:P;,

As new processes, policies, and procedures are considered and adopted into Agency operations,
we will consider their relationship to the Guidelines and detennine the extent to which the
Guidelines may need to change to accommodate new activity.

Appendix

57

OFFICE OF
ENVIRONMENTAL
INFORMATION

www.epa.gov/oei

Friday,

February 22, 2002

Part IX

Office of
Management and
Budget
GW~MesfurEMuringandM~mmmg

the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by
Federal Agencies; Notice; Republication

8452

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices

OFFICE OF MANAGEMENT AND
BUDGET
Guidelines for Ensuring and
Maximizing the Quality, Objectivity,
Utility, and Integrity of Information
Disseminated by Federal Agencies;
Republication

Editorial Note: Due to numerous errors,
this document is being reprinted in its
entirety. It was originally printed in the
Federal Register on Thursday, January 3,
2002 at 67 FR 369-378 and was corrected on
Tuesday, February 5, 2002 at 67 FR 5365.
AGENCY: Office of Management and
Budget, Executive Office of the
President.
ACTION: Final guidelines.
These final guidelines
implement section 515 of the Treasury
and General Government
Appropriations Act for Fiscal Year 2001
(Public Law 106-554; H.R. 5658).
Section 515 directs the Office of
Management and Budget (OMB) to issue
government-wide guidelines that
"provide policy and procedural.
guidance to Federal agencies for
ensuring and maximizing the quality,
objectivity, utility, and integrity of
information (including statistical
information) disseminated by Fesleral
agencies." By October 1, 2002, agencies
must issue their own implementing
guidelines that include "administrative
mechanisms allowing affected persons
to seek and obtain correction of
information maintained and
disseminated by the agency" that does
not comply with the OMB guidelines.
These final guidelines also reflect the
changes OMB made to the guidelines
issued September 28, 2001, as a result
of receiving additional comment on the
"capable of being substantially
reproduced" standard (paragraphs
V.3.B, V.9, and V.I0), which OMB
previously issued on September 28,
2001, on an interim final basis.
DATES: Effective Date: January 3, 2002.

SUMMARY:

FOR FURTHER INFORMATION CONTACT:

Brooke J. Dickson, Office of Information
and Regulatory Affairs, Office of
Management and Budget, Washington,
DC 20503. Telephone (202) 395-3785 or
bye-mail to
informationquality@omb.eop.gov.
SUPPLEMENTARY INFORMATION: In section
515(a) of the Treasury and General
Government Appropriations Act for
Fiscal Year 2001 (Public Law 106-554;
H.R. 5658), Congress directed the Office
of Management and Budget (OMB) to
issue, by September 30, 2001,
government-wide guidelines that
"provide policy and procedural

followed in drafting the guidelines that
guidance to Federal agencies for
we published on September 28, 2001
ensuring and maximizing the quality,
(66 FR 49719), are also applicable to the
objectivity, utility, and integrity of
amended guidelines that we publish
information (including statistical
today.
information) disseminated by Federal
In accordance with section 515. OMB
agencies * * *" Section 515(b) goes on
has designed the guidelines to help
to state that the OMB guidelines shall:
agencies ensure and maximize the
"(1) apply to the sharing by Federal
quality, utility, objectivity and integrity
agencies of, and access to, information
of the information that thev disseminate
disseminated by Federal agencies; and
(meaning to share with, or'give access
"(2) require that each Federal agency
to, the public). It is crucial that
to which the guidelines apply­
"(A) issue guidelines ensuring and
information Federal agencies
maximizing the quality, objectivity,
disseminate meets these guidelines. In
this respect, the fact that the Internet
utility, and integrity of information
(including statistical information)
enables agencies to communicate
disseminated by the agency, by not later information quickly and easily to a wide
than 1 year after the date of issuance of
audience not only offers great benefits to
the guidelines under subsection (a);
society, but also increases the potential
"(B) establish administrative
harm that can result from the
mechanisms allowing affected persons
dissemination of information that does
to seek and obtain correction of
not meet basic information quality
information maintained and
guidelines. Recognizing the wide variety
disseminated by the agency that does
of information Federal agencies
not comply with the guidelines issued
disseminate and the wide variety of
under subsection (a); and
dissemination practices that agencies
"(C) report periodically to the
have, OMB developed the guidelines
Director­
with several principles in mind.
"(i) the number and nature of
First. OMB designed the guidelines to
complaints received by the agency
apply to a wide variety of government
regarding the accuracy of information
information dissemination activities
disseminated by the agency and;
that may range in importance and scope.
"(ii) how such complaints were
OMB also designed the guidelines to be
handled by the agency."
generic enough to fit all media" be they
Proposed guidelines were published
printed, electronic, or in other form.
in the Federal Register on June 28, 2001 OMS sought to avoid the problems that
(66 FR 34489). Final guidelines were
would be inherent in developing
published in the Federal Register on
detailed, prescriptive, "one-size-fits-all"
September 28, 2001 (66 FR 49718). The
government-wide guidelines that would
Supplementary Information to the final . artificially require different types of
guidelines published in September 2001 dissemination activities to be treated in
provides background, the underlying
the same manner. Through this
principles OMB followed in issuing the
flexibility, each agency will be able to
final guidelines, and statements of
incorporate the requirements of these
intent concerning detailed provisions in OMB guidelines into the agency's own
the final guidelines.
information resource management and
In the final guidelilnes published in
administrative practices.
September 2001, OMB also requested
Second, OMB designed the guidelines
additional comment on the "capable of
so that agencies will meet basic
being substantially reproduced"
information quality standards. Given the
standard and the related definition of
administrative mechanisms required by
"influential scientific or statistical
section 515 as well as the standards set
information" (paragraphs V.3.B, V.9,
forth in the Paperwork Reduction Act, it
and V.I0), which were issued on an
is clear that agencies should not
interim final basis. The final guidelines
disseminate substantive information
published today discuss the public
that does not meet a basic level of
comments OMS received, the OMS
quality. We recognize that some
government information may need to
response, and amendments to the final
meet higher or more specific
guidelines published in September
2001.
information. quality standards than
In developing agency-specific
those that would apply to other types of
guidelines, agencies should refer both to government information. The more
the Supplementary Information to the
important the information, the higher
final guidelines published in the
the quality standards to which it should
Federal Register on September 28, 2001 be held, for example, in those situations
(66 FR 49718), and also to the
involving "influential scientific,
Supplementary Information published
financial, or statistical information" (a
today. We stress that the three
phrase defined in these guidelines). The
"Underlying Principles" that OMB
guidelines recognize, however, that

Federal Register / Vol. 67, No. 36/ Friday, February 22, 2002/ Notices
information quality comes at a cost.
Accordingly, the agencies should weigh
the costs (for example, including costs
attributable to agency processing effort,
respondent burden, maintenance of
needed privacy, and assurances of
suitable confidentiality) and the benefits
of higher information quality in the
development of information, and the
level of quality to which the information
disseminated will be held.
Third, OMB designed the guidelines
so that agencies can apply them in a
common-sense and workable manner. It
is important that these guidelines do not
impose unnecessary administrative
burdens that would inhibit agencies
from continuing to take advantage of the
Internet and other technologies to
disseminate information that can be of
great benefit and value to the public. In
this regard, OMB encourages agencies to
incorporate the standards and
procedures required by these guidelines
into their existing information resources
management and administrative
practices rather than create new and
potentially duplicative or contradictory.
processes. The primary example of this
is that the guidelines recognize that, in
accordance with OMB Circular A-l30,
agencies already have in place well­
established information quality
standards and administrative ~
mechanisms that allow persons to seek
and obtain correction of information
that is maintained and disseminated by
the agency. Under the OMB guidelines,
agencies need only ensure that their
own guidelines are consistent with
these OMB guidelines, and then ensure
that their administrative mechanisms
satisfy the standards and procedural
requirements in the new agency
guidelines. Similarly, agencies may rely
on their implementation of the Federal
Government's computer security laws
(formerly, the Computer Security Act,
and now the computer security
provisions of the Paperwork Reduction
Act) to establish appropriate security
safeguards for ensuring the "integrity" .
of the information that the agencies
disseminate.
In addition, in response to concerns
expressed by some of the agencies, we
want to emphasize that OMB recognizes
that Federal agencies provide a wide
variety of data and information.
Accordingly, OMB understands that the
guidelines discussed below cannot be
implemented in the same way by each
agency. In some cases, for example, the
data disseminated by an agency are not
collected by that agency; rather, the
information the agency must provide in
a timely manner is compiled from a
variety of sources that are constantly
updated and revised and may be

confidential. In such cases, while
agencies' implementation of the
guidelines may differ, the essence of the
guidelines will apply. That is, these
agencies must make their methods
transparent by providing
documentation, ensure quality by
reViewing the underlying methods used
in developing the data and consulting
(as appropriate) with experts and users,
and keep users informed about
corrections and revisions.
Summary of OMB Guidelines
These guidelines apply to Federal
agencies subject to the Paperwork
Reduction Act (44 U.S.C. chapter 35).
Agencies are directed to develop
information resources management
procedures for reviewing and
substantiating (by documentation or
other means selected by the agency) the
quality (including the objectivity,
utility, and integrity) of information
before it is disseminated. In addition,
agencies are to establish administrative
mechanisms allowing affected persons
to seek and obtain, where appropriate,
correction of information disseminated
by the agency that does not comply with
the OMB or agency guidelines.
Consistent with the underlying
principles described above, these
guidelines stress the importance of
having agencies apply these standards
and develop their administrative
mechanisms so they can be
implemented in a common sense and
workable manner. Moreover, agencies
must apply these standards flexibly, and
in a manner appropriate to the nature
and timeliness of the information to be
disseminated, and incorporate them into
existing agency information resources
management and administrative
practices.
Section 515 denotes four substantive
terms regarding information
disseminated by Federal agencies:
quality, utility, objectivity, and
integrity. It is not always clear how each
substantive term relates-or how the
four terms in aggregate relate-to the
widely divergent types of information
that agencies disseminate. The
guidelines provide definitions that
attempt to establish a clear meaning so
that both the agency and the public can
readily judge whether a particular type
of information to be disseminated does
or does not meet these attributes.
In the guidelines, OMB defines
"quality" as the encompassing term, of
which "u~ility," "objectivity," and
"integrity" are the constituents.
"Utility" refers to the usefulness of the
information to the intended users.
"Objectivity" focuses on whether the
disseminated information is being

8453

presented in an accurate. clear.
complete, and unbiased manner, and as
a matter of substance, is accurate.
reliable, and unbiased. "Integrity" refers
to security--:-the protection of
information from unauthorized access
or revision, to ensure that the
information is not compromised
through corruption or falsification. OMB
modeled the definitions of
"information," "government
information," "information
dissemination product," and
"dissemination" on the longstanding
definitions of those terms in OMB
Circular A-130, but tailored them to fit
into the context of these guidelines.
In addition, Section 515 imposes two
reporting requirements on the agencies.
The first report, to be promulgated no
later than October I, 2002, must provide
the agency's information quality
guidelines that describe administrative
mechanisms allowing affected persons
to seek and obtain, where appropriate,
correction of disseminated information
that does not comply with the OMB and
agency guidelines. The second report is
an annual fiscal year report to OMB (to
be first submitted on January I, 2004)
prOViding information (both quantitative
and qualitative, where appropriate) on
the number, nature, and resolution of
complaints received by the agency
regarding its perceived or confirmed
failure to comply with these OMB and
agency guidelines.
Public Comments and OMB Response
Applicability of Guidelines, Some
comments raised concerns about the
applicability of these guidelines,
particularly in the context of scientific
research conducted by Federally
employed scientists or Federal grantees
who publish and communicate their
research findings in the same manner as
their academic colleagues. OMB
believes that information generated and
disseminated in these contexts is not
covered by these guidelines unless the
agency represents the information as, or
uses the information in support of, an
official position of the agency.
As a.general matter, tliese guidelines
apply to "information" that is
"disseminated" by agencies subject to
the Paperwork Reduction Act (44 U.S.C.
3502(1)). See paragraphs II, V.5 and V.B.
The definitions of "information" and
"dissemination" establish the scope of
the applicability of these guidelines.
"Information" means "any
communication or representation of
knowledge such as facts or data * * *"
This definition of information in
paragraph V,5 does "not include
opinions, where the agency's
presentation makes it clear that what is

8454

Federal Register/Vol. 67. No. 36/Friday, February 22, 2002/Notices

being offered is someone's opinion
rather than fact or the agency's views."
"Dissemination" is defined to mean
"agency initiated or sponsored
distribution of information to the
public." As used in paragraph V.8,
"agency INITIATED' •• distribution
of information to the public" refers to
information that the agency
disseminates, e.g .. a risk assessment
prepared by the agency to inform the
agency's formulation of possible
regulatory or other action. In addition,
if an agency, as an institution,
disseminates information prepared by
an outside party in a manner that
reasonably suggests that the agency
agrees with the information, this
appearance of having the information
represent agency views makes agency
dissemination of the information subject
to these guidelines. By contrast, an
agency does not "initiate" the
dissemination of information when a
Federally employed scientist or Federal
grantee or contractor publishes and
communicates his or her research
findings in the same manner as his or
her academic colleagues, even if the
Federal agency retains ownership or
other intellectual property rights
because the Federal government paid for
the research. To avoid confusion
regarding. whether the agency agrees
with the information (and is therefore
disseminating it through the employee
or grantee), the researcher should
include an appropriate disclaimer in the
publication or speech to the effect that
the "views are mine, and do not
necessarily reflect the view" of the
agency.
Similarly, as used in paragraph V,8"
"agency' •• SPONSORED
distribution of information to the
public" refers to situations where an
agency has directed a third-party to
disseminate information, or where the
agency has the authority to review and
approve the information before release.
Therefore, for example, if an agency
through a procurement contract or a
grant prOVides for a person to conduct
research, and then the agency directs
the person to disseminate the results (or
the agency reviews and approves the
results before they may be
disseminated), then the agency has
"sponsored" the dissemination of this
information, By contrast, if the agency
simply provides funding to support
research, and it the researcher (not the
agency) who decides whether to
disseminate the results and-if the
results are to be released-who
determines the content and presentation
of the dissemination, then the agency
has not "sponsored" the dissemination
even though it has funded the research

and even if the Federal agency retains
ownership or other intellectual property
rights because the Federal government
paid for the research. To avoid
confusion regarding whether the agency
is sponsoring the dissemination, the
researcher should include an
appropriate disclaimer in the
publication or speech to the effect that
the "views are mine, and do not
necessarily reflect the view" of the
agency. On the other hand, subsequent
agency dissemination of such
information requires that the
information adhere to the agency's
information quality guidelines, In llum,
these guidelines govern an agency's
dissemination of information. but
generally do not govern a third-party's
dissemination of information (the
exception being where the agency is
essentially using the third-party to
disseminate information on the agency's
behalf). Agencies, particularly those that
fund scientific research, are encouraged
to clarify the applicability of these
guidelines to the various types of
information they and their employees
and grantees disseminate.
Paragraph V.8 also states that the
definition of "dissemination" does not
include'" • • distribution limited to
correspondence with individuals or
persons, press releases, archival records,
public filings, subpoenas or adjudicative
processes." The exemption from the
definition of "dissemination" for
"adjudicative processes" is intended to
exclude, from the scope of these
guidelines, the findings and
determinations that an agency makes in
the course of adjudications involving
specific parties. There are well­
established procedural safeguards and
rights to address the quality of
adjudicatory decisions and to provide
persons with an opportunity to contest
decisions. These guidelines do not
impose any additional requirements on
agencies during adjudicative
proceedings and do not provide parties
to such adjudicative proceedings any .
additional rights of challenge or appeal.
The Presumption Favoring Peer­
Reviewed Information ,As a general
matter, in the scientific and research
context, we regard technical information
that has been subjected to formal,
independent, external peer review as
presumptively objective. As the
guidelines state in paragraph V.3.b.i: "If
data and analytic results have been
subjected to formal, independent,
external peer review, the information
may generally be presumed to be of
acceptable objectivity." An example of a
formal, independent, external peer
review is the review process used by
scientific journals.

Most comments approved of the
prominent role that peer review plays in
the OMB guidelines. Some comments
contended that peer review was not
accepted as a universal standard that
incorporates an established. practiced,
and sufficient level of objectivity. Other
comments stated that the guidelines
would be better clarified by making peer
review one of several factors that an
agency should consider in assessing the
objectivity (and quality in general) of
original research. In addition, several
comments noted that peer review does
not establish whether analytic results
are capable of being substantially
reproduced. In light of the comments,
the final guidelines in new paragraph
V.3.b.i qualify the presumption in favor
of peer-reviewed information as follows:
"However, this presumption is
rebuttable based on a persuasive
showing by the petitioner in a particular
instance."
We believe that transparency is
important for peer review, and these
guidelines set minimum standards for
the transparency of agency-sponsored
peer review. As we state in new
paragraph V.3.b.i: "If data and analytic
results have been subjected to formal,
independent, external peer review, the
information may generally be presumed
to be of acceptable objectivity. However,
this presumption is rebuttable based on
a persuasive showing by the petitioner
in a particular instance. If agency­
sponsored peer review is employed to
help satisfy the objectivity standard. the
review process employed shall meet the
general criteria for competent and
credible peer review recommended by
OMB-DIRA to the President's
Management Council (g/20/01) (http://
www.whitehouse.gov/omb/inforeg/
oira_review-process.htrnll, namely, 'that
(a) peer reviewers be selected primarily
on the basis of necessary technical
expertise, (b) peer reviewers be expected
to disclose to agencies prior technicall
policy positions they may have taken on
the issues at hand, (c) peer reviewers be
expected to disclose to agencies their
sources of personal and institutional
funding (private or public sector), and
(dl peer reviews be conducted in an
open and rigorous manner.'''
The importance of these general
criteria for competent and credible peer
review has been supported by a number
of expert bodies. For example, "the
work of fully competent peer-review
panels can be undermined by
allegations of conflict of interest and
bias. Therefore, the best interests of the
Board are served by effective policies
and procedures regarding potential
conflicts of interest. impartiality, and
panel balance." (EPA's Science Advisory

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
Board Panels: Improved Policies and
Procedures Needed to Ensure
independence and Balance, GAD-Ol­
536. General Accounting Office.
Washington. DC, June 2001, page 19~)
As another example, "risk analyses
should be peer-reviewed and
accessible-both physically and
intellectuallv-so that decision-makers
at all levels will be able to respond
critically to risk characterizations. The
intensity of the peer reviews should be
commensurate with the significance of
the risk or its management
implications." (Setting Priorities,
Getting Results: A New Direction for
EPA, Summary Report, National
Academv of Public Administration,
Washington, DC, April 1995, page 23.)
These criteria for peer reviewers are
generally consistent with the practices
now followed by the National Research
Council of the National Academy of
Sciences. In considering these criteria
for peer reviewers, we note that there
are many types of peer reviews and that
agency guidelines concerning the use of
peer review should tailor the rigor of
peer review to the importance of the
information involved. More generally,
agencies should define their peer-review
standards in appropriate ways, given the
nature and importance of the
information they disseminate. '
is Journal Peer Review Always
Sufficient? Some comments argued that
journal peer review should be adequate
to demonstrate quality, even for
influential information that can be
expected to have major effects on public
policy. OMB believes that this position
overstates the effectiveness of journal
peer review as a quality-control
mechanism.
Although journal peer review is
clearly valuable, there are cases where
flawed science has been published in
respected journals. For example, the
NIH Office of Research Integrity recently
reported the following case regarding
environmental health research:
.. Based on the report of an investigation .
conducted by [XX] University, dated July 16,
1999, and additional analysis conducted by
ORr in its oversight review, the US Public
Health Service found that Dr. [XI engaged in
scientific misconduct. Dr. [Xl committed
scientific misconduct by intentionally
falsifying the research results published in
the journal SCIENCE and by providing
falsified and fabricated materials to
investigating officials at [XX] Universitv in
response to a request for original data to
support the research results and conclusions
report in the SCIENCE paper. In addition,
PHS finds that there is no original data or
other corroborating evidence to support the
research results and conclusions reported in
the SCIENCE paper as a whole." (66 FR
52137, October 12,2001).

Although such cases of falsification
are presumably rare, there is a
significant scholarly literature
documenting quality problems with
articles published in peer-reviewed
research. "In a [peer-reviewed] meta­
analysis that surprised many-and some
doubt-researchers found little evidence
that peer review actually improves the
quality ofresearch papers." (See, e.g.,
Science, Vol. 293, page 2187 (September
21,2001.)) In part for this reason, many
agencies have already adopted peer
review and science advisory practices
that go beyond journal peer review. See,
e.g., Sheila Jasanoff, The Fifth Branch:
Science Advisers as Policy Makers,
Cambridge, MA, Harvard.University
Press, 1990; Mark R. Powell, Science at
EPA: Information in the Regulatory
Process. Resources for the Future,
Washington, DC., 1999, pages 138-139;
151-153; Implementation of the
Environmental Protection Agency's Peer
Review Program: An SAB Evaluation of
Three Reviews, EPA-SAB-RSAC-Dl­
009, A Review of the Research Strategies
Advisory Committee (RSAC) of the EPA
Science Advisory Board (SAB),
Washington, DC., September 26,2001.
For information likely to have an
important public policy or private sector
impact, OMB believes that additional
quality checks beyond peer review are
appropriate.
Definition of "Influential". OMB
guidelines apply stricter quality
standards to the dissemination of
information that is considered
"influential." Comments noted that the
breadth of the definition of "influential"
in interim final paragraph V.9 requires
much speculation on the part of
agencies.
We believe that this criticism has
merit and have therefore narrowed the
definition. In this narrower definition,
"influential", when used in the phrase
"influential scientific, financial, or
statistical information", is amended to
mean that "the agency can reasonably
determine that dissemination of the
information will have or does have a
clear and substantial impact on
important public policies or important
private sector decisions." The intent of
the new phrase "clear and substantial"
is to reduce the need for speculation on
the part of agencies. We added the
present tense-"or does have"-to this
narrower definition because on
occasion. an information dissemination
may occur simultaneously with a
particular policy change. In response to
a public comment, we added an explicit
reference to "financial" information as
consistent with our original intent.
Given the differences in the many
Federal agencies covered by these

8455

gUidelines, and the differences in th,e
nature of the information they
disseminate, we also believe it will be
helpful if agencies elaborate on this
definition of "influential" in the context
of their missions and duties. with due
consideration of the nature of the
information they disseminate. As we
state in amended paragraph V.g. "Each
agency is authorized to define
'influential' in ways appropriate for it
given the nature and multiplicity of
issues for which the agency ,is
responsible."
.
Reproducibility. As we state in new
paragraph V.3.b.ii: "If an agency is
responsible for disseminating influential
scientific, financial, or statistical
information, agency guidelines shall
include a high degree of transparency
about data and methods to facilitate the
reproducibility of such information by
qualified third parties." OMB believes
that a reproducibility standard is
practical and appropriate for
information that is considered
"Influential", as defined in paragraph
V.9-that "will have or does have a
clear and substantial impact on
important public policies or important
private sector decisions." The
reproducibility standard applicable to
influential scientific. financia.l, or
statistical information is intended to
ensure that information disseminated by
agencies is sufficiently transparent in
terms of data and methods of analysis
that it would be feasible for a replication
to be conducted. The fact that the use
of original and supporting data and
analytic results have been deemed
"defensible" by peer-review procedures
does not necessarily imply that the
results are transparent and replicable.
Reproducibility of Original and
Supporting Data. Several of the
comments objected to the exclusion of
original and supporting data from the
reprodUcibility requirements.
Comments instead suggested that OMB
should apply the reproducibility
standard to original data, and that OMB
should provide flexibility to the
agencies in determining what
constitutes "original and supporting"
data. OMB agrees and asks that agencies
consider, in developing their own
guidelines, which categories of original
and suppoi}:ing data should be subject to
the reproducibility standard and which
should not. To help in resolving this
issue, we also ask agencies to consult
directly with relevant scientific and
technical communities on the feasibility
of having the selected categories of
original and supporting data subject to
the reproducibility standard. Agencies
are encouraged to address ethical,
feasibility, and confidentiality issues

8456

Federal Register / Vol. 67. No .. 36/ Friday, February 22, 2002/ Notices

with care. As we state in new paragraph
V.3.b.iLA. "Agencies may identify. in
consultation with the relevant scientific
and technical communities. those
particular types of data that can
practicably be subjected to a
reproducibility requirement, given
ethical, feasibility. or confidentiality
constraints." Further. as we state in our
expanded definition of
"reproducibility" in paragraph V.IO, "If
agencies apply the reproducibility test
to specific types of original or
supporting data. the associated
guidelines shall provide relevant
definitions of reproducibility (e.g.,
standards for replication of laboratory
data)." OMS urges caution in the
treatment of original and supporting
data because it may often be impractical
or even impermissible or unethical to
apply the reproducibility standard to
such data. For example. it may not be
ethical to repeat a "negative"
(ineffective) clinical (therapeutic)
experiment and it may not be feasible to
replicate the radiation exposures
studied after the Chernobyl accident. .
When agencies submit their draft agency
guidelines for OMS review. agencies
should include a description of the
extent to which the reproducibility
standard is applicable and reflect
consultations with relevant scientific
and technical communities that were
used in developing guidelines related to
applicability of the reproducibility
standard to original and supporting
data.
It is also important to emphasize that
the reproducibility standard does not
apply to all original and supporting data
disseminated by agencies. As we state in
new paragraph V.3.b.iLA. "With regard
to original and supporting data related
[to influential scientific, financial. or
statistical informationl. agency
guidelines shall not require that all
disseminated data be subjected to a
reproducibility requirement." In
addition. we encourage agencies to
address how greater transparency can be
achieved regarding original and
supporting data. As we also state in new
paragraph V.3.b.iLA. "It is understood
that reproducibility of data is an
indication of transparency about
research design and methods and thus
a replication exercise (i.e.• a new
experiment. test, or sample) shall not be
required prior to each dissemination."
Agency guidelines need to achieve a
high degree of transparency about data
even when reproducibility is not
required.
Reproducibility of Analytic Results.
Many public comments were critical of
the reproducibility standard and
expressed concern that agencies would

be required to reproduce each analytical
result before it is disseminated. While
several comments commended OMS for
establishing an appropriate balance in
the "capable of being substantially
reproduced" standard. others
considered this standard to be
inherently subjective. There were also
comments that suggested the standard
would cause more burden for agencies.
It is not OMB's intent that each
agency must reproduce each analytic
result before it is disseminated. The
purpose of the reproducibility standard
is to cultivate a consistent agency
commitment to transparency about how
analytic results are generated: the
specific data used. the various
assumptions employed, the specific
analytic methods applied. and the
statistical procedures employed. If
sufficient transparency is achieved on
each of these matters. then an analytic
result should meet the "capable of being
substantially reproduced" standard.
While there is much variation in types
of analytic results, OMB believes that
reproducibility is a practical standard to
apply to most types of analytic results.
As we state in new paragraph V.3.b.iLS.
"With regard to analytic results related
[to influential scientific, financial. or
statistical information]. agency
guidelines shall generally require
sufficient transparency about data and
methods that an independent reanalysis
could be undertaken by a qualified
member of the public. These
transparency standards apply to agency
analysis of data from a single study as
well as to analyses that combine
information from multiple studies." We
elaborate upon this principle in our
expanded definition of
"reproducibility" in paragraph V.IO:
"With respect to analytic results.
'capable of being substantially
reproduced' means that independent
analysis of the original or supporting
data using identical methods would
generate similar analytic results. subject
to an acceptable degree of imprecision.
or error."
Even in a situation where the original
and supporting data are protected by
confidentiality concerns, or the analytic
computer models or other research
methods may be kept confidential to
protect intellectual property. it may still
be feasible to have the analytic results
subject to the reproducibility standard.
For example, a qualified party.
operating under the same
confidentiality protections as the
original analysts. may be asked to use
the same data. computer model or
statistical methods to replicate the
analytic results reported in the original
study. See. e.g., "Reanalysis of the

Harvard Six Cities Stud\' and the
American Cancer Societ\' Study of
Particulate Air Pollution and Mortality,"
A Special Report of the Health Effects'
Institute's Particle Epidemiology .
Reanalysis Project. Cambridge. MA.
2000.

The primary benefit of public
transparency is not necessarily that
errors in analytic results will be
detected, although error correction is
clearly valuable. The more important
benefit of transparency is that the public
will be able to assess how much an
agency's analytic result hinges on the
specific analytic choices made by the
agency. Concreteness about analytic
choices allows. for example. the
implications of alternative technical
choices to be readily assessed. This type
of sensitivity analysis is widely
regarded as an essential feature of high­
quality analysis. yet sensitivity analysis
cannot be undertaken by outside parties
unless a high degree of transparency is
achieved. The OMS guidelines do not
compel such sensitivity analysis as a
necessary dimension of quality. but the
transparency achieved by
reproducibility will allow the public to
undertake sensitivity studies of interest.
We acknowledge that confidentiality
concerns will sometimes preclude
public access as an approach to
reproducibility. In response to public
comment. we have clarified that such
concerns do include interests in
"intellectual property." To ensure that
the OMB guidelines have sufficient
flexibility with regard to analytic
transparency, OMB has. in new
paragraph V.3.b.iLS.i. provided agencies
an alternative approach for classes or
types of analytic results that cannot
practically be subject to the
reproducibility standard. "[In those
situations involving influential
scientific. financial. or statistical
information· • • 1making the data and
methods publicly available will assist in
determining whether analytic results are
reproducible. However. the objectivity
standard does not override other
compelling interests such as privacy.
trade secrets. intellectual property. and
other confidentiality protections. "
Specifically. in cases where
reproducibility will not occur due to
other compelling interests. we expect
agencies (1) to perform robustness
checks appropriate to the importance of
the information involved, e.g..
determining whether a specific statistic
is sensitive to the choice of analytic
method. and, accompanying the
information disseminated. to document
their efforts to assure the needed
robustness in information quality. and
(2) address in their guidelines the

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
degree to which they anticipate the
opportunity for reproducibility to be
limited by the confidentiality of
underlying data. As we state in new
paragraph V.3.b.iLB.ii, "In situations
where public access to data and
methods will not occur due to other
compelling interests, agencies shall
apply especially rigorous robustness
checks to analytic results and document
what checks were undertaken. Agency
guidelines shall, however, in all cases,
require a disclosure of the specific data
sources that have been used and the
specific quantitative methods and
assumptions that have been employed."
Given the differences in the many
Federal agencies covered by these
guidelines, and the differences in
robustness checks and the level of detail
for documentation thereof that might be
appropriate for different agencies, we
also believe it will be helpful if agencies
elaborate on these matters in the context
of their missions and duties, with due
consideration of the nature of the
information they disseminate. As we
state in new paragraph V.3.b.iLB.ii,
.
"Each agency is authorized to define the
type of robustness checks. and the level
of detail for documentation thereof, in
ways appropriate for it given the nature
and multiplicity of issues for which the
agency is responsible."
~
We leave the determination of the
appropriate degree of rigor to the
discretion of agencies and the relevant
scientific and technical communities
that work with the agencies. We do,
however, establish a general standard
for the appropriate degree of rigor in our
expanded definition of
"reproducibility" in paragraph V.I0:
" 'Reproducibility' means that the
information is capable of being
substantially reproduced, subject to an
acceptable degree of imprecision. For
information judged to have more (less)
important impacts, the degree of
imprecision that is tolerated is reduced
(increased)." OMB will review each
agency's treatment of this issue when
reviewing the agency guidelines as a
whole.
Comments also expressed concerns
regarding interim final paragraph
V.3.B.iiL "making the data and models
publicly available will assist in
determining whether analytic results are
capable of being substantially
reproduced," and whether it could be
interpreted to constitute public
dissemination of these materials,
rendering moot the reproducibility test.
(For the equivalent provision, see new
paragraph V.3.b.iLB.i.) The OMB
guidelines do not require agencies to
reproduce each disseminated analytic
result by independent reanalysis. Thus,

8457

robustness checks required by these
public dissemination of data and
guidelines. Otherwise. the agency
models per se does not mean that the
should not disseminate anv of the
analytic result has been reproduced. It
studies that did not meet the applicable
means onlv that the result should be
standards in the guidelines at the time
CAPABLE"of being reproduced. The
it publishes the notice of proposed
transparency associated with this
rulemaking.
capability of reproduction is what the
Some comments suggested that OMB
OMB guidelines are designed to
consider replacing the reproducibility
achieve.
standard with a standard concerning
We also want to build on a general
"confirmation" ofresults for influential
observation that we made in our final
scientific and statistical information.
guidelines published in September
Although we encourage agencies to
2001. In those guidelines we stated: "...
in those situations involving influential consider "confirmation" as a relevant
standard-at least in some cases-for
scientifid, financial,] or statistical
assessing the objectivity of original and
information, the substantial
supporting
data, we believe that
reproducibility standard is added as a
"confirmation" is too stringent a
quality standard above and beyond
some peer review quality standards" (66 standard to apply to analytic results.
Often the regulatory impact analysis
FR 49722 (September 28, 2001)). A
prepared by an agency for a major rule,
hypothetical example may serve to
for example, will be the only formal
illustrate this point. Assume that two
analysis of an important subject. It
Federal agencies initiated or sponsored
would be unlikely that the results of the
the dissemination of five scientific
regulatory impact analysis had already
studies after October 1, 2002 (see
been confirmed by other analyses. The
paragraph III.4) that were, before
"capable of being substantially
dissemination, subjected to formal,
reproduced" standard is less stringent
independent, external peer review, Le.,
than a "confirmation" standard because
that met the presumptive standard for
it simply requires that an agency's
"objectivity" under paragraph V.3.b.L
analysis be sufficiently transparent that
Further assume, at the time of
another qualified party could replicate it
dissemination, that neither agency
through reanalysis.
reasonably expected that the
Health, Safety, and Environmental
dissemination of any of these studies
Information. We note. in the scientific
would have "a clear and substantial
context, that in 1996 the Congress, for
impact" on important public policies,
health decisions under the Safe
i.e., that these studies were not
Drinking Water Act. adopted a basic
considered "influential" under
standard of quality for the use of science
paragraph V.9, and thus not subject to
in agency decisionmaking. Under 42
the reproducibility standards in
U.S.C. 300g-1(b)(3)(A). an agency is
paragraphs V.3.b.iLA or B. Then
directed. "to the degree that an Agency
assume, two years later, in 2005, that
action is based on science." to use "(i)
one of the agencies decides to issue an
the best available, peer-reviewed
important and far-reaching regulation
science and supporting studies
based clearly and substantially on the
conducted in accordance with sound
agency's evaluation of the analytic
and objective scientific practices: and
results set forth in these five studies and (ii) data collected by accepted methods
that such agency reliance on these five
or best available methods (if the
studies as published in the agency's
reliability of the method and the nature
notice of proposed rulemaking would
of the decision justifies use of the
constitute dissemination of these five
data)."
studies. These guidelines would require
We further note that in the 1996
the rulemaking agency, prior to
amendments to the Safe Drinking Water
publishing the notice of proposed
Act, Congress adopted a basic quality
rulemaking, to evaluate these five
standard for the dissemination of public
studies to determine if the analytic
information about risks of adverse
results stated therein would meet the
health effects. Under 42 U.S.C. 300g­
"capable of being substantially
1(b)(3)(BJ, the agency is directed, "to
reproduced" standards in paragraph
ensure thatthe presentation of
V.3.b.ii.B and, if necessary, related
information Irisk] effects is
standards governing original and
comprehensive. informative. and
supporting data in paragraph V.3.b.iLA. understandable." The agency is further
If the agency were to decide that any of
directed, "in a document made available
the five studies would not meet the
\to the public in support of a regulation
reproducibility standard, the agency
ItoJ specify, to the extent practicable­
may still rely on them but only if they
(il each population addressed by any
satisfy the transparency standard and­ estimate lof applicable risk effects]: (ii)
as applicable-the disclosure of
the expected risk or central estimate of

8458

Federal Register / Vol. 67, No. 36/ Friday, February 22, 2002 / Notices

risk for the specific populations
[affected]; (iii) each appropriate upper­
bound or lower-bound estimate of risk;
(iv) each significant uncertainty
identified in the process of the
assessment of [risk] effects and the
studies that would assist in resolving
the uncertainty; and (v) peer-reviewed
studies known to the [agency] that
support, are directly relevant to, or fail
to support any estimate of [risk] effects
and the methodology used to reconcile
inconsistencies in the scientific data."
As suggested in several comments, we
have included these congressional
standards directly in new paragraph
V.3.b.iLC, and made them applicable to
the information disseminated by all the
agencies subject to these guidelines:
"With regard to analysis ofrisks to
human health. safety and the
environment maintained or
disseminated by the agencies, agencies
shall either adopt or adapt the quality
principles applied by Congress to risk
information used and disseminated
pursuant to the Safe Drinking Water Act
Amendments of 1996 (42 U.S.C. 300g- .
1(b)(3)(A) & (B))." The word "adapt" is
intended to provide agencies flexibility
in applying these principles to various
types of risk assessment.
Comments also argued that the
continued flow of vital information from
agencies responsible for disseminating
health and medical information to
medical providers, patients, and the
public may be disrupted due to these
peer review and reproducibility
standards. OMS responded by adding to
new paragraph V.3.b.iLC: "Agencies
responsible for dissemination of vital
health and medical information shall
interpret the reproducibility and peer­
review standards in a manner
appropriate to assuring the timely flow
of vital information from agencies to
medical providers, patients, health
agencies, and the public. Information
quality standards may be waived
temporarily by agencies under urgent
situations (e.g., imminent threats to
public health or homeland security) in
accordance with the latitude specified
in agency-specific guidelines."
Administrative Correction
Mechanisms. In addition to commenting
on the substantive standards in these
guidelines, many of the comments noted
that the OMB guidelines on the
administrative correction of information
do not specify a time period in which
the agency investigation and response
must be made. OMB has added the
following new paragraph III.3.i to direct
agencies to specify appropriate time
periods in which the investigation and
response need to be made. "Agencies
shall specify appropriate time periods

for agency decisions on whether and
how to correct the information. and
agencies shall notify the affected
persons of the corrections made."
Several comments stated that the
OMB guidelines needed to direct
agencies to consider incorporating an
administrative appeal process into their
administrative mechanisms for the
correction of information. OMB agreed.
and added the following new paragraph
rrr.3.ii: "If the person who requested the
correction does not agree with the
agency's decision (including the
corrective action, if any), the person
may file for reconsideration within the
agency. The agency shall establish an
administrative appeal process to review
the agency's initial decision, and specify
appropriate time limits in which to
resolve such requests for
reconsideration." Recognizing that
many agencies already have a process in
place to respond to public concerns, it
is not necessarily OMB's intent to
require these agencies to establish a new
or different process. Rather, our intent is
to ensure that agency guidelines specify
an objective administrative appeal
process that, upon furthercomplaint by
the affected person, reviews an agency's
decision to disagree with the correction
request. An objective process will
ensure that the office that originally
disseminates the information does not
have responsibility for both the initial
response and resolution of a
disagreement. In addition. the agency
guidelines should specify that if the
agency believes other agencies may have
an interest in the resolution of any
administrative appeal, the agency
should consult with those other
agencies about their possible interest.
Overall. OMS does not envision
administrative mechanisms that would
burden agencies with frivolous claims.
Instead. the correction process should
serve to address the genuine and valid
needs of the agency and its constituents
without disrupting agency processes.
Agencies, in making their determination
of whether or not to correct information,
may reject claims made in bad faith or
without justification, and are required to
undertake only the degree of correction
that they conclude is appropriate for the
nature and timeliness of the information
involved. and explain such practices in
their annual fiscal year reports to OMS.
OMS's issuance of these final
guidelines is the beginning of an
evolutionary process that will include
draft agency guidelines, public
comment, final agency guidelines,
development of experience with OMS
and agency guidelines, and continued
refinement of both OMS and agency
guidelines. Just as OMB requested

public comment before issuing these
final guidelines, OMS will refine thesp
guidelines as experience develops and
further public comment is obtained.
Dated: December 21, 2001.
John D. Graham,
Administrator, Office of Information and
Regulatory Affairs.

Guidelines for Ensuring and
Maximizing the Quality, Objectivity.
Utility. and Integrity of Information
Disseminated by Federal Agencies

1.0MB Responsibilities
Section 515 of the Treasurv and
General Government Appropriations
Act for FY2001 (Public Law 106-554)
directs the Office of Management and
Budget to issue government-wide
guidelines that provide policy and
procedural guidance to Federal agencies
for ensuring and maximiZing the
quality, objectivity, utility, and integrity
of information, including statistical
information, disseminated by Federal
agencies.
II. Agency Responsibilities
Section 515 directs agencies subject to
the Paperwork Reduction Act (44 U.S.C.
3502(1)) to­
1. Issue their own information quality
guidelines ensuring and maximiZing the
quality, objectivity, utility .. and integrity
of information, including statistical
information, disseminated by the agency
no later than one year after the date of
issuance of the OMB guidelines;
2. Establish administrative
mechanisms allowing affected persons
to seek and obtain correction of
information maintained and
disseminated by the agency that does
not comply with these OMB gUidelines;
and
3, Report to the Director of OMB the
number and nature of complaints
received by the agency regarding agency
compliance with these OMS guidelines
concerning the quality, objectivity,
utility, and integrity of information and
how such complaints were resolved.
III. Guidelines for Ensuring and
Maximizing the Quality. Objectivity,
Utility, and Integrity of Information
Disseminated by Federal Agencies'
1. Overall, agencies shall adopt a
basic standard of quality [including
objectivity, utility, and integrity) as a
performance goal and should take
appropriate steps to incorporate
information quality criteria into agency
information dissemination practices.
Quality is to be ensured and established
at levels appropriate to the nature and
timeliness of the information to be
disseminated. Agencies shall adopt

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
specific standards of quality that are
appropriate for the various categories of
information thev disseminate.
2. As a matter of good and effective
agency information resources
management, agencies shall develop a
process for reviewing the quality
(including the objectivity, utility, and
integrity) of information before it is
disseminated. Agencies shall treat
information quality as integral to every
step of an agency's development of
information, including creation,
collection, maintenance, and
dissemination. This process shall enable
the agency to substantiate the quality of
the information it has disseminated
through documentation or other means
appropriate to the information.
3. To facilitate public review, agencies
shall establish administrative
mechanisms allowing affected persons
to seek and obtain, where appropriate,
timely correction of information
maintained and disseminated by the
agency that does not comply with OMB
or agency guidelines. These
administrative mechanisms shall be
flexible, appropriate to the nature and
timeliness of the disseminated
information, and incorporated into
agency information resources
management and administrative
~
practices.
\. Agencies shall specify appropriate
time periods for agency decisions on
whether and how to correct the
information, and agencies shall notify
the affected persons of the corrections
made.
ii. If the person who requested the
correction does not agree with the
agency's decision (including the
corrective action, if any), the person
may file for reconsideration within the
agency. The agency shall establish an
administrative appeal process to review
the agency's initial decision, and specify
appropriate time limits in which to
resolve such requests for
reconsideration.
4. The agency's pre-dissemination
review, under paragraph III.2, shall
apply to information that the agency
first disseminates on or after October 1,
2002. The agency's administrative
mechanisms, under paragraph III.3.,
shall apply to information that the
agency disseminates on or after October
1, 2002, regardless of when the agency
first disseminated the information.

IV. Agency Reponing Requirements
1. Agencies must designate the Chief
Information Officer or another official to
be responsible for agency compliance
with these guidelines.
2. The agency shall respond to
complaints in a manner appropriate to

8459

perspective of the agency but also from
the perspective of the public. As a
result. when transparency of
information is relevant for assessing the
information's usefulness from the
public's perspective, the agency must
take care to ensure that transparency has
been addressed in its review of the
information.
3. "Objectivity" involves two distinct
elements, presentation and substance.
a. "Objectivity" includes whether
disseminated information is being
presented in an accurate, clear,
complete, and unbiased manner. This
involves whether the information is
presented within a proper context.
Sometimes, in disseminating certain
types of information to the public. other
information must also be disseminated
in order to ensure an accurate. clear,
complete, and unbiased presentation.
Also, ilie agency needs to identify the
sources of the disseminated information
(to the extent possible, consistent with
confidentiality protections) and, in a
scientific, financial. or statistical
context, the supporting da.ta and
models. so that the public can assess for
itself whether there may be some reason
to question the objectivity of the
sources. Where appropriate, data should
have full, accurate, transparent
documentation, and error sources
affecting data quality should be
identified and disclosed to users.
b. In addition, "objectivity" involves
a focus on ensuring accurate, reliable,
and unbiased information. In a
scientific, financial. or statistical
context, the original and supporting
data shall be generated, and the analytic
1,2002.
6. On an annual fiscal-year basis, each results shall be developed, using sound
statistical and research methods.
agency must submit a report to the
\. If data and analytic results have
Director of OMB providing information
been subjected to formal. independent,
(both quantitative and qualitative,
external peer review., the information
where appropriate) on the number and
may generally be presumed to be of
nature of complaints received by the
acceptable objectivity. However, this
agency regarding agency compliance
presumption is rebuttable based on a
with these OMB guidelines and how
persuasive showing by the petitioner in
such complaints were resolved.
Agencies must submit these reports no. a particular instance. If agency­
sponsored peer review is employed to
later than January 1 of each following
year, with the first report due January 1, help satisfy the objectivity standard, the
review process employed shall meet the
2004.
general criteria for competent and
V. Definitions
credible peer review recommended by
1. "Quality" is an encompassing term
OMB-GIRA to the President's
comprising utility, objectivity, and
Management Council (9/20101) (http://
integrity. Therefore, the guidelines
www. whitehouse.gov I omblinforegl
sometimes refer to these four statutory
oira_review-process.htrnl), namely,
terms, collectively, as "quality."
"that (a) peer reviewers be selected
2. "Utility" refers to the usefulness of primarily on the basis of necessary
the information to its intended users,
technical expertise, (b) peer reviewers
including the public. In assessing the
be expected to disclose to agencies prior
usefulness of information that the
technical/policy positions they may
agency disseminates to the public, the
have taken on the issues at hand, (c)
agency needs to consider the uses of the peer reviewers be expected to disclose
information not only from the
to agencies their sources of personal and
the nature and extent of the complaint.
Examples of appropriate responses
include personal contacts via letter or
telephone, form letters, press releases or
mass mailings that correct a widely
disseminated error or address a
frequently raised complaint.
3. Each agency must prepare a draft
report, no later than April 1, 2002,
providing the agency's information
quality guidelines and explaining how
such guidelines will ensure and
maximize the quality, objectivity,
utility, and integrity of information,
including statistical information,
disseminated by the agency. This report
must also detail the administrative
mechanisms developed by that agency
to allow affected persons to seek and
obtain appropriate correction of
information maintained and
disseminated by the agency that does
not comply with the OMB or the agency
guidelines.
4. The agency must publish a notice
of availability of this draft report in the
Federal Register, and post this reporton
the agency's website, to provide an
opportunity for public comment.
5. Upon consideration of public
comment and after appropriate revision,
the agency must submit this draft report
to OMB for review regarding
consistency with these OMB guidelines
no later than July 1, 2002. Upon
completion of that OMB review and
completion of this report, agencies must
publish notice of the availability of this
report in its final form in the Federal
Register, and post this report on the
agency's web site no later than· October

8460

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices

institutional funding (private or public
sector), and (dl peer reviews be
conducted in an open and rigorous
manner."
iL If an agency is responsible for
disseminating influential scientific,
financial. or statistical information,
agency guidelines shall include a high
degree of transparency about data and
methods to facilitate the reproducibility
of such information by qualified third
parties.
A. With regard to original and
supporting data related thereto, agency
guidelines shall not require that all
disseminated data be subjected to a
reproducibility requirement. Agencies
may identify, in consultation with the
relevant scientific and technical
communities. those particular types of
data that can practicable be subjected to
a reproducibility requirement, given
ethical, feasibility, or confidentiality
constraints. It is understood that
reproducibility of data is an indication
of transparency about research design
and methods and thus a replication
exercise (Le., a new experiment, test, or .
sample) shall not be required prior to
each dissemination.
B. With regard to analytic results
related thereto, agency guidelines shall
generally require sufficient transparency
about data and methods that an ~
independent reanalysis could be
undertaken by a qualified member of the
public. These transparency standards
apply to agency analysis of data from a
single study as well as to analyses that
combine information from multiple
studies.
i. Making the data and methods
publicly available will assist in
determining whether analytic results are
reproducible. However, the objectivity
standard does not override other
compelling interests such as privacy,
trade secrets, intellectual property, and
other confidentiality protections.
ii. In situations where public access to
data and methods will not occur due to
other compelling interests, agencies
shall apply especially rigorous
robustness checks to analytic results
and document what checks were
undertaken. Agency guidelines shall,
however, in all cases, require a
disclosure of the specific data sources
that have been used and the specific
quantitative methods and assumptions
that have been employed. Each agency
is authorized to define the type of
robustness checks, and the level of

detail for documentation thereof, in
ways appropriate for it given the nature
and multiplicity of issues for which the
agency is responsible.
C. With regard to analysis of risks to
human health, safety and the
environment maintained or
disseminated by the agencies, agencies
shall either adopt or adapt the quality
principles applied by Congress to risk
information used and disseminated
pursuant to the Safe Drinking Water Act
Amendments of 1996 (42 U.S,C. 300g­
1(b)(3)(A) & (Bll. Agencies responsible
for dissemination of vital health and
medical information shall interpret the
reproducibi1ity and peer-review
standards il1 a manner appropriate to
assuring the timely flow of vital
information from agencies to medical
providers. patients, health agencies, and
the public. Information quality
standards may be waived temporarily by
agencies under urgent situations (e,g.,
imminent threats to public health or
homeland security) in accordance with
the latitude specified in agency-specific
guidelines.
4. "Integrity" refers to the security of
information-protection of the
information from unauthorized access
or revision, to ensure that the
information is not compromised
through corruption or falsification.
5. "Information" means any
communication or representation of
knowledge such as facts or data. in any
medium or form, including textual,
numerical, graphic, cartographic,
narrative, or audiovisual forms. This
definition includes information that an
agency disseminates from a web page,
but does not include the provision of
hyperlinks to information that others
disseminate. This definition does not
include opinions, where the agency's
presentation makes it clear that what is
being offered is someone's opinion
rather than fact or the agency's views.
6. "Government information" means
information created, collected,
processed, disseminated, or disposed of
by or for the Federal Government.
.
7. "Information dissemination
product" means any books, paper. map,
machine-readable material, audiovisual
production, or other documentary
material. regardless of physical form or
characteristic, an agency disseminates to
the public. This definition includes any
electronic document, CD-ROM, or web
page.
8. "Dissemination" means agency
initiated or sponsored distribution of

information to the public (see 5 CFR
1320.3(d) (definition of "Conduct or
Sponsor")). Dissemination does not
include distribution limited to
government employees or agency
contractors or grantees; intra- or inter­
agency use or sharing of government
information; and responses to requests
for agency records under the Freedom of
Information Act, the Privacy Act, the
Federal Advisory Committee Act or
other similar law. This definition also
does not include distribution limited to
correspondence with individuals or
persons, press releases, archival records.
public filings, subpoenas or adjudicative
processes.
9. "Influential", when used in the
phrase "influential scientific, financial.
or statistical information", means that
the agency can reasonably determine
that dissemination of the information
will have or does have a clear and
substantial impact on important public
policies or important private sector
decisions. Each agency is authorized to
define "influential" in ways appropriate
for it given the nature and multiplicity
of issues for which the agency is
responsible.
10. "ReprodUcibility" means that the
information is capable of being
substantially reproduced, subject to an
acceptable degree of imprecision. For
information judged to have more (less)
important impacts, the degree of
imprecision that is tolerated is reduced
(increased). If agencies apply the
reproducibility test to specific types of
original or supporting data, the
associated guidelines shall provide
relevant definitions of reproducibility
(e.g., standards for replication of
laboratory data). With respect to
analytic results, "capable of being
substantially reproduced" means that
independent analysis of the original or
supporting data using identical methods
would generate similar analytic results,
subject to an acceptable degree of
imprecision or error.
[FR Doc. 02-59 Filed 1-2~2:
BILLING CODE 3110-01-M

1:36 pm]

Editorial Note: Due to numerous errors,
this document is being reprinted in its
entirety. It was originally printed in the
Federal Register on Thursday. January 3,
2002 at 67 FR 369-378 and was corrected on
Tuesday, February 5, 2002 at 67 FR 5365.
[FR Doc. R2-59 Filed
BILLING CODE 1505-01-{)

2-21~2; 8:45

am]

Page 10f1

Unknown
From:
Sent:
To:
Subject:

Cotter, Sandra, Ms, OSD-ATL
Monday, September 30, 2002 09:00
Ledbetter, George, COL, DoD OGC
FW: Rialto

Here is version sent to Mr. Woodley.
Message----­
From: Kratz, Kurt, , OSD-ATL
Sent: Monday, September 09, 20028:45 AM
To: Woodley Jr., John, Mr, 05D-ATL
Cc: Armstrong, Brett, LTC, OSD~ATL; Ungaro, Ronald, CDR, 05D-ATL; Cotter, Sandra, Ms, OSD-ATL; Ledbetter,
George, COL, DoD OGe
Subject: FW: Rialto
----~Original

Sir,
Rick Newsome suspects that Mr. Barry Groveman, of the Inland Empire Perchlorate Task Force (IEPTF),
may contact Mr. DuBois in a further attempt to gain a commitment from DoD to participate in
Potentially Responsible Party (PRP) negotiations for cleanup of perchlorate contamination in drinking
water, in San Bernadino, California. The info paper enclosed is the latest update on the site, "Rialto."
Recommend that the MAs forward any calls to Col George Ledbetter of OGC, •
He is up
to date on the legal issues involved.
Kurt

Rialto Ammunition Storage PointlInland Empire Site
Issue: Mr. Barry Groveman, of the Inland Empire Perchlor~te Task Force (IEPTF), may contact
Mr. DuBois in a further attempt to gain a commitment from DoD to participate in Potentially
Responsible Party (PRP) negotiations for cleanup of perchlorate contamination in drinking water, in
San Bernadino, California.
Background: As part of the Formerly Used Defense Program (FUDS) program, the Corps of
Engineers completed two eligibility assessments on the property, both of which suggested that
perchlorate was not present during -the time of DOD ownership and control of the former Rialto
Ammunition Storage Point. The assessments are not 100% conclusive, and the Corps is
considering expanding the eligibility assessment specifically to address potential perchlorate
contamination, in FY 2003. Subsequent to DOD ownership, records indicate that several DOD
contractors operated at the property, with contracts with both the Air Force and Navy, and may have
been indemnified for their activities. Therefore, in regard to liability, the site appears to have little
FUDS interest, although could evolve into a complex contractural/PRP situation for DoD.
Current Status: In a conference calIon August 15, 2002, representatives of IEPTF requested Anny
agree to participate in PRP negotiations allocating financial responsibility for cleanup of perchlorate
contamination in the drinking water of four municipalities near San Bernadino, California. IEPTF
alleged that the former RIalto Ammunition Storage Point contributed to the contamination at the
site. In subsequent telephone calls, Mr. Ryan Heite and Mr. Barry Gr:oveman, ofIEPTF, requested
Army designate an individual to participate in the PRP negotiations.
IEPTF is demanding that DoD agree to financially contribute to:
1) a provision for an alternate water supply until perchlorate well head treatment can be made
available,
2) payment of perchlorate well head treatment capital and O&M costs,
3) indemnification of IEPTF municipalities from toxic tort claims/suits through payment for
insurance,
4) long-term commitment to restoration of perchlorate contaminated groundwater,
5) reimbursement for DoD's share of the IEPTF mWlicipalities past response costs, and
7) reimbursement of DoD's share of any litigation costs.
On September 4, 2002, Army informed Mr. Groveman, that before they could respond to IEPTF on
any of these issues, they must first establish whether c>r not DOD caused or controlled activities that
could have potentially contributed to the perchlorate contamination at the site. The Army also
informed Mr. Groveman that the Department of Justice (DoJ) would represent the Army in any
discussion, or negotiation, of DOD liability at this property. Further, even if Anny and IEPTF
were to come to agreement on the degree ofcontribution at the site, 000 could not commit to any
direct payments to IEPTF municipalities.

(b)(5)

Page 1 of2

Perchlorates - OSD Human Pathway Question

/C:<5

From:

Cotter, Sandra, Ms, OSD-ATL

Sent:

Wednesday, June 25, 2003 13:57

To:

'Bauermeister Fred'

SUbject: FW: Perchlorates - OSD Human Pathway Question

-----Orig inal Message----­
From: Newsome, Richard E Mr ASA-I&E [maHto (b)(6)
sent: Wednesday, June 25, 2003 1:27 PM
To: Cotter, Sandra Ms OSD-ATL
Cc: Kratz, Kurt OSD-ATL; Garg, Malcolm J ACSIM/CH2M HILL; Van Brocklin, Connie H Ms ACSIM; 'Gentile, Laura';
Ganta, Krishna Mr ACSIM; Bell, David E Mr OGC; Fatz, Raymond J Mr ASA-I&E; Rogers, Brian LTC ASA-I&E;
Lockhart, Vivian R Ms ASA-I&E
Subject: Perchlorates - OSD Human Pathway Question

Sandy,
I am told you requested information on Army installations/locations where there may be a completed
pathway for human exposure to perchlorate. Ms. Van Brocklin, et. al. have prepared the information provided at
the attachment. It is provided for your use as appropriate.
Rick
-'-··Original Message-"-'
From: Van Brocklin, Connie H Ms ACSIM
Sent: Wednesday, June 25, 2003 8:47 AM
To:

Newsom~,

Cc:

Garg, Malcolm) ACSIM/CH2M HILL

Subject:

Richard EMr ASA·I&E

Perchlorates Info for Ms. Catter

Rick,

Sandy Cotter called last week and said the Ben Cohen also wanted to know if there is an exposure

pathway for the sites where we have quantitative data. If you approve, can send to Ms. Cotter?

Thanks,

Connie

«Is There an Exposure Pathway.doc»

Connie Van Brock/in
Army Environmental Programs
ATTN: DAIM-EDT
RoomrmtfJI
600 Army Pentagon
Washington, DC 20310

Perchlorates - OSD Human Pathway Question
(b)(2)

Page 2 of2

DAIM-EDT
25 June 2003

Is There an Exposure Pathway?
Installations have been asked to address the question, "Is there a perchlorate
exposure pathway that could threaten public health". Determinations of
pathways often require rigorous site investigation and interpretation of data.
Th~se are preliminary answers based on best available information.

Aberdeen Proving Ground - Yes. Perchlorate has been found in drinking
supply wells (ND- 5ppb) for the City of Aberdeen. Perchlorate has been detected
in the aquifer that supplies water to the City of Aberdeen.
ARDEC (Picatinny) - No known or suspected pathway. One GW detection of
19.6 ppb in Open Detonation area and a 605 ppb detection in what is thought to
be an isolated, non-source aqUifer. Perchlorate also sampled at southern
boundary and none found.
Iowa Ammunition Plant - No known exposure pathway - not suspected.
Lake City Army Ammunition Plant- No exposure pathway suspected. 2 of
30 wells sampled with results of 23 and 79 ppb - no drinking water sources are
known to be threatened.
Lone Star Army Ammunition Plant - Unknown. 8 ppb found in GW by
burning grounds and 9 ppb found next to production bldg.
Longhorn Army Ammunition Plant- No human health exposure determined.
Perchlorate is not approaching drinking water sources. Installation has defined
nature and extent of perchlorate contamination at various locations on Longhorn
property. Nature and extent of perchlorate contamination in soil and groundwater
has been identified. Monitoring of surface water continues on a quarterly basis
per an agreement with regulatory agencies. No determination of a Human/Public
exposure pathway has been determined.
Massachusetts Military Reservation - Yes, Perchlorate has been detected in
wells that supply the town of Bourne « 1 ppb) and in a private-residence Bourne
well (1.75 ppb). Perchlorate has been found in a sole source aqUifer located
under MMR (upto 300 ppb).
Redstone Arsenal-Possible pathway exists - however sampling and drinking
water sources have not been impacted. Elevated levels of perchlorate have been
detected in springs that empty into a creek that, in turn, empties into the river.
This site is located one-mile upstream of the arsenal's drinking water intake. The
drinking '....ater intake has been analyzed for perchlorate and the results are non­
detect.

From:
Sent:
To:
SUbject:

Kratz, Kurt, , OSD-ATL
Tuesday, March 23, 200409:44
Cotter, Sandra, Ms, OSD-ATL; Wennerberg, Linda Dr OSD-ATL; Willging, Joseph Mr 000
OGC

FW: Perchlorate White paper

fyi
-----Original Message----­
From: Bowling, Curtis, Mr, OSD-ATL
Sent: Tuesday, March 23, 2004 8:46 AM
To: Kratz, Kurt, , OSD-ATLi Larkin, Janice, Ms, OSD-ATLi Beard, Bruce, Mr,
OSD-ATL
Subject: FW: Perchlorate White paper
fyi
-----Original Message----­
From: Bowling, Curtis, Mr, OSD-ATL
Sent: Thursday, March 18, 2004 9:32 AM
To: Beehler, Alex, Mr, OSD-ATLi Cohen, Ben, Mr, DoD OGC
Cc:
Wright, William, CAPT, OSD-ATLi Kiser,
Richard CAPT DDESBi Bowling, Curtis, Mr, OSD-ATLi Kaminski, Art, LtCol,
OSD-ATL; Nicholls, William, Mr, OSD-ATL
Subject: Perchlorate White paper

I. .

PerchlorateWhitePa

per.doc

I have attached t
e DDESB's draft of a Perchlorate White Paper on
"Benefits of Using Perchlorate in Military Munitions. Most of the
information was provided by the Chemical Propulsion Information Agency,
Office of Munitions, and the Navy (China Lake).
They have tried to
quantify areas where we located data but none of our POCs in the
acquisition area had any firm numbers on such short notice.
The DDESB and
I would be glad to meet with you to discuss the paper.
Curtis

1

Document 890

Three page White Paper withheld

Title: Benefits of Using Perchlorates in Military Munitions

Exemption 5

From:

Kratz, Kurt, , OSD-ATL

Sent:

Thursday, October 09, 2003 13:49

To:

Walton, Christina, LtCol, OSD-ATL

Cc:

Cornell Jeff Lt. Col SAFflE; Rogers Daniel Col AFLSAIJACE; McManus, Edward. Mr, OSD-ATL.
Cotter, Sandra, Ms, OSD-ATL

Subject: Perchlorate Info
Christina,

As promised. Sorry it took a couple of days. Remember - close hold· back ground use for you.

Certainly don't mind if you use these ideas to start doing investigation on your own.

Thx,

Kurt

7/19/2005

Document 290

Five page draft memorandum with charts withheld

Title: Draft DoD Perchlorate Site Characterization and Treatment Cost Estimate

Exemption 5

Document 290

Ten page draft chart withheld

Title: Revising the Cost Estimate Based on Real Plume Volumes

Exemption 5

Document 290

Two page draft memorandum withheld

Title: Summary of Estimated Perchlorate Costs

Exemption 5

Issue Background

• Perchlorate case
- Establishment of standards by guidance rather than rulemaking

• EPA toxicity assessment
- Dispute over scientific basis for draft guidance
- Assessment drives operational risk and financial liability

• DoD role in definition of:
- Toxicology science

- Cleanup/drinking water treatment

- Development of new process for rulemaking

o OMB lead

PR IVII J;~(Jld.n/lPIU;JU;rISfA~ArI./lRPJIRFR ArTI\'F//RA ~JAT RISP' .ASP II.JRPR 17AI At

2

Perchlorate Fundamentals

• Chemistry'
Stable salt in soil and groundwater

• Perchlorate may occur naturally
South West and South Central Texas
o Salado formation (potash) identified by USGS in 2002
o 4 - 58 ppb in wells, 46 ppb in finished drinking water

Chile

o Potash formation
o 110 ppb in drinking water in surrounding communities

• Industrial Uses:
Energetic for solid rockets and munitions (- 90 percent)
Fireworks, airbag inflators, medicines (- 10 percent)

• Contaminent in ground and surface water
Found in groundwater under rocket manufacturing/testing facilities and live-fire impact areas
Colorado River contamination from runoff at Kerr-McGee facility in Henderson, Nevada
o Possible effects on California and Arizona agricultural sector

• Health effects
Iodide uptake inhibitor in thyroid (replaces iodide at receptors)
EPA identified sensitive subpopulation as fetal
Extremely high doses may cause developmental problems
~J'

lIT

~""T:"T"\.Llnn_T"'~"""~'IC"!.I..f"\.._"'TA I" Il~T"'-----.L.....1.I:"LI:"_.I:l..-..&._~I,---rT""JJT"'t..r"t. ,"1F"'\..~

1'"""'\.1_nC'!I C\c,...,

I

I" ' ......... rn rC\.I_--'­

rKI v ILLULDIII IXLDLCh,IVI"ilftL!IDCLIOLlx::r\ I I V DIDV l"ilV I UIJCLV.,L VrITJ'LK I

vln

3

Percholorate Fundamentals
• Partnership with EPA, Industry, and States
- Initiated in 1996
- .More proactive 000 position
- Determine extent of occurrence and health effects

• $29 million invested cumulatively through SERDP/ESTCP
-

Toxicological -- $2.15M
Ecological - $1.89M
Analytical -- $.38M
Remediation technologies -- $25M

• Surveys on DoD sites
Mandatory
o Unregulated Contaminant Monitoring Rule
• Safe Drinking Water Act
o National Pollutant Discharge Elimination System permits
• Clean Water Act

- Voluntary

o 200 I survey

PRIVJL~G~D,£IPRj;i;nj;i;rJ~JONAL/lnFlJnFRATIVFJlnA ~fAT Risel nSF fJNRFR

FAJA

4

Current Regulatory Environment
• Evolution of EPA draft reference dose
Used by EPA Regions and some States as de facto cleanup standard
o 1992 - draft reference dose at 4 ppb
o 1995 - draft reference dose at 18 ppb
o 1999 - EPA sponsored Peer Review at 32 ppb

Recommended more research to address uncertainty

o 2002 - draft final reference dose at I ppb

• State standards
Califomia:action level at 4 - 18 ppb

Maryland: interim advisory level at I ppb

Massachusetts: interim drinking water advice level at I ppb for sensitive populations

Texas: interim action level

o Industrial at 7 - 10 ppb
o Residential at 4 ppb

• Mission effects
Aberdeen
o Stopped training on selected ranges

Massachusetts Military Reservation

o State asking for water supply replacement for contaminated wells

Vandenberg

o TCE cleanup halted to preclude perchlorate re-injection into groundwater

PRPlILEGEQh'PREDECISIO~'ALHDELIBERATIVE/fDONOT DISCLOSE UNDER FOIA

5

National Academy of Science Review

• 2002 EPA Peer Review
Draft EPA Exposure Level (RID) of 1ppb

- 000, NASA disagree with EPA analysis of toxicity

o Elevated to Executive Office of the President

• Interagency oversight of NAS review of science underlying perchlorate
toxicity
- Participants
o EOP leadership: OSTP, OMS, CEQ
o Sponsors: 000, EPA, DoE, NASA
o Others: USDA, FDA, 001

• Issue referred to National Academy of Sciences
- NAS to review underlying science of perchlorate toxicity
o Estimated completion - Nov 2004

- NAS could dispute EPA Peer Review document

o If NAS review question science - EPA must address results

PRIVILEGEBHPREOECISIOtqALIfOEI JRERATI'/EHDO NAT DISCI ,OSF (INOFR FOIA

6

FY04 Congressional Action

• Defense Authorization Bill (Senate)
Section 331 - Public Health Assessment of Exposure to Perchlorate
o SecDef required to provide an independent epidemiological study ofexposure to perchlorate in
drinking water
• Study conducted in cooperation with CDC, NIH, or another federal entity with experience in
environmental toxicology
• Study to include available data on other substances that have endocrine effects similar to
perchlorate to which public is frequently exposed.
• Study due I June 2005.

• Military Construction Appropriation Bill (Senate)
- Report 108-82
o Report on activities completed by perchlorate interagency work group
o Identify sources of perchlorate at BRAe sites & develop a plan to remediate once state or federal
perchlorate standards are set
o Report due 180 days after enactment

• Defense Appropriations Bill (House)
- Report 108-187
o DoD/EPA conduct joint study on perchlorate contamination in CA, AZ, NV
o DoD recommend national standard & outline steps to cleanup

PItIVILEGEDf7(fREDECISIOr~ALh'DELIBERATIVE/IDONOT DISCLOSE UJ>ID~R

fOIl\.

7

Industry Activity
• American Chemistry Council
- Long-Range Research Initiative

• LRRI key focal points
- Improve science
o Identify knowledge and data gaps
o Fund and/or perfonn research to fill gaps
o Produce independent risk assessments suitable for informing

decision-making

- Partner with private and public entities with similar

regulatory concerns

- Engage regulatory agenices strategically

PIUVILnGEDtlPR~D~CH;IO~lAL~rDf:LIBERATIVhl/DO~~OT DI~GI~O~R (J~Jf)~R yOIA

8

Current Industry Cleanup Activities

• Aerojet - Sacramento, California
- Perchlorate at ,. . , 2500 ppb
- NOMA at ,. . , 110 ppt
- TeE at,. . , 1500 ppb

• Atlantic Research Corporation - Camden, Arkansas
- Perchlorate at ,. . , 100 ppb

• Boeing - Redlands, California
- Perchlorate at ,. . , 70 ppb

• Kerr-McGee - Las Vegas, Nevada
- Perchlorate at ,. . , 885 ppb

• Lockheed Propulsion - Redlands, California
- Perchlorate at ,. . ,70 ppb

PRIVILEGED"lPREDRCH\IOl>~ALHORLIBP,RATIVyl/nA

l>JOT nl~rl J)~y (H>JOV'R ~()IA

9

DoD Additional Actions

• Sampling Policy
November 2002
o 000 Components may assess perchlorate occurrence if:
• Reasonable basis for presence
• Pathway that impacts public health
o Allows Components to react to regulator requests

• California Outreach
- Conducted by ADUSD(E)
Agree.ment to establish working group with California regulatory
agencIes
. 0

Coordinated response to California request for all DoD installations to
sample for perchlorate plus five other unregulated chemicals

Agreement to work with Southern California water agencies
o Cooperation in testing, validation, and certification of drinking water
treatment technologies

PRIVILEGEn,~IPREDECISIO~JAL//DELIBERATIVENDO~.OT DISCLOSE UNDER fOIA

10

Proposed Revision to Perchlorate Sampling Policy

-PRI\qLEGEOHPREDf!CISIOHAU/9ELlBERATIVEIfSO MO'f DISCLOSE UNSER F91A

IJ

Emerging Regulatory Environment

Tri-chloro ethylene (TCE)
NAS review new drafi cancer slope factor
TERA review requested by Air Force

Royal Dutch Explosive (RDX)
Possible cancer risk identified by EPA
Cancer slope factor and RID published, no MeL

N-Nitrosodimethylamine (NOMA)
Component of rocket fuel
Probable human carcinogen identified by EPA
California action level is 10 nglL

1,4 Dioxane
Stabilizer for chlorinated solvents
Probable human carcinogen (low hazard) identified by rPA
California action level is 2 micrograms/L

1,2,3-Trichloropropane (TCP)
Paint and varnish remover, cleaning, degreasing agent
California action level is .005 micrograms/L

Additional subjects of

Cal EPA survey request

Hexavalent Chromium
Metal plating, corrosion inhibitor
No state or federal regulation
Regulation by California possible by Jan 2004 (Senate Bill 351)

Polybrominated Diphenyl Ehter (PODE)
Flame retardant
No state or federal regulation

. PRIVlbEGEDIIPREDECISIONALI/DELIBERATIVEIlDO NOT DISCLOSE tiNDER FOIA

12

Potential Effects of Increased Regulation

... PRWff::EOEDffPREDECISIONAb'tDELlBERATIYEfIDO NO,. DISCLOSE UNDER FOIA

Way Forward

• Emerging chemical regulationary environment
- Managed under Safe Drinking Water Act (SDWA)

• Air Force is Executive Agent for SDWA
- DoDI 4715.6, 1996

'"
..
• SDWA Steering Committee
;

"

OSD retains policy oversight function

PFJVILEGHO/IPR E~CISIOJ>.IAljlOeLIBERATIVEHDO t40'f f)JSCLOSE UNM!t\::E@IJIt

14

Page 1 of 1

Unknown
From:

Kratz, Kurt, ,OSD-ATL

Sent:

Wednesday, August 06,200309:42

To:

Cotter, Sandra, Ms, OSD-ATL

SUbject: Perchlorate
fyi

Unknown
From:

Kratz, Kurt, , OSD-ATL

Sent:

Wednesday, August 06,200309:42
'Kowalczyk Daniel'

To:
SUbject:

RE: FW: EPA-NAS contract

I think that is enough for now.
Kurt
-----Original Message----­
From: Kowalczyk Daniel [mail to: ,

Sent: Wednesday, August 06, 2003 9:12 AM

To: Kratz Kurt OSD-ATL

Subject: Re: FW: EPA-NAS contract

Sir,
This is what Lt Col Cornell transmitted earlier this week - it has
double pages of some and is incomplete - several pages appear to be
missing. I have already contacted Lisa Matthews and she is resubmitting
to me via fax. When I receive it, I will scan it in and send it out to
yourself, Lt Col Cornell, Col Rogers and Rick Belzer. Anyone else I
should include??

vir
Dan
"Kratz, Kurt, , OSO-ATL" wrote:
>
> Check this
> Kurt
>
> -----Original Message----­
> From: Rogers Daniel CoJ. AFLSA/JACE
> Sent: Wednesday, August 06, 2003 8:53 AM
> To: Kratz, Kurt, , OSD-ATL
> Subject: FW: EPA-NAS contract
>
> here it is
>
> DANIEL E. ROGERS, Colonel, USAF
> Chief, Environmental Law and Litigation
> (b)(2)
>
>

>
>
>
> -----Original Message----­
> From:
Cornell Jeff Lt. Col SAF/IE
> Sent:
Monday, August 04, 2003 10:20 AM
> To:
Kratz, Kurt, , OSD-ATLi Rogers Daniel Col AFLSA/JACEi Richard
B.

> Belzer Ph. D. (E-mail)i Kowalczyk Daniel (E-mail)

> Subject:
EPA-NAS contract

>
> -----Original Message----­
Cruz Angelyn Ctr. SAF/TEE
> From:
Monday, August 04, 2003 10:13 AM
> Sent:
Cornell Jeff Lt. Col SAF/lE
> To:
FW: requested scanned documents
> SUbject:
>

>
>
>
>
>
>
>

-----Original Message----­
From:
Cruz Angelyn Ctr. SAF/IEE
Sent:
Friday, July 11, 2003 8:33 AM
To:
Cornell Jeff Lt. Col SAF/lE
Subject:
requested scanned documents
«NAS Perchlorate Task Order.doc»

>

> Angelyn A. Cruz
> Office of the Deputy Assistant Secretary
> (Environment, Safety & Occupational Health)
>

>
>

Name: NAS Perchlorate Task

Order. doc
>
NAS Perchlorate Task Order.doc
(application/msword)

Type: WINWORD File

>
>

Encoding: base64
Download Status: Not downloaded with

message
Daniel Kowalczyk

Booz Allen Hamilton

8283 Greensboro Dr

McLean, VA 22102

(b)(2)

2

Unknown
From:

Sent:
To:
Cc:
Subject:

Kratz, Kurt, . OSD-ATL
Wednesday, August 06,200309:40
'Kowalczyk Daniel'
Richard 8. Belzer, Ph. D.
FW: PRIVATE: FW: Perchlorate SOW on NAS website different from our negotiated position

I need the side-by-side ASAP.
Kurt
-----Original Message----­
From: Koetz Maureen SES SAF/IE
Sent: Wednesday, August 06, 2003 9:33 AM
To: Cornell Jeff Lt. Col SAF!lE; Cohen, Ben, Mr, 000 OGe; Kratz, Kurt, ,
OSO-ATL; Meehan, Patrick, Mr, OSO-ATL; Rogers Daniel Col AFLSA/JACE
Subject: RE: PRIVATE: FW: Perchlorate SOW on NAS website different from
our negotiated position
Okay, but what exactly is it you are trying to tell me to do about this?
If you want me to go to Connaughton, I can but I will need a copy of the
document we think we should have and the document NAS is using, and'
advise that our version is the proper one. I don't mind engaging
principlas on this if we should.
Thanks,
MK

-----Original Message----­
From: Cornell Jeff Lt. Col SAF/IE
Sent: Tuesday, August 05, 2003 9:37 PM
To: Koetz Maureen SES SAF/IE; Cohen, Ben, Mr, DoD OGC: Kratz, Kurt, ,
OSD-ATL; Meehan, Patrick, Mr, OSD-ATL; Rogers Daniel Col AFLSA!JACE
Subject: PRIVATE: FW: Perchlorate SOW on NAS website different from our
negotiated position
Importance: High

PREDECISIONAL DO NOT CITE OR QUOTE

vr,
jeff
-----Original Message----­
From: Cornell Jeff Lt. Col SAF/IE
1

To:
Sent: 8/572003 9:22 PM
Subject: FW: Perchlorate SOW on NAS website different from our
negotiated position
Importance: High
«Final Task Order SOW 5-13-03.doc»
Paul - as we discussed. I left another voicemail for Paul ... let's hope
this is easily resolved. I'll call you tomorrow.
Jeff
-----Original Message----­
From: Cornell Jeff Lt. CQl SAflIE
To: Panastas (E-mail)
Sent: 8/4/2003 9:18 AM
Subject: Perchlorate SOW on NAS website different f rom our negotiated
position
Importance: High

thanks,
jeff

2

Unknown
From:
Sent:

To:
SUbject:

Kratz, Kurt, , OSD-ATL
Wednesday, August 06, 2003 09:38
Meehan, Patrick, Mr, OSD-ATL
FW: DOD/NASA/DOE Discussion

fyi
Message----­
Mr, 000 OGCi Kratz, Kurt, ,

,

Rusden

.

Lesly

Civ SAF/lEE; Ashworth Richard Col SAF/lE
Subject: FW: DOD/NASA/DOE Discussion

jeff

Lesly - please contact me on the cell phone if/when a telecon is
scheduled ... thanks
-----Original Message----­

From II'~Bwu~b~ailri'.p.aiti1lriiililiclie
••••••••••••••••••••••

To:

iJ:

Cc: Guevara, Karen; Rowley, Blaine
Sent: 8/5/2003 6:56 PM
Subject: DOD/NASA/DOE Discussion

1

Thanks
Patty Bubar

2

Page 1 of 1

Unknown
From:

Kratz, Kurt, , OSD-ATL

Sent:

Wednesday, August 06,200309:01

To:

Meehan, Patrick, Mr, OSD-ATL

Subject: FW Brief to Wynne

-----Original Message----­
From: Koetz Maureen SES SAFflE
Sent: Wednesday, August 06, 2003 8:59 AM
To: Kratz, Kurt, , OSD-ATL
Subject: RE: Brief to Wynne
We shall stand by, thank you
-----Original Message----­

From: Kratz, Kurt, , OSD-ATL

sent: Tuesday, August 05,20035:41 PM

To: Koetz Maureen SES SAFflE

Cc: Cornell Jeff Lt. Col SAfflE

Subject: Brief to Wynne

Ma'am,

Mr. Grone postponed meeting w;th Wynne. He wants to vette the brief with DUSD(IP) - Suzanne Patrick,

Director, Defense Systems - Glen Lamartin, and Director, Defense Procurement and Acquisition Policy,

Diedre Lee for their comments. Next time both Mr. Wynne and Mr. Grone are in the building at the same

time will be first week in Sep. That is the proposed new brief date.

Kurt

From:
Sent:

To:
SUbject:

Kowalczyk Daniel • •
Friday, June 13, 2003 08:56
Sandra Coner
Last One

POTENTIAL
RRPI Apr03 .doc
udgetary Impacts to

Daniel Kowalczyk
Booz Allen Hamilton
8283 Greensboro Dr
McLean, VA 22102
(b)(2)

RRPI
Legislation2.doc

Toe.doc

What 000 Wants
Apr03.doc

Document 232

Two page memorandum withheld

Title: Potential Budgetary Impacts to DoD

Exemption 5

RRPI LEGISLATION AND

PERCHLORATE REMED~TION

Summary of the Administration's Proposed Readiness and Range Preservation Initiative
(RPPI) Legislation

.:. The potential interaction between perchlorate remediation and RRPI occur in the legislation's
proposed changes to RCRA and CERCLA. The legislation also proposes some changes to
the Marine Mammal Protection Act, the Endangered Species Act. and the Clean Air Act.
.:. The changes to RCRA seek to codify EPA's-1997 Military Munitions Rule into statute. This
1997 rule drew the line between ranges and range activities that ~ exempt and that are
subject to solid and hazardous waste regulation.
•:. The RRPI legislation would exclude from in the definition of a "solid waste" under the Solid
Waste Disposal Act, explosives, unexploded ordnance (UXO), munitions or their
constituents when used for the following purposes:

>
~

~

~

For training military personnel or emergency response specialists~

Research and development, testing and evaluation of munitions and weapons systems;

Where they are deposited on an operational range incident to their expected use or are

deposited off-range, but are promptly rendered safe and retrieved; or
Where they are recovered and destroyed on-range during range clearance activities. This
exclusion does not include burial of unexploded ordnance when burial is not a result of
the firing of the munitions.

•:. Then, to be clear, it includes in the statutory definition of solid waste explosives, unexploded
ordinance, munitions or their constituents which:
~

>
~

>
>

Have been deposited on an operational range incident to use and:

Are removed from an operational range for reclamation, treatment, storage or disposal;

Are disposed by burial or landfilling;

Migrate off-range and are not addressed by CERCLA; or

Are deposited off-range incident to nonnal use and are not promptly rendered safe or

retrieved.

•:. The bill clarifies that such munitions or ordnance remaining on closed ranges remain subject
to existing legal requirements.
•:. The proposed changes to CERCLA are more complex. The major changes to DoD's
activities under CERCLA include:
. ~ It would exempt DoD from CERCLA's section 103 requirement to report releases of

hazardous substances.

~
~

It would eliminate the requirement under CERCLA section 120 that EPA assure that
DoD has conducted a preliminary assessment at all applicable Federal facilities.
It retains EPA's authority under Section 106 of CERCLA to order clean up - even on
active ranges exempt under the RCRA portions of RRPI and current regulations - if EPA
finds an imminent and substantial danger to public health.

How Does the Legislation Effect Current and Future Perchlorate Cleanups?

.:. The scope of activities covered by the proposed legislation are identical to, and no broader
than, those currently covered under the Clinton Administration's 1997 Military Munitions
Rule. Thus, nothing new would be exempted under RRPI that isn't already exempted by
Federal reguiaiion.
•:. However, only about 30 States have adopted the Munitions rule in their State regulations.
Since States can be more stringent than EPA's Munitions rule, the RRPI bill would preclude
States from adopting more stringent regulations for active ranges.
•:. Elevating the RCRA exclusions into statute would also prevent States, EPA, and private
parties from filing suits to force cleanup under RCRA's corrective action program for exempt
activities at operational ranges.
•:. DoD argues that the legislation hannonizes the numerous provisions under RCRA and
CERCLA for Federal, State, and private party litigation. IfRRPI became law, EPA would be
able to address problems under its CERCLA 104 and 106 authorities. If EPA and DoD failed
to devise a solution, private parties and States could then initiate action since the constituents
were "deposited off-range incident to normal use and are not promptly rendered safe or
retrieved." Thus, the legislation elevates potential conflicts between operational readiness
and the pace of environmental cleanup to the Federal level for resolution.
What the Legislation Does Not Do

.:. RRPI would not change the scope, pace, and authority of EPA, States, and private parties for
perchlorate cleanups of:
~

Munitions or their constituents at Formerly Used Defense Sites (FUDS);

~ Munitions in old landfills;

~ Closed ranges;

~ Releases at ranges that close in the future;

~ Spills and releases 'at private ordnance manufacturing and storage facilities;

.:. The vast majority of sites with known perchlorate releases would not be changed by RR,PI.
Since perchlorate has only been detected since 1997, current site investigations and cleanups
have been carried out under the exemptions of the Munitions Rule. Since current cleanup
sites have not received exemptions under the Munitions Rule, this activity is evidence that
their cleanups would be unaffected by RRPI.

Are Private Parties Exempted by RRPI?

.:. RRPI would codify the existing regulatory exemptions for private parties that engage in
exempt activities with munitions at test ranges they own. For example, if a private
manufacturer operated an active firing range to conduct research, development, and testing of
munitions, its range clearing activities meeting the regulatory definition would be exempt.
.:. Releases from munitions that migrate off any active range are - and would continue to be ­
subject to Federal and State enforcement. Again, RRP! only codifies the existing provisions
of the Munitions Rule. EPA's authority to order cleanups or investigations currently
underway would not change.
•:. This narrow provision would appear not to exempt the most well-known perchlorate releases
from private facilities. Most perchlorate releases appear to be from closed manufacturing
operations, munitions disposal areas, and closed testing facilities.
CouDten to CEPO's "CoDnect the Dots" Arguments
.:. An environmental group, Center for Public Environmental Oversight (CEPO), and its

director Lenny Segal has circulated a "connect the dots" argument that purports to show how
RRPI exempts DoD from its perchlorate liability. This argument, and its numerous flaws.
are reviewed below.
~

DoD contends that perchlorate cleanup is not required, except under statutory authority.

> The Bush Administration and DoD, in particular, is likely to delay promulgation of a
perchlorate standard.

.

> California is likely to adopt a stringent perchlorate standard in 2004.

California's MCL
would be an applicable or relevant and appropriate requirement (ARAR) under CERCLA
and a cleanup standard under ReRA corrective actIon.

------------_._-----------­

~

However, DoD's proposed RRPI language would limit the applicability of Califomi a's
(and other State's) standard at CERCLA and RCRA at operational ranges and possibly
other sites.

~

Questions the validity of testimony given by EPA Assistant Administrator Suarez that the
Safe Drinking Water Act (SDWA) gives EPA the tools it needs to address groundwater
issues at sites that might be exempted from RCRA and CERCLA under the proposed
legislation. He also claims that the SDWA does not recognize State standards.

~ Thus, DoD hopes to avoid full regulation of its perchlorate sites even if California and

other States issue protective standards.
(b)(5)

Perchlorate Executive Briefing Book
Table of Contents

Tabl

What DoD Wants

Tab 2

Key Officials and Agency positions

Tab 3

Timeline of Dispute

Tab 4

Perchlorate's Importance to DoD (Operational Risk)

TabS

Budgetary Impacts to DoD

Tab 6

EPA's Risk Assessment Flaws

Tab 7

Peer Review Process Flaws

Tab 8

Scientific Support for DoD's Position

Tab 9

Congressional Interest Analysis

TablD

History of DoD/EPA Cooperation on Perchlorate

Tab 11

Perchlorate/RRPI Connection

Tab 12

Parallels with Emerging Issues

Tabl3

Media Stories

Tabl4

Official Correspondence

Document 232

Five page memorandum withheld

Title: What DoD Wants

Exemption 5

~

.!t •."

...kr

it' \t

Green David (b)(6)

Friday, December OS, 2003 15:07

Cotter Sandra

Salomon Roy; Kowalczyk Daniel; McCarty Jean

DQA2 pager

From:
Sent:
To:
Cc:
Subject:

..
.

"

".'

Use of the Data
Quality Act in."

Attachment B,pdf Attachment A,pdf Attachment C.pdf

Sandy,

Attached is the 2 pager on the usini~the Data Quality Act as a means to
challenge EPA'S science and analysis~on perchlorate. Also provided are
the attachments to the 2 pager.
There are two topics remaining:
1. Require Issuance of Statement of Limitations on Use of RfD and

Toxicity Data

2. Develop and Issue Additional Guidance on Application of Risk

Management Within the CERCLA process

Do you have a preference for which one gets finished next?
please let me know.

David R. Green

Booz Allen Hamilton

8283 Greensboro Drive

Hamilton 4063

McLean, Virginia 22102

(v)
(b)(2)
(f)

On

the Web:

www.boozallen.com

1

If so,

Use of the Data Quality Act in Challenges to Health Based Standards
Problem
The Department of Defense (DoD) has repeatedly expressed concerns regarding the data used by
the U.S. Environmental Protection Agency (EPA) in the January 2002 draft perchlorate risk
assessment. This draft risk assessment proposed a reference dose (RID) drinking water equivalent
level (DWEL) of 1 ppb, based in part on an EPA decision to discard many of the results and
conclusions of both a 1998 risk characterization and the conclusions and recommendations from
the peer review of that risk assessment EPA conducted in late 1998-early 1999. Despite repeated
challenges to the science and health effects analyses underlying the decision to set the RID for
perchlorate at 1 ppb, DoD has realized little success at changing EPA's latest proposal. This paper
explores the option of a formal challenge under EPA's Guidelines for Ensuring and Maximizing

the Quality, Objectivity, Utility, and Integrity of Information Disseminated by the
Environmental Protection Agencyl (hereinafter the "EPA Guidelines") (provided at Attachment
A).

Discussion
Enacted as part of the Treasury and General Government Appropriation Act for Fiscal Year
2001, (Public Law 106-554, section 515), the Data Quality Act (DQA) amended the Paperwork
Reduction Act (PRA) (44 U.S.C. § 3501 et seq) (see Attachment B). The DQA requires Federal
agencies ensure that the information disseminated by the agency meet a standard of quality. The
Office of Management and Budget (OMB) issued Guidelines for Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity ofInformation Disseminated by Federal Agencies; (67
F.R. 8452, February 22,2002) (provided at Attachment C). The OMB guidance required all
Federal agencies:
• Issue guidelines"... ensuring and maximizing the quality, objectivity, utility and integrity
of information (including statistical information) disseminated by the agency.
• Establish administrative mechanisms allowing affected persons to seek and obtain
correction of information maintained and disseminated by the agency that does not comply
with the data quality guidelines.
• Periodically report on the number and nature of complaints received by the agency and
how such complaints were handled by the agency.
The EPA Guidelines use a tiered approach to set the level of quality, objectivity, utility and
integrity of information products that is based on the intended uses of those products. Under this
tiered system, the more important the use, the higher the quality standard. As a general matter,
EPA believes that current statues, regulations, and scientific practices (including both internal
quality management systems and external peer review) that EPA employs implement these
guidelines. EPA has also stated that information subjected to public review and comment will not
be considered under the EPA Guidelines and asserted the view that that formal, independent,
external peer review, of and by itself, means information is objective. EPA does, however, allow
this presumption of objectivity to be rebutted. It is this provision allowing rebuttal of EPA 's
assertions as to the quality, objectivity, utility and integrity which may offer a means to challenge
the scientific analysis underlying the 2002 perchlorate risk assessment.

1 Issued October 15, 2002.

DRAFT
Revision 0-0
December 5,2003

To make use of the mechanism for challenging the quality, objectivity, utility and integrity of
information used by EPA a petitioner must submit a Request for Correction (RFC) to EPA. As
discussed in the EPA Guidelines, an RFC must include:
(1) Name and contact information for the individual or organization submitting a complaint;
identification of an individual to serve as a contact.
(2) A description of the information the person believes does not comply with EPA or OMB
guidelines, including specific citations to the information and to the EPA or OMB
guidelines, if applicable.
(3) An explanation of how the information does not comply with EPA or OMB guidelines
and a recommendation of corrective action. EPA considers that the complainant has the
burden of demonstrating that the information does not comply with EPA or OMB
guidelines and that a particular, corrective action would be appropriate.
(4) Explanation of how the error affects or how a·correction would benefit the requestor.
It is the responsibility of the petitioner to show how the information fails to comply with the, and
how any proposed actions to address that non-compliance resolves the data problem. Once the
petition is submitted to the EPA Office of Environmental Information (OEI), EPA begins the
administrative process for addressing the RFC by sending the RFC to the appropriate EPA office
(i.e., the office responsible for the quality of the information or document). An RFC can only be
rejected by the official at the highest organization level in the responsible EPA Office or Region.
If unsatisfied by the response a petitioner can appeal the decision by submitting a Request for
Reconsideration (RFR). As with the RFC, the RFR must include an explanation of the
disagreement with EPA's decision and a specific recommendation for addressing the conflict. The
RFR is reviewed by an executive-level panel comprised of EPA personnel. Beyond the
administrative review mechanism of the RFC and RFR, those with legal standing can seek judicial
review (judicial review is not an option available to DoD).
Recommendations
The first action that should be taken is an analysis of the DQA, OMB guidelines, and EPA
guidelines to detennine if DoD is in any way prohibited from submitting a challenge under the
DQA and the EPA or OMB Guidelines. For example, under the construct of the unitary executive,
DoD may be prohibited from submitting a RFC challenging the analyses and data underlying the
draft risk assessments. As a mater requiring legal analysis, the Office of the Deputy Under
Secretary of Defense (Installations and Environment) should request a written legal opinion from
the Office of General Counsel. Since this opinion is will answer a "go/no go" question, it must be
obtained as quickly as possible. 2
Assuming that DoD is not enjoined from seeking a review of the EPA data and analysis
underlying the second draft risk assessment for perchlorate, DoD should begin development of the
RFC documentation. It is observed that represents a significant level of effort since the petition
must be specific in regards both the problems with the data and the solutions proposed, and
requires examination of hundreds, if not thousands, of pages of documents. The Air Force, which
has led the perchlorate effort for DoD as part of their role as the Executive Agent for the Safe
Drinking Water Act, should be tasked with this responsibility.

2It would also be advisable to request the Office of General Counsel provide guidance on how the DQA and OMB guidelines
affect data collected and used in the execution ofthe Defense Environmental Restoration Program.

DRAFT
Revision 0-0
December 5,2003

2

- - - - - - - - - - - - - - - - _.._---­

The Data Quality Act
a. In General -- The Director of the Office of Management and Budget shall, by not later than
September 30,2001, and with public and Federal agency involvement, issue guidelines under
sections 3504(d)(l) and 3516 of title 44, United States Code, that provide policy and procedural
guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and
integrity of information (including statistical information) disseminated by Federal agencies in
fulfillment of the purposes and provisions ofchapter 35 of title 44, United States Code,
commonly referred to as the Paperwork Reduction Act.
b. Content of Guidelines. - The guidelines under subsection (a) shall­
1. Apply to the sharing by Federal agencies of, and access to,information disseminated by
Federal agencies; and
.
2. Require that each Federal agency to which the guidelines applyA. Issue guidelines ensuring and maximizing the quality, objectivity, utility and

integrity of information (including statistical information) disseminated by the
agency, by not later than 1 year after the date of issuance of the guidelines under
subsection (a);
B. Establish administrative mechanisms allowing affected persons to seek and
obtain correction of information maintained and disseminated by the agency that
does not comply with the guidelines issued under subsection (a); and
C. Report periodically to the Director ­
i. The number and nature of complaints received by theagency regarding
the accuracy of informatioIllisseminated by the agency; and
ii. How such complaints were handled by the agency.

Guidelines for Ensuring and
Maximizing the Quality, Objectivity,
Utility, and Integrity of Information
Disseminated by the Environmental
Protection Agency

EPAJ260R-02-008
October 2002

Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and
Integrity, of Information Disseminated by
the Environmental Protection Agency
Prepared by:
U.S. Environmental Protection Agency
Office of Environmental Infonnation (2810)
1200 Pennsylvania Avenue, NW
Washington, DC 20460

also availab Ie via the inter net at:

http://www.epa.gov/oei/qualityguidelines/

GL!ld"'!lne~ to~ E'1sunng am! M:JxlIl1izing tnc Quality. Oo;eclivity. Utility. and Integrity of Information DISSemIn3!('(" :,..' "PI)

Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of
Information Disseminated by the Environmental Protection Agency
Table of Contents
1

Introduction

.J.

2

EPA Mission and Commitment to Quality
2.1
EPA's Mi'ssion and Commitment to Public Access
2.2
Information Management in EPA
2.3
EPA's Relationship with State, Tribal, and Local Governments

~

R

OMB Guidelines

2

3
4

Existing Policies and Procedures that Ensure and Maximize Information Quality
Quality System
4.2
Peer Review Policy
·.·
4.3
Action Development Process
4.4
Integrated Error Correction Process
4.5
Information Resources Management Manual
4.6
Risk Characterization Policy and Handbook
4.7
Program-Specific Policies
4.8
EPA Commitment to Continuous Improvement
4.9
Summary of New Activities and Initiatives

4.1

5

~
~

10
10

·il
.ll
12
13

U
13

H
H

Guidelines Scope and Applicability

What is "Quality" According to the Guidelines?
~
5.2
What is the Purpose of these Guidelines?
12
5.3
When Do these Guidelines Apply?
12
5.4
What is Not Covered by these Guidelines?
12
5.5
What Happens if Information is Initially Not Covered by these Guidelines. but
EPA Subsequently Disseminates it to the Public?
lR
5.6
How does EPA Ensure the Objectivity, Utility, and Integrity of information that is
not covered by these Guidelines?
lR

5.1

6

Guidelines for Ensuring and Maximizing Information Quality
l2.
6.1
How does EPA Ensure and Maximize the Quality of Disseminated Information?

............................................................... l2.
6.2
6.3
6.4
6.5

How Does EPA Define Influential Information for these Guidelines?
12
How Does EPA Ensure and Maximize the Quality of "Influential" Information?
............................................................... 20
How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information?
21
Does EPA Ensure and Maximize the Quality of Information from External
Sources?
28

Table of Contents

,1

GUidelines, for' Ensuring and MaKIITllzlng the Quality. Objectivity. Utility. and Integrlty'of Information DISSemIM\e'(

EP':'

t)\

7

Administrative Mechanism for Pre-dissemination Review
7.1
What are the Administrative Mechanisms for Pre-dissemination Reviews?

8

Administrative Mechanisms for Correction of Information
30

8.1
What are EPA's Administrative Mechanisms for Affected Persons to Seek and

Obtain Correction of Information?
30

8.2
What Should be Included in a Request for Correction of Information?
30

8.3
When Does EPA Intend to Consider a Request for Correction of Information?
•••••••••••••••••••••••••••••

8.4
8.5
8.6

0

•••••••••••••••••••••••••••••••••

o

0

••

0

••••••••••••••••••••••••••••••••••••••••••••••••

on

How Does EPA Expect to Process Requests for Correction of Information on

Which EPA has Sought Public Comment?
32

What Should be Included in a Request Asking EPA to Reconsider its Decision on
34
a Request for the Correction of Information?
How Does EPA Intend to Process Requests for Reconsideration of EPA
34

Decisions?
0

•••••••

0

0

•••••••

••••••••••••••

,

••••••••

Appendix A: IQG Development Process and Discussion of Public Comments
A.1

Introduction

A.2

General Summary of Comments

A.3

Response to Comments by Guidelines Topic Area
A.3.1 Existing Policy
A.3.2 Scope and Applicability
A.3.3 Sources of Information
,
A.3.4 Influential Information .. ,
A.3.5 Reproducibility
A.3.6 Influential Risk Assessment
A.3.7 Complaint Resolution
,

A.4

31

How Does EPA Intend to Respond to a Request for Correction of Information?

0

8.7

29

29

Next Steps

Table of Contents

,

36

36

,

,

37

,
:

,
,

,
,
" .. ,

,

38

38

39

42

43

47

48

53

,. 56

2

C,Uldenne, tN Ensunno and Maximizlllg the Quality. Obrectivity. Utility. and Integrity of Information Disseminated b, EPt,

1

Introduction

The Environmental Protection Agency (EPA) is committed to providing public access to
environmental information. This commitment is integral to our mission to protect human health
and the environment. One of our goals is that all parts of society - including communities,
individuals, businesses, State and local governments, Tribal governments - have access to
accurate information sufficient to effectively participate in managing human health and
environmental risks. To fulfill this and other important goals, EPA must rely upon information
of appropriate quality for each decision we make.
Developed in response to guidelines issued by the Office of Management and Budget (OMB)I
under Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal
Year 2001 (Public Law 106-554; H.R. 5658), the Guidelinesfor Ensuring and Maximizing the
Quality, Objectivity, Utility, and Integrity of Information Disseminated by the Environmental
Protection Agency (the Guidelines) contain EPA's policy and procedural guidance for ensuring
and maximizing the quality of information we disseminate. The Guidelines also outline
administrative mechanisms for EPA pre-dissemination review of information products and
describe some new mechanisms to enable affected persons to seek and obtain corrections from
EPA regarding disseminated information that they believe does not comply with EPA or OMB
guidelines. Beyond policies and procedures these Guidelines also incorporate the following
performance goals:

Disseminated information should adhere to a basic standard of quality, including
objectivity, utility, and integrity.

The principles of information quality should be integrated into each step of EPA's
development of information, including creation, collection, maintenance, and
dissemination.

Administrative mechanisms for correction should be flexible, appropriate to the
nature and timeliness of the disseminated information, and incorporated into
EPA's information resources management and administrative practices.

OMB encourages agencies to incorporate standards and procedures into existing information
resources management practices rather than create new, potentially duplicative processes. EPA
has taken this advice and relies on numerous existing quality-related policies in these Guidelines.
EPA will work to ensure seamless implementation into existing practices. It is expected that
EPA managers and staff will familiarize themselves with these Guidelines, and will carefully
review existing program policies and procedures in order to accommodate the principles outlined
in this document.

IOuidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information
Disseminated by Federal Agencies; OMB, 2002. (67 FR 8452) Herein after "OMB guidelines".
http://www.whitehouse.gov/omb/fedreg/reproducibJe2.pdf
Introduction

3

GUIOC!lnC5 to'

E:nsunn9 and

~.'lny.lmlZ:lI1g the

Quality. OOlectlvity. Utilil,'. and IntegrIty of Informall on

Dlss~mJn~t·~,··

i)

to;;.,

EPA's Guidelines are intended to carry out OMB's government-wide policy regarding
information we disseminate to the public. Our Guidelines reflect EPA's best effort to present our
goals and commitments fOf ensuring and maximizing the quality of information we disseminate.
As such, they are not a regulation and do not change or substitute for any legal requirements.
They provide non-binding policy and procedural guidance, and are therefore not intended to
create legal rights, impose legally binding requirements or obligations on EPA or the public
when applied in particular situations, or change or impact the status of information we
disseminate, nor to contravene any other legal requirements that may apply to particular agency
determinations or other actions. EPA's intention is to fully implement these Guidelines in order
to achieve the purposes of Section 515.
These Guidelines are the product of an open, collaborative process between EPA arid numerous
EPA stakeholders. The Guidelines development process is described in the Appendix to this
document. EPA received many public cdmments and has addressed most comments in these
Guidelines. A discussion of public comments is also provided in the Appendix and is grouped by
overarching themes and comments by Guidelines topic areas. EPA views these Guidelines as a
living document, and anticipates their revision as we work to further ensure and maximize
information quality.

Introduction

4

GUiDelines te, Ensurlllo and MaxlIl1izrng the Quality. Oblectivity. Utility. and Integnl)' at Informotlon Disscrninntcc t\ "P;'

2

EPA Mission and Commitment to Quality

2.1

EPA's Mission and Commitment to Public Access

The mission of the EPA is to protect human health and safeguard the natural environment upon
which life depends. EPA is committed to making America's air cleaner, water purer, and land
better protected and to work closely with its Federal, State, Tribal, and local government
partners; with citizens; and with the regulated community to accomplish its mission. In addition,
the United States plays a leadership role in working with other nations to protect the global
environment.
EPA's commitment to expanding and enhancing access to environmental information is
articulated in our Strategic Plan. EPA works every day to expand the public's right to know
about and understand their environment by providing and facilitating access to a wealth of
information about public health and local environmental issues and conditions. This enhances
citizen understanding and involvement and provides people with tools to protect their families
and their communities.
EPA statutory responsibilities to protect human health and safeguard the natural environment are
described in the statutes that mandate and govern our programs. EPA manages those programs in
concert with numerous other government and private sector partners. As Congress intended, each
statute provides regulatory expectations including information quality considerations and
principles. Some statutes are more specific than others, but overall, each directs EPA and other
agencies in how we regulate to protect human health and the environment. For example, the Safe
Drinking Water Act (SDWA) Amendments of 1996 set forth certain quality principles for how
EPA should conduct human health risk assessments and characterize the potential risks to
humans from drinking water contaminants. Information quality is a key component of every
statute that governs our mission.

2.2

Information Management in EPA

The collection, use, and dissemination of information of known and appropriate quality are
integral to ensuring that EPA achieves its mission. Information about human health and the
environment -- environmental characteristics; physical, chemical, and biological processes; and
chemical and other pollutants -- underlies all environmental management and health protection
decisions. The availability of, and access to, information and the analytical tools to understand it
are essential for assessing environmental and human health risks, designing appropriate and
cost-effective policies and response strategies, and measuring environmental improvements.
EPA works every day to ensure information quality, but we do not wait until the point of
dissemination to consider important quality principles. While the final review of a document
before it is published is very important to ensuring a product of high quality, we know that in
order to maximize quality, we must start much earlier. When you read an EPA report at your
local library or view EPA information on our web site, that information is the result of processes
EPA Mission and Commitment to auality

5

GUldcill1es tor Ensuring and Maxllmzlng the Quality. ObJectivity. Utifity. and IntegrIty of Intormalion

Dlsscmln~tr'c

::;,

:=.;o~

undertaken by EPA and our partners that assured quality along each step of the way. To better
describe this interrelated information quality process, the following presents some of the major
roles that EPA plays in its effort to ensure and maximize the quality of the information:

EPA is a collector and generator of information: While most of our programs
rely on States, Tribes, or the private sector to collect and report information to
EPA, there are some programs in which EPA collects its own information. One
example is the Agency's enforcement and compliance program, under which EPA
collects samples in the field or condu~ts onsite inspections. We also conduct
original, scientific research at headquarters, in Regional Offices, and at our
research laboratories to investigate and better understand how our environment
works, how humans react to chemical pollutants and other environmental
contaminants, and how to model our natural environment to assess the potential
impact of environmental management activities. Ensuring the quality of collected
information is central to our mission.

EPA is a recipient of information: EPA receives a large amount of information
that external parties volunteer or provide under statutory and other mandates.
Much of the environmental information submitted to EPA is processed and stored
in Agency information management systems. While, we work to ensure and
maximize the integrity of that information through a variety of mechanisms and
policies, we have varying levels. of quality controls over information developed or
collected by outside parties. This information generally falls into one of four
categories:

Information collected through contracts with EPA. Examples of this
information include studies and collection and analysis of data by parties
that are under a contractual obligation with EPA. Since EPA is responsible
for managing the work assigned to contractors, EPA has a relatively high
degree of control over the quality of this information.

Information collected through grants and cooperative agreements
with EPA. Examples of this information include scientific studies that are
performed under research grants and data collected by State agencies or
other grantees to assess regulatory compliance or environmental trends.
Although EPA has less control over grantees than contractors, EPA can
and does include conditions in grants and cooperative agreements
requiring recipients to meet certain criteria.

Information submitted to EPA as part of a requirement under a
statute, regulation, permit, order or other mandate. Examples of this
information include required test data for pesticides or chemicals, Toxics
Release Inventory (TRI) submissions and compliance information
submitted to EPA by States and the regulated community. EPA ensures
EPA Mission and Commitment to Quality

6

GWIOCllnes 101 ::nsurlnc ana Moxllnlzmg the Quallt\'. Objectivity. Utility. and Integrity of Information DISSemln:Jlc:; c, ~,,;

quality control of such information through regulatory requirements. such
as requiring samples to be analyzed by specific analytical procedures and
by certified laboratories. However, each EPA program has specific
statutory authorities which may affect its ability to impose certain quality
practices.
The final category of information that is not included in any of the above
th~ee categories includes information that is either voluntarily
submitted to EPA in hopes of influencing a decision or that EPA
obtains for use in developing a policy, regulatory, or other decision.
Examples of this information include scientific studies published in
journal articles and test data obtained from other Federal agencies,
industry, and others. EPA may not have any financial ties or regulatory
requirements to control the quality of this type of information.
While the quality of information submitted to EPA is the responsibility of the
original collector of the information, we nevertheless maintain a robust quality
system, that addresses information related to the first three bullets above by
including regulatory requirements for quality assurance for EPA contracts, grants,
and assistance agreements. For the fourth category, we intend to develop and
publish factors that EPA would use in the future to assess the quality of voluntary
submissions or information that the Agency gathers for its own use.

EPA is a user of information: Upon placement in our information management
systems, information becomes available for use by many people and systems.
EPA users may include Program managers, information product developers, or
automated financial tracking systems. Depending on the extent of public release,
users may also include city planners, home:owners, teachers, engineers, or
community activists, to name a few. To satisfy this broad spectrum of users, it is
critical that we present information in an unbiased context with thorough
documentation.
EPA is moving beyond routine administration of regulatory information and
working in concert with States and other stakeholders to provide new information
products that are responsive to identified users. Increasingly, information
products are derived from information originally collected to support State or
Federal regulatory programs or management activities. Assuring the suitability of
this information for new applications is of paramount importance.

EPA is a conduit for information: Another major role that EPA plays in the
management of information is as a provider of public access. Such access enables
public involvement in how EPA achieves it mission. We provide access to a
variety of information holdings. Some information distributed by EPA includes
information collected through contracts; information collected through grants and

EPA Mission and Commitment to Quality

7

GUidelines tor Ensurln(l 3nct Mnximizrng the Quality. ObjeclIvity. Utility. and Integrity' of Intormatlon DlsscrT'm<1rC'G D\ EP;:.

cooperative agreements; information submitted to EPA as part of a requirement
under a statute, regulation, permit, order, or other mandate; and information that
is either voluntarily submitted to EPA in hopes of influencing a decision or that
EPA obtains for use in developing a policy, regulatory, or other decision. In some
cases, EPA serves as an important conduit for information generated by external
parties; however, the quality of that information is the responsibility of the
external information developer, unless EPA endorses or adopts it.

2.3

EPA's Relationship with State, Tribal, and Local Governments

As mentioned in the previous section, EPA works with a variety of partners to achieve its
mission. Our key government partners not only provide information, they also work with EPA to
manage and implement programs and communicate with the public about issues of concern. In
addition to implementing national programs through EPA Headquarters Program Offices, a vast
network of EPA Regions and other Federal, State, Tribal and local governments implement both
mandated and voluntary programs. This same network collects, uses, and distributes a wide
range of information. EPA plans to coordinate with these partners to ensure the Guidelines are
appropriate and effective.
One major mechanism to ensure and maximize information integrity is the National
Environmental Information Exchange Network (NEIEN, or Network). The result of an important
partnership between EPA, States and Tribal governments, the Network seeks to enhance the
Agency's information architecture to ensure timely and one-stop reporting from many of EPA's
information partners. Key components include the establishment of the Central Data Exchange
(CDX) portal and a System of Access for internal and external users. When fully implemented,
the Network and its many components will enhance EPA and the public's ability to access, use,
and integrate information and the ability of external providers to report to EPA.

EPA Mission and Commitment to Quality

8

C:wlcellnes ior Ensurln9 and Maximizing the auality. ObJectivity. Utility. and Integrity of InformatIOn DlsseminmE'd b\ E'P.:.

3

OMB Guidelines

In Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year
2001 (Public Law 106-554; H.R. 5658), Congress directed OMB to issue government-wide
guidelines that "provide policy and procedural guidance to Federal agencies for ensuring and
maximizing the quality, objectivity, utility, and integrity of information (including statistical
information) disseminated by Federal agencies...." The OMB guidelines direct agencies subject
to the Paperwork Reduction Act (44 U.S.C. 3502(1» to:

Issue their own information quality guidelines to ensure and maximize the
quality, objectivity, utility, and integrity of information, including statistical
information, by no later than one year after the date of issuance of the OMB
guidelines;

Establish administrative mechanisms allowing affected persons to seek and obtain
correction of information maintained and disseminated by the agency that does
not comply with the OMB or agency guidelines; and

Report to the Director of OMB the number and nature of complaints received by
the agency regarding agency compliance with OMB guidelines concerning the
quality, objectivity, utility, and integrity of information and how such complaints
were resolved.

The OMB guidelines provide some basic principles for agencies to consider when developing
their own guidelines including:

Guidelines should be flexible enough to address all communication media and
variety of scope and importance of information products.

Some agency information may need to meet higher or more specific expectations
for objectivity, utility, and integrity. Information of greater importance should be
held to a higher quality standard.

Ensuring and maximizing quality, objectivity, utility, and integrity comes at a
cost, so agencies should use an approach that weighs the costs and benefits of
higher information quality.

Agencies should adopt a common sense approach that builds on existing
processes and procedures. It is important that agency guidelines do not impose
unnecessary administrative burdens or inhibit agencies from disseminating
quality information to the public.

OMS Guidelines

9

(,llld01'Tl0,

4

ic,r

t:'15.Urlr10

<!nci fllawr.rzrng the Quality. ObJectivity. UHlity. and Integntv of Informalio r , Drssomrna:t·

l

::"',

Existing Policies and Procedures that Ensure and Maximize Information Quality

EPA is dedicated to the coilection, generation, and dissemination of high 'quality information.
We disseminate a wide variety of information products, ranging from comprehensive scientific
assessments of potential health risks,2 to web-based applications that provide compliance
information and map the location of regulated entities,3 to simple fact sheets for school children. 4
As a result of this diversity of information-related products and practices, different EPA
programs have evolved specialized approaches to information quality assurance. The OMB
guidelines encourage agencies to avoid the creation of "new and potentially duplicative or
contradictory processes." Further, OMB stresses that its guidelines are not intended to "impose
unnecessary administrative burdens that would inhibit agencies from continuing to take
advantage of the Internet and other technologies to disseminate information that can be of great
benefit and value to the public." In this spirit, EPA seeks to foster the continuous improvement
of existing information quality activities and programs. In implementing these guidelines, we
note that ensuring the quality of information is a key objective alongside other EPA objectives,
such as ensuring the success of Agency missions, observing budget and resource priorities and
restraints, and providing useful infonnation to the public. EPA intends to implement these
Guidelines in a way that will achieve all these objectives in a hannonious way in conjunction
with our existing guidelines and policies, some of which are outlined below. These examples
illustrate some of the numerous systems and practices in place that address the quality,
objectivity, utility, and integrity of information.

4.1

Quality System

The EPA Agency-wide Quality System helps ensure that EPA organizations maximize the
quality of environmental information, including information disseminated by the Agency. A
graded approach is used to establish quality criteria that are appropriate for the intended use of
the information and the resources available. The Quality System is documented in EPA Order
5360.1 A2, "Policy and Program Requirements for the Mandatory Agency-wide Quality
System" and the "EPA Quality Manual.,,5 To implement the Quality System, EPA organizations
(1) assign a quality assurance manager, or person assigned to an equivalent position, who has
sufficient technical and management expertise and authority to conduct independent oversight of
the implementation of the organization's quality system; (2) develop a Quality Management
Plan, which documents the organization's quality system; (3) conduct an annual assessment of
the organization's quality system; (4) use a systematic planning process to develop acceptance or
performance criteria prior to the initiation of all projects that involve environmental information

2

hltp:llcfpub.epa.gov/ncealcfm/partmatt.cfm

3

http://www,epa.gov/enviro/wme/

4

http://www.epa.gov/kids

EPA Quality Manual for Environmental Programs 5360 AI. May 2000.
http://www.epa.gov/quality/qs-docs/5360.pdf

5

EXisting Policies and Procedures that Ensure and Maximize Information Quality

10

Guidel1r)(OS tor Ensurill~ and Maxllnlzing the Quality. ObJectIvity. Ulilil~(. and Integrity of Information DIs.s:'!'11I!'\"te~ ')'~".

collection and/or use; (5) develop Quality Assurance Project Plan(s), or equivalent document(s)
for all applicable projects and tasks involving environmental data; (6) conduct an assessment of
existing data, when used to support Agency decisions or other secondary purposes, to verify that
they are of sufficient quantity and adequate quality for their intended use: (7) implement all
Agency-wide Quality System components in all applicable EPA-funded extramural agreements:
and (8) provide appropriate training, for all levels of management and staff.
The EPA Quality System may also apply to non-EPA organizations, with key principles
incorporated in the applicable regulations governing contracts, grants, and cooperative
agreements. EPA Quality System provisions may also be invoked as part of negotiated
agreements such as memoranda of understanding. Non-EPA organizations that may be subject to
EPA Quality System requirements include (a) any organization or individual under direct
contract to EPA to furnish services or items or perfonn work (i.e., a contractor) under the
authority of 48 CFR part 46, (including applicable work assignments, delivery orders, and task
orders); and (b) other government agencies receiving assistance from EPA through interagency
agreements. Separate quality assurance requirements for assistance recipients are set forth in 40
CFR part 30 (governing assistance agreements with institutions of higher education, hospitals,
and other non-profit recipients of financial assistance) and 40 CPR parts 31 and 35 (government
assistance agreements with State, Tribal, and local governments).

4.2

Peer Review Policy

In addition to the Quality System, EPA's Peer Review Policy provides that major scientifically
and technically based work products (including scientific, engineering, economic, or statistical
documents) related to Agency decisions should be peer-reviewed. Agency managers within
Headquarters, Regions, laboratories, and field offices detennine and are accountable for the
decision whether to employ peer review in particular instances and, if so, its character, scope,
and timing. These decisions are made consistent with program goals and priorities, resource
constraints, and statutory or court-ordered deadlines. For those work products that are intended
to support the most important decisions or that have special importance in their own right,
external peer review is the procedure of choice. For other work products, internal peer review is
an acceptable alternative to external peer review. Peer review is not restricted to the penultimate
version of work products; in fact, peer review at the planning stage can often be extremely
beneficial. The basis for EPA peer review policy is articulated in Peer Review and Peer
Involvement at the U.S. Environmental Protection Agency.b The Peer Review Policy was first
issued in January, 1993, and was updated in June, 1994. In addition to the policy, EPA has
published a Peer Review Handbook, 7 which provides detailed guidance for implementing the
policy. The handbook was last revised December, 2000.

6Peer Review and Peer Involvement at the U.S. EPA. June 7, 1994.
http://www.epa.gov/osp/spc/perevmem.htm
7Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
1OO-B-OO-OO I. http://www.epa.gov/osp/spclprhandbk.pdf

Existing Policies and Procedures that Ensure and Maximize Information Quality

11

GUldelflle, for Ensuring and MaXilnizmg the Quality. Objectivity. Utility. and Integrity of InformatIOn DI5seminaler b.

4.3

rp.

Action Development Process

The Agency's Action Development Process also serves to ensure and maximize the quality of
EPA disseminated information. Top Agency actions and Economically Significant actions as
designated under Executive Order 12866 are developed as part of the Agency's Action
Development Process. The Action Development Process ensures the early and timely
involvement of senior management at key decision milestones to facilitate the consideration of a
broad range of regulatory and non-regulatory options and analytic approaches. Of particular
importance to the Action Development Process is ensuring that our scientists, economists, and
others with technical expertise are appropriately involved in determining needed analyses and
research, identifying alternatives, and selecting options. Program Offices and Regional Offices
are invited to participate to provide their unique perspectives and expertise. Effective
consultation with policy advisors (e.g., Senior Policy Council, Science Policy Council), co­
regulators (e.g., States, Tribes, and local governments), and stakeholders is also part of the
process. Final Agency Review (FAR) generally takes place before the release of substantive
information associated with these actions. The FAR process ensures the consistency of any
policy determinations, as well as the quality of the information underlying each policy
determination and its presentation.

4.4

Integrated Error Correction Process

The Agency's Integrated Error Correction Process 8 (IECP) is a process by which members of the
public can notify EPA of a potential data error in information EPA distributes or disseminates.
This process builds on existing data processes through which discrete, numerical errors in our
data systems are reported to EPA. The IECP has made these tools more prominent and easier to
use. Individuals who identify potential data errors on the EPA web site can contact us through
the IECP by using the "Report Error" button or error correction hypertext found on major data
bases throughout EPA's web site. EPA reviews the error notification and assists in bringing the
notification to resolution with those who are responsible for the data within or outside the
Agency, as appropriate. The IECP tracks this entire process from notification through final
resolution.

8Integrated Error Correction Process for Environmental Data.
:1ttp://www.epa.gov/cdxlie£p.html

Existing Policies and Procedures that Ensure and Maximize Information Quality

12

GlIID.eiH1',S to, t:fl5urong and Maxllnizing the Quality. Objectivity. Utility. and Integrity of Information D.ssemma\f"(j

4.5

D\ E;;'~

Information Resources Management Manual

The EPA Information Resources Management (IRM) Manual9 articulates and describes many of
our information development and management procedures and policies, including infonnation
security, data standards, records management, information collection, and library services.
Especially important in the context of the Guidelines provided in this document, the IRM
Manual describes how we maintain and ensure information integrity. We believe that
maintaining information.integrity refers to keeping information "unaltered," Le., free from
unauthorized or accidental modification or destruction. These integrity principles apply to all
information. Inappropriately changed or modified data or software impacts information integrity
and compromises the value of the information system. Because of the importance of EPA's
information to the decisions made by the Agency, its partners, and the public, it is our
responsibility to ensure that the information is, and remains, accurate and credible.
Beyond addressing integrity concerns, the IRM Manual also includes Agency policy on public
access and records management. These are key chapters that enable EPA to ensure transparency
and the reproducibility of information.

4.6

Risk Characterization Policy and Handbook

The EPA Risk Characterization Policy and Handbook 1o provide guidance for risk
characterization that is designed to ensure that critical information from each stage of a risk
assessment is used in forming conclusions about risk. The Policy calls for a transparent process
and products that are clear, consistent and reasonable. The Handbook is designed to provide risk
assessors, risk managers, and other decision-makers an understanding of the goals and principles
of risk characterization.

4.7

Program-Specific Policies

We mentioned just a few of the Agency's major policies that ensure and maximize the quality of
information we disseminate. In addition to these Agency-wide systems and procedures, Program
Offices and Regions implement many Office-level and program-specific procedures to ensure
and maximize information quality. The purpose of these Guidelines is to serve as a common
thread that ties all these policies together under the topics provided by OMB: objectivity,
integrity and utility. EPA's approach to ensuring and maximizing qUality is necessarily
distributed across all levels of EPA's organizational hierarchy, including Offices, Regions,
divisions, projects, and even products. Oftentimes, there are different quality considerations for
different types of products. For example, the quality principles associated with a risk assessment

9

EPA Directive 2100 Information Resources Management Policy Manual.

http://www.epa.gov/irmpoli8/polman/
IORisk

Characterization Handbook, U.S. EPA, Science Policy Council, December 2000.

http://www.epa.gov/osp/spcl2riskchr.htm

Existing Policies and Procedures that Ensure and Maximize Information Quality

13

GUldelJne~

tor Ensuring and Maximizing the Quality. ObJectivity. Utility. and Integrity of Intormatlon Dlsseminawc

l'" E;>t.

differ from those associated with developing a new model. The Agency currently has a
comprehensive but distributed system of policies to address such unique quality considerations.
These Guidelines provide us with a mechanism to help coordinate and synthesize our quality
policies and procedures.

4.8

EPA Commitment to Continuous Improvement

As suggested above, we will continue to work to ensure that our many policies and procedures
are appropriately implemented, synthesized, and revised as needed. One way to build on
achievements and learn from mistakes is to document lessons learned about specific activities or
products. For example, the documents that present guidance and tools for implementing the
Quality System are routinely subjected to external peer review during their development:
comments from the reviewers are addressed and responses reviewed by management before the
document is issued. Each document is formally reviewed every five years and is either reissued,
revised as needed, or rescinded. If important new information or approaches evolve between
reviews, the document may be reviewed and revised more frequently.

4.9

Summary of New Activities and Initiatives

In response to OMB's guidelines, EPA recognizes that it will be incorporating new policies and
administrative mechanisms. As we reaffirm our commitment to our existing policies and
procedures that ensure and maximize quality, we also plan to address the following new areas of
focus and commitment:

Working with the public to develop assessment factors that we will use to assess
the quality of information developed by external parties, prior to EPA's use of
that information.

Affirming a new commitment to information quality, especially the transparency
of information products.

Establishing Agency-wide correction process and request for reconsideration
panel to provide a centralized point of access for all affected parties to seek and
obtain the correction of disseminated information that they believe does not
conform to these Guidelines or the OMB guidelines.

Existing Policies and Procedures that Ensure and Maximize Information Quality

14

GUidelines

to~

EnsurlOQ and Maximizing the Quality. Ob,ectlvity. Utility. and Integrity of Intormatlon Disseminate" c"

5

Guidelines Scope and Applicability

5.1

What is "Quality" According to the Guidelines?

r:".:­

Consistent with the OMB guidelines, EPA is issuing these Guidelines to ensure and maximize
the quality, including objectivity, utility and integrity, of disseminated information. Objectivity,
integrity, and utility are defined here, consistent with the OMB guidelines. "Objectivity" focuses
on whether the disseminated information is being presented in an accurate, clear, complete, and
unbiased manner, and as a matter of substance, is accurate, reliable, and unbiased. "Integrity"
refers to security, such as the protection of information from unauthorized access or revision, to
ensure that the information is not compromised through corruption or falsification. "Utility"
refers to the usefulness of the information to the intended users.

5.2

What is the Purpose of these Guidelines?

The collection, use, and dissemination of information of known and appropriate quality is
integral to ensuring that EPA achieves its mission. Information about the environment and
human health underlies all environmental management decisions. Information and the analytical
tools to understand it are essential for assessing environmental and human health risks, designing
appropriate and cost-effective policies and response strategies, and measuring environmental
improvements.
These Guidelines describe EPA's policy and procedures for reviewing and substantiating the
quality of information before EPA disseminates it. They describe our administrative mechanisms
for enabling affected persons to seek and obtain, where appropriate, correction of information
disseminated by EPA that they believe does not comply with EPA or OMB guidelines.

5.3

When Do these Guidelines Apply?

These Guidelines apply to "information" EPA disseminates to the public. "Information," for
purposes of these Guidelines, generally includes any communication or representation of
knowledge such as facts or data, in any medium or form. Preliminary information EPA
disseminates to the public is also considered "information" for the purposes of the Guidelines.
Information generally includes material that EPA disseminates from a web page. However not
all web content is considered "information" under these Guidelines (e.g., certain information
from outside sources that is not adopted, endorsed, or used by EPA to support an Agency
decision or position).
For purposes of these Guidelines, EPA disseminates information to the public when EPA
initiates or sponsors the distribution of information to the public.

EPA initiates a distribution of information if EPA prepares the information and
distributes it to support or represent EPA's viewpoint, or to formulate or support a
regulation, guidance, or other Agency decision or position.

Guidelines Scope and Applicability

15

GU!delJne~

to" !::nsum:9 and MJXII11IZrnp the Quality, Obrcctlvity. Utility, and

Inte~mtv

of IntormrtlJor:

D,sscmln<J~c,:' 'J,.

Er,.

EPA initiates a distribution of information if EPA distributes information
prepared or submitted by an outside party in a manner thaneasonably suggests
that EPA endorses or agrees with it; if EPA indicates in its distribution that the
information supports or represents EPA's viewpoint; or if EPA in its distribution
proposes to use or uses the information to formulate or support a regulation,
guidance, policy, or other Agency decision or position.

Agency-sponsored distribution includes instances where EPA reviews and
comments on information distributed by an outside party in a manner that
indicates EPA is endorsing it, directs the outside party to disseminate it on EPA's
behalf, or otherwise adopts or endorses it.

EPA intends to use notices to explain the status of information, so that users will be aware of
whether the information is being distributed to support or represent EPA's viewpoint.

5.4

What is Not Covered by these Guidelines?

If an item is not considered "information," these Guidelines do not apply. Examples of items that
are not considered information include Internet hyperlinks and other references to information
distributed by others, and opinions, where EPA's presentation makes it clear that what is being
offered is someone's opinion rather than fact or EPA's views.

"Dissemination" for the purposes of these Guidelines does not include distributions of
information that EPA does not initiate or sponsor. Below is a sample of various types of
information that would not generally be considered disseminated by EPA to the public:

Distribution of information intended only for government employees (including
intra- or interagency use or sharing) or recipients of government contracts, grants,
or cooperative agreements. Intra-agency use of information includes use of
information pertaining to basic agency operations, such as management,
personnel, and organizational information.
EPA's response to requests for agency records under the Freedom of Information
Act (FOlA), the Privacy Act, the Federal Advisory Committee Act (FACA), or
other similar laws.

Distribution of information in correspondence directed to individuals or persons
(i.e., any individual, group, or entity, including any government or political
subdivision thereof, or Federal governmental component/unit).
Information of an ephemeral nature, such as press releases, fact sheets, press
conferences, and similar communications, in any medium that advises the public
of an event or activity or announces information EPA has disseminated

Guidelines Scope and Applicability

16

GUld€,llOes.

for E:nsunnq and Maxnnizing the auality, Objectivity, Utility, and Integrity of Information D,ssemlnalec

rh ::t';

elsewhere: interviews, speeches, and similar communications that EPA does not
disseminate to the public beyond their original context, such as by placing them
on the Internet. If a speech, press release, or other "ephemeral" communication is
about an information product disseminated elsewhere by EPA, the product itself
will be covered by these Guidelines.

Information presented to Congress as part of the legislative or oversight
processes, such as testimony of officials, information, or drafting assistance
provided to Congress in connection with pending or proposed legislation. unless
EPA simultaneously disseminates this information to the public.

Background information such as published articles distributed by libraries or by
other distribution methods that do not imply that EPA has adopted or endorsed
the materials. This includes outdated or superseded EPA information that is
provided as background information but no longer reflects EPA policy or
influences EPA decisions, where the outdated or superseded nature of such
material is reasonably apparent from its form of presentation or date of issuance.
or where EPA indicates that the materials are provided as background materials
and do not represent EPA's current view.

These Guidelines do not apply to information distributed by recipients of EPA
contracts, grants, or cooperative agreements, unless the information is
disseminated on EPA's behalf, as when EPA specifically directs or approves the
dissemination. These Guidelines do not apply to the distribution of any type of
research by Federal employees and recipients of EPA funds, where the researcher
(not EPA) decides whether and how to communicate and publish the research,
does so in the same manner as his or her academic colleagues, and distributes the
research in a manner that indicates it does not necessarily represent EPA's official
position (for example, by including an appropriate disclaimer). The Guidelines do
not apply even if EPA retains ownership or other intellectual property rights
because the Federal government paid for the research.

Distribution of information in public filings to EPA, including information
submitted to EPA by any individual or person (as discussed above), either
voluntarily or under mandates or requirements (such as filings required by
statutes, regulations, orders, permits, or licenses). The Guidelines do not apply
where EPA distributes this information simply to provide the public with quicker
and easier access to materials submitted to EPA that are publicly available. This
will generally be the case so long as EPA is not the author, and is not endorsing,
adopting, using, or proposing to use the information to support an Agency
decision or position.

Distribution of information in documents filed in or prepared specifically for a
judicial case or an administrative adjudication and intended to be limited to such

Guidelines Scope and Applicability

17

GlIIdelJ'les tor EnsurlJ10 and Maxllnlzmg the Quality. Objectivity. Utility. and Integrity of Information Dlss;:mlnated o\' ED;

actions, including information developed during the conduct of any criminal or
civil action or administrative enforcement action, investigation, or audit involving
an agency against specific parties.

5.5

What Happens if Information is Initially Not Covered by these Guidelines, but EPA
Subsequently Disseminates it to the Public?

If a particular distribution of information is not covered by these Guidelines, the Guidelines may

still apply to a subsequent dissemination of the infonnation in which EPA adopts, endorses, or
uses the information to formulate or support a regulation, guidance, or other Agency decision or
position. For example, if EPA simply makes a public filing (such as facility data required by
regulation) available to the public, these Guidelines would not apply to that distribution of
information. However, if EPA later includes the information in a background document in
support of a rulemaking, these Guidelines would apply to that later dissemination of the
information in that document.

5.6

How does EPA Ensure the Objectivity, Utility, and Integrity of information that is
not covered by these Guidelines?

These Guidelines apply only to information EPA disseminates to the public, outlined in section
5.3, above. Other information distributed by EPA that is not covered by these Guidelines is still
subject to all applicable EPA policies, quality review processes, and correction procedures.
These include quality management plans for programs that collect, manage, and use
environmental information, peer review, and other procedures that are specific to individual
programs and, therefore, not described in these Guidelines. It is EPA's policy that all of the
information it distributes meets a basic standard of information quality, and that its utility,
objectivity, and integrity be scaled and appropriate to the nature and. timeliness of the planned
and anticipated uses. Ensuring the quality of EPA information is not necessarily dependent on
any plans to disseminate the information. EPA continues to produce, collect, and use information
that is of the appropriate quality, irrespective of these Guidelines or the prospects for
dissemination of the information.

Guidelines Scope and Applicability

18

GlIId01ll1el; to, Ensuring and MaxlITlizmg the Quality. Objectivity. Utility. and Integrity of Information DISSllmmJted

Q.

6

Guidelines for Ensuring and Maximizing Information Quality

6.1

How does EPA Ensure and Maximize tbe Quality of Disseminated Information?

::c'':'

EPA ensures and maximizes the quality of the information we disseminate by implementing well
established policies and procedures within the Agency as appropriate to the information product.
There are many tools that the Agency uses such as the Quality System,11 review by senior
management, peer review process,12 communications product review process~13 the web guide. 14
and the error correction processY Beyond our internal quality management system, EPA also
ensures the quality of information we disseminate by seeking input from experts and the general
public. EPA consults with groups such as the Science Advisory Board and the Science Advisory
Panel, in addition to seeking public input through public comment periods and by hosting public
meetings.
For the purposes of the Guidelines, EPA recognizes that if data and analytic results are subjected
to formal, independent, external peer review, the information may generally be presumed to be
of acceptable objectivity. However, this presumption of objectivity is rebuttable. The Agency
uses a graded approach and uses these tools to establish the appropriate quality, objectivity,
utility, and integrity of information products based on the intended use of the information and
the resources available. As part of this graded approach, EPA recognizes that some of the
information it disseminates includes influential scientific, financial, or statistical information,
and that this category should meet a higher standard of quality.

6.2

How Does EPA Define Influential Information for these Guidelines?

"Influential," when used in the phrase "influential scientific, financial, or statistical
information," means that the Agency can reasonably detennine that dissemination of the
information will have or does have a clear and substantial impact (i.e., potential change or effect)
on important public policies or private sector decisions. 16 For the purposes of the EPA's
II EPA Quality Manual for Environmental Programs 5360 AI. May 2000.
http://www.epa.gov/guality/gs-docs/5360.pdf

12 Peer Review Handbook, 2nd Edition, U.S. EPA, Science Policy Council, December 2000, EPA
IOO-B-OO-OOl. http://www.epa.gov/osplspc/prhandbk.pdf

13EPA's Print and Web Communications Product Review Guide. hup://www.epa.gov/dced/pdf/rcview.pdf
14Web

Guide. U.S. EPA. http://www.epa.gov/webguide/resources/webserv.html

15Integrated Error Correction Process. http://www.epa.gov/cdxliecp.html
16The term "clear and substantial impact" is used as part of a definition to distinguish different categories of
information for purposes of these Guidelines. EPA does not intend the classification ofinformation under this
definition to change or impact the status of the information in any other setting, such as for purposes of determining
whether the dissemination of the information is a final Agency action.

Guidelines for Ensuring and Maximizing Information Quality

19

GUldelmes fo' Ensunn9 anj Maxllnlzmg the Quality. Objectivity. Utility. and Integrity of Information Dlsscmlnatcrj n\' EDt

Information Quality Guidelines, EPA will generally consider the following classes of
information to be influential, and, to the extent that they contain scientific, financial, or statistical
information, that information should adhere to a rigorous standard of quality:

6.3

Information disseminated in support of top Agency actions (i.e., rules, substantive
notices, policy documents, studies, guidance) that demand the ongoing
involvement of the Administrator's Office and extensive cross-Agency
involvement; issues that have the potential to result in major cross-Agency or
cross-media policies, are highly controversial, or provide a significant opportunity
to advance the Administrator's priorities. Top Agency actions usually have
potentially great or widespread impacts on the private sector, the public or state,
local or tribal governments. This category may also include precedent-setting or
controversial scientific or economic issues.

Information disseminated in support of Economically Significant actions as
defined in Executive Order 12866, entitled Regulatory Planning and Review (58
FR 51735, October 4, 1993), Agency actions that are likely to have an annual
effect on the economy of $100 million or more or adversely affect in a material
way the economy, a sector ofthe economy, productivity, competition, jobs. the
environment, public health or safety, or State, Tribal, or local governments or
communities.

Major work products undergoing peer review as called for under the Agency's
Peer Review Policy. Described in the Science Policy Council Peer Review
Handbook, the EPA Peer Review Policy regards major scientific and technical
work products as those that have a major impact, involve precedential, novel,
and/or controversial issues, or the Agency has a legal and/or statutory Obligation
to conduct a peer review. These Major work products are typically subjected to
external peer review. Some products that may not be considered "major" under
the EPA Peer Review Policy may be subjected to external peer review but EPA
does not consider such products influential for purposes of these Guidelines.

Case-by-case: The Agency may make determinations of what constitutes
"influential information" beyond those classes of information already identified
on a case-by-case basis for other types of disseminated information that may have
a clear and substantial impact on important public policies or private sector
decisions.

How Does EPA Ensure and Maximize the Quality of "Influential" Information?

EPA recognizes that influential scientific, financial, or statistical information should be subject
to a higher degree of quality (for example, transparency about data and methods) than
information that may not have a clear and substantial impact on important public policies or
private sector decisions. A higher degree of transparency about data and methods will facilitate
Guidelines for Ensuring and Maximizing Information Quality

20

GUloeiin",., lor Ensurinp and MaY.llnlzlIlg the Quality. Objectivity Utility. and Integrity of Information D,sscrr>inatco n· EO"',.

the reproducibility of such information by qualified third parties, to an acceptable degree of
imprecision. For disseminated influential original and supporting data. EPA intends to ensure
reproducibility according to commonly accepted scientific, financial, or statistical standards, It is
important that analytic results for influential infonnation have a higher degree of transparency
regarding (1) the source of the data used, (2) the various assumptions employed, (3) the analytic
methods applied, and (4) the statistical procedures employed. It is also important that the degree
of rigor with which each of these factors is presented and discussed be scaled as appropriate. and
that all factors be presented and discussed. In addition, if access to data and methods cannot
occur due to compelling interests such as privacy, trade secrets, intellectual property, and other
confidentiality protections, EPA should, to the extent practicable, apply especially rigorous
robustness checks to analytic results and carefully document all checks that were undertaken.
Original and supporting data may not be subject to the high and specific degree of transparency
provided for analytic results; however, EPA should apply, to the extent practicable, relevant
Agency policies and procedures to achieve reproducibility, given ethical, feasibility, and
confidentiality constraints.
Several Agency-wide and Program- and Region-specific policies and processes that EPA uses to
ensure and maximize the quality of environmental data, including disseminated information
products, would also apply to infonnation considered "influential" under these Guidelines.
Agency-wide processes of particular importance to ensure the quality, objectivity, and
transparency of "influential" information include the Agency's Quality System, Action
Development Process, Peer Review Policy, and related procedures. Many "influential"
information products may be subject to more than one of these processes.

6.4

How Does EPA Ensure and Maximize the Quality of "Influential" Scientific Risk
Assessment Information?

EPA conducts and disseminates a variety of risk assessments. When evaluating environmental
problems or establishing standards, EPA must comply with statutory requirements and mandates
set by Congress based on media (air, water, solid, and hazardous waste) or other environmental
interests (pesticides and chemicals). Consistent with EPA's current practices, application of these
principles involves a "weight-of-evidence" approach that considers all relevant infonnation and
its quality, consistent with the level of effort and compleXity of detail appropriate to a particular
risk assessment. In our dissemination of influential scientific infonnation regarding human
health, safety'? or environmental 18 risk assessments, EPA will ensure, to the extent practicable

17"Safety risk assessment" describes a variety of analyses, investigations, or case studies conducted by EPA
to respond to environmental emergencies. For example, we work to ensure that the chemical industry and state and
local entities take action to prevent, plan and prepare for, and respond to chemical emergencies through the
development and sharing of information, tools, and guidance for hazards analyses and risk assessment.
18Because the assessment of "environmental risk" is being distinguished from "human health risk," the term
"environmental risk" as used in these Guidelines does not directly involve human health concerns. In other words, an
"environmental risk assessment" is in this case the equivalent to what EPA commonly calls an "ecological risk

Guidelines for Ensuring and Maximizing Information Quality

21

GUldCII!lCs. tor Ensunng and M31ol1'lIzl!lg the Quality. Objectivity. Utility. and lnteprtly of Informallon Dlsscr1",nntc'; t,. Er.:.

and consistent with Agency statutes and existing legislative regulations, the objectivity '9 of such
infonnation disseminated by the Agency by applying the following adaptation of the quality
principles found in the Safe Drinking Water Act 20 (SDWA) Amendments of 199621 :
(A)

The substance of the infonnation is accurate, reliable and unbiased. This involves the use
of:
(i)
the best available science and supporting studies conducted in accordance with
sound and objective scientific practices, including, when available, peer reviewed
science and supporting studies; and
(ii)
data collected by accepted methods or best available methods (if the reliability of
the method and the nature of the decision justifies the use of the data).

(B)

The presentation of infonnation on human health, safety, or environmental risks,
consistent with the purpose of the infonnation, is comprehensive, infonnative, and
understandable. In a document made available to the public, EPA specifies:
(i)

(ii)

each population addressed by any estimate of applicable human health risk or
each risk assessment endpoint, including populations if applicable, addressed by
any estimate of applicable ecological risk 22 ;
the expected risk or central estimate of human health risk for the specific

assessment" .
19 0MB stated in its guidelines that in disseminating information agencies shall develop a process for
reviewing the quality of the information. "Quality" includes objectivity, utility, and integrity. "Objectivity" involves
two distinct elements, presentation and substance. Guidelines for Ensuring and Maximizing the Quality, Objectivity,
Utility. and Integrity of Information Disseminated by Federal Agencies, OMB, 2002. (67 FR 8452)
http://www.whjtehouse.gov/omb/fedreg/reproducible2.pdf

20Safe Drinking Water Act Amendments of 1996, 42 U.S.c. 300g-l(b)(3)(A) & (B)
21The exception is risk assessments conducted under SDWA wruch will adhere to the SDWA principles as
amended in 1996.
22Agency assessments of human health risks necessarily focus on populations. Agency assessments of
ecological risks address a variety of entities, some of which can be described as populations and others (such as
ecosystems) wruch cannot. The phrase "assessment endpoint" is intended to reflect the broader range of interests
inherent in ecological risk assessments. As discussed in the EPA Guidelines for Ecological Risk Assessment (found
at http://cfpub.epa.gov/ncealcfmlrecordisplay.cfm?deid=12460), assessment endpoints are explicit expressions of the
actual environmental value that is to be protected, operationally defined by an ecological entity and its attributes.
Furthermore, those Guidelines explain that an ecological entity can be a species (e.g., eelgrass, piping plover), a
community (e.g., benthic invertebrates), an ecosystem (e.g., wetland), or other entity of concern. An attribute of an
assessment endpoint is the characteristic about the entity of concern that is important to protect and potentially at
risk. Examples of attributes include abundance (of a population), species richness (of a community), or function (of
an ecosystem). Assessment endpoints and ecological risk assessments are discussed more fully in those Guidelines
as well as other EPA sources such as Ecological Risk Assessment Guidance for Superfund: Process for Designing
and Conducting Ecological Risk Assessments· Interim Final found at
http://www.epa.gov/oerrpage/superfundlprogramslrisk/ecorisk/ecorisk.htm
Guidelines for Ensuring and Maximizing Information Quality

22

Gtllde-lin(!l; for' Ensurl[lg and Mawnlzing the Quality ObJectivity. Utility. and Integrity of Information Dlsscmmatc-,' lh E:":

(iii)
(iv)
(v)

populations affected or the ecological assessment endpoints 23 , including
populations if applicable;
each appropriate upper-bound or lower-bound estimate of risk;
each significant uncertainty identified in the process of the assessment of risk and
studies that would assist in resolving the uncertainty; and
peer-reviewed studies known to the Administrator that support, are directly
relevant to, or fail to support any estimate of risk and the methodology used to
reconcile inconsistencies in the scientific data.

In applying these principles, "best available" usually refers to the availability at the time an
assessment is made. However, EPA also recognizes that scientific knowledge about risk is
rapidly changing and that risk information may need to be updated over time. When deciding
which influential risk assessment should be updated and when to update it, the Agency will take
into account its statutes and the extent to which the updated risk assessment will have a clear and
substantial impact on important public policies or private sector decisions. In some situations,
the Agency may need to weigh the resources needed and the potential delay associated with
incorporating additional information in comparison to the value of the new information in terms
of its potential to improve the substance and presentation of the assessment.

Adaptation clarifications
In order to provide more clarity on how EPA adapted the SDWA principles in this guidance in
light of our numerous statutes, regulations, guidance and policies that address how to conduct a
risk assessment and characterize risk we discuss four adaptations EPA has made to the SDWA
quality principles language.

EPA adapted the SDWA principles by adding the phrase "consistent with Agency statutes and
existing legislative regulations, the objectivity of such information disseminated by the Agency"
in the introductory paragraph, therefore applying to both paragraphs (A) and (8). This was done
to explain EPA's intent regarding these quality principles and their implementation consistent
with our statutes and existing legislative regulations. Also, as noted earlier, EPA intends to
implement these quality principles in conjunction with our guidelines and policies. The
procedures set forth in other EPA guidelines set out in more detail EPA's policies for conducting
risk assessments, including Agency-wide guidance on various types of risk assessments and
program-specific guidance. EPA recognizes that the wide array of programs within EPA have
resulted not only in Agency-wide guidance, but in specific protocols that reflect the
requirements, including limitations, that are mandated by the various statutes administered by
the Agency. For example, the Agency developed several pesticide science policy papers that
explained to the public in detail how EPA would implement specific statutory requirements in
the Food Quality Protection Act (FQPA) that addressed how we perform risk assessments. We
also recognize that emerging issues such endocrine disruption, bioengineered organisms, and
genomics may involve some modifications to the existing paradigm for assessing human health

Guidelines for Ensuring and Maximizing Information Quality

23

GUIdonn0S lor Ensurlna and Mnxllnlztng the Quality. ObJectivity. Utility. and Integrit, ot Information

DiSS~mlni11C"

0'.

;::c.~

and ecological risks. This does not mean a radical departure from existing guidance or the
SOWA principles, but rather indicates that flexibility may be warranted as new information and
approaches develop.
EPA introduced the following two adaptations in order to accommodate the range of real-world
situations that we confront in the implementation of our diverse programs. EPA adapted the
SOWA quality principles by moving the phrase "to the extent practicable" from paragraph (B) to
the introductory paragraph in this Guidelines section to cover both parts (A) and (B) of the
SOWA adaptation. 24 The phrase refers to situations under (A) where EPA may be called upon to
conduct "influential" scientific risk assessments based on limited information or in novel
situations, and under (B) in recognition that all such "presentation" information may not be
available in every instance. The level of effort and complexity of a risk assessment should also
balance the information needs for decision making with the effort needed to develop such
information. For example, under the Federal Insecticide, Fungicide and Rodenticide Act 25
(FIFRA) and the Toxic Substances and Control Act 26 (TSCA), regulated entities are obligated to
provide information to EPA concerning incidents/test data that may reveal a problem with a
pesticide or chemical. We also receive such information voluntarily from other sources. EPA
carefully reviews incident reports and factors them as appropriate into risk assessments and
decision-making, even though these may not be considered information collected by acceptable
methods or best available method as stated in A(ii). Incident information played an important
role in the Agency's conclusion that use of chlordane/heptachlor termiticides could result in
exposures to persons living in treated homes, and that the registrations needed to be modified
accordingly. Similarly, incident reports concerning birdkills and fishkills were important
components of the risk assessments for the reregistration of the pesticides phorate and terbufos,
respectively. In addition, this adaptation recognizes that while many of the studies incorporated
into risk assessments have been peer reviewed, data from other sources may not be peer
reviewed. EPA takes many actions based on studies and supporting data provided by outside
sources, including confidential or proprietary information that has not been peer reviewed. For
example, industry can be required by regulation to submit data for pesticides under FIFRA or for
chemicals under TSCA. The data are developed using test guidelines and Good Laboratory
Practices (GLPs) in accordance with EPA regulations. While there is not a reqUirement to have
studies peer reviewed, such studies are reviewed by Agency scientists to ensure that they were
conducted according to the appropriate test guidelines and GLPs and that the data are valid.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
in many circumstances to conserve Agency resources and those of the regulated community who
otherwise might have to generate significant additional data. This flexibility is already provided
24 The discussion in this and following paragraphs gives some examples of the types of assessments that
may under some circumstances be considered influential. These examples are representative of assessments
performed under other EPA programs, such as CERCLA

25

7 U.S.c. 136 et seq.

26

15 U.S.C. 2601 et seq.

Guidelines for Ensuring and Maximizing Information Quality

24

GUidelines

TO~

Ensurmq and Maximizing the Quality. Objectivity. Utility. and Integrity of Information Dissaminalcc: m :: c,:

for paragraph (B) in the SDWA quality principles. Pesticide and chemical risk assessments are
frequently performed iteratively, with the first iteration employing protective (conservative)
assumptions to identify possible risks. Only if potential risks are identified in a screening level
assessment, is it necessary to pursue a more refined, data-intensive risk assessment. This is
exhibited, for example, in guidance developed for use in CERCLA and RCRA on tiered
approaches. In other cases, reliance on "structure activity relationship" or "bridging data" allows
the Agency to rely on data from similar chemicals rather than require the generation of new.
chemical-specific data. \Yhile such assessments mayor may not be considered influential under
the Guidelines. this adaptation of the SDWA principles reflects EPA's reliance on less-refined
risk assessments where further refinement could significantly increase the cost of the risk
assessment without significantly enhancing the assessment or changing the regulatory outcome.
In emergency and other time critical circumstances, risk assessments may have to rely on
information at hand or that can be made readily available rather than data such as described in
(A). One such scenario is risk assessments addressing Emergency Exemption requests submitted
under Section 18 of FIFRA27 which, because of the emergency nature of the request, must be
completed within a short time frame. As an example, EPA granted an emergency exemption
under Section 18 to allow use of an unregistered pesticide to decontaminate anthrax in a Senate
office building. The scientific review and risk assessment to support this action were necessarily
constrained by the urgency of the action. Other time-sensitive actions include the reviews of new
chemicals under TSCA. Under Section 5 of TSCA28 , EPA must review a large number of
pre-manufacture notifications (more than 1,000) every year, not all of which necessarily include
"influential" risk assessments, and each review must be completed within a short time frame
(generally 90 days). The nature of the reviews and risk assessment associated with these
pre-manufacture notifications are affected by the limited time available and the large volume of
notifications submitted.
The flexibility provided by applying "to the extent practicable" to paragraph (A) is appropriate
to account for safety risk assessment practices. This flexibility is already provided for paragraph
(B) in the SDWA quality principles. We applied the same SDWA adaptation for use with human
health risk assessments to safety risk assessments with the needed flexibility to apply the
principles to the extent practicable. "Safety risk assessments" include a variety of analyses.
investigations, or case studies conducted by EPA concerning safety issues. EPA works to ensure
that the chemical industry and state and local entities take action to prevent, plan and prepare for,
and respond to environmental emergencies and site specific response actions through the
development and sharing of infonnation, tools and guidance for hazard analyses and risk
assessment. For example, although the chemical industry shoulders most of the responsibility for
safety risk assessment and management. EPA may also conduct chemical hazard analyses,
investigate the root causes and mechanisms associated with accidental chemical releases, and
assesS the probability and consequences of accidental releases in support of agency risk

27

Section 18 ofFIFRA. 7 U.S.c. 136p

28

Section 5 ofTSCA. 15 U.S.c. 2604

Guidelines for Ensuring and Maximizing Information Quality

25

GUidelme~ te' ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Informallon DIsSemln3ted

I),

c:'"

assessments. Although safety risk assessments can be different from traditional human health
risk assessments because they may combine a variety of available information and may use
expert judgement based on that information, these assessments provide useful information that is
sufficient for the intended purpose.
Next, EPA adapted the SDWA quality principles by adding the clause "including, when
available, peer reviewed science and supporting studies" to paragraph (A)(i). It now reads: "the
best available science and supporting studies conducted in accordance with sound and objective
scientific practices, including, when available, peer reviewed science and supporting studies." In
the Agency's development of "influential" scientific risk assessments, we intend to use all
relevant information, including peer reviewed studies, studies that have not been peer reviewed,
and incident information; evaluate that information based on sound scientific practices as
described in our risk assessment guidelines and policies; and reach a position based on careful
consideration of all such information (i.e., a process typically referred to as the "weight-of­
evidence" approach 29 ). In this approach, a well-developed, peer-reviewed study would generally
be accorded greater weight than information from a less well-developed study that had not been
peer-reviewed, but both studies would be considered. Thus the Agency uses a "weight-of­
evidence" process when evaluating peer-reviewed studies along with all other information.
Oftentimes under various EPA-managed programs, EPA receives information that has not been
peer-reviewed and we have to make decisions based on the information available. While many
of the studies incorporated in risk assessments have been peer reviewed, data from other sources,
such as studies submitted to the Agency for pesticides under FIFRA 30 and for chemicals under
TSCA, may not always be peer reviewed. Rather, such data, developed under approved
guidelines and the application of Good Laboratory Practices (GLPs), are routinely used in the
development of risk assessments. Risk assessments may also include more limited data sets such
as monitoring data used to support the exposure element of a risk assessment. In cases where
these data may not themselves have been peer reviewed their quality and appropriate use would
be addressed as part of the peer review of the overall risk assessment as called for under the
Agency's peer review guidelines.
Lastly, EPA adapted the SDWA principles for influential environmental ("ecological") risk
assessments that are disseminated in order to use terms that are most suited for such risk
assessments. Specifically, EPA assessments of ecological risks address a variety of entities,

29 The weight-of-evidence approach generally considers all relevant infonnation in an integrative
assessment that takes into account the kinds of evidence available, the quality and quantity of the evidence, the
strengths and limitations associated of each type of evidence, and explains how the various types of evidence fit
together. See, e.g., EPA's Proposed Guidelinesfor Carcinogen Risk Assessment (Federal Register 61 (79):
17960-1801 I; Apri123, 1996) and EPA's Guidelines for Carcinogen Risk Assessment (Federal Register 51 (185):
33992-34003; September 24, 1986), available from: www.epa.gov/ncea/raf/.andEPA·sRisk Characterization
Handbook (Science Policy Council Handbook: Risk Characterization, EPA 100-B-OO-OO2, Washington, DC: U.S.
EPA, December 2000).
3040 CFR

part 158

Guidelines for Ensuring and Maximizing Information Quality

26

GUldeilne~

tor Ensurino and Maximizing the auallty. Objectivity. Utility. and Integrity 01 Information Disseminated t),

EP~

some of which can be described as populations and others (such as ecosystems) which cannot.
Therefore, a specific modification was made to include "assessment endpoints, including
populations if applicable"in place of the tenn "population" for ecological risk assessments and
EPA added a footnote directing the reader to various EPA risk policies for further discussion of
these concepts in greater detail.

Guidelines for Ensuring and Maximizing Information Quality

27

GUI'lellne!' tor E nsurtng and MaxImizing the Quality. ObJectivity. Utility. and Integrity of InformatIOn DIsscmin2r0>: I" "D;

6.5

Does EPA Ensure and Maximize the Quality of Information from External Sources?

Ensuring and maximizing' the quality of information from States, other governments, and third
parties is a complex undertaking, involving thoughtful collaboration with States, Tribes. the
scientific and technical community, and other external information providers. EPA will continue
to take steps to ensure that the quality and transparency of information provided by external
sources are sufficient for the intended use. For instance, since 1998, the use of environmental
data collected by others or for other purposes, including literature, industry surveys,
compilations from computerized data bases and information systems, and results from
computerized or mathematical models of environmental processes and conditions has been
within the scope of the Agency's Quality System3 !.
For information that is either voluntarily submitted to EPA in hopes of influencing a decision or
that EPA obtains for use in developing a policy, regulatory, or other decision, EPA will continue
to work with States and other governments, the scientific and technical community, and other
interested information providers to develop and publish factors that EPA would use to assess the
quality of this type of information.
For all proposed collections of information that will be disseminated to the public, EPA intends
to demonstrate in our Paperwork Reduction Ace 2 clearance submissions that the proposed
collection of information will result in information that will be collected, maintained and used in
ways consistent with the OMB guidelines and these EPA Guidelines. These Guidelines apply to
all information EPA disseminates to the public; accordingly, if EPA later identifies a new use for
the information that was collected, such use would not be precluded and the Guidelines would
apply to the dissemination of the information to the public.

31 EPA Quality Manual for Environmental Programs 5360 AI. May 2000, Section 1.3.1.
http://www.epa.gov/guality/gs-docsl5360.pdf

32

44 V.S.c. 3501 et seq.

Guidelines for Ensuring and Maximizing Information Quality

26

GUldelmes: lcor Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Inform<lllon DISSCmInZl!C'" D'.

7

Administrative Mechanism for Pre-dissemination Review

7.1

What are the Administrative Mechanisms for Pre-dissemination Reviews?

r: t­

Each EPA Program Office and Region will incorporate the information quality principles
outlined in section 6 of these Guidelines into their existing pre~dissemination review procedures
as appropriate. Offices and Regions may develop unique and new procedures, as needed, to
provide additional assurance that the information disseminated by or on behalf of their
organizations is consistent with these Guidelines. EPA intends to facilitate implementation of
consistent cross-Agency pre-dissemination reviews by establishing a model of minimum review
standards based on existing policies. Such a model for pre-dissemination review would still
provide that responsibility for the reviews remains in the appropriate EPA Office or Region.
For the purposes of the Guidelines, EPA recognizes that pre-dissemination review procedures
may include peer reviews and quality reviews that may occur at many steps in development of
information, not only at the point immediately prior to the dissemination of the information.

Administrative Mechanism for Pre-dissemination Review

29

GUidelines tor Ensunn9 and Maxflnlzrng the Quality. ObJectivity. Utility. and Integrity of Inform<ltlOn DrssemlnJtec t

8

Administrative Mechanisms for Correction of Information

8.1

What are EPA's Administrative Mechanisms for Affected Persons to Seek and
Obtain Correction of Information?

Er :

EPA's Office of Environmental Information (OEI) manages the administrative mechanisms that
enable affected persons to seek and obtain, where appropriate, correction of information
disseminated by the Agency that does not comply with EPA or OMB Information Quality
Guidelines. Working with the Program Offices, Regions, laboratories, and field offices, OEI will
receive complaints (or copies) and distribute them to the appropriate EPA information owners.
"Information owners" are the responsible persons designated by management in the applicable
EPA Program Office, or those who have responsibility for the quality, objectivity, utility, and
integrity of the information product or data disseminated by EPA. If a person believes that
information disseminated by EPA may not comply with the Guidelines, we encourage the person
to consult informally with the contact person listed in the information product before submitting
a request for correction of information. An informal contact can result in a quick and efficient
resolution of questions about information quality.
8.2

What Should be Included in a Request for Correction of Information?

Persons requesting a correction of information should include the following information in their
Request for Correction (RFC):

Name and contact information for the individual or organization submitting a
complaint~ identification of an individual to serve as a contact.

A description of the information the person believes .does not comply with EPA
or OMB guidelines, including specific citations to the information and to the EPA
or OMB guidelines, if applicable.

An explanation of how the information does not comply with EPA or OMB
guidelines and a recommendation of corrective action. EPA considers that the
complainant has the burden of demonstrating that the information does not
comply with EPA or OMB guidelines and that a particular corrective action
would be appropriate.

An explanation of how the alleged error affects or how a correction would benefit
the requestor.

An affected person may submit an RFC via anyone of methods listed here:
Internet at http://www.epa.gov/oei/gualityguidelines


E-mail atguality.guidelines@epa.gov
Fax at (202) 566-0255

Administrative Mechanisms for Correction of Information

30

Gllldelln":s

to~

Ensuring and Maxnnlzmg the Quality. Objectivity. Utility. and Integrity of Information D,sSemtnal£"('

8.3

:C\'

~P,:,

Mail to Information Quality Guidelines Staff, Mail Code 2822lT, U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington, DC, 20460
By courier or in person to Information Quality Guidelines Staff, OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC

When Does EPA Intend to Consider a Request for Correction ofInformation?

EPA seeks public and stakeholder input on a wide variety of issues, including the identification
and resolution of discrepancies in EPA data and infonnation. EPA may decline to review an
RFC under these Guidelines and consider it for correction if:

The request does not address infonnation,disseminated to the public covered by
these Guidelines (see section 5.3 or OMB's guidelines). In many cases, EPA
provides other correction processes for infonnation not covered by these
GUidelines.

The request omits one or more of the elements recommended in section 8.2 and
there is insufficient infonnation for EPA to provide a satisfactory response.
The request itself is "frivolous," including those made in bad faith, made without
justification or trivial, and for which a response would be duplicative. More
infonnation on this subject may be found in the OMB guidelines.

8.4

How Does EPA Intend to Respond to a Request for Correction of Information?

EPA intends to use the following process:

Each RFC will be tracked in an OEI system.

If an RFC is deemed appropriate for consideration, the infonnation owner office
or region makes a decision on the request on the basis of the infonnation in
question, including a request submitted under section 8.2. Rejections of a request
for correction should be decided at the highest level of the information owner
office or region. EPA's goal is to respond to requests within 90 days of receipt, by
1) providing either a decision on the request, or 2) if the request requires more
than 90 calendar days to resolve, infonning the complainant that more time is
required and indicate the reason why and an estimated decision date.

If a request is approved, EPA detennines what corrective action is appropriate.
Considerations relevant to the determination of appropriate corrective action
include the nature and timeliness of the infonnation involved and such factors as
the significance of the error on the use of the infonnation and the magnitude of

Administrative Mechanisms for Correction of Information

31

Gltideilne~

fa' Ensunng and Maximizing the Quality. Objectivity. Utility. and Integrity of InformatIOn DISS2mlniHc( til' ~::.

the error. For requests involving information from outside sources. considerations
may include coordinating with the source and other practical limitations on EPA' s
ability to take corrective action.

Whether or not EPA determines that corrective action is appropriate. EPA
provides notice of its decision to the requester.

For approved requests, EPA assigns a steward for the correction who marks the
information as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.

OEI will provide reports on behalf of EPA to OMB on an annual basis beginning January 1,
2004 regarding the number, nature, and resolution of complaints received by EPA.
.

8.5

How Does EPA Expect to Process Requests for Correction of Information on Which
EPA has Sought Public Comment?

When EPA provides opportunities for public participation by seeking comments on information,
the public comment process should address concerns about EPA's information. For example,
when EPA issues a notice of proposed rulemaking supported by studies and other information
described in the proposal or included in the rulemaking docket, it disseminates this information
within the meaning of the Guidelines. The public may then raise issues in comments regarding
the information. If a group or an individual raises a question regarding information supporting a
proposed rule, EPA generally expects to treat it procedurally like a comment to the rulemaking,
addressing it in the response to comments rather than through a separate response mechanism.
This approach would also generally apply to other processes involving a structured opportunity
for public comment on a draft or proposed document before a final document is issued, such as a
draft report, risk assessment, or guidance document. EPA believes that the thorough
consideration provided by the public comment process serves the purposes of the Guidelines,
provides an opportunity for correction of any information that does not comply with the
Guidelines, and does not duplicate or interfere with the orderly conduct of the action. In cases
where the Agency disseminates a study, analysis, or other information prior to the final Agency
action or information product, it is EPA policy to consider requests for correction prior to the
final Agency action or information product in those cases where the Agency has determined that
an earlier response would not unduly delay issuance of the Agency action or information product
and the complainant has shown a reasonable likelihood of suffering actual harm from the
Agency's dissemination if the Agency does not resolve the complaint prior to the final Agency
action or information product. EPA does not expect this to be the norm in rulemakings that it
conducts, and thus will usually address information quality issues in connection with the final
Agency action or information product.

Administrative Mechanisms for Correction of Information

32

GUldelmes for Ensuring and MaximIzing the Ouality. Objectivity. Utility. and Integrity of Inlormatlon Dlss~mlnatcCl l)V EDt:

EPA generally would not consider a complaint that could have been submitted as a timely
comment in the rolemaking or other action but was submitted after the comment period. If EPA
cannot respond to a complaint in the response to comments for the action (for example. because
the complaint is submitted too late to be considered and could not have been timely SUbmitted. or
because the complaint is not gennane to the action), EPA will consider whether a separate
response to the complaint is appropriate.

Administrative Mechanisms for Correction of Information

33

Gurdelm",:,. to· Ensunn," and Ma>omlZlng the Quality. ObJectivity. Utility. and Integrity o! InformatIon Disseminate", t,

8.6

Ero_

What Should be Included in a Request Asking EPA to Reconsider its Decision on a
Request for the Correction of Information?

If requesters are dissatisfied with an EPA decision, they may file a Request for Reconsideration
(RFR). The RFR should contain the following information:

An indication that the person is seeking an appeal of an EPA decision on a
previously submitted request for a correction of information, including the date of
the original submission and date of EPA decision. A copy of EPA's original
decision would help expedite the process.

Name and contact information. Organizations submitting an RFR should identify
an individual as a contact.

An explanation of why the person disagrees with the EPA decision and a specific
recommendation for corrective action.

A copy of the original RFC of information.

An affected person may submit a Request for Reconsideration (RFR) via anyone
of the methods listed here:

Internet at hnp://www.epa.gov/oei/gua1ityguidelines

E-mail atguality.guide1ines@epa.gov

Fax at (202) 566-0255

Mail to Information Quality Guidelines Staff, Mail Code 28221 T, U.S.
EPA, 1200 Pennsylvania Ave., N.W., Washington, DC, 20460

By courier or in person to Information Quality Guidelines Staff, OEI
Docket Center, Room B128, EPA West Building, 1301 Constitution
Ave., N.W., Washington, DC

EPA recommends that requesters submit their RFR within 90 days of the EPA decision. If the
RFR is sent after that time, EPA recommends that the requester include an explanation of why
the request should be considered at this time.
8.7

How Does EPA Intend to Process Requests for Reconsideration of EPA Decisions?

EPA intends to consider RFR using the following process:

Each RFR will be tracked in an OEI system.

OEI sends the RFR to the appropriate EPA Program Office or Region that has
responsibility for the information in question.

Administrative Mechanisms for Correction of Information

34

GUldelmes tor EI15unn9 and Maxllnizmg thc Quality. ObJcctivity. Utility. and Integrity of Information Dlss!)mlnatl'(,

0\'

E";·

The Assistant Administrator (AA) or Regional Administrator (RA) infonnation
owner presents to an executive panel. The executive panel would be comprised of
the Science Advisor/AA for the Office of Research and Development (ORD).
Chief Infonnation Officer/AA for OEI, and the Economics Advisor/AA for the
Office of Policy, Economics and Innovation (OPEL). The 3-member executive
panel would be chaired by the Chief Infonnation Officer/AA for OEI. When the
subject of the RFR originated from a member office, that panel member would be
replaced by an alternate AA or RA. While the executive panel is considering an
RFR, the decision made on the initial complaint by the infonnation owner office
or region remains in effect.
The executive panel makes the final decision on the RFR.
EPA's goal is to respond to each RFR within 90 days of receipt, by I) providing
either a decision on the request or 2) if the request requires more than 90 calendar
days to resolve, informing the complainant that more time is required and indicate
the reason why and an estimated decision date.

If a request is approved, EPA determines what type of corrective action is
appropriate. Considerations relevant to the determination of appropriate
corrective action include the nature and timeliness of the infonnation involved
and such factors as the significance of the error on the use of the infonnation and
the magnitude of the error. For requests involving infonnation from outside
sources, considerations may include coordinating with the source, and other
practical limitations on EPA's ability to take corrective action.

Whether or not EPA determines that corrective action is appropriate, EPA
provides notice of its decision to the requester.

For approved requests, EPA assigns a steward for the correction who marks the
infonnation as designated for corrections as appropriate, establishes a schedule
for correction, and reports correction resolution to both the tracking system and to
the requestor.

Administrative Mechanisms for Correction of Information

35

GlIldCllnCS tor Ensunnc and Maximizing the Quality, Objectivity, Utility. and Integrity ot InformatIon D,sS:lm:'1:H('c

C',

~D.

Appendix A

IQG Development Process and Discussion of Public Comments

A.I

Introduction

EPA's Guidelines are a living document and may be revised as we learn more about how best to
address, ensure, and maximize information quality. In the process of developing these
Guidelines, we actively solicited public input at many stages. While the public was free to
comment on any aspect of the Guidelines, EPA expli,citly requested input on key topics such as
influential information, reproducibility, influential risk assessment, information sources, and
error correction.
,
Public input was sought in the following ways:

An online Public Comment Session was held March 19-22, 2002, as the first draft of the
Guidelines was being developed. EPA received approximately 100 comments.

A Public Meeting was held on May 15,2002, after the draft Guidelines were issued.
There were 99 participants, 13 of whom made presentations or commented on one or
more issues.

A 52 day Public Comment period lasted from May 1 to June 21, 2002, where comments
could be mailed, faxed, or e-mailed to EPA. EPA received 55 comments during this
period.

A meeting with State representatives, sponsored and supported by the Environmental
Council of the States (ECOS), was held on May 29, 2002.

A conference call between EPA atd Tribal representatives was held on June 27,2002.

More detailed information on the public comments is available through an DEI web site, serving
as the home page for the EPA Information Quality Guidelines through the development and
implementation process. Please visit this site at http://www.epa.gov/oei/gualityguidelines.
We have established a public docket for the EPA Information Quality Guidelines under Docket
ID No. OEI-looI4. The docket is the collection of materials available for public viewing
Information Quality Guidelines Staff, OEI Docket Center, Room B128, EPA West Building,
1301 Constitution Ave., N.W., Washington, DC, phone number 202-566-0284. This docket
consists of a copy of the Guidelines. public comments received. and other information related to
the Guidelines. The docket is open from 12:00 PM to 4:00 PM, Monday through Friday.
excluding legal holidays. An index of docket contents will be available at
http://www .epa. gov/oeilgualityguidelines.

Appendix

36

GU1~i)i!nes

A.2

for Ensuring and Mawnlzing the Quality. Objectivity. Utility. and IntegrIty of Information D1SSemlnal!.'(; th :: D;

General Summary of Comments

During the various public comment opportunities, EPA received input from a diverse set of
organizations and private citizens. Comments came from many of EPA's stakeholders - the
regulated community and many interest groups who we hear from frequently during the
management of EPA's Programs to protect the nation's land, air, water, and public health.
Government agencies at the Federal, State, Tribal, and local level also commented on the
Guidelines. OMB sent comments to every Federal agency and EPA received comments from two
members of Congress. Beyond our government colle~gues, the private sector voiced many
concerns and helpful recommendations for these Guidelines. We would like to take this
opportunity to thank all commenters for providing their input on these Guidelines. Due to the
tight time frame for this project, this discussion of public comments generally describes the
major categories of comments and highlights some significant comments, but does not contain
an individual response to each public comment.
Comments received by EPA during the public comment period reflect a diversity of views
regarding EPA's approach to developing draft Guidelines as well as the general concept of
information quality. Some commenters included detailed review of all Guidelines sections, while
others chose to address only specific topics. In some cases, commenters provided examples to
demonstrate how current EPA procedures may not ensure adequate information quality for a
specific application. Commenters provided general observations such as stating that these
Guidelines did not sufficiently address EPA's information quality problems. Some commenters
offered that the Guidelines relied too much on existing policies. Interpretations of the intent of
the Data Quality Act were offered by some commenters. One comment noted that improvement
of data quality is not necessarily an end in and of itself. Another comment was that the goal of
Guidelines should be more to improve quality, not end uncertainty. Public interest and
environmental groups voiced concern over what they believed was an attempt by various groups
to undermine EPA's ability to act in a timely fashion to protect the eilVironment and public
health. Some commenters stated that the directives of the Data Quality Act and OMB cannot
override EPA's mission to protect human health and the environment per the statutory mandates
under which it operates.
EPA was congratulated for the effort and, in some cases, encouraged to go even further in
addressing information quality. Some commenters encouraged EPA to provide additional
process details, provide more detailed definitions, augment existing policies that promote
transparency, and share more information about the limitations of EPA disseminated
information. In one case, EPA was encouraged to develop a rating scheme for its disseminated
information.
This section discusses public comments and our responses to many of the important questions
and issues raised in the comments. First, we provide responses to some overarching comments
we received from many commenters, then we provide a discussion of public comments that were
received on specific topics addressed in the draft Guidelines.

Appendix

37

Gu!delin~s for EnSUring and Maximizing the Quality. Objectivity. Utility. and Integrit~; of Information DlSSernirlatee

A.3

D',

EP:

Tone: Commenters criticized the "defensive tone", "legalistic tone", and the lack
of detail afforded in the Guidelines. Some commenters said that it was not clear
what the Guidelines were explaining, or how they might apply to various types of
information. We understand and agree with many of these criticisms and have
made attempts to better communicate the purpose, applicability, and content of
these Guidelines.

Plan for implementation: Commenters suggested that the Guidelines should
describe EPA's plans for implementing the Guidelines. These Guidelines provide
policy guidance, and as such, do not outline EPA's plan for implementation. That
is, they do not describe in great detail how each Program and Regional Office will
implement these principles. We do not intend to imply that each Office will
implement them in conflict with one another, but rather assume that because each
Program implements a different statutory mandate or mandates, there will be
some inherent differences in approach. Beyond intemal implementation, we agree
that there is more work and communication to be conducted with information
providers and users to optimize the provisions set forth in these Guidelines.

Commitment to public access: One commenter suggested that we "remove
outdated information" from our web site. Other commenters suggested that when
a complaint has been filed that the information should be removed from public
view while a complaint is being reviewed. This is generally unacceptable to EPA
in light of our commitment to providing the public with access to information;
however, in certain cases EPA may consider immediate removal of information
(for example, when it is clear to us that the information is grossly incorrect and
misleading and its status cannot be adequately clarified through a notice or other
explanation). With respect to outdated information, sometimes it serves a
historical purpose, and should continue to be disseminated for that purpose.

Response to Comments by Guidelines Topic Area

A.3.t Existing Policy
Many commenters told us that we rely excessively on existing EPA information quality policies.
Commenters provided specific examples of areas they believed were demonstrative of our lack
of commitment to or uneven implementation of our existing policies. Some commenters also
poimed out that there are key areas in which we lack policies to address quality and, as a result,
the Guidelines !Should address such issues in more detail. Some commenters also noted that EPA
itself has highlighted lessons learned with existing approaches to information product
development.
Ongoing improvement in implementing existing processes is a key principle of quality
management. We view these Guidelines as an opportunity to enhance existing policies and
redouble our commitment to quality information.
Appendix

38

G'.lIdeitn0, tor Ensurll19 and Maximizing the auallty. ObJectivity. Utility. and Integrit)· of Information Dlssemm31rc to, !'"P:.

The concept of peer review is considered in three Guidelines sections. (1) Application of the
Agency's Peer Review Policy language for "major scientific and technical work products and
economic analysis used in decision making" as a class of information that can be considered
"influential" for purposes of the Guidelines; ( 2) Use of "peer-reviewed science" as a component
of some risk assessments; and (3) Use of the Agency's Peer Review Policy as one of the
Agency-wide processes to ensure the quality, objectivity, and transparency of "influential"
scientific, financial, and statistical information under the Guidelines.
Some commenters expressed concerns regarding application of peer review in EPA.
Commenters suggest that current peer reviews are not sufficiently standardized, independent, or
consistently implemented. Peer review is a cornerstone to EPA's credibility and we must ensure
that the process always works as designed. For this reason, we conduct routine assessments to
evaluate and improve the peer review process.
Commenters also questioned whether peer review is an adequate means to establish
"objectivity." We note that OMB guidelines specifically allow for the use of formal, external,
independent peer review to establish a presumption of objectivity. OMB guidelines also state
that the presumption of objectivity is rebuttable, although the burden of proof lies with the
complainant. Some comrnenters asked for additional definitions for peer review terms. Our
current peer review policy is articulated in Peer Review and Peer Involvement at the U.S.
Environmental Protection Agency.33 Additional discussion regarding the application of
peer-reviewed science is provided in the discussion of comments on risk assessment.

A.3.2 Scope and Applicability
We received a number of comments on section 1.1 (What is the Purpose of these Guidelines?) of
the draft Guidelines. Some commenters argued that the Guidelines should be binding on EPA,
that they are legislative rules rather than guidance, or that the Guidelines must be followed
unless we make a specific determination to the contrary. Others argued that the Guidelines
should not be binding or that we should include an explicit statement that the Guidelines do not
alter substantive agency mandates. Some suggested that our statements retaining discretion to
differ from the Guidelines sent a signal that EPA was not serious about information quality.
With respect to the nature of these Guidelines, Section 515 specifies that agencies are to issue
"guidelines." As directed by OMB's guidelines, we have issued our own guidelines containing
nonbinding policy and procedural guidance. We see no indication in either the language or
general structure of Section 515 that Congress intended EPA's guidelines to be binding rules.
We revised this section (now section 1 in this revised draft) by adding a fuller explanation of
how EPA intends to ensure the quality of information it disseminates. This section includes
language explaining the nature of our Guidelines as policy and procedural guidance. This
language is intended to gi ve clear notice of the nonbinding legal effect of the Guidelines. It
33http://epa.gov/osp/spc/perevmem.htm
Appendix

39

GlllClellnes tor !:nsu"'19 ami Ma~imizing the Quality. Objectivity. Utility. and Integrity of !ntormattor- D,ssem''''ltpc ." ~~,

notifies EPA staff and the public that the document is guidance rather than a substantive rule and
explains how such guidance should be implemented. Although we believe these Guidelines
would not be judicially reviewable, we agree that a statement to this effect is unnecessary and
have deleted it. In response to comments that EPA clarify that the Guidelines do not alter
existing legal requirements, we have made that change. In light of that change, we think it is
clear that decisions in particular cases will be made based on applicable statutes, regulations. and
requirements, and have deleted other text in the paragraph that essentially repeated that point.
Elsewhere in the document, EPA has made revisions to be consistent with its status as guidance.
Some commenters argued that all EPA disseminated information should be covered by the
Guidelines and that we lack authority to "exempt" information from the Guidelines. Others
thought that the coverage in EPA's draft was appropriate. EPA does not view its Guidelines as
establishing a fixed definition and then providing "exemptions." Rather, our Guidelines explain
when a distribution of information generally would or would not be considered disseminated to
the public for purposes of the Guidelines.. As. we respond to complaints and gain experience in
implementing these Guidelines, we may identify other instances where information is or is not
considered disseminated for the purposes of the Guidelines.
Some commenters cited the Paperwork Reduction Act (PRA), 44 U.S.C. 3501 et seq., to support
their argument that the Guidelines should cover all information EPA makes public. EPA's
Guidelines are issued under Section 515 of the Treasury and General Government
Appropriations Act for Fiscal Year 2001, which directs OMB to issue government-wide
guidelines providing policy and procedural guidance to Federal agencies. In tum, the OMB
guidelines provide direction and guidance to Federal agencies in issuing their own guidelines.
EPA's Guidelines are intended to carry out OMB's policy on information quality. One
commenter cited in particular the term "public information" used in the PRA as evidence of
Congress's intent under Section 515. In EPA's view, this does not show that Congress intended a
specific definition for the key terms, "information" and "disseminated," used in Section 515. In
the absence of evidence of Congressional intent regarding the meaning of the terms used in
Section 515, EPA does not believe the PRA requires a change in EPA's Guidelines.
We agree with commenters who noted that even if a particular distribution of information is not
covered by the Guidelines, the Guidelines would still apply to information disseminated in other
ways. As stated in section 1.4, if information is not initially covered by the Guidelines, a
subsequent distribution of that information will be subject to the Guidelines if EPA adopts,
endorses, or uses it.
Some commenters made specific recommendations about what should and should not be covered
by the Guidelines. In addition to the specific recommendations, some suggested that the "scope
and applicability" section was too long, while others thought it had an appropriate level of detail.
Based on other agencies' guidelines and public comments, EPA has removed much of the detail
from the discussion of Guidelines coverage. These revisions were intended to shorten and
simplify the discussion without changing the general scope of the Guidelines.

Appendix

40

GUidelines tor EnsUrtr19 and MaXilnizmg the Quality. ObJectivity. Utility. and Integrity 01 information DIsseminate:, b' =~:

We revised our definition of "information" in section 5.3, in response to a comment requesting
that the Agency make clear that information from outside sources is covered by the Guidelines if
EPA adopts, endorses, or uses it to support an Agency decision or positiQn. In section 5.4. we
modified several of the provisions. We added statements of "intent" or similar language to define
the scope of several of the provisions. Accordingly, dissemination would not include distribution
of information "intended" for government employees or recipients of contracts, grants. or
cooperative agreements. Nor would information in correspondence "directed to" individuals or
persons be covered. This recognizes that there may be instances where EPA may use a letter
written to an individual in a way that indicates it is directed beyond the correspondent and
represents a more generally applicable Agency policy. The Guidelines would apply in such a
case. EPA has created a category for information of an "ephemeral" nature, including press
releases, speeches, and the like. The intent was that the Guidelines should not cover
communications that merely serve as announcements, or for other reasons are intended to be
fleeting or of limited duration. Consistent with other agency guidelines, we have added language
indicating that the Guidelines do not cover information presented to Congress, unless EPA
simultaneously disseminates this information to the public.
Some commentel's thought all information from outside sources should be covered by the
Guidelines, even if EPA does not use, rely on, or endorse it. Others wished to clarify the point at
which the Guidelines cover information from outside sources. As noted above, section 1.4 of the
Guidelines explains how subsequent distributions of information in public filings may become
subject to the Guidelines. We continue to think that EPA's own public filings before other
agencies should not generally be covered by the Guidelines as long as EPA does not
simultaneously disseminate them to the public, since use of this information would be subject to
the requirements and policies of the agency to which the information is submitted.
We received a number of comments, including from OMB, arguing that the provision regarding
information related to adjudicative processes was too broad, and that the Guidelines should
cover some or all information related to adjudicative processes, particularly administrative
adjudications. In addition to shortening this section, we have limited this provision to
information in documents prepared specifically for an administrative adjudication. This would
include decisions, orders, findings, and other documents prepared specifically for the
adjudication. As indicated in the Draft Guidelines, our view is that existing standards and
protections in administrative adjudications would generally be adequate to assure the quality of
information in administrative adjudications and to provide an adequate opportunity to contest
decisions on the quality of information. For example, in permitting proceedings, parties may
submit comments on the quality of information EPA prepares for the permit proceeding, and
judicial review is available based on existing statutes and regulations. Narrowing the provision
to information prepared specifically for the adjudication should make clear that the Guidelines
would not generally provide parties with additional avenues of challenge or appeal during
adjudications, but would still apply to a separate distribution of information where EPA adopts,
endorses, or uses the infonnation, such as when EPA disseminates it, on the Internet, or in a
rulemaking, or guidance document. When we intend to adopt information such as models or risk
assessments for use in a class of cases or determinations (e.g., for use in all determinations under
Appendix

41

G,ddelHles for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information

Dlssc",m<Jt~d t

,. c.,

a particular regulatory provision), EPA often disseminates this information separately and in
many instances requests public comment on it. Accordingly, it is not clear that there would be
many instances where persons who are concerned about information prepared specifically for an
adjudication would not have an opportunity to contest the quality of information.
We respectfully disagree with a commenter' s recommendation that regulatory limits established
by EPA should be subject to the Guidelines. The Guidelines apply to information disseminated
by EPA, not to regulatory standards or other Agency decisions or policy choices. In response to
comments regarding information disseminated in rulemakings and other matters subject to public
comment, EPA considers that this information would be disseminated within the meaning of the
Guidelines, although we would generally treat complaints regarding that information
procedurally like other comments on the rulemaking or other matter.

A.3.3 Sources of Information
We received many comments on how the Guidelines apply to external parties, the shared quality
responsibilities between EPA and external parties, and specific EPA responsibilities when using
or relying on information collected or compiled by external parties.

EPA roles: Some commenters emphasized that ensuring quality of information at the point of
dissemination is no substitute for vigorous efforts by EPA to receive quality information in the
first place and therefore for information providers to produce quality infonnation. One
commenter stated that EPA cannot be responsible for all aspects of the quality of the information
we disseminate. In response to this and other comments, we have provided additional language
in these Guidelines on the various roles that EPA assumes in either ensuring the quality of the
information we disseminate or ensuring the integrity of information EPA distributes. One
comment suggested that we mention the role of the National Environmental Information
Exchange Network in ensuring information integrity, which we have done in section 2.4 of the
Guidelines.

Assessment factors: Overall, public input was positive and welcoming of our proposal to
develop assessment factors to evaluate the quality of information generated by third parties. A
few commenters offered their involvement in the development of these factors, their advice on
how to develop such factors, and some examples of what assessment factors we should consider.
EPA staff have provided such comments to the EPA Science Policy Council workgroup that was
charged with developing the assessment factors. EPA welcomes stakeholder input in the
development of these factors and published draft assessment factors for public comment in
September 2002.
Coverage of State Information: Some commenters suggested that our Guidelines must apply to
all information disseminated by EPA, including information submitted to us by States. Whereas
some commenters stressed that the quality of information received by EPA is the responsibility
of the providers, others expressed concern about the potential impact that EPA's Guidelines
could have on States. We believe it is important to differentiate between information that we
Appendix

42

Guidelme" for Ensurin9 and Maximizing the Quality. Objectivity. Utility. and Integrity of Informallon Dlssemlnatt'(·

p,.

E;;,:

generate and data or information generated by external parties, including States. State
information, when submitted to EPA, may not be covered by these Guidelines, but our
subsequent use of the information may in fact be covered. We note, however, that there may be
practical limitations on the type of corrective action that may be taken, since EPA does not
intend to alter information submitted by States. However, EPA does intend to work closely with
our State counterparts to ensure and maximize the quality of information that EPA disseminates.
Furthermore, one commenter stated that if regulatory information is submitted to an authorized
or delegated State program, then the State is the primary custodian of the information and the
Guidelines would not cover that information. We agree with that statement.
We also received comments regarding the use of labels, or disclaimers, to notify the public
whether information is generated by EPA or an external party. We agree that disclaimers and
other notifications should be used to explain the status of information wherever possible, and we
are developing appropriate language and format.
A statement regarding Paperwork Reduction Act clearance submissions has been added in
response to comment by OMB.
A.3.4 Influential Information
EPA received a range of comments on its definition of "influential." Below we provide a
summary of the comments raised and EPA's response.
Several commenters generally assert that the definition is too narrow. Other commenters
indicated that under EPA's draft definition, only Economically Significant actions, as defined in
Executive Order 12866, or only Economically Significant actions and information disseminated
in support oftop Agency actions, are considered "influential." We disagree. To demonstrate the
broad range of activities covered by our adoption of OMB's definition, we reiterate the
definition below and include an example of each type of action, to illustrate the breadth of our
definition. "Influential," when used in the phrase "influential scientific, financial, or statistical
information," means that the Agency can reasonably determine that dissemination of the
information will have or does have a dear and substantial impact on important public policies or
important private sector decisions. We will generally consider the following classes of
information to be influential: information disseminated in support of top Agency actions;
information disseminated in support of "economically significant" actions; major work products
undergoing peer review; and other disseminated information that will have or does have a dear
and substantial impact (i.e., potential change or impact) on important public policies or
important private sector decisions as determined by EPA on a case-by-case basis. In general,
influential information would be the scientific, financial or statistical information that provides a
substantial basis for EPA's position on key issues in top Agency actions and Economically
Significant actions. If the information provides a substantial basis for EPA's position, EPA
believes it would generally have a clear and substantial impact.

Appendix

43

Gtlldehnes tor Ensuring and MaXImizing the Quality. Objectivity. Utility. and Integrity of Information Diss~minatN' hI E":

Top Agency actions: An example of a top Agency action is the review of the National
Ambient Air Quality Standards (NAAQS) for Particulate Matter. Under the Clean Air
Act, EPA is to periodically review (l) the latest scientific knowledge about the effects on
public health and public welfare (e.g., the environment) associated with the presence of
such pollutants in the ambient air and (2) the standards, which are based on this science.
The Act further directs that the Administrator shall make any revisions to the standards
as may be appropriate, based on the latest science, that in her judgment are requisite to
protect the public health with an adequate margin of safety and to protect the public
welfare from any known or anticipated adverse effects. The standards establish allowable
levels of the pollutant in the ambient air across the United States, and States must
development implementation plans to attain the standards. The PM NAAQS were last
revised in 1997, and the next periodic review is now being conducted.
"Economically significant" rules: An example of a rule found to be economically
significant is the Disposal of Polychlorinated Biphenyls (PCBs) Final Rule. In 1998, EPA
amended its rules under the Toxic Substances Control Act (TSCA), which addresses the
manufacture, processing, distribution in commerce"use, cleanup, storage and disposal of
PCBs. This rule provides flexibility in selecting disposal technologies for PCB wastes
and expands the list of available decontamination procedures; provides less burdensome
mechanisms for obtaining EPA approval for a variety of activities; clarifies andJor
modifies certain provisions where implementation questions have arisen; modifies the
requirements regarding the use and disposal of PCB equipment; and addresses
outstanding issues associated with the notification and manifesting of PCB wastes and
changes in the operation of commercial storage facilities. EPA would consider the
information that provides the principal basis for this rule to be influential information.
Peer reviewed work products: An example of a major work product undergoing peer
review is the IRIS Documentation: Reference Dose for Methylmercury. Methylmercury
contamination is the basis for fish advisories. It is necessary to determine an intake to
humans that is without appreciable risk in order to devise strategies for decreasing
mercury emissions into the environment. After EPA derived a reference dose (RID) of
0.0001 mglkg-day in 1995, industry argued that it was not based on sound science.
Congress ordered EPA to fund an National Research CouncilfNational Academy of the
Sciences panel to determine whether our RID was scientifically justifiable. The panel
concluded that the 0.0001 mglkg-day was an appropriate RID, based on newer studies
than the 1995 RID. The information in this document was evaluated, incorporated. and
subjected to comment by the Office of Water, where it contliibuted in large part to
Chapter 4 of Drinking Water Criteria for the Protection of Human Health:
Methylmercury (EPN8231R-01/00l) January 2001. The peer review mechanism was an
external peer review workshop and public comment session held on November 15, 2000,
accompanied by a public comment period from October 30 to November 29, 2000.

Case-by.base determination - PBT Chemicals Rule: An example of a case-by-case
determination is the Guidance Document for Reporting Releases and Other Waste
Appendix

44

GUldeilne~ lar EnsunJ19 and Maxirnlzlnq the Quality. Objectivity. Utility. and Integrity of Information Dissemtnate~ ::. ::r"

Management Activities of Toxic Chemicals: Dioxin and Dioxin-like Compounds
(December, 2000). In a final rule pUblished October 29,1999, EPA lowered the reporting
thresholds for certain persistent bioaccumulative toxic (PBT) chemicals that are subject to
reporting under Section 313 of the Emergency Planning and Community Right-to-Know Act of
1986 (EPCRA) and Section 6607 of the Pollution Prevention Act of 1990 (PPA). We also added
a category of dioxin and dioxin-like compounds to the EPCRA Section 313 list of toxic
chemicals and established a 0.1 gram reporting threshold for the category. In addition, EPA
added certain other PBT chemicals to the EPCRA Section 313 list of toxic chemicals and
established lower reporting thresholds for these chemicals. As a result of this rulemaking, we
developed a guidance document on the reporting requirements for the dioxin and dioxin-like
compounds category, as well as a number of other guidance documents. The dioxin guidance
document provides guidance on how to estimate annual releases and other waste management
quantities of dioxin and dioxin-like compounds to the environment from certain industries and
industrial activities. Due to the high interest level of stakeholders, we solicited public comments
on the draft guidance document and formed a workgroup of interested stakeholders. The
workgroup reviewed all public comments, provided their own comments, and then reviewed and
commented on the final draft.

Case-by-case determination - National Water Quality Inventory Report: A second
example of a case-by-case determination is the National Water Quality Inventory Report
to Congress. The National Water Quality Inventory Report to Congress is a biennial
report to Congress and the public about the quality of our nation's waters. It is prepared
under Section 305 (b) of the Clean Water Act (CWA), which requires States and other
jurisdictions to assess the health of their waters and the extent to which water quality
supports State water quality standards and the basic goals of the CWA. States' Section
305 (b) assessments are an important component of their water resource management
programs. These assessments help States: implement their water quality standards by
identifying healthy waters that need to be maintained and impaired waters that need to be
restored, prepare their Section 303 (d) lists of impaired waters, develop restoration
strategies such as total maximum daily loads and source controls, and evaluate the
effectiveness of activities undertaken to restore impaired waters and protect healthy
waters.
A number of commenters said that EPA created a limited definition of what types of information
are to be considered "influential," and that we have no rational basis to do so. A number of
commenters also stated that "all Agency information should be considered influential"~ that "all
data relied upon by the Agency should meet a high standard of quality regardless of the type"; or
that "'influential' information includes information used to support any EPA action, not just
'top' Agency actions," EPA followed OMB's guidelines in establishing a definition for
"influential" information that was not all-encompassing. OMB stated "the more important the
information, the higher the quality standards to which it should be held, for example, in those
situations involving "influential scientific, financial or statistical information..,". OMB narrowed
the definition of "influential" in their final guidance as follows:

Appendix

45

GUldellfles lor Ensuring and Maxllnlzmg the Quality. Objectivity. Utility. and Integrity of InformatIOn Dlssemmale(i 0, E";.

In this narrower definition, "influential", when used in the phrase "influential
scientific. financial, or statistical information", is amended to mean that "the
agency can reasonably determine that dissemination of the infomiation will have
or does have a clear and substantial impact on important public policies or
important private sector decisions" (67 FR 8455).
OMB also amended their definition to say that "each agency is authorized to define "influential"
in ways appropriate for it given the nature and multiplicity of issues for which the agency is
responsible" (67 FR 8455). We adopted OMB's "influential" definition. Once the Agency
reviewed the wide range of information disseminated to the public, such as major rulemakings,
risk assessments, rule related guidance, health advisories, annual reports, fact sheets, and
coloring books, it became apparent that there were reasons to distinguish between "influential"
information and other information. EPA adopted OMB's definition for "influential" and used
types of information the Agency disseminates to further explain what information is included.
Another commenter suggested that EPA should not indicate whether disseminated information is
"influential" when it is first disseminated but should wait to designate information as
"influential" until either an information correction request is made or a final agency action is
taken. We intend to consider this point, as well as other comments made about when
disseminated information becomes influential, as the Agency implements the Guidelines.
One commenter suggests that the definition of the term "influential" should be more narrow.
Specifically, the commenter states the following:
Within the relatively narrow sphere of "disseminated" information, an agency
should reserve the designation of "influential" for information disseminated in
support of agency actions that are "major" regulations under Executive Order
12866, provide a "significant" opportunity to advance the agency's mandate by
other means, or involve precedent-setting or reasonably controverted issues. This
designation recognizes that procedures to promote the quality of information have
significant costs, and that the most significant (and therefore the most costly) of
such procedures should be reserved for information that is the most important in
terms of the agency's mission.
EPA agrees with the commenter that there are significant costs associated with ensuring that
information disseminated by the Agency is of high quality. Consequently, EPA chose a
definition of the term "influential" to COver information that, when disseminated, will result in a
clear and substantial impact on important public policies and private sector decisions. We
believe that this definition balances the costs associated with implementing the Guidelines, the
need to ensure high quality information, and the Agency's mission to protect human health and
safeguard the natural environment.
Several commenters indicated that it is inappropriate for EPA to base its definition of
"influential" on categories of actions. They suggest that the definition be based instead on the
Appendix

46

GlIrdellnes fa' Ensurln0 ancl MaxlITlIzing the Quality. ObJectivity. Utility. and Integrity of Information DrsSemlnatN1 h. EO>:

i

content of the infonnation. We consider our definition to be based on infonnation content, given
that those categories of disseminated infonnation we defined as influential are those that EPA
can reasonably determine will or do have a clear and substantial impact on important public
policies or private sector decisions. We note here that, in addition to the specific classes of
disseminated infonnation we have defined as "influential," EPA has reiterated the "case-by­
case" portion of the OMB "influential" definition. This general provision is intended to capture
disseminated infonnation, based on its content, that would not otherwise rise to the level of
"influential" under the other parts of our definition (i.e., top Agency actions, Economically
Significant actions, major peer reviewed products).
Several commenters assert that EPA should categorically state that certain specific types of
disseminated infonnation products are influential, and that we should categorically' state that
certain specific types of disseminated infonnation products are not influential. Given the vast
array of infonnation disseminated by the Agency, and given the fact that certain infonnation
may have a clear and substantial impact on important public policies or private sector decisions
at one time, but not have such an impact later on (and vice versa), classifying types of
information as "influential" or otherwise upfront is difficult and could be misleading. We intend
to rely on our definition in determining whether specific types of disseminated information
products are to be considered "influential" for purposes of the Guidelines.
A.3.5 Reproducibility
Some commenters stated that there needs to be more clarity in the definition of "reproducibility"
and related concepts. We have tried to provide definitions that are consistent with OMB
guidelines. Also, our Guidelines now include that EPA intends to ensure reproducibility for
disseminated original and supporting data according to commonly accepted scientific, financial,
or statistical standards. Many commenters thought there should be some kind of method to
consider reproducibility when proprietary models, methods, designs, and data are used in a
dissemination. Some commenters discourage all use of proprietary models; others suggest
proprietary model use be minimized with application limited to situations in which it is
absolutely necessary. We understand this concern, but note that there are other factors that are
appropriately considered when deciding whether to use proprietary models, including feasibility
and cost considerations (e.g., it may be more cost-effective for the Agency to use a proprietary
model in some situations than to develop its own model). In cases where the Agency relies on
proprietary models, these model applications are still subject to our Peer Review Policy. Further,
as recently directed by the Administrator, the Agency's Council on Regulatory Environmental
Modeling is now revitalizing its development of principles for evaluating the use of
environmental models with regard to model validation and certification issues, building on
current good modeling practices. In addition, these Guidelines provide for the use of especially
rigorous "robustness checks" and documentation of what checks were undertaken. These steps,
along with transparency about the sources of data used, various assumptions employed, analytic
methods applied, and statistical procedures employed should assure that analytic results are
"capable of being substantially reproduced."

Appendix

47

Gwdehnes for

Ensurll1~

and M:lXImizing the Quality. Objectivity. Utility. and Integrity of Information DlssemlnateG

D

E~L

Regarding robustness checks, commenters were concerned that the EPA did not use the term
"especially rigorous robustness checks." We have modified our Guidelines to include this term.
Some commenters speculated on the ability of the Agency's Peer Review program to meet the
intent of the Guidelines and were concerned about the process to rebut a peer review used to
support the objectivity demonstration for disseminated information. Our Peer Review program
has been subject to external review and we routinely verify implementation of the program.
Affected persons wishing to rebut a formal peer review may do so using the complaint resolution
process in these Guidelines, provided that the information being questioned is considered to be
"disseminated" according to the Guidelines.
Regarding analytic results, some commenters indicated that the transparency factors identified
by EPA (section 6.3 of the Guidelines) are not a complete list of the items that would be needed
to demonstrate a higher degree of quality for influential information. EPA agreed with the list of
four items that was initially provided by the OMB and recognizes that, in some cases, additional
information regarding disseminated information would facilitate increased quality. However,
given the variety of information disseminated by the Agency, we cannot reasonably provide
additional details for such a demonstration at this time. Also, in regards to laboratory results,
which were mentioned by several commenters, these Guidelines are not the appropriate place to
set out for the science community EPA's view of what constitutes adequate demonstration of test
method validation or minimum quality assurance and quality control. Those technical
considerations should be addressed in the appropriate quality planning documentation or in
regulatory requirements.
EPA has developed general language addressing the concept of reproducibility and may provide
more detail after appropriate consultation with scientific and technical communities, as called for
by OMB in its guidelines. We have already begun to consult relevant scientific and technical
experts within the Agency, and also have planned an expedited consultation with EPA's Science
Advisory Board (SAB) on October 1, 2002. Based on these initial consultations, EPA may seek
additional input from the SAB in 2003. These consultations will allow EPA to constructively and
appropriately refine the application of existing policies and procedures, to further improve
reproducibility. In the interim, EPA intends to base the reproducibility of disseminated original
and supporting data on commonly accepted scientific, financial, or statistical standards.

A.3.6 Influential Risk Assessment
General Risk Assessment
Risk assessment is a process where information is analyzed to determine if an environmental
hazard might cause harm to exposed persons and ecosystems (paraphrased from Risk
Assessment in the Federal Government, National Research Council, 1983). That is:
Risk

=hazard x exposure

For a chemical or other stressor to be "risky," it must have both an inherent adverse effect on an
Appendix

48

GLlldelille~

tor Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Information

DISSerTllnaH",1 ',.

=~ ~

organism, population, or other endpoint and it must be present in the environment at
concentrations and locations that an organism, population, or other endpoint is exposed to the
stressor. Risk assessment is a tool to deuinnine the likelihood of harm or loss of an organism,
population, or other endpoint because of exposure to a chemical or other stressor. To assist those
who must make risk management decisions, risk assessments include discussions on uncertainty.
variability and the continuum between exposure and adverse effects.
Risk assessments may be performed iteratively, with the fITst iteration employing protective
(conservative) assumptions to identify possible risks, Only if potential risks are identified in a
screening level assessment is it necessary to pursue a more refined, data~intensive risk
assessment. The screening level assessments may not result in "central estimates" of risk or
upper and lower-bounds of risks. Nevertheless, such assessments may be useful in making
regulatory decisions, as when the absence of concern from a screening level assessment is used
(along with other information) to approve the new use of a pesticide or chemical or to decide
whether to remediate very low levels of waste contamination.

Appendix

49

GwdC!lnes fer Ensuring and Maximizing the Quality. OOJectlvity. Utility. and Integrity of Information D,ssemrnaisc u" E""

OMB Guidelines
In its guidelines OMB stated that, with respect to influential information regarding health, safety
or environmental risk assessments, agencies should either adopt or adapt the quality principles in
the Safe Drinking Water Act (SDWA) Amendments of 1996. 34.35 In the background section of
the OMB guidelines, OMB explains that "the word 'adapt' is intended to provide agencies
flexibility in applying these principles to various types of risk assessment."
Guidelines Development Consideration
EPA carefully and practically developed the adaptation of the SDWA quality principles using
our considerable experience conducting human health and ecological36 risk assessments as well
as using our existing policies and guidance.
EPA conducts many risk assessments every year. Some of these are screening level assessments
based on scientific experts' judgments using conservative assumptions and available data and can
involve human health, safety, or environmental risk assessments. Such screening assessments
provide useful information that are sufficient for regulatory purposes in instances where more
elaborate, quantitative assessments are unnecessary. For example, such assessments could
indicate, even with conservative assumption, the level of risk does not warrant further
investigation. Other risk assessments are more detailed and quantitative and are based on
research and supporting data that are generated outside EPA. For example, pesticide reviews are
based on scientific studies conducted by registrants in accordance with our regulations and
guidance documents. Our test guidelines and Good Laboratory Practices (GLPS)37 describe
sound scientific practices for conducting studies needed to assess human and environmental
hazards and exposures. Such studies are not required to be peer-reviewed. Risk assessments
based on these studies can include occupational, dietary, and environmental exposures.

34 Safe Drinking Water Act Amendments of 1996, 42 V.S.c. 300g-1(b)(3)(A) & (B).
35 In section III.3.ii.C. of its guidelines, OMB states that: "With regard to analysis of risks to human health,
safety and the environment maintained or disseminated by the agencies, agencies shall either adopt or adapt the
equality principles applied by Congress to risk infonnation used and disseminated pursuant to the Safe Drinking
Water Act Amendments of 1996 (42 V.S.c. 300g-1 (b)(3)(A) & (B». Agencies responsible for dissemination of vital
health and medical infonnation shall interpret the reproducibility and peer-review standards in a manner appropriate
to assuring the timely flow of vital infonnation from agencies to medical providers, patients. health agencies, and the
public. Infonnation quality standards may be waived temporarily by agencies under urgent situations (e.g.• imminent
threats to public health or homeland security) in accordance with the latitude specified in agency-specific
guidelines".
36Because the assessment of "environmental risk" is being distinguished in OMB's adaptation of the
SDWA quality principles from "human health risk". the tenn "environmental risk" as used in these Guidelines does
not directly involve human health concerns. In other words, "environmental risk assessment" is, in this case. the
equivalent to what EPA commonly refers to as "ecological risk assessment".
3740 CFR part 160 for FlFRA and 40 CFR part 792 for TSCA.
Appendix

50

G,lIdeIlne!'o for Ensuring and Ma~imizmg the Quality. ObJectivity. Utility. and IntegritY of Information D,ssemmated 0\' 1:'0,:.

The results of these risk assessments are conducted and presented to policy makers to inform
their risk management decisions. EPA currently has numerous policies that provide guidance to
internal risk assessors on how to conduct a risk assessment and characterize risk. The EPA Risk
Characterization Policy8 and associated guidelines are designed to ensure that critical
information from each stage of a risk assessment is used in forming conclusions about risk and
that this information is communicated from risk assessors to policy makers.
EPA Existing Policies and Guidance
Current EPA guidance and policies incorporate quality principles. These are designed to ensure
that critical information from each stage of a risk assessment is used in forming conclusions
about risk and that this information is communicated from risk assessors to policy makers. One
example is the EPA Risk Characterization POlicy9 which provides a single, centralized body of
risk characterization implementation guidance to help EPA risk assessors and risk managers
make the risk characterization process transparent and risk characterization products clear,
consistent and reasonable (TCCR). These principles have been included in other Agency risk
assessment guidance, such as the Guidelines for Ecological Risk Assessment. 4o Other examples
of major, overarching guidelines for risk assessments include: Guidelines For Exposure
Assessment 41, Guidelines For Neurotoxicity Risk Assessment,42 and Guidelines For Reproductive
Toxicity Risk Assessment. 43 Each of these documents has undergone external scientific peer
review as well as public comment prior to publication. Additionally, individual EPA offices have
developed more specific risk assessment policies to meet the particular needs of the programs
and statutes under which they operate. 44 EPA's commitment to sound science is evidenced by our
ongoing efforts to develop and continually improve Agency guidance for risk assessment.

38http://www.epa.gov/OSP/spc/rcpolicy.htm
39

Ibid.

40US EPA(l998). Guidelines for ecological risk assessment (Federal Register 63(93):26846-26924).
http://www.epa.gov/nceairaf.
41 US EPA (1992). Guidelines For Exposure Assessment. Federal Register 57(104):22888-22938.
http://www.epa.gov/ncea/raf/.
42US EPA (1998). Guidelines For Neurotoxicity Risk Assessment. Federal Register 63(93):26926-26954.
http://www.epa.gov/ncea/raf/.
43US EPA (1996). Guidelines For Reproductive Toxicity Risk Assessment. Federal Register 61 (212):56274·
56322. http://www.cQa.gov/ncea/raf .
44 The Office of Solid Waste and Emergency Response has developed Tools for Ecological Risk
Assessment for Superfund Risk Assessment. One example is the Ecological Risk Assessment Guidance for
Superfund: Process for Designing and Conducting Ecological Risk Assessments· Interim Final.
http://www.epa.gov/oerrpage/superfund/programs/risk/ecorisk/ecorisk.htm
http://www.epa.gov/oerrpage/superfund/programs/risk/tooleco.htm

Appendix

51

C.'ll!o€.'iJn~& for Ensuring and MaXIIllizing the Quality. ObJectivity. Utility. and Integrity ot InformallO'1 D,ssemrnate"

l)\

: : ; ,.

EPA's Experience Conductine Risk Assessments
The first EPA human health risk assessment guidelines 45 were issued in 1986. In 1992. the
Agency produced a Framework for Ecological Risk Assessment46 which was replaced by the
1998 Ecological Risk Assessment GUidelines. 47 As emphasized elsewhere in this document. the
statutes administered by EPA are diverse. Although the majority of risk assessments conducted
within the Agency are for chemical stressors, we also assess risks to biological and physical
stressors. In addition to risk assessment guidelines, both the EPA Science Policy Council and the
EPA Risk Assessment Forum have coordinated efforts to address the complex issues related to
data collection and analysis for hazard and exposure assessments. Thus, the Agency has
considerable experience in conducting both screening level and in-depth assessments for a wide
array of stressors.
Most environmental statutes obligate EPA to act to prevent adverse environmental and human
health impacts. For many of the risks that we must address, data are sparse and consensus about
assumptions is rare. In the context of data quality, we seek to strike a balance among fairness.
accuracy, and efficient implementation. Refusing to act until data quality improves can result in
substantial harm to human health, safety, and the environment.
Public Comments
We received a range of public and stakeholder comments on the adaptation of the SOWA
principles for "influential" human health, safety. and environmental risk assessments that are
disseminated by EPA. Some commenters stated that we should adopt the SOWA quality
principles for human health risk, safety and environmental risk assessments. Many commenters
sought clarification on reasons for EPA's adaptation of the SOWA quality principles for human
health risk assessments and additional information on how we plan to address this process.
Others urged us to adapt the SOWA principles rather than adopt, because of certain elements in
the SOWA principles that may not be applicable to all risk assessments such as a "central
estimate of human risk for the specific populations affected." Others stated that we should
neither adapt nor adopt SOWA principles because the "Data Quality Act" does not authorize
importing decisional criteria into statutory provisions where they do not apply. The decisional
criteria set forth in SOWA are expressly limited to SOWA. We also received comments at a
level of detail that are more appropriate for implementation of the Guidelines than for the
formulation of the Guidelines. These include comments regarding the use of clinical human test
data, and comments regarding the use of particular types of assumptions in risk assessments. To
the extent that an affected person believes that our use of data or assumptions in a particular

4551 FR 33992-34054, 24 September 1986.
46Framework For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1992, EPAl630fR­
92/001.
470uidelines For Ecological Risk Assessment, U.S. EPA, Risk Assessment Forum, 1998. EPAl630fR­
95/002F. http://cfpub.epa.gov/ncea/cfm/ecorsk.cfm

Appendix

52

(,llloelines for E.nsunng and MaxImIzing the Quality, Objectivity. Utility. and Integrity of Information D,ssemlnatrc' D, ;:~

dissemination of infonnation is inconsistent with these Guidelines, the issue can be raised at that
time.
A few commenters raised a question regarding a conflict between EPA's existing policies and
the SOWA principles and asked us to identify the conflicting specific risk assessment standards
and make every effort to reconcile the conflicting standards with the SOWA principles. A few
commenters stated that EPA should not have two separate standards for risk assessments (i.e..
one for influential and one for non-influential), but that all risk assessments should be considered
influential. Another stated that if there is a conflict between existing policies and the SOWA
principles, EPA should identify the conflicting specific risk assessment standards and make
every effort to reconcile the conflicting standards with the SOWA principles. Some commenters
have questioned why the "best available, peer reviewed science and supporting stud'ies"
language of SOWA was conditioned by tenns such as "to the extent practicable" or "as
appropriate."

Adaptation of SDWA Quality Principles
Public comments received by the Agency on the draft Guidelines were widely divergent. As no
obvious consensus could be drawn, we carefully considered comments and arguments on
adoption and adaptation. We also reviewed our experience with the SOWA principles. existing
policies, and the applicability and appropriateness of the SOWA language with regard to the
variety of risk assessments that we conduct and have detennined that, to best meet the statutory
obligations of the many statutes EPA implements, it remains most appropriate to adapt the
SOWA principles to human health, safety, and environmental risk assessments.
In response to public comments we have removed "as appropriate" from these Guidelines in our
SOWA adaptation. EPA agrees that the phrase peer reviewed science "as appropriate" was
unclear. We revised this statement in part (A) to "including, when available, peer-reviewed
science and supporting studies." EPA introduced such adaptations in order to accommodate the
range of real-world situations we address in the implementation of our diverse programs.
Numerous commenters expressed that EPA did not provide adequate clarifications of how we
adapted the principles and what our thinking was on each adaptation. In these Guidelines we
have provided detailed clarifications regarding each adaptation made to the original SOWA
language and other remarks regarding our intent during the implementation of the SOWA
adaptation for influential disseminations by EPA. We direct reader to the Guidelines text for
such clarifications.

A.3.7 Complaint Resolution
A few commenters noted that EPA should outline how an affected person would rebut the
presumption of objectivity afforded by peer review. EPA believes this determination would be
made on a case-by-case basis considering the circumstances of a particular peer review and has

Appendix

53

GUidelines for Ensuring and Maximizing the Quality. Objectivity. Utility. and Integrity of Informatton D,ssemmatec

c),

::"'.

decided not to provide specific suggestions for affected persons on how to rebut the presumption
of objectivity afforded by a peer review.
OMB and other commenters noted that agencies' guidelines needed to make clear that a request
for correction can be filed if an affected person believes that information does not comply with
the EPA Guidelines and the OMB guidelines. EPA has added language in the EPA Guidelines to
make sure this is more clear to readers.
EPA received numerous comments on the EPA definition of affected persons. In the draft
Guidelines, EPA had adopted OMB's definition. EPA agrees with comments suggesting that.
instead of elaborating on the definition of "affected person," a more open approach would be to
ask complainants to describe how they ate an affected person with respect to the information that
is the subject of their complaint. EPA is asking that persons submitting requests for correction
provide, among other things, such an explanation. EPA has revised the Guidelines accordingly,
so that we may consider this information along with other information in the complaint in
deciding on how to respond.
Some commenters noted that the EPA Guidelines do not state how the process will work,
specifically, for States, municipalities, and EPA. They expressed concern of being "caught in the
middle," so to speak, on trying to get their own information corrected. EPA does not believe that
the Guidelines needed greater details on how States will work with EPA to address complaints,
but intends to work closely with States to better ensure timely correction. EPA does appreciate
the frustration of an information owner in seeing what they deem'''incorrect'' information in a
disseminated document or web site. However, EPA notes that this is a very complex issue that
cannot be addressed with general language in the Guidelines for all cases.
Several comments indicated that EPA appears to have given itself "carte blanche" authority to
"elect not to correct" information. The commenters stated that there was no valid reason why
EPA would opt out of correcting information and that all errors should be corrected. To the
contrary, EPA like every Federal agency wants to correct wrong information. The issue is not as
simple as the correction of an improper zip code or phone number on the EPA web site. Even
these simple errors may be very complex if it would involve changing data in an EPA and/or
State database. Furthermore, EPA is not certain of the volume of complaints it will receive after
October 1 and therefore needed to provide a general provision in the Guidelines to recognize that
once EPA approves a request, the corrective action may vary depending on the circumstances.
On a case·by~case basis, EPA will determine the appropriate corrective action for each
complaint. EPA determined that this was the most reasonable approach. The revision also
recognizes practical limitations on corrective action for information from outside sources.
Several commenters noted that EPA needs to establish time frames for the complaint process.
Commenters stated that EPA should establish time frames for when affected persons can submit
a complaint on an information product, when EPA needs to responds to affected persons with a
decision on discrete, factual errors, when EPA would respond to affected persons with a decision
on more complex or broader interpretive issues, and when an affected person should submit a
Appendix

54

GUldelmcs for E:nsv~lnq and M:lXImlzmg the Quality. Oblectlvity. Uliltty. and Inteqrlty of Information Dlsserntn:lIe~< 0.. ~c;

request for reconsideration. One commenter suggested that EPA solicit all complaints at one
time during a 6-month window or another time frame. EPA notes that commenters provided
helpful examples and well thought out proposals for such a suite of time frames and appreciates
the public input.
EPA did not agree on the need to develop two separate time frames for complaints that are more
factual in nature versus those that are more complex. One commenter suggests a I5-day time
line for discrete factual errors and a 45-day time line for all other complaints. Another
commenter recommended 30 days for factual errors ",nd 60 days for all other complaints.
Another commenter advised EPA to model this complaint process according to the FOlA
process. This commenter also suggested a 3-week time line for more numeric corrections and 60
days for "broader interpretive issues or technical questions." While EPA appreciates the value of
these approaches, they might be problematic to implement. However, as EPA learns more about
the nature of this complaint process following some period of implementation. these suggested
approaches could be revisited.
EPA also agreed with commenters that a window of opportunity for commenters to submit a
request for reconsideration made sense. EPA has advised affected persons in these Guidelines to
submit a request for reconsideration within 90-days of the initial complaint decision by EPA.
Some commenters asked that EPA establish time lines for when EPA would take corrective
action. EPA does not anticipate that there would be any value in applying a specific time frame
for this action and prefers to look at each complaint and appropriate corrective action on a
case-by-case basis, as discussed above.
Commenters suggested that 45 days was a reasonable time frame for EPA to get back to the
affected person with either a decision or a notice that EPA needs more time. One group noted
that HHS, SSA, and NRC adopted the 45-day window. EPA disagreed with this approach and
instead opted for a 90-day time frame similar to the DOT Guidelines.
EPA received many comments on how EPA should structure its internal processes for the
complaint resolution process. Several comments specifically discussed the role that OEI should
play in the initial complaint and the requests for reconsideration. EPA does not agree that OEI
should be the arbiter on all requests for reconsideration, but does view the role of OEI in the
process as an important one. Namely, OEI may work to help ensure consistent responses to
complaints and requests for reconsideration. Other comments recommending specific internal
implementation processes are being considered as EPA designs the correction and request for
reconsideration administrative processes in greater detail.
Many commenters argued that Assistant Administrators and Regional Administrators should not
decide requests for reconsideration because they would be biased or would have a conflict of
interest when deciding complaints regarding infonnation disseminated by their own Offices or
programs, or if they had to reconsider decisions made by their own staffs. EPA does not agree.
This type of decision making is within the delegated decision making authority of EPA's
Appendix

55

GUIdelines lor Ensuring and MaxII11Izing the Ouality. ObJectivity. Utility, and Integrity of Information DlsSCmlllOwG b>, ED;.

officials, and these decisions should be presumed to be unbiased absent a specific showing that a
decision maker is not impartial in a particular case. EPA does agree with commenters who noted
that it is important to make consistent decisions on cross-cutting information quality issues. In
order to achieve appropriate consistency of response to affected persons on requests for
reconsideration and to ensure that cross-cutting information quality issues are considered across
the Agency at a senior level, EPA intends for an executive panel to make the final decisions on
all requests for reconsideration. Furthermore, we felt it important to add greater detail on the
time frame within which EPA would respond to a requestor on their request for reconsideration.
We have added that it is'EPA's goal to respond to requesters regarding requests for
reconsideration within 90 days.
EPA received many recommendations in public comments to include the public in the EPA
complaint process. Specifically, commenters requested that EPA notify the public about all
pending requests to modify information and one commenter stated that EPA should allow the
public to comment on information corrections requests for information that are considered
"central to a rulemaking or other Final Agency Action" before EPA accepts or rejects the request.
As a general matter, EPA does not intend to solicit public comment on how EPA should respond
to requests for correction or reconsideration. EPA also does not intend to post requests for
correction and requests for reconsideration on the EPA web site, but we plan to revisit this and
many other aspects of the Guidelines within one year of implementation.
EPA also received many comments on how information that is currently being reviewed by EPA
in response to a complaint appears to the public on the EPA web site or some other medium.
Some commenters recommended the use of flags for all information that has a complaint pending
with a note that where appropriate, challenged information will be pulled from dissemination and
removed from EPA's web site. Other commenters stated that the information in question should
be removed from public access until the resolution process has been completed. Still other
commenters requested that EPA not embark on self-censorship. As a general rule, EPA has
decided not to flag information that has a complaint pending. EPA believes that information that
is the subject of a pending complaint should not necessarily be removed from public access based
solely on the receipt of a request for correction.

A.4

Next Steps

EPA is actively developing new policies and procedures. as appropriate, to improve the quality of
information disseminated to the public. Some activities specifically support ensuring and
maximizi~g the quality, objectivity, utility, and integrity of information. For instance, we are
consulting with the scientific community on the subject of reproducibility. The EPA Science
Advisory Board (SAB) is performing an expedited consultation on the subject on October I,
2002. Based on this initial consultation, EPA and the SAB may consider a full review of
reproducibility and related information quality concepts in 2003. Furthermore, as noted earlier,
the EPA Science Policy Council has commissioned a workgroup to develop assessment factors
for consideration in assessing information that EPA collects or is voluntarily submitted in support
of various Agency decisions.
Appendix

56

Guidelines tor Ensuring and MaxlInizing the Quality. ObJectivity. Utility. and Integrlt}' of InformatIon Dlssemlnatec bv EP':'

As new processes, policies, and procedures are considered and adopted into Agency operations.
we will consider their relationship to the Guidelines and determine the extent to which the
Guidelines may need to change to accommodate new activity.

Appendix

57

OFFICE OF
ENVIRONMENTAL
INFORMATION

www.epa.gov/oei

Friday,

February 22, 2002

Part IX

Office of
Management and
Budget
Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and '
Integrity of Information Disseminated by
Federal Agencies; Notice; Republication

8452

Federal Register/Vol. 67. No. 36/Friday. February 22. 2002/Notices

OFFICE OF MANAGEMENT AND
BUDGET
Guidelines for Ensuring and
Maximizing the Quality, Objectivity,
Utility, and Integrity of Information
Disseminated by Federal Agencies;
Republication
Editorial Note: Due to numerous errors,
this document is being reprinted in its
entirety, It was originally printed in the
Federal Register on Thursday, January 3.
2002 at 67 FR 369-378 Ilnd WIlS corrected on
Tuesday, February 5, 2002 at 67 FR 5365.
AGENCY: Office of Management and
Budget. Executive Office of the
President.
ACTION: Final guidelines.

These final guidelines
implement section 515 of the Treasury
and General Government
Appropriations Act for Fiscal Year 2001
(Public Law 106-554; H.R. 5658).
Section 515 directs the Office of
Management and Budget (OMB) to issue
government-wide guidelines that
"provide policy and procedural
guidance to Federal agencies for
ensuring and maximizing the quality,
objectivity, utility, and integrity of
information (including statistical
information) disseminated by Federal
agencies." By October 1.2002. agencies
must issue their own implementing
guidelines that include "administrative
mechanisms allowing affected persons
to seek and obtain correction of
information maintained and
disseminated by the agency" that does
not comply with the OMB guidelines.
These final guidelines also reflect the
changes OMB made to the guidelines
issued September 28,2001, as a result
of receiving additional comment on the
"capable of being substantially
reproduced" standard (paragraphs
V.3.B, V.9, and V.I0). which OMB
previously issued on September 28.
2001, on an interim final basis.
DATES: Effective Date: January 3. 2002.
SUMMARY:

FOR FURTHER INFORMATION CONTACT:

Brooke J. Dickson, Office of Information
and Regulatory Affairs. Office of
Management and Budget. Washington.
DC 20503. Telephone (202) 395-3785 or
bye-mail to
informationquality@omb.eop.gov.
SUPPLEMENTARY INFORMATION: In section
515(a) of the Treasury and General
Government Appropriations Act for
Fiscal Year 2001 (Public Law 106-554;
H.R. 5658), Congress directed the Office
of Management and Budget (OMB) to
issue, by September 30,2001,
government-wide guidelines that
"provide policy and procedural

guidance to Federal agencies for
ensuring and maximizing the quality,
objectivity, utility, and integrity of
information (including statistical
information) disseminated by Federal
agencies' • ." Section 515(b) goes on
to state that the OMB guidelines shall:
"(1) apply to the sharing by Federal
agencies of. and access to. information
disseminated by Federal agencies; and
"(2) require that each Federal agency
to which the guidelines apply­
"(A) issue guidelines ensuring and
maximizing the quality, objectivity,
utility, and integrity of information
(including statistical information)
disseminated by the agency, by not later
than 1 year after the date of issuance of
the guidelines under subsection (a);
"(B) establish administrative
mechanisms allowing affected persons
to seek and obtain correction of
information maintained and
disseminated by the agency that does
not comply with the guidelines issued
under subsection (a); and
"(C) report periodically to the
Director­
"(i) the number and nature of
complaints received by the agency
regarding the accuracy of information
disseminated by the agency and;
"(ii) how such complaints were
handled by the agency."
Proposed guidelines were published
in the Federal Register on June 28. 2001
(66 FR 34489). Final guidelines were
published in the Federal Register on
September 28.2001 (66 FR 49718). The
Supplementary Information to the final
gUidelines published in September 2001
provides background, the underlying
principles OMB followed in issuing the
final guidelines, and statements of
intent concerning detailed provisions in
the final guidelines.
In the final guidelilnes published in
September 2001, OMB also requested
additional comment on the "capable of
being substantially reproduced"
standard and the related definition of
"influential scientific or statistical
information" (paragraphs V.3.B, V.9,
and V.I0). which were issued on an
interim final basis. The final guidelines
published today discuss the public
comments OMB received. the OMB
response, and amendments to the final
guidelines published in September
2001.
In developing agency-specific
guidelines, agencies should refer both to
the Supplementary Information to the
final guidelines published in the
Federal Register on September 28.2001
(66 FR 49718). and also to the
Supplementary Information published
today. We stress that the three
"Underlying Principles" that OMB

followed in drafting the guidelines that
we published on September 28. 2001
(66 FR 49719), are also applicable to the
amended guidelines that we publish
today.
In accordance with section 515. OMB
has designed the guidelines to help
agencies ensure and maximize the
quality, utility, objectivity and integrity
of the information that they disseminate
(meaning to share with. or'give access
to, the public). It is crucial that
information Federal agencies
disseminate meets these guidelines. In
this respect. the fact that the Internet
enables agencies to communicate
information quickly and easily to a wide
audience not only offers great benefits to
society, but also increases the potential
harm that can result from the
dissemination of information that does
not meet basic information quality
guidelines. Recognizing the wide variety
of information Federal agencies
disseminate and the wide varietv of
dissemination practices that agencies
have, OMB developed the guidelines
with several principles in mind.
First. OMB designed the guidelines to
apply to a wide variety of government
information dissemination activities
that may range in importance and scope.
OMB also designed the guidelines to be
generic enough to fit all media. be they
printed. electronic. or in other form.
OMB sought to avoid the problems that
would be inherent in developing
detailed. prescriptive, "one·size-fits-all"
government-wide guidelines that would
.artitlcially require different types of
dissemination activities to be treated in
the same manner. Through this
flexibility, each agency will be able to .
incorporate the requirements of these
OMB guidelines into the agency's own
information resource management and
administrative practices.
Second. OMB designed the guidelines
so that agencies will meet basic
information quality standards. Given the
administrative mechanisms required by
section 515 as well as the standards set
forth in the Paperwork Reduction Act, it
is clear that agencies should not
disseminate substantive information
that does not meet a basic level of
quality. We recognize that some
government information may need to
meet higher or more specific
information quality standards than
those that would apply to other types of
government information. The more
important the information. the higher
the quality standards to which it should
be held, for example, in those situations
involving "influential scientific.
financial, or statistical information" (a
phrase defined in these guidelines). The
guidelines recognize, however. that

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices
infbrmation quality comes at a cost.
Accordingly. the agencies should weigh
the costs (for example, including costs
attributable to agency processing effort,
respondent burden, maintenance of
needed privacy, and assurances of
suitable confidentiality) and the benefits
of higher information quality in the
development of information, and the
level of quality to which the information
disseminated will be held.
Third, OMB designed the guidelines
so that agencies can apply them in.a
common-sense and workable manner. It
is important that these guidelines do not
impose unnecessary administrative
burdens that would inhibit agencies
from continuing to take advantage of the
Internet and other technologies to
disseminate information that can be of
great benefit and value to the public. In
this regard, OMB encourages agencies to
incorporate the standards and
procedures required by these guidelines
into their existing information resources
management and administrative
practices rather than create new and
potentially duplicative or contradictory
processes. The primary example of this
is that the guidelines recognize that, in
accordance with oMB Circular A-130,
agencies already have in place well­
established information quality
standards and administrative
mechanisms that allow persons to seek
and obtain correction of information
that is maintained and disseminated by
the agency. Under the OMB guidelines,
agencies need only ensure that their
own guidelines are consistent with
these OMB guidelines, and then ensure
that their administrative mechanisms
satisfy the standards and procedural
requirements in the new agency
gUidelines. Similarly, agencies may rely
on their implementation of the Federal
Government's computer security laws
[formerly, the Computer Security Act,
and now the computer security
provisions of the Paperwork Reduction
Act) to establish appropriate security
safeguards for ensuring the "integrity"
of the information that the agencies
disseminate.
In addition, in response to concerns
expressed by some of the agencies, we
want to emphaSize that OMB recognizes
that Federal agencies prOVide a wide
variety of data and information.
Accordingly, OMB understands that the
guidelines discussed below cannot be
implemented in the same way by each
agency. In some cases, for example, the
data disseminated by an agency are not
collected by that agency; rather, the
information the agency must provide in
a timely manner is compiled from a
variety of sources that are constantly
updated and revised and may be

confidential. In such cases, while
agencies' implementation of the
gUidelines may differ, the essence of the
guidelines will apply. That is, these
agencies must make their methods
transparent by providing
documentation, ensure quality by
reviewing the underlying methods used
in developing the data and consulting
[as appropriate) with experts and users,
and keep users informed about
corrections and revisions.
Summary of OMB Guidelines

These guidelines apply to Federal
agencies subject to the Paperwork
Reduction Act [44 U.S.C. chapter 35).
Agencies are directed to develop
information resources management
procedures for reviewing and
substantiating (by documentation or
other means selected by the agency) the
quality [including the objectivity,
utility, and integrity) of information
before it is disseminated. In addition,
agencies are to establish administrative
mechanisms allowing affected persons
to seek and obtain, where appropriate,
correction of information disseminated
by the agency that does not comply with
the OMB or agency guidelines.
Consistent with the underlying
principles described above, these
guidelines stress the importance of
having agencies apply these standards
and develop their administrative
mechanisms so they can be
implemented in a common sense and
workable manner. Moreover, agencies
must apply these standards flexibly, and
in a manner appropriate to the nature
and timeliness of the information to be
disseminated, and incorporate them into
existing agency information resources
management and administrative
practices.
Section 515 denotes four substantive
terms regarding information
disseminated by Federal agencies:
quality, utility, objectivity, and
integrity. It is not always clear how each
substantive term relates-or how the
four terms in aggregate relate-to the
widely divergent types of information
that agencies disseminate. The
gUidelines provide definitions that
attempt to establish a clear meaning so
that both the agency and the public can
readily judge Whether a particular type
of information to be disseminated does
or does not meet these attributes.
In the gUidelines, OMB defines
"quality" as the encompassing term, of
which "utility," :'objectivity," and
"integrity" are the constituents.
"Utility" refers to the usefulness of the
information to the intended users.
"Objectivity" focuses on whether the
disseminated information is being

8453

presented in an accurate. clear.
complete, and unbiased manner. and as
a matter of substance, is accurate.
reliable, and unbiased. "Integrity" refers
to security-the protection of
information from unauthorized access
or revision, to ensure that the
information is not compromised
through corruption or falsification. OMB
modeled the definitions of
"information," "government
informatkm," "information
dissemination product." and
"dissemination" on the longstanding
definitions of those terms in OMB
Circular A-130, but tailored them to fit
into the context of these guidelines.
In addition, Section 515 imposes two
reporting requirements on the agencies.
The first report, to be promulgated no
later than October I, 2002, must provide
the ag~ncy's information quality
guidelines that describe administrative
mechanisms allOWing affected persons
to seek and obtain, where appropriate,
correction of disseminated information
that does not comply with the OMB and
agency guidelines. The second report is
an annual fiscal year report to OMB (to
be first submitted on January I, 2004)
providing information [both quantitative
and qualitative, where appropriate) on
the number, nature, and resolution of
complaints received by the agency
regarding its perceived or confirmed
failure to comply with these OMB and
agency guidelines.
Public Comments and OMB Response
Applicability of Guidelines. Some
comments raised concerns about the
applicability of these guidelines,
particularly in the context of scientific
research conducted by Federally
employed scientists or Federal grantees
who publish and communicate their
research findings in the same manner as
their academic colleagues. OMB
believes that information generated and
disseminated in these contexts is not
covered by these guidelines unless the
agency represents the information as, or
uses the information in support of, an
offiCial position of the agency.
As a general matter, these guidelines
apply to "information" that is
"disseminated" by agencies subject to
the Paperwork Reduction Act (44 U.S.C.
3502(1)). See paragraphs II, V.5 and V.B.
The definitions of "information" and
"dissemination" establish the scope of
the applicability of these guidelines.
"Information" means "any
communicatiOll or representation of
knowledge such as facts or data' • ."
This definition of information in
paragraph V.5 does "not include
opinions, where the agency's
presentation makes it clear that what is

8454

Federal Register/Vol. 67, No. 36/Friday, February 22. 2002/Notices

being offered is someone's opinion
rather than fact or the agency's views."
"Dissemination" is defined to mean
"agency initiated or sponsored
distribution of information to the
public." As used in paragraph V.8,
"agency INITIATED' * • distribution
of information to the public" refers to
information that the agency
disseminates, e.g .. a risk assessment
prepared by the agency to inform the
agency's formulation of possible
regulatory or other action. In addit!on.
if an agency, as an institution,
disseminates information prepared by
an outside party in a manner that
reasonably suggests that the agency
agrees with the information, this
appearance of having the information
represent agency views makes agency
dissemination of the information subject
to these guidelines. By contrast, an
agency does not "initiate" the
dissemination of information when a
Federally employed scientist or Federal
grantee or contractor publishes and
communicates his or her research
findings in the same manner as his or
her academic colleagues. even if the
Federal agency retains ownership or
other intellectual property rights
because the Federal government paid for
the research. To avoid confusion
regarding whether the agency agrees
with the information (and is therefore
disseminating it through the employee
or grantee), the researcher should
include an appropriate disclaimer in the
publication or speech to the effect that
the "views are mine, and do not
necessarily reflect the view" of the
agency.
Similarly, as used in paragraph V.8.,
"agency' * • SPONSORED
distribution of information to the
public" refers'lto situations where an
agency has directed a third-party to
disseminate information. or where the
agency has the authority to review and
approve the information before release.
Therefore, for example, if an agency
through a procurement contract or a
grant provides for a person to conduct
research, and then the agency directs
the person to disseminate the results (or
. the agency reviews and approves the
results before they may be
disseminated), then the agency has
"sponsored" the dissemination of this
information. By contrast, if the agency
simply provides funding to support
research, and it the researcher (not the
agency) who decides whether to
disseminate the results and-if the
results are to be released-who
determines the content and presentation
of the dissemination, then the agency
has not "sponsored" the dissemination
even though it has funded the research

and even if the Federal agency retains
ownership or other intellectual property
rights because the Federal government
paid for the research. To avoid
confusion regarding whether the agency
is sponsoring the dissemination, the
researcher should include an
appropriate disclaimer in the
publication or speech to the effect that
the "views are mine, and do not
necessarily reflect the view" of the
agency. On the other hand, subsequent
agency dissemination of such
information requires that the
information adhere to the agency's
information quality guidelines. In sum,
these guidelines govern an agency's
dissemination of information, but
generally do not govern a third-party's
dissemination of information (the
exception being where the agency is
essentially using the third-party to
disseminate information on the agency's
behalf). Agencies, particularly those that
fund scientific research, are encouraged
to clarify the applicability of these
guidelines to the various types of
information they and their employees
and grantees disseminate.
Paragraph V.8 also states that the
definition of "dissemination" does not
include ,,* • • distribution limited to
correspondence with individuals or
persons, press releases, archival records,
public filings. subpoenas or adjudicative
processes." The exemption from the
definition of "dissemination" for
"adjudicative processes" is intended to
exclude, from the scope of these
guidelines, the findings and
determinations that an agency.makes in
the course of adjudications involVing
specific parties. There are well­
established procedural safeguards and
rights to address the quality of
adjudicatory decisions and to provide
persons with an opportunity to contest
decisions. These guidelines do not
impose any additional requirements on
agencies during adjudicative
proceedings and do not provide parties
to such adjudicative proceedings any
additional rights of challenge or appeal.
The Presumption Favoring Peer­
Reviewed Information.As a general
matter, in the scientific and research
context, we regard technical information
that has been subjected to formal,
independent, external peer review as
presumptively objective. As the
guidelines state in paragraph V.3.b.i: "If
data and analytic results have been
subjected to formal, independent,
external peer review. the information
may generally be presumed to be of
acceptable objectivity." An example of a
formal, independent. external peer
review is the review process used by
scientific journals.

Most comments approved of the
prominent role that peer review plays in
the OMB guidelines. Some comments
contended that peer review was not
accepted as a universal standard that
incorporates an established, practiced.
and sufficient level of objectivity. Other
comments stated that the guidelines
would be better clarified by making peer
review one of several factors that an
agency should consider in assessing the
objectivity (and quality in general) of
original research. In addition. several
comments noted that peer review does
not establish whether analvtic results
are capable of being substantially
reproduced. In light of the comments,
the final guidelines in new paragraph
V.3.b.i qualify the presumption in favor
of peer-reviewed information as follows:
"However, this presumption is
rebuttable based on a persuasive
showing by the petitioner in a particular
instance."
We believe that transparency is
important for peer review. and these
guidelines set minimum standards for
the transparency of agency-sponsored
peer review. As we state in new
paragraph V,3.b.i: "If data and analytic
results have been subjected to formal,
independent, external peer review. the
information may generally be presumed
to be of acceptable objectivity. However,
this presumption is rebuttable based on
a persuasive showing by the petitioner
in a particular instance. If agency­
sponsored peer review is employed to
help satisfy the objectivity standard. the
review process employed shall meet the
general criteria for competent and
credible peer review recommended by
OMB-GIRA to the President's
Management Council (9/20/01) (http://
www.whitehouse.gov/omb/inforeg/
oira_review-process.htmll, namely, 'that
(a) peer reviewers be selected primarily
on the basis of necessary technical
expertise, (b) peer reviewers be expected
to disclose to agencies prior technical!
policy positions they may have taken on
the issues at hand, (c) peer reviewers be
expected to disclose to agencies their
sources of personal and institutional
funding (private or public sector), and
(d) peer reviews be conducted in an
open and rigorous manner.' "
The importance of these general
criteria for competent and credible peer
review has been supported by a number
of expert bodies. For example, "the
work of fully competent peer-review
panels can be undermined by
allegations of conflict of interest and
bias. Therefore, the best interests of the
Board are served by effective policies
and procedures regarding potential
conflicts of interest, impartiality, and
panel balance." (EPA's Science Advisory

Federal Register/Vol. 67. No. 36/Friday, February 22. 2002/Notices
Board Panels: Improved Policies and
Procedures Needed to Ensure
Independence and Balance, GAO-Ol­
536. General Accounting Office.
Washington. DC. June 2001. page 19.)
As another example. "risk analyses
should be peer-reviewed and
accessible-both physically and
intellectuallv-so that decision-makers
at all levels will be able to respond
critically to risk characterizations. The
intensity of the peer reviews should be
commensurate with the significance of
the risk or its management
implications." (Setting Priorities,
Getting Results: A New Direction for
EPA, Summary Report. National
Academy of Public Administration.
Washington. DC. April 1995, page 23.)
These criteria for peer reviewers are
generally consistent with the practices
now followed by the National Research
Council of the National Academy of
Sciences. In considering these criteria
for peer reviewers. we note that there
are many types of peer reviews and that
agency guidelines concerning the use of
peer review should tailor the rigor of
peer review to the importance of the
information involved. More generally.
agencies should define their peer-review
standards in appropriate ways. given the
nature and importance of the
information they disseminate.
Is Journal Peer Review Always
Sufficient? Some comments argued that
journal peer review should be adequate
to demonstrate quality. even for
influential information that can be
expected to have major effects on public
policy. OMB believes that this position
overstates the effectiveness of journal
peer review as a quality-control
mechanism.
Although journal peer review is
clearly valuable, there are cases where
flawed science has been published in
respected journals. For example. the
NIH Office of Research Integrity recently
reported the following case regarding
environmental health research:
"Based on the report of an investigation
conducted by [XX] University. dated July 16.
1999. and additional analysis conducted by
OR! in its oversight review, the US Public
Health Service found that Dr. [Xl engaged in
scientific misconduct. Dr. [Xl committed
scientific misconduct by intentionally
falsifying the research results published in
the journal SCIENCE and by providing
falsified and fabricated materials to
investigating officials at [XX] University in
response to a request for original data to
support the re~earch results and conclusions
report in the SCIENCE paper. In addition,
PHS finds that there is no original data or
other corroborating evidence to support the
research results and conclusions reported in
the SCIENCE paper as a whole." (66 FR
52137, October 12. 2001).

Although such cases of falsification
are presumably rare. there is a
significant scholarly literature
documenting quality problems with
articles published in peer-reviewed
research. "In a [peer-reviewed] meta­
analysis that surprised many-and some
doubt-researchers found little evidence
that peer review actually improves the
quality ofresearch papers." (See. e.g.,
Science, Vol. 293. page 2187 (September
21. 2001.)) In part for this reason, many
agencies have already adopted peer
review and science advisory practices
that go beyond journal peer review. See,
e.g.. Sheila Jasanoff. The Fifth Branch:
Science Advisers as Policy Makers,
Cambridge. MA. Harvard University
Press, 1990; Mark R. Powell, Science at
EPA: Information in the Regulatory
Process. Resources for the Future.
Washington. DC.. 1999. pages 138-139;
151-153; Implementation of the
Environmental Protection Agency's Peer
Review Program: An SAB Evaluation of
Three Reviews, EPA-SAB-RSAC-01­
009. A Review of the Research Strategies
Advisory Committee (RSAC) of the EPA
Science Advisory Board (SAB),
Washington. DC.. September 26, 2001.
For information likely to have an
important public policy or private sector
impact. OMB believes that additional
quality checks beyond peer review are
approyriate.
Definition of "Influential". OMB
guidelines apply stricter quality
standards to the dissemination of
information that is considered
"influential." Comments noted that the
breadth ofthe definition of "influential"
in interim final paragraph V.9 requires
much speculation on the part of
agencies.
We believe that this criticism has

merit and have therefore narrowed the
definition. In this narrower definition.
"influential". when used in the phrase
"influential scientific, financial, or
statistical information", is amended to
mean that "the agency can reasonably
determine that dissemination of the
information will have or does have a
clear and substantial impact on
important public policies or important
private sector decisions." The intent of
the new phrase "clear and substantial"
is to reduce the need for speculation on
the part of agencies. We added the
present tense-"or does have"-to this
narrower definition because on
occasion, an information dissemination
may occur simultaneously with a
particular policy change. In response to
a public comment, we added an explicit
reference to "financial" information as
consistent with our original intent.
Given the differences in the many
Federal agencies covered by these

8455

guidelines, and the differences in the
nature of the information the\"
disseminate, we also believe it will be
helpful if agencies elaborate on this
definition of" influential" in the context
of their missions and duties. with due
consideration of the nature of the
information thev disseminate. As we
state in amended paragraph V.9. "Each
agency is authorized to define
.influential' in ways appropriate for it
given the nature and multiplicity of
issues for which the agency is
responsible. "
Reproducibility. As we state in new
paragraph V.3.b.ii: "If an agency is
responsible for disseminating influential
scientific. financial, or statistical
information. agency guidelines shall
include a high degree of transparency
about data and methods to facilitate the
reproducibility of such information bv
qualified third parties." OMB believes
that a reproducibility standard is
practical and appropriate for
information that is considered
"influential", as defined in paragraph
V.9-that "will have or does have a
clear and substantial impact on
important public policies or important
private sector decisions." The
reproducibility standard applicable to
influential scientific, financial, or
statistical information is intended to
ensure that information disseminated by
agencies is sufficiently transparent in
terms of data and methods of analysis
that it would be feasible for a replication
to be conducted. The fact that the use
of original and supporting data and
analytic results have been deemed
"defensible" by peer·review procedures
does not necessarily imply that the
results are transparent and replicable.
Reproducibility ofOriginal and
Supporting Data. Several of the
comments objected to the exclusion of
original and supporting data from the
reproducibility requirements.
Comments instead suggested that OMB
should apply the reproducibility
standard to original data, and that OMB
should provide flexibility to the
agencies in determining what
constitutes "original and supporting"
data, OMB agrees and asks that agencies
consider. in developing their own
guidelines, which categories of original
and supporting data should be subject to
the reproducibility standard and which
should not. To help in resolving this
issue. we also ask agencies to consult
directly with relevant scientific and
technical communities on the feasibility
of having the selected categories. of
original and supporting data subject to
the reproducibility standard. Agencies
are encouraged to address ethical,
feasibility, and confidentiality issues

8456

Federal Register/Vol. 67, No. 36/Friday, February 22, 2002/Notices

with care. As we state in new paragraph
V.3.b.iLA, "Agencies may identify. in
consultation with the relevant scientific
and technical communities, those
particular types of data that can
practicably be subjected to a
reproducibility requirement, given
ethical, feasibility, or confidentiality
constraints." Further. as we state in our
expanded definition of
"reproducibility" in paragraph V.IO, "If
agencies apply the reproducibility test
to specific types of original or
supporting data, the associated
guidelines shall provide relevant
definitions of reproducibility (e.g"
standards for replication of laboratory
data)," OMB urges caution in the
treatment of original and supporting
data because it may often be impractical
or even impermissible or unethical to
apply the reproducibility standard to
such data. For example, it may not be
ethical to repeat a "negative"
(ineffective) clinical (therapeutic)
experiment and it may not be feasible to
replicate the radiation exposures
studied after the Chernobyl accident.
When agencies submit their draft agency
guidelines for OMB review, agencies
should include a description of the
extent to which the reproducibility
standard is applicable and reflect
consultations with relevant scientific
and technical communities that were
used in developing gUidelines related to
applicability of the reproducibility
standard to original and supporting
data,
It is also important to emphasize that
the reproducibility standard does not
apply to all original and supporting data
disseminated by agencies, As we state in
new paragraph V.3.b.iLA, "With regard
to original and supporting data related
[to influential scientific, financial, or
statistical information], agency
guidelines shall not require that all
disseminated data be subjected to a
reproducibility requirement." In
addition, we encourage agencies to
address how greater transparency can be
achieved regarding original and
supporting data. As we also state in new
paragraph V.3.b.iLA, "It is understood
that reproducibility of data is an
indication of transparency about
research design and methods and thus
a replication exercise (i.e., a new
experiment, test, or sample) shall not be
required prior to each dissemination."
Agency guidelines need to achieve a
high degree of transparency about data
even when reproducibility is not
required.
Reproducibility of Analytic Results.
Many public comments were critical of
the reproducibility standard and
expressed concern that agencies would

be required to reproduce each analytical
result before it is disseminated. While
several comments commended OMB for
establishing an appropriate balance in
the "capable of being substantially
reproduced" standard, others
considered this standard to be
inherently subjective. There were also
comments that suggested the standard
would cause more burden for agencies.
It is not OMB's intent that each
agency must reproduce each analytic
result before it is disseminated. The
purpose of the reproducibility standard
is to cultivate a consistent agency
commitment to transparency about how
analytic results are generated: the
specific data used, the various
assumptions employed, the specific
analytic methods applied, and the
statistical procedures employed. If
sufficient transparency is achieved on
each of these matters. then an analytic
result should meet the "capable of being
substantially reproduced" standard.
While there is much variation in types
of analytic results. OMB believes that
reproducibility is a practical standard to
apply to most types of analytic results.
As we state in new paragraph V.3.b.iLB.
"With regard to analytic results related
[to influential scientific. financial, or
statistical information], agency
guidelines shall generally require
sufficient transparency about data and
methods that an independent reanalysis
could be undertaken by a qualified
member of the public. These
transparency standards apply to agency
analysis of data from a single study as
well as to analyses that combine
information from multiple studies." We
elaborate upon this principle in our
expanded definition of
"reprodUcibility" in paragraph V.10:
"With respect to analytic results,
'capable of being substantially
reproduced' means that independent
analysis of the original or supporting
data using identical methods would
generate similar analytic results. subject
to an acceptable degree of imprecision
or error."
Even in a situation where the original
and supporting data are protected by
confidentiality concerns, or the analytic
computer models or other research
methods may be kept confidential to
protect intellectual property, it may still
be feasible to have the analytic results
subject to the reproducibility standard.
For example, a qualified party,
operating under the same
confidentiality protections as the
original analysts, may be asked to use
the same data, computer model or
statistical methods to replicate the
analytic results reported in the original
study. See, e.g., "Reanalysis of the

Harvard Six Cities Study and the
American Cancer Society StudY of
Particulate Air Pollution and Mortality."
A Special Report of the Health Effects'
Institute's Particle Epidemiology
Reanalysis Project. Cambridge. MA,
2000.

The primary benefit of public
transparency is not necessarily that
errors in analvtic results will be
detected. although error correction is
clearly valuable. The more important
benefit of transparency is that the public
will be able to assess how much an
agency's analytic result hinges on the
specific analytic choices made by the
agency. Concreteness about analytic
choices allows, for example, the
implications of alternative technical
choices to be readily assessed. This type
of sensitivity analysis is widely
regarded as an essential feature of high­
quality analysis. yet sensitivity analysis
cannot be undertaken by outside parties
unless a high degree of transparency is
achieved. The OMB guidelines do not
compel such sensitivity analysis as a
necessary dimension of quality. but the
transparency achieved by
reproducibility will allow the public to
undertake sensitivity studies of interest.
We acknowledge that confidentiality
concerns will sometimes preclude
public access as an approach to
reproducibility. In response to public
comment, we have clarified that such
concerns do include interests in
"intellectual property." To ensure that
the OMB guidelines have sufficient
flexibility with regard to analytic
transparency, OMB has, in new
paragraph V.3.b.iLB.i. provided agencies
an alternative approach for classes or
types of analytic results that cannot
practically be subject to the
reproducibility standard. "[In those
situations involving influential
scientific, financial, or statistical
information· • • J making the data and
methods publicly available will assist in
determining whether analytic results are
reproducible. However, the objectivity
standard does not override other
compelling interests such as privacy.
trade secrets. intellectual property, and
other confidentiality protections. "
Specifically. in cases where
reproducibility will not occur due to
other compelling interests, we expect
agencies (1) to perform robustness
checks appropriate to the importance of
the information involved, e.g.,
determining whether a specific statistic
is sensitive to the choice of analytic
method. and, accompanying the
information disseminated, to document
their efforts to assure the needed
robustness in information quality, and
(2) address in their guidelines the

Sign up to vote on this title
UsefulNot useful