You are on page 1of 91

DEPARTMENT OF HEALTH

DIRECTORATE: RADIATION CONTROL


CODE OF PRACTICE FOR USERS OF MEDICAL X-
RAY
EQUIPMENT
(Code: Diagnostic Use)
WEB ADDRESS:
http://www.doh.gov.za/department/radiation/01.html
2
INDEX Page no.
1. INTRODUCTION 3
2. ABBREVIATIONS AND DEFINITIONS 4
3. LICENSING 6
Application for Licences 6
Installations 6
Acceptance Tests 6
Disposal / modification of x-ray equipment 7
New / modified premises 7
Responsible person 7
Change of responsible person 7
4. RESPONSIBILITIES OF LICENCE HOLDER / RESPONSIBLE PERSON 8
Keeping of patient records 8
Keeping of equipment records 9
Submission of acceptance test results 9
5. OPERATORS 10
Operators of mammography units 10
6. RADIATION WORKERS 11
Pregnant radiation workers 11
Appointment of a radiation worker 11
Medical examinations 12
Monitoring 12
Termination of employment 13
Appointment of radiation workers by a new employer 13
Radiation Occurrences 13
7. RADIATION PROTECTION 14
Basic protection principles 14
Protection of patients 14
Protection of pregnant patients 15
Protection of women of reproductive capacity 15
Protection of paediatric patients 15
Protection of non-radiation personnel and public 16
Protection of persons holding patients 16
8. PREMISES REQUIREMENTS 17
9 RADIATION WARNING SIGNS, NOTICES AND LIGHTS 17
10. ANNEXURE A 18
11. CONTACT DETAILS 19
12. REFERENCES 20
3
1. INTRODUCTION
This Code sets out requirements and recommendations for radiation
safety associated with the use of medical diagnostic x-ray equipment.
The Hazardous Substances Act, 1973 (Act 15 of 1973) and
Regulations
(No R1332 of 3 August 1973) govern the safe use of medical x-ray
equipment in South Africa.
Requirements from the Act and Regulations are incorporated in this
Code. Further requirements are taken from source material listed in
the
section of References.
Whenever compliance with a requirement in this Code is mandated
the
word must / shall is used. The word should indicate a practice that
is
recommended but not mandatory at this stage.
Where a given technology or practice is not specifically covered by
this
Code, guidance in matters of radiation protection should be sought
from
the Department of Health: Radiation Control.
The Licensee shall be responsible for ensuring that corrective action
takes place on items of non-compliance with this Code.
The Act does not allow any person to use radiation equipment unless
he/she holds a licence under the Act for that purpose.
This Code does not cover the use of x-rays for dental and veterinary
diagnosis.
This Code must be read in conjunction with DOH Guideline
documents
as listed in Annexure A of this Code. ♦
All forms and guidelines are available at:
http://www.doh.gov.za/department/radiation/01.html → Licensing.
♦ DOH Guideline documents: http://www.doh.gov.za/department/radiation/01.html→Codes of
Practice→Electronic
products→Ionising Radiation
4
2. ABBREVIATIONS
ACT Hazardous Substances Act, 1973 (Act 15 of 1973)
DOH Department of Health
HPCSA Health Professions Council of South Africa
IER Individual Equipment Record
mSv milliSievert
TLD Thermo Luminescent Dosimeter
NTP Nuclear Technology Products
PRMD Personal Radiation Monitoring Device
SABS South African Bureau of Standards
SANAS South African National Accreditation System
SDR Supplementary Diagnostic Radiographer
QA Quality Assurance
QC Quality Control
DEFINITIONS
Actinic marking Permanent transfer of patient data / identification on to
the film prior to processing
ALARA As Low As Reasonably Achievable
Controlled area A controlled area is a limited access area in which the
occupational exposure of personnel to radiation is under
the supervision of an individual in charge of radiation
protection. This implies that access, occupancy and
working conditions are controlled for radiation protection
purposes
Department Department of Health
Diagnostic QC Requirements for licence holders with respect to Quality
Control Test for diagnostic x-ray imaging systems
5
Inspection Bodies An organisation approved by the Department of Health
to perform acceptance and QC tests on diagnostic x-ray
equipment
Radiation worker Any person who is potentially exposed as a result of
his/her occupation to more than three tenths of the
occupational dose limit
Regulations Regulations relating to the Control of Electronic Products
(No R1332 of 3 August 1973)
X-ray unit An electronic product that is designed, manufactured or
assembled with the primary purpose of producing x-rays
6
3. LICENSING
3.1 The Regulations concerning to the Control of Electronic Products require that
a joint
product and premises licence be obtained for x-ray equipment before it may be
installed and commissioned.
(a) Licences are not transferable and are issued:
• To a specific person or institution;
• for specific equipment and its application, and
• for a specific premises.
(b) Licences are issued subject to the Regulations concerning to the Control of
Electronic Products and the application of specific conditions.
(c) Licence holders must verify the accuracy of the information displayed on the
licence issued and communicate any inaccuracies to the DOH.
Application for licences (Form RC001)
3.2 It is the responsibility of the prospective user of an x-ray unit to apply for a
licence
by completing and submitting an application form. Accurately completed forms
will
ensure processing of the applications without undue delay.
The import / manufacture licence number and technical information required on
the
form must be obtained from the supplier / distributor of the x-ray unit.
Allow 30 days for processing of applications.
Installations
3.3 The installation of an x-ray unit may only commence after a licence to
install the
unit has been issued.
Acceptance tests
(Refer to “Diagnost QC” document on DOH website – see Annexure A.1)
3.4 Only Inspection Bodies approved by DOH may perform acceptance tests on
x-ray
equipment. A list of approved Inspection Bodies and scope of the licences are
available on the DOH website.
3.5 New Units
When a new unit is installed, acceptance tests must be performed by the supplier
of
the x-ray unit and the results recorded on the prescribed form and filed in the IER
of
the unit.
3.6 Pre-owned units
The prospective user must ensure that acceptance tests are performed. Granting
of a licence to use a unit is subject to submission of the results of the tests to
DOH.
When an existing licenced unit is moved to a new premises (building) or room,
prior
to use, acceptance tests must be performed on the unit and the results submitted
to
DOH.
7
Disposal / modification of x-ray equipment (Form RC002)
3.7 The licence holder must apply for and obtain permission from DOH by
submitting a
completed form RC002 prior to cancellation, modification, disposal and/or sale
of
x-ray equipment.
• Particulars regarding the type of disposal, e.g. sale, dismantling, disappearance
or storage of a unit, must be furnished to DOH before the cancellation of the
licence will be effected.
New / modified premises (Form RC002)
3.8 The licence holder must apply for and obtain permission prior to:
(a) modification of any licensed premises or layout of equipment on such
premises, and/or
(b) change of licensed premises (building) or equipment moved to other rooms
within the same building.
Responsible person
3.9 The licence holder must appoint a responsible person that has adequate
knowledge
and experience in the field of radiation protection in general. The appointed
person
is responsible to the licence holder for the safe use of the x-ray equipment (see
also
par 4).
3.10 The person appointed must be qualified in either of the following categories
and
registered with the Health Professions Council of South Africa (HPCSA):
• Radiography;
• Radiology;
• Medical Physics, or
• Chiropractics.
3.11 The responsible person must be appointed in writing indicating the scope of
the
actions delegated by the licence holder.
Change of responsible person (Form RC002)
3.12 The licence holder must notify DOH of a change in responsible person by
submitting a completed form RC002.
8
4. RESPONSIBILITIES OF LICENCE HOLDERS / RESPONSIBLE
PERSONS
4.1 The licence holder of a diagnostic x-ray facility is ultimately responsible
for:
(a) The entire scope of radiation safety, for the equipment and premises for
which
he/she holds a licence;
(b) fulfilment of all related statutory requirements, and
(c) compliance to conditions specified in the licence.
4.2 The licence holder / responsible person must ensure that:
• The equipment and the facilities, in which such equipment is installed and used,
meet all applicable radiation safety standards;
• The equipment is maintained and functions properly;
• The equipment is used and maintained only by competent and appropriately
trained persons / personnel;
• Applicable Quality Control (QC) tests are performed at the prescribed
frequencies as stipulated in “Diagnost QC” document on DOH website (see
Annexure A1);
• The required QC equipment is provided;
• Ensure that radiation surveys to monitor safe performance of equipment and to
monitor radiation levels in work areas are undertaken;
• Radiation workers (occupationally exposed persons) are identified and issued
with personal radiation monitoring devices (PRMD’s);
• The appropriate protective clothing, devices and equipment is provided to
personnel and properly used;
• Radiation safety rules are communicated to and followed by all personnel;
• Operational procedures are established and maintained to ensure that the
radiation exposure to workers, patients and public is kept as low as reasonable
achievable (ALARA) without compromising the diagnostic efficiency of the result,
and
• Workers are educated in the hazards and risks of ionising radiation.
Keeping of patient records
4.3 Records must be kept and available for inspection purposes by DOH.
4.4 A record / register must be kept of all patients undergoing x-ray
examinations. The
record / register must be preserved for 5 years and contain the following
information:
• surname, name, date of birth or ID number / age and gender;
• date of examination;
• brief clinical indication of the examination;
• type of examination;
• number of exposures (repeat exposures included) and
• fluoroscopy time, dose results (if available) and the name of the person
performing the fluoroscopy procedure.
9
Keeping of equipment records
4.5 IER must be kept and contain all required information as stipulated in
“Diagnost
QC” document on DOH website - see Annexure A.1.
4.6 Radiation worker record (see par 6.3)
Submission of acceptance tests results to DOH
4.7 Acceptance test results of pre-owned of x-ray units must be submitted to
DOH
following installation (see par 3.6).
10
5. OPERATORS
5.1 Only the following persons who are appropriately trained and registered with
the
HPCSA in Radiography and Radiology, may operate x-ray equipment and
perform
examinations within their appropriate scope of practice:
• Radiographer
• Supplementary Diagnostic Radiographer (SDR):
- May only work in a Government hospital / or an institution operated or
subsidised by government or provincial authority or by the South African
Chamber of Mines
(Refer to Medical, Dental and Supplementary Health Services Professions
Act, 1974 (Act no 56 of 1974) Annexure 7)
- Supplementary diagnostic radiographers must be supervised, at least once a
week, by a qualified registered radiographer
• Chiropractor
• Radiologist
Operators of mammography units
5.2 With effect from 1 July 2009 mammography examinations shall only be
performed
by qualified radiographers in possession of a recognised additional qualification
(post graduate) in mammography.
• Details of accredited courses can be obtained from the Professional Board for
Radiography and Clinical Technology at the HPCSA.
11
6. RADIATION WORKERS
6.1 Dose limits for radiation workers and public
Application Occupational Public
Effective dose 20 mSv per annum, not more
than 100 mSv over a periods of
5 years (not more than 50 mSv
in any one year)
1 mSv per annum
Annual equivalent dose to the
lens of the eye 150 mSv 15 mSv
skin 500 mSv 50 mSv
hands and feet 500 mSv --------
6.2 A radiation worker must be older than 18 years. However, if a radiation
worker in
training is younger than 18, but older than 16, such worker must work under
direct
supervision.
6.3 The holder of the licence must keep records of the following for a period of 10
years
for each radiation worker:
(a) The monthly dose reports furnished by the monitoring service provider(s)
(SABS / NTP) and
(b) results of medical examinations.
(c) RC008 forms (see par 6.5, 6.8 & 6.9)
Pregnant radiation workers
6.4 When pregnancy has been diagnosed the women shall not be allowed to
work
under working conditions where the maximum equivalent dose limit of 2 mSv to
the
women's abdomen (lower trunk) for the remainder of the pregnancy could be
exceeded. Pregnant radiographers shall continue to be monitored in the
prescribed
manner. Taking into account the specific working conditions, pregnant
radiographers must be issued with a direct reading pocket alarm
dosimeter,
and in so doing prevent that such women are unwittingly exposed to radiation.
• The employer should provide continuous education as to the risks to the foetus
and actual dose levels in the various working environments.
• Radiation workers, especially young females, must at all time and not only
when
pregnant, be well versed in the uses of ionisation radiation.
(Refer to Guideline document on DOH website – Annexure A.2)
6.5 Appointment of radiation workers (Form RC008)
(a) A form RC008 (only parts A, C and D) must be completed for each radiation
worker.
12
(b) The completed form RC008 must be kept in the licence holder’s register.
Note: Licence holders are no longer required to submit form RC008 or inform
DOH
of any change in the register as stipulated in Regulation III.4 (b) & (c).
6.6 Medical examinations of radiation workers
(a) Before any person is appointed as a radiation worker, he/she must undergo a
medical examination.
(b) Medical examinations for radiation workers should follow general
occupational
medical practice for determining fitness for work.
(c) Each radiation worker will be required to undergo a medical examination in
the
event of the following:
• When a radiation occurrence / incident resulting in an abnormally high dose, is
suspected to have taken place or has been confirmed;
• when a medical practitioner deems it necessary;
• when such an examination is considered necessary either by the regulatory
authority or the holder of the licence and
• when the radiation worker suspects that his/her health has been, or will be
adversely affected by occupational factors.
Note: Annual medical examinations are no longer required by DOH but it
remains the prerogative of the licence holder should he/she deems it
necessary.
6.7 Monitoring of radiation workers
(a) The licence holder must ensure that all radiation workers are issued with a
personal radiation monitoring device (PRMD)
• For correct positioning of the PRMD refer to guideline document on DOH
website – see Annexure A.3
(b) Application forms for a PRMD can be obtained directly from the following
current monitoring service providers:
NTP Radioisotopes (Pty) Ltd SABS Holdings (Pty) Ltd
Commercial Section Radiation Protection Services
012-3055129 (􀀋) 012-4286493 (􀀋)
012-3055137 (Fax) 012-4286685 (fax)
ntp@necsa.co.za rps@sabs.co.za
(c) The service provider will forward the radiation dose records to the licence
holder on a monthly basis or after a radiation occurrence. The dose records
must be kept for 10 years.
(d) The licence holder must ensure that the service provider replaces PRMD’s at
regular intervals not exceeding 32 days.
13
6.8 Termination of employment as a radiation worker
(a) When the employment of a radiation worker is terminated the licence holder
must ensure that the form RC008 (only parts A, B & D) is completed in
duplicate.
(b) The form RC008 must be preserved in the licence holder’s register and the a
copy given to the worker in question.
(c) The records must be preserved for a period of 10 years from the date of the
last entry.
Note: Licence holders are no longer required to inform the DOH of any change in
the register as stipulated in Regulation III.4 c.
6.9 Appointment of radiation workers by a new employer
(a) The licence holder must obtain the form RC008, completed by the previous
employer, from the radiation worker.
(b) The procedure outlined in paragraphs 6.5, 6.6 & 6.7 must then be followed.
6.10 Radiation occurrences (Form RC010)
(a) Details of any radiation occurrence or suspected radiation occurrence must
immediately be reported to the Director: Radiation Control on form RC010.
14
7. RADIATION PROTECTION
Basic radiation protection principles are based on:
• The justification of the practice
No radiation examination shall be adopted unless the benefit is outweighed by
the associated risk.
• The optimisation of protection
Radiation doses from medical exposures and those received by the public and
occupationally exposed persons must be kept as low as reasonable achievable
(ALARA), economic and social factors taken into account.
• Limitation of individual dose and risk
All medical applications of ionising radiation must be managed in such a way
that radiation doses to occupationally exposed person and members of the
public do not exceed the specified dose limits (see par 6.1).
7.1 Protection of patients
(a) All medical exposures should be subject to the principles of justification and
optimisation.
(b) X-ray examinations shall not be performed unless there are valid clinical
indications.
(c) Examinations on children shall require a higher justification since such
patients
may be more sensitive to radiation.
(d) Obtain previous x-ray images to minimise the taking of repeat films.
(e) Screening programmes of asymptomatic persons shall not be instituted
unless
approved by DOH.
(f) Licence holders should be aware of the approximate patient radiation doses.
Reference dose levels should be introduced for applications in diagnostic x-ray
examinations as performed in their facilities.
(g) When appropriate consider other modalities such as MRI or ultrasound which
do not use ionising radiation.
(h) Examinations with potential high patient doses such as CT examinations
should only be carried out after a proper clinical justification by the radiologist.
(i) For each projection select the highest kilovoltage (KV) and fastest film-screen
combination compatible with the image quality requirements of the
examination.
(j) The primary beam shall be collimated at all times.
15
(k) Means to permanently transfer patient identification, prior to processing of the
images, must be provided.
(l) Radiation examinations may only be requested by:
• A medical practitioner or
• any appropriately trained and registered health professional.
(Refer to the Guideline document on DOH website – Annexure A.4)
7.2 Protection of pregnant patients
(a) X-ray examinations must be justified and only essential views performed.
(b) Alternative imaging modalities, especially ultrasound for obstetric procedures,
shall be used where appropriate. An x-ray examination shall not be performed
to assess foetal development where ultrasound facilities are available.
(c) X-ray pelvimetry shall not be performed on a routine basis.
(d) For examinations where the primary beam unavoidably irradiates the foetus,
the methods of minimising dose shall be used as appropriate, and particular
attention shall be given to:
• minimising the number of views;
• strict beam collimation;
• using higher kVp settings;
• using fast image recording media ;
• where practicable, using PA projections in preference to AP projections.
7.3 Protection of women of reproductive capacity
(a) X-ray examinations involving the exposure of the abdomen of women likely to
be pregnant shall be avoided unless there are strong clinical indications for the
examination.
(b) In order to minimise the possibility of unintentional exposure to the embryo /
foetus, notices must be posted at several places within the radiology facility.
The notices shall contain wording similar to or having the same meaning as
the following:
“If you might be pregnant notify the radiographer before your x-ray examination.”
7.4 Protection of paediatric patients
The longer life expectancy of children results in greater potential for the
manifestation of possible harmful effects of radiation.
In addition to the requirements in this Code for patients in general (see par 7.1),
the
following requirements for paediatric x-ray examinations shall be observed:
(a) For a given procedure each view shall be examined, where practical, before
deciding whether to take a further view;
(b) fluoroscopy shall in general be used only when radiography will not provide
the
information required and
(c) there shall be strong justification for x-ray procedures involving high doses
such as CT (Refer to Guideline document on DOH website – see Annexure
A.5).
16
7.5 Protection of non-radiation personnel and members of the public
(a) Members of the public are not allowed to enter controlled areas unsupervised.
(b) Non-radiation personnel or members of the public shall not remain in the x-
ray
room during any x-ray procedure unless they are required to be in attendance.
(c) The occasional use of non–radiation personnel to give assistance, particularly
in ward or theatre radiography, is acceptable but shall involve the full use of
protective clothing, devices and techniques to minimise personnel dose. Care
shall be taken to ensure that the same non-radiation personnel are not always
involved. Women who are pregnant shall not be used in this role.
7.6 Protection of persons holding patients or image receptors
(a) No person shall hold a patient, x-ray film cassette, or other imaging
equipment
or x-ray tube head in position during exposures unless it is otherwise
impossible to obtain a diagnostically useful image and not merely that it is a
matter of convenience.
(b) Holding of patients or x-ray film cassettes during exposure shall be done by
persons accompanying the patient in preference to non-radiation personnel;
and by non-radiation personnel in preference to radiation workers. Nonradiation
personnel should be chosen on the basis of a roster, i.e. it shall not
be the same person who does the holding. No pregnant women or young
persons (under the age of 18) shall do any holding.
(c) Any persons holding patients or film cassettes in position during an x-ray
examination shall wear a lead rubber apron and wherever practicable, lead
rubber gloves. No part of the holder’s body shall be in the primary beam, even
if covered with protective clothing.
17
8. PREMISES REQUIREMENTS
Refer to guideline document on DOH website:
• General Guidelines with regard to the design of x-ray rooms (See Annexure
A.6)
9. RADIATION WARNING SIGNS, NOTICES AND LIGHTS AT
ENTRANCES TO X-RAY ROOMS
9.1 Appropriate radiation warning signs and notices must be displayed and
required
warning lights in working order:
(a) Fixed units:
A radiation warning sign and warning notice, “ X-RAYS - NO
UNAUTHORISED ENTRY” must be displayed at all entrances leading to the
rooms where X-ray units are installed.
(b) Mobile units:
A radiation warning sign and warning notice, “ X-RAYS - NO
UNAUTHORISED USE” must be displayed on the control panel of the X-ray
units.
(c) Warning lights for CT & Fluoroscopy units (excluding theatres):
A red warning light, which is only activated when the beam is on and when
fluoroscopy is in progress, must be mounted in a conspicuous place outside
the entrance to the x-ray rooms.
(Refer to Guideline document on DOH website – see Annexure A.7)
18
10. ANNEXURE A
1. Diagnost QC – Requirements for licence holders with respect to Quality
Control
tests for diagnostic x-ray imaging systems.
2. Management of pregnant radiographers and other staff members.
3. Personnel monitoring when a lead rubber apron is worn – medical and
veterinary
use of x-ray equipment.
4. Request for medical examinations.
5. FDA Public Health Notification: Reducing radiation risk from computed
tomography
for paediatric and small adult patients – 2 November 2001.
6. General guidelines with regard to the design of x-ray rooms.
7. Display and format of radiation warning signs at entrances to rooms containing
x-ray units.
19
11. CONTACT DETAILS
Offices Postal address Street address Tel & Fax
Head Office
Bellville
Private Bag X62
Bellville
7535
c/o Kort & Vrede Str
2nd Floor
Louwville Place
Bellville
7530
021-9486162 􀀋
021-9461589
Regional Office
Pretoria
PO Box 977
Pretoria
0001
5th Floor
MBA Building
527 Church Street
Pretoria
0002
012-3416322 􀀋
012-3411651
Regional Office
Durban
PO Box 4301
Durban
4000
6th Floor, Room 604
85 On Field Building
Field Street
Durban
4001
031-3072111 􀀋
031-3076099
WEB ADDRESS: http://www.doh.gov.za/department/radiation/01.html
20
12. REFERENCES
1. Australian Government. Australian Radiation Protection and Nuclear Safety
Agency, 2008. Radiation Protection in Medical Applications of Ionizing
Radiation. Publication No. 14.
http://www.arpansa.gov.au
2. International Commission on Radiological Protection, 1991.
1990 Recommendations of the International Commission on Radiological
Protection. ICRP Publication 60 Vol 21/1-3. Pergamon Press.
http://www.icrp.org
3. International Commission on Radiological Protection, 2000. Pregnancy and
Medical Radiation. ICRP Publication 84 vol 30/1. Pergamon Press.
http://www.icrp.org
4. New Zealand Ministry of Health, National Radiation Laboratory, 1994. Code of
Safe Practice for the use of x-rays in Medical Diagnosis. NRL C5.
http://nrl.moh.govt.nz
5. South Africa, 1973. Hazardous Substances Act, 1973 (Act of 15 of 1973).
http://www.doh.gov.za/department/radiation/01.html → Act & Regulations
6. South Africa, 1973. Regulations Concerning the Control of Electronic
Products.
Regulation Gazette No 3991.
http://www.doh.gov.za/department/radiation/01.html → Act & Regulations

Skip Navigation

Oxford Journals
 Contact Us
 My Basket
 My Account

International Journal for Quality in


Health Care
 About This Journal
 Contact This Journal
 Subscriptions
 View Current Issue (Volume 22 Issue 3 )
 Archive
 Search

 Oxford Journals
 Medicine
 Int. Journal for Quality in Health Care
 Volume 2, Number 3-4
 Pp. 213-218

 Previous Article | Next Article 

Perform your original search, quality assurance in xray depatment, in IJQHC   Search
This Article

Full Text (PDF)


Alert me when this article is cited
Alert me if a correction is posted

International Journal for Quality in Health Care 2:213-


218 (1990) Services
© 1990 International Society for Quality in Health Care
Email this article to a friend
Similar articles in this journal
QUALITY ASSURANCE IN Alert me to new issues of the journal

DIAGNOSTIC RADIOLOGY Add to My Personal Archive


Download to citation manager

—FOR ITS OWN SAKE OR Request Permissions

THAT OF THE PATIENT Citing Articles

E. T. Henshaw Scopus Links


Citing Articles via CrossRef
Integrated Radiological Services Limited ,RPS Centre 42
Rodney Street, Liverpool L1 9AA, UK Google Scholar

X-Ray departments are expensive to equip and Articles by Henshaw, E. T.


run. This paper illustrates how a quality Search for Related Content
assurance programme may help to limit the
wastage of resources. The production of good PubMed
quality medical X-ray images is extremely
complex and can only be guaranteed by Articles by Henshaw, E. T.
implementing some form of quality assurance
programme. The exposure of patients to X-rays Social Bookmarking
also entails a risk of radiation injury and a
quality assurance programme is necessary in         
order to limit this risk to a level as low as What's this?
reasonably practicable. Because of this, in
countries within the CEC, legislation now requires such a programme to be implemented.
The aims of a QA programme are defined, and the implications arising from these aims
are discussed. The role of international organisations in helping to achieve these aims is
also discussed. The pitfalls of a QA programme in radiology are also identified
particularly: (1) the tendency to carry out a large programme and acquire a considerable
amount of data so that the original aims are obscured; (2) the possibility of carrying out
tests which are expensive to perform and are not cost effective and (3) the failure to adapt
constantly the content of the QA programme to the ever changing needs of the local
department and the radiological community generally. The various components of a QA
programme are presented together with illustrations of their possible impact on the
standard of work of the X-ray department. These include: (1) resource management
through film reject analysis; (2) patient dose measurements; (3) equipment inspection
programme; (4) equipment maintenance programme; (5) training and education of staff.
Indications are given of the potential savings derived from a QA programme together
with approximate estimates of the cost of operating such a programme.

Keywords: Quality assurance, radiology, cost effectiveness, dosimetry, equipment


maintenance, training, resource management
Accepted for publication June 16, 1990.

CiteULike    Connotea    Del.icio.us    What's this?

Disclaimer: Please note that abstracts for content published before 1996 were created
through digital scanning and may therefore not exactly replicate the text of the original
print issues. All efforts have been made to ensure accuracy, but the Publisher will not be
held responsible for any remaining inaccuracies. If you require any further clarification,
please contact our Customer Services Department.

Online ISSN 1464-3677 - Print ISSN 1353-4505


Copyright © 2010 International Society for Quality in Health Care and Oxford
University Press
Oxford Journals Oxford University Press
 Site Map
 Privacy Policy
 Frequently Asked Questions

Other Oxford University Press sites:


http://intqhc.oxfordjournals.org/cgi/content/short/2/3-4/213
Share / Save
E-mail
Add to your favorites

FacebookDeliciousGoogle BookmarksMySpaceYahoo TwitterDiggGoogle BuzzRedditWindows


BuzzStumbleUponBeboWordPressOrkutNetvibes Live FavoritesYahoo BookmarksMister-
ShareStrandsDailyMeTechNetArtoSmakNewsAIMIdenti.caBlogger WongGoogle
PostBox.netNetlogShoutwireJumptagsHemidemiInstapaperXerpiW ReaderEvernoteStumpediaPosterousMS
inkBibSonomyBlogMarksStartAidKhabbrYoolinkTechnotizieMultipl DNExpressionTipdPlurkYahoo
yPlaxo MessengerMozillacaTypePad
PulseSquidooBlinklistYiGGSegnaloYouMobFarkJamespotTwiddla PostMixxTechnorati
MindBodyGreenHuggNowPublicTumblrCurrentSpurlOneviewSimp FavoritesCiteULikeWindows Live
yBuddyMarksViadeoWistsBackflipSiteJotDZoneHyvesBitty SpacesFunPPhoneFavsNetvouzDiigoTa
BrowserSymbaloo FeedsFoxiewireVodPodAmazon Wish ListRead gglyTailrankKledyMeneameBookmarks.fr
It Later NewsVineFriendFeedPingProtopage
Yahoo MailAOL Mail BookmarksFavesWebnewsPushaSlashd
otAllvoicesImera
BrazilLinkaGoGounalogDiglogPropellerLi
veJournalHelloTxtYampleLinkatopiaLinke
dInAsk.com
MyStuffMapleConnoteaMyLinkVaultSphi
nnCare2
NewsSphereGabbrTagzaFolkdNewsTrus
tPrintFriendly
Google GmailHotmail
Any e-mail    
Powered by AddToAny

Wednesday, June 30, 2010

News Feed Comments

Omni Imaging

Bringing Technology to Life

 Home
 Medical Digital X Ray
 Chiropractic Digital X-Ray
 Veterinary Digital X-Ray
 X Ray Department Quality Assurance Programs

Search this w GO

 Analog Radiography News


 Buyer Beware
 Chiropractic
o Chiropractic Table Sales
o Chiropractic Table Service
 CR vs. DR
 Digital Radiology
 Meet our Team
 Omni News
 State and Federal Quality Control information
 Veterinary

You are here: Home / X Ray Department Quality Assurance Programs

X Ray Department Quality Assurance


Programs

Recent governmental regulations for required


maintenance and repair of x-ray equipment have
become much more stringent.  For example, facilities in
the State of Maryland are now required to have annual
maintenance on their x-ray equipment performed by a
service provider registered with the State of Maryland. 
Fines will be assessed on facilities failing to have this
annual maintenance.  In addition, the State of
Maryland’s new policy on periodic inspections is as
follows:  “If any violations are cited at a facility, even if
they are corrected, the facility will receive a monetary
penalty.”  Under prior Maryland regulations, a fine
would not normally be imposed if required repairs were
completed within a 10-day grace period.  For more
information on the new Maryland regulations please
read this article by Steve M. Deaver B.S., R.T.
Some service providers consider annual maintenance to
consist of taking a few exposures, lubricating moving
parts, checking connections, and other fairly simple
tasks.  Our annual maintenance includes a great deal
more.  When we are finished, you will receive a seven-
page report of the work done, test results,
recommendations and comments.

The following is a partial list of the work that Omni


Imaging will perform, in addition to the routine work
done by others:
Administrative/Facility Documentation

 Check for an adequate technique chart


 Check for technique factors indicated at the control panel
 Check log of human holders
 Verify personnel monitoring and records from past 3 months

Radiation Machine Data

 Collimator operating properly


 Beam Limitation
 Light Field accuracy
 SID (Source to Image Distance) indicator accuracy
Radiation Control Devices, Timers

 Timing Accuracy including Reproducibility


 Exposure Linearity
 kVp accuracy

Processing-Automatic Processors

 Processor transit time


 Processor temperatures
 Check for proper service log

 Fog test for the darkroom

These are just a few of the checks that we perform during our Quality
Assurance check up. At Omni Imaging we help our clients maintain
their equipment to the high standards that your patients deserve.  Our
service engineers work in Maryland, Pennsylvania, Northern Virginia,
Washington D.C., Delaware and parts of West Virginia.

Two important benefits of having Omni perform annual


maintenance are:

1.Your equipment will be kept in top operating condition; this can


provide the following results:

 Limit patient exposure


 Improve image quality
 Minimize premature wear on components
 Limit the number of retakes

 Lessen disruptions to your practice


2. You will be more prepared for your state mandated inspection.  In
the event an inspector cites your facility, we will work with you and the
inspector to resolve any issues.

Please contact us about a X-Ray Preventative Maintenance Program for your


equipment. You can call us at 866-692-1033 or you can send an email to
bwills@omniimaging.com

 Omni Imaging News and Updates

Sign up to receive breaking news


and other site updates!

Enter your em GO

Ryan Everhart

OmniImaging

OmniImaging 'iCRco Signs Distribution Agreement With GE Healthcare'


http://shar.es/mI32Y 6 days ago reply
OmniImaging @FUJIMEDUSA Looking forward to the New Fuji Flat Panel. Fuji
Quality Images in DR awesome!! 11 days ago reply

OmniImaging @MonahanChiro Digital X-Ray is a Great Investment for a Chiro


Practice. Our Hottest system is the http://www.omniimaging.c... 11 days ago
reply

Join the conversation

 Most Popular Pages


o Urgent Care Digital X Ray Systems
o Medical Digital X Ray
o Podiatry Digital X-Ray Systems
o Digital X-ray Equipment
o X-Ray Generating Equipment For Sale
o Fuji Computed Radiography System
o All Pro Scan X 12 CR X-Ray System
o Silver and X Ray Film Recycling
o Chiropractic and Physical Therapy Services
o Veterinary CR and DR Digital X-Ray Systems
o iDR Chiropractic X Ray System
o iCRco 3600 CR X Ray System
o Electrotherapy Lead Wire Replacement
o AGFA CR 30 X Digital X Ray System
o Fuji CR FCR XC-2
o Fuji FCR XL-2
o JPI DR4000 Digital Veterinary X Ray System
o 17×17 Omni Imaging DR Flat Panel Detector
 Contact Omni Imaging

* Email

* First Name

 Phone
 Business

  * = Required Field

  Submit

Recent Articles
o Caveat Emptor
o Free Fuji CR? Service 4 Life? What are you really getting?
o Digital Podiatry CR Machine for under $25,000
o Ryan Parker
o AGFA HealthCare DX-S Neonatal Imaging CR
 Radiation and Imaging News
o Breast Cancer: Debate Rages over Screening June 30, 2010

Advocates, critics of routine mammography cite studies to support their positions. […]

o Managing High Maintenance Employees June 30, 2010

[…]

o Winning the Ratings Game June 30, 2010

A good online performance score is vital.Learn how to survey patients so you're aware of
problems before they hit the Internet. […]

 Veterinary Digital X-Ray News


o Notes from VENDOR X - Hey Cheapo! June 30, 2010

Ever wonder what the vendors say about the vets when they are shattered at the hotel bar
or at the lunch counter on the show floor. Well, so did we until we got some insight into
the world of the vendors. it is always easy to bash vendors for trying to get us to part with
our hard earned cash. The reality is that most vendors and sales people are trying to d
[…]

o The 2010 digital radiography fire sale June 30, 2010

It is 2009 all over again. Last year,  there was a race to the bottom in digital radiography
sales as pressure was put on Eklin medical systems that eventually resulted in a purchase
of Eklin by Sound Technologies. One year later, it is fire-sale time all over again. This
time, the causes and consequences may be farther reaching than the relatively minor ind
[…]

 Top Products
Fuji Computed Radiography (CR) iDR Direct Digital Chiropractic X-Ray Systems (DR) Digital
X-Ray Package Summit Digital X Ray Machine Podiatry Digital X-Ray Packages


 Imaging Ecomonics
o Clinical Trial Image Management Made Efficient June 30, 2010
o Carestream Wins Frost & Sullivan Innovation Award June 30, 2010
o Rural Clinic First to Purchase Toshiba's Aplio Ultrasound June 30, 2010
o California Hospital Speeds Up IMRT with RapidArc June 30, 2010
o NIH Awards GE $4 Million to Develop Nerve Imaging Agent June 30, 2010

 Omni Imaging Blog and News


o Analog Radiography News
o Buyer Beware
o Chiropractic
o Chiropractic Table Sales
o Chiropractic Table Service
o CR vs. DR
o Digital Radiology
o Meet our Team
o Omni News
o State and Federal Quality Control information
o Veterinary
 X-Ray Industry Keywords

Affordable Digital X-Ray Animal Medical Treatment Chiropractic CR Systems


Chiropractic Digital X Ray Chiropractic Table Service Computed
Radiography CR Digital X Ray Systems CR Reader digital
imaging Digital Radiography Digital Radiology Digital X Ray
Direct Digital X Ray DR Digital X Ray Systems Electro Shock
Electrodes Electrotherapy electrodes Electrotherapy leads Electrotherapy Repair EMS Electrodes EMS Leads Fuji

CR Gel Electrodes Leadwires Muscle Stimulation Electrodes muscle stimulation machines Nervous System Imaging
Oncologic Imaging Orthopedic Imaging PACS Physical Therapy Equipment Repair Podiatry Digital X Ray
TENS Electrodes Vascular Imaging veterinarian x ray Veterinary CR Systems Veterinary Digital X Ray
Veterinary DR Digital X Ray Systems Veterinary Imaging Veterinary MRI
Veterinary Specialists Viewing software X-Ray Film X-Ray Film Processors X Ray laws X Ray
systems
Omni Imaging 3916 Vero Road., Suite D Baltimore, MD 21227 | phone: (866)692-1033 (443)524-1033
fax: (443)524-1034 · Copyright © 2010 · All Rights Reserved

http://www.omniimaging.com/?page_id=630

Diagnostic X-Ray Imaging Quality Assurance: An Overview


PART II
Hospital Diagnostic Imaging
Quality Assurance Program Review
Survey Worksheets
Facility:
Address:
Radiology Manager:
QC Technologist: QA Co-ordinator:
Reviewer: Date:
Abbreviations:
(D) Daily (W) Weekly (SM) Semi-Monthly
(M) Monthly (Q) Quarterly (SA) Semi-Annually (A) Annually (N) Never
(H) High (M) Medium (L) Low (N) None
Contents
1. Hospital and radiology Department Quality 7. Equipment Performance Records and Record Keeping
Assurance Committees
1.1 Hospital QA Committee 8. Equipment Appraisal and Replacement Policy
1.2 Radiology Department QA Committee 9. Standardization of Exposure
2. Quality Assurance Training 9.1 Radiographic Positioning
3. Equipment Specification Writing 9.2 Loading factors
4. Quality Control Testing 9.3 Entrance-Skin-Exposure
5. Equipment Acceptance Testing 10. Acceptance criteria for Diagnostic Radigrams
6. Quality Control Testing 11. Reject-Repeat Analysis Program
6.1 X-Ray Equipment QC 12. Summary of Quality Assurance and Quality
6.2 Photographic Equipment QC Control Document Assessment
2
1. Hospital and Radiology Department QA Committees
1.1. Hospital Quality Assurance Committee (QAC)
1. Does the hospital have a QAC? ................................................................................................
Y/N
2. Does the hospital have documented QA program? ....................................................................
Y/N
3. Is a copy of the hospital organization available (showing level of
responsibility and reporting order)? ......................................................................................... Y/N
Comments:
1.2. Radiology Department Quality Assurance Committee
1. Does the radiology department have a QAC? ...........................................................................
Y/N
2. Does the radiology department QAC have an overall strategy with clearly defined
work plans?........................................................................................................................... Y/N
3. Does the radiology department have a documented QA program? .............................................
Y/N
If yes, is a copy of the QA manual available? ........................................................................... Y/N
4. Radiology QAC members:
Radiology administrator:
Medical physicist:
Chief x-ray technologist:
Quality control technologist:
Hospital service engineer:
Private consultants:
Others:
Comments:
3
5. Radiology department QA program review and reporting structure:
Who reviews the radiology QA program?
Review schedule: .................................................................................(M) (Q) (SA) (A) (N)
Is a summary of the radiology QAC audit plan available? ................................................ Y/N
Describe the radiology QAC program reporting structure:
6. Is a copy of the radiology department's organization chart available (showing
the level of responsibility and reporting order)? ....................................................................... Y/N
7. Does the radiology QAC serve as an advising committee to give direction,
training and/or advice on QA and QC protocols to other hospitals?..................(M) (Q) (SA) (A) (N)
If yes, which hospitals?
8. Is a member of the department's QAC on the hospital QAC? : ....................................................
Y/N
Comments:
2. Quality Assurance Training
1. Is QA training available? ......................................................................................................... Y/N
2. Type of QA training:
In-house:
Other hospitals:
Outside agency:
Special courses:
Refresher courses:
Other:
3. What priority level is placed on QA training? ..........................................................(H) (M) (L)
(N)
Comments:
4
3. Equipment Specification Writing
1. Is the QAC involved in equipment specification writing?...........................................................
Y/N
2. Does QC technologist participate in equipment specification writing? .......................................
Y/N
3. Who does equipment specification writing? (QAC?, private consultants?, etc.)
4. Is a copy of documented equipment specification writing guidelines available?.........................
Y/N
5. Do equipment specifications include acceptance testing criteria?..............................................
Y/N
6. Is a copy of the equipment specification document sent out for tender
for the last x-ray unit purchased by the hospital available? . ................................................... Y/N
Comments:
4. Quality Control Test Equipment List
1. Are QC test equipment available? ............................................................................................
Y/N
2. List QC test equipment used: (including manufacturer, model and calibration date):
Processing test equipment: Manufacturer Model Calibration Date
sensitometer:
densitometer:
thermometer:
stop watch:
graduated transparent beaker:
darkroom fog test tool:
Radiographic test equipment: Manufacturer Model Calibration Date
exposure and exposure rate meter:
full range of ionization chambers:
electronic irradiation time measuring device:
electronic x-ray tube voltage measuring device:
5
collimator and beam alignment tool:
aluminum filters:
film screen contact wire mesh:
star focal spot patterns:
Tomography phantoms: Manufacturer Model Calibration Date
tomogram scale:
tomogram aperture plate:
full range body part phantom:
uniform density phantom:
resolution phantom:
step wedge:
Image Intensifier test tools: Manufacturer Model Calibration Date
full range of lead (resolution) test patterns:
low contrast resolution test tool:
high contrast resolution test tool:
Video test equipment: Manufacturer Model Calibration Date
oscilloscope:
scope camera:
video waveform monitor:
video signal generator:
photometer:
General purpose test equipment: Manufacturer Model Calibration Date
chart recorder:
other:
6
5. Equipment Acceptance Testing
1. Does the QAC have an equipment acceptance testing policy? ....................................................
Y/N
2. Who does the equipment acceptance testing (manufacturer, in-house,
private consultants)?:
3. Equipment acceptance test results recorded?............................................................................
Y/N
4. Equipment acceptance test results kept for QC base data? ........................................................
Y/N
5. Is a copy of equipment acceptance testing results available? ...................................................
Y/N
Comments:
6. Quality Control Testing
The following are general questions regarding the QC testing program and the QC technologist’s
responsibilities. Further information, about x-ray imaging equipment QC testing, i.e., specific
tests, test
devices and frequency of testing, is collected based on information from "Radiographic Quality
Control,
Minimum Standards" from the CAMRT, Appendix A of NCRP Report No.99 and "Diagnostic X-ray
Equipment and Facility Survey" of Health Canada publication 94- EHD-184. Questions are listed
in a
separate survey form.
6.1 X-Ray Equipment Quality Control
1. QC responsibilities (persons in charge and reporting order):
Radiology department QC program:
QC testing:
QC record keeping:
QC data evaluation:
Equipment control parameter setting:
Equipment repair and services decisions:
7
2. Does the x-ray department have a documented equipment QC test protocol manual? .................
Y/N
If yes, is a copy of the equipment QC test protocol manual available?....................................... Y/N
Does the manual include QC test protocol for the following equipment? :
General radiographic equipment?................................................................................ Y/N
Fluoroscopic equipment?............................................................................................ Y/N
Special procedures equipment?................................................................................... Y/N
Mammographic equipment?........................................................................................ Y/N
CT equipment?........................................................................................................... Y/N
Mobile fluoroscopic equipment? ................................................................................. Y/N
Dedicated procedure equipment? ................................................................................ Y/N
Film processors? ........................................................................................................ Y/N
Other? :
3. Is the QC testing done by a private consulting agent? ..............................................................
Y/N
If yes, who?
Reporting protocol:
Consultant objectives:
Radiation safety survey of equipment?
Equipment specification writing?
Acceptance testing?
QC testing of equipment?
Advisor on QA program?
Frequency of consultant contract: ............................................................... (M) (SA) (A) (N)
Is copy of consultant contract objectives available? ........................................................ Y/N
4. QC technologist available? :.................................................. (Full-time), (Part-time),
(Occassional)
To whom does the QC technologist report? :
8
5. Does the QC technologist have a specific QC test schedule? ......................................................
Y/N
If yes, how strictly is it followed?
QC testing schedule priority level:...........................................................................(H) (M) (L) (N)
Is a copy of the equipment QC test schedule available?............................................................. Y/N
QC test schedule (time spent): h/d; d/w; w/m
Consequences of not meeting the QC schedule:
6. QC technologist responsibility
x-ray rooms darkrooms processors
radiographic tubes fluoroscopic tubes mobile units
mammography units CT units other
7. How much time spent testing equipment (number of tubes, hours/unit)?
General radiography?
Fluoroscopy?
Special procedure equipment?
Mammography?
CT?
General film processors?
Dedicated film processors?
Other:
8. Does QC technologist have adequate time to carry out QC test required? ...................................
Y/N
9. Does QC technologist have adequate time to evaluate results of QC tests performed?.................
Y/N
10. Does QC technologist have adequate time to update and maintain QC records?..........................
Y/N
11. Are samples of QC tests records (blanks) available?.................................................................
Y/N
9
12. QC test reporting:
To whom are QC test results reported?..................................................................................... Y/N
What is the reporting structure?
Priority of QC reporting:.........................................................................................(H) (M) (L) (N)
Consequences of late reporting:
13. QC testing review activity:
Is equipment QC test program audited?..................................................(W) (M) (Q) (SA) (A) (N)
Review method of audit:
Is a copy of the QC audit plan available? ................................................................................. Y/N
Consequences of bad reviews:
14. Is QC testing training available for the QC technologist? ...........................................................
Y/N
If yes, where? when?
15. Is QC technologist shared with other hospitals? ........................................................................
Y/N
If yes, list hospital and days per week:
16. Is the Hospital QC performance compared with other large city hospitals?................................
Y/N
If yes, who and frequency: Hospital (M) (Q) (SA) (A) (N)
Comments:
6.2. Photographic Equipment Quality Control
The following are general questions regarding the photographic QC testing program and the QC
technologist’s responsibilities. Further information, about photographic equipment QC testing,
i.e., specific
tests, test devices and frequency of testing, is collected based on information from "Radiographic
Quality
Control, Minimum Standards" from the CAMRT, Appendix A of NCRP Report No.99 and
"Diagnostic Xray
Equipment and Facility Survey" of Health Canada publication 94- EHD-184. Questions are listed
in a
separate survey form.
1. Number of automatic processors:
2. Number of dedicated processors:
3. Processor sensitometric evaluation performed?................................................... (D) (W) (SM)
(N)
10
4. Is the developer temperature verified using a thermometer?................................. (D) (W) (SM)
(A)
5. Replenishment rates checked? ............................................................................ (D) (W) (SM)
(N)
6. Transport time checked? .................................................................................... (D) (W) (SM)
(N)
7. Is the manufacturer's time/temperature chart followed? ...........................................................
Y/N
8. Are film processors cleaned regularly? ........................................................ (D) (W) (SM) (M)
(N)
9. Preventive maintenance program for the processor?.................................................................
Y/N
10. Are the cassette screens cleaned regularly? ...................................(D) (W) (SM) (M) (SA) (A)
(N)
11. Are screen contact tests done? ........................................................... (W) (SM) (M) (SA) (A)
(N)
12. Safelight integrity verified? .......................................................................... (W) (M) (SA) (A)
(N)
13. Darkroom fog test?...................................................................................... (W) (M) (SA) (A)
(N)
Comments:
1. Does the radiology department have a silver recovery program?...............................................
Y/N
2. Who is in charge of the silver recovery program?
3. Is silver recovery done for all automatic processors? ...............................................................
Y/N
4. Does the hospital have a policy on effluent disposal? ...............................................................
Y/N
5. Are the developer and fixer treated before going to effluent......................................................
Y/N
6. What happens to old or reject-repeat radiograms?
Comments:
7. Equipment Performance Records and Record Keeping
1. Are equipment performance records kept? ...............................................................................
Y/N
2. Do the equipment performance records include acceptance testing results?...............................
Y/N
3. Are the initial and current radiation safety surveys reports available?.......................................
Y/N
4. Are the current year QC tests and results recorded?..................................................................
Y/N
5. Are the past year QC tests and results recorded? ......................................................................
Y/N
6. Are the equipment repairs and servicing recorded (frequency and costs)? .................................
Y/N
7. Is the equipment down time recorded? .....................................................................................
Y/N
11
8. Is a copy of the equipment performance record available? .......................................................
Y/N
Comments:
8. Equipment Appraisal and Replacement Policy
1. Does the QAC have an equipment appraisal and replacement policy? ........................................
Y/N
2. Planned budget allocations for future purchases?.....................................................................
Y/N
3 Describe the equipment appraisal and replacement policy budget strategy:
4. Is a copy of the equipment appraisal and replacement policy available .....................................
Y/N
9. Standardization of Exposure
9.1. Radiographic Positioning
1. Is a standard radiographic positioning manual available in each room? ....................................
Y/N
If no, is it easily accessible? .................................................................................................... Y/N
Is a copy (sample) of radiographic positioning manual available? ............................................ Y/N
Comments:
2. Current condition of the radiographic positioning manual (indicate on a scale of 1 to 5):
12345
Poor - - - Good
Disorganized - - - Tidy
Ambiguous - - - Clear
Vague - - - Precise
Incomplete - - - Comprehensive
Neglected - - - Updated
Comments:
3. Does the radiographic positioning manual provide instructions about:
body part to be x-rayed?.......................................................................................... Y/N
number of projections required?............................................................................... Y/N
size of image receptor to use? .................................................................................. Y/N
12
part rotation? .......................................................................................................... Y/N
tube angle?.............................................................................................................. Y/N
central ray location? ................................................................................................ Y/N
source-to-image receptor distance? .......................................................................... Y/N
detail of structures to be shown?.............................................................................. Y/N
general instructions for positioning?......................................................................... Y/N
illustrations? ........................................................................................................... Y/N
Comments:
4. Radiographic positioning manual update:
Is the radiographic positioning manual updated?...................................................................... Y/N
Who authorizes changes?
Are changes reported through QAC reporting channels?............................................................ Y/N
Are changes unreported and adopted?...................................................................................... Y/N
Comments:
9.2. Loading Factors
1. Is there a loading factors chart (or manual) posted in each x-ray room? ...................................
Y/N
Is a copy (sample) of loading factors manual available?........................................................... Y/N
2. Current condition of Loading Factor charts (indicate on a scale of 1 to 5):
12345
Poor - - - Good
Disorganized - - - Tidy
Ambiguous - - - Clear
Careless - - - Precise
Incomplete - - - Comprehensive
Neglected - - - Updated
Comments:
3. Does the loading factors chart contain the following information? :
patient thickness?............................................................................................... Y/N
13
child/adult technique?......................................................................................... Y/N
optimum kVp?.................................................................................................... Y/N
optimum time, mA, mAs or automatic exposure control? ..................................... Y/N
focal spot size? ................................................................................................... Y/N
grid/no grid? ....................................................................................................... Y/N
film-screen combination? .................................................................................... Y/N
Comments:
4. Is the loading factors chart strictly followed?...........................................................................
Y/N
If not, why?
5. Loading factors chart changes:
Is the loading factors chart updated or changed to compensate for equipment
or processor problems? ........................................................................................................... Y/N
Who sets the loading factors chart factors?
Who authorizes the loading factors chart changes?
Are the loading factors chart changes reported to QC technologist?........................................... Y/N
Are changes unreported and adopted?...................................................................................... Y/N
Comments:
9.3. Entrance-Skin-Exposure (ESE)
1. Are the ESEs measured for:
each diagnostic procedure?........................................................................................ Y/N
each x-ray room? ...................................................................................................... Y/N
each fluoroscopic procedure?..................................................................................... Y/N
each fluoroscopic room?............................................................................................ Y/N
List the ESE procedures measured:
2. Is the ESE schedule reviewed : ............................................................................. (M) (SA) (A)
(N)
3. Are the ESEs recorded in the QC log book?...............................................................................
Y/N
14
If yes, 1) is a copy (sample) of the radiographic ESE record for each room available?............. Y/N
2) is a copy (sample) of the fluoroscopic ESE record for each room available? ............. Y/N
4. Is there an ESE comparison with other major city hospitals?
If yes, who? How often? (M) (Q) (SA) (A) (N)
Comments:
10. Acceptance Criteria for Diagnostic Radiograms
1. Have acceptance criteria for diagnostic radiograms established? ..............................................
Y/N
2. Do the acceptance criteria cover the following points:
1) the visibility of predetermined landmarks clearly defined for each view? .............................
Y/N
2) an acceptable density range measured at predetermined anatomical landmarks? ..................
Y/N
3) also include three limits of acceptability clearly defined where:
a) the x-ray technologist forwards radiogram to radiologist for reporting? ...................... Y/N
b) or the x-ray technologist consults with the radiologist? .............................................. Y/N
c) or the radiogram is rejected and a repeat is done?....................................................... Y/N
3. Are the acceptance criteria followed by technologist? ..............................................................
Y/N
4. Are the acceptance criteria reviewed?......................................................................................
Y/N
Frequency of review: ....................................................................................(M) (Q) (SA) (A) (N)
5. Are acceptance criteria compared with that of other major city hospitals? ................................
Y/N
6. If yes, Who?, How often? .............................................................................(M) (Q) (SA) (A)
(N)
7. If a QA criteria has not been established against which standard are the radiograms checked
when
the radiologist is not available? (e.g., evening or weekends)
How does that affect the repeat rate when the radiologist does become available?
15
8. Is a copy of the acceptance criteria available? .........................................................................
Y/N
Comments:
11. Reject-Repeat Analysis Program (RRAP)
1. Does the radiology department have a comprehensive RRAP? ...................................................
Y/N
2. Is a copy of the documented RRAP parameters available? .........................................................
Y/N
3. Who sets the RRAP parameters? :
4. Reject-Repeat Analysis parameters:
patient positioning patient motion
radiograms too dark radiograms too light
artifacts tomographic scout radiograms
fog static
medical reasons processor malfunction
mechanical quality control films
clear black film
Good radiograms Other
___ Total waste Total rejects Total repeats
Comments:
5. Do the RRAP results show how many rejects or repeats were acceptable and
should not have been repeated? ............................................................................................... Y/N
6. Are the RRAP results posted?................................................................................................... Y/N
7. Is the repeat percentage analysis evaluated:
per technologist? per room? ..................................................... Y/N
8. What is the current reject-repeat rate?
9. What is the reject-repeat rate for the last six months? :
10. What corrective action is used to reduce the reject-repeat rate?
16
11. Reject-repeat rate is based on what workload?
12. What is radiology department total workload?
13. Is the RRAP compared with other hospitals?.............................................................................
Y/N
If yes, who? How often?: ..............................................................................(M) (Q) (SA) (A) (N)
Note: RRAP should look at three separate categories:
1) Total waste films: all films in the scrap bin? .............................................................. Y/N
2) Total rejects: all films except clear and QC films? ...................................................... Y/N
3) Total repeats: only those where an additional radiogram was made? ........................... Y/N
RRAP should not include radiograms from special
procedures areas (cardiovascular, neurological copy, nor subtraction films.)
Comments:
12. QA/QC Document Assessment (Summary)
The following (current) documents should be collected as examples for assessing the Radiology
Department's QA/QC program.
Section Reference Documents
1.1.3. Hospital organization chart (with reporting order)
1.2.3. Radiology department QA manual
1.2.5. Summary of radiology department’s QAC audit plan
1.2.6. Radiology department’s organization chart (with reporting order)
3.4. Equipment specification writing guidelines
3.6. Equipment specification document (e.g., last purchase)
4.2. List of all QC test equipment
5.5. Equipment acceptance test results
6.1.2. Equipment QC test protocol manual
6.1.3. QC consultant contract objectives
6.1.5. Equipment QC test schedule
6.1.11. Sample QC test records (blanks)
17
6.1.13. QC audit plan
7.8. Equipment performance record
8.4. Equipment appraisal replacement policy
9.1.1. Radiographic positioning manual (sample)
9.2.1. Loading factors chart (sample)
9.3.3. ESE (sample list of ESEs recorded and date in QC log for radiographic and
fluoroscopic examinations for each room)
10.8. Acceptance criteria for diagnostic radiograms
11.2. Reject-Repeat Analysis Program parameters

http://www.hc-sc.gc.ca/ewh-semt/alt_formats/hecs-sesc/pdf/pubs/radiation/qa-
x_ray_image-aq/qa-x_ray_image-aq-eng.pdf

Guidelines
1995/1
NRL Report

for Quality Assurancein


Radiation Protectionfor
Diagnostic X-Ray
Facilities:Large X-Ray
Facilities

J L Poletti
GUIDELINES FOR QUALITY ASSURANCE IN
RADIATION PROTECTION
FOR DIAGNOSTIC X-RAY FACILITIES:
LARGE X-RAY FACILITIES
NATIONAL RADIATION LABORATORY
MINISTRY OF HEALTH
CHRISTCHURCH NEW ZEALAND
CONTENTS
Page
ABSTRACT
1 INTRODUCTION 1
2 IMPORTANT CONCEPTS FOR QA PROGRAMMES 3
3 SPECIFIC REQUIREMENTS FOR LARGE FACILITIES 5
3.1 Automatic film processors 6
3.2 X-ray generators 7
3.3 X-ray tubes 8
3.4 Automatic exposure controls (AEC) 9
3.5 Light beam diaphragms (LBD) 9
3.6 X-ray cassettes 9
3.7 X-ray image intensifier systems 9
3.7.1 Image intensifiers 11
3.7.2 Fluoroscopic television chains 12
3.8 Fast film changers 12
3.9 Cine systems and small format cameras 12
3.10 Digital subtraction imaging (DSI) systems 13
3.11 Computed tomography scanners 14
3.12 Mammography machines 15
3.13 Tomography machines 15
3.14 Mobile radiographic equipment 15
3.15 Mobile image intensifier equipment 16
3.16 Grids 16
3.17 Protective equipment 16
3.18 Darkrooms 16
3.19 Viewboxes 16
3.20 Technique charts 17
3.21 Dose measurements 17
3.22 Approval of the QA programme 17
4 AN OUTLINE QA PROGRAMME FOR LARGE FACILITIES 17
BIBLIOGRAPHY 21
.
ABSTRACT
The Code of safe practice for the use of x-rays in medical diagnosis (NRL C5 1) requires
that each x-ray facility has an appropriate quality assurance (QA) programme in radiation
protection. The objective of the quality assurance programme is to ensure accurate
diagnosis and to ensure that doses are kept as low as reasonably achievable. In addition
the quality assurance programme should ensure compliance with NRL C51 at all times.
This requires an in-house system of regular checks and procedures.
These guidelines are intended to assist x-ray facilities to comply with the NRL C5
requirement, by outlining the features considered to be appropriate for a QA programme.
The concepts involved in QA programmes are described, and details are given of the types
of tests to be performed. An indication of QA equipment required is given, and suggested
test frequencies are outlined.
1 INTRODUCTION
For the purposes of these guidelines, a large x-ray facility is defined as having at least one of
the following (in addition to general radiography rooms):
- More than one fluoroscopy room
- More than one film processor
- A mammography machine
- A digital fluoroscopy system
- A CT scanner
The use of ionizing radiation in New Zealand is controlled by the Radiation Protection Act
(1965). Licences under this Act may be granted for a number of purposes, including medical
diagnosis. All licences for the use of x-rays for medical diagnosis include a condition that the
requirements of the Code of safe practice for the use of x-rays in medical diagnosis (NRL
C51) are met. Among the requirements of NRL C5 are a quality assurance programme in
radiation protection.
The objective of the quality assurance programme is to ensure accurate diagnosis and to
ensure that doses are kept as low as reasonably achievable. In addition the quality assurance
programme should ensure compliance with NRL C51 at all times. This requires an in-house
system of regular checks and procedures as detailed in these guidelines.
In addition, and completely independent from the quality assurance programme, each facility
is required to have a complete radiation protection survey performed at least once every four
years. For those facilities with image intensifier systems, a CT scanner or mammography
machine, these must be done every two years. The radiation protection survey is intended to
focus on radiation safety and checks for compliance with the appropriate requirements of
NRL C5. As part of this, the radiation protection survey acts as an external independent audit
of the quality assurance programme. The tests and measurements made during a radiation
protection survey cannot be considered to be part of the quality assurance programme. The
radiation protection survey is currently performed free of charge to the facility by NRL staff.
A qualified health physicist is also permitted to do radiation protection surveys, provided that
the protocol and equipment used are acceptable to NRL.
A comprehensive radiation protection quality assurance programme requiring some test
equipment is appropriate for large facilities (as defined above). The general requirements for
a quality assurance programme in radiation protection, as given in NRL C5 and of relevance
to large x-ray facilities, are summarised as follows: (Note that should and shall have
specified meanings within NRL C5, but not elsewhere in these guidelines.)
1 The principal licensee for any facility that uses x-rays for medical diagnosis shall ensure
that a suitable programme of quality assurance (with respect to radiation protection), is
instituted and maintained.
1
2 The programme shall ensure as a primary goal, accurate and timely diagnosis. As
secondary goals the programme shall ensure minimisation of radiation exposure and risk and
of discomfort and cost to patient and community. These secondary goals shall always be
balanced against the primary goal.
3 The programme shall comprise such routine checks and procedures as are required to give
reasonable confidence in the continuing compliance with this Code of Practice. The
programme shall be approved by a qualified health physicist, to ensure that the quality
control procedures are sufficient to ensure compliance with this Code. The programme shall
include quality control of x-ray film processing facilities. Note: A programme is not to be
confused with a radiation protection survey.
4 There shall be a well-defined responsibility and reporting structure, appropriate to the size
and scope of the facility. Each staff member shall routinely review the results of checks for
which they are responsible and report summary results to their superior. Any anomalous
check shall be reported immediately. Each staff member shall be responsible for the
maintenance of the programme by any personnel under his/her control.
5 Procedures should be standardised and set down in protocols or local rules (a quality
assurance manual) wherever possible.
6 All equipment shall be checked at suitable regular intervals to ensure it is operating within
suitable tolerances of accuracy and consistency. The tests performed and their frequency
shall be approved by a qualified health physicist. All measurements and maintenance shall
be recorded in an equipment log. As well as routine tests any faults or breakdowns shall be
logged and reported to superiors.
7 Acceptance tests shall be performed on all new equipment to
(a) ensure that it meets the manufacturer's specifications;
(b) ensure that it complies with this Code;
(c) establish baseline data for subsequent quality assurance.
8 Control charts shall be established for all parameters measured. Control limits shall be
established for all parameters. If a measured value of any parameter exceeds a control limit,
action shall be taken to correct the parameter.
9 A retake analysis shall be performed at regular intervals to monitor the effectiveness of the
programme.
10 The frequency with which a particular parameter is tested should be determined by both
the likelihood and the consequences of an error beyond the acceptable tolerances.
2
11 The programme should conform to the procedures and tolerances given in NCRP report
99 (National Council on Radiation Protection and Measurements, 1988) 2 or Assurance of
quality in the diagnostic x-ray department 3.
12 The programme for CT facilities should include the recommendations given in IEC 1223-
2-64.
It should be noted that a properly implemented QA programme will result in significant
benefits. These will include reduced costs due to reduced repeat rates, increased accuracy of
diagnosis, reduced equipment down-time and increased morale and job satisfaction for staff
5,6,7,8.
2 IMPORTANT CONCEPTS FOR QA PROGRAMMES
Acceptance testing
Whenever a new piece of equipment is installed, acceptance tests should be performed. This
may take a few days for complex equipment 9. The purposes of the acceptance tests are three-
fold.
- First it is important to check that the equipment meets the specifications set out in the
purchase contract with the supplier. (For this reason it is important that purchase contracts
clearly state the requirements for the equipment.)
- Second, the acceptance tests establish baseline values for the parameters that are to be
monitored during the QA programme.
- Third, the acceptance tests will show whether the machine complies with the requirements
of NRL C5, at installation.
Control charts
Once baseline values for the QA parameters have been established a system has to be set up
to ensure that these parameters are maintained to within acceptable tolerances. Control charts
are an essential part of the system7. A control chart is a plot of the measured parameter with
time. A typical control chart is shown in figure 1. The control chart has three horizontal lines,
giving the desired value of the parameter and the allowable limits. The desired value is
determined from the acceptance tests, or from tests made at the start of the QA programme
(first ensuring that the equipment is performing correctly). Allowable limits may come from
many sources, such as published protocols, regulatory requirements, or the effects on other
parameters that may be affected.
3
Figure 1. An example of a control chart
1/2/942/2/943/2/944/2/945/2/948/2/949/2/9410/2/9411/2/9412/2/9415/2/9416/2/9417/2/9418/2/9419/2/9422/
2/9423/2/9424/2/9425/2/9426/2/94889092949698100ValueLower limitUpper limitControl
valueDate
Each time a measurement is made the result is plotted on the control chart. Whenever a
parameter goes outside the control limit, the measurement should be repeated. If no mistake
has been made and the parameter is still out of control, then immediate action must be taken
to correct the parameter. If this is not done, then the entire QA programme is a waste of time.
It is sometimes possible to observe trends in the data that suggest that a parameter will
become out of control in the near future (as in the example above). It is advisable to take
corrective action at this stage, rather than to wait until the parameter is out of control.
Reject/retake analysis
A reject is defined as any film rejected by the department as scrap for any reason 7,10,11. A
retake is a patient film that has to be retaken because of an error. Retakes are only a part of
all rejected films. Rejects may be due to three main causes.
1 Retakes
2 Films wasted due to other causes, such as fogging, equipment breakdowns, etc
3 Trial films to establish exposure settings that are not viewed by a radiologist as part of the
diagnosis. (For example, some films in a tomo-graphic series.)
Analysis of the rejected films over a period of time will enable causes of rejects to be
determined and improvements to be made. The more detailed the analysis performed, then
the more information will be obtained. The simplest approach is just to collect all rejects and
to express the rate as a percentage of all films used. This gives a baseline value for the reject
rate for future reference.
4
This may be done continually, or for shorter periods on a regular basis. The minimum period
should be at least six weeks. The first two weeks' results are generally discarded to eliminate
the "startup effect".
If the rejects are sorted by work area or room, then corrective action may be directed to
where it is most needed.
If they are categorised by reason for rejection, then effort may be directed at reducing the
most common errors. Further subdivision by anatomical region may help determine which
examinations are causing the most difficulty. Finally, analysis by staff member may be used
to direct training to staff with difficulties in particular areas.
Equipment logs
Each x-ray room should have an equipment log 7. To be recorded in this log are the dates and
details of all QA corrective action, breakdowns and routine servicing. This should include, if
possible, an estimate of the downtime.
Responsibility for the QA programme
In every department, a particular person should be assigned the overall responsibility for the
QA programme7. Clearly, this person should be familiar with all the x-ray equipment and
with the principles of radiology quality assurance. Where appropriate, it is well worthwhile to
establish a committee to oversee the programme. This committee should include
representatives of all the areas involved, MRTs, radiologists, medical physicists, service
personnel and management. (Note that NRL C5 requires that the QA programme be approved
by a qualified health physicist or directly by NRL.)
The QA manual
The procedures involved for the entire QA programme must be recorded in a manual 12. This
is to ensure that all tests are carried out in a consistent and reproducible manner. If not, then
parameters may appear to be out of control, where in fact the change was due to differences
in the measurement technique. Clearly the manual should be readily available to all personnel
concerned.
Staff training
At the start of the QA programme, staff will need to be given sufficient training to carry out
the QA procedures for which they are responsible. Periodic refresher training will be required
also during the programme to keep staff up to date with development of the programme and
to provide feedback on its effectiveness and acceptability to staff.
3 SPECIFIC REQUIREMENTS FOR LARGE FACILITIES
All equipment and accessories need to be included in the QA programme, although the
frequencies may be very different for each item. This chapter describes the parameters that
need to be tested for each item of equipment and gives the suggested frequency of tests.
Where appropriate, recommendations for
5
test equipment are given. Specific step-by-step descriptions and tolerance values of all the
tests are not given. Such detailed information may be found in many of the publications listed
in the bibliography. In many cases the tolerances required may be determined from the
requirements of NRL C5. In any case the qualified health physicist may choose appropriate
frequencies where specific guidance is not given elsewhere.
Note that QA measurement instruments or systems are available from a number of
manufacturers. These generally combine kVp, dose and time measurement capabilities, and
in some cases are computer interfaced and have QA reporting software. The cost of such
instrumentation is small compared to the cost of the x-ray equipment and is well justified.
It is essential that all test equipment be calibrated on a regular basis. NRL may be consulted
if necessary for details of equipment calibration.
3.1 Automatic film processors
Of all equipment in x-ray departments, film processors cause the greatest proportion of
rejects7,10,13,14. Consequently, QA of film processors will give the greatest improvement in
reject rate. Therefore, film processor QA should be the first priority of any QA programme
and will probably form a major share of the work of the programme. Processor QA must be
performed daily to be fully effective.
The principles of processor QA are simple and well described in many
documents2,3,15,16,17,18,19,20. The first step is to ensure that the processor is operating correctly
at the start of the programme, usually by cleaning, replacing chemicals and checking the
replenishment rates and developer temperature. A light sensitometer should then be used
daily to expose test films that are processed without delay 20,21. Measurements of the densities
with a densitometer are then made to give base+fog, mid density (speed) and contrast indexes
which are plotted on the control charts. Systems that automate much of this procedure are
commercially available. (Note that special measures are required for modern low-crossover
films, or films with different emulsion speeds on each side.)
If any parameter is found to be out of control, corrective action must be taken. The
appropriate action may often be determined by consulting trouble-shooting charts. (Charts are
generally supplied by x-ray film or film processor manufacturers.) These show the action to
be taken, for example, if contrast and speed are down but fog is up. Control charts and test
films should be stored for future reference, and a log book should be kept for each processor.
Two further items are important for processor QA. The first is that the test films must always
come from the same box of film, to eliminate variations between film batches. Second, when
the box is almost empty, a new box should be started in parallel, to check for differences. It
may be necessary to adjust the control chart values if the new batch is slightly different or
even to reject the batch of film, if it is too far out of control. (It has happened!)
6
3.2 X-ray generators
Peak kilovoltage
The most important parameter to monitor is the peak kilovoltage, since small drifts in the
kVp can significantly alter the film density. Large x-ray facilities should have some form of
kVp instrument. Either an inferential digital kVp meter or an Ardran and Crooks type
penetrameter22,23 may be used. Alternatively, a qualified health physicist may be contracted
to make kVp and other measurements requiring expensive equipment.
Depending on the age and stability of the generator, kVp measurements should be made at
least annually, and in addition, after servicing, and if a drift in kVp is suspected for any
reason.
Linearity with mA/mAs
Good linearity will ensure that the same film blackening can be obtained for the same mAs,
regardless of the mA/time combination used. To assess linearity a dosemeter is required 7,
although results of reduced accuracy may be obtained using film 24. Depending on the age and
stability of the generator, linearity measurements should be made at least annually, after
servicing, and if unpredictable results are being achieved with changes in mA.
Reproducibility of mAs
It is clearly important that the film blackening should always be the same for a given machine
setting. Good reproducibility is therefore essential for consistent radiography.
Reproducibility may be assessed using a dosemeter 7, although results of reduced accuracy
may be obtained using film24. The standard deviation should not exceed 5% of the mean
dose1.
Exposure timer accuracy and reproducibility
These contribute to mAs linearity and reproducibility. Many instruments are available with
time measurement capabilities, while a spinning top may be used for one- and two-pulse
machines7,17.
Waveform monitoring
Increasing numbers of test devices are capable of displaying the x-ray output and/or the kVp
waveform. It is strongly recommended that the waveform be checked at a range of generator
settings whenever the full set of tests is done 25.
Regular x-ray generator tests
All of the above tests for kVp, linearity, reproducibility and exposure timer require the use of
test equipment and in many cases will require the services of a qualified health physicist 26.
Therefore, in order to provide a quick check on the entire radiographic system that is easily
performed by radiography staff, it is recommended that a stepwedge exposure test be
performed periodically. Although such a test is not likely to be able to provide a diagnosis for
any equipment problems, it will show whether there have been any changes since the
previous test. The stepwedge exposure frequency should be as determined by a qualified
health physicist, depending on the age and stability of the equipment.
7
Weekly tests may be appropriate for older machines, while monthly to quarterly may be more
appropriate for modern equipment.
The design of a suitable stepwedge is given in figure 2. This may easily be constructed from
layers of 2.5 mm aluminium. (NRL may be able to supply suitable wedges should there be
sufficient demand.) This should be radiographed using an mAs for which the image of the
thickest part of the wedge is just discernible above the base+fog density. For single phase
machines with 2.5 to 3.0 mm total filtration, 80 kVp should be used, while for three-phase
and medium frequency machines 70 kVp should be used. An 18 x 24 or 24 x 30 cm cassette
may be used. (If a higher kVp is typically used for a particular room, then a 1 mm Cu plate
added to the wedge may be appropriate.) The cassette, each side of the wedge, must be
shielded with lead or lead rubber. The kVp, mAs and FFD required for this should be
recorded in the QA manual and should be used for all subsequent tests, unless there is some
change to the x-ray machine or film processing. In this case a new mAs should be determined
for subsequent tests. The first image should be kept in a safe place and used as a reference
image to compare with the subsequent images. While the images may be assessed by eye the
use of a densitometer is preferable.

End viewSide viewTop view50 mm220 mm3.0 mm


AluminiumOptional 1 mm Cu plate
Figure 2. Design for a stepwedge suitable for QA of x-ray machines
In the event that the image differs markedly from the reference image, then the reason should
be sought. If there has been no change in processing conditions, as determined by the daily
processor QA, then there could be a fault in the x-ray machine.
3.3 X-ray tubes
Filtration
The filtration must comply with the NRL1 requirement for greater than 2.5 mm Al equivalent
in the primary beam. To measure the total filtration, a set of high purity aluminium filters and
a dosemeter are required. The half value layer should be measured at a known kVp and the
total filtration inferred from an appropriate
8
chart27. It may be possible to inspect the tube assembly to establish that filtration complies,
by adding up the equivalent filtration of each component, plus any added filtration. Filtration
needs to be measured at acceptance testing and then only after servicing or modification to
the tube assembly.
Labelling
NRL C5 requires that the focal spot position be marked, to enable radiation protection survey
measurements to be made accurately and consistently. A label is also required, giving the
specifications of the tube. This should be checked at acceptance testing and after major
servicing.
3.4 Automatic exposure controls (AEC)
Significant reductions in the retake rate may be achieved through consistent use of AEC for
all exposures28. However, this can only be achieved if the AEC device itself is correctly
adjusted and is included in the QA programme. A simple device for streamlining the AEC
QA was described recently29. QA involves taking films for various phantom thicknesses, kVp
and mA settings, and for each combination of AEC detectors. A measurement protocol
appropriate to the type of AEC system should be determined by a qualified health physicist.
3.5 Light beam diaphragms (LBD)
There are four aspects to LBD performance to be considered 7,17. These are accuracy,
delineation (centring), brightness and edge sharpness. Requirements for these are given in
NRL C5. LBDs need to be checked regularly as they are prone to being knocked out of
alignment. They should be checked at least twice yearly, and more often if found to be
necessary. The LBD may be tested using a commercial test tool or the "eight pennies
method"17.
It is important also to check the alignment of the x-ray beam with the grid and the image
receptor for wall and table buckys.
3.6 X-ray cassettes
X-ray cassettes must be kept in good condition. They should be cleaned and checked
regularly. All cassettes of the same type should be of the same speed. This should be checked
periodically7,17,24. Film/screen contact should be checked using a wire mesh or similar test
object. Light leaks should be checked for, by processing a film that has been kept in a
cassette exposed to light for an hour or more without x-ray exposure. Intensifying screens
must be replaced when they are worn out, or if they become damaged.
3.7 X-ray image intensifier systems
General considerations
Image intensifier systems are generally complex, with a number of separate components that
all have to be in good condition and correctly adjusted for image quality and patient dose to
be acceptable30. The image intensifier insert is one of the few components in diagnostic
radiology that has a clearly limited life, and
9
which deteriorates in a relatively predictable manner during its life. Because of this
deterioration, the dose rate to the patient required for satisfactory image quality will increase
during the life of the II, to two or even three times the value when new.
A properly budgeted replacement programme for II inserts is essential for all x-ray
departments with fluoroscopy equipment. A large department with ten image intensifiers
should budget to replace (for example) one or two inserts each year on average.
Regular maintenance and adjustment of all components are essential for all image intensifier
systems, while a full assessment by a qualified health physicist should be made at least
annually. A full radiation protection survey is required by NRL C5 at two year intervals.
Maximum entrance surface dose rate
The maximum dose rate permitted by NRL C5 at the position of the patient's skin is 50 mGy
per minute. To test this, a dosemeter is required. For systems with automatic brightness
control, sufficient lead (>2 mm) should be placed on the II face to drive the technique factors
to maximum. For systems with manual control, the maximum technique factors should be set.
This test should be performed at least annually and after servicing. Note that for modern
systems it may be more appropriate to use about 5.0 mm of copper to represent the largest
patient likely to be examined, to check the maximum dose rate likely to be reached in
practice.
Average patient dose rate
Because the performance of the II insert deteriorates with time, the dose to patients should be
regularly monitored30,31. At acceptance testing and radiation protection surveys the dose rate
for a patient equivalent phantom (see below) should be determined, and the technique factors
(kVp, mA, focus-II distance and TV monitor brightness and contrast settings) should be
recorded in the QA manual. For systems with automatic brightness control, the phantom
should be screened regularly and the technique factors checked. If these have changed
significantly, the reason should be sought, and the system serviced if necessary. For manual
systems, the technique factors recorded in the QA manual should be set, and the image of the
phantom should be checked. If the brightness seems to have changed then again the reason
should be sought, and the system serviced if necessary.
Patient equivalent phantom
Each facility must have a patient equivalent phantom to enable QA tests to be made. The
actual phantom used is not critical 31. However, the phantom must be reproducible. The
phantom should measure at least 300 x 300 mm. The thickness will depend on the material
used. Suitable phantoms could consist of 250 mm of water in a container, 200 mm of
perspex, 250 mm of oil tempered hardboard, 2.5 mm of copper or 45 mm of aluminium. Note
that “thick” phantoms that are approximately tissue equivalent are preferable in most
circumstances, as the exit beam quality and scatter levels are more typical of those that occur
in actual examinations.
10
Automatic brightness control (ABC)
The ABC system is responsible for maintaining constant brightness at the TV monitor,
regardless of patient attenuation. ABC systems operate in many different ways, including
control of some or all of the kVp, mA and video gain. It is crucial that the ABC system be
well maintained and adjusted at all times, as ABC system faults may cause excessive dose
rates and/or poor image quality. In the worst case, an ABC fault may cause the generator to
run at maximum kVp and mA. The parameter that the ABC system ultimately controls is the
II input dose rate. Therefore ABC system performance can be monitored by regular
measurement of the II input dose rate. This should be done with a range of phantom
thicknesses, at least twice per year7,17,30.
3.7.1 Image intensifiers
II input dose rate
The II input dose rate is a key parameter to monitor, as it gives a guide to the condition of the
image intensifier and it can indicate the presence of other problems with the system 30,32. NRL
C5 gives limits for the II input dose rate. In general most intensifier systems should give
satisfactory image quality at dose rates much lower than the limits. To measure the II input
dose rate a relatively sensitive electrometer is required, fitted with a pancake ionization
chamber of at least 50 cc volume. The grid (if fitted) should be removed for this test, or
alternatively the chamber should be positioned between the grid and the II face. If the grid
cannot be removed then an allowance must be made for the grid attenuation, typically 50%
for primary beam (no scatter).
If the II input dose rate has changed significantly since the previous tests then adjustments to
the system may be required. Any II that is found to require input dose rates greater than the
limits in NRL C5 should be replaced.
Image contrast and resolution
At acceptance testing, a set of measurements should be made using approved II test
objects32,33. These tests should be repeated at least annually.
Where approved test objects are not available for routine testing, an image quality QA
phantom should be devised31. This will set threshold values for contrast and resolution. This
should be used regularly, to check that no deterioration in the image quality has occurred.
Conversion factor
The best guide to the condition of an image intensifier is obtained by a measurement of its
conversion factor. This requires specialised equipment and generally can only be done by a
qualified health physicist or by the manufacturer. For some systems it is not practicable to do
it at all. Where the conversion factor can be measured, replacement of the II insert is
generally recommended when the conversion factor has reduced to a third of the value when
the II was new. The conversion factor should be measured annually where possible.
11
Where the conversion factor cannot be measured, indirect methods need to be applied to
assess the condition of the II insert, as described above 30.
Focus
The focus controls for the II should be regularly checked and adjusted if necessary, using a
line pair test plate or fine grid. The best compromise between central and peripheral
resolution should be achieved7,17.
3.7.2 Fluoroscopic television chains
Generally, television chains have a number of adjustable parameters that all have to be within
tolerance for optimum image quality. For example, these parameters may include peak white
and black level voltages, blanking circle diameter, II vignetting correction, etc. The
adjustments provided and the optimum voltage values depend on the individual
manufacturer. Therefore, the manufacturer's test protocols and recommended test frequencies
should be adhered to.
Assessment of most of the TV system parameters may be made using a test object placed on
the II face30, and a storage oscilloscope with single TV line facility. The measured
parameters should be compared to the manufacturer's specifications, although general
recommendations that apply to the majority of systems may also be applied 30.
Among the most important parameters is the noise produced by the TV system. The peak-to-
peak TV camera noise should be less than the quantum noise at normal II input dose rates
and should be less than 10% of the peak white voltage.
3.8 Fast film changers
Angiographic film changers require several tests to be made periodically to ensure that the
film transport is synchronised to the x-ray exposures, that the film/screen contact is
satisfactory and that the screens are clean and in good condition 34.
To properly check the film transport and synchronisation, a dual trace oscilloscope should be
used34.
Film/screen contact can be tested using a mesh as for conventional x-ray cassettes. The grid
should be removed from the film changer for this test.
The intensifying screens should be carefully cleaned and checked for damage.
3.9 Cine systems and small format cameras
The crucial parameter for both cine systems and small format cameras 35,36 is the dose per
frame at the II input face. Generally a dose of 0.1 to 0.2 μGy per frame is recommended for
cine depending on the II field size. For small format cameras, the dose for correct operation
should be determined at acceptance testing. The dose per frame should be checked using a
dosemeter at regular intervals, at least twice per year, and more often for intensively used
systems.
12
For cine systems, jitter of the camera and or projector can be a problem. The Society of
Motion Picture and Television Engineers (SMPTE) test films can be used to check for
projector jitter. If the projector is found to be OK, and jitter is observed in the clinical films,
then the cine camera is at fault3.
The cine camera, spot film camera and cine projector optics should be carefully cleaned
regularly, and inspected for deterioration. Any optical elements in poor condition must be
replaced.
3.10 Digital subtraction imaging (DSI) systems
Before any quality assurance measurements can be made on DSI systems, the x-ray and
image intensifier systems must be tested as described above.
Most measurements on DSI systems require specialised phantoms and test equipment, and
should be made by a qualified health physicist 37,38,39. The tests should be carried out every
three months. A typical series of tests would follow the protocol provided with the DSI test
objects supplied by Leeds University 37. The discussion below is based on the use of the
Leeds test objects.
II input dose per frame
Fundamental to the tests is a measurement of the dose per frame at the entrance surface
position of the patient and at the input face of the image intensifier. A sensitive electrometer
and a large (>50 cc) pancake chamber are required for the latter measurement. The dose per
frame at the patient entrance surface is recorded for comparison purposes. The II input dose
per frame is needed to determine that the system is correctly adjusted and that the image
intensifier is still in satisfactory condition for DSI requirements.
Operating headroom
It is important for digital subtraction systems, that the dose rate (dose per frame) at the image
intensifier input face be set sufficiently low to prevent saturation of the video signal in the
brightest parts of the image. If saturation should occur then clinical information may be lost
in the subtracted images. Saturation is avoided by setting the dose rate at a level that gives a
video signal that is less than the maximum. The percentage that the operating level is set
below the maximum is termed the headroom. However, if the headroom is too great, then the
actual useable part of the video signal (the dynamic range) will be too small. The automatic
dose rate/dose per frame is therefore a compromise between dynamic range and headroom.
The Leeds "jig test object" is designed to measure the headroom for peak sensing systems. It
gives a subtracted image of a stepwedge. The headroom is assessed by counting the number
of steps of this stepwedge that are visible. As each step corresponds to 4% above the
operating video level, the headroom is the number of steps visible, times 4%.
There is now available an analogous Leeds test object designed to be used with average
sensing systems.
13
The jig test object also contains a low contrast square. If this square is not visible, then the
electronic noise of the system is excessive.
Dynamic range
A further test of dynamic range is made using the Leeds "quadrant test object". This has four
sets of low contrast details. Each set is in a sector of differing x-ray intensity, obtained by
different thicknesses of copper filter. The relative x-ray intensities of the four sectors
(quadrants) are 100%, 33%, 10% and 3.3%. Clearly, as the x-ray intensity is reduced the
details will become more difficult to see. Detectability of the details in each quadrant is
therefore a test of the dynamic range.
Contrast-detail performance
The Leeds "contrast-detail test object" is very similar to the NRL CD test object used for
conventional IIs, as reported above. However, the contrasts for each detail size are lower,
reflecting the greater capability of DSA systems. A contrast-detail assessment should be
included in the quarterly tests.
Misregistration
The Leeds "Dalmatian" test object may be used to assess the accuracy of registration of the
mask and subtraction images. This has a large number of 11 mm details spaced in a uniform
matrix over the test object. The test object remains in place for both the mask and subtraction
images and therefore should disappear in the subtracted image. Any misregistration results in
halos in the subtracted images.
Monitors and hard copy devices
The viewing monitors and the hard copy film device for DSI need to be checked at least
weekly, as drifts in performance can significantly reduce image quality 40,41.
3.11 Computed tomography scanners
CT scanners are generally covered by a service contract that includes calibrations and some
form of regular preventative maintenance (often monthly). This is essential to ensure
satisfactory calibration of the machine at all times. Generally the sorts of tests made are
invasive and can only be made by the service agent. However, a maintenance contract does
not relieve the facility from the responsibility for regular quality assurance
measurements42,43.
At the very least the CT number of air and water and the noise level at the centre of a uniform
phantom should be assessed weekly. Regular checks of the modulation of a bar pattern
phantom should be done at least monthly (using the ROI standard deviation function) to
check the resolution43. The CTDI in air should be measured annually and after major
servicing (such as x-ray tube replacement), in order to monitor patient dose levels. The multi
format camera and the video display units must be checked regularly, at least weekly 40,41.
14
3.12 Mammography machines
Mammography is the area of radiology requiring the greatest effort in quality assurance.
Mammography can only be effective with a comprehensive quality assurance programme 44.
A complete programme for mammography QA would require more extensive description
than is appropriate here. Only the types of tests to be performed are outlined here, while the
recommended frequencies are included in section 4. There are several excellent published
QA programmes45,46,47. It is strongly recommended that all mammography facilities obtain at
least one of these publications.
The greatest priority for mammography QA should be the film processor. Careful daily
quality control of film processing is a pre-requisite for satisfactory mammography. All of the
relevant items for general x-ray machines and automatic exposure controls should also be
monitored regularly. These generally need to be monitored more often than in general
radiology and to have tighter control limits. In addition, specialist features of mammography
should be checked. These include the focal spot size, beam quality, compression device,
regular assessments of an imaging phantom, daily exposure of a perspex phantom to check
the AEC device, and a regular check on the dose level.
3.13 Tomography machines
The x-ray generator and x-ray tube should be checked in accordance with the
recommendations above before tests on the tomographic system are applied. Items to check
for tomography are stability, resolution, layer selection and angle of swing 7,17. BIR3 give a
good explanation of the tests to be used.
Stability is assessed visually by watching the tomographic motion during an exposure from a
distance of about 2 m. This should be done without x-ray exposure if possible, or with the
LBD closed. (Otherwise the operator should wear a lead apron.) The movement should be
continuous, smooth and at a regular speed.
Smoothness of swing may also be tested using a pinhole in a lead sheet to produce an image
of the tube movement. The image should be regular and of even density.
Resolution is tested using a wire mesh test object, with a selection of meshes from 1 up to 2
holes mm-1. Resolution should be at least 1.2 mm -1.
Layer thickness and layer height should be assessed using an angulated scale.
These tests should be done at least annually, and more often if found to be necessary.
3.14 Mobile radiographic equipment
The same requirements apply to mobile equipment as to equivalent fixed machines.
However, because of the harsher treatment often afforded to mobile
15
machines, the quality assurance tests may need to be done more frequently 7. In particular the
LBD alignment should be checked often. Additional items to check include the mechanical
features, such as tube locks, wheel brakes, safety of electrical cables, etc.
3.15 Mobile image intensifier equipment
Mobile image intensifier systems should meet the same requirements as fixed II systems.
However, because of the harsher treatment often afforded to mobile machines, the quality
assurance tests may need to be done more frequently. In particular the alignment of the x-ray
tube with the II should be regularly checked to ensure that the primary x-ray beam is
completely intercepted by the II.
3.16 Grids
Grids should periodically be checked for warping and damage and should be radiographed to
check for uniformity17.
3.17 Protective equipment
Lead aprons and lead gloves should be thoroughly checked at least annually for any signs of
wear or damage. If they appear to be suspect, then they should be tested using x-ray film or a
fluoroscopy machine.
3.18 Darkrooms
A well laid out darkroom that is clean and free from light leaks is essential for satisfactory
radiography.
The entrance should be light tight, either a maze or well-sealed door. Safelights should be the
correct colour for the film type being used and should be no more than 25 watts. (Note that
"novelty lamps" are not suitable.) A check for light leaks should be made at least annually. A
film fog test should also be done at least annually 1,7,48.
3.19 Viewboxes
The viewbox(es) should be in an area shielded from direct sunlight or bright artificial light. It
should be possible to dim the lighting. The viewbox should be cleaned inside and out at least
annually. The fluorescent lamps should be replaced if they become too dim. The lamps
should all be the same colour and the same wattage 49. Inexpensive light meters can be
obtained, to simplify QA of viewboxes. NRL C51 gives brightness values for viewboxes.
Locally derived tolerances for uniformity should also be applied. Note that the brightness of
viewboxes tends to increase by 20 to 30% as they warm up and also note that the brightness
of the fluorescent tubes reduces with age.
There is evidence that the colour temperature of viewboxes is important also 49. This should
be checked periodically as appropriate.
16
3.20 Technique charts
It is essential that all radiographic technique factors are recorded and displayed on a
technique chart28,50. This should include the kVp, mA and exposure time to set (or AEC
detector, density and speed settings) the cassette type and size to use, the focus to film
distance, and any important centring points and angulations. These technique charts must be
kept up to date and must be clearly legible.
3.21 Dose measurements
As part of the QA programme, NRL C5 requires that all x-ray facilities periodically assess the
dose to patients for a number of common examinations (skull, chest, thoracic spine,
abdomen, lumbar spine and pelvis and the more complex procedures barium enema, barium
meal and IVU). Patient dose estimates generally require the assistance of a qualified health
physicist or NRL. Three methods may be used for these dose assessments. All three methods
rely on the use of organ dose data obtained from computer simulation of x-ray examinations
of an average patient51,52.
The simplest method is to calculate the doses from the technique factors used for average
patients, using x-ray output calibration data for each x-ray tube. The second method is to use
thermoluminescent dosimeters to measure the entrance surface doses for a number of average
patients. Finally a dose area product meter can be fitted to the x-ray machine for a number of
examinations. This is the most practical method for fluoroscopy examinations.
3.22 Approval of the QA programme
NRL C5 requires that the QA programme be approved by a qualified health physicist. For
large x-ray facilities, a programme following the guidelines herein should be satisfactory.
However, NRL or a qualified health physicist must give overall approval to the programme
and should be consulted concerning any details of the programme that are in doubt. All
documentation for the QA programme should be kept in a safe place, so that the details may
be checked during NRL radiation protection surveys.
4 AN OUTLINE QA PROGRAMME FOR LARGE FACILITIES
Each x-ray facility is required by NRL C5 to institute a QA programme in radiation
protection that is appropriate to its size and scope. In essence the QA programme involves the
implementation of procedures to ensure that all of the items of equipment are tested at the
appropriate frequency, and that corrections are made when parameters are found to be outside
the permitted tolerance. Besides the actual QA measurements, essential components of the
QA programme are acceptance tests and reject/repeat analysis. The programme should be set
down in a QA manual. Responsibility for the QA programme should be assigned to one
person and where appropriate, should be overseen by a QA committee. These concepts are
discussed in section 2 above.
17
Notes
• It is intended in the programme below that for each category, the tests in the previous
category should be included. For example the weekly tests would include all of the daily
tests, plus those to be done weekly.
• The frequencies are those typically required for average equipment. The frequencies may be
modified in the light of experience, to be more frequent or less frequent as necessary.
• Clearly, it is assumed below that corrective action will be taken immediately, should any
tests reveal that QA parameters are out of control.

Equipment
To perform a satisfactory QA programme, each facility should have (or have access to) the
following test equipment.
• A light sensitometer, with single/double emulsion and blue/green capability.
• A densitometer.
• A basic QA dosemeter.
• A kVp meter, or kVp penetrameter.
• An inexpensive light meter (Luxmeter).
• A good quality thermometer.
• An aluminium stepwedge.
• A set of aluminium (type 1100) filters.
• Some form of resolution test object or set of meshes.
• Basic II image quality phantom.
• Patient equivalent phantom.
• Specialised phantoms for DSA, CT, mammography, etc as appropriate.
• Some equipment may require specialised test jigs, attachment or imaging objects.

(The total cost of this equipment would be in the range $10,000 to $15,000, not including
specialised DSA/CT/mammography phantoms.)
Daily tests
For each film processor in the department, a film exposed to a sensitometer should be
processed, the densities read and the results posted on the control chart. This test should be
done at the same time each day, after the processor has been in use for an hour or so.
The mammography machine AEC should be checked with a standard (40 mm) perspex
phantom, and the mAs should be recorded.
Video displays and hard copy devices should be checked.
18
Weekly tests
A stepwedge should be radiographed for each x-ray tube in the department and the density
parameters should be plotted on a control chart. If any measured parameters are found to be
out of control, then corrective action must be taken. (This may be done monthly if weekly
tests show no variations.)
A mammography QA phantom should be imaged and the details detected should be recorded.
If not done daily, video displays and hard copy devices should be checked.
CT scanners should have the CT numbers of air and water and the noise at the centre of the
water phantom measured. The modulation of a bar pattern phantom could also be checked
using the ROI software.
For fluoroscopy systems, the patient equivalent phantom should be screened, using a
standardised machine setup, and the technique factors noted.
Monthly tests
The intensifying screens should be carefully inspected each month and cleaned if necessary.
A reduced cleaning frequency may be possible in the light of experience. They should be
cleaned at least every six months.
The mammography machine AEC should be thoroughly tested.
For mobile machines fitted with an LBD, the alignment should be checked. Wall and table
buckys should have the beam alignment checked.
A stepwedge radiograph should be produced for all of the x-ray tubes at the facility, if not
required weekly. These should be compared to the reference stepwedge for each tube. Any
differences should be investigated and corrective action taken if necessary.
Checks of patient dose rate and image contrast and resolution as described above should be
made at facilities with image intensifiers.
Quarterly tests
DSA machines should be tested quarterly as described above.
For mammography machines the mean glandular dose should be checked, by comparing the
mAs required to image a 40 mm perspex phantom.
For machines fitted with an LBD, the alignment should be checked.
Annual tests
Protective aprons and gloves should be given a careful visual inspection. If suspect, then they
should be referred for more careful checking.
19
The viewbox should be cleaned, including the internal reflectors, and the lamps replaced if
they have become dim.
Film/screen contact should be checked using a mesh. Cassettes should be checked for light
leaks and damage. The screens should be carefully inspected for scratches, blemishes or other
damage. Old or worn out screens should be replaced.
A light-leak and light-fog test should be done in the darkroom.
The technique chart should be checked to ensure that it is up to date.
The films in the reject/retake bin should be sorted by category 10. The number in each
category and the total number should be counted. The reject/retake rate should then be
calculated as the percentage of all films used. The reject categories with the greatest numbers
of films should be investigated to determine whether any improvements can be made.
It is strongly recommended that the peak kilovoltage, total filtration, linearity and
reproducibility of the x-ray machine(s) be checked at least annually.
For image intensifier systems the II input, patient and maximum dose rates should be
measured. Where possible, the II conversion factor should be measured. Tests with approved
II test objects should also be made.
The accuracy of any focus to film readout devices should be estimated
For mammography machines, all parameters should be checked, including compression
device and paddles, generator kVp, linearity and reproducibility, focal spot size, AEC device,
x-ray cassettes, darkroom and viewbox. In addition the mean glandular dose for an average
breast should be checked. The mAs required for the 40 mm perspex phantom should be noted
for use as a quarterly test that the mean glandular dose has not changed.
Two-yearly tests
All of the annual tests should be done. In addition a full radiation protection survey should be
performed, either by NRL or by a qualified health physicist (approved by NRL). This will
include quantitative tests on the x-ray generator(s), image intensifier and any mobile x-ray
equipment and will also include assessment of doses to patients.
20
BIBLIOGRAPHY
1 National Radiation Laboratory. Code of safe practice for the use of x-rays in medical
diagnosis. Christchurch : National Radiation Laboratory, 1994. Code NRL C5.
2 National Council on Radiation Protection and Measurements. Quality assurance for
diagnostic imaging equipment. Bethesda, MD, : NCRP, 1988. NCRP report no. 99.
3 British Institute of Radiology. Assurance of quality in the diagnostic x-ray department.
London : British Institute of Radiology, 1988.
4 International Electrotechnical Commission. Evaluation and routine testing in medical
imaging departments. Part 2-6: Constancy tests – x-ray equip-ment for computed
tomography. Geneva : IEC., 1994. IEC 1223-2-6.
5 Watkinson S A. Economic aspects of quality assurance Radiography 51(597):133-140,
1985.
6 Henshaw E T. Quality assurance in practice – a critical appraisal of what is effective. In
Criteria and methods for quality assurance in medical x-ray diagnosis. London : British
Institute of Radiology, 1985. BJR supplement 18, p. 142-1440.
7 Gray J E et al. Quality control in diagnostic imaging : a quality control cookbook.
Gaithersberg, MD, : Aspen Publishers, 1983.
8 Technical and physical parameters for quality assurance in medical diagnostic radiology :
tolerances, limiting values and appropriate measuring methods. Eds B M Moore et al.
London : British Institute of Radiology 1989. BIR report 18.
9 Gray J E and Stears J. Quality control in diagnostic radiology at Mayo Clinic. Applied
radiology 13(4):89-92, 1984.
10 Goldman L W and Beech S. Analysis of retakes : understanding, managing and using an
analysis of retakes program for quality assurance. Rockville, MD, : Bureau of Radiological
Health, 1979. HEW publication FDA 79-8097.
11 Watkinson S, Moores B M and Hill S J. Reject analysis: its role in quality assurance.
Radiography 50(593):189-194, 1984.
12 Hendee W R and Rossi R P. Quality assurance for radiographic x-ray units and associated
equipment. Rockville, MD, : Bureau of Radiological Health, 1979. HEW Publication (FDA)
79-8094.
21
13 Goldman L et al. Automatic processing quality assurance programme: impact on a
radiology department. Radiology 125:591-595, 1977.
14 Gray J E. Mammography (and radiology?) is still plagued with poor quality in
photographic processing and darkroom fog. Radiology 191:318-319, 1994.
15 Gray J E. Photographic quality assurance in diagnostic radiology, nuclear medicine and
radiation therapy. Volume 1. The basic principles of daily photographic quality assurance.
Rockville, MD, : Bureau of Radiological Health, 1976. HEW publication (FDA):76-8043.
16 Frank E D, Gray J E and Wilken D A. Flood replenishment: a new method of processor
control. Radiologic technology 52(3):271-275, 1980.
17 McLemore J M. Quality assurance in diagnostic radiology. Chicago, Ill, : Year Book
Medical Publishers, 1981.
18 Watkinson S et al. Quality assurance: a practical programme. Radio-graphy 49(578):27-
32, 1983.
19 International Electrotechnical Commission. Evaluation and routine testing in medical
imaging departments, part 2-1: Constancy tests – film processors. Geneva : IEC, 1993. IEC
1223-2-1.
20 Groenendyk D J. Densitometers and sensitometers in QC. Radiologic technology
65(4):249-250, 1994.
21 Suleiman O H and Thomas A W. A comparison of freshly exposed and preexposed
control film in the evaluation of processing. Medical imaging and instrumentation '85:96-
102, 1985. SPIE 555.
22 Le Heron J C and Williamson B D P. The NRL kV-cassette – a penetrameter for the
estimation of peak kilovoltage on diagnostic x-ray machines. Christchurch : National
Radiation Laboratory, 1980. Report NRL 1980/5.
23 Netto T G and Cameron J R. An inexpensive kVp penetrameter. Medical physics
12(2):259-260, 1985.
24 Burt G. Quality control without a budget. The radiographer 40:12-15, 1993.
25 Stears J G et al. X-ray waveform monitoring for radiographic quality control. Radiologic
technology 57(1):9-15.
26 Pirtle O T. X-ray machine calibration: a study of failure rates. Radiologic technology
65(5):291-295, 1994.
22
27 Le Heron J C. Half value layer versus total filtration for general diagnostic x-ray beams.
Christchurch : National Radiation Laboratory, 1990. Report NRL 1990/5.
28 Eastman T. Technique charts improve x-ray quality. Radiologic technology 65(3):183-
186, 1994.
29 Hunt A J and Plain S G. Technical note: a simple solution to the problems of testing
automatic exposure control in diagnostic radiology. British journal of radiology 66:360-362,
1993.
30 Poletti J L and Le Heron J C. Subjective performance assessment of x-ray image
intensified television fluoroscopy systems. Christchurch : National Radiation Laboratory,
1987. Report NRL 1987/2.
31 Hayward G. A practical phantom for fluoroscopy. The radiographer 32(2):72-74, 1985.
32 Le Heron J C and Poletti J L. Imaging performance and limits of acceptability for x-ray
image intensifier systems. Christchurch : National Radiation Laboratory, 1990. Report NRL
1990/4.
33 Hay G A et al. A set of test objects for quality control in television fluoroscopy. British
journal of radiology 58(688):335-344, 1985.
34 American College of Radiology. Acceptance testing protocols: a systematic approach to
evaluating radiologic equipment. Chicago, Il, : ACR, 1983.
35 Renaud L and Morissette R. A quality control program for cine radiography. Journal of
the Canadian Association of Radiologists 35:380-382, 1984.
36 Rouse S and Cowen A R. Quality assurance of fluorographic camera systems.
Radiography 49(587):251-255, 1983.
37 Cowen A R et al. A set of x-ray test objects for image quality control in digital subtraction
fluorography. I Design considerations and II Appli-cation and interpretation of results.
British journal of radiology 60: 1001-1009 and 1011-1018, 1987.
38 McLean D and Collins L. Quality assurance protocol for digital subtraction angiographic
units. Australasian physical and engineering sciences in medicine 9(3):127-132, 1986.
39 American Association of Physicists in Medicine. Digital Radiography/ Fluorography Task
Group. Performance evaluation and quality assurance in digital subtraction angiography:
report. NY, : American Institute of Physics, 1985. AAPM report no. 15.
23
40 International Electrotechnical Commission. Evaluation and routine testing in medical
imaging departments. Part 2-4: Constancy tests – hard copy cameras. Geneva, : IEC., 1995.
(AS/NZS 4184.2.4 : 1995) IEC 1223-2-4.
41 International Electrotechnical Commission. Evaluation and routine testing in medical
imaging departments. Part 2-5: Constancy tests – image display devices. Geneva, : IEC.,
1995. (AS/NZS 4184.2.5 : 1995) IEC 1223-2-5.
42 Droege R T. A quality assurance protocol for CT scanners. Radiology 146:244-246, 1983.
43 Poletti J L. Performance assessment of CT x-ray scanners. Christchurch : National
Radiation Laboratory, 1985. Report NRL 1985/8.
44 Kirkpatrick A E. Quality control in mammography. In: Breast cancer screening in
Europe. Gad A and Rosselli Del Turco M Eds. Berlin : Springer-Verlag, 1993. p. 131-141.
45 Royal Australasian College of Radiologists. Mammography Quality Assurance
Programme Subcommittee. Mammography quality control. American College of Radiology,
1992.
46 Calverd A. Physical quality control for screening mammography. Radiography today
55(630):20-23, 1989.
47 Screen film mammography : imaging considerations and medical physics responsibilities.
Barnes G T and Frey G D eds. Madison, WI : Medical Physics Publishing, 1991.
48 Gray J E. Light fog on radiographic films: how to measure it properly. Radiology
115:225-227, 1975.
49 Haus A G, Gray J E and Daly T R. Assessment of mammographic viewbox luminance,
illuminance and color. Medical Physics 20(3):819-821, 1993.
50 Enright M. A simple graphic exposure system for mobile capacitor discharge units. The
radiographer 33(3):94-97, 1986.
51 Hart D, Jones D G and Wall B F. Estimation of effective dose in diagnostic radiology
from entrance surface dose and dose-area product measurements. Chilton, Oxon, : National
Radiological Protection Board, 1994. NRPB-R262.
52 Institute of Physical Sciences in Medicine. Dosimetry Working Party. National protocol
for patient dose measurements in diagnostic radiology. Chilton, Oxon, : National
Radiological Protection Board, 1992.
24

http://www.nrl.moh.govt.nz/publications/1995-1.pdf
Section 2: Introduction

Purpose
The purpose of this manual is to provide information on the requirements necessary to meet the
HARP Act and regulation's standards. It also explains the necessary components of proper
Quality Assurance and Quality Control programs. In addition, it describes in detail all tests and
procedures carried out by the Ministry of Health's X-ray Inspection Service.

Definition
A quality assurance program:

Is a management tool that includes policies and procedures ensuring overall safe practices are
observed in an x-ray department as well as ensuring that the performance of the x-ray equipment
is at its optimum, in keeping with minimum exposure to both patients and personnel.

It is an ongoing process in order to keep up with the changes in technology, hospital policies, etc.

Quality Control:

Regular monitoring and testing of equipment with proper evaluation and corrective actions taken
when necessary.

Implementation
A successful quality assurance program depends on the understanding and support of all those
involved in the operation of the facility. A program initiated solely to comply with the regulatory
requirements of the Healing Arts Radiation Protection Act is not likely to provide the maximum
possible benefit of patient care.

Responsibility for Quality Assurance Testing


Section 8 (2) of the HARP Regulation states:

"Every radiation protection officer shall establish and maintain procedures and tests for the x-ray
machines and x-ray equipment in the facility for which he is a radiation protection officer to
ensure compliance with this Regulation."

Quality assurance activities are the responsibility of all members of a facility, but the ultimate
accountability rests with the Radiation Protection Officer (RPO). The degree of involvement of
other members of the facility will vary depending upon the size and organization of the facility.

A Quality Assurance Program Must Include


1. X-ray Safety procedural/policy manual. Refer to the General Information - Section 13 of
this manual for more information about this topic·
2. Plan Approval for all x-ray rooms. Refer to Plan Approval - Section 14 of this manual for
more information.
3. A Quality Control program in place, including but not limited to:
a. Developer sensitometry on a daily basis (including weekends if applicable).
b. Testing of x-ray equipment on a regular basis and analysing results (refer to the
Regulation for required frequency of testing).
c. Checking film quality.
d. Ensuring operator qualifications.

http://www.xrayfocus.info/qc/testhtml/mohguide/section02.html
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories 8.1

8
QUALITY ASSURANCE AND
QUALITY
CONTROL
Quality Assurance and Quality Control Chapter 8
8.2 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
CO-CHAIRS, EDITORS AND EXPERTS
Co-Chairs of the Expert Meeting on Cross-sectoral Methodologies for
Uncertainty Estimation and Inventory Quality
Taka Hiraishi (Japan) and Buruhani Nyenzi (Tanzania)
REVIEW EDITORS
Carlos M Lòpez Cabrera (Cuba) and Leo A Meyer (Netherlands)
Expert Group: Quality Assurance and Quality Control (QA/QC)
CO-CHAIRS
Kay Abel (Australia) and Michael Gillenwater (USA)
AUTHOR OF BACKGROUND PAPER
Joe Mangino (USA)
CONTRIBUTORS
Sal Emmanuel (IPCC-NGGIP/TSU), Jean-Pierre Fontelle (France), Michael Gytarsky (Russia), Art Jaques
(Canada), Magezi-Akiiki (Uganda), and Joe Mangino (USA)
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories 8.3
Contents
8 QUALITY ASSURANCE AND QUALITY CONTROL
8.1 INTRODUCTION................................................................................................................................8.4
8.2 PRACTICAL CONSIDERATIONS IN DEVELOPING QA/QC SYSTEMS......................................8.5
8.3 ELEMENTS OF A QA/QC SYSTEM..................................................................................................8.6
8.4 INVENTORY AGENCY ......................................................................................................................8.6
8.5 QA/QC PLAN......................................................................................................................................8.6
8.6 GENERAL QC PROCEDURES (TIER 1)............................................................................................8.7
8.7 SOURCE CATERGORY-SPECIFIC QC PROCEDURES (TIER 2).................................................8.10
8.7.1 Emissions data QC...................................................................................................................8.10
8.7.2 Activity data QC......................................................................................................................8.13
8.7.3 QC of uncertainty estimates.....................................................................................................8.15
8.8 QA PROCEDURES ............................................................................................................................8.15
8.9 VERIFICATION OF EMISSIONS DATA.........................................................................................8.16
8.10 DOCUMENTATION, ARCHIVING AND REPORTING.................................................................8.16
8.10.1 Internal documentation and archiving .....................................................................................8.16
8.10.2 Reporting ................................................................................................................................8.17
REFERENCES ...........................................................................................................................................8.17
Table
Table 8.1 Tier 1 General Inventory Level QC Procedures ..............................................................8.8
Quality Assurance and Quality Control Chapter 8
8.4 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories

8 QUALITY ASSURANCE AND


QUALITY CONTROL
8 .1 INTRODUCTION
An important goal of IPCC good practice guidance is to support the development of national greenhouse
gas
inventories that can be readily assessed in terms of quality and completeness. It is good practice to
implement
quality assurance and quality control (QA/QC) procedures in the development of national greenhouse gas
inventories to accomplish this goal.
This guidance establishes good practice consistent with the Revised 1996 IPCC Guidelines for National
Greenhouse Gas Inventories (IPCC Guidelines). The QA/QC good practice guidance outlined here reflects
practicality, acceptability, cost-effectiveness, existing experience, and the potential for application on a
worldwide
basis. A QA/QC programme contributes to the objectives of good practice guidance, namely to improve
transparency, consistency, comparability, completeness, and confidence in national inventories of
emissions
estimates.
The outcomes of the QA/QC process may result in a reassessment of inventory or source category
uncertainty
estimates. For example, if data quality is found to be lower than previously thought and this situation
cannot be
rectified in the timeframe of the current inventory, the uncertainty estimates ought to be re-evaluated.
The terms ‘quality control’ and ‘quality assurance’ are often used incorrectly. The definitions of QC and
QA in
Box 8.1 will be used for the purposes of good practice guidance.
BOX 8.1
DEFINITION OF QA/QC
Quality Control (QC) is a system of routine technical activities, to measure and control the quality
of the inventory as it is being developed. The QC system is designed to:
(i) Provide routine and consistent checks to ensure data integrity, correctness, and
completeness;
(ii) Identify and address errors and omissions;
(iii) Document and archive inventory material and record all QC activities.
QC activities include general methods such as accuracy checks on data acquisition and
calculations and the use of approved standardised procedures for emission calculations,
measurements, estimating uncertainties, archiving information and reporting. Higher tier QC
activities include technical reviews of source categories, activity and emission factor data, and
methods.
Quality Assurance (QA) activities include a planned system of review procedures conducted by
personnel not directly involved in the inventory compilation/development process. Reviews,
preferably by independent third parties, should be performed upon a finalised inventory following
the implementation of QC procedures. Reviews verify that data quality objectives were met,
ensure that the inventory represents the best possible estimates of emissions and sinks given the
current state of scientific knowledge and data available, and support the effectiveness of the QC
programme.
Before implementing QA/QC activities, it is necessary to determine which techniques should be used, and
where
and when they will be applied. There are technical and practical considerations in making these decisions.
The
technical considerations related to the various QA/QC techniques are discussed in general in this chapter,
and
specific applications to source categories are described in the source category-specific good practice
guidance in
Chapters 2 to 5. The practical considerations involve assessing national circumstances such as available
resources and expertise and the particular characteristics of the inventory. The level of QA/QC activities
should
be compatible with the methods or tiers used to estimate emissions for particular source categories. In
addition,
resources should be focused on priority areas, such as the key source categories (as described in Chapter 7,
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories 8.5
Methodological Choice and Recalculation, 7.2, Determining National Key Source Categories) and source
categories where changes have occurred in methods or data acquisition since the last inventory compilation.
8 .2 PRACTICAL CONSIDERATIONS IN
DEVELOPING QA/QC SYSTEMS
Implementing QA/QC procedures requires resources, expertise and time. In developing any QA/QC
system, it is
expected that judgements will need to be made on the following:
Resources allocated to QC for different source categories and the compilation process;
Time allocated to conduct the checks and reviews of emissions estimates;
Availability and access to information on activity data and emission factors, including data quality;
Procedures to ensure confidentiality of inventory and source category information, when required;
Requirements for archiving information;
Frequency of QA/QC checks on different parts of the inventory;
The level of QC appropriate for each source category;
Whether increased effort on QC will result in improved emissions estimates and reduced uncertainties;
Whether sufficient expertise is available to conduct the checks and reviews.
In practice, the QA/QC system is only part of the inventory development process and inventory agencies do
not
have unlimited resources. Quality control requirements, improved accuracy and reduced uncertainty need
to be
balanced against requirements for timeliness and cost effectiveness. A good practice system seeks to
achieve
that balance and to enable continuous improvement of inventory estimates.
Within the QA/QC system, good practice provides for greater effort for key source categories and for those
source categories where data and methodological changes have recently occurred, than for other source
categories. It is unlikely that inventory agencies will have sufficient resources to conduct all the QA/QC
procedures outlined in this chapter on all source categories. In addition, it is not necessary to conduct all of
these
procedures every year. For example, data collection processes conducted by national statistical agencies are
not
likely to change significantly from one year to the next. Once the inventory agency has identified what
quality
controls are in place, assessed the uncertainty of that data, and documented the details for future inventory
reference, it is unnecessary to revisit this aspect of the QC procedure every year. However, it is good
practice to
check the validity of this information periodically as changes in sample size, methods of collection, or
frequency
of data collection may occur. The optimal frequency of such checks will depend on national circumstances.
While focusing QA/QC activities on key source categories will lead to the most significant improvements
in the
overall inventory estimates, it is good practice to plan to conduct at least the general procedures outlined in
Section 8.6, General QC Procedures (Tier 1), on all parts of the inventory over a period of time. Some
source
categories may require more frequent QA/QC than others because of their significance to the total
inventory
estimates, contribution to trends in emissions over time or changes in data or characteristics of the source
category, including the level of uncertainty. For example, if technological advancements occur in an
industrial
source category, it is good practice to conduct a thorough QC check of the data sources and the compilation
process to ensure that the inventory methods remain appropriate.
It is recognised that resource requirements will be higher in the initial stages of implementing any QA/QC
system than in later years. As capacity to conduct QA/QC procedures develops in the inventory agency and
in
other associated organisations, improvements in efficiency should be expected.
General QC procedures outlined in Table 8.1, Tier 1 General Inventory Level QC Procedures, and a peer
review
of the inventory estimates are considered minimal QA/QC activities for all inventory compilations. The
general
procedures require no additional expertise in addition to that needed to develop the estimates and compile
the
inventory and should be performed on estimates developed using Tier 1 or higher tier methods for source
categories. A review of the final inventory report by a person not involved in the compilation is also good
practice, even if the inventory were compiled using only Tier 1 methods. More extensive QC and more
rigorous
review processes are encouraged if higher tier methods have been used. Availability of appropriate
expertise
may limit the degree of independence of expert reviews in some cases. The QA/QC process is intended to
ensure
transparency and quality.
Quality Assurance and Quality Control Chapter 8
8.6 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
There may be some inventory items that involve confidential information, as discussed in Chapters 2 to 5.
The
inventory agency should have procedures in place during a review process to ensure that reviewers respect
that
confidentiality.
8 .3 ELEMENTS OF A QA/QC SYSTEM
The following are the major elements to be considered in the development of a QA/QC system to be
implemented in tracking inventory compilation:
An inventory agency responsible for coordinating QA/QC activities;
A QA/QC plan;
General QC procedures (Tier 1);
Source category-specific QC procedures (Tier 2);
QA review procedures;
Reporting, documentation, and archiving procedures.
For purposes of the QA/QC system, the Tier 2 QC approach includes all procedures in Tier 1 plus
additional
source category-specific activities.
8 .4 INVENTORY AGENCY
The inventory agency is responsible for coordinating QA/QC activities for the national inventory. The
inventory
agency may designate responsibilities for implementing and documenting these QA/QC procedures to other
agencies or organisations. The inventory agency should ensure that other organisations involved in the
preparation of the inventory are following applicable QA/QC procedures.
The inventory agency is also responsible for ensuring that the QA/QC plan is developed and implemented.
It is
good practice for the inventory agency to designate a QA/QC coordinator, who would be responsible for
ensuring that the objectives of the QA/QC programme are implemented.
8 .5 QA/QC PLAN
A QA/QC plan is a fundamental element of a QA/QC system, and it is good practice to develop one. The
plan
should, in general, outline QA/QC activities that will be implemented, and include a scheduled time frame
that
follows inventory preparation from its initial development through to final reporting in any year. It should
contain an outline of the processes and schedule to review all source categories.
The QA/QC plan is an internal document to organise, plan, and implement QA/QC activities. Once
developed, it
can be referenced and used in subsequent inventory preparation, or modified as appropriate (i.e. when
changes in
processes occur or on advice of independent reviewers). This plan should be available for external review.
In developing and implementing the QA/QC plan, it may be useful to refer to the standards and guidelines
published by the International Organization for Standardization (ISO), including the ISO 9000 series (see
Box
8.2). Although ISO 9000 standards are not specifically designed for emissions inventories, they have been
applied by some countries to help organise QA/QC activities.
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories 8.7
BOX 8.2
ISO AS A DATA QUALITY MANAGEMENT SYSTEM
The International Organization for Standardization (ISO) series programme provides standards for
data documentation and audits as part of a quality management system. Though the ISO series is
not designed explicitly for emissions data development, many of the principles may be applied to
ensure the production of a quality inventory. Inventory agencies may find these documents useful
source material for developing QA/QC plans for greenhouse gas inventories. Some countries (e.g.
the United Kingdom and the Netherlands) have already applied some elements of the ISO
standards for their inventory development process and data management.
The following standards and guidelines published under the ISO series may supplement source
category-specific QA/QC procedures for inventory development and provide practical guidance
for ensuring data quality and a transparent reporting system.
ISO 9004-1: General quality guidelines to implement a quality system.
ISO 9004-4: Guidelines for implementing continuous quality improvement within the
organisation, using tools and techniques based on data collection and analysis.
ISO 10005: Guidance on how to prepare quality plans for the control of specific projects.
ISO 10011-1: Guidelines for auditing a quality system.
ISO 10011-2: Guidance on the qualification criteria for quality systems auditors.
ISO 10011-3: Guidelines for managing quality system audit programmes.
ISO 10012: Guidelines on calibration systems and statistical controls to ensure that
measurements are made with the intended accuracy.
ISO 10013: Guidelines for developing quality manuals to meet specific needs.
Source: http://www.iso.ch/
8 .6 GENERAL QC PROCEDURES (TIER 1)
The focus of general QC techniques is on the processing, handling, documenting, archiving and reporting
procedures that are common to all the inventory source categories. Table 8.1, Tier 1 General Inventory
Level QC
Procedures, lists the general QC checks that the inventory agency should use routinely throughout the
preparation of the annual inventory. Most of the checks shown in Table 8.1 could be performed by cross-
checks,
recalculation, or through visual inspections. The results of these QC activities and procedures should be
documented as set out in Section 8.10.1, Internal Documentation and Archiving, below. If checks are
performed
electronically, these systems should be periodically reviewed to ensure the integrity of the checking
function.
It will not be possible to check all aspects of inventory input data, parameters and calculations every year.
Checks may be performed on selected sets of data and processes, such that identified key source categories
are
considered every year. Checks on other source categories may be conducted less frequently. However, a
sample
of data and calculations from every sector should be included in the QC process each year to ensure that all
sectors are addressed on an ongoing basis. In establishing criteria and processes for selecting the sample
data sets
and processes, it is good practice for the inventory agency to plan to undertake QC checks on all parts of
the
inventory over an appropriate period of time.
Quality Assurance and Quality Control Chapter 8
8.8 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
TABLE 8.1
TIER 1 GENERAL INVENTORY LEVEL QC PROCEDURES
QC Activity Procedures
Check that assumptions and criteria for the selection of
activity data and emission factors are documented.
Cross-check descriptions of activity data and emission
factors with information on source categories and ensure
that these are properly recorded and archived.
Check for transcription errors in data input and reference
Confirm that bibliographical data references are properly
cited in the internal documentation.
Cross-check a sample of input data from each source
category (either measurements or parameters used in
calculations) for transcription errors.
Check that emissions are calculated correctly.
Reproduce a representative sample of emissions
calculations.
Selectively mimic complex model calculations with
abbreviated calculations to judge relative accuracy.
Check that parameter and emission units are
correctly recorded and that appropriate conversion
factors are used.
Check that units are properly labelled in calculation sheets.
Check that units are correctly carried through from
beginning to end of calculations.
Check that conversion factors are correct.
Check that temporal and spatial adjustment factors are used
correctly.
Check the integrity of database files.
Confirm that the appropriate data processing steps are
correctly represented in the database.
Confirm that data relationships are correctly represented in
the database.
Ensure that data fields are properly labelled and have the
correct design specifications.
Ensure that adequate documentation of database and model
structure and operation are archived.
Check for consistency in data between source
categories.
Identify parameters (e.g. activity data, constants) that are
common to multiple source categories and confirm that there
is consistency in the values used for these parameters in the
emissions calculations.
Check that the movement of inventory data among
processing steps is correct.
Check that emissions data are correctly aggregated from
lower reporting levels to higher reporting levels when
preparing summaries.
Check that emissions data are correctly transcribed between
different intermediate products.
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories 8.9
TABLE 8.1 (CONTINUED)
TIER 1 GENERAL INVENTORY LEVEL QC PROCEDURES
QC Activity Procedures
Check that uncertainties in emissions and removals are
estimated or calculated correctly.
Check that qualifications of individuals providing expert
judgement for uncertainty estimates are appropriate.
Check that qualifications, assumptions and expert
judgements are recorded. Check that calculated
uncertainties are complete and calculated correctly.
If necessary, duplicate error calculations or a small sample
of the probability distributions used by Monte Carlo
analyses.
Undertake review of internal documentation.
Check that there is detailed internal documentation to
support the estimates and enable duplication of the emission
and uncertainty estimates.
Check that inventory data, supporting data, and inventory
records are archived and stored to facilitate detailed review.
Check integrity of any data archiving arrangements of
outside organisations involved in inventory preparation.
Check methodological and data changes resulting in recalculations.
Check for temporal consistency in time series input data for
each source category.
Check for consistency in the algorithm/method used for
calculations throughout the time series.
Undertake completeness checks.
Confirm that estimates are reported for all source categories
and for all years from the appropriate base year to the
period of the current inventory.
Check that known data gaps that result in incomplete source
category emissions estimates are documented.
Compare estimates to previous estimates.
For each source category, current inventory estimates
should be compared to previous estimates. If there are
significant changes or departures from expected trends, recheck
estimates and explain any difference.
The checks in Table 8.1, should be applied irrespective of the type of data used to develop the inventory
estimates and are equally applicable to source categories where default values or national data are used as
the
basis for the estimates.
In some cases, emissions estimates are prepared for the inventory agency by outside consultants or
agencies. The
inventory agency should ensure that the QC checks listed in Table 8.1, Tier 1 General Inventory Level QC
Procedure, are communicated to the consultants/agencies. This will assist in making sure that QC
procedures are
performed and recorded by the consultant or outside agency. The inventory agency should review these
QA/QC
activities. In cases where official national statistics are relied upon – primarily for activity data – QC
procedures
may already have been implemented on these national data. However, it is good practice for the inventory
agency to confirm that national statistical agencies have implemented adequate QC procedures equivalent
to
those in Table 8.1.
Due to the quantity of data that needs to be checked for some source categories, automated checks are
encouraged where possible. For example, one of the most common QC activities involves checking that
data
keyed into a computer database are correct. A QC procedure could be set up to use an automated range
check
(based on the range of expected values of the input data from the original reference) for the input values as
recorded in the database. A combination of manual and automated checks may constitute the most effective
procedures in checking large quantities of input data.
Quality Assurance and Quality Control Chapter 8
8.10 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas
Inventories
8 .7 SOURCE CATERGORY-SPECIFIC QC
PROCEDURES (TIER 2)
In contrast to general inventory QC techniques, source category-specific QC procedures are directed at
specific
types of data used in the methods for individual source categories and require knowledge of the emission
source
category, the types of data available and the parameters associated with emissions.
It is important to note that Tier 2 source category-specific QC activities are in addition to the general QC
conducted as part of Tier 1 (i.e. include QC checks listed in Table 8.1). The source category-specific
measures
are applied on a case-by-case basis focusing on key source categories (see Chapter 7, Methodological
Choice
and Recalculation) and on source categories where significant methodological and data revisions have
taken
place. It is good practice that inventory agencies applying higher tier methods in compiling national
inventories
utilise Tier 2 QC procedures. Specific applications of source category-specific Tier 2 QC procedures are
provided in the energy, agriculture, industrial processes and waste chapters of this report (Chapters 2 to 5).
Source category-specific QC activities include the following:
Emission data QC;
Activity data QC;
QC of uncertainty estimates.
The first two activities relate to the types of data used to prepare the emissions estimates for a given source
category. QC of uncertainty estimates covers activities associated with determining uncertainties in
emissions
estimates (for more information on the determination of these uncertainties, see Chapter 6, Quantifying
Uncertainties in Practice).
The actual QC procedures that need to be implemented by the inventory agency will depend on the method
used
to estimate the emissions for a given source category. If estimates are developed by outside agencies, the
inventory agency may, upon review, reference the QC activities of the outside agency as part of the QA/QC
plan.
There is no need to duplicate QC activities if the inventory agency is satisfied that the QC activities
performed
by the outside agency meet the minimum requirements of the QA/QC plan.
8 .7 .1 Emissions data QC
The following sections describe QC checks on IPCC default factors, country-specific emission factors, and
direct
emission measurements from individual sites (used either as the basis for a site-specific emission factor or
directly for an emissions estimate). Emission comparison procedures are described in Section 8.7.1.4,
Emission
Comparisons. Inventory agencies should take into account the practical considerations discussed in Section
8.2,
Practical Considerations in Developing QA/QC Systems, when determining what level of QC activities to
undertake.
8.7.1.1 IPCC DEFAULT EMISSION FACTORS
Where IPCC default emission factors are used, it is good practice for the inventory agency to assess the
applicability of these factors to national circumstances. This assessment may include an evaluation of
national
conditions compared to the context of the studies upon which the IPCC default factors were based. If there
is
insufficient information on the context of the IPCC default factors, the inventory agency should take
account of
this in assessing the uncertainty of the national emissions estimates based on the IPCC default emission
factors.
For key source categories, inventory agencies should consider options for obtaining emission factors that
are
known to be representative of national circumstances. The results of this assessment should be documented.
If possible, IPCC default emission factor checks could be supplemented by comparisons with national site
or
plant-level factors to determine their representativeness relative to actual sources in the country. This
supplementary check is good practice even if data are only available for a small percentage of sites or
plants.
8.7.1.2 COUNTRY-SPECIFIC EMISSION FACTORS
Country-specific emission factors may be developed at a national or other aggregated level within the
country
based on prevailing technology, science, local characteristics and other criteria. These factors are not
necessarily
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
8.11
site-specific, but are used to represent a source category or sub-source category. Two steps are necessary to
ensure good practice emission factor QC for country-specific factors.
The first is to perform QC checks on the data used to develop the emission factors. The adequacy of the
emission
factors and the QA/QC performed during their development should be assessed. If emission factors were
developed based on site-specific or source-level testing, then the inventory agency should check if the
measurement programme included appropriate QC procedures.
Frequently, country-specific emission factors will be based on secondary data sources, such as published
studies
or other literature.1 In these cases, the inventory agency could attempt to determine whether the QC
activities
conducted during the original preparation of the data are consistent with the applicable QC procedures
outlined
in Table 8.1 and whether any limitations of the secondary data have been identified and documented. The
inventory agency could also attempt to establish whether the secondary data have undergone peer review
and
record the scope of such a review.
If it is determined that the QA/QC associated with the secondary data is adequate, then the inventory
agency can
simply reference the data source for QC documentation and document the applicability of the data for use
in
emissions estimates.
If it is determined that the QA/QC associated with the secondary data is inadequate, then the inventory
agency
should attempt to have QA/QC checks on the secondary data established. It should also reassess the
uncertainty
of any emissions estimates derived from the secondary data. The inventory agency may also reconsider
how the
data are used and whether any alternative data, (including IPCC default values) may provide a better
estimate of
emissions from this source category.
Second, country-specific factors and circumstances should be compared with relevant IPCC default factors
and
the characteristics of the studies on which the default factors are based. The intent of this comparison is to
determine whether country-specific factors are reasonable, given similarities or differences between the
national
source category and the ‘average’ source category represented by the defaults. Large differences between
country-specific factors and default factors should be explained and documented.
A supplementary step is to compare the country-specific factors with site-specific or plant-level factors if
these
are available. For example, if there are emission factors available for a few plants (but not enough to
support a
bottom-up approach) these plant-specific factors could be compared with the aggregated factor used in the
inventory. This type of comparison provides an indication of both the reasonableness of the country-
specific
factor and its representativeness.
8.7.1.3 DIRECT EMISSION MEASUREMENTS
Emissions from a source category may be estimated using direct measurements in the following ways:
Sample emissions measurements from a facility may be used to develop a representative emission factor
for
that individual site, or for the entire category (i.e. for development of a national level emission factor);
Continuous emissions monitoring (CEM) data may be used to compile an annual estimate of emissions
for a
particular process. In theory, CEM can provide a complete set of quantified emissions data across the
inventory period for an individual facility process, and does not have to be correlated back to a process
parameter or input variable like an emission factor.
Regardless of how direct measurement data are being used, the inventory agency should review the
processes
and check the measurements as part of the QC activities.
Use of standard measurement methods improves the consistency of resulting data and knowledge of the
statistical properties of the data. If standard reference methods for measuring specific greenhouse gas
emissions
(and removals) are available, inventory agencies should encourage plants to use these. If specific standard
methods are not available, the inventory agency should confirm whether nationally or internationally
recognised
standard methods such as ISO 10012 are used for measurements and whether the measurement equipment
is
calibrated and maintained properly.
For example, ISO has published standards that specify procedures to quantify some of the performance
characteristics of all air quality measurement methods such as bias, calibration, instability, lower detection
limits, sensitivity, and upper limits of measurement (ISO, 1994). While these standards are not associated
with a
1 Secondary data sources refer to reference sources for inventory data that are not designed for the express purpose of
inventory development. Secondary data sources typically include national statistical databases, scientific literature, and
other
studies produced by agencies or organisations not associated with the inventory development.
Quality Assurance and Quality Control Chapter 8
8.12 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas
Inventories
reference method for a specific greenhouse gas source category, they have direct application to QC
activities
associated with estimations based on measured emission values.
Where direct measurement data from individual sites are in question, discussions with site managers can be
useful to encourage improvement of the QA/QC practices at the sites. Also, supplementary QC activities
are
encouraged for bottom-up methods based on site-specific emission factors where significant uncertainty
remains
in the estimates. Site-specific factors can be compared between sites and also to IPCC or national level
defaults.
Significant differences between sites or between a particular site and the IPCC defaults should elicit further
review and checks on calculations. Large differences should be explained and documented.
8.7.1.4 EMISSION COMPARISONS
It is standard QC practice to compare emissions from each source category with emissions previously
provided
from the same source category or against historical trends and reference calculations as described below.
The
objective of these comparisons (often referred to as ‘reality checks’) is to ensure that the emission values
are not
wildly improbable or that they fall within a range that is considered reasonable. If the estimates seem
unreasonable, emission checks can lead to a re-evaluation of emission factors and activity data before the
inventory process has advanced to its final stages.
The first step of an emissions comparison is a consistency and completeness check using available
historical
inventory data for multiple years. The emission levels of most source categories do not abruptly change
from
year to year, as changes in both activity data and emission factors are generally gradual. In most
circumstances,
the change in emissions will be less than 10% per year. Thus, significant changes in emissions from
previous
years may indicate possible input or calculation errors. After calculating differences, the larger percentage
differences (in any direction) should be flagged, by visual inspection of the list, by visual inspection of the
graphical presentation of differences (e.g. in a spreadsheet) or by using a dedicated software programme
that
puts flags and rankings in the list of differences.
It is good practice to also check the annual increase or decrease of changes in emissions levels in
significant subsource
categories of some source categories. Sub-source categories may show greater percentage changes than
the aggregated source categories. For example, total emissions from petrol cars are not likely to change
substantially on an annual basis, but emissions from sub-source categories, such as catalyst-equipped petrol
cars,
may show substantial changes if the market share is not in equilibrium or if the technology is changing and
rapidly being adopted in the marketplace.
It is good practice to check the emissions estimates for all source categories or sub-source categories that
show
greater than 10% change in a year compared to the previous year’s inventory. Source categories and sub-
source
categories should be ranked according to the percentage difference in emissions from the previous year.
Supplementary emission comparisons can also be performed, if appropriate, including order-of-magnitude
checks and reference calculations.
ORDER-OF-MAGNITUDE CHECKS
Order of magnitude checks look for major calculation errors and exclusion of major source categories or
subsource
categories. Method-based comparisons may be made depending on whether the emissions for the source
category were determined using a top-down or bottom-up approach. For example, if N2O estimates for
nitric acid
production were determined using a bottom-up approach (i.e. emissions estimates were determined for each
individual production plant based on plant-specific data), the emissions check would consist of comparing
the
sum of the individual plant-level emissions to a top-down emission estimate based on national nitric acid
production figures and IPCC default Tier 1 factors. If significant differences are found in the comparison,
further
investigation using the source category-specific QC techniques described in Section 8.7, Source Category-
Specific QC Procedures (Tier 2), would be necessary to answer the following questions:
Are there inaccuracies associated with any of the individual plant estimates (e.g. an extreme outlier may
be
accounting for an unreasonable quantity of emissions)?
Are the plant-specific emission factors significantly different from each other?
Are the plant-specific production rates consistent with published national level production rates?
Is there any other explanation for a significant difference, such as the effect of controls, the manner in
which
production is reported or possibly undocumented assumptions?
This is an example of how the result of a relatively simple emission check can lead to a more intensive
investigation of the representativeness of the emissions data. Knowledge of the source category is required
to
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
8.13
isolate the parameter that is causing the difference in emissions estimates and to understand the reasons for
the
difference.
REFERENCE CALCULATIONS
Another emission comparison may be used for source categories that rely on empirical formulas for the
calculation of emissions. Where such formulas are used, final calculated emission levels should follow
stochiometric ratios and conserve energy and mass. In a number of cases where emissions are calculated as
the
sum of sectoral activities based on the consumption of a specific commodity (e.g. fuels or products like
HFCs,
PFCs or SF6), the emissions could alternatively be estimated using apparent consumption figures: national
total
production + import – export stock changes. For CO2 from fossil fuel combustion, a reference calculation
based on apparent fuel consumption per fuel type is mandatory according to the IPCC Guidelines. Another
example is estimating emissions from manure management. The total quantity of methane produced should
not
exceed the quantity that could be expected based on the carbon content of the volatile solids in the manure.
Discrepancies between inventory data and reference calculations do not necessarily imply that the
inventory data
are in error. It is important to consider that there may be large uncertainties associated with the reference
calculations themselves when analysing discrepancies.
8 .7 .2 Activi ty data QC
The estimation methods for many source categories rely on the use of activity data and associated input
variables
that are not directly prepared by the inventory agency. Activity data is normally collated at a national level
using
secondary data sources or from site-specific data prepared by site or plant personnel from their own
measurements. Inventory agencies should take into account the practical considerations discussed above
when
determining the level of QC activities to undertake.
8.7.2.1 NATIONAL LEVEL ACTIVITY DATA
Where national activity data from secondary data sources are used in the inventory, it is good practice for
the
inventory agency or its designees to evaluate and document the associated QA/QC activities. This is
particularly
important with regard to activity data, since most activity data are originally prepared for purposes other
than as
input to estimates of greenhouse gas emissions. Though not always readily available, many statistical
organisations, for example, have their own procedures for assessing the quality of the data independently of
what
the end use of the data may be. If it is determined that these procedures satisfy minimum activities listed in
the
QA/QC plan, the inventory agency can simply reference the QA/QC activities conducted by the statistical
organisation.
It is good practice for the inventory agency to determine if the level of QC associated with secondary
activity
data includes those QC procedures listed in Table 8.1. In addition, the inventory agency may establish
whether
the secondary data have been peer reviewed and record the scope of this review. If it is determined that the
QA/QC associated with the secondary data is adequate, then the inventory agency can simply reference the
data
source and document the applicability of the data for use in its emissions estimates.
If it is determined that the QC associated with the secondary data is inadequate, then the inventory agency
should
attempt to have QA/QC checks on the secondary data established. It should also reassess the uncertainty of
emissions estimates in light of the findings from its assessment of the QA/QC associated with secondary
data.
The inventory agency should also reconsider how the data are used and whether any alternative data,
including
IPCC default values and international data sets, may provide for a better estimate of emissions. If no
alternative
data sources are available, the inventory agency should document the inadequacies associated with the
secondary
data QC as part of its summary report on QA/QC (see Section 8.10.2, Reporting, for reporting guidance).
For example, in the transportation category, countries typically use either fuel usage or kilometer (km)
statistics
to develop emissions estimates. The national statistics on fuel usage and kms travelled by vehicles are
usually
prepared by a different agency from the inventory agency. However, it is the responsibility of the inventory
agency to determine what QA/QC activities were implemented by the agency that prepared the original fuel
usage and km statistics for vehicles. Questions that may be asked in this context are:
Does the statistical agency have a QA/QC plan that covers the preparation of the data?
What sampling protocol was used to estimate fuel usage or kms travelled?
How recently was the sampling protocol reviewed?
Has any potential bias in the data been identified by the statistical agency?
Quality Assurance and Quality Control Chapter 8
8.14 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas
Inventories
Has the statistical agency identified and documented uncertainties in the data?
Has the statistical agency identified and documented errors in the data?
National level activity data should be compared with previous year’s data for the source category being
evaluated. Activity data for most source categories tend to exhibit relatively consistent changes from year to
year
without sharp increases or decreases. If the national activity data for any year diverge greatly from the
historical
trend, the activity data should be checked for errors. If the general mathematical checks do not reveal
errors, the
characteristics of the source category could be investigated and any change identified and documented.
Where possible, a comparison check of activity data from multiple reference sources should be undertaken.
This
is important for source categories that have a high level of uncertainty associated with their estimates. For
example, many of the agricultural source-categories rely on government statistics for activity data such as
livestock populations, areas under cultivation, and the extent of prescribed burning. Similar statistics may
be
prepared by industry, universities, or other organisations and can be used to compare with standard
reference
sources. As part of the QC check, the inventory agency should ascertain whether independent data have
been
used to derive alternative activity data sets. In some cases, the same data are treated differently by different
agencies to meet varying needs. Comparisons may need to be made at a regional level or with a subset of
the
national data since many alternative references for such activity data have limited scope and do not cover
the
entire nation.
8.7.2.2 SITE-SPECIFIC ACTIVITY DATA
Some methods rely on the use of site-specific activity data used in conjunction with IPCC default or
countryspecific
emission factors. Site or plant personnel typically prepare these estimates of activity, often for purposes
other than as inputs to emissions inventories. QC checks should focus on inconsistencies between sites to
establish whether these reflect errors, different measurement techniques, or real differences in emissions,
operating conditions or technology.
A variety of QC checks can be used to identify errors in site-level activity data. The inventory agency
should
establish whether recognised national or international standards were used in measuring activity data at the
individual sites. If measurements were made according to recognised national or international standards
and a
QA/QC process is in place, the inventory agency should satisfy itself that the QA/QC process at the site is
acceptable under the inventory QA/QC plan and at least includes Tier 1 activities. Acceptable QC
procedures in
use at the site may be directly referenced. If the measurements were not made using standard methods and
QA/QC is not of an acceptable standard, then the use of these activity data should be carefully evaluated,
uncertainty estimates reconsidered, and qualifications documented.
Comparisons of activity data from different reference sources may also be used to expand the activity data
QC.
For example, in estimating PFC emissions from primary aluminium smelting, many inventory agencies use
smelter-specific activity data to prepare the inventory estimates. A QC check of the aggregated activity data
from
all aluminium smelters can be made against national production statistics for the industry. Also, production
data
can be compared across different sites, possibly with adjustments made for plant capacities, to evaluate the
reasonableness of the production data. Similar comparisons of activity data can be made for other
manufacturing-based source categories where there are published data on national production. If outliers
are
identified, they should be investigated to determine if the difference can be explained by the unique
characteristics of the site or there is an error in the reported activity.
Site-specific activity data checks may also be applied to methods based on product usage. For example, one
method for estimating SF6 emissions from use in electrical equipment relies on an account balance of gas
purchases, gas sales for recycling, the amount of gas stored on site (outside of equipment), handling losses,
refills for maintenance, and the total holding capacity of the equipment system. This account balance
system
should be used at each facility where the equipment is in place. A QC check of overall national activity
could be
made by performing the same kind of account balancing procedure on a national basis. This national
account
balancing would consider national sales of SF6 for use in electrical equipment, the nation-wide increase in
the
total handling capacity of the equipment (that may be obtained from equipment manufacturers), and the
quantity
of SF6 destroyed in the country. The results of the bottom-up and top-down account balancing analyses
should
agree or large differences should be explained. Similar accounting techniques can be used as QC checks on
other
categories based on gas usage (e.g. substitutes for ozone-depleting substances) to check consumption and
emissions.
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
8.15
8 .7 .3 QC of uncertainty estimates
QC should also be undertaken on calculations or estimates of uncertainty associated with emissions
estimates.
Good practice for estimating inventory uncertainties is described in Chapter 6, Quantifying Uncertainties in
Practice, and relies on calculations of uncertainty at the source category level that are then combined to
summary
levels for the entire inventory. Some of the methods rely on the use of measured data associated with the
emission factors or activity data to develop probability density functions from which uncertainty estimates
can
be made. In the absence of measured data, many uncertainty estimates will rely on expert judgement.
It is good practice for QC procedures to be applied to the uncertainty estimations to confirm that
calculations are
correct and that there is sufficient documentation to duplicate them. The assumptions on which uncertainty
estimations have been based should be documented for each source category. Calculations of source
categoryspecific
and aggregated uncertainty estimates should be checked and any errors addressed. For uncertainty
estimates involving expert judgement, the qualifications of experts should also be checked and
documented, as
should the process of eliciting expert judgement, including information on the data considered, literature
references, assumptions made and scenarios considered. Chapter 6 contains advice on how to document
expert
judgements on uncertainties.
8 .8 QA PROCEDURES
Good practice for QA procedures requires an objective review to assess the quality of the inventory, and
also to
identify areas where improvements could be made. The inventory may be reviewed as a whole or in parts.
QA
procedures are utilised in addition to the Tier 1 and Tier 2 QC. The objective in QA implementation is to
involve
reviewers that can conduct an unbiased review of the inventory. It is good practice to use QA reviewers
that
have not been involved in preparing the inventory. Preferably these reviewers would be independent
experts
from other agencies or a national or international expert or group not closely connected with national
inventory
compilation. Where third party reviewers outside the inventory agency are not available, staff from another
part
of the inventory agency not involved in the portion of the inventory being reviewed can also fulfil QA
roles.
It is good practice for inventory agencies to conduct a basic expert peer review (Tier 1 QA) prior to
inventory
submission in order to identify potential problems and make corrections where possible. It is also good
practice
to apply this review to all source categories in the inventory. However, this will not always be practical due
to
timing and resource constraints. Key source categories should be given priority as well as source categories
where significant changes in methods or data have been made. Inventory agencies may also choose to
perform
more extensive peer reviews or audits or both as additional (Tier 2) QA procedures within the available
resources.
More specific information on QA procedures related to individual source categories is provided in the
source
category-specific QA/QC sections in Chapters 2 to 5.
EXPERT PEER REVIEW
Expert peer review consists of a review of calculations or assumptions by experts in relevant technical
fields.
This procedure is generally accomplished by reviewing the documentation associated with the methods and
results, but usually does not include rigorous certification of data or references such as might be undertaken
in an
audit. The objective of the expert peer review is to ensure that the inventory’s results, assumptions, and
methods
are reasonable as judged by those knowledgeable in the specific field. Expert review processes may involve
technical experts and, where a country has formal stakeholder and public review mechanisms in place,
these
reviews can supplement but not replace expert peer review.
There are no standard tools or mechanisms for expert peer review, and its use should be considered on a
case-bycase
basis. If there is a high level of uncertainty associated with an emission estimate for a source category,
expert peer review may provide information to improve the estimate, or at least to better quantify the
uncertainty.
Expert reviews may be conducted on all parts of a source category. For example, if the activity data
estimates
from oil and natural gas production are to be reviewed but not the emission factors, experts in the oil and
gas
industry could be involved in the review to provide industry expertise even if they do not have direct
experience
in greenhouse gas emissions estimation. Effective peer reviews often involve identifying and contacting
key
industrial trade organisations associated with specific source categories. It is preferable for this expert input
to be
sought early in the inventory development process so that the experts can participate from the start. It is
good
practice to involve relevant experts in development and review of methods and data acquisition.
Quality Assurance and Quality Control Chapter 8
8.16 IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas
Inventories
The results of expert peer review, and the response of the inventory agency to those findings, may be
important
to widespread acceptance of the final inventory. All expert peer reviews should be well documented,
preferably
in a report or checklist format that shows the findings and recommendations for improvement.
AUDITS
For the purpose of good practice in inventory preparation, audits may be used to evaluate how effectively
the
inventory agency complies with the minimum QC specifications outlined in the QC plan. It is important
that the
auditor be independent of the inventory agency as much as possible so as to be able to provide an objective
assessment of the processes and data evaluated. Audits may be conducted during the preparation of an
inventory,
following inventory preparation, or on a previous inventory. Audits are especially useful when new
emission
estimation methods are adopted, or when there are substantial changes to existing methods. It is desirable
for the
inventory agency to develop a schedule of audits at strategic points in the inventory development. For
example,
audits related to initial data collection, measurement work, transcription, calculation and documentation
may be
conducted. Audits can be used to verify that the QC steps identified in Table 8.1 have been implemented
and that
source category-specific QC procedures have been implemented according to the QC plan.
8 .9 VERIFICATION OF EMISSIONS DATA
Options for inventory verification processes are described in Annex 2, Verification. Verification techniques
can
be applied during inventory development as well as after the inventory is compiled.
Comparisons with other independently compiled, national emissions data (if available) are a quick option to
evaluate completeness, approximate emission levels and correct source category allocations. These
comparisons
can be made for different greenhouse gases at national, sectoral, source category, and sub-source category
levels,
as far as the differences in definitions enable them.
Although the inventory agency is ultimately responsible for the compilation and submission of the national
greenhouse gas inventory, other independent publications on this subject may be available (e.g. from
scientific
literature or other institutes or agencies). These documents may provide the means for comparisons with
other
national estimates.
The verification process can help evaluate the uncertainty in emissions estimates, taking into account the
quality
and context of both the original inventory data and data used for verification purposes. Where verification
techniques are used, they should be reflected in the QA/QC plan. Improvements resulting from verification
should be documented, as should detailed results of the verification process.
8 .10 DOCUMENTATION, ARCHIVING AND
REPORTING
8 .10.1 Interna l documentation and archiving
As part of general QC procedures, it is good practice to document and archive all information required to
produce the national emissions inventory estimates. This includes:
Assumptions and criteria for selection of activity data and emission factors;
Emission factors used, including references to the IPCC document for default factors or to published
references or other documentation for emission factors used in higher tier methods;
Activity data or sufficient information to enable activity data to be traced to the referenced source;
Information on the uncertainty associated with activity data and emission factors;
Rationale for choice of methods;
Methods used, including those used to estimate uncertainty;
Changes in data inputs or methods from previous years;
Identification of individuals providing expert judgement for uncertainty estimates and their qualifications
to
do so;
Chapter 8 Quality Assurance and Quality Control
IPCC Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
8.17
Details of electronic databases or software used in production of the inventory, including versions,
operating
manuals, hardware requirements and any other information required to enable their later use;
Worksheets and interim calculations for source category estimates and aggregated estimates and any
recalculations
of previous estimates;
Final inventory report and any analysis of trends from previous years;
QA/QC plans and outcomes of QA/QC procedures.
It is good practice for inventory agencies to maintain this documentation for every annual inventory
produced
and to provide it for review. It is good practice to maintain and archive this documentation in such a way
that
every inventory estimate can be fully documented and reproduced if necessary. Inventory agencies should
ensure
that records are unambiguous; for example, a reference to ‘IPCC default factor’ is not sufficient. A full
reference
to the particular document (e.g. Revised 1996 IPCC Guidelines for National Greenhouse Gas Inventories)
is
necessary in order to identify the source of the emission factor because there may have been several
updates of
default factors as new information has become available.
Records of QA/QC procedures are important information to enable continuous improvement to inventory
estimates. It is good practice for records of QA/QC activities to include the checks/audits/reviews that were
performed, when they were performed, who performed them, and corrections and modifications to the
inventory
resulting from the QA/QC activity.
8 .10.2 Reporting
It is good practice to report a summary of implemented QA/QC activities and key findings as a supplement
to
each country’s national inventory. However, it is not practical or necessary to report all the internal
documentation that is retained by the inventory agency. The summary should describe which activities
were
performed internally and what external reviews were conducted for each source category and on the entire
inventory in accordance with the QA/QC plan. The key findings should describe major issues regarding
quality
of input data, methods, processing, or archiving and show how they were addressed or plan to be addressed
in
the future.
REFERENCES
Intergovernmental Panel on Climate Change (IPCC) (1997). Revised 1996 IPCC Guidelines for National
Greenhouse Gas Inventoires: Volumes 1, 2 and 3. J.T. Houghton et al., IPCC/OECD/IEA, Paris, France.
International Organization for Standardization (ISO) (1994). Air Quality, Determination of Performance
Characteristics of Measurement Methods. ISO 9196:1994. ISO, Geneva, Switzerland.

http://www.ipcc-nggip.iges.or.jp/public/gp/english/8_QA-QC.pdf
9 QUALITY ASSURANCE AND QUALITY CONTROL

9.1 Introduction

The goal of quality assurance and quality control (QA/QC) is to identify and implement
sampling and analytical methodologies which limit the introduction of error into analytical
data. For MARSSIM data collection and evaluation, a system is needed to ensure that
radiation surveys produce results that are of the type and quality needed and expected for
their intended use. A quality system is a management system that describes the elements
necessary to plan, implement, and assess the effectiveness of QA/QC activities. This system
establishes many functions including: quality management policies and guidelines for the
development of organization- and project-specific quality plans; criteria and guidelines for
assessing data quality; assessments to ascertain effectiveness of QA/QC implementation; and
training programs related to QA/QC implementation. A quality system ensures that
MARSSIM decisions will be supported by sufficient data of adequate quality and usability
for their intended purpose, and further ensures that such data are authentic, appropriately
documented, and technically defensible.

Any organization collecting and evaluating data for a particular program must be concerned
with the quality of results. The organization must have results that: meet a well-defined
need, use, or purpose; comply with program requirements; and reflect consideration of cost
and economics. To meet the objective, the organization should control the technical,
administrative, and human factors affecting the quality of results. Control should be oriented
toward the appraisal, reduction, elimination, and prevention of deficiencies that affect
quality.

Quality systems already exist for many organizations involved in the use of radioactive
materials. There are self-imposed internal quality management systems (e.g., DOE) or there
are systems required by regulation by another entity (e.g., NRC) which require a quality
system as a condition of the operating license. These systems are typically called Quality
1

Assurance Programs. An organization may also obtain services from another organization
that already has a quality system in place. When developing an organization-specific quality
system, there is no need to develop new quality management systems, to the extent that a
facility’s current Quality Assurance Program can be used. Standard ANSI/ASQC E4-1994
(ASQC 1995) provides national consensus quality standards for environmental programs. It
addresses both quality systems and the collection and evaluation of environmental data.
Annex B of ANSI/ASQC E4-1994
1
Numerous quality assurance and quality control (QA/QC) requirements and guidance
documents have been applied to environmental programs. Until now, each Federal agency
has developed or chosen QA/QC requirements to fit its particular mission and needs. Some of
these requirements include DOE Order 5700.6c (DOE 1991c); EPA QA/R-2 (EPA 1994f);
EPA QA/R-5 (EPA 1994c); 10 CFR 50, App. B; NUREG-1293, Rev. 1 (NRC 1991); Reg
Guide 4.15 (NRC 1979); and MIL-Q-9858A (DOD 1963). In addition, there are several
consensus standards for QA/AC, including ASME NQA-1 (ASME 1989), and ISO
9000/ASQC Q9000 series (ISO 1987). ANSI/ASQC E4-1994 (ASQC 1995) is a consensus
standard specifically for environmental data collection.
August 2000 9-1 MARSSIM, Revision 1
Quality Assurance and Quality Control

(ASQC 1995) and Appendix K of MARSSIM illustrate how existing quality system
documents compare with organization- and project-specific environmental quality system
documents.

Table 9.1 illustrates elements of a quality system as they relate to the Data Life Cycle.
Applying a quality system to a project is typically done in three phases as described in
Section 2.3: 1) the planning phase where the Data Quality Objectives (DQOs) are developed
following the process described in Appendix D and documented in the Quality Assurance
Project Plan (QAPP), 2) the implementation phase involving the collection of environmental
2

data in accordance with approved procedures and protocols, and 3) the assessment phase
including the verification and validation of survey results as discussed in Section 9.3 and the
evaluation of the environmental data using Data Quality Assessment (DQA) as discussed in
Section 8.2 and Appendix E. Detailed guidance on quality systems is not provided in
MARSSIM because a quality system should be in place and functioning prior to beginning
environmental data collection activities. Table 9.1 The Elements of a Quality System
Related to the Data Life Cycle Data Life Cycle Quality System Elements Planning Data Quality
Objectives (DQOs) Quality Assurance Project Plans (QAPPs) Standard Operating Procedures (SOPs)
Implementation QAPPs SOPs Data collection Assessments and audits Assessment Data validation and
verification Data Quality Assessment (DQA)

http://www.epa.gov/rpdweb00/marssim/docs/revision1_August_2002corrections/chapter
9.pdf

38 Quality Assurance and Reliability


Includes approaches to, and methods for Input Subjects of Specific Interest
reliability analysis and control, quality control,
inspection, maintainability, and standardization.
 accelerated life testing
 clean rooms (general)
Definition
 environmental test facilities
Quality Assurance – A system of activities whose
 environmental testing
purpose is to provide assurance and show  failure rates
evidence that the overall quality control task is in  fault detection (quality control)
fact being done effectively. The system involves a  inspection
continuing evaluation of the adequacy and
effectiveness of the overall quality control
 inspection methods
program with a view to having corrective  life prediction
measures initiated where necessary. AGARD  life testing
Multilingual Aeronautical Dictionary, 1980.  maintainability (procedures and theory)
 nondestructive testing
Reliability – Of a piece of equipment or a system,  quality assurance
the probability of specified performance for a
given period of time when used in the specified  quality control
manner. NASA Thesaurus, Washington, DC:  radiography (quality control)
National Aeronautics and Space Administration.  redundancy systems
Dictionary of Technical Terms for Aerospace  reliability (procedures and theory)
Use. Wm. H. Allen, ed., 1965. NASA SP-7.
 reliability criteria
 sampling techniques (quality control)
NASA Interest
 service life
 shock testing (quality control)
Exhaustive Interest : Quality control, quality
assurance, and reliability theories, procedures,
and practices specifically applicable to aircraft,  ultrasonic testing (quality control)
space vehicles, launch vehicles, supporting
facilities, other aerospace applications, and
related equipment.

Selective Interest : Quality control, quality


assurance, and reliability theories, procedures,
and practices specifically concerned with
developments and techniques for nonaerospace
oriented activities that may be unusual or of use
within the aerospace effort.

http://www.sti.nasa.gov/sscg/38.html

Diagnostic X-Ray Imaging Quality Assurance (QA) and ***** Control (QC)
Diagnostic X-Ray Imaging Quality Assurance (*****): It ***** a program used ***** the caregiver
management to retain ***** best ***** diagnostic image quality with the ***** risk and suffering to
patients. ***** under the program are quality ***** tests ***** regular intervals, measures for
preventive *****, administrative procedures and training. Besides, it also includes continuous
evaluation of the competence of the imaging service and the ***** to start remedial action. The
main objective of a ***** quality assurance program is to guarantee the continual provision *****
quick and precise diagnosis of patient. This ***** will be effectively fulfilled by ***** in place a *****
program having three secondary goals ***** follows: (i) diagnostic ***** quality maintenance (ii)
minimize the radiation exposure to patient and staff; and (iii) cost effectiveness. (***** Canada,
2006)

Diagnostic ***** Imaging ***** Control (QC): Under this program, sequences of standardised tests
are conducted to find out modifications or alterations in X-ray equipment functionality from its
original level ***** performance. The purpose of such ***** when carried out on a routine basis
permits immediate ***** action to retain the quality of X-ray image. However, it is ***** to bear in
mind that the doctor ***** charge ***** the X-ray facility ***** the ***** responsibility ***** quality
control and ***** with the regulatory body. (Health Canada, 2006) total diagnostic ***** Imaging
***** consists ***** six different constituents. ***** comprise ***** radiation ***** monitoring,
radiographic unit monitoring, sensitometry ***** darkroom monitoring, the application of technique
charts, ***** evaluation ***** repeat rates and continuing education. Radiation Exposure
Monitoring: For safety measures all radiology departments must ***** a system for monitoring the
cumulative occupational ***** to ***** working with ionizing *****. Every employee must *****
provided with thermoluminescent ***** or film badges must be given to all employees and monthly
dosages posted ***** a bulletin board. This apart, a lot of departments also scrutinize patient
exposure ***** radiation ***** simple methods as a patient advisement device as well ***** a legal
precaution. For instance, ***** total fluoroscopy exposure ***** collected ***** the ***** of a
fluoroscopic ***** can be obtained from the fluoro times ***** recorded ***** patient *****. The *****
of ***** exposures obtained for ***** method can even be recorded ***** the patient database.
(Carrol, n. d.)

Radiographic ***** Monitoring: Majority of ***** quality control and ***** measurement checks for
***** apparatus can be done by ***** radiographer. ***** value of this ***** of evaluation is ***** by
***** reality that the radiation output per milliampere ***** been observed to differ by 50% from
one unit to the subsequent within ***** radiology department and ***** the extent of 100%
between units in ***** radiology departments. Besides, it is suggested that each of the equipment
***** a department is meticulously inspected ***** a radiation physicist once every ***** months at
the *****. (Carrol, n. d.)

Sensitometery and darkroom Monitoring: Majority of these functions are possible and must *****
carried out

http://www.essaytown.com/paper/diagnostic-x-ray-imaging-quality-assurance-qa-quality-
control-qc-41311
QUALITY ASSURANCE (QA) PROGRAM PLAN
Quality assurance is a system of management activities to ensure that a process, item, or
service is of the type and quality needed by the user.

Each offeror, as a separate and identifiable part of its technical proposal, shall submit a
Quality Assurance (QA) program plan setting forth the offeror's capability for quality
assurance. The plan shall address the following:

(a) A statement of policy concerning the organization's commitment to implement a


Quality Control/Quality Assurance program to assure generation of measurement data of
adequate quality to meet the requirements of the Statement of Work.
(b) An organizational chart showing the position of a QA function or person within the
organization. It is highly desirable that the QA function or person be independent of the
functional groups which generate measurement data.
(c) A delineation of the authority and responsibilities of the QA function or person and
the related data quality responsibilities of other functional groups of the organization.
(d) The type and degree of experience in developing and applying Quality
Control/Quality Assurance procedures to the proposed sampling and measurement
methods needed for performance of the Statement of Work.
(e) The background and experience of the proposed personnel relevant to accomplish the
QA specifications in the Statement of Work.
(f) The offeror's general approach for accomplishing the QA specifications in the
Statement of Work.

A QA project plan is a specific delineation of an offeror's approach for accomplishing the


QA specifications in a Statement of Work. When offerors are required to submit a project
plan, a program plan may or may not be required. The project plan may be a part of an
offeror's technical proposal, or a deliverable under the contract.
The offeror, as a separate and identifiable part of its technical proposal, shall submit a
Quality Assurance (QA) project plan which shall describe specific procedures and
responsibilities needed to accomplish the QA specifications in the Statement of Work.
The project plan shall consist of the following form and content:

(a) Title page, with provision for approval signatures.


(b) Table of contents.
(c) Project description.
(d) Project organization(s) and responsibilities.
(e) Quality Assurance objectives for measurement data, in terms of precision, accuracy,
completeness, representativeness and comparability.
(f) Sampling procedures.
(g) Sample custody.
(h) Calibration procedures, references, and frequency.
(i) Analytical procedures.
(j) Data reduction, validation, and reporting.
(k) Internal quality control checks and frequency. (l) Quality assurance performance
audits, system audits, and frequency.
(m) Quality Assurance reports to management.
(n) Preventive maintenance procedures and schedules.
(o) Specific procedures to be used in routinely assessing data precision and accuracy,
representativeness, comparability, and completeness of the specific measurement
parameters involved.
(p) Correction action.

A QA project plan is a specific delineation of an offeror's approach for accomplishing the


QA specifications in a Statement of Work. When offerors are required to submit a project
plan, a program plan may or may not be required. When a QA project plan was not a
required part of the technical proposal, the project plan may be required as a deliverable
under the contract by use of the following. However, the Statement of Work must contain
a specification for the form and content of the project plan before this paragraph may be
used.

QUALITY ASSURANCE (QA) PROJECT PLAN DOCUMENTATION

(a) The Contractor shall submit to the Project Officer 3 copies of a Draft Project Plan for
Quality Assurance within 30calender days after the effective date of the contract.
(b) EMonument will review and return the Draft Project Plan indicating approval or
disapproval, and comments, if necessary, within 30 calendar days. In the event that
EMonument delays review and return of the Draft Project Plan beyond the period
specified, the Contractor shall immediately notify the Contracting Officer in writing. The
Contractor shall deliver the Final Project Plan within 45 calendar days after the effective
date of the contract.
(c) The Contracting Officer will incorporate the approved Quality Assurance Project Plan
into the contract.

http://www.environmonument.com/qa.htm

You might also like