You are on page 1of 16

The fundamental objective of the rollout of this QA Programme is to promote patient safety and

enhancement of patient care with accurate, timely and complete Radiology diagnoses and

I. Introduction
A. Description
This document describes a quality assurance protocol for diagnostic equipment at the radiologic
technologist level. A series of tests are described using equipment and test tools designed for that
specific purpose. All of the important parameters in diagnostic procedures quality assurance are

Condition of the X-ray Facility

1. Mechanical integrity: A general observation of the diagnostic system should be made. Key items
to look for are the presence of loose or absent screws, bolts, or other structural elements that may
have been improperly installed or have worked loose due to use. The functioning of meters, dials, and
other indicators should be checked. The operation of pilot lights that are often necessary to observe
equipment functions in a darkened examination room should also be checked.

2. Mechanical stability: To obtain a diagnostic quality radiograph it is important to minimize patient

motion. Of key importance from the equipment side are the stability and stiffness of the x-ray tube
hanger and image receptor, (i.e. table Bucky or wall mounted cassette holder.) The availability and
adequacy of patient support devices such as the table or immobilizing devices should also be
checked. In addition it is important to check the reproducibility of positioning of the source and image
receptor that may be indicated or controlled by physical marks or detents. A check of the accuracy of
angulation scale should be made. As part of the check of structural stability an inspection of the
electrical and/or mechanical locks on the machine should be carried out.

3. Electrical integrity: The external condition of the high voltage cables should be observed. Check
to make sure that the retaining rings at the termination points are tight and that there are no breaks in
the insulation. It is important to observe the "lay" of the cables. If they do not hang properly they can
interfere with positioning of the tube and may fail prematurely.

4. Electrical safety: The system should be checked by a safety engineer. This involves a physical
inspection of the electrical wiring. Key areas where problems often occur include the power cord to
light indicators in the beam limitation system, the wires to the exposure hand switch, and other similar
power hook ups. Verify that all elements are well grounded (to each other and to
the ground).

5. Alignment and SID: Source to image receptor distance (SID) indicators should be checked. The
consistency between multiple SID indicators (indicators on the tube support and the collimator)
should be verified. The accuracy of these indicators should also be verified with a tape measure.
Verification of proper grid installation should be made. This check should also include a verification of
the alignment of the x-ray source and the center of the grid. (More specific tests of grids are outlined
elsewhere in this report).

E. The Radiograph as a QA Tool

The goal of a diagnostic quality assurance program is to produce radiographs of consistent high
quality. Patient radiographs are in turn a quality control check and should be factored into any
departmental evaluation program. It also must be kept in mind that the diagnostic radiograph
should not be relied on as the only quality control check since acceptable radiographs can be
obtained when individual elements in the system (generator, image receptor and processor) are
operated outside acceptable limits. For example, compensation for inadequate film development can
be corrected by unacceptable increases in exposure.

1. Rejected films: Unacceptable radiographs can result from a variety of factors that include patient
motion, positioning error, improper technique selection and equipment related problems. A review of
the rejected films should be made on a periodic basis to identify the magnitude of the problem and to
determine the cause.

2. Accepted films: Good practice should always question the adequacy of radiographs of less than
optimal quality for their acceptability in making a diagnosis. Repeating a procedure to get a film of
optimal quality is often not necessary and should be evaluated in terms of the radiation exposure and
cost of the retake. Since one should expect to find films of less than optimal quality in a departmental
file, an analytic review of these films should be made on a regular basis. Some of the key indicators
that may be a signal for the need for some QA action include static marks, contrast changes and
streak marks. Many other items may also be indications of less than adequate system performance.

A few examples are given below.

a. The necessity to change technique factors may indicate changes occurring in the x-ray system or
the development stage. In facilities with more than one x-ray unit the source of the problem can often
be isolated by seeing if the change is required on all the x-ray systems where the development
process would be in question or for a single unit where the generator would be suspected first.

b. Changes in the appearance of bone or the image contrast in studies where iodine base contrast
media is used often indicates a shift in tube potential.

C. Excessive base fog in areas covered by film blockers may indicate improper film handling and/or

d. Asymettry of borders of collimated images

Part I. Definition

It is necessary to define, at least briefly for the purposes of this article, the general concepts of quality
assurance and quality control and to review the principal objective of a radiology quality assurance

Quality assurance (QA) is a program used by management to maintain optimal diagnostic image
quality with minimum hazard and distress to patients. The program includes periodic quality control
tests, preventive maintenance procedures, administrative methods and training. It also includes
continuous assessment of the efficacy of the imaging service and the means to initiate corrective

The primary goal of a radiology quality assurance program is to ensure the consistent provision of
prompt and accurate diagnosis of patients. This goal will be adequately met by a QA program having
the following three secondary objectives:

 to maintain the quality of diagnostic images;

 to minimize the radiation exposure to patient and staff; and
 to be cost effective.

Quality control (QC) consists of a series of standardized tests developed to detect changes in x-ray
equipment function from its original level of performance. The objective of such tests, when carried
out routinely, allows prompt corrective action to maintain x-ray image quality. It is important to note
that the ultimate responsibility for quality control rests with the physician in charge of the x-ray facility,
not with the regulatory agency.

1. Introduction
The World Health Organization (WHO) defines a quality assurance (QA) programme in
diagnostic radiology as an organized effort by the staff operating a facility to ensure that the
diagnostic images produced are of sufficiently high quality so that they consistently provide
adequate diagnostic information at the lowest possible cost and with the least possible
exposure of the patient to radiation: (World Health Organization [WHO], 1982). The nature
and extent of this programme will vary with the size and type of the facility, the type of
examinations conducted, and other factors. The determination of what constitutes high
quality in any QA programme will be made by the diagnostic radiology facility producing
the images.

Quality assurance actions include both quality control (QC) techniques and quality administration
procedures. QC is normally part of the QA programme and quality control techniques are those
techniques used in the monitoring (or testing) and maintenance of the technical elements or
components of an X-ray system. The quality control techniques thus are concerned directly with the
equipment that can affect the quality of the image i.e. the part of the QA programme that deals with
instrumentation and equipment. An X-ray system refers to an assemblage of components for the
controlled production of diagnostic images with X-rays. It includes minimally an X-ray high voltage
generator, an X-ray control device, a tube-housing assembly, a beam-limiting device and the
necessary supporting structures. Other components that function with the system, such as image
receptors, image processors, automatic exposure control devices, view boxes and darkrooms, are
also parts of the system. The main goal of a QC programme is to ensure the accuracy of the
diagnosis or the intervention (optimising the outcome) while minimising the radiation dose to achieve
that objective

Radiology Department QA Program

A documented QA program should be developed, under the guidance of the QAC, specifically to
address the needs of the radiology department. The QA program should include a written plan of
action outlining policies and procedures. It should clearly define the goals and objectives of the

The QA program should cover both QC testing techniques and administrative procedures. The latter
are to verify that QC testing is effective, i.e., the tests are performed regularly and correctly, the
results evaluated promptly and accurately, and the necessary action taken. They include
recommendations regarding the responsibility for quality assurance action, staff training, equipment
standards, and the selection of the appropriate equipment for each examination. The quality
assurance program should include the means to evaluate the effectiveness of the program itself, e.g.,
ongoing retake rate and causes, equipment repair and replacement costs and analysis of trends in
the equipment performance.

A QA manual should be developed in a format that allows easy reference by staff and permits future
revision by management and the QAC. The content of the manual should be determined by
management with the advice of the department's QAC but it should contain the following items
considered essential:

 a list of the radiology department QAC personnel and an outline of their duties, authority and
 a list of personnel involved in QC testing and an outline of their responsibilities;
 guidelines for equipment specification writing;
 a list of the equipment parameters to be measured and the frequency of monitoring for each x-
ray system and system component;
 a description of the performance standards, with specific tolerance limits established for each
QC test;
 a description of the method used to measure each parameter;
 sample (blank) forms, worksheets, charts and records used for QC testing;
 guidelines for equipment acceptance testing;
 a schedule of QC testing for each equipment (monitoring frequency);
 guidelines for photographic QC;
 guidelines for recording equipment performance;
 procedures to be followed when equipment failure occurs or when test results fall outside the
tolerance limits;
 guidelines for the reject-repeat analysis program;
 a list of the publications where detailed instructions for monitoring and maintenance
procedures for each equipment can be found (although separate from the QA manual, service
and technical operations manuals should be readily available);
 guidelines for equipment appraisal and replacement;
 guidelines for the standardization of patient exposure, e.g., patient positioning, loading factors
and measurement of patient exposure;
 guidelines for quality acceptance of diagnostic radiograms; and
 a schedule of management's review QC reports and the QA program.

QC Technologist

All staff in the radiology department should be involved in quality control. However, specific tests are
usually performed more effectively by specially trained technologists. The amount of time spent on
QC should be adequate to perform the functions required for an effective quality control program.
Among the activities of the QC technologist(s) should be to:

 carry out the day-to-day QC tests on the department's photographic, radiographic and
fluoroscopic imaging equipment as prescribed by the QC test schedule;
 record and/or chart the QC test measurement data;
 evaluate the test results;
 report any deterioration or trends in equipment performance to the radiology manager and staff
using the equipment;
 initiate prompt corrective action and/or preventive measures when necessary;
 oversee the repair of defective equipment performed by the hospital biomedical or electronic
maintenance staff or by private service companies;
 perform the required tests to confirm that defective equipment was repaired and restored to the
original level of performance;
 maintain equipment performance records;
 provide monthly reports on QC activities to the radiology manager; and
 develop new QC monitoring and maintenance procedures as required.

The person in the position should report directly to the radiology manager.

QC Test Equipment

The adequacy of the department's QC test equipment and radiation measuring instrumentation
should be reviewed periodically by the QAC. Specifications for new test equipment intended for
purchase should include the following items:

 the equipment specifications (accuracy, precision, sensitivity, range, etc.);

 a calibration reference;
 compatibility with existing equipment;
 the expected useful life of the equipment;
 the availability of parts and service;
 an estimate of maintenance costs; and
 instructions and/or training in the operation of the equipment.

Equipment Acceptance Testing

The purpose of post-installation acceptance tests is to insure that the x-ray equipment operates
correctly and meets the following criteria:

 the standards of design, construction and functioning, for diagnostic x-ray equipment and
applicable provincial regulations;
 the purchase contract specifications; and/or
 the original equipment manufacturer specifications.

QC Testing Program

The purpose of a QC testing program is to maintain the quality of diagnostic images. This is done with
routine monitoring of photographic and x-ray equipment parameters to detect deviations of equipment
performance and take prompt corrective action. Periodic monitoring should not be eliminated if the
test results indicate relatively stable equipment performance. Small but progressive changes in image
quality, not readily detectable to the eye, will be more easily noticed using standardized test
procedures and specific test equipment. The most important objectives of a routine QC testing
program are to:

 establish a baseline against which future measurements can be compared to maintain the
original level of performance;
 assist in detecting and diagnosing the cause of any deterioration in equipment performance;
 promptly correct any deterioration in equipment performance when the cause is known;
 detect defects on installation, or after major repairs, which may adversely affect image quality
or patient dose; and
 enable comparable loading factors to be used on similar x-ray machines, where appropriate.

The QC testing program should be divided into component parts to cover each area effectively. For

 general radiography, i.e., photographic, radiographic and fluoroscopic equipment QC;

 mammography, i.e., photographic and radiographic equipment QC; and
 computed tomography, i.e., image quality and radiographic equipment QC.

Equipment Performance Records and Record Keeping

The method of record keeping should be determined by management with advice from the QAC. The
records may be maintained in a file or individual log books or in a computerized data base.
Commercial software is available and can be customized with relative ease to record the data
collected. In some cases the measured kilovoltage, exposure time and exposure is transmitted
directly from the radiation detection meter to the computer.

The records should be maintained in a ready-to-use form and the information be readily available to
all staff. The information should be complete, up-to-date and presented in a form suitable for
departmental reviews and provincial radiation safety audits.
The equipment performance record should clearly identify the equipment and its location. It is
recommended to keep one log book for each x-ray room. The required equipment performance data
should be determined by the QAC but the following elements are considered essential:

 the equipment system identification, i.e., the name of manufacturer, the model designation and
serial number, the date and country of manufacture;
 the equipment location;
 a copy of the equipment acceptance test report;
 a copy of the current provincial radiation safety survey;
 a copy the current federal, provincial or territorial registration certificate;
 the QC monitoring records (data, graphs, charts, etc.);
 the equipment service and repair record including service frequency and costs; and
 the equipment down-time record.

Loading Factors

The loading factors manual should be reviewed regularly (e.g., monthly) by the QC technologist and
each chart updated and maintained in clear printed form. Changes should not be maintained in hand-
written form. The contents of the loading factors manual should be determined by management with
participation from the QAC but the following information should be included:

 the body part and projection;

 the patient size or body part thickness;
 the optimum kVp;
 the optimum mA and time, or mAs, or AEC including the specific cell;
 the film/screen combination selection;
 the grid selection; and
 the tube focal spot size selection.

Any change made to loading factors, to compensate for decreased diagnostic image quality, should
be investigated to determine the cause. Poor film processing is the most frequent cause of a
decrease in image quality.

The method used to derive loading factors, i.e., fixed kVp or fixed mAs, should be determined by the
radiologist. Loading factors derived from the subjective evaluation of patient size are unreliable and
do not provide consistent radiographic images All patients should be measured and the loading
factors adjusted accordingly. For example, to compensate for variations in patient sizes, using fixed
mAs techniques, the method of increasing or decreasing the kilovoltage by approximately 2 kVp per
centimetre of body part can be employed.


The entrance-skin-exposure (ESE) should be measured and recorded for standard radiographic
examinations or projections. The standard projections recommended for routine QC ESE
measurement should be based on those found in Tables 1 and 2 of Safety Code-20A Footnote 10 and
are listed in Table 1.
Table 1. Recommended Upper Limits for Entrance Skin Exposure

Entrance Skin Exposure

Examination (Projection) Recommended Upper Limits

(mR) (µC/kg)

Chest (P/A) 20 5.2

Skull (Lateral) 224 58.0

Abdomen (A/P) 627 162.0

Cervical Spine (A/P) 137 35.0

Thoracic Spine (A/P) 380 98.0

Full Spine (A/P) 263 68.0

Lumbo-Sacral Spine (A/P) 614 158.0

Retrograde Pyelogram (A/P) 539 139.0

The recommended upper limits for ESE given in the table above are based on:

 the third quartile levels from unpublished (Canadian) Nationwide Evaluation of X- Ray Trends
(NEXT) data; and
 a reference patient having the anthropometrical characteristics shown in Table 2.

Table 2. Anthropometrical Characteristics of the Reference Patient

Body Part Reference Patient Thickness (cm)

Head (lateral) 15

Neck (A/P) 13

Chest (A/P) 23

Abdomen (A/P) 23

Note: In practice, it should be feasible to have actual ESE substantially lower than these limits.

This series of ESE measurements should be performed at least semi-annually for every radiographic
equipment system or each time the radiographic system is repaired or serviced. The ESE
measurement results should be used to update the information contained in the radiographic
technique chart.
Commercial software is available and may be used to calculate the ESE. After the basic information
is entered into the computer program, the computer is able to calculate the ESE for any given
examination or loading factors or patient size.

Radiation Safety/Quality Assurance Programs

1. Radiation Safety/Quality Assurance Responsibility

Responsibility for Computed Radiography Quality Assurance is a function of the size of the
facility. Large facilities can absorb the responsibilities and structure for CR quality assurance
into the existing committee responsible for overseeing diagnostic radiography. In smaller
facilities, it is the responsibility of the physician who serves as the Radiation Safety Officer.
The responsibility for CR equipment preventative maintenance/testing falls upon several
different groups. Medical physicists, radiologic technologists, in-house engineering, and
manufacturer service representatives are responsible for various aspects of equipment
evaluation or performance. The QA responsibilities for each party shall be defined as a manual

2. Records
1. Quality Assurance Manual

Establish a manual that includes the following items:

a. a list of tests to be performed and the frequency of performance;

b. a list identifying which individual or group will be doing each test;
c. a written description of the procedure that will be used for each test;
d. a list of all the variables which comprise the operating conditions for each test
e. the acceptability limits for each test;
f. a list of the equipment to be used for testing; and
g. sample records to be used for each test.

2. Equipment Records

Records will be maintained for each unit currently in operation and include:

a. the initial test results (acceptance testing and radiation safety survey as
appropriate); and
b. the CR exposure indicator numbers and image retakes rates. These results
should be trended and kept for a period of three years.

3. QC Records for Test Equipment

Records will be maintained and available for review for QC test equipment that requires
calibration. QC records may be maintained in either a hardcopy or softcopy format, but
must be available for review during inspections and whenever else they are needed.

3. Equipment Monitoring

Each facility will make or have made quality control tests to monitor equipment performance
and maintain records of data collected. The tests performed will vary from manufacturer to
manufacturer but must include those quality control checks specified by the manufacturer and
be modeled after the program below. If, at the time of inspection, significant equipment
malfunctions are found, the facility may be required to perform more frequent testing to ensure
good diagnostic image quality.

Appropriate quality control testing must be conducted whenever major maintenance or a

change in equipment operation (software change) occurs.

1. Test Frequency – Daily

The technologist needs to check each patient image, softcopy and hardcopy, for
artifacts. Artifacts may be caused by dust particles, scratches, lines, mechanical friction
marks or other factors. If an artifact is found to be common to a particular Imaging Plate
(IP), that plate shall be removed from service until the cause of the artifact is eliminated.

Softcopy images shall be evaluated on a radiologist's workstation and viewed over a

range of window-level settings consistent with routine clinical practice.

All images must be quality controlled, verified and properly stored.

Maintain a record of image retakes including date and cause for future reference and

2. Test Frequency - Quarterly

a. Evaluate a 10% sampling of a random assortment of plates for dark noise
following the manufacturer's instructions. Dark noise is the residual signals from
background radiation or other sources on an unexposed cassette. If widespread
failures are noted then the entire inventory will be evaluated. The exposure
indicator numbers (e.g., S value, exposure index, or IgM, etc.) are documented
and retained for future reference and review.
b. Make a QC phantom image using a phantom that is designed by the original
equipment manufacturer and implement the suggested testing protocol. If a
manufacturer's phantom is not available, use an appropriate phantom that
includes gray scale patterns and a spatial resolution test pattern. Perform
qualitative and quantitative analysis of the test image for:
1. resolution,
2. contrast/noise,
3. laser jitter, and
4. exposure indicator accuracy.

Document findings and store acquired data for future use and review.

c. Verify the calibration of the photostimuable storage phosphor (PSP)

workstations. If the system is capable, this can be accomplished by using a
SMPTE video test pattern, the methodology of AAPM TG-18 for monitor quality
control, or a comprehensive phantom image that includes gray scale patterns
and a spatial resolution pattern.
d. Visually inspect and clean all cassettes and imaging plates with methods that are
approved by the equipment manufacturer. All plates should be erased before
being placed back into service. The cleaning frequency is based upon the
particular environment that the plate is used in. Cleaning more or less frequently
may be necessary should environmental conditions warrant.
e. Review the image retake rate and determine the causes of unacceptable PSP
images. Document findings and maintain records of review.
f. Review the QC exposure indicator database. Determine the cause of
under/overexposures, implement corrective action and generate a quarterly
report that can be reviewed by the Radiation Safety Committee or RSO. The
incident exposures on the IPs should be tracked to analyze any trends and in
particular "dose or CR creep" as there is an optimal level of exposure for each
type of exam that is performed.
3. Test frequency - Annually
a. IP Dark Noise

Each IP must be erased with the full erasure cycle. The hard and soft images for
each IP should be clear, uniform and artifact free. Corrective action for any
artifacts, density shading or non-uniformities must be made prior to performing
other tests.

b. Imaging Plate Linearity and Sensitivity

Verify the response of selected IPs to several different levels of radiation

exposure (e.g., 0.1, 1, and 10 mR). Measure the optical density or pixel value to
determine linearity. At the same time, the manufacturer's method of establishing
an estimate of the incident exposure on the plate can be verified when the
incident exposure is measured with a calibrated radiation detector. If a calibrated
radiation detector is not available on-site, evaluate linearity and sensitivity by
using appropriate technique factors obtained from a medical physicist.

c. System Linearity and Sensitivity

This is accomplished by a uniform exposure of several plates of each size
dimension over a range of exposures, for example 0.1, 1.0, 10 mR (this must be
measured with a calibrated ionization chamber) using a standard kV (e.g., 80
kVp) and added filtration, if recommended by the manufacturer, (e.g., 1 mm Cu).
Use an extended SID of 180 cm to minimize the heel effect. Using a standard
display, the resultant images should have a constant image density across the
field of view. Hardcopy images should be within 0.5 OD of the target OD. The
measured exposures to the IP as determined by the CR system exposure index,
S-number or DgN value, should agree to within 20% over the three orders of
magnitude of incident exposure.

d. Image Quality

Uniformity is evaluated by selecting several different sizes of ISP and then

exposing them to the same quantity of radiation (approximately 1 mR) and
processing the plates with a standardized image acquisition and processing
routine. Compare the resultant images for any type of artifacts.

e. Uniformity and Reproducibility

Compare the OD or pixel value from similar images acquired in step 3d above for
uniformity of response and reproducibility.

f. Image Geometric Uniformity and Spatial Resolution

A test object such as a line-pair resolution phantom or other device is imaged in

each of the 4 quadrants of the IP with minimum magnification using a standard
acquisition protocol and a reasonable incident exposure (1 mR). Verification of
pixel calibration and pixel dimensions along the rows (x-dimension) and columns
(y-dimension) is performed for each quadrant. The results should be within 5% of
one another. A resolution test pattern appropriate for the system (e.g., 1-5 lp/mm)
is then placed centrally with a 45 degree orientation along the x–axis and a
second resolution test pattern is placed in one of the outer quadrants with
orientation along the laser scan lines. The IP is then exposed to a moderate
incident exposure (e.g., 50 kVp and 5 mAs at 180 cm with no filtration). Process
the IP and determine the limiting resolution in both the horizontal and vertical
directions. The measured resolution should be within 10% of the theoretical
resolution based upon the sampling frequency of the imaging plate as specified
by the manufacturer.

g. Artifacts

Each IP should be evaluated for artifacts using a uniform attenuator (such as

acrylic or copper). Artifacts should be evaluated using either softcopy or
hardcopy with sufficiently narrow windows to demonstrate clinically significant
a. Monitor, evaluate, review:
Quality control results
What kinds of controls are used?
How are control ranges established?
What criteria are used to decide if test run is acceptable?
What is to be done when controls are outside limits?
What systems are used to evaluate shifts and trends?
Who reviews QC data, how often, what’s done when problems are noted?
Proficiency testing results
Performed by all staff?
Handled like patient samples?
How are failures investigated, documented?
How do you evaluate your performance when your results are ungraded?
Who reviews PT results?
Are PT samples used to assess personnel competency?
Patient test results
Describe your reporting system
How are patient results reviewed for accuracy, clarity, transcription errors, improbable
Are results correlated with other findings?
Do you have a system for reporting critical values, corrected reports?
Biannual verification of accuracy
What tests are not covered by PT (non-regulated analytes)?
Define frequency - minimum 2 samples twice per year
How is biannual verification done?
Set criteria for acceptable agreement between split samples
Biannual evaluation of relationship of test results between methods
Describe instruments or methods
Describe how often - minimum 2 samples twice per year
How is biannual evaluation done?
Set criteria for acceptable agreement between instruments or method
b. Identify and correct problems
Define systems available, who reviews, how often
QC results
PT results
Patient results
Troubleshooting, problem logs
Incident reports
Corrected reports
Patient redraws
c. Establish, maintain accurate, reliable, prompt reporting of test results
Who reviews reports for accuracy, clarity?
Do all results have units of measurement, normal ranges?
How are phoned reports handled, documented?
How are corrected reports handled, documented?
Are critical limits defined? How are they handled, documented?
What are your expected turnaround times - STATs, routines?
What is your system to track and report send-out test results?
d. Verify all tests conform to specified performance criteria in quality control
Procedures are available, are correct and staff adhere to them
Performance criteria (for QC, calibration, linear limits, etc) are written and available to
QC, calibrations, linear limits, instrument performance checks are performed on
schedule, are acceptable or patient results are not reported
Trends are noted and corrected
e. Establish, maintain adequacy, competency of technical personnel
Write job descriptions, define duties and responsibilities
Develop orientation and training checklists
How is ongoing competency assessed?
(Direct observations, review of QC, PT, problems, reports, evaluations)
How is competency documented? (Semiannually for new employees, annually thereafter)
Continuing education documentation
a. Establish & apply criteria for specimen acceptance & rejection
How are samples labeled?
What do you do if specimens are collected in incorrect containers?
What do you do if there are there are time delays in delivery of specimens to lab?
How do staff know what’s acceptable, unacceptable for each test performed?
b. Notify individuals as soon as possible of life-threatening results
Write a critical limits (panic values) policy
Define limits with your Director and medical staff
What’s done (repeat, confirm value)?
Who’s called, how do you document?
c. Assess problems identified during QA review & discuss with staff
What QA reviews are done?
Incident reports, complaints, corrected reports, problem logs, PT performance, personnel
Problems with specimen submission, clarity of orders, turnaround times
How are reviews shared with staff?
d. Evaluate reporting systems-Accurate, reliable reporting, transmittal, storage, retrieval
of data
How are results reported, charted in your setting?
Describe computer reporting systems
Describe transmittal of results from other labs - onsite printers, faxed results, phoned
Describe archiving of results
Define record retention - Where, how long?
e. Document all actions taken to identify and correct problems and that they are effective
in correcting the problem
f. Issue corrected reports
Write a corrected report policy describing how this is done in your system
g. Provide instructions for specimen collection, handling, preservation, transportation
These should be available to nurses, doctors, clients
Where are these for each test?
How are these updated when tests or methods change?
h. Provide clients updates of testing changes affecting test results or interpretation
For changes in methods, normal ranges, detection limits, interpretation of results,
specimen requirements
Describe how Director is made aware of changes
Describe how Director shares this with other medical staff or clients