You are on page 1of 5

PROCEEDINGS OF SPIE

SPIEDigitalLibrary.org/conference-proceedings-of-spie

Quality assurance for CT and MRI


scanners

Rowberg, Alan, Haynor, David, Denny, Michael, Kleven,


Marv, Goodenough, David

Alan H. Rowberg, David R. Haynor, Michael Denny, Marv Kleven, David J.


Goodenough, "Quality assurance for CT and MRI scanners," Proc. SPIE
2432, Medical Imaging 1995: Physics of Medical Imaging, (8 May 1995); doi:
10.1117/12.208375

Event: Medical Imaging 1995, 1995, San Diego, CA, United States

Downloaded From: https://www.spiedigitallibrary.org/conference-proceedings-of-spie on 24 May 2022 Terms of Use: https://www.spiedigitallibrary.org/terms-of-use


Quality assurance for CT and MRI scanners
Alan H. Rowberg, David R. Haynor, Michael Denney,
Mary Kieven, David J. Goodenough
University of Washington, Seattle WA 98195
(DRH, MD, MK, Veterans Affairs Medical Center, Seattle, WA 98108)
(DJG, George Washington University, Washington, DC 20037)

ABSTRACT
While specialized phantoms for quality assurance have been provided with CT scanners since these devices
were first marketed to radiology departments, there has been little in the way of integrated software and
procedures to use these phantoms on an on-going basis. Typically, they are used initially when the scanner
is installed, and then used only very intermittently thereafter, usually by the vendors' service personnel.
Although calibration scans are performed routinely, these typically only establish the baseline for the
accuracy and uniformity of CT numbers, and do not actually measure the resolution which the images are
capable of achieving. Over the last four years, a software package to automatically analyze images from CT
scanners has been developed, and this was adapted to use with MRI scanners in 1993. An additional
software package has been developed to handle the results of the individual quality assurance scans in a
data base, and allow for easy analysis and graphing of the results.

1. INTRODUCTION
Phantoms have been used for many years in other areas of diagnostic radiology, and it was natural to
develop new and specialized phantoms to assist in the design and testing of the first CT scanners, and later
in testing MR scanners.13 While some physicists continue to monitor equipment performance periodically,
in other institutions this was delegated to the vendor personnel. The computers that operated the scanners
were typically closed environments, and it was difficult to add software to the scanner, especially quality
assurance software. Therefore, the majority of tests were made on film, with observations of which pins
were visible at which contrast and dose levels. More recently, some of these QA tests have been
automated46 and some attention has been paid to plotting trends7.

2. METHODS AND MATERIALS


A quality assurance software package from IRIS, Inc. (Frederick, MD) was installed on a PC at the
Veterans Affairs Medical Center (VAMC) in Seattle, WA. It was connected to the Picker PQ-2000 CT
Scanner by Ethernet, and to a magnetic tape drive (Overland Data, Los Angeles, CA) that could read
magnetic tapes which came from the Philips Gyroscan MRI Scanner. The CT QA system was installed and
became operational in mid-1993, and IRIS developed the required software for the MRI scanner at the end
of 1993. The standard software writes its results to a disk file, which can then be printed and filed by the
physicist. The standard software treats each QA examination (a series of 3-4 images) as a complete test, and
produces a report giving the results of this test. It does not, however, consider the test results from previous
days in producing a report which has a time component.
We developed a computer program to scan the hard disk for QA result files, and extract the key results from
these files, storing them in a form compatible with commonly available database management systems,
such as Microsoft Access. This software creates a file which contains a historical summary of test results
for all tests currently on the PC, and annotated with the test name and test date. This file is tab-delimited,
and can be directly imported into a spreadsheet, such as Microsoft Excel, where the values can be displayed
as tabular results or graphed. Figure 1 is a block diagram which shows the data flow through the system.

O-8194-1780-7/95/$6.OO SPIE Vol. 2432 / S39


Downloaded From: https://www.spiedigitallibrary.org/conference-proceedings-of-spie on 24 May 2022
Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Figure 1. Block diagram showing data flow through the system.

Figure 2 illustrates a sample page from the original report, which gives only the details of the QA
examination for one day. Even when the test results are reformatted into a table, showing the chronology
over time (Fig 3) it is difficult to extract trend information in a meaningful way. When displayed as a graph
of signal-to-noise ratio (SNR) plotted as a function of time (Fig 4) it becomes extremely easy to spot
outliers and determine trends. Although printed here in black and white, many of our plots use color coding
for specific subvalues.

TEST :NOISE

X(MM) Y(MM) MEAN SD SNR

0.0 50.0 3786.1 31.5 120.2


50.0 0.0 3706.8 33.7 110.1
0.0 —50.0 3571.3 33.6 106.1
—50.0 0.0 3589.7 34.6 103.7
0.0 0.0 3643.3 35.0 104.0

Uniformity Index : 2.92%

Figure 2. Portion of Original QA Report Page

540 / SPIE Vol. 2432


Downloaded From: https://www.spiedigitallibrary.org/conference-proceedings-of-spie on 24 May 2022
Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Noise SNR
Date Top Right Left Down Center
08/08/94 108.3 112.8 111.2 98.1 107.7
08/26/94 107.1 124.2 102.6 102.9 99.4
08/29/94 107.1 108.2 113.0 110.4 105.1
09/19/94 109.5 122.0 109.7 109.9 106.3
09/26/94 105.2 117.6 104.4 109.1 104.0
10/17/94 117.7 114.6 106.6 104.6 111.3
10/19/94 123.6 115.2 102.1 101.2 102.1
10/24/94 123.8 120.3 108.5 105.8 113.5
11/07/94 122.3 120.4 107.4 106.8 112.0
11/14/94 117.8 126.5 101.5 115.6 109.1
11/21/94 102.9 113.3 112.4 112.2 113.9
12/05/94 99.6 119.5 116.8 100.5 105.8
12/09/94 101.6 107.9 104.8 107.4 97.6
12/12/94 101.1 115.4 111.8 100.3 110.5
12/16/94 103.2 114.1 108.7 94.7 100.6
12/22/94 99.7 109.1 105.6 102.2 113.4
12/29/94 107.3 114.1 109.0 102.6 106.1
01/13/95 120.2 110.1 106.1 103.7 104.0
Figure 3. Chronological List of Test Results

VAMC MRI
140

I 20

—.— Noise SNR lop


ck .—.--- Noise SNR Rht
—e—NoiseSNR Left
C,) —*-- Noise SNR Down
.—*--- Noise SNR Center

100

80
0)

-()N-
0)

c'J
L()
0)
C)
'3.
0)
(0
(N
0)
0)
CN
0)
0)
i-
0)
(0
CN
C)
'
- - C
N-
0)
C)
0) 0)
N- '
'— ç tO
0)
C)
'
('1
C)

i-
(0
C)
C4
C'1
C)
C)
(N
U)
C)

Figure 4. Graph of Test Results

3. RESULTS
During the first three months that the system was used routinely, adjustments were made to the types of
tests which were extracted from the test results file, and how the results were displayed. We refined
concepts of what the acceptable error bands should be for specific values, so that judgments could be made
about whether the trend of a series of quality assurance examinations indicated a problem with the scanner,
or that the scanner was working properly.

SPIEVo!. 2432/541
Downloaded From: https://www.spiedigitallibrary.org/conference-proceedings-of-spie on 24 May 2022
Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
We found that running the QA examination on a weekly basis gave adequate information about the
performance of the scanner, and only a very short term in scanner characteristics would be missed within
this interval. The majority of detectable and useful events occurred over a long period of time, such as
gradual drifts in the density value of water, or gradual changes in the amount of noise present in a scan of a
homogeneous water phantom.

4. CONCLUSIONS
Periodic monitoring of CT or MR scanner performance can be of significant assistance in determining that a
scanner is working properly, and in detecting when it is not working properly. This may serve to give an
early warning of an impending x-ray tube failure, or failure of some other component of the system. It will
not detect sudden failures, however, such as a semiconductor suddenly ceasing to operate.

5. REFERENCES
1. Goodenough-D-J. Weaver-K-E. Phantoms for specifications and quality assurance of MR imaging
scanners. Comput-Med-Imaging-Graph. 1988 Jul-Aug. 12(4). P 193-209.
2. Coffey-C-W. Taylor-R. Umstead-G. A slice geometry phantom for cross sectional tomographic imagers.
Med-Phys. 1989 Mar-Apr. 16(2). P 273-8.
3. Price-R-R. Axel-L. Morgan-T. Newman-R. Perman-W. Schneiders-N. Selikson-M. Wood-M. Thomas-S-
R. Quality assurance methods and phantoms for magnetic resonance imaging: report of AAPM nuclear
magnetic resonance Task Group No. 1. Med-Phys. 1990 Mar-Apr. 17(2). P 287-95.
4. Barker-G-J. Tofts-P-S. Semiautomated quality assurance for quantitative magnetic resonance imaging.
Magn-Reson-Imaging. 1992. 10(4). P 585-95.

5. Murphy-B-W. Carson-P-L. Ellis-J-H. Zhang-Y-T. Hyde-R-J. Chenevert-T-L. Signal-to-noise measures


for magnetic resonance imagers. Magn-Reson-Imaging. 1993. 11(3). P 425-8.
6. Lerski-R-A. de-Certaines-J-D. Performance assessment and quality control in MRI by Eurospin test
objects and protocols. Magn-Reson-Imaging. 1993. 11(6). P 8 17-33.

7. Duina-A. Mascaro-L. Moretti-R. Belletti-S. [Results of the quality control of magnetic resonance imagesi
Radiol-Med-Torino. 1992 Mar. 83(3). P 276-8 1.

542 / SPIE Vol. 2432


Downloaded From: https://www.spiedigitallibrary.org/conference-proceedings-of-spie on 24 May 2022
Terms of Use: https://www.spiedigitallibrary.org/terms-of-use

You might also like