You are on page 1of 70


Algorithms for Reconstruction with Nondiffracting Sources

In this chapter we will deal with the mathematical basis of tomography with nondiffracting sources. We will show how one can go about recovering the image of the cross section of an object from the projection data. In ideal situations, projections are a set of measurements of the integrated values of some parameter of the object²integrations being along straight lines through the object and being referred to as line integrals. We will show that the key to tomographic imaging is the Fourier Slice Theorem which relates the measured projection data to the two-dimensional Fourier transform of the object cross section. This chapter will start with the definition of line integrals and how they are combined to form projections of an object. By finding the Fourier transform of a projection taken along parallel lines, we will then derive the Fourier Slice Theorem. The reconstruction algorithm used depends on the type of projection data measured; we will discuss algorithms based on parallel beam projection data and two types of fan beam data.

3.1 Line Integrals and Projections
A line integral, as the name implies, represents the integral of some parameter of the object along a line. In this chapter we will not concern ourselves with the physical phenomena that generate line integrals, but a typical example is the attenuation of x-rays as they propagate through biological tissue. In this case the object is modeled as a two-dimensional (or three-dimensional) distribution of the x-ray attenuation constant and a line integral represents the total attenuation suffered by a beam of x-rays as it travels in a straight line through the object. More details of this process and other examples will be presented in Chapter 4. We will use the coordinate system defined in Fig. 3.1 to describe line integrals and projections. In this example the object is represented by a twodimensional function f(x, y) and each line integral by the (0, t) parameters. The equation of line AB in Fig. 3.1 is

x cos 0 + y sin 0 = t

(1 )


Fig. 3.1: An object, f(x, y), and its projection, Pe(t i), are shown for an angle of O. (From fKak79J.)

and we will use this relationship to define line integral Po(t) as Po(t)=
(ernline f(x, y) ds.


Using a delta function, this can be rewritten as
c o Po(t)= pc., f(x, y)6(x cos 0 +y sin 0 ² t) dx dy.


The function Po(t) is known as the Radon transform of the function f(x, y).

Fig. 3.2: Parallel projections are
taken by measuring a set of parallel rays for a number of different angles. (From [Ros82.1.)

A projection is formed by combining a set of line integrals. The simplest projection is a collection of parallel ray integrals as is given by P9 (t) for a constant 0. This is known as a parallel projection and is shown in Fig. 3.2. It could be measured, for example, by moving an x-ray source and detector along parallel lines on opposite sides of an object. Another type of projection is possible if a single source is placed in a fixed position relative to a line of detectors. This is shown in Fig. 3.3 and is known as a fan beam projection because the line integrals are measured along fans. Most of the computer simulation results in this chapter will be shown for the image in Fig. 3.4. This is the well-known Shepp and Logan [She74]


(From [Ros821.) The image in Fig.4(a) is composed of 10 ellipses. 3. as illustrated in Fig. Note that the projection of an image composed of a number of ellipses is simply the sum of the projections for each of the ellipses. ) 2 Fig.) "head phantom.3: A fan beam projection is collected if all the rays meet in one location.4(b). The parameters of these ellipses are given in Table 3.P (t. 3. (The human head is believed to place the greatest demands on the numerical accuracy and the freedom from artifacts of a reconstruction method." so called because of its use in testing the accuracy of reconstruction algorithms for their ability to reconstruct cross sections of the human head with x-ray tomography.1.4(a) for computer simulation is that now one can write analytical expressions for the projections. A major advantage of using an image like that in Fig. We will now present 52 COMPUTERIZED TOMOGRAPHIC IMAGING . 3. this follows from the linearity of the Radon transform. 3.

The phantom is a superposition of ID ellipses.4: The Shepp and Logan "head phantom" is shown in (a). Let f(x.0 . 2 m p aA o . . y) be as shown in Fig.2 y 0.04A1.02 v.6 0. 0 .e.2 0 4 0.0 .. f (x.0 4 .8 1 0 (b) Fig. i. 1 1 Il l 'L r 1.) expressions for the projections of a single ellipse.2 -0. 6 0. Y) = E P 0 X2 y2 for ³+ ³ 1 A2 B2 otherwise (inside the ellipse) (4) (outside the ellipse).8 I V 0s1i 3in r1 1 1IE a1 1 I1L d 11 I NI I I II 1 1 I. each with a size and magnitude as shown in (b). 3.4 0.0 6 . (From (Ros82.0 l g z. /1/1Ef 0 0 x -1.0 .0 0.1.0 -0.1 .02 m 0.e 1.t.0 m i NO W ON M .5(a). ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 53 . Most of the computer simulated results in this chapter were generated using this phantom.8 0.6 -0.4 -0.02i . 3. 8 . 4 .

5: (a) An analytic expression is shown for the projection of an ellipse. (From fRos821.) (a) 54 COMPUTERIZED TOMOGRAPHIC IMAGING . (b) Shown here is an ellipse with its center located at (x 1. For computer simulations a projection can be generated by simply summing the projection of each individual ellipse.a2(8) = A2 cos20 t B2 sin2 Fig. y i) and its major axis rotated by a. 3.

(b) Table 3.01 0.21 0.01 0.02 ³ 0. 0.06.41 0.605) (0.0 ³ 0. ³ 0.22.35) (0.22. 0) (³ 0.023 0.046 0.023 0. ³ 0. 0 ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 55 . Major Axis 0.98 ³ 0.023 Rotation Angle 90 90 72 108 90 0 0 0 0 90 Refractive Index 2.69 0.02 0. 0) ³ 0. 0) (0.01 0.0184) (0.25 0.01 0.6624 0.874 0.16 0. 0 ³0. 3.92 0.1) (.046 Minor Axis 0.11 0. 0.046 0.01 Center Coordinate (0. 0 ³ 0.023 0.31 0.046 0.n m Fig.01 0.605) (.605) (.1) (³0.046 0.08.1: Summary of parameters for tomography simulations.5: Continued.046 0.

0) = [ f(x. Note that a(0) is equal to the projection half-width as shown in Fig. Let P' (0. They are related to P0(t) in (5) by Po(t)= Po a(t S cos (7 ³ 0)) where s = (6) + and y = tan' (y i /x i ). It follows that given the projection data.) and rotated by an angle a as shown in Fig.r_f(x. it should then be possible to estimate the object by simply performing a two-dimensional inverse Fourier transform. 00 F(u. t) be the resulting projections. The Fourier transform integral now simplifies to F(u. First.1'0(0.It is easy to show that the projections of such a function are given by for I t I :s a(0) 2pAB NIa2(9)³ t2 Po (t)=[ a 2 ( 0 ) (5) 0 I t I > a(0) where a2(B) = A 2 cost 8 + B2 sine 0. We start by defining the two-dimensional Fourier transform of the object function as j2r (ux +v y) dx dy. Now consider the ellipse described above centered at (x1.2 The Fourier Slice Theorem We derive the Fourier Slice Theorem by taking the one-dimensional Fourier transform of a parallel projection and noting that it is equal to a slice of the two-dimensional Fourier transform of the original object.07rux dx dy CO (9) but because the phase factor is no longer dependent on y we can split the integral into two parts. (8) The simplest example of the Fourier Slice Theorem is given for a projection at 0 = 0. 3. F(u. y)eco (7) Likewise define a projection at an angle 0. (10) From the definition of a parallel projection. consider the Fourier transform of the object along the line in the frequency domain given by v = 0.5(b).5(a). y)e--. 3. y) dY] e-J22"ux dx. y. v) = . 0)=1. 3. the reader will recognize the term 56 COMPUTERIZED TOMOGRAPHIC IMAGING . C O tc f(x. and its Fourier transform by Se(w)= Pe(t)e -J27' dt.

F( brackets as the equation for a projection along lines of constant x or P0_0(x)= Substituting this in (10) we find f(x.6 the (t. as shown in Fig. Clearly this result is independent of the orientation between the object and the coordinate system. (From 1Pan8. o (u). v).) The Fourier transform of a parallel projection of an image f(x. 3. ³ (13) This is the simplest form of the Fourier Slice Theorem. u) along line BB in Fig. 3. subtending an angle 0 with the u-axis.6. 3. In other words. the Fourier transform of P0(t) gives the values of F(u.6: The Fourier Slice Theorem relates the Fourier transform of a projection to the Fourier transform of the object along a radial line.31. y) taken at angle 0 gives a slice of the two-dimensional transform. This leads to the Fourier Slice Theorem which is stated as [Kak85]: Fig. (12) The right-hand side of this equation represents the one-dimensional Fourier transform of the projection P0=0. 0) S o . the Fourier transform of the projection defined in (11) is equal to the two-dimensional Fourier transform of the object along a line rotated by O. thus we have the following relationship between the vertical projection and the 2-D transform of the object function: F(u. for example. s) coordinate system is rotated by an angle 0. 0) = P0=0(x)e-J27ux dx. If. y) dy‡ F(u. Fourier transform V ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 57 .

0)=F(w cos 0. s) coordinate system to be a rotated version of the original (x. 58 COMPUTERIZED TOMOGRAPHIC IMAGING ( AA (20) . F(u. 02. y) coordinate system by using the relationships in (14). (8) Substituting the definition of a projection into the above equation we find Se(w) . v = w sin 0) or (18) This equation is the essence of straight ray tomography and proves the Fourier Slice Theorem. 3.²j2r w(x cos 0 +y sin 8) r S9(w)=F(w. for the purpose of computation (19) can be written as f(x. y) can be recovered by using the inverse Fourier transform: f(x. then F(u. y)= 00 c .-_ (15) Pe(t)e-rziwt dt. v).--.f (t. the object function f(x. s) ds and from (8)its Fourier transform is given by Se(w)=.r(ux-i-uY) du dv. oe. y) system as expressed by t [ x] = cos 0 sin 0 (14) s ² sin 0 cos 0 ] [y In the (t.0= A2 Ä. 0k and Fourier transforming each of these. the result being Se (W) =f (x. y) is bounded by ²A/2 < x < A/2 and ²A/2 < y < A/2.[ f (t. s) ds e-J2Tws dt. ‡ ‡ ‡ . we can determine the values of F(u. dx dy. w sin 0). . . Knowing F(u.6.The derivation of the Fourier Slice Theorem can be placed on a more solid foundation by considering the (t.06. O C (19) If the function f(x. (16) This result can be transformed into the (x.12. v) on radial lines as shown in Fig. v) would be known at all points in the uv-plane. s) coordinate system a projection along lines of constant t is written Po(t)= 1 0. (17) The right-hand side of this equation now represents the two-dimensional Fourier transform at a spatial frequency of (u = w cos 0. If an infinite number of projections are taken. The above result indicates that by taking the projections of an object function at angles 01.

(21) Since in practice only a finite number of Fourier components will be known.7: Collecting projections of the object at a number of angles gives estimates of the Fourier transform of the object along radial lines. This implies that there is greater error in the V ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 59 .for AA AA -.2 . Since an FFT algorithm is used for transforming the data. In practice only a finite number of projections of an object can be taken. In order to be able to use (22) one must then interpolate from these radial points to the points on a square grid.2 and -. 3. This calculation involves solving a large set of simultaneous equations often leading to unstable solutions. we can write 1 N / 2 N / 2 n f(x. n/A) are known. v) is only known along a finite number of radial lines such as in Fig. Since the density of the radial points becomes sparser as one gets farther away from the center. (From [Pan83. the interpolation error also becomes larger.7. In that case it is clear that the function F(u.1. y) ~ A2 E m= -N/2 n=--N/2 EFA_ <x<2 2 m A ej2w((m/A)x+(n/A)y) (22) for ³ 2 AA 2 2 AA 2 (23) Fig. Theoretically. It is clear that the spatial resolution in the reconstructed picture is determined by N. u) are known on some radial lines [Cro70]. It is more common to determine the values on the square grid by some kind of nearest neighbor or linear interpolation from the radial points. 3.) where we arbitrarily assume Nto be an even integer.2 <x<. the dots represent the actual location of estimates of the object's Fourier transform. one can exactly determine the N2 coefficients required in (22) provided as many values of the function F(u. Equation (22) can be rapidly implemented by using the fast Fourier transform (FFT) algorithm provided the N2 Fourier coefficients F(m/A.2 <y<.

Consider a single projection and its Fourier transform. The algorithm that is currently being used in almost all applications of straight ray tomography is the filtered backprojection algorithm. we will first provide a bit of intuitive rationale behind the filtered backprojection type of approach. The computer implementation of these algorithms requires the projection data to be sampled and then filtered. If the reader finds this presentation excessively wordy. derivations and implementation details will be presented for the backprojection algorithms for three types of scanning geometries. Using FFT algorithms we will show algorithms for fast computer implementation.calculation of the high frequency components in an image than in the low frequency ones. This theorem is brought into play by rewriting the inverse Fourier transform in polar coordinates and rearranging the limits of the integration therein. We say that the projections are nearly independent (in a loose intuitive sense) because the only common information in the Fourier transforms of the two projections at different angles is the dc term. which results in some image degradation. While this provides a simple conceptual model of tomography. Thus given the Fourier transform of a projection at enough angles the projections could be assembled into a complete estimate of the two-dimensional transform and then simply inverted to arrive at an estimate of the object. we note that because of the Fourier Slice Theorem the act of measuring a projection can be seen as performing a two-dimensional filtering operation. By the Fourier Slice Theorem. and equispaced fan beam. To develop the idea behind the filtered backprojection algorithm. The derivation of this algorithm is perhaps one of the most illustrative examples of how we can obtain a radically different computer implementation by simply rewriting the fundamental expressions for the underlying theory. 3.3. 3.1 The Idea The filtered backprojection algorithm can be given a rather straightforward intuitive rationale because each projection represents a nearly independent measurement of the object.3. In this chapter. practical implementations require a different approach. Before launching into the mathematical derivations of the algorithms. This isn't obvious in the space domain but if the Fourier transform is found of the projection at each angle then it follows easily by the Fourier Slice Theorem.2. equiangular fan beam. It has been shown to be extremely accurate and amenable to fast implementation and will be derived by using the Fourier Slice Theorem. 60 COMPUTERIZED TOMOGRAPHIC IMAGING . parallel beam. he or she may go directly to Section 3.3 Reconstruction Algorithms for Parallel Projections The Fourier Slice Theorem relates the Fourier transform of a projection to the Fourier transform of the object along a single radial.

Fig. and multiply it by the width of the wedge at that frequency. (2irl w I /K)S0(w). this constitutes the backprojection process.8(a). As the name implies. Later when we derive the theory more rigorously. which is equivalent to finding the elemental reconstructions corresponding to each wedge filter mentioned above. 3. What we really want from a simple reconstruction procedure is the sum of projections of the object filtered by pie-shaped wedges as shown in Fig. and the backprojection part. What is actually measured is shown in (b). The final reconstruction is found by adding together the two-dimensional inverse Fourier transform of each weighted projection. w. (a) is the ideal situation. 3. Thus the weighted projections represent an approximation to the pie-shaped wedge but the error can be made as small as desired by using enough projections. As predicted by the Fourier Slice Theorem. there are two steps to the filtered backprojection algorithm: the filtering part. when the summation is carried out in the space domain. As will be seen later. we will see that this factor of w I represents the Jacobian for a change of variable between polar coordinates and the rectangular coordinates needed for the inverse Fourier transform. each wedge has a width of 2irl w I /K. which can be visualized as a simple weighting of each projection in the frequency domain. So (w). Because each (b) (c) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 61 . The filtered backprojection algorithm takes the data in (b) and applies a weighting in the frequency domain so that the data in (c) are an approximation to those in (a). this projection gives the values of the object's two-dimensional Fourier transform along a single line. If the values of the Fourier transform of this projection are inserted into their proper place in the object's two-dimensional Fourier domain then a simple (albeit very distorted) reconstruction can be formed by assuming the other projections to be zero and finding the twodimensional inverse Fourier transform. 3. It is important to remember that this summation can be done in either the Fourier domain or in the space domain because of the linearity of the Fourier transform. A reconstruction could be formed by simply summing the reconstruction from each angle until the entire frequency domain is filled. The effect of this weighting by 271w j /K is shown in Fig. a projection gives information about the Fourier transform of the object along a single line. the weighted projection. The point of this exercise is to show that the reconstruction so formed is equivalent to the original object's Fourier transform multiplied by the simple filter shown in Fig. Perhaps the simplest way to do this is to take the value of the Fourier transform of the projection. has the same "mass" as the pieshaped wedge. 3.8(b). Comparing this to that shown in (a) we see that at each spatial frequency. The first step mentioned above accomplishes the following: A simple weighting in the frequency domain is used to take each projection and estimate a pie-shaped wedge of the object's Fourier transform.8(c).8: This figure shows the frequency domain data available from one projection. Thus if there are K projections over 180° then at a given frequency w.

it turns out that it is usually more accurate to carry out interpolation in the space domain. the reconstruction procedure can be started as soon as the first projection has been measured. The complete filtered backprojection algorithm can therefore be written as: Sum for each of the K angles. 3. 0 +1 0 +1 (a) (b) 62 COMPUTERIZED TOMOGRAPHIC IMAGING . 0. There are two advantages to the filtered backprojection algorithm over a frequency domain interpolation scheme. interpolation is often necessary. To perform a reconstruction it is necessary to filter the projection and then backproject the result as shown in Fig. (b) shows the projection after it has been filtered in preparation for backprojection.projection only gives the values of the Fourier transform along a single line. Simple linear interpolation is often adequate for the backprojection algorithm while more complicated approaches are needed for direct Fourier domain interpolation [Sta81]. This step is commonly called a backprojection since. as we will show in the next section. In Fig.9: A projection of an ellipse is shown in (a). than in the frequency domain. this inversion can be performed very fast.9(b). 3.9(a) we show the projection of an ellipse as calculated by (5). between 0 and 180° Measure the projection. it can be perceived as the smearing of each filtered projection over the image plane. This can speed up the reconstruction procedure and reduce the amount of data that must be stored at any one time. 3. as part of the backprojection or smearing process. To appreciate the second advantage. Po(t) Fourier transform it to find So (w) Multiply it by the weighting function 27r1w 1K Sum over the image plane the inverse Fourier transforms of the filtered projections (the backprojection process). the reader must note (this will become clearer in the next subsection) that in the filtered backprojection algorithm. Most importantly. The result due to backproject- Fig. when we compute the contribution of each filtered projection to an image point.

(a) shows the result of backprojecting for a single angle. 3. 3.10 shows the result of reconstructing an object with up to 512 projections. It takes many projections to accurately reconstruct an object. (b) shows the effect of backprojecting over 4 angles. and (d) shows 512 one projection is shown in Fig. (w.2 Theory We will first present the backprojection algorithm for parallel beam projections. f(x. v). 3. for a polar coordinate system.10: The result of backprojecting the projection in Fig. (c) shows 64 angles. Recalling the formula for the inverse Fourier transform. can be expressed as f(x. 3.12r(ux+vy) du du. y)= F(u. u)e. u = w cos 0 v = w sin 0 and then changing the differentials by using (25) (26) (27) 4 proiec:ions '71 du du = w dw de 1ft ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 63 . (24) Exchanging the rectangular coordinate system in the frequency domain. 3.9 is shown here.10. the object function. y). by making the substitutions Fig. Fig.3. 0). (u.

" The resulting projections for different angles 0 are then added to form the estimate of f(x. given the projection data transform So(w). oei2ww(x cos 0+y Sin Ow dw dO." This can be explained as follows. Y) = .wi dB. has a simple form. 0)6. to be "backprojected. + 180 °) =F( w. y) may be written as f(x. 0) ³ (30) the above expression for f(x. y)= [S . therefore Q0 (w) is called a "filtered projection.)E. y) in the image 64 COMPUTERIZED TOMOGRAPHIC IMAGING .we can write the inverse Fourier transform of a polar function as 2/r Co f(x. (28) This integral can be split into two by considering 0 from 0° to 180° and then from 180° to 360°. Equation (34) calls for each filtered projection. 0)1wleorwt dwl dO. y). y)= Q 0 (x cos 0+y sin 0) dO where (34) 120(t)= So(w)lwled2Twi dw. o o F(w. y)= 00 A F(w. (29) and then using the property F(w. (35) JC m This estimate of f(x. 0). sin (0+1800)] w dw dO. + 180. y).f7 s0(w)iwie2.i2irw(x cos 0+y sin ow dw dO + 00 F(w. (31)Jo Here we have simplified the expression by setting t=x cos 0 +y sin O. F(w.w[x cos (0+180 0)+:). we get f (x. So(w). (33) This integral in (33) may be expressed as f(x. f(x. where the frequency response of the filter is given by I w I. Qo. for the two-dimensional Fourier transform F(w. Equation (35) represents a filtering operation.J2. To every point (x. (32) If we substitute the Fourier transform of the projection at angle 0.

1. in principle. over the image plane. the value of t is the same for all (x. Therefore. be carried out over all spatial frequencies. the filtered projection. and the filtered projection Qo contributes to the reconstruction its value at t ( = x cos 0 + y sin 0). An FFT algorithm can then be used to ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 65 . ' 2 P e ( m T ) . so for all practical purposes the projections may be considered to be bandlimited. 3. (From IRos82. Here a equal to zero for large values of I t I then a projection can be represented as filtered projection is smeared back over the reconstruction plane along lines of constant t. It is easily shown that for the indicated angle 0. If we also assume that the projection data are known as backprojection. This is further illustrated in Fig. If W is a frequency higher than the highest frequency component in each projection. one could say that in the reconstruction process each filtered projection.plane there corresponds a value of t = x cos 0 + y sin 0 for a given value of 0. then by the sampling theorem the projections can be sampled at intervals of 1 T-2W Fig. will make the same contribution to the reconstruction at all of these points.11: Reconstructions are (36) often done using a procedure without introducing any error. is smeared back. 3. The integration in (35) must. Therefore. ²N N . In practice the energy contained in the Fourier transform components above a certain frequency is negligible.) - (37) for some (large) value of N. ‡ ‡ ‡ -. Qo.1 t makes the same contribution to all pixels along the line LM in the x y plane.11. The parameter w has the dimension of spatial frequency. m The filtered projection at a point =2. y) on the line LM. Q0. 0. or backprojected.

2 w)


approximate the Fourier transform So(w) of a projection by 1 N/2-1 S A W ) S ( I n - = po( k_ e-j2r(m 2 W k= -N/2 2W


k / N )


Given the samples of a projection, (38) gives the samples of its Fourier transform. The next step is to evaluate the "modified projection" Q9(t) digitally. Since the Fourier transforms So (w) have been assumed to be bandlimited, (35) can be approximated by
Q6(t) =

So( W) w ei27 wt dw

(39) 2W N
e j2 rm(2 W/N )t




( in 2 W)

m= - N/2


N provided N is large enough. Again, if we want to determine the projections Q9 (t) for only those t at which the projections Po(t) are sampled, we get ( 2W e j 2 r ( m k / N ) s Q 0 ni2E, (41) N) \2W) ³(N ) m=- N/2 k= ²N/2, ‡ ‡ ‡ , ²1, 0, 1, ‡‡ ‡ , N/2. (42)
By the above equation the function Qo(t) at the sampling points of the projection functions is given (approximately) by the inverse DFT of the product of So(m(2W/N)) and I m(2 WIN)I. From the standpoint of noise in the reconstructed image, superior results are usually obtained if one multiplies the filtered projection, So (2 WI/V) m(2 W/N)I , by a function such as a Hamming window [Ham77]: N/2 ivi ff ) So (m31\77)
I n= -N/2

Q 0 2k w)







W) N

ej2 ,( mk/N)


where H(m(2 W/N)) represents the window function used. The purpose of the window function is to deemphasize high frequencies which in many cases represent mostly observation noise. By the familiar convolution theorem for the case of discrete transforms, (43) can be written as Qe * pe( 2 W) N 2 W) 2 W) (44) where * denotes circular (periodic) convolution and where 0(k/2 W) is the inverse DFT of the discrete function I m(2 W//V)IH(m (2 W/N)), m = ² N/2, ‡ ‡ ‡ , ²1, 0, 1, ‡ ‡ ‡ , N/2.


Clearly at the sampling points of a projection, the function Q0(t) may be obtained either in the Fourier domain by using (40), or in the space domain by using (44). The reconstructed picture f(x, y) may then be obtained by the discrete approximation to the integral in (34), i.e.,


f(x, y)=K ² E 42,;(x cos 0;+y sin 0,)


where the K angles 0, are those for which the projections Po(t) are known. Note that the value of x cos 0, + y sin 0, in (45) may not correspond to one of the values oft for which Q0, is determined in (43) or in (44). However, Qoi for such t may be approximated by suitable interpolation; often linear interpolation is adequate. Before concluding this subsection we would like to make two comments about the filtering operation in (35). First, note that (35) may be expressed in the t-domain as
Q9(t)= Po(a)p(t ² a) da


where p(t) is nominally the inverse Fourier transform of the I w I function in the frequency domain. Since I w I is not a square integrable function, its inverse transform doesn't exist in an ordinary sense. However, one may examine the inverse Fourier transform of
I w I e-E


as e 0. The inverse Fourier transform of this function, denoted by p½(0, is given by e2 ³ (27rt)2 PE(t)= (e2+(2702)2 (48) This function is sketched in Fig. 3.12. Note that for large t we get pE(t) ³ 1/(202. Now our second comment about the filtered projection in (35): This equation may also be written as


sgn (w)] ei27wf dw


where sgn (w)= 1 for w> 0 for w < O. (50)

By the standard convolution theorem, this equation may be expressed as Q0(t) = {IFT of j2rwSo(w)} * {IFT of 2 sgn (w)} 7r (51)


p c ( t)



i 8E2

Fig. 3.12: An approximation to the impulse response of the ideal backprojection filter is shown here. (From [Ros821.)

where the symbol * denotes convolution and the abbreviation IFT stands for inverse fast Fourier transform. The IFT of j2rwSe(w) is (a/at)p,(t) while the IFT of ( ³ j/271-) sgn (w) is 1 /t. Therefore, the above result may be written as

I apo(t) Qe(t)= let * _____ at
= Hilbert Transform of


aP0(t) at


where, expressed as a filtering operation, the Hilbert Transform is usually defined as the following frequency response: H(w)= { i ' j,




Equation (40) is very attractive since it directly conforms to the definition of the DFT and.13(a) shows a reconstruction of the Shepp and Logan head phantom from 110 projections and 127 rays in each projection using (40) and (45).3. If there is no aliasing. if N is decomposable. Fig. computer processing based on (40) usually leads to interperiod interference artifacts created when an aperiodic convolution (required by (35)) is implemented as a periodic convolution.13. Since these two assumptions (taken together) are never strictly satisfied. viz.13(b) shows that reconstruction based on (42) and (45) introduces a slight "dishing" and a dc shift in the image. (35)) calls for such "zeroing out" to occur at only one frequency.13(b) shows the reconstructed values on the horizontal line for y = ³ 0. The theory presented in the preceding subsection says that for each sampled projection P0(kr) we must generate a filtered Q0(kr) by using the periodic (circular) convolution given by (40). (55) 2r Let the sampled projections be represented by P0(kr) where k takes integer values. this implies that in the transform domain the projections don't contain any energy outside the frequency interval ( ³ W. Zero-padding of the projections also reduces. W) where 1 W= 2r cycles/cm. These artifacts are partly caused by the periodic convolution implied by (40) and partly by the fact that the implementations in (40) "zero out" all the information in the continuous frequency domain in the cell represented by m = 0.605 line through the ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 69 . w = 0. The contribution to these artifacts by the interperiod interference can be eliminated by adequately zero-padding the projection data before using the implementations in (42) or (43). (If NFFT points are used for performing the discrete Fourier transform. This is illustrated in Fig. Fig. whereas the theory (eq. For comparison we have also shown the values on this line in the original object function.3 Computer Implementation of the Algorithm Let's assume that the projection data are sampled with a sampling interval of T cm. The comparison illustrated in Fig. the contribution to the artifacts by the zeroing out of the information in the m = 0 cell in (40). possesses a fast FFT implementation. note that (40) is only valid when the projections are of finite bandwidth and finite order. 3. 3. This is because zero-padding in the space domain causes the cell size to get smaller in the frequency domain.605. Equation (40) was implemented with a base 2 FFT algorithm using 128 points.3. These data were transformed by an FFT algorithm and filtered with a 1w1 function as before. However. the 127 rays in each projection in the preceding example were padded with 129 zeros to make the data string 256 elements long.) To illustrate the effect of zero-padding. 3. but never completely eliminates. The y = ³ 0. 3. the size of each sampling cell in the frequency domain is equal to 1/NFFT r.

The dishing caused by interperiod interference has disappeared.605 line. .0 -. 3.675 obtained on the y = -0. .) 70 COMPUTERIZED TOMOGRAPHIC IMAGING .0 True values Reconstructed values 0.0 .550 __ 2.50 0. the dc shift still remains. (c) Shown here are the reconstructed values .0 x 1. The dark regions at the top and the bottom of the reconstruction are the most visible artifacts here.4.0 2.05 operation in the filtered backprojection algorithm.0 (b) -1 0 however.) The "dishing" and the dc shift artifacts are quite evident in this comparison.13: (a) This reconstruction of the Shepp and Logan phantom shows the artifacts caused when the projection data are not adequately zero-padded and FFTs are used to perform the filtering 1. This 128 x 128 reconstruction 925 was made from 110 projections with 127 rays in each projection. (b) A numerical comparison of the true and the reconstructed values on the y = -0.(a ) Fig.800 (For the location of this line see Fig. 3.50 0. (From [Ros82J.605 line if the 127 rays in each projection are zeropadded to 256 points before using the FFTs.

050 2.0 (c) .50 0.0 0.0 1. b w (w)= 1 [ 0 lw<W l otherwise. 3.0 .9000 -1. (58) H(w).0 t t 1. h(t).0 Fig. The impulse response.2. When the highest frequency in the projections is finite (as given by (55)).13: Continued. represents the transfer function of a filter with which the projections must be processed.50 1. demonstrating that the dishing distortion is now less severe.012 True values . We will now show that the artifacts mentioned above can be eliminated by the following alternative implementation of (35) which doesn't require the approximation used in the discrete representation of (40). So (w)H(w)e i'w t dw (56) where H(w)=1w1bw(w) (57) where.9750 Reconstructed values . (35) may be expressed as r.0 ____________________________________________________________________ x . of this filter is given by the inverse Fourier transform of H(w) and is h. 0 23 (0 = .13(c). shown in Fig.9375 0. reconstruction is shown in Fig.14. ( t ) = c H(w)e+.127"vf dw (59) (60) 2T 2 2/11/2T 1 sin 2rt/2r 1 ( sin wt/27 \ 2 4T2 at/2 T ) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 71 . 3. again. 3.

of h(t) are h(nr)={1/ 47-2. It has been bandlimited to 1/2r.W(t³ kr) h(t)= 72 COMPUTERIZED TOMOGRAPHIC IMAGING E h(kr) sin 271.) where we have used (55). 3. 0. Since both Po(t) and h(t) are now bandlimited functions.15. for digital processing the impulse response need only be known with the same sampling interval.W(t³ kr) ________________________ (63) .14: The ideal filter response for the filtered backprojection algorithm is shown here. Since the projection data are measured with a sampling interval of r. they may be expressed as 2ir W(t ³ kr) Pe(t)=--- E p (kr) ______________ e sin 2. h(nr). (From (Ros821. The samples.71. 3.2 k = ² oD (62) This function is shown in Fig. given by n=0 n even n odd. (61) n 7 7 2 2 .Frequency (w) Fig.

W(t³ kr) By the convolution theorem the filtered projection (56) can be written as (64) k=² Q9(t)= Po(t' )h(t ³ t') dt' .271. ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 73 .

) Q0(n T)= T result for the values of E h( nr ³ IcT)Pe(kr). N ³ 1. k. ³co (65) In practice each projection is of only finite extent. n = 0. 2. 3. y ‡y . (67) 74 COMPUTERIZED TOMOGRAPHIC IMAGING . 1. n = 0. 2. 1. ‡ y ‡.15: The impulse response Substituting (62) and (63) in (64) we get the following of the filter shown in Fig. Suppose that each Po(kr) is zero outside the index range k = 0. (From (Ros821. We may now write the following two equivalent forms of (65): N-1 Q0(nr)= or T k=0 h(nr ³ kr)Pe(kr). N ³ 1 (66) N²I Qe(nr)= T k= ²(N-1) E h(kr)P6(nT ³ kT).h( n T ) 4T2 -41 - 3T - T I \ I I T I / / # 1 21 3r 41 A 51 22 9'1 1 22 251 T \I / _ 1 22 1 Fig. ‡ ‡ ‡ . 3.14 is the filtered projection at the sampling points: shown here. N ³ 1.

While the latter sequence is zero at k = 0. we pad the projection data with a sufficient number of zeros.16: The DFT of the bandlimited filter (broken line) and that of the ideal filter (solid line) are shown here.16. the DFT of h(nr) with n ranging from ³ (N ³ 1) to (N ³ 1) is nonzero at this point.613 128 0. (68) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 75 .84 1.84 2. 3.2.45 Fig. it is much faster to implement it in the frequency domain using FFT algorithms. [By using specially designed hardware. direct implementation of (66) can be made as fast or faster than the frequency domain implementation. It is important to realize that the results obtained by using (66) or (67) aren't identical to those obtained by using (42). we avoid interperiod interference over the N samples of Qe (kr). which is most often the case.00 ______________ 0. the sequences Pe(kr) and h(kr) have to be zero-padded so that each is (2N ³ 1)2 elements long. To eliminate the interperiod interference artifacts inherent to periodic convolution.00 .613 1.23 1. while the convolution required in (66) is aperiodic.] For the frequency domain implementation one has to keep in mind the fact that one can now only perform periodic (or circular) convolutions. It can easily be shown [Jak76] that if we pad P0(kr) with zeros so that it is (2N ³ 1) elements long.45 - 1. This is because the discrete Fourier transform of the sequence h(nr) with n taking values in a finite range [such as when n ranges from ³(N ³ 1) to (N ³ 1)] is not the sequence I k[(2 W)/ Nil.23 . Therefore. where (2N ³ 1)2 is the smallest integer that is a power of 2 and that is greater than 2N ³ 1. 3. Notice the primary difference is in the dc component. the frequency domain implementation may be expressed as Q0(nr)= T x IFFT [FFT P o(nr) with ZP] x [FFT h(nr) with ZP] }. However. if one wants to use the base 2 FFT algorithm. This is illustrated in Fig. Of course. The discrete convolution in (66) or (67) may be implemented directly on a general purpose computer.) These equations imply that in order to determine Q0(nr) the length of the sequence h(nr) used should be from 1 = ³ (N ³ 1) to / = (N ³ 1). (From [Ros821.

respectively. In this technique. fast Fourier transform and inverse fast Fourier transform. the backprojection for parallel projection data can be accomplished with virtually no multiplications. y) and 0. 3. 3. In generating these parallel data a source-detector combination has to linearly scan over the 76 COMPUTERIZED TOMOGRAPHIC IMAGING . may not correspond to one of the kr at which Qe. which can be combined with the computation in (69). One method of preinterpolation. prior to backprojection. consists of the following: In (69). That is. the frequency domain function is padded with a large number of zeros. Fig.13(b). the function Q0 (t) is preinterpolated onto 10 to 1000 times the number of points in the projection data. The number of rays used in each projection was 127 and the number of projections 100. One usually obtains superior reconstructions when some smoothing is also incorporated in (68). 3. prior to performing the IFFT.605 for the Shepp and Logan head phantom. preinterpolation of the functions Q0(t) is also used.4 Reconstruction from Fan Projections The theory in the preceding subsections dealt with reconstructing images from their parallel projections such as those shown in Fig. It was recently shown [Kea78] that if the data sequence contains "fractional" frequencies this approach may lead to large errors especially near the beginning and the end of the data sequence.1. Note that with preinterpolation and with appropriate programming. (68) may be rewritten as Q0(10= T x IFFT {[FFT Pe(nr) with ZP] x [FFT h(nr) with ZP] x smoothing . is known. 3. which combines it with the operations in (69).17(a) shows the complete reconstruction. A variety of techniques are available for preinterpolation [Sch73].0. Using the implementation in (68). To make convolutions aperiodic. From this dense set of points one simply retains the nearest neighbor to obtain the value of 0. ZP stands for zero-padding. 3. + y sin 0. we see that the dc shift and the dishing have been eliminated..where FFT and IFFT denote. the projection data were padded with zeros to make each projection 256 elements long. Comparing with Fig. (69) After the filtered projections Q0(nr) are calculated with the alternative method presented here. This will call for interpolation and often linear interpolation is adequate. in order to eliminate the computations required for interpolation. Sometimes. Again for a given (x.window} . the rest of the implementation for reconstructing the image is the same as in the preceding subsection. The inverse transform of this sequency yields the preinterpolated Q0.17(b) shows the reconstructed values on the line y = . we use (45) for backprojections and their summation. Smoothing may be implemented by multiplying the product of the two FFTs by a Hamming window. + y sin 0. the argument x cos 0. at x cos 0. Fig. When such a window is incorporated.

This usually results in times that are as long as a few minutes for collecting all the data.0 1. As might be expected.050 -1 1. The source and the entire bank of detectors are rotated to generate the desired number of fan projections.0 (b) length of a projection.3.025 - 1. Note that the dishing and dc shift artifacts visible in Fig. 3.13 have disappeared. On the other side of the object a bank of detectors is used to make all the measurements in one fan simultaneously.0 .) 2.0 {a 2. The unit sample response h(nr) was used with n ranging from ²128 to 127. X -50 0. One now uses a point source of radiation that emanates a fan-shaped beam. (From [Ros82J.16. yielding 256 points for this function.000 - Reconstructed values True values .9750 - 9500 -1 0 0. The 127 rays in the projection were zero-padded so that each projection was 256 elements long.605 line of the reconstruction in (a) with the true values. then rotate through a certain angular interval. The number of projections was 100 and the display matrix size is 128 x 128. A much faster way to generate the line integrals is by using fan beams such as those shown in Fig.0 r 0. and so on. then scan linearly over the length of the next projection.0 II I. 3.17: (a) Reconstruction obtained by using the filter shown in Fig. one has to pay a price for this simpler and ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 77 .Fig. (b) A numerical comparison of the y = ²0.50 1. 3. 3.

18(c)). Consider the ray SA.18. In (a) we have shown an equiangular set of rays. then the ray SA would belong to a parallel projection Po (t) for 0 and t given by 0=0+y and t = D sin y (70) where D is the distance of the source S from the origin 0. however. this equation may be rewritten as f(x. if one would like to use the projections generated over 360° . ) denote a fan projection as shown in Fig. Now we know that from parallel projections Pe(t) we may reconstruct f(x. Here 6 is the angle that the source S makes with a reference axis. If the detectors for the measurement of line integrals are arranged on the straight line Di D2. However. The algorithms that reconstruct images from these two different types of fan projections are different and will be separately derived in the following subsection. 3. The second type of fan projection is generated when the rays are arranged such that the detector spacing on a straight line is now equal (Fig. y) (marked C in Fig. This difference is illustrated in Fig. 3. that is.18(b)). in all projections. There are two types of fan projections depending upon whether a projection is sampled at equiangular or equispaced intervals. This equation only requires the parallel projections to be collected over 180°. 3. If. as we will see later the simple backprojection of parallel beam tomography now becomes a weighted backprojection. x= r cos 4) y = r sin 4. 3. 3. The relationships in (70) are derived by noting that all the rays in the parallel projection at angle 0 are perpendicular to the line PQ and that along such a line the distance OB is equal to the value of t.19. (73) 78 COMPUTERIZED TOMOGRAPHIC IMAGING . the detectors are arranged on the arc of a circle whose center is at S.faster method of data collection. (72) Derivation of the algorithm becomes easier when the point (x. 0). 3.4. y)=2 o r POOM (X cos 0 + y sin 0 ² t) dt de. y) by f(x. and the angle y gives the location of a ray within a fan. If the projection data were generated along a set of parallel rays. y)= P o (t) h(x c os 0 +y si n 0 ² t) dt de cm o (71) where tm is the value oft for which Po(t) = 0 with I t I > tÄ.1 Equiangular Rays Let Ro (1.20) is expressed in polar coordinates (r. they may now be positioned with equal spacing along this arc (Fig. this implies unequal spacing between them.

As shown in (c) the detectors can be arranged with constant spacing along a line but then the angle between rays is not constant. In (a) the angle between rays is constant but the detector spacing is uneven.18: Two different types of fan beams are shown here.) (b) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 79 . (From fRos82J. If the detectors are placed along a circle the spacing will then be equal as shown in (b). 3.S D2 Fig.

4 1 1 1 1 1 1 1 1 1 1 11"1 1 \ \ 4 4 1 T I 80 COMPUTERIZED TOMOGRAPHIC IMAGING V ." " 4 1 1 1 1 1 1 4 1 4111 1111. .Awnisv ne rt ge n u l bNaqyrl eOruea tT a i ea s l .

Rays arranged whql i ea tu (c) detector spacing Fig.18: Continued. A few observations about this expression are in order. (74) 2 Using the relationships in (70). 1 I2a ²y fSm³lUm/D) Po+ y(D sm Om /D) sin 7)h(r cos 03 + -y ³ ²d sin 7)D cos dy d13 (75) 4 ³ 7 where we have used dt dO = D cos 'y dy dO. The expression in (72) can now be written as f(r. (/))= o rÄ. 3. Po(t)h(r cos (0 cb) t) dt dO. Since all the functions of i3 are periodic (with period 27r) these limits ³ ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 81 . the double integration may be expressed in terms of 7 and 13. The limits 7 to 27r ³ -y for 13 cover the entire range of 360°. ³ ³ ³ Å. (131)=7. f(r.

3. (From fRos82J. the upper and lower limits for y may be written as 7Ä. (76) In order to express the reconstruction formula given by (76) in a form that can be easily implemented on a computer we will first examine the argument 82 COMPUTERIZED TOMOGRAPHIC IMAGING . Each ray is identified by its angle y from the central ray.19. 3..19: An equiangular fan is shown here. .Ä.fan projection R (Y) Fig. The expression PO..) may be replaced by 0 and 2r. Therefore. Sin-' (tm/D) is equal to the value of 7 for the extreme ray SE in Fig.(D sin y) corresponds to the ray integral along SA in the parallel projection data Po(t). respectively. Introducing these changes in (75) we get 1 f(r. 41)=-2 o Roy)h(r cos (13 ( ² (11)³D sin 7)D cos 7 ch (113. and ²y. respectively. The identity of this ray integral in the fan projection data is simply Ra(y).

3. One can now easily show that L cos y' =D+ r sin (0-0) L sin 'y' =r cos (0³ 0). r.20. Clearly.3 ³ 0)]2+ [r cos ((3 and ³ tanD+r sin (13³ r cos (13³ 0) (80) ³ 2 (79) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 83 . y) [or (r. )3)=V[D+ r sin (. and -y is the angle that the source-to-pixel line subtends with the central ray. let 7' be the angle of the ray that passes through this point (r. 0. Also. L is a function of three variables. (77) Let L be the distance from the source S to a point (x. 0. y) from the source S. (From [Ros821. 0). 0) in polar coordinates] such as C in Fig.) of the function h. and 0. The argument may be rewritten as r cos ((3 + 0)³D sin y =r cos (0-0) cos 7 ³ [r sin (0 ³ 0)+D] sin y.20: This figure illustrates that L is the distance of the pixel at location (x. L(r.Fig. 3.

()=Co L2 I 7 where . (81) (82) We will now express the function h(L sin (y' ³ y)) in terms of h(t). (88) may be interpreted as a weighted filtered backprojection algorithm. (82) may be written as f(r. cm. ( 2 sin -7³ ) ( h27 ) ‡ (89) For the purpose of computer implementation.f iw I ej27r wt dw.3+ -y ³ 0)³ D sin 7 =L sin (y' ³y) and substituting this in (76) we get f(r.13. Therefore.³ .Using (78) in (77) we get for the argument of h r cos (.f207 . (84) Using the transformation wL sin 7 w' ___________________________ (85) y we can write ) = y (86) _____ \ 2 c'wi Lsiny ) ( _______ J2 h(7)‡ L sin 7 dw' (87) 2 1 Ro(7)g(y' ³7)D cos 7 dy d. 0)=o 84 COMPUTERIZED TOMOGRAPHIC IMAGING L2 Qg(7') da (90) . (83) Therefore. To show this we rewrite (88) as follows: c2i. 1 f ( r .7 7 (88) g(7)1 =.f7n7m120(7)h(L sin (y' ³ -y))D cos y dy d. h(L sin 7)= iw I ej2irwL sin y dw.6 . Note that h(t) is the inverse Fourier transform of I w I in the frequency domain: h(t)= . ck)=2 .

theoretically. are the angles at which projections are taken. (94) To perform this discrete convolution using an FFT program the function R. we get for the discrete impulse response n=0 n is even n is odd. (96) Although.(na) must be padded with a sufficient number of zeros to avoid interperiod interference artifacts. Note that n = 0 corresponds to the ray passing through the center of the projection.where Qa(y)= R and where * g(T) (91) R' (y) = Ro(y) ‡ D cos 7.4.i(nce)*g(na)*k(na) (97) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 85 . The sequence g(na) is given by the samples of (89): n a ) 2 h(na).(na) by R o' .(na)*g(na). The first step is to generate for each fan projection Roi(na) the corresponding Ri. g(na)=1 (95) 2 si n na If we substitute in this the values of h(na) from (61). in practice superior reconstructions are obtained if a certain amount of smoothing is combined with the required filtering: Qfii(ncx)= R 13. no further filtering of the projection data than that called for by (94) is required. The known data then are Roi(na) where n takes integer values. Step 2: Convolve each modified projection Ro'i(na) with g(na) to generate the corresponding filtered projection: (93) Qfii(na)=12. (92) This calls for reconstructing an image using the following three steps: Step 1: Assume that each projection Ro(y) is sampled with sampling interval a.(na)= Rfli(na) ‡ D ‡ cos na.

(c) This figure illustrates the implementation step that in order to determine the backprojected value at pixel (x.(a) Fig.21: While the filtered projections are backprojected along parallel lines for the parallel beam case (a).) (b) 86 COMPUTERIZED TOMOGRAPHIC IMAGING . one must first compute for that pixel. 3. for the fan beam case the backprojection is performed along converging lines (b). y). (From Mos821.

y) shown there one must first find the angle. For the parallel case the filtered projection is backprojected along a set of parallel lines as shown in Fig. y'.21: Continued.21(a). 3. y) and AO = 2r/M. For 13/ chosen in Fig. where k(na) is the impulse response of the smoothing filter. 0i) O i l ( 7 ) where y' is the angle of the fan beam ray that passes through the point (x. One must ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 87 . In the frequency domain implementation this smoothing filter may be a simple cosine function or a Hamming window. 3. 3. we will explain it in some detail. to the point (x. This is dictated by the structure of (90): f(x. y) under consideration.21(b)). OAT ' ) will then be contributed from the filtered projection at 13. Of course.21(c) in order to find the contribution of Qpi(7) to the point (x. of the ray SA that passes through that point (x. the computed value of 7' may not correspond to one of na for which O1(na) is known.6 1 0 3 (98) L2(x. 3. Y)=--'. Step 3: Perform a weighted backprojection of each filtered projection along the fan. Since the backprojection here is very different from that for the parallel case. y).(y) (c) Fig. For the fan beam case the backprojection is done along the fan (Fig.

3. Now we represent them by Ro(s).22: For the case of equispaced detectors on a straight line.then use interpolation. each projection is denoted by the function Rd(s). 3. Fig. which then introduces differences in subsequent mathematical manipulations.) 88 COMPUTERIZED TOMOGRAPHIC IMAGING . (From [Ros82J.4. The contribution ‡231(7 ) at the point (x. Although the projections are measured on a line such as Di D2 in Fig. fan projections were sampled at equiangular intervals and we represented them by R0(7) where y represented the angular location of a ray. Before.2 Equally Spaced Collinear Detectors Let Rn(s) denote a fan projection as shown in Fig. y).22. y) must then be divided by L2 where L is the distance from the source S to the point (x.22. The principal difference between the algorithm presented in the preceding subsection and the one presented here lies in the way a fan projection is represented. where s is the distance along the straight line corresponding to the detector bank. This concludes our presentation of the algorithm for reconstructing projection data measured with detectors spaced at equiangular increments. 3. 3.

Thus in Fig. D. 3. D. .1 ) 0 (t)h(r cos (0 ² 4)) ² t) dt d0 (74) S ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 89 .23 we will associate a fan projection Ro(s) with the imaginary detector line D. as opposed to point B on Di D2.23: This figure illustrates several of the parameters used in the derivation of the reconstruction algorithm for equispaced detectors.) VD2+s2 t³ ___________ 0=(3+tan -1 ² D where use has been made of the fact that angle AOC is equal to angle OSC. (From [Ros82J.for theoretical purposes it is more efficient to assume the existence of an imaginary detector line D. the ray SA would belong to a parallel projection P9(t) with 0 and t as shown in the figure. D. Now consider a ray SA in the figure. passing through the origin. the value of s for this ray is the length of OA. and where D is the distance of the source point S from the origin 0. If parallel projection data were generated for the object under consideration. 44=r . . In terms of the parallel projection data the reconstructed image is given by (74) which is repeated here for convenience: f(r. We now associate the ray integral along SB with point A on D. . The relationship between (3 and t for the parallel case is given by t=s cosy sD B= /3+y Fig. 3.

0) sm ( sr) r=.tan -1 (sm/D) and 27 tan-1 (sm/ D) cover the angular interval of 360°. these limits may be replaced by 0 and 27. 1¶c211 f(r. y) the ratio of SP (Fig. Also. respectively.s R (s)h (r cos (3 +tan 2o fl -1 ²s D3 ) ds c10. (6)=.24) to the source-to-origin distance. Since all functions of (3 in (100) are periodic with period 27. 0) is the reconstructed image in polar coordinates. denoted by U. _____ NID2±s2 = r c o s 0 ( 3 0 ) ( D + r s i n ( f 3 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ (104) VD2+ s2 VD2+ s2 We will now introduce two new variables that are easily calculated in a computer implementation.1 4 -ta-I (Sm/D) 1 -Sm P0+7 D2 + S2 ‡ h [r cos 63+ tan -1 Ds 1 D3 (³s )-(b) vD2+ s2 (D2+ s2)3/2 ds (113 (100) where we have used D3 dt dB= (D2+ s2)3/2 ds (101) In (100) sm is the largest value of s in each projection and corresponds to tm for parallel projection data. (103) Ds 1/D2 s2 (D2 ± s2)3/2 In order to express this formula in a filtered backprojection form we will first examine the argument of h. 3. The argument may be written as ( Ds r cos (3 + tan . The first of these. The identity of this ray integral in the fan projection data is simply R. Introducing these changes in (100) we get . Note that 90 COMPUTERIZED TOMOGRAPHIC IMAGING . is for each pixel (x. the expression - sD P13+7 NI I D 2 + s 2 (102) corresponds to the ray integral along SA in the parallel projection data Po(t). Using the relationships in (99) the double integration may be expressed as 1 r 2r-tan I (S m/D) r f(.where f(r. The limits .(s).1 ³0 s D .

which is the projection of the source to pixel line on the central ray. Let s' denote this value of s.s Fig. ) (108) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 91 . it is given by the distance OF.Eq. Thus U(r.) SP is the projection of the source to pixel distance SE on the central ray. 4). 13)= ______ SO + OP D (105) D+ r sin (i3-0) (106) = ____________ D The other parameter we want to define is the value of s for the ray that passes through the pixel (r. 3. (¢) under consideration. to the source-to-center distance.24: For a pixel at the polar coordinates (r. (Adapted from fRos821. Since s ' E P SO SP (107) we have r cos (0² 4)) s' =D_____________ D+ r sin ( (3 ³ cf. 4)) the variable U is the ratio of the distance SP. Since s is measured along the imaginary detector line D.

2________ s R o (s)h [(s' ²s) ) 3 VD2 ±s2 (D2 +s 2 3/2 ds d(3. nominally./D2 +s2 U2D2 D2 + s2 = _____ h(s' ²s). Therefore. Note that. ( 1 0 9 ) D VD2+s2 = VD2 ± s2 VD2 + s2 Substituting (109) in (103). m UD D (110) We will now express the convolving kernel h in this equation in a form closer to that given by (61).Equations (106) and (108) can be utilized to express (104) in terms of U and s s ( r Ds c o s s' UD sUD 4 ) .-sxup/. s Ä. 4))=1 2. dw' (114) (115) Substituting this in (110) we get f(r4)L 7. h [ (s' s) UD VD2+ s2 ]°°= f I w 1 eizirwcs. h(t) is the inverse Fourier transform of I )4. I in the frequency domain: h(t)= dw. U2D2 Using the transformation UD W = W VD2 s 2 (113) -s)w. (112) we can rewrite (112) as follows: h [(s' ²s) _____ UD 1 D 2 + s 2 .- f(r. we get 1 3 + t a n .1 ./D2+s2) dw.2 I U2 R o (s)g(s' s) _____ ³ 92 COMPUTERIZED TOMOGRAPHIC IMAGING .

(116) may be interpreted as a weighted filtered backprojection algorithm. To show this we rewrite (116) as ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 93 . VD2 s2 ds di3 (116) + (117) For the purpose of computer implementation.1 g(s)=2 h(s).where c.

Oi are the angles for which fan projections are known.(na)= Roi(na) * g(na) (122) where the sequence g(na) is given by the samples of (117): 1 g(na)= 2 h(na). (123) Substituting in this the values of h(na) given in (61) we get for the impulse response of the convolving filter: 1 8a g(na)= 0.(na) with g(na) to generate the corresponding filtered projection: Q0. 94 COMPUTERIZED TOMOGRAPHIC IMAGING . The first step is to generate for each fan projection Rfli(na) the corresponding modified projection R Wna) given by D R´na)= Roi(na) N I D2 + n2a2 (121) Step 2: Convolve each modified projection R o'. The known data then are Rgi(na) where n takes integer values with n = 0 corresponding to the central ray passing through the origin. 1 n=0 n even (124) 2n2r2a2 n odd.follows: 27r 1 ³ 0 U2 Qo(s') d cb)= where (118) Q(s)= R (s) * g(s) and (119) D R ' (s)= Ro(s) ‡ VD2 s2 Equations (118) through (120) suggest the following steps for computer implementation: Step 1: Assume that each projection R3(s) is sampled with a sampling interval of a.

1 U (x. R0(7) denotes a fan beam projection taken at angle 0. and Pe (t) a parallel projection taken at angle 0. y)= AB i. After re-sorting one may use the filtered backprojection algorithm for parallel projection data to reconstruct the image. If k(na) is the impulse response of the smoothing filter.4. .3 A Re-sorting Algorithm We will now describe an algorithm that rapidly re-sorts the fan beam projection data into equivalent parallel beam projection data. (3i ) 2 ') (126) where U is computed using (106) and s' identifies the ray that passes through (x. y. Step 3: Perform a weighted backprojection of each filtered projection along the corresponding fan. using (127) we can write R0(7) = /3 0. y) in the fan for the source located at angle )3. (128) Let AO denote the angular increment between successive fan beam projections. (125) In a frequency domain implementation this smoothing may be achieved by a simple multiplicative window such as a Hamming window. Referring to Fig.7(D sin y). This fast re-sorting algorithm does place constraints on the angles at which the fan beam projections must be taken and also on the angles at which projection data must be sampled within each fan beam projection. The sum of all the backprojections is the reconstructed image f(x. we can write Qfi i (na)= Roi (na) * g(na) * k(na). 3.When the convolution of (122) is implemented in the frequency domain using an FFT algorithm the projection data must be padded with a sufficient number of zeros to avoid distortion due to interperiod interference. (127) If. 3. and let A7 denote the angular interval used for sampling the fan ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 95 . Of course. In that case interpolation is necessary. the relationships between the independent variables of the fan beam projections and parallel projections are given by (70): t=D sin y and 0 =13 + 7.19. In practice superior reconstructions are obtained if a certain amount of smoothing is included with the convolution in (122). as before. this value of s' may not correspond to one of the values of na at which 4201 is known.

We will assume that the following condition is satisfied: AO =A7=a. (132) (133) and ³ tmax < t < tmax where tnax is large enough so that each projection is at least as wide as the object at its widest. respectively.n +n ). Of course. each line integral can be thought of as a single point in the Radon transform of the object.beam projections.(D sin na).0<00+ 180° Fig. If each ray integral is represented as a point in a polar coordinate system (t. because of the sin na factor on the righthand side of (130). 3. 3. We can extend this result by noting that an object is completely specified if the ray integrals of the object are known for O0. That is. We may therefore write (128) as Rm a (na)=P(. Each line integral is identified by its distance from the origin and its angle. Pe(t)= P61-1. are mirror images of each other. It expresses the fact that the nth ray in the mth radial projection is the nth ray in the (m + n)th parallel projection. (130) This equation serves as the basis of a fast re-sorting algorithm.25 then a complete set of ray 96 COMPUTERIZED TOMOGRAPHIC IMAGING .(t).25: As shown in this figure. (129) Clearly then and y in (128) are equal to ma and na.5 Fan Beam Reconstruction from a Limited Number of Views Simple geometrical arguments should convince the reader that parallel projections that are 180° apart. 3. This can usually be rectified by interpolation. Pe(t) and P9+180. 0) as shown in Fig. the parallel projections obtained are not uniformly sampled. for some integer values of the indices m and n.180° (²) (131) and thus it is only necessary to measure the projections of an object for angles from 0 to 180° .

7) description of a ray in a fan into its Radon transform equivalent. 3.Fig.28 for angles 0 > 180° and t > 0 are equal to the Radon data for 0 < 0 and t < 0. 3.27 we see that two ray integrals represented by the fan beam angles (fib 71) and (02. These ideas can also be extended to the fan beam case.) integrals will completely fill a disk of radius tniaÅ . The object in (a) is used to illustrate the short scan algorithm developed by Parker (Par824. These two regions are labeled A in Fig. On the other hand. Recall that points in Radon space that are periodic with respect to the intervals shown in (132) and (133) represent the same ray integral. [Par82b]. the regions marked B in Fig. (Reprinted with permission from (Par824. 72) are identical provided 01 ³ = 02 ³ + 180 ° 71 ³ 72. Thus the data in Fig. This is commonly known as the Radon transform or a sinogram and is shown for the Shepp and Logan phantom both in polar and rectangular coordinates in Fig. 3. 3.28. 3.28 are areas in the Radon space where there are no measurements of the object. 3. (b) shows the Radon transform in rectangular coordinates. (134) (135) and With fan beam data the coordinate transformation t=D sin 0=+ 'y (136) maps the (3.26: An object and its Radon transform are shown here.26. while (c) represents the Radon transform in polar coordinates. 3. From Fig. This transformation can then be used to construct Fig.27. which shows the data available in Radon domain as the projection angle varies between 0 and 180° with a fan angle of 40° (7max = 20°).n degrees as shown in 97 COMPUTERIZED TOMOGRAPHIC IMAGING . To cover these areas it is necessary to measure projections over an additional 27.

27: Rays in two fan beams will represent the same line integral if they satisfy the relationship 131 ³ y. 3. The curved lines represent the most extreme projections for a fan angle of ryÅ. for 180° of projections there are no estimates of the Radon transform in the regions marked A B.. 3. - Y ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 98 . On the other hand.28: Collecting projections over 180° gives estimates of the Radon transform between the curved lines as shown on the left.Fig. On the right is shown the available data in the 0=y coordinate system used in describing fan beams. In both cases the region marked A represents the part of the Radon transform where two estimates are available. = 02 - + 180°. Fig.

Fig. We derive the overlap region in this space by using the relations in (134) and (135) and the limits 0:5131:5180° + 27m 0-1325_180° + 27m. (139) (138) (137) Substituting the same two equations into the second inequality in (137) we find 05_13i-2-y1-180°5_180° + 2-ym (140) 99 COMPUTERIZED TOMOGRAPHIC IMAGING . [Par82b] to illustrate his algorithm for short scan or "180 degree plus" reconstructions.. ² 272.30 shows a "perfect" reconstruction of a phantom used by Parker [Par82a]. Again on the left is shown the Radon transform while the right shows the available data in the 0--y coordinate system.29. Projection data measured over a full 360° of 13 were used to generate the reconstruction. 3. 3. 180 ² 6 ² Fig. Fig. The line integrals in the shaded regions represent duplicate data and these points must be gradually weighted to obtain good reconstructions. 3.-180 +2 180 ± -y. then the data illustrated are available. Substituting (134) and (135) into the first equation above we find 0:5 )32² 272+ 180° 180° + 27m and then by rearranging ² 180° +2-y2:5_13227. degrees.29: If projections are gathered over an angle of 180 ° + 2-yÅ. y) coordinate system. It is more natural to discuss the projection data overlap in the ((3. Thus it should be possible to reconstruct an object using fan beam projections collected over 180 + 27Ä.

30: This figure shows a reconstruction using 360° of fan beam projections and a standard filtered backprojection algorithm. (143) as is shown in Fig. 3.5. 3. [Par82bJ.31: This reconstruction was generated with a standard filtered backprojection algorithm using 220° of projections. (141) Since the fan beam angle.) 0 . jPar82b]. the overlapping regions are given by Fig. The large artifacts are due to the lack of data in some regions of the Radon transform and duplicate data in others. )31:5 360° + 2-ym + 271 . If projections are gathered over an angle of 180 ° + 27 and a ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 100 . is always less then 90°. 3. (Reprinted with permission from IPar824.29.27m + 272 (142) and 180° + 180° + 27Å.Fig. (Reprinted with permission from [Par82aJ.132-. y.) and then by rearranging 180° + 271.

While the above filter function properly handles the extra data. and wo(7) = 0 and WI80° + 27m = O. 71) and ((32. 3. wa(y). better reconstructions can be obtained using a window described in [Par82a]. This can be implemented by multiplying a projection at angle 13. the window. a "smooth" window is both continuous and has a continuous derivative. More accurate reconstructions are obtained if a "smoother" window is used to filter the data. As described above. One might think that the reconstruction can be improved by setting the data to zero in one of the regions of overlap. by a one-zero window.In this case a fan angle of 40° (7max = 20) was used.28. (149) 1 0 1 C OM PUTERIZ ED TOM OG RAPH IC IM AGING . the severe artifacts in this reconstruction are caused by the double coverage of the points in region B of Fig. (146) (147) To keep the filter function continuous and "smooth" at the boundary between the single and double overlap regions the following constraints are imposed on the derivative of wa(y): (3= 27m+ 27 =0 (148) and = {3=180° +27 0. 72) satisfying the relations in (134) and (135). The sharp cutoff of the one-zero window adds a large high frequency component to each projection which is then enhanced by the I w I filter that is used to filter each projection. My). given by 0 + 0 (144) elsewhere. must satisfy the following condition: wal('n)+ wo2(72)= 1 (145) for (01. -y)= i wa( ( As shown by Naparstek [Nap80] using this type of window gives only a small improvement since streaks obscure the resulting image. Formally. wa(y). Mathematically.

3.27Ä. instead of illuminating a slice of the object with a fan of x-rays. (Par82bJ. From this image we see that it is possible to eliminate the overlap without introducing errors by using a smooth window. (Reprinted with permission from (Par824.33.4. This is shown in Fig.2.Fig. A more efficient approach. is a generalization of the two-dimensional fan beam algorithms presented in Section 3. the entire object is illuminated with a point source and the x-ray flux is measured on a plane.² 27 ws (7 )= 1. to be considered in this section. 3. (150) 71³Ym A reconstruction using this weighting function is shown in Fig. 3.32. 27m ² 27 5_13 :S 2 fsine [ 45 (3 1 ° 7m³ 7 i' sin ° 45 [ 180° ²2y 180° ² 27 180° + 27m ² 180° + 27m.6 Three Dimensional Reconstructions' - The conventional way to image a three-dimensional object is to illuminate the object with a narrow beam of x-rays and use a two-dimensional reconstruction algorithm. 3.32: Using a weighting function that minimizes the discontinuities in the projection this reconstruction is obtained using 220° of projection data. Now. A three-dimensional reconstruction can then be formed by illuminating successive planes within the object and stacking the resulting reconstructions. This is called a cone beam reconstruction because the ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 102 .) One such window that satisfies all of these conditions is .

ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 103 .' We are grateful for the help of Barry Roberts in the preparation of this material.

6.34. With a single source.33: A three-dimensional reconstruction can be done by repetitively using two-dimensional reconstruction algorithms at different heights along the z-axis. and horizontal and vertical positions on the detector plane. (153) 1 z A three-dimensional parallel projection of the object f 's expressed by the 0 cos 'y 0 cos y ² sin 'y 0 sin y co s s0 sin 0 i n 0 0 cos 0 0 104 COMPUTERIZED TOMOGRAPHIC IMAGING . 3. Cone beam algorithms have been studied for use with Mayo Clinic's Digital Spatial Reconstructor (DSR) [Rob83] and Imatron's Fifth Generation Scanner [Boy83). as in the two-dimensional case. The first rotation. s. ray integrals are measured through every point in the object in the time it takes to measure a single slice in a conventional two-dimensional scanner. (151) (152) A new coordinate system (t. z)axis as shown in Fig. t and r.1 Three-Dimensional Projections A ray in a three-dimensional projection is described by the intersection of two planes t=x cos 0+y sin 0 r= ³ ( ² x sin 0 + y cos 0) sin 1. r). 3. r) is obtained by two rotations of the (x. Then a second rotation is done out of the (t. s)-plane around the t-axis by an angle of y. is by 0 degrees around the z-axis to give the (t.object reconstructed slices of the object Fig. 3. 3. Ro(t. Z cos 7. f3. The main advantage of cone beam algorithms is the reduction in data collection time. In matrix form the required rotations are given by 1 [st' = 0 r 0 [ x 0 y . (From [Kak86J.) rays form a cone as illustrated in Fig. The projection data. z)-axes. are now a function of the source angle.35. y. s.

Fig. and similarly rDso V D2s + o (158) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 105 . r)= sn. To find the equivalent parallel projection ray first define P Dso Ds0+ DDE Dso Dso + DDE (155) as was done in Section 3. f(t. 'y) in the s-z plane.4. By rotating the source and detector plane completely around the object all the data necessary for a three-dimensional reconstruction can be gathered in the time a conventional fan beam system collects the data for its twodimensional reconstruction. r). 3.) following integral: Po. (From [Kak86J.y(t.' (P/Dso) (157) where t and 0 locate a ray in a given tilted fan. In a cone beam system the source is rotated by (3 and ray integrals are measured on the detector plane as described by Rs (p' . s. (t.34: In cone beam projections the detector measures the x-ray flux over a plane.2. Here we have used Dso to indicate the distance from the center of rotation to the source and DDE to indicate the distance from the center of rotation to the detector. Ra(p. r) ds. sm (154) Note that four variables are being used to specify the desired ray. For a given cone beam ray. the parallel projection ray is given by Dso t -= p __________________________ (156) VDL-i-p2 0 =13 ± tan. 0) specify the distance and angle in the x-y plane and (r.

C) (0.023. 0.04. 0. The r-axis is not shown but is perpendicular to the t.and s-axes. (From [Kak86].1) 2. B. 0.2: Summary of parameters for three-dimensional tomography simulations.06.2 shows the position and size of each ellipse and Fig. ³ 0. 0) (0.1.36 illustrates their position. ( 0.874. t)-axis. 0.02) (0. 0. 0.046) (0.1. 0. y. 0.8.02 0. 0. The two-dimensional ellipses of Table 3. ³0.625) (0.) 7 =tan. 0. 0. ³0.1. 0. 0. 0.25) (0.02 f g h 106 COMPUTERIZED TOMOGRAPHIC IMAGING .41. 0. ³0.25) (-0. ³ 0. 0.02) (0. ³ 0.56. Coordinates of the Center (x. 0) (-0.25) (0.105.22. 3. 0.046.69. 0.01 0. ³0. ³0. 0.35: To simplify the discussion of the cone beam reconstruction the coordinate system is rotated by the angle of the source to give the (s.Fig.056. ³0. 0. 0.046.01 0. 0. z) (0.11.25) (0.046.1) (0. 0.6624.02 ³0.92. 0.023.25) (0. Table 3.98 ³0.02 0. 0. 0.88) (0. Because of the linearity of the Radon transform. ³0.0 b d e 0 108 72 0 0 0 90 90 0 ³ 0. (159) were r and y specify the location of the tilted fan itself.21) (0.31.02 ³0.06. The reconstructions shown in this section will use a three-dimensional version of the Shepp and Logan head phantom. a projection of an object consisting of ellipsoids is just the sum of the projection of each individual Table 3.9) (0. 0.046.625) Rotation Angle (3 (deg) 0 Gray Level Ellipsoid a Axis Lengths (A.1 GiDso).22) (0. 3.1 have been made ellipsoids and repositioned within an imaginary skull.046. 0.046) (0.

(From [Kak86J.625) and (e) an illustration of the gray levels with the slice.(a) (b) Fig.) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 107 . (a) A vertical slice through the object illustrating the position of the two reconstructed planes.25) and (c) an illustration of the level of each of the ellipses. 3.36: A three-dimensional version of the Shepp and Logan head phantom is used to test the cone beam reconstruction algorithms in this section. (d) An image at plane A (z = 0. (b) An image at plane B (z = ³ 0.

2 Three-Dimensional Filtered Backprojection We will present a filtered backprojection algorithm based on analyses presented in [Fe184] and [Kak86].ellipsoid. (162) If the tilt angle.t2(C2cos2 + (B2 cos2 0 A2 sin2 0) sin2 7) - a`OW  r2(A 2 cost 0 + B2 sin2 0)( 7 + cos (4-y)) 8  2tr sin y cos 0 sin 61 (B2 . ² __________ R (p) h( p' p) -. 7)=C2(B2 sin2 0 +A2 cos2 0) cos2 -y+A2B2 sin2 y. If the ellipsoid is constant and described by f (X. 3. Y Z)={ 0 otherwise P A 2+ B 2+ C 21 x 2 y 2 (160) z 2 then its projection on the detector plane is written 2pABC Pe. The cone beam algorithm sketched above is best derived by starting with the filtered backprojection algorithm for equispatial rays. In a threedimensional reconstruction each fan is angled out of the source-detector plane of rotation.A2)] 1/2 (161) where a2(0.r)= _________ a2 (0. g(r.. 7. In other words..6. This leads to a change of variables in the backprojection algorithm. 7). is zero then (161) simplifies to (5)./ D o +p` e. The reconstruction is based on filtering and backprojecting a single plane within the cone.p dw (164) P Dsor cos ((3 . First consider the two-dimensional fan beam reconstruction formula for the point (r. (163) I 1 o Dso ( r . 0 - dp 2 0 U2 \.y(t. each elevation in the cone (described by z or 0 is considered separately and the final threedimensional reconstruction is obtained by summing the contribution to the object from all the tilted fan beams. h(p)= c w w w Dso+ r sin (613) = __________________________________ (165) Dso 108 COMPUTERIZED TOMOGRAPHIC IMAGING .(1)) Dso+ r sin (0-0) U(r.

thus the size of the fan and the coordinate system of the reconstructed point change. z D2S0 o. Recall that (t. s). s)= 2 0 (Dso³ s) (169) In a cone beam reconstruction it is necessary to tilt the fan out of the plane of rotation. As shown in Fig. Because of the changing fan size both the source distance. y. s7) is defined that represents the location of the reconstructed point with respect to the tilted fan. (167) (168) lead to DSO t P U(x.Equation (163) is the same as (116). (¢) coordinate system by the rotated coordinates (t. except that we have now used different names for some of the variables. change.37: The (I. To further simplify this expression we will replace the (r. 0.37 a new coordinate system (t. 3. cn D o t p) _____ '13111" Dso²s VDso+P \ ( -1:SO 2 d p d o . and the angular differential. s) is the location of a point rotated by the angular displacement of the source-detector array. g(t. The new source distance is given by D 2 (168) SO=DS0+2 z Fig. (From [Kak86J.) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 109 . 0)=Dso³s Dso ³ s Dso The fan beam reconstruction algorithm is now written as 2 c . S') coordinate system represents a point in the object with respect to a tilted fan beam. 3. Ds0 . The expressions t=x cos +y sin s= ²x sin 13 + y cos (3 (166) x=r cos qS y = r sin 0.

.2 Substituting these new variables. and writing the projection data as Rp (p. ?*.c113' . Ro(p.(P. ?-). each weighted projection is backprojected over the three110 COMPUTERIZED TOMOGRAPHIC IMAGING .2Dso g(t. ?')=R. by the function (Dso/ /D10 + 3'2 + p2) to find Rt. (174) Dso³s N/DL + +p2 The cone beam reconstruction algorithm can be broken into the following three steps: Step 1: Multiply the projection data. (176) Step 3: Finally. s)² 2 0 (D so³s)2 y Ro(P. (2 so+P (Ds'0²:s7 P) VD ___ ' 2 2 (172) To return the reconstruction to the original (t.(3 Dso df3' ³VD D. The result. a is written 1 QAP. z) coordinate system we substitute I= t .(3 g(t. Dso ___________ RAP. D ' Dso' so Dso Dso² s - (173) and (170) and (171) to find 1 7. (169) becomes Dso d0= E)0 di3 d.13. * 2h(P). . In addition. the increment of angular rotation ch3' becomes (171) L + ?. (-) with h(p)/2 by multiplying their Fourier transforms with respect top. ?").where is the height of the fan above the center of the plane of rotation. s. (D s o t p ) Dso __________ dp c10. ‡23(p.0=²1 f i r 2 o (Dsio³s7)2 y 1 R i p ( p . Note this convolution is done independently for each elevation.3 for Dso and d13' for d(3. O h D' so _ Ddp. DL+ p2 (175) Step 2: Convolve the weighted projection RAp. .(p.

Figs. The reconstructed planes were at z = 0. s. Qp. 3. In agreement with [Smi85]. Only those points of the object that are illuminated from all directions can be properly reconstructed. Farther from the central plane each point in the reconstruction is irradiated from all directions but now at an oblique angle. On the plane of rotation (z = 0) the cone beam algorithm is identical to a equispatial fan beam algorithm and thus the results shown in Fig. The success of x-ray CT has naturally led to research aimed at extending this mode of image formation to ultrasound and microwave sources.38 are quite good.25 planes and are marked as Plane A and Plane B in Fig.36. 3. 3. the mismatch with the assumed theoretical models is caused primarily by the polychromaticity of the radiation used.) His invention also showed that it is possible to process a very large number of measurements (now approaching a million) with fairly complex mathematical operations.38 and 3. ( D s o ² s ) 2 / 3.7 Bibliographic Notes The current excitement in tomographic imaging originated with Hounsfield's invention [Hou72] of the computed tomography (CT) scanner in 1972.dimensional reconstruction grid: D2S0 Dso t DSOZ g(t. the quality of the reconstruction varies with the elevation of the plane. z)= ____________ Q0 _________________________ (177) 2r 0 DSO ³ S 'Dso³s The two arguments of the weighted projection. represent the transformation of a point in the object into the coordinate system of the tilted fan shown in Fig. In a cone beam system this region is a sphere of radius Ds0 sin (PÅ. is half the beamwidth angle of the cone.36. 3. The superiority of the filtered backprojection algorithm over ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 111 . and still get an image that is incredibly accurate. This will be discussed in Chapter 4. 3.) where rÅ. The idea of filtered backprojection was first advanced by Bracewell and Riddle [Bra67] and later independently by Ramachandran and Lakshminarayanan [Ram71]. As shown in Fig. 3. (In x-ray tomography. Outside this region a point will not be included in some of the projections and thus will not be correctly reconstructed. In each case 100 projections of 127 x 127 elements were simulated and both a gray scale image of the entire plane and a line plot are shown.39 show reconstructions at two different levels of the object described in Fig. His invention showed that it is possible to get high-quality cross-sectional images with an accuracy now reaching one part in a thousand in spite of the fact that the projection data do not strictly satisfy theoretical models underlying the efficiently implementable reconstruction algorithms. which was indeed a major breakthrough.625 and z = ³ 0.37.39 there is a noticeable degradation in the reconstruction.

(b) Plot of the y = ²0.736 .00 .605 line in the reconstruction compared to the original. (From fiCak861.05 1.(a) 1.488 .00 -.990 Fig.03 1.06 1.752 -.38: (a) Cone beam algorithm reconstruction of plane B in Fig. 3.04 1.240 .02 1.36.) .984 (b) 112 COMPUTERIZED TOMOGRAPHIC IMAGING .5 0 4 .01 1.980 -1. 3.

) .252 -.007 .1.03- 1.240 . 3.256 -.06- F 1.36. (From [Kak861. 3.01- .995 - .00 -.105 line in the reconstruction compared to the original.930 -1.979- Fig.04 - 1.736 . (b) Plot of the y = ³0.39: (a) Cone beam algorithm reconstruction of plane A in Fig.504 -.984 (b) ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 113 .962- .488 .

vol. by direct Fourier inversion." Proc. and later independently by DeRosier and Klug [DeR68] in electron microscopy and Rowley [Row69] in optical holography. 71. pp. [Ken79]. "Cardiac computed tomography. and Dines and Kak [Din76]. Peters. Full three-dimensional reconstructions have been discussed in [Chi79]. vol. Fast algorithms for ray sorting of fan beam data have been developed by Wang [Wan77]. Murata. 298-307. The reader is referred to [Nah81] for a filtered backprojection algorithm for reconstructions from data generated by using very narrow angle fan beams that rotate and traverse continuously around the object. [Smi85]. [Lew79] for ways to speed up the filtering of the projection data by using binary approximations and/or inserting zeros in the unit sample response of the filter function. In order to utilize two-dimensional FFT algorithms for image formation. Opt. [Boy83] D. and to [Bra67]. Bates and T. Dreike and Boyd [Dre77]. pp. P. 67. the direct Fourier approach calls for frequency domain interpolation from a polar grid to a rectangular grid. The fan beam algorithm derivation presented here was first developed by Scudder [Scu78]. J. [Han8 1 b]. [Opp75]. This was first shown by Bracewell [Bra56] for radioastronomy. [Lew78]. Amer. 14.the algebraic techniques was first demonstrated by Shepp and Logan [She74]. pp. [Kwo77]. T. [Tre69]. [Tan75] have proposed variations on the filter functions of the filtered backprojection algorithms discussed in this chapter.8 References [Bab77] N. [Tam81] for reconstructions from incomplete and limited projections. H. Images may also be reconstructed from fan beam data by first sorting them into parallel projection data. Bates and Peters [Bat71]. Many authors [Bab77]. Peters and Lewitt [Pet77]. Mar. although less accurately. [Chi80]. 1971. 883-896.. The reader is referred particularly to [Ken79]. IEEE. and Mersereau and Oppenheim [Mer74]. 1983. Tomographic imaging may also be accomplished. Baba and K. [Hor79] for algorithms for nonuniformly sampled projection data. [Bat71] R. The reader is also referred to [Hor78]." J. Recently Wernecke and D'Addario [Wer77] have proposed a maximum-entropy approach to direct Fourier inversion. Their procedure is especially applicable if for some reason the projection data are insufficient. Boyd and M. Sci. For some recent methods to minimize the resulting interpolation error. Several workers who applied this method to radiography include Tretiak et al. "Towards improvements in tomography. [Lew79]. 662-668. We have also not discussed the circular harmonic transform method of image reconstruction as proposed by Hansen [Han8 1 a]. [Sat80]. 1977. 114 COMPUTERIZED TOMOGRAPHIC IMAGING . Soc. "Filtering for image reconstruction from projections. instead of the filtered backprojection method presented in this chapter. vol. Its development for fan beam data was first made by Lakshminarayanan [Lak75] for the equispaced collinear detectors case and later extended by Herman and Naparstek [Her77] for the case of equiangular rays. Lipton. the reader is referred to [Sta81]. M.." New Zealand J. 3.

July 1980. 217. pp. New York. Soc. 5. vol." in Array Signal Processing. 150. 67. pp. NJ: Prentice-Hall. Chiu." Tech. Feldkamp. Dept. J. State Univ. 198-217. pp. Soc. "The reconstruction of a threedimensional structure from projections and its applications to electron microscopy. 1978. F." Proc. [Han81a] E. Image Proc. N.. pp. Arendt. Y. pp. NY: Academic Press.. Simpson." SIAM J." IEEE Trans. 1.. Jr. J. pp. Sci. "Reconstruction of three dimensional structures from electron micrographs. N. C. Soc. R. 71." IEEE Trans. Hansen. [Kak86] A. G." J. 319-340.. and G. 1. "Computerized tomography using x-rays and ultrasound. Opt. C. The Patent Office. Lafayette... NJ: Prentice-Hall. Amer. [DeR68] D. "Measurement and reconstruction of ultrasonic parameters for diagnostic imaging. Imaging. S. 1985. 427-434. Simpson. Englewood Cliffs. 1979. 1972. M. Young and K. Nucl. S. pp. IEEE." Patent Specification 1283915. Opt. Y. K." Research Rep. "Circular harmonic image reconstruction: Experiments. [Jak76] C. "More accurate interpolation using discrete Fourier transforms. A317. Phys. Opt.. "Theory of circular image reconstruction. DeRosier and A. Appl." Comp.. Kak. Roberts. 1968. IN. Bates. 612-619. pp. 1981." Ultrason." J. vol. vol. "Efficient convolution kernels for computerized tomography. Naparstek. pp. DeRosier. [Han8lb] ____ . of New York at Buffalo." Nature. [Cro70] R. 1979. "Density reconstruction using arbitrary ray sampling schemes. pp." J. Lewitt and R. [Hor78] B. "A method of and apparatus for examination of a body by radiation such as x-ray or gamma radiation. 269-278 (Part IV). A. [Din76] K. 33. "Practical cone-beam algorithm. C. 1986. 1977. vol. J. "Three dimensional reconstruction from planar projections. 69. J. [Hor79] ____ . Gindi. Purdue Univ. [Lak75] A. "Fast image reconstruction based on a Radon inversion formula appropriate for rapidly collected data. T." Proc. H. [Ham77] R. H. Lakshminarayanan. 1990-1998. W. V. pp. vol. [Kak85] ____ . C. J. and T. 1976. Kak. School of Electrical Engineering. S. Y. L. "Reconstruction from divergent ray data. 304-308." Research Rep. and J. 1977. Keating. of Computer Science. A. Greenleaf. Roy. Amer. July 1981. 1976. 459-469. 755-762. 1616-1623." in Handbook of Pattern Recognition and Image Processing. Bracewell and A. Lafayette. vol." Aust. 551-562. H. [Ken79] S. Dreike and D. pp. I. "Tomographic imaging with diffracting and non-diffracting sources. and A. School of Electrical Engineering. H. 9. 50. 1977. Kenue and J. IEEE. Math. Soc. Dec. 1977. 1970. C. 20. Klug. vol. pp. Purdue Univ. Barrett. IEEE. [Kak79] A. Opt. NS-24. May 1978. "Fan-beam reconstruction methods. pp. Mar. Fu. [Chi79] M. pp. pp. Graph. Herman and A. 1956." Proc. 1979. Chou. "Image reconstruction from projections. Nov. 67. Hounsfield. Chiu.. TR-EE 77-4. vol. R. [Dre77] P. Kress. and R. Amer. Opt. pp. Speech Signal Processing. 130-134. 189-204 (Part III).. W. H. "Computerized tomography with x-ray emission and ultrasound sources. 1979." Astrophys. Kak and B. 1978. London. Digital Filters. vol. vol. "Strip integration in radio astronomy." Optik. vol. Amer. Truong. 1245-1272. P. T. 232-244. P. Riddle. IN. Hamming. Eds. D. C. 85-109 (Part II). Englewood Cliffs. June 1984. vol. S. 1975. 2266-2274. Haykin. [Her77] G. 70. vol. [Kea78] P. Bracewell. Barrett. "Image reconstruction from projections.. vol. TR-EE 76-26. [Bra67] R. 511-533.[Bra56] R. A. Boyd. [Lew78] R. Horn. Acoust. Jakowatz. vol. Dines and A. K." J. Rep. and A. pp. vol. 1323-1330. 66. Oct. Davis. Ed.. Kak. Nov. T. 92. Soc. W. pp. [Hou72] G. C. Kwoh." Appl. "Inversion of fan-beam scans in radio astronomy. Crowther. K. [Fe184] L. G. vol. 1967. "A generalized I w I-filter for 3-D reconstruction. ALGORITHMS FOR RECONSTRUCTION WITH NONDIFFRACTING SOURCES 115 . Reed. [Kwo77] Y. N. vol. Klug. [Chi80] M. V. 19-33 (Part I). W. pp. Jan." Proc. "Three dimensional radiographic imaging with a restricted view angle. pp. ASSP-26. N. "Convolution reconstruction of fan-beam reconstructions. 368-369.

pp. and E. Lewitt. Oppenheim. 2236-2240. [Sta81] H. D. 199-202. M." Med. "The Fourier reconstruction of a head section. [Scu78] H. 1980. 395-399. E. Rabiner. "Introduction to computer aided tomography. M. - 116 COMPUTERIZED TOMOGRAPHIC IMAGING . C. Pan and A. S. "Design constraints and reconstruction algorithms for transverse-continuous-rotate CT scanners. [Pan83] S. and R. Eds. A. Parker. Naparstek. Tretiak. Tomog. New York. "Tomographic image reconstruction from limited projections using iterative revisions in image and transform spaces. M. Oppenheim. 0. Oct. "Optimal short-scan convolution reconstruction for fanbeam CT. Baltimore. Mar. IEEE. Iinuma. [Sat80] T. 1983." IEEE Trans. vol. vol. R. C. 8th Int. W. 61.." IEEE Trans. vol. pp. Soc. Mar. "Digital reconstruction of multidimensional signals from their projections." Proc. [Opp75] B. [Ros82] A. "Maximum entropy image reconstruction. [Wer77] S. vol. pp. 1983. D. A. Ter Pogossian et al. and M." IEEE Trans. pp. 1971. NY: Academic Press. L. pp. L. V. J. 2678-2681. Comput. J. BME-28. Smith. M. D. pp. vol. Schafer and L. D'Addario. Simon. 1977. 71. C-26. Rowley. [Row69] P. vol. L. Comput.. 1975. pp. Kak. [Tan75] E. 2nd ed.. NS-26. [Nap80] A." IEEE Trans. pp. J. Scudder. 1." J. Eng. NucL Sci. NucL Sci. 1319-1338. Acoust. R. L." AppL Opt. Eng. Assist. [Tam81] K. vol. NS-27. and M." J." IEEE Trans. Hingorani. M. Tanaka and T. Wemecke and L. Biol. pp. vol. Amer. vol." Proc. M. 429 436. "Internal structures for three dimensional images. 1974. [Par82b] __ . vol. IEEE. IEEE. 1977. 1979. [Tre69] 0. [Mer74] R." Proc.. Logan." J. Mar. pp. Ikeda.. pp." in Proc. pp. filtered-backpropagation. pp. "Image reconstruction from cone-beam projections: Necessary and sufficient conditions and reconstruction methods. J. 789-798. pp. pp. C. 1980. 62. Kak." IEEE Trans. Lewitt. C. Peters and R. vol. "Three dimensional reconstructions from radiographs and electron micrographs: Application of convolution instead of Fourier transforms. Linzer.. Mar. J. ASSP-31." IEEE Trans. Soc. 1969. vol. Hirama. A. [Pet77] T. vol. Ramachandran and A. [Wan77] L. vol." Proc. 1982. Wang. 1496-1498. "Highspeed three-dimensional x-ray computed tomography: The dynamic spatial reconstructor. June 1978.[Lew79] R." in Proc. "Quantitative interpretation of three dimensional weakly refractive phase objects using holographic interferometry. 9. R. "Tomographical imaging with limited angle input. IL. May 1981. MI-4. Sinak. E. Opt. A. 1975. Paul. [Nah81] D. Opt. 71. [Ram71] G. "Computed tomography with fan-beam geometry. X. Kak. 20. 1969... Med. on Med. vol. vol. Hoffman. Rosenfeld and A. Sato. Chicago. Robb. 254-257." IEEE Trans. 1973. M. pp. N. Biomed. Ritman. vol. "Reconstruction tomography from incomplete projections. "Ultra-fast convolution approximation for computerized tomography. Conf. 20. Nov. 1982. Eden. Crawford. Speech Signal Processing. "Short-scan fan-beam algorithms for CT.. MD: University Park Press." in Reconstruction Tomography in Diagnostic Radiology and Nuclear Medicine.. and A. C-26. NucL Sci. pp." IEEE Trans." Proc. ASSP-29. 1982. 237-244. W. Mar.. 308-319. Mersereau and A. Med. 1262-1275. [Sch73] R. vol. C. Harris. 1981. Perez-Mendez. Acoust. Phys. "Optimization of short scan convolution reconstruction for fan-beam CT. [She74] L. pp. "Direct Fourier reconstruction in computer tomography. Nahamoo. vol. Sci. 68. 66../Apr. 59. F. 351-364. "A computational study of reconstruction algorithms for diffraction tomography: Interpolation vs. I. Norton. 79-97. Acad. International Workshop on Physics and Engineering in Medical Imaging." Phys. IEEE. pp. M. 1985. vol. 628-637. pp. NS-21. [Rob83] R. V.. 692-702. Stark.. Amer. [Par82a] D. Nat. 14-25. [Smi85] B. Lakshminarayanan. Woods. "Correction functions for optimizing the reconstructed image in transverse section scan. 264-268. Comput. 21-43. 582-592. 1981. 1974. Shepp and B. Digital Picture Processing. Speech Signal Processing. "Cross-section reconstruction with a fan-beam scanning geometry. Biol. Imaging. Feb. Tam and V. 1977.. "A digital signal processing approach to interpolation.