You are on page 1of 10

2013 International Nuclear Atlantic Conference - INAC 2013

Recife, PE, Brazil, November 24-29, 2013


ASSOCIAÇÃO BRASILEIRA DE ENERGIA NUCLEAR - ABEN
ISBN: 978-85-99141-05-2

DEVELOPMENT OF A SIMULATOR FOR X-RAY


FLUORESCENCE SPECTROMETRY USING DISTRIBUTED
PROCESSING

Marcio H. dos Santos1a , Robson C. de Castro1b , Marcelino J. dos Anjos1c ,


Joaquim T. de Assis2 , Marcelo P. de Albuquerque3 and Luı́s F. de Oliveira1d
1
Programa de Pós-Graduação em Fı́sica, Instituto de Fı́sica
Universidade do Estado do Rio de Janeiro
Rua São Francisco Xavier, 524
20550-013 Rio de Janeiro, RJ
a
marciohsantos2010@gmail.com, b prof.robinho@gmail.com, c marcelin@uerj.br , d lfolive@uerj.br

2
Departamento de Engenharia Mecânica e Energia, Instituto Politécnico
Universidade do Estado do Rio de Janeiro
Rua Bonfim, 25, Vila Amélia
28625-570 Nova Friburgo, RJ
joaquim.iprj@gmail.com

3
Coordenação de Atividades Técnicas
Centro Brasileiro de Pesquisas Fı́sicas
Rua Dr. Xavier Sigaud, 150, Urca
22290-180 Rio de Janeiro, RJ
mpalbuquer@gmail.com

ABSTRACT

On the study of energy transport by ionizing electromagnetic radiation, the computer simulation is nat-
urally justified. First, because the simulation is free of risk to the analyst, second because the simulation
is cheaper than the real investment. However, the simulation techniques based on Monte Carlo methods
present a well known problem: the time wasted to yield the result. As the complexity of the modelling in-
creases, the time increases. The X-ray fluorescence spectrometry is a very flexible technique that permits
someone studies the chemical composition of the sample. Before someone builds an X-ray fluorescence
system, it is recommended to study the feasibility of the setup over a specific spectrometric modality.
And this step can be easily performed through simulation. The probability of X-ray fluorescence pro-
duction is very low compared to other interaction mechanisms. So, the time spent to reach a specific
statistic is large. To overcome the problem, the simulation could be implemented in a way to explore the
modern computer and graphic processor units (GPUs) and to use parallel programming languages. The
goal of the paper is to show the development of a simulator based on parallel programming language to
explore the computer resource to solve the problem of efficiency of simulation using Monte Carlo method
in an ED-XRF simulation. The ED-XRF setup was simulated to analyse two samples built with iron
and covered one with nickel and other with copper. To validate the simulator, the results were compared
with other simulator. The spectra of each sample generated by the developed simulator and other are
shown.

1. INTRODUCTION

Different needs can be listed to justify the decision of someone on analysing a real system:
a better understanding of the system behaviour under a variety of conditions, identification
of potential modifications on the system proceedings to increase the efficiency or simply to
better understand the current system operational conditions [1, 2]. The analysis process
can require time and money to achieve these goals by manipulation of the physical system.
But when the system becomes economically infeasible, the computer simulation comes as
a very good approach.

On the study of energy transport by ionizing electromagnetic radiation, the computer


simulation is naturally justified. First, because the simulation is free of risk to the analyst,
second because the simulation is cheaper than the real investment. The effort to develop
computer simulators to study energy transport is not new. Since World War II the
simulation has been used. However, the simulation techniques based on Monte Carlo
methods present a well known problem: the time wasted to yield the result. As the
complexity of the modelling increases, the time increases.

The modern personal computers and the own computer science have been advancing
and yielding new approaches to solve this problem. Nowadays, graphic boards based on
GPUs, multi-core processors and distributed programming languages are the instruments
to implement optimized codes. An example is the CHARM++ language. Developed by
the Parallel Programming Laboratory of Illinois University, this language supports a great
variety of processors and network architectures. Its installation is easy and offers a great
tool to analyse the code performance.

Back to the problem of energy transport, the X-ray fluorescence spectrometry (XRF) is
a very flexible technique that permits someone studies the chemical composition of the
sample. The precision of the technique depends on the setup characteristics. The X-ray
fluorescence spectrometry accepts different modalities of experimental setup, each one
improving a different measurement, e.g., wavelength dispersive XRF (WD-XRF), energy
dispersive XRF (ED-XRF), total reflection (TR-XRF) and micro-fluorescence (µ-XRF).
Essentially, the elements of a simple X-ray fluorescence system are: the X-ray source, the
sample and the detector. Depending on the relative position of these elements, a different
modality of spectrometry becomes available [3].

Before someone builds an X-ray fluorescence system, it is recommended to study the


feasibility of the setup over a specific spectrometric modality. And this step can be easily
performed through simulation. The probability of X-ray fluorescence production is very
low compared to other interaction mechanisms. So, the time spent to reach a specific
statistic is large. To overcome the problem, the simulation could be implemented in a
way to explore the modern computer and graphic processor units (GPUs) and to use
parallel programming languages.

This is the goal of the paper: the development of a simulator based on parallel program-
ming language to explore the computer resource (CPU and network) to solve the problem
of efficiency of simulation using Monte Carlo method in an ED-XRF simulation. Dis-
tributed processes can be created using other tool as message passing interface (MPI),
but the CHARM++ is natively a parallel language.

The ED-XRF setup was simulated to analyse two samples built with iron and covered one

INAC 2013, Recife, PE, Brazil.


with nickel and other with copper. To validate the simulator, the results were compared
with other simulator (MCNP5). The spectra of each sample generated by the developed
simulator and MCNP5 are shown.

2. MATERIALS AND METHODS

The creating process of a model inserts intrinsically a series of uncertainties since the
model is, and will always be a kind of system approximation, far from the real system.
It can be understood as a reality representation. It is important to know that the results
gained by the modelling technique are always affected by a variety of sources. These
results are not related to an object creation but to the concept of the model as near as
possible to the study goals. Globally, everything in simulation depends on the research
context and the kind of information one want to gain by the system (and the model).

As can be seen on figure 1, the modelling process is a composition of steps starting with
the problem definition up to the results analysis [4]:

• Problem definition: the problem is seen as a conception and the processes must be
identified;

• Goals definition: the analyst must define the study goals and the simulation, which
parameters he needs to follow;

• System analysis: the system must be opened and the relations among inputs, out-
puts and processes must be identified;

• Description model: the first modelling of the system; each part (subsystems) must
be presented;

• Parameters behaviour: the step before the formal modelling; this is the identification
of the influence of each parameter inside the modelling;

• Formal model: the mathematical description of the processes;

• Model implementation: this step builds the algorithms and codes of the model;

• Model evaluation: the simulation as the whole is analysed to be validated;

• Results analysis: the last step where the results are compared with real data.

2.1. Modelling of the X-ray Spectrometry System

As mentioned in the previous section, the X-ray spectrometry process proceeds from the
emission of X-ray photons from a source, usually an X-ray tube. This radiation may be
transmitted or interact with a sample by medium of physical processes, which may be the
elastic or inelastic scattering or the photoelectric effect. The secondary radiation resulting
from this interaction is spread in several directions. A portion of this radiation (energies

INAC 2013, Recife, PE, Brazil.


Figure 1: Steps of the modelling process.

lower than 1.02 MeV) can be focused in a detector (or in a set of detectors), which is
sensitive to radiation. This device is responsible for generating X-ray spectra, which
include fluorescence photons produced by the photoelectric effect and also the chemical
signature of the sample. The spectrometric technique in question is by energy dispersive.

The identification of the subsystems that make up the spectrometry system consists of:
source, sample and detector. The system parameters, in turn, are: the source position,
its direction of emission, the detector position, its orientation, the location of the sample
and the coordinate system which governs the whole system.

2.2. Modelling of the Source Subsystem

The X-ray tube is the source of electromagnetic radiation responsible for generation and
emission of photons. The tube seen as a subsystem, it may be thought from its interior
elements. However, it can also be described as a black box is characterized only by the
ability of emission of X-ray photons. In this work, the second approach was chosen to allow
better monitoring of the modelling process, since the black box model makes the project
easier without necessarily compromising the ability of the process. Thus, the X-ray tube
shall only be characterized by parameters relating to the radiation beam generated.

INAC 2013, Recife, PE, Brazil.


Such parameters connected to the beam and to identify the source subsystem are: the
diameter of emission, the opening angle, the centre of emission and the energy spectrum
related to the emission of photons.

2.3. Modelling of the Detector Subsystem

The X-radiation detector is the main component of the entire detection system overview
spectrometry. The system can have one or more detectors in order to record an even
larger share of the secondary radiation. This device, as already mentioned, is responsible
for generation of the spectra of X-ray output, thus establishing the base spectrometry.

The geometry of the detector (for a logical principle) needs to be defined, with a paral-
lelepiped geometry being one of the most practical. Therefore, the detector geometry is
determined by three parameters: height, width and depth. For the sake of modelling, the
depth is not considered as a decisive parameter in the process of photon counting. In this
case, the model of detector becomes more a logical device counter that an experimental
apparatus.

The set of variables which characterize the detector subsystem: detector position, sensitive
surface of detection, detection direction, height, width, depth and detection matrix.

2.4. Modelling of the Sample Subsystem

The sample is a set of mediums (specimens) over which focuses the primary radiation.
After the incidence, there is the probability of occurrence of aforementioned processes
(transmission, photoelectric absorption, incoherent or coherent scattering). It can be
considered the main subsystem, as a result of the interaction nature is critical to the
quality of the spectrometric analysis.

In terms of the experimental overview, the sample has a specific geometry and also occu-
pies a region within the system itself. This region is the sample space and can be modelled
as a volume (parallelepiped) in which the set of mediums is inserted. This volume contains
any sample of the spectrometry system and should preferably be the smallest possible.

Thus, the sample can be identified by nine parameters: height, width and depth of the
sample space, geometry of the specimens, orientation of the sample space, point of inci-
dence, incidence matrix and composition table.

2.5. Modelling of the Database Component

This component is a feedback mechanism within the process and not a physical element
of the spectrometry system, in conclusion, there is not a system in itself. The database
is simply a theoretical option of this modelling process and has no physical link with the
spectrometry system.

INAC 2013, Recife, PE, Brazil.


The database consists of a directory that stores 94 text files, each having data and informa-
tion of an element of the periodic table (from hydrogen to plutonium without exclusion).
All of this database is to assist the computational modelling of the spectrometry system,
supporting the entire process of radiation interaction [5].

Therefore, the identification of the parameters that characterize the database consists of:
element symbol, atomic number, atomic mass, atomic density, fluorescence production
factor, mass cross-section coefficients, binding energies and jump-ratio constants.

3. RESULTS

The simulator was implemented in CHARM++ language and evaluated in LINUX en-
vironment due to the fact that it is more appropriate for the implementation of parallel
and distributed applications. In turn, the CHARM++ language is conducive to the im-
plementation of various scientific applications, that is, a “mature” language, and is useful
for optimizing the performance due to the flexibility coupled with logic [6].

Several tests of verification, comparison and performance were performed with respect to
the nature of the simulator. Among them, we highlight four comparisons of spectrometric
results provided by the simulator in question and MCNP. The simulated spectrometry
system comprises (figure 2):

Figure 2: Hypothetical illustration of system configuration.

• a X-ray “punctiform” source emitting a monochromatic spectrum of 17 keV, placed


at 5 cm from the center of the system. The beam path is 25 degrees with the Z-axis;

• a detector positioned to record the scattering in the sample. It is within 5 cm from


the center of the system as opposed to the source. Its detection direction is 25
degrees with respect to the Z-axis and in the same vertical plane formed by the
source and the center of the system. The detector has a sensitive area of 1 cm2 and
an matrix of 1000 × 1000. The MCNP detectors by their cases are hollow spheres
of 1 cm in diameter and composed of a carbon surface;

INAC 2013, Recife, PE, Brazil.


• four different samples composed of thin layers of copper and nickel on a support of
iron. The first sample has a layer of 5 µm in copper thickness, the second has a
layer of 10 µm in copper thickness, the third has a layer of 5 µm in nickel thickness
and the fourth has a layer of 10 µm in nickel thickness. The iron support has a
thickness of 1 cm on all samples;

• 109 events were generated in all simulations.

Regarding spectrometric results (figures 3–6), despite the energy values are the same in
each pair of results, some discrepancy occurred between photon counts. This fact was
expected, since the modelling of this simulator and the MCNP certainly present different
points of view.

Figure 3: Spectrum generated by the simulator (left) and the MCNP (right),
for a copper layer of 5 µm.

Figure 4: Spectrum generated by the simulator (left) and the MCNP (right),
for a copper layer of 10 µm.

The implementation of the simulation by Monte Carlo method was favoured by the use
of the randomness technique, indicating that the problem of energy transport by photon
emission can be naturally distributed. Photons are independent of each other, and the
history of each photon is permeated by a sequence of events that do not interfere with
any other photon history. Thus, it allowed the use of distributed programming in order
to optimize the whole process.

INAC 2013, Recife, PE, Brazil.


Figure 5: Spectrum generated by the simulator (left) and the MCNP (right),
for a nickel layer of 5 µm.

Figure 6: Spectrum generated by the simulator (left) and the MCNP (right),
for a nickel layer of 10 µm.

The performance analysis of the simulator, in turn, was generated to performance situa-
tions varying the number of processors from 1 to 18. To illustrate the reduction in the
average processing time, there is the graph in figure 7. The average times were obtained
from 10 rounds of simulation for the copper sample of 5 µm to 108 events for each number
of processors. The uncertainties were calculated using standard deviations of each set of
measurements. These same uncertainties were used to speed-up graph.

The calculation of the speed-up due to the number of processors was based on the ratio
between the average time of the simulation for one processor and the average time for
the other cases (figure 8). Then the speed-up for two processors is the ratio between the
average time of the simulation for one processor and the average time for two processors
and so forth.

It should also be noted the importance of each core inside the joint, since the relation
between the number of processors and the density of communication between them can
become infeasible. This is the reason that from a given point, the real graph moves away
from the ideal graph.

INAC 2013, Recife, PE, Brazil.


Figure 7: Average time graph for simulation of the copper sample of 5 µm to
108 events.

Figure 8: Speed-up graph for simulation of the copper sample of 5 µm to 108


events.

4. CONCLUSIONS

The aim of this paper was the development of a computational simulator for analysis
and studying of X-ray fluorescence systems. It is worth noting the value of the modelling
process, since this resource allows the studying of the problem from a point of view without
turning into a rigid path and paralyzing the modeller.

In the intention of finding prospects, the main focus is the maintenance of the simulator
in order to test different ways of analyzing a given system. The modelling of the detectors
can be improved, including the insertion of technical parameters. The introduction of
other processes of X-radiation interaction would also be valid, such as X-ray diffraction
or total reflection spectrometry.

Another feasible approach is the electronic interaction with the medium. It would pave
the way for modelling the X-ray tube from its interior elements. All these thoughts are

INAC 2013, Recife, PE, Brazil.


important not only in terms of diversification, but to demonstrate that there are no limits
when studying a system by building models.

ACKNOWLEDGMENTS

To FAPERJ, for financial support.

REFERENCES

1. SCHMIDT, J. W. Fundamentals of digital simulation modelling. In: WINTER SIM-


ULATION CONFERENCE, 13th. Proceedings. Atlanta: Vol. 1, 1981, p. 13–21.
2. SANCHEZ, P. J. Fundamentals of simulation modelling, In: WINTER SIMULA-
TION CONFERENCE, 39th. Proceedings. Piscataway: 2007. p. 54–62.
3. GRIEKEN, R.; MARKOWICZ, A. Handbook of X-ray spectrometry. 2nd. ed. New
York: Marcel Dekker, 2002. 1016 p.
4. SOKOLOWSKI, J.; BANKS, C. Principles of modeling and simulation: a multidis-
ciplinary approach. Hoboken: John Wiley & Sons, 2009. 259 p.
5. EBEL, H. et al. Numerical description of photoelectric absorption coefficients for
fundamental parameter programs. X-ray Spectrometry, v. 32, n. 6, p. 442–451, Jun.
2003.
6. KALE, L. V.; KRISHNAN, S. CHARM++: a portable concurrent object oriented
system based on C++. SIGPLAN Not.. New York, v. 28, n. 10, p. 91–108, Oct. 1993.

INAC 2013, Recife, PE, Brazil.

You might also like