Professional Documents
Culture Documents
3, JUNE 2006
761
I. INTRODUCTION
REAST cancer early detection is recognized as a worldwide priority, since it constitutes the most effective way
to deal with this illness. Nevertheless, the detection specificity
of present diagnosis systems is low [1]. Therefore, research on
new diagnosis processes and systems for this type of cancer
are actively pursued. Positron Emission Tomography (PET)
based technology is one of these promising research lines.
PET technology is used in the development of the Clear-PEM
scanner, a high-resolution Positron Emission Mammography
(PEM) system, capable of detecting tumors with diameters
Manuscript received June 19, 2005; revised March 30, 2006. This work was
supported in part by AdI (Innovation Agency) and POSI (Operational Program
for Information Society), Portugal. P. Rodrigues and A. Trindade were supported by the FCT under Grant SFRH/BD/10187/2002 and Grant SFRH/BD/
10198/2002.
C. Leong and P. Bento are with INESC-ID, Lisboa, Portugal.
P. Lous, J. Nobre, and J. Rego are with INOV, Lisboa, Portugal.
P. Rodrigues and A. Trindade are with the Laboratrio de Instrumentao e
Fsica de Partculas, Lisboa, Portugal.
J. C. Silva is with the Laboratrio de Instrumentao e Fsica de Partculas,
Lisboa, Portugal, and also with CERN, Geneva, Switzerland.
I. C. Teixeira and J. P. Teixeira are with INESC-ID, Lisboa, Portugal, and also
with the Instituto Superior Tcnico, Universidade Tcnica de Lisboa, Portugal.
J. Varela is with the Laboratrio de Instrumentao e Fsica de Partculas,
Lisboa, Portugal, and also with CERN, Geneva, Switzerland, and the Instituto
Superior Tcnico, Universidade Tcnica de Lisboa, Portugal.
Digital Object Identifier 10.1109/TNS.2006.874841
762
763
764
TABLE I
DATA AND CONTROL SIGNALS DESCRIPTION
tionality. In each DAQ FPGA, system functionality is partitioned into DAQ (synchronization and processing) Read-Out
Controller (ROC) and Filter.
Each one of the four DAQ boards maps 48 crystal modules.
Each DAQ FPGA inside each DAQ board processes data corresponding to 24 crystal modules. Modularity and hierarchy are
also present in the design of each module that constitutes the
DAQ FPGA.
The Trigger and Data Concentrator (TGR/DCC) board
houses the TGR/DCC FPGA, which implements the Trigger
and Data Concentration functionality. This FPGA is responsible
for the detection of coincidence occurrences. This functionality
is implemented in module TGR (Fig. 2). Whenever a coincidence is detected, a trigger signal is generated. The presence
of this signal indicates to the DAQ that the corresponding
765
In Fig. 2, LVDS stands for Low Voltage Differential Signaling, which is a low noise, low power, low amplitude method
for high-speed (gigabits per second) data transmission over
copper wire.
Two proprietary buses, the Generic Bus and the Dedicated
Bus, are responsible for the fast information flow within the
DAE system.
B. Process Diagrams
In the proposed D&T methodology, system functionality is
partitioned into sub-functions, in a hierarchical way to satisfy
design, testability and diagnostic requirements.
The partitioning procedure is based on the characterization of
data and data streams, as well as on the processes that transform
data.
Process Diagrams, PD, (e.g., Fig. 3) are used to describe
data storage, processing and data and control flow. Process
Diagrams eases problem characterization and modeling. This
procedure has been adapted from the software domain [7],
[8] where this kind of modeling is used as a thinking tool
to characterize the problem under analysis, as completely as
possible, prior to initiate system design.
In Fig. 3, ellipses represent processes, rectangles represent
external objects that dialog with the processes and arrows represent information flow.
A process is defined as a set of functions that carry out a given
functionality. Each ellipse conveys the process name and the
number of instances of that process (e.g., x4 means that there
are 4 instances of this module in the architecture). Each process
can be instantiated more than once. For instance, DAQ Sync is
instantiated 4 times. By doing so, modularity, reuse and parallelism are highlighted.
Each arrow conveys data and control signal information. Different types of arrows represent different types of information.
In this particular case, distinction is made between functional
operation mode (dot lines) and test mode (dash and continuous
lines).
In test mode, distinction is also pointed out between data originated by the test modules, that is, the test vectors (dash lines)
and the modules response to the test vectors, that is, modules
signatures (continuous lines).
In a good design, Process Diagrams should present low connectivity, that is, processes should be designed so that its associated functionality should be executed as independently from
the other processes as possible. This eases the implementation
of hierarchy and parallelism in the design structures.
Another aspect that is contemplated in the Process Diagrams
is the time variable. In fact, although it does not appear explicitly
in the diagrams, it is conveyed in the control signal that, together
with data, defines the flow of information between processes.
To guarantee, as much as possible, the completeness of the
functional description the concept of operational scenario is introduced. In this context, a scenario is defined as the set of processes and corresponding data and control flow that represents
the complete execution of the functionality in a given operation
mode.
766
synchronism is a critical issue in this system (de-synchronization may mainly be due to the long and diverse length of the
interconnection cables). In fact, if synchronism is lost, data become meaningless. To guarantee synchronism in key parts of
the circuit, where delays associated to previous processing or
data paths could be variable, self-adjusted pipeline structures
are used. The later is because the data come through an asynchronous bus, so its data are scrambled in the time domain,
which must be de-scrambled before processing. The first is for
auto adaptation of the cable length (cable delay).
Moreover, it is necessary to guarantee the working frequency
of 100 MHz. To achieve this purpose, registers are inserted
among modules whenever it is required. The modular character of the design significantly simplifies this procedure. As
referred, identical modules are used in parallel processing
mode. Different modules can work at different frequencies.
Synchronous and/or asynchronous FIFOs are used to guarantee
the correct data transfer between modules. With this generic
approach, implementing functional BIST structures is equivalent to implementing any other functionality.
E. Testability Issues
DAE testing [8] is carried out in order to insure: 1) design and
prototype validation, diagnosis and debug and 2) lifetime selftest. This may be carried out at component, board and system
level. As mentioned before, the complexity of the system would
make its functional test extremely complex, if based on the use
of external equipment only. Therefore, a test resource partitioning strategy has been adopted. Almost all the DAE test procedures are embedded in the FPGA design with negligible overhead: unused Silicon area and limited speed degradation. The
implemented functional BIST structures support both abovementioned objectives [9].
The functional built-in test modules in the different FPGA
aim at: 1) the verification of the correctness of the DAE system
767
An example of a test structure is presented in Fig. 6 corresponding to processes 2 and 3 in Fig. 3. As shown, a set of test
benches, TB1, TB2, and Null TB is applied to the processes to
be tested. Comparators are used to validate the module outputs
by comparison with the expected signature. These test benches
and expected outputs are generated by the Geant4 Monte Carlo
simulation toolkit and DIGITsim DAQ Simulator [2] and stored
in ROM blocks within the FPGAs.
Testing is carried out in two steps, one non-deterministic and
one deterministic. The non-deterministic test will verify that all
duplicated modules and blocks have identical response for the
same input vectors, which include Monte Carlo digitized data
frames. The deterministic test will verify that the functionality,
namely the evaluation of the two key values (Delta/Time Tag
and Energy) and samples [4] are correct on, at least, one complete signal path. The deterministic test will also verify that the
768
769
[4] P. Bento, Architecture and first prototype tests of the clear-PEM electronics systems, in IEEE MIC, Rome, Italy, 2004.
[5] N. Matela, System matrix for clear-PEM using ART and linograms,
in IEEE MIC, Rome, Italy, 2004.
[6] OMG-Unified Modeling Language, v1.5, Rational, 2003.
[7] Bran Selic1 and Jim Rumbaugh2, Using UML for modeling complex
real-time systems, 1 Realtime, 2 Rational, 1998.
[8] G. Hetherington, T. Fryars, N. Tamarapalli, M. Kassab, A. Hassan, and
J. Rajski, Logic BIST for large industrial designs: Real issues and case
studies, in Proc. IEEE Int. Test Conf., 1999, pp. 358367.
[9] P. Bento, C. Leong, I. C. Teixeira, J. P. Teixeira, and J. Varela, Testability and DfT/DfD Issues of the DAE System for PEM, Tech. Rep.
Jan. 2005, version 3.1.
[10] S. Agostinelli, GEANT4A simulation toolkit, Nucl. Instrum. Meth.
A, vol. 506, pp. 250303, 2003, 1995.
[11] P. Rodrigues, Geant4 applications and developments for medical
physics experiments, IEEE Trans. Nucl. Sci., vol. 51, no. 4, pp.
14121419, Aug. 2004.