You are on page 1of 8

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/3251573

Historical Perspective on Scan Compression

Article  in  IEEE Design and Test of Computers · April 2008


DOI: 10.1109/MDT.2008.40 · Source: IEEE Xplore

CITATIONS READS

55 350

3 authors, including:

R. Kapur
Synopsys
103 PUBLICATIONS   2,210 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

VLSI Test vs Security View project

All content following this page was uploaded by R. Kapur on 16 January 2014.

The user has requested enhancement of the downloaded file.


The Current State of Test Compression

Historical Perspective on
Scan Compression
Rohit Kapur Thomas W. Williams
Synopsys Synopsys

Subhasish Mitra
Stanford University

evolution of the types of constructs


Editor’s note: used to create breakthrough solutions.
At the beginning of this decade, compression suddenly became a hot topic in
test. In reality, compression has had a long history. This historical survey ties Historical overview
together the classical techniques of reducing test pattern size with the newer During the past decade, IC test
methods, and shows the continuity of work in this area. solutions mainly focused on cost con-
—Scott Davidson, Sun Microsystems tainment (and low quality contributes
to cost). The variance is where cost is
encountered. In the beginning, cost
&THE BEGINNINGS OF the modern-day IC test trace was a result of the delay impact and the area overhead
back to the introduction of such fundamental of adding the multiplexer to implement scan. Later, the
concepts as scan, stuck-at faults, and the D-algorithm. cost equation shifted to test time, as measured by the
Since then, several subsequent technologies have time spent on expensive testers. Today, the cost
made significant improvements to the state of the art. equation is moving toward flows in which test forms
Today, IC test has evolved into a multifaceted industry a value link in achieving other tasks, such as yield
that supports innovation. Researchers present new improvement. As a result of this change, we can clearly
ideas in approximately 40 workshops every year. demarcate four eras in the evolution of the technology
Innovative advances in test research and practice over trying to contain the cost of IC test:
four decades have culminated into a set of technol-
ogies that address the problem of the rising cost of test. & expensive multiplexer,
Scan compression technology has proven to be a & gate-to-gate connection,
powerful antidote to this problem, as it has catalyzed & low-cost ATE, and
reductions in test data volume and test application & manufacturing yield.
time of up to 100 times. As a result, the cost of test
might well be contained for many years to come. Figure 1 depicts these eras on a rough timeline.
This article sketches a brief history of test technol- From the 1960s through the 1980s, gate delay and
ogy research, tracking the evolution of compression the area associated with gates played a significant role
technology that has led to the success of scan in test solutions. This era represents the longest span of
compression. It is not our intent to identify specific change in test. It began with the invention of scan and
inventors on a fine-grained timeline. Instead, we ended when scan tests became the primary test
present the important concepts at a high level, on a solution used to manufacture digital ICs. To incorpo-
coarse timeline. Starting in 1998 and continuing to the rate scan chains, the multiplexer was added in the
present, numerous scan-compression-related inven- data path of a flip-flop. Although most optimizations in
tions have had a major impact on the test landscape. this era, such as partial scan, did not add efficiency to
However, this article also is not a survey of the various the testing process, there were many other significant
scan compression methods. Rather, we focus on the innovations that dealt with reducing the pattern count

114 0740-7475/08/$25.00 G 2008 IEEE Copublished by the IEEE CS and the IEEE CASS IEEE Design & Test of Computers
Figure 1. Timeline divided into eras showing test technologies for test cost reduction.

for the designs. The technologies developed primarily and test application time problem. Experience and
during this timeframe fall into the expensive multi- techniques gained from the expensive multiplexer era
plexer era. (especially prior work on scan and logic BIST),
Some of these technologies were improvements in together with significant new innovations, resulted in
modeling and algorithms, and others can be classified technology that was relatively quickly adopted and
as DFT solutions. Most of the DFT solutions in this era implemented to address the needs created by the new
were classified as high-overhead solutions because the focus on scan compression.
need for compact tests was a secondary issue in the
testing process. Software solutions for test cost reduction
As semiconductor technology moved toward small- Scan made it practical to automatically generate
er geometries in the 1990s, the shift in delay from the test patterns. With the single stuck-at fault model as a
gate to the net had a significant impact on test. With the target, the algorithms could structurally verify the
ability to manufacture a large number of transistors, correctness of transistor-transistor logic (TTL) with
and the diminishing impact of gate delay on the design, high confidence. Since then, the quest for efficient
DFT solutions that were deemed impractical when first automation has continued. One noted area of
invented suddenly became useful for reducing the efficiency was test pattern reduction—the effort to trim
number of test patterns and the data associated with it. test data volume and test application time. Today, this
These technologies, which fall in the gate-to-gate is the primary goal of scan compression, and test
connection era, are now recognized as a precursor to automation software has used several methods to
today’s scan compression technology. achieve it:
As test technologists continued to push the
envelope on issues related to high quality, or low & fault dominance,
defects per million (DPM), the cost of test per transistor & reverse simulation of test patterns,
stayed relatively constant. The cost of manufacturing a & static compaction, and
transistor, meanwhile, kept decreasing. With the cost & dynamic compaction.
of test projected to equal the cost of manufacturing a
transistor, the test industry moved into the low-cost Single stuck-at faults ensure that the test set covers
ATE era. Demands on reducing the cost of ATE, the every net of the design. However, it is unnecessary to
biggest ticket item, led to the creation of structural apply a unique test for each single stuck-at fault in the
testers. netlist. Generating tests for single stuck-at faults for
While the commotion of the low-cost ATE contin- inputs and the fan-out branches of the combinational
ued to be sorted out, the EDA industry responded with netlist guarantees that the resulting test set will detect
new DFT solutions that addressed the test data volume all the single stuck-at faults. Targeting the right faults

March/April 2008
115
The Current State of Test Compression

parallel is called static compaction. After ATPG


generates a test cube, the test pattern is stored in a
temporary set of bins with a process that attempts to
merge the test cubes that are created. Before starting a
new bin, the test cube is checked for compatibility
with bins that already have test cubes in them. If the
new test is seen as compatible with the existing test
cubes—that is, it has the same stimulus requirements
for common stimulus points—then the test cube is
merged with the test cube in the bin. With some exit
criteria, the test cubes are completed with random
Figure 2. Venn diagram depicting all tests to values before fault simulation is performed.
detect faults F1, F2, and F3. After ATPG generates a test for a target fault
(primary fault), the algorithm tries to target many
first has a significant impact on the number of test more faults (secondary faults) before calling it quits
patterns created to fully cover the faults. At the heart of and sending the test cube on to the next step. This
this is the concept of fault dominance. A fault is said to process is called dynamic compaction.
dominate another fault if the set of tests that render the In this section, we left out technologies that
fault detectable is a proper subset of the set of tests that contribute to the reduction in test data volume and
detect the fault that gets dominated. test application time improvements but are implement-
Figure 2 shows a conceptual view of all possible ed for other reasons. Examples of these technologies
tests for three different faults. Overlapping portions of include scan with parallel scan chains limited by the
the Venn diagram regions depict tests that detect device interface, random fill in ATPG, fault dropping in
multiple faults. Fault F2 dominates F1 because every fault simulation, and many heuristics that identify
test for F2 is also a test for F1. By first targeting F2, we redundant faults and guide the ATPG to better solutions.
ensure that both F1 and F2 get detected. The reverse The primary goal of most of these technologies was to
situation might not apply. make ATPG fault simulation runtime more efficient.
Using this principle, if we were to target faults with
the more restricted test set, the selected test guarantees Area-insensitive DFT to contain test cost
the detection of the faults with the larger test set. This is During the expensive multiplexer era, some re-
not true if the fault with the larger test set is targeted searchers ventured down the DFT path and spent
first. Fault dominance helps reduce the number of test precious silicon real estate on the then-secondary
patterns. This concept, along with other fault-ordering needs of test. Such technologies are represented by
methods, plays a significant role in containing test test points and logic BIST. Both of these technologies
pattern count. got a boost when shrinking technology changed the
In the early days of ATPG algorithms, tests were delay equation. When interconnect delay became a
generated by first applying some random patterns to dominant factor in VLSI design, coupled with the
the netlist and fault-grading the patterns against the ability to put a large number of gates on silicon, these
target faults. Although the process of generating tests area-insensitive technologies got a second wind and
was efficient, the number of test patterns created in the became the natural precursor to scan compression.
random-generation phase led to significant test data In just a few years, researchers developed several
volume. Solutions that reverse simulated patterns or technologies to add logic between the scan inputs and
randomized and then simulated the patterns were able outputs and the internal scan chains. These efforts
to reduce the number of test patterns for the same leveraged the fact that a large portion of ATPG-
detection credit of targeted faults. generated tests had few stimulus and observation
Although ATPG is performed one fault at a time, requirements for fault detection.1 These technologies
fault simulation, depending on the platform’s word relied on decoupling the scan inputs and outputs from
length (32- and 64-bit machines), is performed with the internal chains such that a larger number of
many patterns. The process of collecting ATPG results internal chains could be driven from a smaller
to accumulate the set of patterns that are simulated in interface.

116 IEEE Design & Test of Computers


Figure 3 shows the impact of decou-
pling the scan terminals from the
internal scan chains. This figure also
shows that a reduction in chain length
linearly reduces the test data volume
and test application time. A ratio of 4
times more internal chains than the scan
interface translates to 4-times-shorter
chains and the corresponding reduction
in the quality of results (data and time).
This is the fundamental mechanism
behind the numerous research papers
and the commercially available scan
compression technologies of today. A
one-to-one relationship between the Figure 3. Decoupling of the scan interface from the internal scan chains
number of stimulus, observed values helps reduce the test data volume and test application time.
on the boundary to the number of shifts
leads to equal gains in test data volume and test the number of unspecified values in the test patterns,
application time. A further decoupling of the number and thus improves the compaction algorithms sup-
of times a stimulus or observation is performed on the ported in the ATPG tools.
interface to the number of shifts in the internal scan
chains leads to solutions that have different gains for Reseeding logic BIST
data and time. Logic BIST technology was invented at a time when
Logic added to interface the scan input with the area overhead for test was a sensitive issue and played
internal scan chains is referred to as the decompressor an important role in a niche system test market.3 This
because it takes only a few values to supply a much heavy-handed solution for test was used in production
larger set of receiving flip-flops. Logic added to interface environments as a way to reuse existing system test
the internal scan chains to the scan outputs is referred infrastructures. This technology represented high test
to as the compressor because it takes many values from data compaction but not much, if any, test application
the flip-flops and funnels the data to a far smaller set of time improvements. However, it was never recognized
terminals. Logic X (unknown) values in the test as a scan compression solution. In the 1990s, reseeding
response have a negative impact on the observability technology was used to reduce the test application
of good responses that come together in the compres- time—at the expense of reduced test data volume
sor. To handle logic X values in the response, gating gains. This made logic BIST the first real scan
logic is added between the scan chains and the compression solution.4
compressor. This is called the masking logic. Figure 4 shows the popular Stumps structure for
logic BIST. Seeds applied directly to the linear-
Test points feedback shift register (LFSR), or through another scan
Scan itself is a test point added to the flip-flops in chain, allow for deterministic values to be supplied to
the design. This control-and-observe point on the flip- the self-test engine. If the seeds represent encoded
flops was critical to the ability to generate tests using values that the ATPG needs to detect the fault, the self-
ATPG algorithms. However, control-and-observe test engine becomes a deterministic test machine.
points can be added anywhere in the combinational
part of the circuit. Although the technology was Input scan compression
invented to increase the fault coverage when targets A large variety of solutions have been developed
were not achieved, adding test points has the effect of for the interfacing logic on the input side of Figure 3.1
reducing test patterns when fault coverage is kept the Touba discusses the details of the input decompres-
same.2 Thus, test points became the simplest DFT sors as code-based, linear-decompressor-based, and
construct to reduce test data volume and test broadcast-scan-based schemes.1 We can categorize
application time. This construct effectively increases these solutions as either combinational or sequential.

March/April 2008
117
The Current State of Test Compression

error detection in the presence of a logic X.11 Shift


registers have been added to such XOR structures to
provide temporal trade-offs in the observation of
values such that fewer output pins can be used for
compaction.12
Before the emergence of such X-tolerant solutions,
traditional logic BIST solutions tried to completely
eliminate X values from a design, thus practically
achieving extremely high degrees of response com-
paction.6 Unfortunately, for most designs today, it is
impractical to eliminate all X sources, owing to timing
constraints, area overhead, inefficiencies of the
simulation engines (such as zero-delay simulation),
and inaccuracies in modeling the behaviors of certain
circuit blocks (for example, memory, custom-logic,
and analog-circuit blocks). Hence, X-tolerant solutions
Figure 4. Logic BIST structure showing reseeding to apply are seeing wide adoption, as Figure 5 shows. In
deterministic tests through the linear-feedback shift register addition to detecting defective chips, such X-compac-
(LFSR). (MISR: multiple-input signature register.) tors are effective for diagnosis and yield-learning
purposes as well.13,14
Combinational solutions have been as simple as direct
(but shared) connections of scan inputs to the internal Scan compression automation
scan chains,5,6 and they have been as complex as Along with scan compression research, the EDA
decoding logic to unravel scan data to sequences of 1s industry has played a leading role in making scan
and 0s. The more common solutions use XOR gates on compression the focus of IC test solutions. Mentor
the input,7,8 or multiplexers to distribute values from Graphics introduced the first commercially available
the scan inputs to the receiving chains.9 product for scan compression, but virtually every EDA
The sequential solutions branch off into two company has been quick to deliver similar solutions in
categories, of which some are efficient mutations of a short timeframe.
the logic BIST structure tailored for scan compression. Figure 5 represents conceptual pictures of scan
With seeds streaming in at intervals, or on every shift, compression technologies delivered by Mentor Graph-
the stimulus requirements for fault detection and ics,10 SynTest,8 Cadence,6 and Synopsys.9,15 These
observation (masking) are encoded in the state of the technologies provide different trade-offs between the
LFSR to provide significant gains in test data volume quality of results as seen in data compression and test
and test application time.10 application time, with differences in the amount of
The other branch of sequential solutions uses area overhead spent to deliver the results. Although
optional shift registers to temporarily store multiple Figure 5 does not necessarily show the current state of
values that get applied with various spreading logic.8 the art for each tool, we expect the basic constructs to
Figure 5 shows examples of these techniques with remain the same.
EDA solutions. Because EDA tools need to guarantee fault cover-
age, the solutions we outlined here provide masking
Output scan compression logic and also implement the traditional scan chains.
On the output side, XOR gates represent the most
common interfacing logic.7,8,10 With high observability THE INNOVATIONS THAT LED to the current state of scan
provided by an XOR gate, other options have not been compression culminate from optimizations developed
able to compete. Various flavors of compressors with in their respective eras. The changing environment of
XOR gates have developed around the basic concept IC manufacturing has made the impractical solutions
of dealing with unknown values in the test response, of yesterday practical today. Scan compression, in its
also known as X-tolerant response compaction. X- current state, represents a significant change to the
compact represents XOR structures that guarantee way test was once done. This change appears to be

118 IEEE Design & Test of Computers


Figure 5. Four of the scan compression technologies represented at the conceptual level that are adopted by EDA
companies Mentor Graphics (a), SynTest (b), Cadence (c), and Synopsys (d).

permanent. The test industry is quickly gaining new research introduces improvements, IC test
confidence in using this technology. Today, traditional solutions will rely on scan compression to deliver the
scan chains are implemented along with scan increased number of patterns created by small delay
compression to create a bypass configuration, which testing. Because data volume gains can be easily
provides a safety net for test methodologies that use achieved compared to test application time gains,
scan compression when the compression logic causes future research on this technology will likely focus on
loss of fault coverage. The bypass configuration is also test application time reduction.
used for debugging situations and nonmainstream use As we move forward, manufacturing yield appears
of scan chains. The scan compression evolution will to be the most-pressing industry problem. Experts
complete its current arc once it is implemented on anticipate that test, and hence scan compression, will
designs without traditional scan. need to undergo yet another revolution for break-
Although scan compression has become part of the through solutions in this area. &
mainstream IC test industry, this technology has not
had time to mature. In time, capabilities will be built & References
on top of it that will improve the quality of the results 1. N.A. Touba, ‘‘Survey of Test Vector Compression
(including high compression) and provide better Techniques,’’ IEEE Design & Test, vol. 23, no. 4, July-
analysis tools for architecting scan compression. As Aug. 2006, pp. 294-303.

March/April 2008
119
The Current State of Test Compression

2. M.J. Guezebroek et al., ‘‘Test Point Insertion for Compact 15. P. Wohl et al., ‘‘Minimizing the Impact of Scan
Test Sets,’’ Proc. Int’l Test Conf. (ITC 00), IEEE CS Compression,’’ Proc. VLSI Test Symp. (VTS 07), IEEE
Press, 2000, pp. 292-301. CS Press, 2007, pp. 67-74.
3. P.H. Bardell and W.H. McAnney, ‘‘Self-Testing of
Multichip Logic Modules,’’ Proc. Int’l Test Conf. (ITC 82),
IEEE CS Press, 1982, pp. 200-204. Rohit Kapur is a scientist working in
4. B. Koenemann, ‘‘LFSR-Coded Test Patterns for Scan the area of IC test at Synopsys. His
Designs,’’ Proc. European Test Conf., IEEE CS Press, research interests include IC test
1991, pp. 581-590. methods and their use in design flows.
5. I. Hamzaoglu and J. Patel, ‘‘Reducing Test Application He has a BS in electronics engineering
Time for Full Scan Embedded Cores,’’ Proc. 29th Ann. from Birla Institute of Technology, Mesra, India, and
Symp. Fault Tolerant Computing (FTCS 99), IEEE CS an MS and a PhD in computer engineering from the
Press, 1999, pp. 260-267. University of Texas at Austin. He is a fellow of the
6. C. Barnhart et al., ‘‘OPMISR: The Foundation for IEEE.
Compressed ATPG Vectors,’’ Proc. Int’l Test Conf. (ITC
01), IEEE CS Press, 2001, pp. 748-757. Subhasish Mitra is an assistant
7. S. Mitra and K.S. Kim, ‘‘XPAND: An Efficient Test professor in the Departments of Elec-
Stimulus Compression Technique,’’ IEEE Trans. trical Engineering and Computer Sci-
Computers, vol. 55, no. 2, Feb. 2006, pp. 163-173. ence at Stanford University. His re-
8. L.-T. Wang et al., ‘‘VirtualScan: A New Compressed Scan search interests include robust system
Technology for Test Cost Reduction,’’ Proc. Int’l Test design, VLSI design and test, and design for
Conf. (ITC), IEEE CS Press, 2004, pp. 916-925. emerging nanotechnologies. He has a PhD in
9. N. Sitchinava et al., ‘‘Changing the Scan Enable During electrical engineering from Stanford University.
Shift,’’ Proc. VLSI Test Symp. (VTS 04), IEEE CS Press,
2004, pp. 73-78. Thomas W. Williams is a fellow at
10. J. Rajski et al., ‘‘Embedded Deterministic Test,’’ IEEE Synopsys. His research interests in-
Trans. Computer-Aided Design of Integrated Circuits and clude IC test methods and their use in
Systems, vol. 23, no. 5, May 2004, pp. 776-792. design flows. He has a BS in electrical
11. S. Mitra and K.S. Kim, ‘‘X-Compact: An Efficient engineering from Clarkson University,
Response Compaction Technique for Test Cost an MA in pure mathematics from the State University
Reduction,’’ Proc. Int’l Test Conf. (ITC 02), IEEE CS of New York at Binghamton, and a PhD in electrical
Press, 2002, pp. 311-320. engineering from Colorado State University. He is
12. J. Rajski et al., ‘‘Convolutional Compaction of Test a fellow of the IEEE.
Responses,’’ Proc. Int’l Test Conf. (ITC 03), IEEE CS
Press, 2003, pp. 745-754. & Direct questions and comments about this article to
13. Z. Stanojevic et al., ‘‘Enabling Yield Analysis with X- Rohit Kapur, Synopsys, 700 East Middlefield Rd.,
Compact,’’ Proc. Int’l Test Conf. (ITC 05), IEEE CS Mountain View, CA 94043; rkapur@synopsys.com.
Press, 2005, pp. 726-734.
14. A. Leininger et al., ‘‘Compression Mode Diagnosis Enables For further information about this or any other comput-
High Volume Monitoring Diagnosis Flow,’’ Proc. Int’l Test ing topic, please visit our Digital Library at http://www.
Conf. (ITC 05), IEEE CS Press, 2005, pp. 156-165. computer.org/csdl.

120 IEEE Design & Test of Computers

View publication stats

You might also like