You are on page 1of 17

ASSIGNMENT- 1

DFT Basics
Arpit Tiwari

PART-1

1. Why we need DFT? Advantage of DFT?


A simple answer is DFT is a technique, which facilitates a design to become
testable after production. It‘s the extra logic which we put in the normal
design, during the design process, which helps its post-production testing.
Post-production testing is necessary because, the process of manufacturing is
not 100% error free. There are defects in silicon which contribute towards
the errors introduced in the physical device. Of course a chip will not work
as per the specifications if there are any errors introduced in the production
process. But the question is how to detect that. Since, to run all the
functional tests on each of say a million physical devices produced or
manufactured, is very time consuming, there was a need to device some
method, which can make us believe without running full exhaustive tests on
the physical device, that the device has been manufactured correctly. DFT is
the answer for that. It is a technique which only detects that a physical is
faulty or is not faulty. After the post-production test is done on a device, if it
is found faulty, trash it, don‘t ship to customers, if it is found to be good,
ship it to customers. Since it is a production fault, there is assumed to be no
cure. So it is just detection, not even a localization of the fault. That is our
intended purpose of DFT. For the end customer, the DFT logic present on
the device is a redundant logic.
Example
To further justify the need of DFT logic, consider an example where a
company needs to provide 1 Million chips to its customer. If there isn‘t any
DFT logic in the chip, and it takes for example, 10 seconds (Its very kind
and liberal to take 10 seconds as an example, in fact it can be much larger
than that) to test a physical device, then it will take approx. three and a half
months just to test the devices before shipping. So the DFT is all about
reducing three and a half months to may be three and a half days. Of course
practically many testers will be employed to test the chips in parallel to help
reduce the test time.
The benefits of testing are quality and economy. These two attributes are
notindependent and neither can be defined without the other. Quality means
satisfying the user‘s needs at a minimum cost. A good test process can weed
out all bad productsbefore they reach the user. However, if too many bad
items are being producedthen the cost of those bad items will have to be
recovered from the price charged for the few good items that are produced. It
willbe impossible for an engineer to design a quality product without a
profound understanding of the physical principlesunderlying the processes of
manufacturing and test.

Advantages of DFT

1. Reduce test efforts


2. Reduce cost for test equipment
3. Shorten time-to-market
4. Increase product quality
5. Quality and economy are the two major benefits of testing.
6. Improve quality by detecting defects.
7. Make it easier to generate vectors
8. Reduce vector generation time

2. Difference between verification and test?

Ans. Verification is a functional check of the abstract model created in RTL.


Testing is an Actual check of the silicon created from abstract model.

Verification Testing
Verify correctness of design Test the correctness of manufactured
device
Done by simulation or formal Two step process Test generation and
method, hardware emulation test application
Performed only once Applied to manufactured device
Responsible for quality of design Responsible for quality of device
Functional vectors are more Test vector are less
Functional coverage less Test coverage more

3. What is manufacturing Test and list down the goals?

Ans. Manufacturing Test:Factory testing of all manufactured chips for


parametric faults and for random defects. Main goals of manufacturing test
are enlist
1. Identify and separate defective chips from being used in actual
products.
2. Prevent catastrophic failures of critical electronic systems by providing
an early warning of failure.
3. Objectives of manufacturing test are to facilitate/simplify fail data
collection and diagnostics to an extent that can enable intelligent failure
analysis (FA) sample selection, to identify the source of failure.
4. Improve the cost, accuracy, speed, and throughput of testing,
diagnostics, and FA.
5. In addition to being useful for manufacturing "go/no go" testing, scan
chains can also be used to "debug" chip designs which have already been
mounted on a PCB

Q.4 What is Test Plan? What are requirements for Test plan?
Q.5 Different types of fault

Fault: Fault is a logic level abstraction of a physical defect.


Used to describe the change in the logic function of a device caused by
the defect.
Fault abstractions reduce the number of conditions that must be
considered in deriving tests.

Types of Faults:

Stuck-at-Faults:

When node is is permanently set 0 or 1 independent of whatever the logic


that is passing through it is known as stuck at faults.

One every node there will be 2 stuck at faults will be there:

1. stuck-at-Zero (S-a-0): when a node permanently set to Zero.

2. Stuck-at-One (S-a-1): when a node permanently set to one.

This Stuck-at-faults will be detected by using D-Algorithm.

Transition Faults:

Transition Fault model is used to find out the delay in the time, taken by gate
or memory element to change its logic state.

In transition faults path will be selected by the tool and detects the fault sites.

On every node there will be 2 Transition faults will be there:

1. Slow-to-Raise: (Changing its logic state from 0 to 1)

2. Slow-to-Fall : (Changing its logic state from 1 to 0)

This Transition faults will be detected by using Launch on Capture (LOC)


and Launch on Shift (LOS).

Path Delay:

Path delay is used to find out the delay in the time, taken by gate or memory
element to change its logic state.
In Path delay faults model critical paths needs to select manually and detects
the fault sites.

Shows three main categories of defects and their associated test types:
Functional, IDDQ, and at-speed.

Functional Test
Functional test continues to be the most widely-accepted test type. Functional
test typically
Consists of user-generated test patterns, simulation patterns, and ATPG
patterns. Functional testing uses logic levels at the device input pins to detect
the most common manufacturing process-caused problem, static defects (for
example, open, short, stuck-on, and stuck-open conditions). Functional testing
applies a pattern of 1s and 0s to the input pins of a circuit and then measures the
logical results at the output pins Functional testing checks the logic levels of
output pins for a ―0‖ and ―1‖ response.

IDDQ Test
IDDQ testing measures quiescent power supply current rather than pin voltage,
detecting device failures not easily detected by functional testing—such as
CMOS transistor stuck-on faults or adjacent bridging faults. IDDQ testing
equipment applies a set of patterns to the design, lets the current settle, and then
measures for excessive.IDDQ testing measures the current going through the
circuit devices.

At-Speed Test
Timing failures can occur when a circuit operates correctly at a slow clock rate,
and then fails
When run at the normal system speed. Delay variations exist in the chip due to
statistical
Variations in the manufacturing process, resulting in defects such as partially
conducting
Transistors and resistive bridges. At speed testing checks the amount of time it
takes for a device to change logic states.
Q.6 What are the different types of DFT methods?
Electronic systems contain three types of components:
a) Digital logic
b) Memory blocks
(c) Analog or mixed-signal circuits.

There are specific DFT methods of each type of component.


Built-in self-test (BIST), which is also used for digital logic as well as for
memory blocks Special techniques, known as boundary-scan and analog
test bus, provide test access to components embedded in a system.

Logic DFT takes one of two possible routes: ad-hoc and structured. The
adhoc DFT relies on ―good‖ design practices learned from experience.
Some of these are
• Avoid asynchronous logic feedbacks. A feedback in the combinational
logic can give rise to oscillation for certain inputs. This makes the circuit
difficult to verify and impossible to generate tests for by automatic
programs. This is because test generation algorithms are only known for
acyclic combinational circuits.
• Make flip-flops initializable. This is easily done by supplying clear or
reset signals that are controllable from primary inputs.
• Avoid gates with a large number of fan-in signals. Large fan-in makes
the inputs of the gate difficult to observe and makes the gate output
difficult to control.
• Provide test control for difficult-to-control signals. Signals such as
those produced by long counters require many clock cycles to control and
hence increase the length of the test sequence. Long test sequences are
harder to generate.

There are difficulties with the use of ad-hoc DFT methods. First, circuits
are too large for manual inspection. Second, human testability experts are
often hard to find, while the algorithmically generated testability
measures are approximate and do not always point to the source of the
testability problem

As the size and complexity of digital systems grew, an alternative form of


DFT, Known as structured DFT gained popularity.
In structured DFT, extra logic and signals are added to the circuit so as to
allow the test according to some predefined procedure.
Apart from the normal functional mode, such a design will have one or
more test modes. Commonly used structured methods are scan and built-
in self-test

Q.7 What are the cons of DFT?


Ans:
Cons of DFT:
1) Extra test Hardware Cost
2) Extra Tools and engineer cost
3) Area overhead
4) Test Circuit may disturb the functionality of device
5) Need to take care more related to static time analysis
6) Increased design complexity
7) Large test data volume and long test time
8) Basically a slow speed (DC) test.
9) Speed performance degradation.
10) Excess power consumption (usually outside circuits specifications)
during the Scan IN/OUT operations and the capture of the test
response in the scan chain
PART-2
Q.1 Explain ASIC design flow?

Design Specification

Behavioral Description

RTL Description

Functional
Verification
and Testing

Logic Synthesis

Gate Level Net list

Logic verification
and testing

Floor planning
and place Route

Physical Layout

Layout Verification and


Implementation

FABRICATION
1. The ASIC design process begins from writing a functional description
containing detailed requirements for the chip. We can start design on
basis of a functional description prepared by the customer. Alternatively,
we can create the functional description document based on the
customer‘s demands expressed in any form. At no time will we share
your information with anyone without your explicit permission.
2. Based on your demands, our team estimates the amount of resources
needed and produces a Statement of Work. After reaching an agreement,
the actual work is started.
3. The first step is similar to FPGA design. The following tasks are run in
parallel:
 Writing a synthesizable RTL (register transfer level) description (either on
Virology or VHDL) of the device.
 Writing a behavioral model, which is used to verify that the design meets its
requirements?
 Writing a verification plan and a corresponding verification
environment which describes and implements the method of proving the
design correctness.

4. The RTL description is verified against the behavioral model by out


dedicated Validation and Verification Department. This approach reduces the
probability of the design error since no RTL designer tests his own code.
5. Most modern ASIC designs are complex enough to the stage when it‘s
impossible to tell apart valid chips from faulty at the production stage without
special preparations during the design stages. These preparations are called DFT
(design for test). DFT techniques include:
 Scan path insertion – a methodology of linking all registers into one long
shift register (scan path). This can be used to check small parts of design
instead of the whole design (the latter being almost always impossible).
 BIST (built-in self test) – a device used to check RAMs. After being
triggered it feeds specific test patterns to the RAM module, reads back and
compares results.
 ATPG (automatic test pattern generation) – a method of creating test vectors
for scan paths and BIST automatically. Most modern EDA tool chains
incorporate such a feature.
6. The synthesizable and verified RTL undergo logic synthesis. The synthesized
reads RTL input, user-specified constraints and a cell library from the foundry.
The output of the synthesis process is a gate-level netlist.
7. The netlist must undergo formal verification to prove that RTL and netlist are
equivalent.
8. Preliminary timing results after synthesis are analyzed; critical paths are
checked against the project performance requirements. If needed, the RTL
description, constraints or synthesis options are modified, and the synthesis is
repeated.
9. When timing constraints are finally met, the design proceeds to the layout,
which consists of floor planning, placement and routing. Some other important
tasks are performed at this step, including clock tree insertion.
10. Final (post-layout) timing results are again compared with performance
requirements. If it doesn‘t fit, the floor plan can be changed or placement runs
with other parameters.
11The last stage before tape out includes the following checks:
 DRC (design rule check) is a check that the layout conforms to the foundry-
specific rules.
 LVS (layout versus schematic) is a formal equivalence check between the
post-synthesis net list and the final layout.
12. At last the resulting layout in GDSII format is handed to the semiconductor
fabrication plant (foundry). This process is called tape out.

Q.2 Explain FPGA design flow?

FPGA Design Flow


Architecture design. This stage involves analysis of the project requirements,
problem decomposition and functional simulation (if applicable). The output of
this stage is a document which describes the future device architecture,
structural blocks, their functions and interfaces.

HDL design entry.


The device is described in a formal hardware description language (HDL). The
most common HDLs are VHDL and Verilog. Test environment design. This
stage involves writing of test environments and behavioral models (when
applicable). They are later used to ensure that the HDL description of a device
is correct.
Behavioral simulation.
This is an important stage that checks HDL correctness by comparing outputs of
the HDL model and the behavioral model (being put in the same conditions).
Synthesis.
This stage involves conversion of an HDL description to a so-
called netlist which is basically a formally written digital circuit schematic.
Synthesis is performed by special software called synthesizer. For an HDL code
that is correctly written and simulated, synthesis shouldn't be any problem.
However, synthesis can reveal some problems and potential errors that can't be
found using behavioral simulation, so, an FPGA engineer should pay attention
to warnings produced by the synthesizer.

Implementation. A synthesizer-generated netlist is mapped onto particular


device's internal structure. The main phase of the implementation stage is place
and route or layout, which allocates FPGA resources (such as logic cells and
connection wires). Then these configuration data are written to a special file by
a program called bitstream generator
.
Timing analysis. During the timing analysis special software checks whether
the implemented design satisfies timing constraints (such as clock frequency)
specified by the user.
Q.3 What is difference between ASIC and FPGA?

ASIC FPGA
Permanent circuitry. Once the Reconfigurable circuit. FPGAs can
application specific circuit is taped- be reconfigured with a different
out into silicon, it cannot be changed. design. They even have capability to
The circuit will work same for its reconfigure a part of chip while
complete operating life remaining areas of chip are still
working! This feature is widely used
in accelerated computing in data
centres
Very high entry-barrier in terms of Easier entry-barrier. One can get
cost, learning curve, liaising with started with FPGA development for
semiconductor foundry etc. Starting as low as possible
ASIC development from scratch can
cost well into millions of dollars
Suited for very high-volume mass Not suited for very high-volume
production mass production.

Much more power efficient than Less energy efficient, requires more
FPGAs. Power consumption of power for same function which ASIC
ASICs can be very minutely can achieve at lower power
controlled and optimized
ASIC fabricated using the same Limited in operating frequency
process node can run at much higher compared to ASIC of similar process
frequency than FPGAs since its node. The routing and configurable
circuit is optimized for its specific logic eat up timing margin in FPGA
function.

ASICs can have complete analog Analog designs are not possible with
circuitry, for example Wi Fi FPGAs. Although FPGAs may
transceiver, on the same die along contain specific analog hardware
with microprocessor cores. This is such as PLLs, ADC etc, they are not
the advantage which FPGAs lack much flexible to create for example
RF transceivers.
ASICs are definitely not suited for FPGAs are highly suited for
application areas where the design applications such as Radars, Cell
might need to be upgraded frequently Phone Base Stations etc where the
or once-in-a-while current design might need to be
upgraded to use better algorithm or to
a better design. In these applications,
the high-cost of FPGAs is not the
deciding factor. Instead,
programmability is the deciding
factor.
It is not recommended to prototype a Preferred for prototyping and
design using ASICs unless it has validating a design or concept. Many
been absolutely validated. Once the ASICs are prototyped using FPGAs
silicon has been taped out, almost themselves! Major processor
nothing can be done to fix a design manufacturers themselves use
bug (exceptions apply). FPGAs to validate their System-on-
Chips (SoCs). It is easier to make
sure design is working correctly as
intended using FPGA prototyping.
ASIC designers need to care for FPGA designers generally do not
everything from RTL down to reset need to care for back-end design.
tree, clock tree, physical layout and Everything is handled by synthesis
routing, process node, manufacturing and routing tools which make sure
constraints (DFM), testing the design works as described in the
constraints (DFT) etc. Generally, RTL code and meets timing. So,
each of the mentioned area is designers can focus into getting the
handled by different specialist person RTL design done

Q.4 Which scenarios ASIC is preferred? Which scenarios FPGA is


preferred?

ASIC is preferred
 mass production,
 high frequency
 low power operation
 For Analog Circuit

FPGA is Preferred
 Small Production
 FPGAs are highly suited for applications such as Radars, Cell
Phone Base Stations etc where the current design might need to be
upgraded to use better algorithm or to a better design.
 Suited for Front end design not backend
 Limited Frequency Operation

Q.5 DFT can be done in which all phases of ASIC flow?


DFT can be done all phase of ASIC flow after RTL Verification.

Q.6 What is problem if we do not have DFT?


IF we don‘t have a DFT. we have no option to make functional design
testable. We don‘t have any controllability and observability to any node.
Without DFT High Quality design not achieved. Chip failer occurs any
time and we don‘t know when it is.
Consider an example of driver less automatic vehicle. If we don‘t have
DFT we put a chip its working fine but later on due to temperature rise it
will stop working because we don‘t have DFT we not tested it and chip
fail occurs it will lead to accident or human loss. So to achieve high
quality chip without DFT is impossible.
The short or open circuit can occur by a defect on the die can damage the
chip.
Stuck at ‗0‘ and stuck at ‗1‘ will appear not as a stuck signal but as an
unexpected increase or decrease in propagation delay, or an unexpected
shape in the arriving waveform

Q.7 ASIC consists of which all components? How all those can be tested
using DFT?
ASIC consist of Functional design, Memory, Clock and reset signal, input
and output signal.
DFT consist of two type of method
1) Adhoc
2) Structural
Adhoc :
• Avoid asynchronous logic feedbacks. A feedback in the combinational
logic can give rise to oscillation for certain inputs. This makes the circuit
difficult to verify and impossible to generate tests for by automatic
programs. This is because test generation algorithms are only known for
acyclic combinational circuits.
• Make flip-flops initializable. This is easily done by supplying clear or
reset signals that are controllable from primary inputs.
• Avoid gates with a large number of fan-in signals. Large fan-in
makes the inputs of the gate difficult to observe and makes the gate
output difficult to control.
• Provide test control for difficult-to-control signals. Signals such as
those produced by long counters require many clock cycles to control and
hence increase the length of the test sequence. Long test sequences are
harder to generate.

Structural approach:
Function design can be tested using scan insertion structure and later on
ATPG by pattern generation and fault simulation.
For Memory test MBIST Logic is there.
Clock and reset problem by taking care by test logic a scan insertion.

Q.8 What is meaning of Fault and Defect?


Defect
A defect is the unintended difference between the implemented
Hardware and its intended design.
Defects occur either during manufacture or during the use of
Devices Fault
A representation of a defect at the abstracted function level.
Q.9 What is functional code coverage versus Structural Test
coverage?
Functional code coverage
Functional code coverage- provides a quantitative measurement of
the testing effort.
It can assist in directing the tester‘s future efforts.
It can demonstrate redundancy in test cases.
It can be used as entry or exit criteria between test phases.

Structural test coverage


Coverage is a measure of how much testing can be done given a set
of test cases.
Coverage in most cases is a term associated with white-box testing
or structural testing.
Q.10 What is Yield? What is DPPM?
Fraction (or percentage) of good chips produced in a manufacturing
process is called the yield. Yield is denoted by symbol Y. yield is nothing
but performance of a chip.
DPPM is Defect part per Millions
The manufacturing yield depends on the used technology, the silicon area
and the layout design. Early in a technology development the yield is too
low (even less than 10%) and continuously rises (even above 95%) as
technology is getting mature.

Q.11 DPPM should be lesser or more, which is better?

DPPM is stand for Defect part per million. Definitely it should be less
which is better. PPM is typically used when the number of defective
products produced is small so that a more accurate measure of the
defective rate can be obtained than with the percent defective.)

Q.12 How Test coverage plays role?

 Test coverage measures the amount of testing executed. (100% coverage


doesn‘t mean 100% tested)
 It gives the information about covered and uncovered area in testing.

Q.13 Why achieving good test coverage is important?

 To find the areas which are in the requirement document but not covered
in the test cases?
 It helps to take decision about quality of product.
 It helps us to bridge the gap between requirement and test cases.

Coverage analysis is as easy to adopt and integrate as it is essential to


ensuring consistent, high-quality verification results. Code and finite-state-
machine (FSM) coverage analysis tools can be installed, integrated and
running in existing RTL flows in a matter of hours; new users can quickly
identify unverified areas of their design and focus new test development
there, speeding the verification process. And, as coverage analysis spreads
throughout an organization, additional benefits will emerge, among them
facilitating a uniformly qualified pool of in-house design intellectual
property and providing a common frame of reference for coverage closure.
Coverage % = (no of test areas covered/ total no of test areas) *100

Q.14 Who all are the major DFT tool vendors and what all are the
names of DFT tools
(Vendor wise)?

TEESENT - Mentor Graphics


DFT ADVISOR

DC COMPLIER
TETRA MAX SYNOPSIS

MODUS -CADENS
GENESIS - CADENS

You might also like