You are on page 1of 10

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/257947089

The State of Verification Reuse in VLSI and Architecture for Reusable Test Bench

Conference Paper · October 2007

CITATIONS READS
0 127

1 author:

Usha Mehta
Nirma University
42 PUBLICATIONS   131 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Usha Mehta on 03 June 2014.

The user has requested enhancement of the downloaded file.


Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

The State of Verification Reuse in VLSI


and
Architecture for Reusable Test Bench

Usha S. Mehta
Assistant Professor (EC Dept.)
Institute of Technology,
Nirma University, Ahmedabad
ushasmehta2002@yahoo.co.in, usm_ec.it@nirmauni.ac.in

Abstract with complex ASIC designs, besides added by re-using


It is well known that verification today constitutes about Intellectual Property (IP) cores. In any ASIC or
70% to 80% of the total design efforts, thereby making complex SoC design flow, verification is very
it the most expensive component in terms of cost and important; any behavioral or functional bug escaping
time, in the entire design flow of any ASIC. A number of this phase will not be detected in the subsequent
academic and industrial research laboratories have implementation phases and will surface only after the
been carrying out research on verification reuse and first silicon is integrated into the target system, resulting
static and dynamic methodologies for verification reuse in costly design and silicon iterations.
of ASIC and SoC based on different approaches. In this Verification of a design itself suggests that it should
paper the main area covered is the verification reuse . be well checked for all possible valid inputs, its
Many of the issues related to intrinsic limitations of performance during erroneous input conditions or
these approaches taken. Today’s scenario in market has erroneous environmental conditions, all possible
brought to focus the need to carry out design and implementation (device dependency) and all possible
verification concurrently. For the design and future uses of the design.
verification task to proceed concurrently there is a need For this purpose very natural method is to generate
to understand concepts of preparation for verification all possible inputs, apply it to design under verification
before real Design under Verification (DUV) is ready to and to check the response of design for these inputs.
be verified. As designs become more complex, This method, as inputs are required in this method, is
verification will have to be carried out using the divide called dynamic verification method. The Functional
and conquer approach and it may require a different verification method is a dynamic verification method.
verification approach for different block of design. The Dynamic verification deals with generation of input
aim of this paper is to provide information on the state stimuli, driving the stimuli to design under verification,
of the art in the area of reusable verification component collect the output from design under verification and
i.e Reusable TestBench(RTB) and architecture for validate the output for applied input.
Reusable test Bench. The author concludes the review Now as any design is required to be verified before
by presenting issues, which form the current focus for further processing as stand alone and later on as a part
research. Most of the observations are author’s own of big SoC also, it is advisable to make test bench
observations during her M. Tech. dissertation on reusable at various stage so that it can be used at
“Verification of Ethernet e Verification Components” at various levels of process flow. Reusability means that
eInfochips, Ahmedabad. So the architecture is the same input stimulus i.e. test cases can be used for
sometimes explained with the example of Ethernet DUV. stand alone verification as well as a part of SoC
verification. The main requirement of such Reusable
Index Terms—Verification Reuse, Reusable Test Bench Test Bench (RTB) is TRUST among users. To create a
(RTB), System On Chip (SoC) , Design Under Verification trust on Reusable Test Bench and to be confident for
(DUV) its result, it is must that this reusable test bench should
be containing all possible scenarios and corner cases for
I. Introduction stand alone verification as well as future post integration
verification. For assurance of this purpose this
When verification is a tough job for ASIC, it is verification component must be rigorously verified with
expected to get even worse for SoC designs because of available standard or specifications of design, third
System on Chip (SoC) designs inherit all the well party models or golden reference of the design for
known verification and validation difficulties associated which it is built and checked that it has a provision to
detect all possible errors that can occur in design.
Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 62
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

• It must be able to work with both Verilog and VHDL


II. Reusable Test Bench designs and if possible with other languages also.

Improving verification productivity is an economic • It must be able to work with all HDL simulators.
necessity. When the design company people try to
verify each of the design modules from its big SoC • User should be able to use this test bench as a full
design themselves, they are required to create test bench verification environment or add it to an existing
for each particular module. This is a very lengthy environment.
process itself. So if for some standard modules like
processors, protocol or architecture which follows some • The reusable test bench should be optimized for use in
international standards like IEEE, if the ready made test a coverage-driven verification process by providing:
bench is available in market, such design companies (1) Generation that randomizes to all possible behaviors
will be at very ease. Such ready made options of test of the environment
bench which can be also used at various steps of
verification process are called Reusable test Bench. (2) Generation that can be constrained to a narrower set
Verification reuse directly addresses higher of behaviors or all the way to a directed test.
productivity, increased chip quality and overall
verification investment. Reusable Methodology is the (3) Self-checking
breakthrough technology required to create reusable
verification environments and to ensure that all (4) Coverage measurement for stimulus conditions,
verification components effectively interoperate. checking triggers and device response.
Today's complex chips commonly incorporate many
different protocols, interfaces and processors. • The reusable test bench behavior should comply with
Assembling appropriate verification environments the IEEE Standards or other specific standards that
requires efficient integration of reusable, plug-and-play industries generally follow for that particular
verification components. Achieving reusability requires application. For example, a reusable test bench for
that all components be built and packaged uniformly. Ethernet protocol should be comply with the IEEE std
Reusability becomes even more challenging when 802.3, 2000, IEEE Draft P802.3ae/D4.0, RMII
design teams all over the world create verification Consortium, SGMII and SMII specifications by Cisco
components that need to fit together seamlessly. Every systems-1998 and so on..
aspect of the component, including basic naming
conventions and coding styles, debug message • Normally an RTB contains agents for different
conventions, user interfaces, and interactions between possible configurations or extensions. Each agent
components must be standardized in order to assure contains a signal map block—containing the RTB
interoperatability. World leader companies in the signals mapped to the DUT, configuration block-that
development phase face many verification challenges. has the signals for configuration of the agent, Bus
Verification reuse is essential for productivity and to Function Model (BFM)- to drive and generate the
reduce time-to-market for these companies. Here stimuli, a monitor to check the DUT behavior and
Verification Reuse suggests the ready made test bench collect coverage information and to collect the output
for a particular application like a protocol, processor or data and then emits an event for each output data
so on. When we think of any ready made, plug and play received, collector- to check for the protocol
kind of test bench, it should contain some of the generic violation, checker – to automate the output response
features as listed below: checking process and a scoreboard to check the data
items integrity.
• Reusable Test bench should be ready-to-use, out-of-
the-box verification environment, typically focusing
on a specific processor, protocol or architecture, such
as Ethernet, PCI Express, AHB, PCI, or USB. III Architecture of Reusable test Bench

• It must consist of a complete set of elements for The architecture of Reusable Test Bench is shown in
stimulating, checking, and collecting coverage fig. 1, page no.3. It contains various components like
information on device under test (DUT). Active agent, passive agent, monitor, checker, score
board, bus function model and so on. The function of
• This test bench expedites creation of a more efficient each block is explained in detail.
test bench for DUT.

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 63
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

Fig. 1 Architecture of Reusable Test Bench

RTB & VE MGMT Sequences Item coverage


agent ve Agent Sequences
Config
&
Ethernet error logger Coverage Active

Ethernet RTB MGMT Sequence Item checker


management error Driver
Sequence Driver

VE monitor
Passive
BFM MGMT BFM
Monitor

BFM checker

DUT
Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 64
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

Role of Active Agents to meet your test bench requirements. The monitor also
The active agents drive traffic to the DUT with the help contains predefined checks that verify the DUV’s
of the sequence driver. The sequence driver generates adherence to the specifications. The monitor in the
various sequences and these sequences produce input passive agent checks for the protocol violation on the TX
stimuli for DUV. The stimuli (like Ethernet packets for line and monitor in the passive agent checks for protocol
Ethernet DUV or random lists of data) are injected into violation on the Rx line. The scoreboard unit verifies the
the DUT by the Bus Function Model (BFM). The BFM data integrity by comparing the input and output data.
injects them on the input port of DUV (like TX lines for Flow of Data within the Agents
the MAC agent and Rx lines for the PHY agent). The BFM initiates new stimuli for transmission by calling
the sequence (seq.) driver if transmission of previous
• The active agents generate input stimuli packets
stimuli is over and other required conditions match. If an
depending on the constraints provided by the user on
stimuli needs to be transmitted, then the sequence driver
various item fields. Active agents also contain a
generates the required stimuli and passes it to the BFM. If
monitor to do the checking and collecting coverage.
a management packet needs to be transmitted, then the
management sequence driver generates the required
The management sequence driver generates various
management stimuli and passes it to the management
management sequences that are injected into the DUT by
BFM. During reception cycle both the active and passive
the management BFM. The management BFM injects the
agents collect list of bits, list of di-bits, list of nibbles or
sequences on the MDIO (Management Data
list of bytes depending on the type of interface. The
Input/Output) line. The management sequences are
agents then re-group the data in the collector and check
related to functions of loop back, power down mode,
for related errors.
isolation, control, and status operations.
Proceeding with the Ethernet example, the sequences Monitor and BFM Architecture
from the sequence driver and the management sequence The monitors of the RTB are completely passive. The
driver verify the DUT as follows: BFM drives and generates the data. The BFM can make
The sequences from the sequence driver verify the DUT use of the monitor or duplicate some of the monitor's
for the following functions: logic.
Most passive activity is done by the monitor, while all
• MII like collision, false carrier indication, TX and Rx active interactions with the DUT are done by the BFM.
errors. For example, the monitor collects the packets and then
• GMII like carrier extension, bursting, collision, false emits an event for each data received.
carrier indication, TX and Rx errors.
• XGMII like starting error, data error, remote fault, and Monitor Architecture.
local fault. RMII like Rx error, collision, and so on. 1. TX Collector: The TX Collector collects
• SGMII, TBI like invalid code in data, special, and packets from TX data path. It is disabled
undefined category. for ACTIVE TX agents and enabled for
• XAUI like deskew error. PASSIVE agents by default.
• XSBI like block with invalid sync header and type 2. RX Collector: The RX Collector collects
field. packets from RX data path. It is disabled
The sequences from the management sequence driver for ACTIVE RX agents and enabled for
verify the DUT for functions of management interface PASSIVE agents by default.
like loop back, power down, isolation, speed selection, The monitor has predefined checks to verify protocol
and so on. adherence of DUT and has predefined coverage definition
too.
Role of Passive Agents
The passive agents consist of:
* A monitor IV Architecture for Verification
* A scoreboard Environment of Reusable Test Bench
The passive agent has both the TX and Rx collectors in
the monitor, by default. The TX monitor senses signals on For verification of protocol based DUV, given
the TX path and the Rx collector senses signals on the Rx architecture of RTB can be used. For getting the
path. The monitor collects the packets and emits events qualitative means for how well this RTB is working, it is
on the status of traffic to and from the DUT. The monitor required to verify this RTB before use. It should be
contains predefined coverage definitions and you can carefully checked that whether it fulfills the requirement
create additional coverage definitions and protocol checks of DUV verification or not. It should be also checked for

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 65
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

any bug exists in it. It should be checked for whether its BFM checker
functionality can be further extended and whether it fully
This unit checks whether the item fields (virtual fields)
checks all the concepts of specification or standard for
constrained by the sequence driver; is driven as per
given DUV . So it is required to build the mock
constrains through the BFM or not. Verification target:
verification environment (VE), and configuring the RTB
Verify that the virtual fields of the item are driven
for different kinds of possible DUVs in future. The goal
correctly from the BFM to the monitor.
here is to make RTB fault (bug) free and to achieve 100%
coverage with reference to standards or specifications for
VE Sequences
given design. The architecture of this VE is given in fig.
2. The VE sequences are built on the basic
sequences provided by RTB to generate
First step for defining verification environment is to complex scenarios. The VE should have its own
decide its features. The features required for a good sequence library to simulate various complex
verification environment is:
scenarios so that checking of the checks can be
ƒ Item generation coverage and checking.
ƒ Coverage of all kinds of sequences. done through them.
ƒ Data integrity checks for the BFM.
ƒ Sequences to generate various complex Agent Configuration Coverage
scenarios. This module covers the various fields of the agent config
VE architecture struct.
Based on above-mentioned features, we have prepared
VE architecture, which follows the guidelines from Error Logger
Verisity and Specman Ellit. This topic discusses various This module is used to expect errors based on packet
modules of VE. injected by Ethernet BFM of agent. When DUT
completes its packet collection, occurred errors are copied
to this module. It compares both expected and occurred
Item checker
errors and gives error on mismatch.
This module checks whether the item generated by the
sequence is without any contradiction or not. It also
checks that the sequence driver generate all kind of Management Error Logger
This module is used to expect errors based on
sequence items.
management packet injected by MGMT BFM of agent.
Item Coverage
Rest of the functionality is same as Ethernet Error Logger
This module covers the item generated by the sequence.
module.
Here VE coverage definition is different from RTB
coverage. The RTB coverage defines all the rules and
items design protocol supports. For VE coverage item list, VE Monitor
it should cover all the checkers that are introduced in the This unit checks the validity of the checkers by
RTB, scoreboard items and different possibilities of RTB comparing expected errors and actually occurred errors of
configuration in addition to items listed in RTB coverage error logger unit. Whether a check is fired only when
list. there is an error condition violating the protocol; is
checked by the VE monitor.
While defining coverage items, some questions related to
specifications must be answered. Below are some sample Data Collected Correctly
questions. Verification target: verify that the monitor collects items
• Have all packet/transaction types been tried? correctly. It uses scoreboard to verify data integrity.

• Have all CPU opcodes and operand Checker Is Correct


combinations been tested?
Verification target: verify that the checks fired are
• Have all legal state transitions occurred? correct.
Generate the sequence to generate both erroneous/non-
• Have all instruction types been interrupted? erroneous behavior and check against expected behavior
• Have all cases of resource contention been with a separate checker.
tested? Whenever a sequence is intended to fire checks from
• Have all queue limits been stressed? RTB monitor, expected errors list is updated with the list

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 66
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

of expected error tag names. The errors occurred during coverage report and so on, there are a number of tasks
the execution of sequence are logged into occurred errors which requires automation. Fig. 1 and 2 gives an example
list. At the end of sequence both the lists are compared that how this automation can be applied.
and result is indicated. If both the lists match then a In case of normal input applied to design, the output
message saying: “Expected and occurred errors are should be error free. So a matching automated mechanism
matched” is displayed else a message saying: “Errors can be scripted which will check all the simulators report
expected but not occurred:” or “Errors occurred but for a string which contains a string “ERROR” or
not expected” is displayed. Please refer the figure 3 and something like that. The script will also store such reports
4 with “ERROR” to a different directory with all the details
For checker, coverage buckets are defined for each of like test case number, date of execution, no. of errors in
them with their corresponding tag name. Whenever a file, line no of “ERROR” and so on.
dut_error occurs, the bucket for that particular check is
filled. In case of erroneous input applied to design, for each
predefined error in put, a list of expected output errors
Coverage is correct must be created. So when output is generated, a script can
Verify that coverage definitions are exercised at least be made to list all the errors in output. Then this list must
once and coverage is being collected correctly. be compared with the expected errors list and which ever
are common errors must be neglected for the analysis.
Now there will be two types of error which are “expected
Configuration check / coverage but not occurred” and “occurred but not expected”. So
This module checks and covers for the configuration of bug analysis is required for these two error types. Script
the RTB. It also checks that the RTB is configured as per can help in all above tasks. So now a days, for a
the user configuration. verification engineer, to understand the operating system
It checks for the following. very well and to automate the process in very efficient
ƒ Topologies way is one key to success.
ƒ User configuration.
VII. Conclusion
Scoreboard
The paper focused on why a reusable test bench is
The scoreboard is used to check the data integrity
required and also why it is required to be verified before
between the BFM of one agent on one end and the
putting it in to user’s hand. The architecture for RTB and
monitor of other agent on the other end.
VE are explained for protocol checking by use of checker
Following is the list of scoreboard instances and
as well as data integrity by use of scoreboard. The
corresponding input items, which are to be compared.
coverage results can be used as a measure of quality for
RTB. As day by day, the world is moving with more and
V Implementation more functions on smaller chip, the demand of RTB will
The above architecture is successfully implemented for be definitely higher and to get revenue out of it and to
Ethernet Verification using e Language and Specman generate trust on it, this kind of comprehensive
Elite during Author’s M. Tech. dissertation “Verification verification of RTB will be must in coming days.
of SoC using eVC”.
VIII. Acknowledgements
VI Automation The author is thankful to CEO, eInfochips,
Ahmedabad for providing the facilities to work on such a
For all above tasks of verification like input imerging topic.
generation, simulation, checking, coverage analysis, and
so on, the automation plays a great roll in saving time,
IX. References
efforts and money. The design must be checked for the
valid input combinations including the specific critical
[1] Janic Bergeron, Writing Testbenches : Functional
inputs where errors are most prone. To check the
Verification of HDL Models, ISBN – 1-4020-7401-8
behavior of design under invalid input combinations, the
design should be checked with erroneous input stimuli
[2] Guide lines for eVC and eRM from Verisity.
also. So automation techniques are employed to generate
both types of stimuli. When you have a hundreds of test
[3] Usha S Mehta : “Verification of SoC using Ethernet
cases to apply to design in proper sequence in absence of
eVC’, M. Tech. Project, 2004
you, to collect the reports from simulator in prescribed
format, to analyze the report for any error, to recreate the
[4] http:/ www. einfochips.com
test case which had error in output, to generate the
Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 67
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

Expected Output (without error)


Input

No
match
Normal match
Design Under Matching
Behavior Mechanism
Verification

Real Bug
Output
Real Output (with or without error)

Figure 3 Comparison of real output with expected output in


case of normal input

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 68
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

Input List of
With Expected
Directed Errors Error

Expected No
Design and match
Matching
Under occurred
Mechanism
Verification errors are
matching

Normal
Behavior

Output List of
Occurred
Errors

Expected but not Occurred but not


occurred error expected error

Bug

Figure 4 Comparison of real output with expected output in


case of input containing directed errors

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 69
India
Proceedings of the International Conference on Computer, Communication, Control & Instrumentation – 3CI-2007 – November 21st to 23rd
2007

Departments of Electronics & Communication Engineering and Instrumentation Technology R.V.College of Engineering, Bangalore, 70
India

View publication stats

You might also like