Professional Documents
Culture Documents
325
for multiple test-case scenarios using System and make a decision as to when the device
Verilog. We implement functional coverage, score is fully verified
boarding and assertions to verify the Design Under
Test (DUT). These tasks are typical requirements placed
The structure of this paper is as follows: upon verification engineers today. The EDA
Section II describes the rationale for developing a developers have identified these tasks and
standardised test bench structure. Section III outlines empowered their tools to help the engineer develop
the essential components of an advanced verification solutions to their problems. Yet, different strategies
environment. Section IV describes the directed test are still required depending on the type of chip being
case method and its implementation. The direct test developed, e.g. a network chip requiring a lot of
case scenario is extrapolated using a combined data block intensive traffic to pass through it or a SoC
member class. Its use in maximising the efficiency of level chip that requires a degree of connectivity
the test bench is outlined. Section V outlines the testing as part of its verification process.
floating point test bench case study. Integration of It is possible to build these differing types of
random and direct tests, code coverage, assertions verification solutions using System Verilog and
and a scoreboard together in one standard VMM. In our case study, we use a block intensive
verification environment is described. Section VI approach that can be extended to support the
concludes the paper. Results are highlighted and development of directed test case mechanisms -
indicate the point in the verification effort where often used in providing a solution to a connectivity
maximum efficiency is achieved together with a focused test bench. We leverage our test bench
productivity improvement over previous efforts to solution with the Synopsys VCS simulation tool to
verify the same device. provide a platform that an engineer can use to
generate the verification result speedily.
II THE NEED FOR A
STANDARDISED TEST BENCH III THE ESSENTIAL TEST BENCH
COMPONENTS
At present there is no standardised test bench
architecture that is described within System Verilog The key elements within a System Verilog
or within the verification industry. The different and VMM style verification environment are:
verification methodologies developed by the major • Classes
EDA companies have very powerful verification • Assertions
techniques but they do not describe how a targeted • Functional Coverage
verification environment should be built. • Scoreboards
In developing a chip, it is plausible that • Random Pattern Generators
engineers within a single organisation working in
• Transaction Level Model
different teams will develop varying strategies for
• Regression capability
verifying their IP blocks. This scenario leads to an
organisation having to support and integrate multiple
The key features of these elements are briefly
test benches within a single chip development.
described here:
Ultimately, this will lead to problems at the system
Classes are used to develop transactor
level at the point where much energy is spent in
components, score boards, cover points and data
getting the device through tape-out. There is a clear
structures that are used to manipulate the stimulus
need for organisations (regardless of size) to develop
provided to them by pattern generators. Classes are
and maintain a strategy that supports block level,
written in System Verilog code and utilise the same
sub-system level and system level scenario's.
software techniques as would be present in a C++
It is the responsibility of the verification
style environment.
manager and verification engineers to develop
Assertions are used to verify properties of a
solutions that will support the development of a chip
design that manifest themselves over time. An
from start to finish. To achieve this, they need to: -
example of an assertion is shown in Figure 1.
1. Produce a test bench architecture that will //look for invalid operation
property p2;
support the different functional verification
@(posedge elk) invalid_op_flag != 1;
levels right through to gate level simulation endproperty
2. Develop a structure that can be easily built
upon, supported, documented and extended assert property (p2);
to yield a library of test cases Figure 1: Example Assertion Code
3. Provide their managers with detailed
reporting mechanisms that can be An assertion is a property that should hold
extrapolated from the environment speedily true at all times. Assertions are used to specify
assumptions and help identify design faults on the
326
DUT entity [2]. Within the verification environment, quickly and efficiently capture the expected function
all assertions are declared in the interface file of the device at an approved degree of abstraction.
description - the point where the pins on the DUT
are specified. Assertions provide a powerful Regression capability is a means in which
pinpointing mechanism in highlighting unexpected the verification environment can support and
behaviour on an interface and can be compiled into a highlight certain information that can be extrapolated
detailed reporting structure in the Synopsys Unified at a later point in time to determine different test case
Report Generation (URG) tool. results. Regression runs are merged together to
collate different code coverage results to indicate the
Functional Coverage is a means of overall degree of verification applied to the DUT.
measuring what has been verified in a design in order
to check off items in the verification plan [2]. IV DIRECTED TEST CASE
The implementation of functional coverage MECHANISM
consists of a number of steps. Firstly, code is added
in the form of cover groups in order to monitor the
The verification engineer must design the
stimuli being put on the DUT. The reactions and
architecture of the verification environment at the
response to the stimuli are also monitored to outset to achieve the ability to support both directed
determine what functionality has been exercised.
and constrained random test cases in an efficient
Cover groups should be specified in the verification manner [3].
plan. Within a test case scenario, their usefulness is In designing the Floating-Point test bench
ascertained by analysing the RTL code and
solution, our goal is to develop a robust and easy to
understanding the data they have recorded during the
use mechanism that facilitates the development of
simulation. test cases with minimal impact to the test bench
Cover points become more powerful within a
code. Figure 2 illustrates the key elements on which
simulation when they are crossed together to identify
the architecture is based:
greater levels of abstraction within a design. Cover TB_TEST_NO TB_MODE
points provide a powerful mechanism in identifying I I
327
III~- ...._ _• (
•••••••
.+ Directed
testcase
.
······~Functlonal
Coverage
..
··Minimal code .....-._~
II
modifications
328
FP_rand_l..sv
FP_crend_:l.sv
.i
1 _ top.sv
PLtb.sv
cl_fp_env.sv
cl_fp_scb.sv
cl_fp_lf.sv
cJ_fp_data.sv
cJ_fp_bfm.sv
c1_fp_cdtsv
329
40 million random test vectors. Increasing the after which the directed tests were required to close
number of random vectors above this figure had no off the verification task.
impact on the coverage levels and so CRY and Multiple cover points were identified inside
directed tests were required to reach the goal of the OUT and they were grouped together to yield
100% functional coverage. This aligned t the flow cross coverage points that encapsulated the
illustrated in Figure 4. functionality of the OUT. Leveraging the solution
with the Synopsys URG tool provided ample report
mechanisms to indicate when the task was complete.
Figure 9 depicts a typical report derived from
the Synopsys URG tool. It illustrates a sample
functional and code coverage report from running a
test scenario.
i2_add..uen_exception
Figure 7: Test Case Development
We found that using a single seed for the
constraint solver in VCS with a large number of test
vectors proved to be more productive than applying
several seeds with an equivalent number of test
vectors in yielding a coverage result.
Test scenario 7 in Figure 7 above (40 million
vectors with a single seed) yielded the greatest
degree of productivity from the CRY approach
without running 100+ million vectors. This point was
chosen as the starting point from which to develop Figure 9: Sample Coverage Report Summary
directed test cases. Had there been no time invested
into developing the test bench structure at the start of VII ACKNOWLEDGEMENTS
the process, it would have resulted in a longer time We would like to recognise the support of Damien
frame to close off the verification task at this stage. Melville in Chipright for valuable feedback in the
We developed a suite of 36 test cases to early phases of the project.
achieve 100% coverage. This consisted of one
completely random test, 27 constrained random tests VIII REFERENCES
and 8 directed tests. It took I month's time to verify [I] IEEE Standard for System Verilog, IEEE Std
the OUT using this method. Given the same device 1800tm - 2500
was verified previously using a standard Verilog [2] Verification Methodology Manual for System
flow that took 2 months of verification effort, this Verilog by, Janick Bergeron. Eduard Cerny,
results in a productivity gain of I month. Andrew Nightingale and Alan Hunter, Springer
US, ISBN: 978-0-387- 25538-5
[3] Richard McGee, Paul Furlong, Silicon and
Software Systems, "VHDL to System Verilog,
100 ';.. _~:_=_=_-_=_=_~:_=_=_=:_=_=_-_=_=~_~ =_=_==_=_=_~:.=--- 1l~_ - Direct Te-st--- constrained Random verification of USB 2.0
Approach
We host controller sub-system", SNUG 2006,
~ 7S% Europe
ell
> I+--+--- CRV
o
U SO% Approach [4] System Verilog for Verification by Chris Spears,
'#. Springer US, ISBN 978-0-387-27038-3
2S% [5] Motorola Russia, "Using System Verilog for IC
verification", SNUG 2005, Europe
0% -C::=!=3rno===;;nth;=s;;=sV1t~em~v~.n~lo=, =4-~1mon
......t~h4 - - - - +
R&D 'nit-.two Verification Effort Ti me [6] Effective Functional Verification, Springer US,
Figure 8: Coverage Process ISBN: 978-0-387-28601-3
[7] Torstein Hemes Dybdahl, iiHow much System
Figure 8 illustrates a graph depicting the level of Verilog coverage do I need?", SNUG 2007,
functional coverage achieved as per the test Europe
scenario's outlined in Figure 7. It highlights the [8] ANSI/IEEE Std 754 - 1985, IEEE Standard for
usefulness of the CRY approach up to a certain point Floating-Point Arithmetic, IEEE Computer
Society Press, Los Alamos, California, 1985
330