You are on page 1of 97

VERIFICATION

Contents

1. ASIC Design
2. What is Verification and its challenges
3. Types of Verifications
4. Phases of Verifications
5. Verification Plan
6. Verification Architecture
7. Testbench
8. Coverage based Verification
9. Assertions based Verification
ASIC Design Flow
Verification

Verification is not a testbench, nor is it a series of


testbenches. Verification is a process used to
demonstrate that the intent of a design is preserved in
its implementation.
Verification Challenges

- Paul Wilcox
Verification Challenges

 Every Development team faces Verification issues


due to
a. size or number of designs
b. complexity of the design
c. Verification Process being used

 Verification teams continually attempt to address


these issues to find new problems that may arise.
These new problems can be more complex than
original ones
Verification Challenges

 Almost every issue in verification today can be


placed in one of the three categories

a. Missed bugs
b. Lack of Time
c. Lack of resources
Missed Bugs

 The highest priority for verification teams has


always been to find bugs

 The farther down the supply chain a bug is found,


the more costly it is

 Most common sources of functional bugs found in


silicon are
a. Design errors
b. incorrect or incomplete specifications
c. Changing specifications
Cont...
Lack of Time
 Time to market pressures forces the completion of
IC development in less time.

 Size and complexity of designs continue to


increase

 “when are we done??”

 Verifying large complex designs is an excercise in


risk management

 But teams usually know when they are not done


Finding 98% of the bugs – normal process

Finding remaining 2% (or tough bugs) is


a. Partly an art
b. Partly luck
c. And lots of hardwork
Lack of resources

 Given enough time and resources, most


verification teams can meet their goals

 Resources:
 Experienced verification engineer – hard to find
 Specialists + inexperienced = inefficiency
 S/W licenses and verification tools are costly
 Tools often will have narrow focus
 Verification reuse – write once and use often
Basic V design

 Verification should be separate task

 Verification requires a different mindset than


implementation. Separation results in improved
efficiency and quality of results (QOR)
Basic V design an Test Process
Modified V design and Test
process
What is being Verified

-Writing Test benches using


SystemVerilog
What is being verified??

 Formal Verification -- Mathematical Analysis(LEC)

 Property checking – Verifying the properties of


design using assertions

 Functional Verification – Verifying the


functionality of the design

 Rule checkers – Linting


Functional Verification Approaches

 FV can be accomplished using 3 complementary


approaches

a. Black Box
b. White Box
c. Grey Box
Black Box Verification

• Black box verification cannot look at or know

about the inside of a design

• All the verification is accomplished using

available interfaces

• This method suffers from lack of visibility and

controllability

• Difficult to locate the source of the problem

• Black Box testbenches can be used as Golden

Testbenches
White Box Verification
• White box verification has intimate knowledge and control of the
internals of design

• This is tied up to a specific implementation

• Used to verify the correctness of the functionality


Grey Box Verification
 Grey box approach controls and observes a
design entirely through its top level interfaces.

 A typical Grey box strategy is to include some


non functional modification to provide additional
visibility and controllability.

 Addition of obervability or controllability


features for the design is called Design for
Verification

 Grey and Black box verification can be done in


parallel with design
Phases of Verification

1. Verification Plan

2. Building Testbench

3. Writing Tests

4. Integrating Code Coverage

5. Analyze code coverage


Test Plan
 In test plan we prepare a road map for how do
achieve the goal.

Test plan contains


a. Introduction
b. Assumptions
c. List of test cases
d. List of features to be verified
e. Approach
d. Deliverables
f. Resources
g. Risks and scheduling.
h. Entry and Exit critetia
Building Testbench

 In this phase, the verification environment is


developed.

 Each component can be developed one by one.

 It is preffered to write down the coverage module


first as it gives some idea of the verification.
Writing Tests
 After the Testbench is built and integrated to DUT, it's time for
validating the DUT.

 Initially in CDV, the test are ran randomly till some 70% of
coverage is reached (or) if there is no improvement.

 After analyzing the coverage reports, new tests are written to


cover the holes.

 Randomization is directed to cover the holes.

 Corner cases have to be written in directed verification fashion


Integrating Code Coverage

 Once after achieving certain level of functional


coverage, integrate the code coverage.

 The code coverage tools have option to switch it


on.
Analyze Coverage

Finally analyze both functional coverage and code


coverage reports and take necessary steps to achieve
coverage goals.
Verification Plan

-System on A chip verification


methodology and techniques
Verification Plan
 For any SOC design, it is important to develop
and document a verification plan to serve as a
map for all of the verification activities to be
performed.

 The verification team must be prepared to


modify the plan if unanticipated problems and
limitations arise.

 If a plan is modified, the changes must still meet


the overall goals of the plan.
Cont'd...
1. Project Functional Overview :
The project functional overview summarizes the
design functionality, defines all external interfaces, and lists
the major blocks to be used.

2. Verification Approach:
Which verification approach fits the design
requirements, for example
• Top-down design and verification
• Bottom-up verification
• Platform-based verification
• System interface-driven verification
Cont'd...

Platform Based Verification:

 Confined to particular platform like xilinx, altera.

 We can verify mutually interactive multiple IP's


with very low cost.

 It is a standard C-based test methodology


System interface driven
Verification
3. Abstraction Levels :

The abstraction levels to be verified include


behavioural, functional, gate level, and switch level.
Verification Technologies

Writing Test Benches using


SystemVerilog
4. Verification Technologies :

• Linting
– Limitations of linting
• Simulation
– Event and cycle based simulation
– co-simulators

• Verification IP
• Code Coverage
• Functional Coverage
Linting
• Linting technology finds common programmer
mistakes.

• Linting does not require stimulus.

• Linting can only identify a certain class of


problems.

• Lint the code as it is begin written

• Liniting can detect Race conditions


Example

module saturated_counter(output done,


input rst,
input clk);
byte counter;
always_ff @(posedge clk)
begin
if (rst) counter <= 0;
else if (counter < 255) counter++;
end
assign done = (counter == 255);
endmodule
Simulation

• Stimulus and Response

• Event Driven and Cycle based simulation

• Co-simulators
– Most of the cycle based simulators are integrated with
the event driven simulators
Verification IP

• If you want to verify your design, it is necessary to


have models for all the parts included in a
simulation.

• Models for RAMs , Standard Interfaces and so on.

• It is cheaper to buy models than write them


yourself.
Code Coverage

if (parity == ODD || parity == EVEN) begin


tx <= compute_parity(data, parity);
#(tx_time);
end
tx <= 1’b0;
#(tx_time);
if (stop_bits == 2) begin
tx <= 1’b0;
#(tx_time);
end
Functional Coverage

enum {ADD, SUB, JMP, RTS, NOP} opcode;


...
case (opcode)
ADD: ...
SUB: ...
JMP: ...
default: ...
endcase

Note : Code coverage might be 100% but functional


coverage is not 100%
Abstraction Level for Intent
Verification

- System On A Chip Verification


Methodology and Technologies
5. Abstraction Level for Intent Verification :
The abstraction level that the design
functionality is verified against.

6. Test Application Approach :


The approach for applying functional tests to
the design.

1. either pre-loading tests into on-chip memory and


executing the test on-chip processor

2. applying the test throughout the external


interfaces to the device
7. Results Checking :
How to verify the design’s responses to
functional tests.

A. self checking techniques,


B. golden model (reference model) comparison,
or
C. comparing expected results files.
8. Test Definitions :

Defines the tests that are to be performed and


the model abstraction levels to which the tests are
to be applied.

In many instances, the same test will be


applied to several model levels
9. Models :
-- Which Models to use for Functional Veri.
-- Models includes different abstract levels

a. Model Source
b. Existing Models
- std lib, model from prev design
c. Derived Models
- Gate level model from RTL model using synthesis
d. Authored Models
- Models to be developed as part of verification.
10. Testbench Requirements :

a. The model types


b. Abstraction levels,
c. Model sources, and
d. Testbench elements (checkers, stimulus, and
so on) need to be considered.
e. For formal verification,
define design properties and constraints
11. Verification Metrics :
Two classes of metrics should be addressed in
the verification plan

1. Capacity metrics:
Identifies tool capacity assumptions (run
times, memory size, disk size, and so on) and
verifies that the assumptions made holds true
during the execution of that plan.

2. Quality metrics:
Establishes when a verification task is
complete. Quality metrics include functional
coverage and code coverage.
12. Regression Testing :
a. The strategy for regression testing.
b. The test plan details when the regression tests
are to be run (overnight, continuously, triggered
by change levels, and so on) and
c. Specifies the resources needed for the
regression testing

13. Issue Tracking and Management :


Which tracking system to use to manage bugs
and errors found in the design
14. Resource Plan :
The resources required to execute the
verification plan, such as human resources,
machine resources, and software tool resources.

15. Project Schedule :


The tasks that must be performed to execute the
verification plan as well as key benchmarks and
completion dates. The plan should show the
interdependencies between tasks, resource
allocation, and task durations.
Architecting TestBenches

Writing TestBenches Using


SystemVerilog – Chap 6
TestBench

• TestBench mimic the environment in which the


design will reside. It checks whether the RTL
Implementation meets the design spec or not. This
Environment creates invalid and unexpected as
well as valid and expected conditions to test the
design.
Linear(Direct) TestBench
• module top(); //TestBench code start
reg [15:0] a,b;
wire [16:0] c;

adder DUT(a,b,c);
initial
begin
a = 16'h45;
b = 16'h12;
#10 $display(" a=%0d,b=%0d,c=%0d",a,b,c);
end
endmodule //TestBench code end
Random TestBench
• module top(); //TestBench code start
reg [15:0] a,b;
wire [16:0] c;

adder DUT(a,b,c);
initial begin
repeat (100) begin
a = $random;
b = $random;
#10 $display(" a=%0d,b=%0d,c=%0d",a,b,c);
end
end
endmodule //TestBench code end
Self Checking TestBench

• Visually inspecting simulation results to determine


functional correctness is not an acceptable long-
term strategy

• Whatever intellectual process you would go


through to identify an error visually in the
simulation result must be coded in your testbench.

• This technique will let the testbench detect errors


and declare success or failure on its own
Cond…
• module top(); //TestBench code start
…. //declarations

adder DUT(a,b,c);
initial begin
repeat (100) begin
a = $random;
b = $random;
#10 $display(" a=%0d,b=%0d,c=%0d",a,b,c);
if(a+b !=c) $display(“ERROR”);
end
end
endmodule //TestBench code end
a. Using Reference Model
b. Using Transfer Functions
c. Scoreboarding
Verification Environment
Architecture

-SystemVerilog for Verification


Verification Environment
Architecture
Layers in TestBench Architecture

• We can divide the architecture into different layers.

• Layers of Architecture:
a. Signal and Command Layer
b. Functional Layer
c. Scenario Layer
d. Test layer and Functional coverage
Signal and Command Layers
Functional Layer
Scenario Layer
Test layer and Functional coverage
COVERAGE-DRIVEN RANDOM-
BASED APPROACH

--Writing Testbenches using


SystemVerilog
COVERAGE-DRIVEN RANDOM-
BASED APPROACH
• Random verification does not mean that you
randomly apply zeroes and ones to every input
signal in the design.

• It is the sequence and timing of these operations


and the content of the data transferred that is
random.

• Through the addition of constraints, a random


testbench can be steered toward exercising specific
features.
Cond…
• Measure progress against functional coverage
points that will identify whether a feature has been
exercised.

• The objective becomes filling a functional


coverage model of your design rather than writing
a series of testcases.

• Fill this coverage model using large directed


testbenches. Or you could let a random testbench
create the testcases and exercise the features for
you
Constraint driven Random
Directed Verification Verification
• Test write have to list out all • Test writer has to specify the set
features to be verified of constraints.
• Difficult find the corner case • Easy to find the corner case bugs
bugs. • Stimulus is generated in
• Stimulus is generated in the test testbench depending on the given
case constraints.
• Has limited amount of • Number of testcases can be
randomization. reduced.
• Becomes tedious when design • Testcase maintenance is easy.
complexity increases • Dis-adv : Testbench may
• Test case maintenance will be generate similar scenario many
harder for large designs times.
TestBench and TestCases
• module top();
// DUT instance, clock generator and TB
components
task write()
------
endtask
task read()
------
endtask

TEST tst(); // `include test_name.v


end
• module TEST();
initial
repeat(10)
top.write();
endmodule

• EXAMPLE: testcase_2.v
module TEST();

initial
repeat(10)
top.read();

endmodule
Code Coverage
• Code coverage is a technology that can identify
what code has been(and more importantly not
been) executed in the design under verification.

• All of your testbenches simulate successfully,

• But are there sections of the RTL code that you did
not exercise and therefore not triggered a
functional error?
Code Coverage

-Writing TestBenches using


SystemVerilog
Types of Code Coverages

• There are four popular coverage metrics are there.

a. Statement Coverage
b. Path Coverage
c. Expression Coverage
d. FSM Coverage
Statement Coverage

• Statement coverage can also be called block


coverage

if (dtack == 1’b1) begin: acked


as <= 1’b0;
data <= 16’hZZZZ;
bus_rq <= 1’b0;
state <= IDLE;
end: acked
if (dtack == 1’b1) begin: address <= 16’hFFED;
acked ale <= 1’b1;
as <= 1’b0; rw <= 1’b1;
data <= 16’hZZZZ; wait (dtack == 1’b1);
bus_rq <= 1’b0; read_data = data;
state <= IDLE; ale <= 1’b0;
end: acked
• Once the conditions have been determined

1. you must understand why they never occurred in


the first place.

2. Is it a condition that can never occur?

3. Is it a condition that should have been verified by


the existing verification suite? (OR)

4. Is it a condition that was forgotten?


Path Coverage

• Path coverage measures all possible ways you can


execute a sequence of statements.

• Number of Paths will depend on the Control-Flow


Statements
How many paths are there??
4 possible paths are there.
• The number of paths in a sequence of statements
grows exponentially with the number of control-
flow statements.

• Code coverage tools give up measuring path


coverage if their number is too large in a given
code sequence.

• To avoid this situation, keep all sequential code


constructs (always and initial blocks, tasks and
functions) to under 100 lines.
Expression Coverage
FSM Coverage

• Statement coverage detects unvisited states.

• FSM coverage identifies state transitions.

• FSM coverage cannot identify unintended or


missing transitions.

• Formal verification may be better suited for FSM


verification
Assertion Based Verification

-Practical guide for SystemVerilog


Assertions
What is an Assertion??
• Property :
Specifies the correctness requirements of the design,

• An assertion is a description of a property of the


design.

• If a property that is being checked does not behave the


way we expect it to, the assertion fails.

• If a property that is forbidden from happening in a


design happens, the assertion fails.
• Assertions can be continuously monitored during
functional simulations.

• The same assertions can also be re-used for


verifying the design using formal techniques.

• Assertions, also known as monitors or checker.

• Assertions are added to design only on need basis.


It can be accomplished using `ifdef
Why use of SystemVerilog
Assertions (SVA)
• Dis-Advantages of Verilog :

A. Verilog does not have good control over time.


B. As no. of Assertions increase, it become
difficult to maintain the code.
C. The procedural nature of language makes it
difficult to verify parallel events in the same
time period.
D. Might not capture all triggered events.
E. No built-in mechanism to provide functional
coverage data.
Example :
After the posedge of (a), (b) should
also rise in 1 to 3 clock cycles
Verilog code to check posedge of b

You might also like