You are on page 1of 33

Who takes the driver seat for ISO

26262 and DO 254 verification?


Reconciling requirement based verification
with coverage-driven verification

Avidan Efody, Mentor Graphics Corp.


Agenda
• Intro
– CD/CR
• Context
• Origins
– RBV
• Context
• Origins
• Areas for consideration
– Directed testing drift
– Result stability
– Coverage tracking gap
• Summary
Safety
2 Critical Plan,
April 2015
CD/CR verification
• Context
– Blocks getting too big to be verified by directed tests
– Interactions between “independent” blocks not tested
– Addresses actual “RTL block verification” problems
• Added value
– Open degrees of freedom to stimuli
– Allow unexpected bugs to be found
– Verification termination based on what is done vs. what
we think we did

Safety
3 Critical Plan,
April 2015
RBV
• Context
– Safety critical designs in mil-aero/medical/automotive
– Need to make sure spec and docs 100% match deliverables
– Not directly related to “RTL block verification” problems
• Added value
– Traceability of requirements from spec to implementation
and back
– Price for changing requirements can be accurately
estimated
– Ownership, responsibility, liability clearly defined

Safety
4 Critical Plan,
April 2015
RBV and CD/CR
• Its complicated…
– Management wants control
– Engineers are after efficiency
– Where are the trade-offs?

Directed testing

RBV CD/CR
Management Engineers
Coverage Drive
Overall process Project
Safety
5 Critical Plan,
April 2015
Agenda
• Intro
– CD/CR
• Context
• Origins
– RBV
• Context
• Origins
• Areas for consideration
– Directed testing drift
– Result stability
– Coverage tracking gap
• Summary
Safety
6 Critical Plan,
April 2015
Directed testing shift
• ISO/DO put a very strong emphasis on requirement
tracking
– From high level specification to test results and even code
• Is that problematic from a CD/CR testbench
perspective?

© Accellera Systems Initiative 7


Directed testing drift

RBV vs. CD/CR process


CD/CR RBV
• Ideal process • Ideal process
– Analyze spec holistically – Break spec into requirements
– Find commonalities, abstract – Implement
– Implement
• Confidence derived from • Confidence derived from
models overlapping 100% requirements mapped

Spec

8
Directed testing drift
Example (1)
• A DMA controller with configurable AMBA interface
– i.e. AXI3, AXI4
• Specification
– AXI3, AXI4 specifications
– DMA functionality specification
– Integration guide

© Accellera Systems Initiative 9


Directed testing drift
Example (2)
• CD/CR
– Read AXI3/AXI spec carefully
– Try to find all common points
– Implement single stimuli generation
– Implement a single checker
– Driven by <10 tests
– Coverage implemented using crosses
• Very similar to design

© Accellera Systems Initiative 10


Directed testing drift
Example (3)
• Management asks difficult questions:
– Why does the stimuli generation takes so long?
• “Well, I’m reading the spec holistically…”
– How many of the requirements did you already do?
• “Well, can’t say, my test generates all stimuli together…”
• “I need to write input coverage to answer that”
– A requirement has changed, what do we have to rerun?
• “mmm…”

Safety Critical Plan, April


11
2015
Directed testing drift
Solutions (1)
• Cave in to pressure
– Easy 
– Implement direct test per feature
– Implement two largely similar checkers
– Pay (a lot) later in maintenance, quality & management
• Find a good balance
– Keep coverage in 100% lockstep with
generation/scoreboard
– Always link through coverage

© Accellera Systems Initiative 12


Directed testing drift
Solutions (2)
• Advanced coverage Testplan
databases allow linking to:
– Test plans in various formats
• Answer question about
percentage of requirements
– Tests that covered a given bin
• Answer question about which
test need to be rerun
– And many more features Covered by…
• Regression efficiency, trends,
etc

© Accellera Systems Initiative 13


Directed testing drift
Result stability
• ISO/DO demand that results are repeatable and
reproducible
– Allow easier auditing, version tracking, liabillity
• Is that problematic with a CR verification
environment?

© Accellera Systems Initiative 14


Result stability

CR result stability basics


• @ same code/seed/simulator results are stable
– Except for simulator bug
– Similar to non-random + seed
– Instead of list of tests, list of tests+seeds
• Code modification can changes results
– Can be true for non-random as well
• But, changes will be bigger with random
– Invalidate list of tests/seeds to some extent
– How do we minimize the impact?

© Accellera Systems Initiative 15


Result stability

Test/seed list extraction


Run “blind” regression

Code
modification Get coverage results

Extract regression list

• Not every modification requires “blind” rerun


– How do we make sure “blind” reruns are avoided?
© Accellera Systems Initiative 16
Result instability

What is random stability?


• In a random stable environment code modifications
impacts are:
– Minimal
– Local
– Easily traced back to code modification

© Accellera Systems Initiative 17


Result stability

Solutions (1)
• Make your environment random stable
– Use UVM (version > 1.1d)
• Provides a robust infrastructure
– Follow coding guidelines
• Name transactions/sequences
• Don’t instantiate objects in utility packages
• More…

© Accellera Systems Initiative 18


Result stability

Solutions (2)
• Monitor coverage HTML

continuously Graphs

– Minor changes can drop


coverage numbers
– Make sure you catch those Text

early
– Some coverage databases
provide trend-analysis

© Accellera Systems Initiative 19


Result stability

Solutions (3)
• Consider advanced
stimuli methods
– Graph based methods
combine
• The stability of directed
testing
• The coverage space of
random
• And make it all more
efficient
– Standardization ongoing at
Accellera PSWG

© Accellera Systems Initiative 20


Functional coverage tracking gap
• ISO/DO require link to verification results
• What are “verification results”?
– They could be functional coverage, but…
– Auditing might require something more tangible
• Wave files
• Log files
• How can you go from functional coverage to raw
results?

© Accellera Systems Initiative 21


SV coverage types
• Cover groups • Assertions
– Good for data and crosses – Good for event sequences
– But tied to single event – But data is very simple

• Why can’t we have both?


– Complex data & crosses
– Sequences of events

© Accellera Systems Initiative 22


Example
• Read request to secure zone gets OK response with
garbage data
– Two events: request & response
– Data: address, request type, response, data
• Cover group implementation
– Sampled at response time
– Request time not captured
– Response time capture?
• How do you go from cover group to wave?

© Accellera Systems Initiative 23


Solutions (1)
• Dynamic coverage
– Post processing
– Multiple events
– Multiple data
• Requirements map to database queries
– Get all data and events
– No rerun @ requirement change
• Might already be covered
• Just ask database…

© Accellera Systems Initiative 24


Solutions (2)
• Log out of band information
– Log covergroup sampling time
– Link that to specific bins
• For above example:
– Secure read bin linked to response time
• Not supported by every coverage database!

© Accellera Systems Initiative 25


Summary
• ISO/DO teams shouldn’t stay away from advanced
verification!
• RBV and CD/CR can work together
– But require upfront planning
– And awareness
• With good planning
– Productivity can go >> directed tests
– And ISO/DO support can improve

© Accellera Systems Initiative 26


w w w. m e n t o r. c o m
Main Title here
font size 44pt, 2 row title allowed
Authors and affiliation here
Font size 32pt

Optional company logo(s) only at title page

© Accellera Systems Initiative 29


Guidelines (1)
• Please keep the default font size for main lines at
28pt (or 26pt)
– And use 24pt (or 22pt) font size for the sub bullets
• Use the default bullet style and color scheme
supplied by this template
• Limited the number of bullets per page.
• Use keywords, not full sentences
• Please do not overlay Accellera or DVCon logo’s
• Check the page numbering

© Accellera Systems Initiative 30


Guidelines (2)
• Your company name and/or logo are only allowed to
appear on the title page.
• Minimize the use of product trademarks
• Page setup should follow on-screen-show (4:3)
• Do not use recurring text in headers and/or footers
• Do not use any sound effects
• Disable dynamic slide transitions
• Limit use of animations (not available in PDF export)

© Accellera Systems Initiative 31


Guidelines (3)
• Use clip-art only if it helps to state the point more
effectively (no generic clip-art)
• Use contrasting brightness levels, e.g., light-on-dark
or dark-on-light. Keep the background color white
• Avoid red text or red lines
• Use the MS equation editor or MathType to embed
formulas
• Embed pictures in vector format (e.g. Enhanced or
Window Metafile format)

© Accellera Systems Initiative 32


Questions

Finalize slide set with questions slide

© Accellera Systems Initiative 33

You might also like