|
( ESNUG 562 Item 6 ) -------------------------------------------- [09/29/16] Subject: CDNS Pete Hardee cites 18 mistakes on Jim Hogan's Formal Guide Anyway, since equivalence checking is already well-understood by most chip designers, I'm focusing this rest of this report on F-ABV (ESNUG 558 #2) and Formal Apps (ESNUG 558 #3) -- the two segments where I expect the most expansion for formal.
- Jim Hogan
Vista Ventures LLC, OneSpin BoD Los Gatos, CA
From: [ Pete Hardee of Cadence ] Hi, John, Plenty of good stuff in Jim Hogan's extensive report on formal verification in ESNUG 558 -- but also plenty of stuff Jim got wrong -- some clear errors, other "bendings of the truth", and some outright omissions... Jim's first real inaccuracy is in ESUNG 558 #2: Other vendors sell tools that automatically generate assertions to test for specific common design issues (e.g. "array out of bounds" access). They extend automated testing beyond plain old HDL linting by checking the operation of your chip -- in addition to code syntax and semantics. These tools include OneSpin Inspect, Real Intent Ascent, and Mentor AutoCheck.
- http://www.deepchip.com/items/0558-02.html
Both our JasperGold and our Incisive Formal are ABV tools that have done the
exact automatic checks that Jim's described here for years:
- Structural Property Synthesis (SPS) app for Jasper, and,
- Superlint for Incisive Formal.
Jim might not know this, but a superset of these two tools are now nicely
integrated into our current rev JasperGold. I count missing these two tools
plus missing it in the current rev of JasperGold as 3 Hogan omissions.
Hogan Error Count: 0 Total Hogan Error Count: 0
Hogan Omission Count: 3 Total Hogan Omission Count: 3
---- ---- ---- ---- ---- ---- ----
On to ESNUG 558 #3 to correct Jim's omission:
Some common public algorithms inside formal engines are:
- Binary Decision Diagrams or BDD
- "SATisfiability" Proofs or SAT
- Bounded Model Checking or BMC
- Craig Interpolation, usually shortened to "Craig"
- Property Directed Reachability or PDR
Often the more mature the tool, the more engines with different ABV
algorithms it will contain. For example, Cadence JasperGold and
OneSpin Verify both have ~15-20 formal engines/algorithms.
- http://www.deepchip.com/items/0558-03.html
All formal tools use these basic algorithms; but most use variants on these themes to eother boost either runtime or capacity. For example, Bounded Model Checking (BMC) is great for bug-hunting -- finding CEXs deep in the design's state-space (rather than achieving property convergence) but to go *really* deep in the state-space you need Elastic BMC which is: - automatic helper assertions, - guidepointing (using cover points to guide the bug search), - and searchpointing (random constraint solving). Jim missed that our JasperGold does all three of these -- plus it has a set of new ABV engines to hit "quality" bugs that no other commercial validation techniques can find. (That's 4 more Hogan Omissions.) An HP case study from JUG 2015 backs this:
User example #1: Jim Kasak of HP in his JUG 2015 best paper-winning
presentation, where he described finding a CEX as deep as 2942
cycles and finding a bunch of bugs other techniques missed ~100-400
cycles deep in the design.
Hogan Error Count: 0 Total Hogan Error Count: 0
Hogan Omission Count: 4 Total Hogan Omission Count: 7
---- ---- ---- ---- ---- ---- ----
Jim's second ESNUG 558 #3 omission:
Many Formal ABV tools today now offer some sort of GUI with features such as: waveform display, source code display with annotation, driver analysis, and code navigation. Examples are Mentor Questa debugger, OneSpin Debugger, and the Synopsys Verdi environment.
- http://www.deepchip.com/items/0558-03.html
Jim made a heinous omission by not mentioning JasperGold Visualize here.
Our Visualize tool does some unique root-cause analysis powered by our
formal engines. It took a decade of iterations between user designs and the
Jasper/CDNS R&D folks to reach to this level of maturity and usability.
From the Broadcom post on DeepChip in ESNUG 544 #2:
CADENCE IEV VS. JASPER JASPERGOLD
Then we made the switch from the Cadence IEV formal verification
tool to the Jasper JasperGold (this was before the acquisition)
as we saw some advantage in debug analysis with JasperGold
instead of IEV.
What I liked about the waveform debugger, Jasper Visualize, is
that you can manipulate the waveform to explore your design. It
can generate on-the-fly covers, change signal values on any clock,
and then generate a new trace. Once that trace is displayed, you
still have traditional waveform debug features -- like tracing
the source of a value change.
- Normando Montecillo of Broadcom
http://www.deepchip.com/items/0544-02.html
Cone of Influence (COI) Coverage This is based on the activation design logic that drives a particular signal. COI coverage suggests that a particular code line is observed by the assertion, as well as all the lines of code that influence this observed line. This is a somewhat conservative metric, which is reasonably easy to compute; however, it suffers from the accuracy issue that some of the design logic may be optimized away in various formal engines. In addition, to suggest a line of code is fully verified because it influences another line can be questionable. For this reason the COI method has been refined by some vendors to focus on the proof itself. For example, the Cadence "Proof Core" coverage metric.
- http://www.deepchip.com/items/0558-03.html
The same mistake is also made by Raik Brinkmann (of OneSpin) in his much longer reply in ESUNG 558 #6.
This is *NOT* what ProofCore does. Instead what Jim Hogan described is basic structural analysis of property coverage, which tends to be over-optimistic. It's just a guideline. Its purpose is to highlight areas you definitely haven't covered with properties. No more. ProofCore does much, much more. It takes engine-level information about the contents of the COI that were actually proven by each assertion. ProofCore is much more accurate. Just ask Vigyan Singhai (Oski) and Vikram Khosa (ARM), they know. ProofCore provides the most accurate measure for formal coverage on both bounded and unbounded proofs -- which is unique to CDNS Jasper. No one else does this.
User example #3: Oski-ARM JUG2015 paper, slide 31. Note how the
authors cite how "almost exact" ProofCore is -- much better than
basic property coverage.
I count this as 1 Hogan Error, even though his OneSpin buddy, Raik, is also
implicated in spreading this same FUD.
Hogan Error Count: 1 Total Hogan Error Count: 1
Hogan Omission Count: 0 Total Hogan Omission Count: 8
---- ---- ---- ---- ---- ---- ----
Jim's bending the truth ESNUG 558 #3:
7. FORMAL-SIMULATION INTEROPERABILITY Because Formal ABV today is used alongside simulation, the interoperability between the two is important. A major issue is understanding which parts of your design that are covered by the simulation or formal tools. This has been a problem because the coverage models between the two are different. A number of simulators like Synopsys VCS, Cadence Incisive, Mentor Questa. Aldec Rivera, Silvaco Silos, Fintronic Finsim, Synapticad Verilogger, Winterlogic Z01X, Tachyon CVC and others all have links to formal -- mostly through their coverage mechanisms. Caveat: your formal tool along with your simulator will output a boatload of coverage metric data. This coverage information needs to be loaded into a common database and viewed with a verification planning tool to assess what overall progress you're making on your chip. It is too easy to lose data between the two different methods if you don't do this! Some vendors allow for their databases to be read and written in a limited fashion from your formal and simulation tools, including third parties; but you must ask each vendor exactly which tools it does or does not support.
- http://www.deepchip.com/items/0558-03.html
Jim's bending the truth by trying to have it both ways. If you are a one -trick pony ABV tool you can claim to be "vendor-neutral" who supports every Verilog/VHDL simulator under the sun -- but then your ABV tool loses the advantages you get tight from integration with debuggers, emulators, equivalence checkers, other functional verification languages, etc. While at Cadnce we support all reasonable standards to promote 3rd party interoperability, we have deep integration between our formal and simulation (and hardware) engines that creates a unified metric-driven verification flow which our customers love. Simply importing coverage data from a formal ABV tool is both naive and it only meets part of the customer's requirements. What is needed is a sensible reconciliation of all of the different coverage models *and* their results -- which is what Cadence R&D has implemented. A recent customer case for metric driven verification that illustrates these points:
User example #4: PMC-Sierra paper from JUG 2015. Note how the
integrated flow brings formal throughout the verification. It's
not a hand-off process!
Jim's bending the truth to make OneSpin look passable. If you're simulator
"vendor neutral", Jim knows you lose the advantages of tight hooks with
debuggers, emulators, EC tools, etc. This is another 1 Hogan Error.
Hogan Error Count: 1 Total Hogan Error Count: 2
Hogan Omission Count: 0 Total Hogan Omission Count: 8
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
Something Jim go right ESNUG 558 #3:
9. CUSTOMER TECH SUPPORT Formal Verification has become much easier to use in the last few years. However, creating assertions and running the tools still requires a certain level of expertise. Depending on your internal expertise, you will need access to formal application engineering experts... Understanding the capability of the vendor's AE support is crucial. Caveat emptor.
- http://www.deepchip.com/items/0558-03.html
I couldn't agree with Jim more! However, this fact was later handled very
misleadingly by Jim in his comparison table (which I correct later in this
post.) Anyway, there's no question Cadence dwarfs all our rivals in terms
of our verification R&D team, Corporate AEs, and specialist FAEs.
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
---- ---- ---- ---- ---- ---- ----
What Jim messed up in his Formal Apps discussion in ESNUG 558 #4:
4. ACTIVATION CHECKS & CODE UNREACHABILITY APP These formal apps check your RTL design source code for unreachable parts.
- http://www.deepchip.com/items/0558-04.html
What Jim has described here are actually capabilities found in two
JasperGold apps: SPS and COV. SPS just analyzes the design itself,
usually without constraints (except clock and reset) and can be the
first point at which a designer can identify dead or unreachable code.
Then COV considers the formal constraints, and can identify unreachable
code due to over-constraints. COI analysis of constraints applied is
used in the COV app.
What is totally missing from Jim's app talk, though, is our:
17. Simulation Coverage Unreachabilty App (Cadence's UNR app)
What this JasperGold app does is first to analyze the simulation database
(unicov in Cadence's case) and, for every yet-to-be covered coverpoint, we
generate a formal cover property. Then the UNR app runs on the design to
explore the reachability of those points. If a point is reachable, we can
generate a witness trace which can be used to create a test to reach that
point. If it's unreachable, there are two possibilities. Either:
1. It's a bug - talk to the designer to see if this is essential
code or an unexpected result... OR ...
2. It's not a bug, the point can be waived and the simulation
coverage metrics improve.
Automation of this analysis easily saves typically 2 to 3 weeks of analysis
per major block or sub-system. It's one of the most automated, easiest to
apply formal apps with highest ROI.
We have many customer case studies for the UNR app. Here's 3 of them:
User example #5: Analog Devices paper at CDNLive India 2015.
Note the call out to the UnR app.
User example #6: Texas Instruments paper at CDNLive India 2015.
Note another call out to the UnR app.
User example #7: IBM paper at CDNLive India 2014.
It's an 8 point slide promoting the UnR app.
Since Jim missed the SPS and COV apps, plus missed our more powerful UNR
JaperGold app, I'm citing this as 3 Hogan Omissions.
Hogan Error Count: 0 Total Hogan Error Count: 2
Hogan Omission Count: 3 Total Hogan Omission Count: 11
---- ---- ---- ---- ---- ---- ----
And Jim errs about apps in ESNUG 558 #4:
14. SEQUENTIAL VERILOG/VHDL RTL EQUIVALENCY CHECKING APP The only limitation is state-space equivalency is not infinite -- it only goes to a certain cycle depth, e.g. 30 clock cycles. This cycle depth is a differentiator between vendor's apps.
- http://www.deepchip.com/items/0558-04.html
If he's going to write about formal, Jim should know that cycle depth is not
a good metric for sequential equivalence. SEC verification spans over many
design transformations like:
- clock gating,
- low power optimizations,
- design changes for PPA, etc.
SEC is complete solution for RTL-to-RTL signoff -- which means it is not
sufficient to get to bounded proofs. The goal is to achieve full proofs.
The variable is what capacity the tool has to do that. Cycle depth is
only one dimension of the capacity measure.
Hence I adjusted Jim's comment in his apps comparison table. +1 Hogan Error
Hogan Error Count: 1 Total Hogan Error Count: 3
Hogan Omission Count: 0 Total Hogan Omission Count: 11
---- ---- ---- ---- ---- ---- ----
Then Jim completely skipped over Cadence ABVIP in ESNUG 558 #4:
13. PROTOCOL COMPLIANCE APPS These apps confirm whether your chip design source code complies with common bus and other protocol standards -- such as AMBA, APB, and AXI. ... 16. AND THE 3RD PARTY APP MARKETPLACE... This isn't an app per se, but more about a new business model which makes formal apps available to the common engineer.
- http://www.deepchip.com/items/0558-04.html
I'll address these two together. Why are they linked? The issue is really
not about a 3rd party apps market place, which is just a marketing gambit.
(Jim knows this.)
The real issue is allowing user to readily create their own "apps" - and one
of the biggest needs driving this is custom in-house protocols for which
customers might need their own assertion-based VIPs (previously known as
"IPKs" in the Jasper world, now "ABVIPs" in the Cadence world).
Cadence's library of ABVIPs provides a crazy long list assertion-based
specification and verification IP for a crazy long list of bus and memory
protocols. Additionally, JasperGold users can create ABVIPs for their
proprietary protocols if they want.
Jim missing CDNS ABVIPs is clearly 1 Hogan Omission.
Hogan Error Count: 0 Total Hogan Error Count: 3
Hogan Omission Count: 1 Total Hogan Omission Count: 12
---- ---- ---- ---- ---- ---- ----
This brings us to JasperGold app #18 that Jim missed:
18. Architectural Modeling Apps (ARCH)
This Jasper ARCH app verifies protocols, at the early specification level,
not at the implementation level. Once that is done, ARCH helps create
the Protocol Compliance App -- which is the ABVIP for that particular
custom protocol.
Jasper R&D worked for years helping user companies responsible for some of
the common major public protocols -- and these customers used the ARCH app
to remove bugs from the specification -- before going to implementation.
ARCH is also commonly used for private in-house protocols.
Here's a Samsung example:
User example #8: Samsung paper "IPK Use, Reuse and New Development"
from JUG 2015
Yet another 1 Hogan Omission.
Hogan Error Count: 0 Total Hogan Error Count: 3
Hogan Omission Count: 1 Total Hogan Omission Count: 13
---- ---- ---- ---- ---- ---- ----
And finally the JasperGold apps #19 and #20 that Jim missed:
19. System-Level Deadlock Apps (DLD)
Deadlock checks are part of the "autochecks" that were noted in the other
Jasper apps, and are provided in JasperGold SPS app. But with the DLD app
we're talking about a much bigger context of deadlock here -- and it
really stresses performance and capacity of the formal engines to prove.
Because of this, we treat the DLD app as an "expert-only" app. However,
we have had notable customer successes with it:
User example #9: Qualcomm paper "NoC functional and deadlock
verification using formal" from CDNLive India 2015 using DLD.
20. Post-Silicon Debug Apps (PSD)
Back in 2010 before the CDNS acquisition, Jasper R&D launched the industry's
first formal post-silicon debug app, (PSD), that finds the root cause for
subtle RTL design bugs that escaped pre-silicon validation. Naturally
the PSD app runs inside JasperGold. Working with minimal trace info on how
the anomaly manifests itself, PSD uses Jasper Visualize and deep bug hunting
engines (along with Waypoints) to reproduce the anomaly's likely causes from
the Verilog/VHDL RTL.
User example #10: D.E. Shaw Research paper "Post-Silicon Debug Using
Formal Verification Waypoints" from DAC 2006. This was a post-silicon
debug of a 33 M gate chip. Using waypoints greatly reduced the formal
analysis region needed to track back the post-silicon anomaly from
its error signature.
So with Jim missing two more JasperGold apps, that's 2 more Hogan Omissions,
yielding a final tally of:
Total Hogan Error Count: 3
Total Hogan Omission Count: 15
With Jim being an often quoted industry luminary, I'd think he'd not make so
many (18 total!) mistakes. But then I have to remember that Jim Hogan's
roots are in analog, full custom, and mixed-signal design. Enough said.
- Pete Hardee
Cadence Design San Jose, CA
P.S. Enclosed in ESNUG 562-07 is the corrected version Jim Hogan's two
tables on formal tools and formal apps.
---- ---- ---- ---- ---- ---- ----
Related Articles:
9 major and 23 minor Formal ABV tool metrics - plus their gotchas
16 Formal Apps that make Formal easy for us non-Formal engineers
OneSpin CEO cites 8 "insufficiencies" in Jim Hogan's Formal Guide
---- ---- ---- ---- ---- ---- ----
Join
Index
Next->Item
|
|
|
|
||||||||
|
|||||||||
|
"Relax. This is a discussion. Anything said here is just one engineer's opinion. Email in your dissenting letter and it'll be published, too." This Web Site Is Modified Every 2-3 Days Copyright 1991-2024 John Cooley. All Rights Reserved. | Contact John Cooley | Webmaster | Legal | Feedback Form | |
!!! "It's not a BUG, /o o\ / it's a FEATURE!" ( > ) \ - / _] [_ (jcooley 1991) |
||||||||