( DAC 10 Item 1 ) ---------------------------------------------- [ 09/02/10 ]
Subject: NextOp BugScope, Zocalo Zazz, Breker Trek
NEXTOP KICKED ASS: In the eyes of the users, the one EDA tool that "won"
this year's DAC was NextOp's BugScope. Why? The majority (58%) of the
users who saw it at DAC also said they wanted to evaluate it. Judging
from the user comments, their next nearest rivals, Zocalo and Breker, so
far don't seem to have the same initial traction that NextOp has.
"What were the 3 or 4 most INTERESTING specific tools that
you saw at DAC this year? WHY where they interesting to you?"
---- ---- ---- ---- ---- ---- ----
The BugScope DAC demo was my first conversation with NextOp - though
another team in my company had done an initial evaluation of their
assertion synthesis tool.
I'm a verification manager and what makes BugScope interesting to me
that it can automatically generate assertions and functional coverage
without much input from engineers. We write our own today, so if it
can help, it will be interesting, should help shorten our verification
cycle and should help our bug hunting.
BugScope takes our RTL design plus some the stimulus from our regression
as input, and generates a list of properties that we can give to our RTL
designers to walk through to filter between assertions and functional
coverage. From these properties, the tool can generate assertion or
functional coverage with our choice of language.
We can rerun our regression using the new properties and see if any of
the assertions fire and find bugs. Moreover, we hope to use the coverage
generated by the tool to confirm any coverage holes that we may have with
our regression suite. In this case, we can fine tune our stimuli to
better coverage.
What I described above is what I saw in NextOp's DAC demo.
WARNING: I can't personally say how well it works until we do our own
eval. Our company's first eval went okay, but it was on a small design.
I have a larger design which I want to do a more extensive evaluation
to see if BugScope can help to generate more assertions and better
coverage, to prove if the tool can really add value to our flow. Also
to see if the properties it generates can help us find corner cases and
things we've missed in our test suite. Finally, to find out if the tool
can actually help us find bugs.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
We're looking at evaluating NextOp's BugScope.
From the DAC demo, it looks like BugScope can validate properties and
produce assertions early in the development cycle. In addition, it
provides a way to find missing scenario coverage in the test bench
and tests.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
NextOp automatically generates high quality assertions, by a clever
marriage of simulation and formal.
I am definitely interested in doing an evaluation.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
NextOp BugScope
Step 1: Analyzes the input (RTL) design and stimulus, and
synthesizes properties
Step 2: Generates the properties report, which lets users
know the quality of verification and gives user
some direction for improving their stimulus
Step 3: User can insert the assertions into the design (Cool!)
It automatically generates assertions and functional coverage points
at the RTL-level for improving the quality of verification stimulus.
BugScope only supports SVA & PSL assertions.
- Jen-Chieh Yeh of ITRI Taiwan
---- ---- ---- ---- ---- ---- ----
NextOp looks interesting to us and we are hoping to evaluate it,
but we have not had any hands-on experience yet.
We are a bit concerned about its lofty price tag.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
I'm interested in tools to get coverage metrics, and possibly improve the
coverage in an automatic way (that is, no functional coverage).
NextOp's automatic assertion generation may be a solution. The goal is
to provide assertions to be run in simulation as well as formal
verification.
I found NextOp's presentation/demo useful and motivating enough to try an
evaluation. My main worry is, is the number of generated assertions
manageable on a real design? Also, are the assertions really meaningful?
Only an eval can answer these questions. So I've asked for an eval,
waiting for approval from my management.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
BugScope does assertion and coverage synthesis. We are considering
evaluating it.
- Mikael Andersson of Axis
---- ---- ---- ---- ---- ---- ----
We're already using NextOp after a successful evaluation, so I didn't
bother attending the booth.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
The primary functionality that was of interest to me with NextOp BugScope
Assertion Synthesis was using their assertions to find out the functional
coverage holes in our signoff test suites for different IPs being
designed in our company.
NextOp's claims of scalability of their approach appealed to me as
0-in and Magellan do not appear to scale well.
We're interested in evaluating BugScope very much to see if it really
can cover a major gap that we see in our IP verification flows towards
uncovering functional coverage holes.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
NextOp BugScope has a very interesting and unique approach to assertions.
It basically generates assertions automatically for you. Our company
evaluated BugScope recently and has a positive impression of it.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
I'm impressed by BugScope's ability to produce assertions automatically.
We are struggling to define System Verilog Assertions (SVA) for each of
our modules in our designs, but it is such painful work for our
designers. I am hoping BugScope will be big help for them.
We have requested that NextOp's distributor, Verifore, to explain it to
my designers, and provide us with some information in Japanese.
- Toshihiko Himeno of Toshiba
---- ---- ---- ---- ---- ---- ----
I don't fully understand BugScope, but from what I saw in the demo, the
plain old assertions generation was the most interesting for me.
We are looking into assertion-based verification to fill in major holes
in our methodology. Getting designers to sign up to writing assertions
instead of (or in addition to) module testbenches has proven to be
difficult, so BugScope could help out.
I am interested in an evaluation in a month or two, after I have written
up my DAC report and a proposal. The first thing I would like to do is
have a WebEx or GoToWebinar with the NextOp guys and some other folks in
my group.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
I think BugScope might be a useful tool to generate assertions and
properties automatically. We are interested in learning more
about it after DAC.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
NextOp technology. Based on the design internal activity during your
regression, it identifies design properties that the user should either
mapped as a coverage hole (requiring additional test) or an assertion
that can be reused later on. It seems easy to integrate in a verification
flow. We'll probably have a deeper look at it to assess the added value
of those generated properties...
Zocalo's tool named Zazz also address the issue of properties deployment,
through another means. They first try to figure out what are the best
of RTL signals candidates for properties. Then they propose a graphical
solution to capture your properties and debug them. The mental gymnastic
required to figure out what the property should be is still there, but
other painful tasks look simplified.
Last, we are using Breker Trek solution since a couple of years now:
their SoC verification strategy proposal is solid and we are happy of the
improvements it brings to our flow. I add it to the list because Breker
made nice refinements to the methodology since last year: I like how they
leverage an already existing testsuit to create more stressful SoC tests.
- Gregory Faux of STMicroelectronics
---- ---- ---- ---- ---- ---- ----
Breker System's Trek tool has been refreshing. A new look at how to do
functional verification. As a verification engineer, we all look at
outcomes to figure out our testplan. Breker takes that to a new level
where we can create graphs based on outcomes and even more power comes
when one can apply a static analysis to see if all outcomes can really be
reached by what was created.
Nextop was interesting. Formal analysis so one can classify assertions
it finds as coverage or fails. What was real interesting was how far
that they look into the logic... just a few cycles to come up with the
assertions. Can really assist designers in adding more assertions
earlier in the design cycle.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
Nextop : valuable assertions at low designer cost
Duolog : TLM / RTL system and soc assembly
Brekker : graph based test generation, increase test abstraction
Proximus : system modeling
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
Breker Verification sells a tool, Trek, that generates testbenches
similar to Mentor's inFact tool, although they say it works very
differently and they claim inFact is aimed primarily at Questa. The
input to Trek is BNF, C or C++ and the Trek output is a testbench in
Verilog, C, C++, Vera or System Verilog. Breker says they keep their
graphs smaller than the Mentor inFact tool because they can use C/C++
and claim better performance as a result of that and the user defined
constraints. Trek does coverage graphically, which they say is easier
for the user to understand. They use model-based generation to help
know when they are done, which is the age old question in verification.
Zocalo Tech sells a tool that manages System Verilog assertions. It
helps identify critical signals, allows graphical coding of constructs,
and helps debug assertions.
NextOp sells a tool, BugScope, that takes in RTL and a testbench and it
produces assertions in SVA, PSL or Verilog. These can be thought of
either as assertions that additional testbenches should meet, or
functional coverage holes (interesting how that works); the user gets
to pick which.
- John Weiland of Abraxas Corp.
---- ---- ---- ---- ---- ---- ----
The one that sticks out for me was Breker Trek. I've been keeping an
eye on their development of the tool for a while, and saw a demo. I
like that it translates from our feature-based verification planning
to what looks like straightforward coding of graphs to randomly verify.
I'll be interested to see if it improves our time spent verifying.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
Zocalo's tools are new and interesting. Unique ABV GUI dev and debug.
NextOp BugScope tool generates assertions. I didn't know the tech
existed.
AMIQ's Eclipse DVT. I love Eclipse and this add-on is very nice.
- [ An Anon Engineer ]
---- ---- ---- ---- ---- ---- ----
The Zocalo-tech toolsets for assertions was very very interesting. I DO
find theset of tools extremely useful for
a) the identification of zones that need to be addressed by assertions;
b) for the ease of creating and understanding assertions with Visual SVA
with the graphical entry (visualization of threads and concurrency)
c) the witnessing of test cases for verifying and debugging assertions;
d) the managing of assertions and metrics in terms of progress and zones
that need to be addressed.
I also like navigation tools (i.e., the GUI) for editing code, viewing
the hierarchy, viewing the assertions GUI and code, the test cases.
- Ben Cohen, author of "Real Chip Design and Verification"
Index
Next->Item
|
|