( ESNUG 568 Item 5 ) -------------------------------------------- [03/24/17]
Subject: Fishtail CEO balks at "messed up" Fishtail vs. Excellicon letter
For runtime, Excellicon did 3X faster here. On a 50M instances design,
Excellicon ConCert finished the equivalence checking in 16 minutes total,
versus Fishtail Confirm + Refocus took 46 minutes.
(Note: we did not do
a thorough evaluation on the QOR of Fishtail's EC results so cannot comment
on it. We just ran Excellicon ConCert and since it correctly pin pointed
the equivalences in a very short time we did not bother closely looking at
the final Fishtail EC result.)
- user details 5 major differences between Fishtail and Excellicon
From: [ Ajay Daga of Fishtail ]
Hi, John,
The user who wrote this said" "On a 50M instances design, Excellicon ConCert
finished the equivalence checking in 16 minutes total, versus Fishtail
Confirm + Refocus took 46 minutes". Are you kidding me? Do you seriously
know of any EDA tool that could read in a 50M instance design and provide
you worthwhile results of any type in 46 minutes? -- let alone 16 minutes???
On a 50M instance design, I would expect our Fishtail runtime to be measured
in hours. I will gladly pay for this author to attend DAC'17 at FishTail's
expense -- flight, hotel, food, everything -- if he can demonstrate in our
booth, using our software, that he performed equivalence checking on a 50M
instance design in 46 minutes.
---- ---- ---- ---- ---- ---- ----
One of the biggest differences we saw between the two SDC tools is how the
System Verilog Assertions, or SVA's are generated by both vendors. ...
Here, Fishtail uses a very bad approach. The same equations that are used
to formally verify the exceptions in Fishtail are simply translated to SVA
format. Thus, if Fishtails's formal passes an exception, the corresponding
SVA will always pass. No checking is going on!
The contributor says that FishTail uses a "very bad approach to SVA" because
"the same equations that are used to formally verify the exceptions in
Fishtail are simply translated to SVA format". There are only two ways to
verify an MCP using simulation with SVA's.
1.) You either inject an "X" on the startpoint when it transitions
and check to see whether the "X" gets to the endpoint, OR ...
2.) you write up the condition that should not be true when the
startpoint transitions and then check that it isn't true.
Our Fishtail SVA solution allows users to pick either of these 2 approaches.
There are no other ways to solve this problem.
---- ---- ---- ---- ---- ---- ----
On structural exceptions, Fishtail's formal engine output report shows the
failure in terms of equations with variables corresponding to the nets as
terms of the equation. On the other hand Excellicon shows the stimuli
pointing to why for example an MCP is 3 cycles instead of 2, or why a
specific path is false.
From the reporting perspective, we prefer Excellicon's approach as it's
easier to understand the stimuli values affecting the path rather than
understand huge equations. After all, the simulation that we do is all
stimuli based anyway. Equations, although good for automated verification,
are not intuitive for humans to follow.
The contributor says that FishTail does not generate stimulus for formal
exception verification failures. Wow. The waveform I have here below is
for an MCP verification failure that was generated by FishTail. It must
have not been seen as "stimulus" by this author. Of course, we generate
the stimulus that shows why an exception fails verification! In fact, we
allow engineers to walk back in simulation from the failure state and
understand why signals take the values that they do -- so our users can
understand why the failure happens.
The author also says that Excellicon has a "Timing-Intent-based exception
qualification". This is bullshit. Designers don't trust any EDA tool to
understand the human engineer's motivation for an exception -- they just
expect a tool to do its thing, flag real issues without swamping them
with noise.
"In our customers' SDCs, true structural exceptions are rare -- 99%
percent of their exceptions are the "timing intent" type -- but
it's still nice to see a comprehensive exception verification
that can be performed in one shot."
With FishTail, barely 1-3% of the timing exceptions specified by an engineer
fail verification once an engineer is done with formal and ABV. So, I don't
get the author saying 99% of his timing exceptions do not lend themselves to
verification by Excellicon - why would you invest in a tool that cannot help
you with 99% of your constraints?
---- ---- ---- ---- ---- ---- ----
Fishtail does not generate multi-mode constraints while Excellicon does.
Here is why:
Above if there are two clocks merging to a MUX, Fishtail will identify both
clocks and propagate both of them by creating the generated versions of the
primary clocks. ... Fishtail's method works, but only valid for optimization engines such as for
RTL synthesis, or P&R.
The author says that Excellicon has "Mode Synthesis", and FishTail does not.
We do. We call it "Report Modes". We automatically extract all of the
modes on the design. We report the case analysis required on input ports
or registers for a clock to propagate to its endpoints.
The author also states "Fishtail does not generate multi-mode constraints
while Excellicon does". This is also not true. Fishtail generates multi-
mode constraints -- one SDC file per mode with case analysis autogenerated
for each mode and a description of which clocks propagate in each mode.
This so-called Fishtail/Excellicon user eval was seriously messed up.
Whatever.
- Ajay Daga
Fishtail DA Lake Oswego, OR
---- ---- ---- ---- ---- ---- ----
Related Articles
User details 5 major differences between Fishtail and Excellicon
User benchmarks Fishtail, Ausdia, Atrenta, Excellicon, Blue Pearl
Atrenta frustrated by user's flawed eval of 7 constraints tools
Join
Index
Next->Item
|
|