( ESNUG 537 Item 1 ) -------------------------------------------- [02/07/14]

From: [ Around The World In 80 Days ]
Subject: User benchmarks Fishtail, Ausdia, Atrenta, Excellicon, Blue Pearl

Hi, John,

Please keep me anonymous.

Through a few company acquisitions plus some direct purchases we had found
ourselves with 7 SDC constraints tools (yes! 7!) in our portfolio, and
despite their use we still had timing closure issues from unclean SDC's.

We had to repeatedly consult with our frontend designers (who sometimes
lived half a planet away) in order to fix our SDC's. 

Things got worse when we'd asked the designers to run the tools as part of
their task delivering RTL with corresponding SDC.  The designers rejected
our available tools since the frontend folks did not understand the timing
requirements of the backend folks for timing closure.  This back and forth
was terribly time consuming.

Our management decided we had to clean up and streamline our flow.  They
required us to benchmark our 7 SDC constraints tools in order to pare it
down to 1 or possibly 2 tools.

After looking into them, we immediately dropped Blue Pearl, Cadence CCD,
and Synopsys GCA from the list:

    1. Blue Pearl was simply too primitive.

    2. Cadence themselves are not promoting their CCD product
       and their support for it is lack luster.

    3. Synopsys GCA works only at gate-level with no exceptions
       verification capability, which did failed our objective
       of a full RTL-to-GDS-II constraints methodology.

After those first cuts, we decided to proceed benchmarking the tools from
Fishtail, Ausdia, Excellicon and Atrenta.  Our eval was carried out by
3 BU's on various types of designs, plus our own central CAD team on 
multiple designs.  Since some of the EDA vendors provided generation and
verification of constraints using their separate distinct tools, while
others lump them in the same tool, we evaluated their capabilities
independently to focus on features that we were most interested in.
 
         ----    ----    ----    ----    ----    ----   ----

SDC VERIFICATION TOOLS:

Looking only as SDC verification, we found significant differences existed
between them.  We classified them as follows:
 
    1. SDC Linters - The SDC linters based on generic checkers use
       older technology and are primarily rule-based, i.e., rules
       are used to check the syntactical correctness of the SDC.
       Atrenta Spyglass, Synopsys GCA, and Cadence CCD fall in this
       category.  Fishtail Confirm for most part also falls in this
       category, although some parts of it overlaps into the SDC
       Analyzer category.

    2. SDC Analyzers - These perform SDC linting but their primary
       focus is to analyze the timing intent of the SDC on the
       design using semi-formal techniques.  For example, our SDC
       may be syntactically correct, but the resulting application
       on the design is still incorrect.  These types of problems
       are very hard to find and usually only show up during the
       gate-level sims.  Excellicon Concert and Ausdia TimeVision
       belong to this category.

RESULTS:

After detailed testing by our engineers, for each criterion listed below we
assigned a number between 1 and 5 -- with 5 being the best.

Criteria Confirm
(Fishtail)
Spyglass
Constraints

(Atrenta)
TimeVision
(Ausdia)
Concert
(Excellicon)
Performance 3 4 5 4
Noise 3 1 5 5
SDC Linting 4 5 4 4
Exceptions Verification 5 4 4 4
Timing Intent Verification 1 0 4 5
Multi-Mode 2 2 4 5
Debug 3 4 5 5
Ease of Use 1 4 3 4
Shell 5 3 5 5
Display 4 4 1 5
Total 31 31 40 46

NOTES:

   - Performance: Here Ausdia came out on top.  Their TimeVision
     was faster than the rest, closely followed by Excellicon and
     Atrenta.  The difference on large designs (>300 M gates) was
     less than a minute, while on small designs it was negligible.

   - Noise: Atrenta was noisiest of all tools.  This was expected
     as they have over 1000 rules which when triggered multiple
     times create a lot of noise.  As a generic checker, this was
     expected.  Both Excellicon and Ausdia faired much better here.

   - SDC Linting: All tools performed excellently in this area.
     Excellicon came out slightly below against other tools.  Main
     reason is their Concert extrapolates the SDC intent and keeps
     silent when there is non-ambiguous outcome.  Linting happens
     automatically as opposed to running bunch of rules.  The rules
     cannot be turned off which is why we dinged Excellicon.  Note
     that the rules are more design-oriented and we could not find
     a rule that should be turned off.

   - Exceptions Verification: Fishtail was undoubtedly the best in
     this category.  Rest of the tools followed very closely.  All
     tools verified the exceptions correctly; however the difference
     was seen in run-time.  Fishtail outperformed the rest by more
     than a minute on our large designs.

   - Timing Intent Verification: Atrenta does not do this kind of
     verification at all.  Fishtail has just started exploring this
     space when doing clock groups verification.  Both Excellicon
     and Ausdia came out on top with Excellicon taking the lead.
     Both tools employ CDC checking as part of timing constraints
     verification.  However, Excellicon has an edge over Ausdia
     because the timing data out of PrimeTime is analyzed by the
     new Excellicon tool called "ConStar" that plugs into Concert.
     Through this package, both the constraints and resulting
     timing data is analyzed together.  This was a unique and
     powerful combination that the other tools did not have.

   - Multi-Mode: Excellicon took the lead with Ausdia coming in
     a close 2nd.  Here we were looking at the verification of
     multiple SDC's on the design so that only common issues such
     as unconstrained flops across all modes are reported.

   - Debug: All tools provide debug capabilities.  TimeVision and
     Concert came on top here.  Both tools provide debug through
     their shell interface through their own commands or by
     regular Synopsys Tcl commands.  However, Concert provides
     an additional method of debugging through what is called
     the "TimingMap" -- which is a covered in the Display section
     so we gave equal marks to both leading tools in this section.

   - Ease of Use: We found Fishtail to be very hard to use.  Steep
     learning curve is required here.  Additionally it's not
     suited for frontend designers who may not be familiar with
     the ins and outs of the timing closure.  Without such
     knowledge the results produced by Fishtail have little
     meaning for frontend engineers.  As a result they hated
     running the Fishtail tool.  Ausdia TimeVision faired
     slightly better, but it was not intuitive what with lots
     and lots of Tasks you have to run.  Atrenta and Excellicon
     matched scores here.  We gave both of them 4's because we
     felt that there is still some room for improvement.

   - Shell: All tools except Spyglass had a nice Tcl shell and
     catered to common Synopsys Tcl commands like all_fanout,
     get_cells, get_attribute etc.  Some of the more exotic ones
     like foreach_in_collection are also supported.  We had a
     hard time getting Spyglass to understand some of the exotic
     commands which is why we dinged them.

   - Display: Spyglass has a nice generic GUI, however it's not
     geared towards constraints.  Fishtail uses a mix of HTML
     and has hooks to Verdi.  TimeVision's so-called GUI is a
     joke.  Completely useless so we gave it a 1.  Concert
     displays its own "TimingMap" which is essentially a
     graphical method of SDC visualization overlaid on top of
     your design structure.  This was intuitive and useful
     and also hooks into Verdi.


FINDINGS:

The scores clearly illustrate the distinction between SDC linters and SDA
analysis tools.  SDC linting, although useful, is overshadowed by the more
sophisticated and intelligent SDC analyzer tools.  Both Ausdia TimeVision
and Excellicon Concert outperformed the SDC linter tools; however we gave
higher marks to Concert for its stable code -- Ausdia crashed on us a few
times -- plus its Multi-Mode ability, plus its Timing Intent Verification
and finally its superior debug GUI.

         ----    ----    ----    ----    ----    ----   ----

SDC GENERATION:

On the SDC Generation side almost all tools claim to have this capability;
however we found some differences between Excellicon ConMan and rest of the
vendor tools (TimeVision, Focus and Spyglass) 
 
The biggest difference was the way the ConMan database worked.  It has the
multi-mode timing data in a single database.  Thus, SDC generation happens
by simply changing to the desired hierarchical instance and the mode; and
all timing data (in context) is immediately available.  In contrast other
tools I mentioned relied on specific commands to build and generate the
final data. 
 
Many other differences exist and it is impossible to cover all of them here.
However, since our flow is not yet 100% generation-based, we evaluated the
common features amongst all tools in the area of SDC generation even though
knowing full well that we are comparing apples to oranges. 

RESULTS:

After detailed testing by our engineers, for each criterion listed below we
assigned a number between 1 and 5 -- with 5 being the best.

Criteria Focus/Refocus
(Fishtail)
Spyglass
Constraints

(Atrenta)
TimeVision
(Ausdia)
ConMan
(Excellicon)
Constraints Promotion 2 1 0 5
Constraints Demotion 2 1 1 5
Clock Detection 3 3 5 5
Mode Creation 0 0 0 5
FP/MCP Generation 5 0 3 4
SDC Quality 3 3 5 5
Merged Mode SDC 5 3 0 4
Unknown IP SDC 4 5 5 5
Gates to RTL SDC 5 0 0 4
Management 1 1 1 5
GUI 4 3 1 5
Shell 5 4 5 5
Total 39 24 26 57

NOTES:

   - Promotion: Constraints promotion was the most important feature
     for us because we always have silicon verified constraints for
     previously taped out IP's or other blocks.  However, when
     incorporated in a new design, we need to develop constraints
     for the top level and other newly designed logic.  Without
     promotion we always run a risk of developing constraints that
     are out of context to the block-level constraints.  Thus
     promotion is the key feature for us...  All tools claim to
     have a constraints promotion capability.  However, some tools
     do a better job than others.  For example, Fishtail provides
     two methods of constraints promotion; push constraints to
     chip boundary, or add hierarchy to existing constraints.
     Atrenta follows the latter approach.  Ausdia does not even
     have this feature.  The deficiency with all but Excellicon
     was the fact that the process had to be broken up; meaning
     that we had to run through many steps before the constraints
     were promoted.  With ConMan it's all done automatically.  All
     you have to do is to change to any block or any mode and the
     constraints are available right there.  You can even import
     multiple block level SDC's and promote them to the top in
     one shot.

   - Demotion: As expected, the scores here were identical to the
     promotion.  Same reasons as above.  With ConMan the demotion
     of constraints was incredibly easy.  As a matter of fact it
     is already done when the design is loaded.  The constraints
     are propagated automatically across the design and you can
     get the info you need at any time for any mode, any hierarchy.

   - Clock Detection: Focus and Spyglass did not do well here.  The
     technology is based on simple tracing.  On one of our complex
     clocking designs, these two tools failed to identify some
     generated clocks.  TimeVision and ConMan both came out on top
     here.  TimeVision assigns percentages to each pin for possible
     clocks.  Higher the percentage number, the more likelihood of
     it being a clock source.  This is great if you have some idea
     about the design but for complete unknown IP, the clock
     determination will require more debug by other means (say,
     Verdi.)  In contrast, ConMan does formal analysis and assigns
     labels to each clock which uniquely identifies them as clocks
     against control pins.  It also does a "what-if-analysis" where
     you can probe the behavior of each signal and its unateness
     (clock sense).  ConMan automatically extracts the clocks and
     tells you how many clock domains the design has, clock gating
     structures, clock propagation paths, clock domain crossings,
     fanout loads etc.  This allowed us to use this information to
     re-partition the clock gating structures in order to simplify
     the clock tree synthesis.  We gave high marks to both Ausdia
     and Excellicon.

   - Mode Creation: ConMan outshone everyone else in this area.  No
     other tool does automatic mode creation.  ConMan synthesizes
     the case analysis values based on clock propagation path.
     When we time the chip in several modes and create these modes
     we have to manually identify the pins which influence the
     clock propagation path and then set appropriate values.  This
     is a painful and iterative process involving pre-layout STA
     runs to qualify that the clocks are indeed propagating through
     the correct paths, add more case_analysis if they do not, and
     repeat.  Very time consuming and error prone.  Also, there is
     always a danger of setting a pin value which conflicts with
     another value set downstream on another pin.  Through ConMan's
     Mode Synthesis this step is done automatically.  We found it
     to be extremely useful.  We were able to define 8 different
     modes in our complex SoC within an hour as compared to weeks
     of manual work.

   - Exception Generation: Fishtail came out on top here, with
     Excellicon following closely behind.  The difference was seen
     in the run-time and quality of reports.  On our medium size
     design (100 M gates), Fishtail generated FP and MCP exceptions
     in 2.34 hours and Excellicon in 2.37 hours.

   - SDC Quality: Again Excellicon's ConMan and Ausdia's TimeVision
     took the lead here.  However, this area can not be judged
     independently and ties in very closely to clock and modes
     detection.  ConMan's and TimeVision's quality of generated
     SDC's was far superior to that of Fishtail's and Atrenta.
     The biggest difference we found was in the IO delays and the
     clock groups.  Fishtail incorrectly picked a wrong clock as
     the related clock in one of our test cases.  Atrenta did a
     poor job in clock group identification.  ConMan extracted the
     exact constraint syntax that we could use for synthesis, STA,
     and P&R.  Initially we were very skeptical about the quality
     of constraints, but after a couple of trials and validation
     through PrimeTime we were convinced about the accuracy of the
     resulting constraints.  Both Ausdia and Excellicon get high
     marks here.

   - Merged Mode SDC: Here Fishtail has a lead.  Excellicon came out
     second.  There are two approaches to merging; one where you
     put in an inter-clock FP, and the other where you introduce
     additional clocks at the output of combo logic where clocks
     are merging and make use of the set_clock_groups with correct
     exclusivities.  Fishtail uses the first approach, while ConMan
     the second.  We would have given ConMan higher marks because
     with set_clock_groups you have more accurate SI computation,
     however we did not do because some of our designers use
     Cadence RTL Compiler for synthesis and the set_clock_groups
     is not fully supported by RC, therefore ConMan's generated
     SDC errored out when used with RC.

   - Unknown IP SDC: Pretty much solid marks for all tools.  We gave
     Fishtail slightly less marks here because of the difficulty
     in setting up the flow.  But otherwise, it performed as good
     as others.  We wanted to give ConMan higher marks here because
     of its ClockMap feature but this is covered later.

   - Management: No tool comes close to ConMan in this area.  Its
     single database contains all timing constraints information
     from which any type of SDC for any mode or any block can be
     generated.  This is a huge plus for us.

   - GUI: All tools have GUI's and Tcl shell, with TimeVision having
     the worst GUI among all tools.  TimeVision GUI is clearly an
     afterthought to get a check mark -- more like a poor man's
     attempt to put together something with no thought or direction.
     It's an absolute waste of time.  Fishtail uses HTML as a GUI
     and also has links to Verdi, which is nice.  Atrenta and
     ConMan have their own GUI.  The look and feel of ConMan's
     GUI is very different from other tools. More users/designer
     oriented with features that are useful for say understanding
     the full chip clocking structure or for architecting the
     clock tree; the way we do it in real world.  It also links
     nicely to Verdi thus adding even more debug capability.  The
     GUI can be closed at anytime and re-invoked from the shell.
     Their ClockMap along with ClockSim and IOSim are effective
     displays that can be used for debug, analysis and design
     reviews.  I was able to quickly get the info provided by
     the tool such as clock gating, clock path propagation for
     different mode of operation, number of flops per each clock
     domain, etc. -- which generally is hard info to track
     manually.  An unintended consequence of the ConMan reports
     were formalization of our design review process, revision
     control of our constraints, and documentation of all
     related information as we documented our design review docs.

   - Shell Interface: Solid 5's all across tools except Spyglass.
     All tools provide comprehensive support for the Synopsys Tcl
     environment, but we had hard time with Atrenta Spyglass in
     reading in one of the obscure Tcl commands.


FINDINGS:

As I mentioned before Excellicon ConMan stands out on its own.  There were
other unique features in ConMan that other tools do not possess, that a
comparison did not make sense -- which is why we did not put them on the
criteria list (for example, writing out SDC with some blocks modeled as
ETMs, or features like SDC simulation, or links to simulation world, etc).
Even the criteria we compared were done so differently in ConMan that it
was impossible to compare it against other tools.

         ----    ----    ----    ----    ----    ----   ----

FINAL TALLY:

  Excellicon Fishtail Ausdia Atrenta
Verification 46 31 40 31
Generation/Management 57 39 26 24
Total 103 70 66 55

We dumped all the other tools and standardized on the Excellicon ones.

Our initial objective of pushing the constraints generation to the frontend
is now possible since Excellicon tools don't require our frontend designers
to learn and manage the ins and outs of SDC -- which they despised before.
In our new flow, we now mandate the use of Concert for making sure our
existing constraints are good; and use ConMan to make sure the new design
constraints are good, too.

We have also been able to cut down on CDC setup time and deepen our analysis
capability since the setup through ConMan also provides initial setup for
CDC for any layer of hierarchy and any mode.  We used to do CDC analysis
for functional mode and often did not fully complete the process because of
noise and the time it takes to do.  In general, setup with CDC tools is a
nightmare and very time consuming; ConMan makes it easier and today we can
perform CDC analysis for various modes and any hierarchy as opposed to
performing CDC on subset of design blocks.  The tool also solved our setup
inconsistencies between various layers of hierarchy.

Today, our backend team does not accept any constraints which have not been
certified by Concert or produced by Conman.  SDC management has become very
simple since we now have a single database which we exchange between teams
and it's much easier to track in our revision control process.

    - [ Around The World In 80 Days ]


Join    Index    Next->Item






   
 Sign up for the DeepChip newsletter.
Email
 Read what EDA tool users really think.





Feedback About Wiretaps ESNUGs SIGN UP! Downloads Trip Reports Advertise

"Relax. This is a discussion. Anything said here is just one engineer's opinion. Email in your dissenting letter and it'll be published, too."
This Web Site Is Modified Every 2-3 Days
Copyright 1991-2024 John Cooley.  All Rights Reserved.
| Contact John Cooley | Webmaster | Legal | Feedback Form |

   !!!     "It's not a BUG,
  /o o\  /  it's a FEATURE!"
 (  >  )
  \ - / 
  _] [_     (jcooley 1991)