WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 569
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 569
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)
            
14173 SemiWiki Banner 800x1001
WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 569
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 569
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)

What’s New in Functional Verification Debug

What’s New in Functional Verification Debug
by Daniel Payne on 06-28-2015 at 7:00 am

We often think of EDA vendors competing with each other and using proprietary data formats to make it difficult for users to mix and match tools, or even create efficient flows of tools. At the recent DAC event in San Francisco I was pleasantly surprised to hear that two EDA vendors decided to cooperate instead of create incompatible formats in the area of functional verification debug.

Related – Are There Trojans in Your Silicon? You Don’t Know

VCD
The Value Change Dump file format has been around as a standard since 1995 as IEEE standard 1364-1995, and it works OK for smaller designs, yet as design size grows the VCD file can become multi-GB in size which really starts to slow down EDA tools in terms of loading, parsing and operating. EDA vendors then came up with proprietary extensions to VCD and other binary formats, but nothing universal has been widely accepted.

Cooperation
So the technologists at Cadence and Mentor Graphics decided to cooperate and create a successor to VCD so that modern SoCs with billions of transistors and massive waveforms can be functionally verified in the most efficient manner, saving users time. Ellie Burns from Mentor and Adam Sherer from Cadence presented at the Verification Academy booth at DAC. I first met Ellie at Viewlogic back in the 1990’s and have kept in touch over the years, and she also lives nearby in beautiful Oregon.

What these companies are proposing is a Debug Data API (DDA) to allow any EDA tool to create or view debug waveform data. Dennis Brophy of Mentor Graphics also wrote an informative blogabout DDA earlier this month. Here’s how the DDA works:

Cadence has validated this new DDA with their SST2 waveform format, and Mentor with their Visualizer. Some of the benefits of the DDA are:

  • VCD interoperability
  • Data portability
  • Openness

Adam Sherer blogged about how the DDA uses an open, Apache-licensed source code base so that each EDA vendor can optimize the interface implementation for their own tools.

Related – Getting the Best Dynamic Power Analysis Numbers

A demonstration showed simulation data created in the Mentor Questa simulator, then viewed with Cadence SimVision tool. Talking about Questa, I just learned that it has been updated to run up to 4X faster on regression tests, their new Visualizer Debug Environment sped up by 2-5X while taking less memory, running verification management coverage data collection is now up to 10X faster, and running the formal apps can be up to 8X quicker.

Cadence has committed to using this DDA approach with their newly announced Indago tool.

Related – SoC Debugging Just got a Speed Boost

Next Steps
If you’d like to get involved with the definition and use of DDA, the consider joining this group as they meet in the Valley on July 14th to review the specification.

Share this post via:

Comments

0 Replies to “What’s New in Functional Verification Debug”

You must register or log in to view/post comments.