WP_Term Object
(
    [term_id] => 8
    [name] => Cliosoft
    [slug] => cliosoft
    [term_group] => 0
    [term_taxonomy_id] => 8
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 114
    [filter] => raw
    [cat_ID] => 8
    [category_count] => 114
    [category_description] => 
    [cat_name] => Cliosoft
    [category_nicename] => cliosoft
    [category_parent] => 157
)
            
cliosoft 2021
WP_Term Object
(
    [term_id] => 8
    [name] => Cliosoft
    [slug] => cliosoft
    [term_group] => 0
    [term_taxonomy_id] => 8
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 114
    [filter] => raw
    [cat_ID] => 8
    [category_count] => 114
    [category_description] => 
    [cat_name] => Cliosoft
    [category_nicename] => cliosoft
    [category_parent] => 157
)

Importance of Data Management in SoC Verification

Importance of Data Management in SoC Verification
by Pawan Fangaria on 04-22-2014 at 6:00 am

In an era of SoCs with millions of gates, hundreds of IPs and multiple ways to verify designs through several stages of transformations at different levels of hierarchies, it is increasingly difficult to handle such large data in a consistent and efficient way. The hardware and software, and their interactions, have to be consistent through appropriate files and interfaces. The SoC has to be verified perfectly, with all corner cases covered, in order to avail the short window of market opportunity while keeping the NRE (Non-recurring Engineering) cost within limits. An SoC that was working with a particular set of design database, scripts and several files may no longer work due to mismatch in the version of a single file. This may introduce a significant burden on design and verification team to sort out the issue, waste time and effort, and ultimately impact designer productivity, cost overrun and turn-around time.

 Importance of Data Management in SoC Verification

While going through a research thesis from the University of Michigan on design and verification of digital systems, I noted this flow of semiconductor design cycle from specification to manufacturing and packaging of chips. The actual flow generally is more complex than this, including virtual prototyping above RTL and several sub steps and iterations at every stage. The verification at every stage can include several aspects – function, timing, power, reliability (electro migration, EMI, thermal etc.), physical layout, and area and so on. And these are achieved through several means such as logic and timing simulations (static and dynamic), Spice level simulation, formal verification, equivalence checking, power verification and optimization, DRC, ERC and emulation for firmware and so on. Several trial and error iterations are made throughout the flow. Several ECO (Engineering Change Order) loops are introduced at the final layout stage, where the design is vulnerable to the introduction of inconsistencies between the layout and RTL or any intermediate stage. An important criterion is to keep checking the consistency of the design between different stages of the flow; for example, equivalence between RTL and gate level netlist, layout versus schematic and netlist and so on. Test vectors and test benches have to be generated and maintained.

In such a scenario, where test data is as much or even more significant as the design data, proper data management and configuration is a must. What if the final SoC integrated with hundreds of IPs started failing at a particular output point? Obviously, there would be a heavy cost in checking through every IP. If data configuration/management is in place, the verification engineer can safely extract the older version of required files and database, and debug the isolated case with minimum effort. At this point, I would like to point out that most of the hidden bugs appear at the final top level, which can be very frustrating and expensive to solve if you don’t have a design data management system in place. You may be inviting a situation where you are left with no choice other than to compromise on the overall quality of the chip.

During the design cycle, different levels of hierarchies are created and collapsed depending on the way the design evolves and is optimized through several partitioning and merging schemes. This invites robust data management which can record various hierarchical configurations and consistency of data across several levels of hierarchies. With serveral versions of behavioral models, test benches, simulation vectors, RTL/Gate netlist etc., verification engineers have started versioning snapshots of working verification setup. Given the short time frame engineers have for validating the design, the snapshots form a safety net for them to fall back on in the event something goes wrong in their verification setup. Taking the snapshots of their working model also helps them debug any issues they may face in their current setup. It also helps them narrow down the changes to specific files.

As the design size increases, the design and verification team size also increases. The team may be spread across the globe, thus requiring the data management system to control the access of data across multiple teams, either at the same site or at worldwide sites, in order to maintain the data integrity.

Design data management is not only extremely important for SoCs, but also for IP/IP subsystems. Designers developing the SoC/IP are typically spread over different design sites across geographical boundaries and have to collaborate closely together to meet the ever shrinking time to market window. Herein, having a design data management is extremely useful for versioning, release and derivative management. The core flow of design and verification is more or less the same for IPs as for SoCs. And the quality of IPs plays a significant role in the overall quality of the SoC that contains them. Proper data management of IPs can facilitate more efficient re-use in various SoCs.

As I read the paper, I couldn’t help but think of companies like ClioSoft, which gives prime importance to the above aspects in the overall semiconductor design flow within its design data management solution. The SOS platform provides flexible administration of data through easy-to-use GUIs; world-wide real time access to data, very quickly; protected sharing of design data, libraries, design kits and IPs; release and derivative management and, of course, revision control of data. ClioSoft also provides an innovative tool, VDD (i.e., Visual Design Diff) which can point out the differences between two schematics or layouts, thus helping the designers track mistakes propagated through schematics or layouts.

Also Read

The CAD Team – Unsung heroes in a successful tapeout

Cliosoft Grows Again!

High Quality PHY IPs Require Careful Management of Design Data and Processes

Share this post via:

Comments

0 Replies to “Importance of Data Management in SoC Verification”

You must register or log in to view/post comments.