WP_Term Object
(
    [term_id] => 159
    [name] => Siemens EDA
    [slug] => siemens-eda
    [term_group] => 0
    [term_taxonomy_id] => 159
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 716
    [filter] => raw
    [cat_ID] => 159
    [category_count] => 716
    [category_description] => 
    [cat_name] => Siemens EDA
    [category_nicename] => siemens-eda
    [category_parent] => 157
)
            
Q2FY24TessentAI 800X100
WP_Term Object
(
    [term_id] => 159
    [name] => Siemens EDA
    [slug] => siemens-eda
    [term_group] => 0
    [term_taxonomy_id] => 159
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 716
    [filter] => raw
    [cat_ID] => 159
    [category_count] => 716
    [category_description] => 
    [cat_name] => Siemens EDA
    [category_nicename] => siemens-eda
    [category_parent] => 157
)

Mentor Functional Verification Study 2016

Mentor Functional Verification Study 2016
by Bernard Murphy on 09-14-2016 at 7:00 am

Periodically, Mentor commissions a user/usage survey on Functional Verification, conducted by the Wilson  Research Group, then they publish the results to all of us, an act of industry good-citizenship for which I think we owe them a round of thanks. Harry Foster at Mentor is breaking down the report into a series of 15 blogs. He’s also going to host a webinar on September 20[SUP]th[/SUP] at 8am Pacific to talk through the results.

REGISTER FOR THE WEBINAR

One especially interesting conclusion to come out of this survey is the rapid growth in verification investment and sophistication in methodology for FPGA-based designs. For those of us who still struggle to understand why anyone even bothers with functional verification for FPGA, this may come as a surprise. One view might be that you design, do a bit of sanity-checking, program the device and test at speed in-system, fix any problems you find then repeat, an approach generally known as “burn and churn”.

For simple designs this may still be common practice, but it no longer represents a practical methodology for the bulk of FPGA design. To understand why, look first at who is using FPGAs today. About half the dollar volume (per Gartner) is in communications. Industrial and mil/aero together take about half of what’s left and the rest is divided up between consumer, automotive and data-processing. In many of these markets, FPGA-based design dominates either because device volumes can’t justify ASIC NREs or because designs must be built to be adaptable to rapidly evolving standards.

When FPGAs carry the bulk of functionality those designs become significantly more complex. 59% have at least one embedded processor and 32% contain 2 or more embedded processors. These are programmable SoCs, not general-purpose programmable logic. A Zynq-7000 SoC (not the most complex SoC offered by Xilinx) provides a dual ARM9 MPcore, multiple DDR interfaces, USB, Gigabit Ethernet and SD/SDIO interfaces and a full range of security features. Verifying a design built around all of this, together with the software running on those cores, is every bit as complex as verifying a full-ASIC SoC.

At this complexity, it’s really irrelevant that burn and churn doesn’t cost you $$ in fab costs and fab cycle-time. There is no possible way you could ever converge on a working design through trial and error – you have to follow the same disciplined verification methodologies used for ASIC designs. This is partly a function of the intrinsic complexity of the verification task – interoperating CPUs, memory, peripherals and security, plus your own programmed logic – and partly a result of extremely limited controllability and observability in the programmed device. The debug options you have are limited to external pins and debugger access to memory and state registers, which may be OK for software debug but is definitely not OK for hardware debug.

This means that FPGA verification teams find they must debug designs as comprehensively as possible before committing to burn. The survey shows there is growing use of coverage metrics and assertions, and constrained-random simulation to help get to higher levels of coverage. And, hold onto your hats, 15-20% of projects are using formal methods, spread among property-checking and automated formal checks. Take a moment to let that sink in – a non-trivial percentage of FPGA design teams find it essential to do some level of formal-proving before they burn a design. How times have changed – for FPGA design and for formal verification.

In fact, the evidence shows that FPGA verification investment (in engineers and in advanced verification methodologies) is maturing quite rapidly, much more so that in ASIC/IC design where maturity seems to have flattened out. This isn’t to say that ASIC/IC teams are laggards; per Harry they just got to the (current) peak faster and are now having to throw more bodies at the problem as the verification task grows.

Looking at design overall, the survey measured, among other factors, demand for design engineers and for verification engineers. The Wilson study shows compound annual growth-rate (CAGR) for design engineers more or less steady at 3.6%, which Harry attributes to improvements in automation and IP reuse. But the CAGR for verification engineers is 10.4%, a much, much faster rate of growth reflecting (my conclusion) less rapid progress in automation and reuse in verification.

In the webinar, Harry will provide detailed stats for FPGA and ASIC/IC design. This should be a very useful benchmark for verification teams who want to understand how they stack up against industry norms. It’s certainly an eye-opener on how much verification methods have evolved for FPGA-based design.

REGISTER FOR THE WEBINAR

You can access Harry’s blogs HERE. As of this post he has published 7 of the series of 15. Links to additional blogs should appear under this link as they are posted.

More articles by Bernard…

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.