WP_Term Object
(
    [term_id] => 45
    [name] => Aldec
    [slug] => aldec
    [term_group] => 0
    [term_taxonomy_id] => 45
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 102
    [filter] => raw
    [cat_ID] => 45
    [category_count] => 102
    [category_description] => 
    [cat_name] => Aldec
    [category_nicename] => aldec
    [category_parent] => 157
)
            
WIKI Multi FPGA Design Partitioning 800x100
WP_Term Object
(
    [term_id] => 45
    [name] => Aldec
    [slug] => aldec
    [term_group] => 0
    [term_taxonomy_id] => 45
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 102
    [filter] => raw
    [cat_ID] => 45
    [category_count] => 102
    [category_description] => 
    [cat_name] => Aldec
    [category_nicename] => aldec
    [category_parent] => 157
)

Real FPGAs don’t eat fake test vectors

Real FPGAs don’t eat fake test vectors
by Don Dingee on 06-26-2014 at 8:00 am

Vector blasting hardware is as old as digital test methodology itself. In the days of relatively simple combinational and finite state machine logic, a set of vectors aimed broadside at inputs could shake loose most faults with observable outputs. With FPGAs, creating an effective set of artificial test vectors has become a lot less certain and a lot more time consuming.

Worse yet, one of the darlings of vector test – random pattern sets, which are fairly easy to generate automatically – are pretty good at shaking out manufacturing defects, however they are all but useless in functional verification of FPGA designs. Creating patterns for requirements-based testing of FPGAs calls for much more intimate knowledge of IP block constructs, pipelining, clock and power domains, fabric interconnect, and other specifics.

When you absolutely, positively have to know an FPGA design works, there is no better place to start than the simulation testbench. It knows where the IP bodies are buried, so to speak, and it knows what patterns in what sequence are needed to exhume them. Discarding all that a priori knowledge in creating a set of artificial test vectors would be a shame, yet that is exactly what we tend to do when we toss a design over the wall for objective testing. Go out and make your own patterns, folks … but be quick about it, and don’t miss anything.

No pressure there. Possible, but risky, both in terms of time and coverage. That risk is not lost on the safety-critical types, so they created the rigorous DO-254 certification process: prove to us, beyond a shadow of a doubt so we don’t have to worry, that this FPGA does exactly what it says it does, and does nothing else unexpected.

In that context, a recent quote from Elbit Systems about a successful DO-254 audit crossed my desk. On the surface, it sounds rather preposterous – how can something this seemingly routine be so difficult? Yet, when you understand the time-value-risk proposition involved, taking a moment for celebration is justified:

This is the first time in Elbit’s history that we have been able to bring more than 5 FPGA devices to the [European Aviation Safety Agency] audit.

As with any audit, the proof is in the proof. By using Aldec’s DO-254/CTS and its combined hardware and software solution, Elbit was able to not only survive the EASA Level A audit, but pass more designs faster and with higher confidence. More than just a simulator, the DO-254/CTS relies on a custom daughterboard containing the actual target FPGA. This allows not only at-speed testing, but full visibility to I/O, and the ability to test parameters like voltage and clock variation using scripts.

The parallelism of design, verification, and test in any FPGA design has a large payoff. If you can freeze a design, test it, and pass, that’s wonderful – but unlikely in the real world with designs of anything but trivial complexity. A key benefit Elbit cites from their experience with the Aldec DO-254-CTS is the ability to quickly retest the FPGA after any changes, automatically running tests directly from the simulation testbench, comparing results, and keeping sync with traceability to the requirements.

Much of the lore of testing is built on vector generation and the objectivity of test independent of design, but these results are compelling. Integrating test and traceability with design and simulation tools makes for better FPGA-based product faster, in a DO-254 setting or otherwise.

lang: en_US

Share this post via:

Comments

0 Replies to “Real FPGAs don’t eat fake test vectors”

You must register or log in to view/post comments.