WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 569
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 569
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)
            
14173 SemiWiki Banner 800x1001
WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 569
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 569
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)

Cadence’s System-to-Silicon Verification Summit

Cadence’s System-to-Silicon Verification Summit
by Randy Smith on 10-06-2013 at 6:00 pm

 At this year’s DAC, I spoke with several friends at Cadence. I got the distinct impression that something at Cadence had changed. There was a sense of pride and accomplishment that it seems to me had drifted away over the years. Now employees were speaking with true conviction about the accomplishments of the product development teams and the results of the company’s renewed focus on R&D the past couple years. So, it was with high expectations that I attended the System-to-Silicon Verification Summit held at the auditorium in Cadence Building 10 in San Jose on September 26, 2013. I was not disappointed.

The event was led by Brian Fuller, Cadence’s new editor-in-chief. Opening remarks were provided by Charlie Huang who recently became responsible for the System & Verification Group at Cadence, in addition to his role leading Worldwide Field Operations. This was followed by presentations by keynote presentations, first by Jim Hogan and then by Brian Bailey who gave their views as to the scale and importance of verification in today’s complex system designs. Gary Smith also participated on a panel in the afternoon and there were presentations by representatives of nVidia, Broadcom, Zenverge, and Ambrella, as well as, of course, Cadence.

One trend in the discussions and presentations was the importance of verification places to include software. This is not new, but it is certainly becoming increasingly prominent. When I was VP, of Sale & Marketing at TriMedia (a Philips Semiconductor spin-off), we made it a requirement to use emulation (Cadence Palladium) of the hardware to prove we could boot the operating system. We had the further challenge of verifying the device drivers for much of the IP connected to the system bus. The diagram below from Jim Hogan’s presentation illustrates the many layers of testing and the relationship to software in the system verification process (click on the picture to zoom in).

Related to this topic, on one side of the stage was a new Cadence Palladium XP 2. This system is a remarkable engineering achievement itself. For the recent product announce, click here. Cadence claims that the new Palladium XP II platform delivers 2X increase in verification productivity, resulting in up to four months faster time to market, and the Enhanced System Development Suite delivers up to 60X speed-up for embedded OS verification and 10X performance increase in hardware/software verification. I wish we had had a box like that at in our hands at TriMedia.

The panel discussion was quite lively. I think the primary take away from the panel discussion was the increasing reliance on ‘use cases’ or ‘scenarios’. It was suggested that Apple is using approximately 50,000 scenarios in the development of its iPhones. The point is that it is impossible to test all possibilities when you consider the large amount of software content, the interactions between different modes and applications. It is important however to test the application scenarios that are most likely to occur. This is much different than the old fault coverage paradigm. Verifying without knowledge of the software application to be ran on the system is simply not adequate. This is true in the verification of functionality, performance, and power. This seems to be where the older verification strategies employed in more constrained industrial designs is diverging from the strategy needed in the high software content consumer area, ultimately leading to the coming complexities of the Internet of Things (IoT).

It is not possible to repeat in this short article all of the valuable information presented at this event. It also looks like these summits will be coming Cadence on a somewhat regular basis. Those with an interest in mixed signal design should register for the upcoming Mixed-Signal Technology Summit also to be held on Cadence’s San Jose campus on this coming Thursday, October 10.

lang: en_US

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.