You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!

  • DAC Workshop on Suite of Embedded Applications and Kernels

    Design Automation Conference Workshop on Suite of Embedded Applications and Kernels

    In June, the first Suite of Embedded Applications and Kernels, or SEAK, workshop at the 2014 Design Automation Conference in San Francisco introduced a new Defense Advanced Research Projects Agency program in the area of embedded system benchmarking for U.S. Department of Defense-related applications. SEAK aims to define a new, open suite of benchmarking with a novel methodology in terms of performance and power to evaluate end-to-end embedded systems for DoD’s application areas.

    Article: Tracking the Big Semiconductor Story of 2012-slogo_small-jpg
    The workshop opened with a keynote address from Joseph Cross, the DARPA program manager for SEAK, who highlighted the objectives and expectations from the programs. Adolfy Hoisie and Antonino Tumeo, both from Pacific Northwest National Laboratory, then introduced the program’s strategy and its implementation plan. They described a workflow divided in three parts: 1) identification of challenges, which includes the definition of the benchmark suite through input/output functions, definition of metrics and development of the implementations; 2) the evaluation process, where PNNL will characterize embedded systems provided by vendors in the identified metrics; and 3) ranking, which will lead to a SEAK list updated twice annually. The SEAK list will report tradeoffs under various metrics of the embedded systems, identifying Pareto frontiers for various domains.

    The first invited speaker session focused on benchmarking experiences and methodology. The speakers included: Shay Gal-On (Embedded Microprocessor Benchmark Consortium), Richard Lethin (Reservoir Labs Inc.) and Jeff Bier (Berkeley Design Technology, Inc.).

    This session was followed by rapid-fire talks, where five vendors presented, in 10-minute short discussions, interesting and often undervalued aspects to consider when benchmarking an embedded system. The speakers for the talks were: Grant Martin (Tensilica Inc.), Srinivasa Addepalli (Freescale Semiconductor Inc.), Saurabh Sinha (ARM Ltd.), Yatin Trivedi (Synopsis Inc.) and Dilma Da Silva (Qualcomm Inc.).

    The third session involved three invited speakers who also discussed benchmarking: Albert Reuther (Massachusetts Institute of Technology Lincoln Laboratory), Sek Chai (SRI International) and Jeffrey Smith (BAE Systems).

    A poster session featured 12 posters addressing power and performance tradeoffs of embedded system, design space exploration, prototyping, benchmarking techniques and analysis tools. The posters represented several institutions, including: University of California, Irvine; University of California, Riverside; University of Southern California; Columbia University; University of Delaware; University of Maryland, College Park; PNNL; École Polytechnique Fédérale de Lausanne; Pennsylvania State University; University of Central Florida; and University of Michigan. The workshop closed with a panel, moderated by Cross, and involved all of the invited speakers.

    The SEAK workshop was organized by Cross (DARPA), Hoisie (PNNL), Darren Kerbyson (PNNL) and Joseph Manzano (PNNL) and included significant contributions from PNNL’s High Performance Computing group. More information about SEAK and the workshop, as well as the featured presentations and poster files, can be found here.

    “This workshop served as an important first event of the SEAK program, fostering discussions and interactions that highlighted important aspects to consider when evaluating embedded systems,” said Hoisie, who also serves as PNNL’s principal investigator for the SEAK program. “It provided direction and significant feedback to define the challenge areas and challenging problems and gave us fodder to refine and enhance SEAK’s existing methodologies.”

    SEAK also is seeking information, interaction and feedback for challenge identification from academia and vendors. A request for information has been issued and can be found here. The response date for the RFI is Friday, Dec. 5, 2014.

    Written by Antonino Tumeo