You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!

  • Challenges in IP Qualification with Rising Physical Data

    With every new technology node, there are newer physical effects that need to be taken into account. And every new physical effect brings with itself several new formats to model them. Often a format is also associated with several of its derivatives, sometimes an standard reincarnation of a proprietary format further evolved by an standard body. For example, we have SPF from Cadence, and then SPEF, first proposed by OVI (Open Verilog Initiative) and later standardized by IEEE. We also have RSPF (Reduced Standard Parasitic Format), DSPF (Detailed Standard Parasitic Format), and SBPF (Synopsys Binary Parasitic Format).

    Why so many different formats for a particular physical representations? It’s to do with accuracy and different methods of modeling, efficiency and size of data, optimization, and so on. A certain type of format can be used for a particular type of trade-off, e.g. modeling preference, tool affiliation, data size optimization, and so on. One thing is common; the volume of data to represent an electronic circuit on a piece of silicon and characterize it under all physical conditions is increasing exponentially with every emerging technology node.

    The situation has become more complicated with lower nanometer technology nodes where manufacturing variation becomes prominent. The manufacturing variation can be significant to what you design, so you have to figure out the variation before manufacturing and make appropriate provisioning for that into your design.

    Article: SemiWiki Versus DeepChip!-photon_shot_noise1-min-jpg

    Above is a SEM image of contact-holes that illustrates photon shot-noise as a result of quantum effects at nanometer dimensions. When contact-hole dimension shrinks, the required number of photons to create the required response from the photo-reactive compound in the resist on wafer decreases, however the variability remains the same. Due to this, the difference in the number of photons seen by every contact-hole (i.e. photon shot-noise) makes a visible impact. There are specific formats to model manufacturing variability as well.

    It’s chaotic situation learning, understanding the pros and cons, and making use of various formats in designing, verifying, and testing semiconductor IP and SoCs. An IP must be fully qualified with all the data it possesses before its integration into an SoC. The volume of data in silicon IP has grown multi-fold.

    Article: SemiWiki Versus DeepChip!-data_volume-min-jpg

    The above chart shows typical amount of characterization data necessary to describe the silicon IP needed to design and verify an SoC at different process nodes. It’s 1 TB at 14nm and is expected to grow to 4 TB at 10nm. Today, all factors including timing, power, noise, reliability, and variability have to be taken into account.

    At 14nm with FinFETs, power characterization requires a format like Liberty CCSP (Composite Current Source Power) to model power where the current with which an output is able to drive the connected RC network is accurately modeled in the characterization file. It takes into account the leakage as well as dynamic current. Advanced modeling for gate leakage, asynchronous operation, and voltage and temperature scaling is done to factor all effects.

    As the physical effects modeled in CCSP are highly dependent on process corner, there may be different CCSP models for different states, thus summing up to hundreds of CCSP files for each process corner for a full characterization.

    Interestingly, to add further to data, extensions to CCSP have already started; for electro-migration (EM) and on-chip variability (OCV) effects. Going forward to 7nm and below, the characterization data is bound to increase further.

    This exponential growth in volume and complexity of data per IP makes it impossible continuing with same home grown scripts to check the IP for both the IP provider and SoC integrator. Even simple checks applied on huge datasets can become a difficult and time consuming task. It needs smart automated tools which can do much more than just sanity check, for example trend check for a particular parameter, feedback and correction tips, waiver report, and so on.

    Fractal TechnologiesCrossfire is a right tool to provide an efficient and productive solution for quick IP qualification before its integration into SoC. Crossfire provides a detailed graphical as well as textual report on completeness of data, presence of all parts of a component, failed components / constructs, waived violations along with their justifications, and many more. It can quickly check large data by using separate processes on dedicated machines, thus parallelizing various tasks.

    Covering most of the design, verification, and test formats, design databases, and documentation formats, Crossfire is a tool of choice for IP providers to check the compliance of their offering and SoC vendors to qualify the IP for acceptance before using it in the SoC. Crossfire keeps adding support for upcoming new formats as well as popular vendor specific models.

    Read Fractal’s new whitepaper HERE.


    Pawan Kumar Fangaria
    Founder & President at www.fangarias.com