You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!

  • Does Managing Tools as if they are IP Make Sense?

    Years ago I thought that chip design companies would embrace the latest technology and be eager to adopt new tools. What I learned was that the people implementing and managing design projects were taking a lot of risks with almost every aspect of their projects. What they most wanted is to minimize risk from the design process – especially from design tool changes.

    The reluctance to change goes much deeper. In the middle of a project a design team would never be willing to change tools, or even tool versions. Even minor updates from vendors can have subtle algorithmic changes that affect results. Beyond the obvious possibility of an outright bug, there can be variations in results that can affect every downstream step. This is true for implementation and sign off tools.

    Chip companies spend significant resources on correlation and validation of tools. In some cases, known bugs in software are compensated for and if a tool vendor were to suddenly fixed the bug it could break the flow. Pretty much the only reason a design team will change any tool or tool version is to fix a show-stopper issue.

    Now, think about how many tools there are in the the typical design flow. Each one of these tools has configuration files, rule decks, libraries or stack-up information, and command scripts that drive the tool. If anything changes it can ripple downstream.

    Broadening our scope, the same reasoning applies to all the PDK data. PDK’s contain thousands of files. Stability of the PDK through a project is essential. Nevertheless, some project cannot avoid PDK changes because the foundry is refining the process, and those changes need to be adopted across the entire project prior to tape-out.

    Presently, large team projects usually already use data management for the design data, maybe even rule files. As we can see from the discussion above the same kind of management that is used for design data could be beneficial when applied to the tools in the flow.
    Article: Will Microsoft Go Thermonuclear?-methodics-desgin-environment-management.jpg
    Methodics Inc., a data management company for EDA, has just written about how they support complete management of the design environment using their software. They point out that large teams spread out in locations around the world need to have consistent and well managed tool environments. Treating the design environment as if it were IP allows a systematic way of managing all the tools in the flow.

    Using a variety of techniques, it is possible to make setting up the user environment efficient and fast. One frequent concern is whether making multiple copies of all the tool installations is necessary. Methodics gives customers the choice of instantiating the files or using soft links to save space and copying time. Another important consideration in their solution makes it possible to handle user specific customization, while ensuring before the final project release that known versions of all the tools are used in the final tape out flow. It is also possible to switch tool release versions and keep old tool environments available in case there is a need to roll back a tool update.

    The Methodics white paper goes into more detail describing the different ways their solution can be deployed. But there is no question that managing the software used for a design project is just as important as managing the design data itself.