Usually with Gaussian distributions, when we talk about sigma, we can draw an equivalence between the number of standard deviations needed to reach a given sigma. Sigma is really a way of specifying the percentage of cases under the curve: thus 3 sigma is 99.865%. This comes about because with a true Gaussian distribution, 3 standard deviations covers this percentage of the cases.
With variation analysis, designers seek to put to rest concerns that outliers can cause a greater percentage of chips to fail due to process variation, voltage fluctuation, high temperature, missed timing margins, etc. Additionally, designers can use statistical design analysis methods to determine optimal design parameters for the highest performance. However, these methodologies rely on being able to simulate the design cases farthest from the mean.
With a Gaussian distribution, getting to 6 sigma, or 99.9999999013% of the cases, would require analyzing out to 6 standard deviations from the mean. With a brute force simulation approach this would mean running billions of samples. To make things worse, circuit design behavior does not have Gaussian distributions. Instead they have long tails, drastically increasing the number of standard deviations that are needed to reach a given sigma.
This point was driven home by Jacob Ou from TSMC and Kris Breen from Solido during their webinar late last September. They point out that a single standard cell, sense amp, or bit cell/slice is really only one small part of a larger design that depends on all the elements working. A tolerable error in a single cell or circuit, becomes intolerable when the likelihood of chip failure depends on thousands, or millions of instances of the same in a larger chip.
Jacob from TSMC talks about how the non-planar nature of FinFET devices leads to new parasitic elements coming into play that can no longer be considered negligible. These include additional capacitive couplings within the FinFET device. The result is a long tail on the performance histogram. In some cases, the distribution curves can even become bi-modal. In the webinar TSMC discusses their use of Solido tools to tackle tough issues in standard cell library development. In many cases they were able to get results more quickly, or in some cases even perform analysis that would have otherwise been impossible.
Solido’s High Sigma Monte Carlo uses a self-validating approach to quickly find cases that are above of the desired sigma and simulate them. An ordering of samples around the sigma threshold is generated and simulated, which also provides algorithmic feedback on the effectiveness of the sample selection process. Because SPICE is used for simulation, there is no doubt in the final outcomes.
TSMC usually keeps their cards close to their chest, but in this webinar they go into details about the results achieved when Solido tools are used in their internal flow. The good news is that the webinar is available for replay in case you missed the live session.
More articles by Tom Simon