Semiconductor Design Verification Technical Plan
- Goals: Project goals are to: 1) expand measurement and test solutions for the semiconductor industry, and 2) demonstrate value of these methods and have EDA companies integrate them. Methods and tools developed in this project can be applied at both design and post-silicon phases of manufacturing to reduce verification cost and improve defect detection.
- Current practice: Industry studies have found that there are currently no tools available to systematically search and detect outlier bugs, which can only be triggered with rare and precise sequences [2]. It is infeasible for verification teams to think of every possible corner-case situation and to verify it with hand-written tests.
Functional verification of chip designs today primarily uses variations of random testing, sometimes including constraints, supplemented with test libraries designed to detect particular types of faults. While reasonably effective, these methods can fail to detect very rare faults. Such rare and difficult to trigger faults may not be induced by the input sequences generated randomly, because random tests invariably will not contain all important combinations of values. Supplemental tests based on expert judgement provide some improvements, but depend on the experience and observation of previous designs, and thus may miss some rare conditions that produce faults in new designs. Moreover, the test sets used are much larger than would be required for combinatorial test sets of equivalent coverage.
- New approach: For devices and systems that have more than a few test inputs, exhaustive testing is practically infeasible due to the exponential increase in test cases needed as a function of the number of test variables (the combinatorial explosion problem). Semiconductor devices and systems particularly suffer these testing limitations.
- The combinatorial explosion problem also exists for nearly all software systems, and NIST-developed tools for approach called combinatorial testing have been used for many years to vastly reduce the combinatorial explosion problem in critical systems. Combinatorial testing works by generating tests using covering arrays, which systematically enumerate and exercise all multi-variable (t-way) input interactions, up to a specified level for t (typically 4 to 6). That is, this method compresses all t-way combinations of parameter values into very small test arrays. For example, there are 1,329,336,000 3-way combinations of 1,000 Boolean variables. These can be compressed into a test set of 71 tests that cover all of these 1,329,336,000 combinations [https://csrc.nist.gov/acts]. Combinatorial testing has demonstrated significant cost reductions and more effective testing for complex systems and software, as compared to other forms of testing such as randomized test case generation.
Extending combinatorial testing to semiconductor design problems has the potential for greatly reduced cost and significantly enhanced defect and vulnerability detection.
The project will develop and evaluate tools for applying combinatorial methods to designs expressed in hardware description languages. The combinatorial test approach will be supplemented with fault injection techniques (see Appendix) to determine response to faulty components or data. The approach will include:
- Identification of variables and values to be included in input parameter model. This process may include evaluation of both HDL and formal models (where available).
- Generation of covering arrays for components.
- Evaluation of significant reachable states and depth within a graph model of the component, considering HDL and/or formal models. Formal models to be evaluated include temporal logic and Petri nets (see Appendix).
- Execution of tests derived from covering arrays, measuring statement, branch, condition, toggle, and functional coverage. Fault injection methods will evaluate and record internal states affected by induced faults in components or data.
- Determination of defect detection rate.