Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Combinatorial Methods for Trust and Assurance

Semiconductor Functional Verification

Combinatorial test and measurement methods have demonstrated 20% to 30% cost reductions and more effective testing for complex software.  Detection of security vulnerabilities and ultra-rare defects is significantly better than conventional test methods. Combinatorial testing compresses all t-way combinations of parameter values into very small test arrays, so that it is in many ways comparable to exhaustive testing. As over half the cost for a new chip design is from functional verification [1], extending combinatorial testing to semiconductor design problems has the potential to reduce total cost for a new design by 10% to 20%.    


  • As of 2022, typically 50% to 60% of total IC/ASIC project time goes to functional verification, with some new design IP requiring 70% or more [1].
  • With current functional verification methods, 84% of FPGA designs include non-trivial bugs that are missed in verification and escape into production [1].  The largest categories of these bugs are logic and function flaws.
  • Because of the combinatorial explosion problem, the most common method for functional verification is generation of random tests, sometimes supplemented with constraints or machine learning to enhance detection of possible very rare errors.
  • According to industry experts, there are currently no tools available to systematically search and detect outlier bugs, which can only be triggered with rare and precise sequences [2]. It is infeasible for verification teams to think of every possible corner-case situation and to verify it with hand-written tests.
  • The problem is even greater for security-critical bugs in chip designs.  A large study involving 54 teams of experts was able to find only 61% of security vulnerabilities using conventional test and inspection techniques, and only 48% using formal verification [3].  
    [1] 2022 Wilson Research Group Functional Verification Study
    [3] Dessouky, G., Gens, D., Haney, P., Persyn, G., Kanuparthi, A., Khattri, H., ... & Rajendran, J. (2019). {HardFails}: Insights into {Software-Exploitable} Hardware Bugs. In 28th USENIX Security Symposium (USENIX Security 19) (pp. 213-230).


  • Combinatorial testing of software has been shown to improve software fault detection effectiveness by a factor of 10X or more, with significant reduction in overall costs.  Coverage is also significantly greater and produced more efficiently.  Combinatorial testing has also been shown to provide much stronger testing with far fewer test cases than random testing.  The improvements in cost reduction and test effectiveness are especially strong for complex embedded systems, such as avionics.  []
  • Combinatorial testing excels at rare flaws as it systematically enumerates and exercises all possible t-way input interactions leading to rare sequences.
  • The project is likely to provide advanced capability to industry within 1-3 years.  Tools for advanced software program testing developed by NIST have been adopted widely by industry.  These tools can be adapted and extended to be applied to hardware description languages (HDL) used in all new semiconductor designs.  The modifications to adapt to chip design languages will focus on the greater parallelism in hardware description languages, and capability for input sequences to ensure reachability of even extremely rare states within a design.  New theorems published by NIST demonstrate how this may be done, and these results can be implemented in new tools for verification of advanced models.
  • The combinatorial test methods that are the focus of this work have been demonstrated to reduce testing cost for complex software by 20% to 30% or more.  If similar results can be shown for hardware description languages, it may be possible to reduce new chip design costs by 10% to 20%, as testing costs average more than half the total cost of a new chip design, and even higher for new designs using less existing IP. 

Combinatorial test methods can potentially identify much larger fraction of security vulnerabilities than are generally known to be discoverable through traditional testing methods (found to be 61% in [3]). This will tremendously reduce the risk of hardware vulnerabilities escaping into production.


The output of the project will be a set of combinatorial testing tools for semiconductor design verification and testing.  The tools will be extended and enhanced adaptation of the NIST combinatorial testing tool that has been distributed to more than 4,500 industry and academic users, and used by some of the world's largest organizations.

Semiconductor Design Verification Technical Plan

  • Goals:  Project goals are to:  1) expand measurement and test solutions for the semiconductor industry, and 2) demonstrate value of these methods and have EDA companies integrate them.  Methods and tools developed in this project can be applied at both design and post-silicon phases of manufacturing to reduce verification cost and improve defect detection.
  • Current practice:  Industry studies have found that there are currently no tools available to systematically search and detect outlier bugs, which can only be triggered with rare and precise sequences [2]. It is infeasible for verification teams to think of every possible corner-case situation and to verify it with hand-written tests.

Functional verification of chip designs today primarily uses variations of random testing, sometimes including constraints, supplemented with test libraries designed to detect particular types of faults. While reasonably effective, these methods can fail to detect very rare faults.  Such rare and difficult to trigger faults may not be induced by the input sequences generated randomly, because random tests invariably will not contain all important combinations of values.  Supplemental tests based on expert judgement provide some improvements, but depend on the experience and observation of previous designs, and thus may miss some rare conditions that produce faults in new designs.  Moreover, the test sets used are much larger than would be required for combinatorial test sets of equivalent coverage.

  • New approach:  For devices and systems that have more than a few test inputs, exhaustive testing is practically infeasible due to the exponential increase in test cases needed as a function of the number of test variables (the combinatorial explosion problem). Semiconductor devices and systems particularly suffer these testing limitations. 
  • The combinatorial explosion problem also exists for nearly all software systems, and NIST-developed tools for approach called combinatorial testing have been used for many years to vastly reduce the combinatorial explosion problem in critical systems. Combinatorial testing works by generating tests using covering arrays, which systematically enumerate and exercise all multi-variable (t-way) input interactions, up to a specified level for t (typically 4 to 6).  That is, this method compresses all t-way combinations of parameter values into very small test arrays. For example, there are 1,329,336,000 3-way combinations of 1,000 Boolean variables. These can be compressed into a test set of 71 tests that cover all of these 1,329,336,000 combinations [].  Combinatorial testing has demonstrated significant cost reductions and more effective testing for complex systems and software, as compared to other forms of testing such as randomized test case generation. 

Extending combinatorial testing to semiconductor design problems has the potential for greatly reduced cost and significantly enhanced defect and vulnerability detection.

The project will develop and evaluate tools for applying combinatorial methods to designs expressed in hardware description languages.  The combinatorial test approach will be supplemented with fault injection techniques (see Appendix) to determine response to faulty components or data.  The approach will include:

  1. Identification of variables and values to be included in input parameter model.  This process may include evaluation of both HDL and formal models (where available).
  2. Generation of covering arrays for components.
  3. Evaluation of significant reachable states and depth within a graph model of the component, considering HDL and/or formal models.  Formal models to be evaluated include temporal logic and Petri nets (see Appendix). 
  4. Execution of tests derived from covering arrays, measuring statement, branch, condition, toggle, and functional coverage. Fault injection methods will evaluate and record internal states affected by induced faults in components or data.
  5. Determination of defect detection rate.    


The tools developed by NIST have been shown to reduce the time required for software verification and testing, while improving test coverage.  For example, a study of their application to industrial control software [6] showed 3X greater fault detection in 1/4 of the time used for conventional test methods, or roughly 12X improvement in efficiency.  Similar results have been shown in other studies.Comparable improvements in semiconductor design verification could result in significant reductions in engineering time and thus cost.  In addition to the cost for FPGA and IC/ASIC verification engineers, design engineers also spend 40% - 50% of their time on verification [1], increasing engineering headcount per project.  Improving the efficiency of semiconductor functional verification will allow precious engineering hours to be spent on a larger number of projects.

In addition to cost reduction for individual designs, this improvement could improve industry access to scarce semiconductor engineering and technical talent.  It is projected that the US semiconductor industry will face a shortfall of 70,000 to 90,000 workers over the next six years [7].  Demand for IC/ASIC design engineers increased at a 2.7% CAGR from 2007 to 2022, and demand for verification engineers increased even faster, at 6.2% per year from 2007 to 2022 [1].  This increase in demand is expected to continue.   


[6] Li, X., Gao, R., Wong, W.E., Yang, C. and Li, D.,  Applying combinatorial testing in industrial settings. 2016 IEEE Intl Conf on Software Quality, Reliability and Security (QRS) (pp. 53-60)



Rick Kuhn

Raghu Kacker

M S Raunak


Security and Privacy: assurance, modeling, testing & validation

Technologies: software & firmware

Created May 24, 2016, Updated June 13, 2024