Try the new CSRC.nist.gov and let us know what you think!
(Note: Beta site content may not be complete.)

View the beta site
NIST Logo and ITL Banner Link to the NIST Homepage Link to the ITL Homepage Link to the NIST Homepage

Post-Quantum crypto standardization

Frequently Asked Questions

  1. The call for proposals briefly mentions hybrid modes that combine quantum-resistant cryptographic algorithms with existing cryptographic algorithms (which may not be quantum-resistant). Can these hybrid modes be FIPS-validated?
  2. What are NIST’s plans regarding stateful hash-based signatures?
  3. Does the requirement for ANSI C source code preclude the use of assembly language optimizations?
  4. Will NIST consider platforms other than the “NIST PQC Reference Platform” when evaluating submissions?
  5. In Sections 4.A.2 and 4.A.4, NIST’s CFP sets the number of decryption (resp. signature) queries, that an attacker against a proposed encryption (resp. signature) scheme can make, to at most 2 to the 64. What is the rationale for not letting the adversary make essentially as many queries as the target security?
  6. What does NIST consider to be an acceptable rate of decryption/decapsulation failure?
  7. Is the NIST PQC Standardization Process a competition?
  8. Why does NIST’s CFP ask submitters to provide a classical security analysis, when the intent is to plan for a world with quantum computers?
  9. In section 4.A.5, it is stated that NIST will assume that its 5 security categories are correctly ordered (i.e. that a brute force collision attack on SHA256 (resp. SHA384) will be harder to perform than a brute force key search attack on AES192 (resp. AES 256.)) How realistic is this assumption?
  10. How can submitters who aren’t experts in quantum cryptanalysis set their parameters?
  11. What will happen to a submitted algorithm if some or all of the provided parameters fail to meet their claimed security strength categories?
  12. Which security strength categories will NIST consider for standardization?
  13. What are the “standard conversion techniques” NIST will use to convert between public-key encryption schemes and KEMs?
  14. NIST provided APIs and security definitions for Public Key encryption, KEM, and digital signature. Why are other functionalities not included?
  15. How does a submission obtain secure randomness?

     

A1: Assuming one of the components of the hybrid mode in question is a NIST-approved cryptographic primitive, such hybrid modes can be approved for use for key establishment or digital signatures. In particular, a hybrid mode for signatures consists of two signatures. The mode is valid if and only if both signatures are valid.  FIPS 140 validation can only validate the part of the hybrid signature which is currently approved by NIST. Similarly, a hybrid key establishment scheme derives keying material from two or more secret values established by different key establishment primitives. Only the NIST approved key establishment primitive can be validated according to FIPS 140.  In any case, such validation is only certifying that the NIST-approved portion is correctly implemented and used, and it says nothing about the security of the quantum-resistant portion of the hybrid mode. Hybrid modes may be an initial step for the migration to post-quantum primitives. However, NIST continues to believe that the long term solution to the threat of quantum computers is to provide standards for post-quantum public key cryptography, through the process outlined in our call for algorithms.

Back to Top

A2: NIST plans to coordinate with other standards organizations, such as the IETF, to develop standards for stateful hash-based signatures. As stateful hash-based signatures do not meet the API requested for signatures, this standardization effort will be a separate process from the one outlined in the call for proposals. It is expected that NIST will only approve a stateful hash-based signature standard for use in a limited range of signature applications, such as code signing, where most implementations will be able to securely deal with the requirement to keep state.

A3: The optimized code required as part of the submission package should be ANSI C with no assembly (this includes inline assembly). This code is meant to be portable. If significant optimizations can be made with assembly, then it can be included as an additional implementation and discussed in the performance analysis.

A4: The reference platform was defined in order to provide a common and ubiquitous platform to verify the execution of the code provided in the submissions.  NIST will include performance metrics from a variety of platforms in our evaluation, including: 64-bit “desktop/server class,” 32-bit “mobile class,” microcontrollers (32-, 16-, and where possible, 8-bit), as well as hardware platforms (e.g., FPGA). Submitters are encouraged to provide additional implementations for these platforms if possible.

The reference platform should be treated as a single core machine. If an algorithm can make particular use of multiple cores or vector instructions, submitters are encouraged to provide additional implementations for these platforms.

A5: Our reason for primarily considering attacks involving fewer than 2 to the 64 decryption/signature queries is that the number of queries is controlled by the amount of work the honest party is willing to do, which one would expect to be significantly less than the amount of work an attacker is willing to do. Any attack involving more queries than this looks more like a denial of service attack than an impersonation or key recovery attack. Furthermore, effectively protecting against online attacks requiring more than 2 to the 64 queries using NIST standards would require additional protections which are outside the scope of the present post-quantum standardization effort, most notably the development of a block cipher with a block size larger than 128 bits. This may be something NIST pursues in the future, but we do not feel it is necessary for addressing the imminent threat of quantum computers. That said, as noted in the proposed call for algorithms, NIST is open to considering attacks involving more queries, and would certainly prefer algorithms that did not fail catastrophically if the attacker exceeds 2 to the 64 queries.

Back to Top

A6: NIST did not provide an explicit limit on the rate of decryption/decapsulation failure. In cases where a scheme is targeting chosen ciphertext security, decryption/decapsulation failures may pose a security threat. A failure rate sufficiently high to violate the claimed security of a scheme is, of course, unacceptable. If, on the other hand, there is a strong argument that decryption/decapsulation failures do not pose a security threat, then the decryption/decapsulation failure rate becomes simply one among many performance considerations. NIST does not wish, at this time, to prejudge what performance considerations are important, and will therefore leave it up to submitters to provide performance characteristics that they feel will be most useful for the applications they think best fit their schemes.

A7: This process shares many features with NIST competitions, and is modelled after the successes we have had with competitions in the past.  There are, however, some important requirements that the current research climate demands we require for this process which constitute significant distinctions between this process and a competition.

First, our handling of the applicants does not coincide with a competition as specified in NISTIR 7977, nor does this process correspond to multiple parallel competitions.  There will not be an appropriate or directly analogous concept of “winners” and “losers.”  Our intention is to select a couple of options for more immediate standardization, as well as to eliminate some submissions as unsuitable. There will likely be some submissions that we do not select for standardization, but that we also do not eliminate and which may be excellent options for a specific application that we're not ready or don't have the contemporaneous resources to standardize.  In such a circumstance, we would communicate with the submitters to allow these to remain under a public license for study and practice and to remain under consideration for future standardization.  There is no specification for the handling of such an applicant in a competition.
           
Second, the state of the science in the competitions of the past, i.e. for the AES and SHA-3 competitions, was far more developed than for post-quantum cryptography.  Though differences of opinion are inevitable, the selection of the past winners should not have been too surprising.  The situation in post-quantum cryptography is less clear and opinions of required properties are less unanimous.  In addition, some of NIST’s selection criteria, particularly regarding post-quantum security, may need further refinement in response to ongoing research.
In many respects, the PQC standardization process is less like a competition, and more like an “analysis of alternatives.” The goal of the process is not primarily to pick a winner, but to document the strengths and weaknesses of the different options, and to analyze the possible tradeoffs among them. In the end, even if there is not a final consensus on what constitutes the best option, NIST expects that it will be able to make some selections that most experts will agree are satisfactory.

Back to Top

A8: Classical cryptanalysis is still valuable for a number of reasons. First, classical computers are not going away. For algorithms not subject to dramatic quantum attacks, such as those involving Shor’s algorithm, NIST believes that classical measures of security will continue to be highly relevant. Currently envisioned quantum computing technologies would be orders of magnitude slower and more energy intensive than today’s classical computing technology, when performing the same sorts of operations. In addition, practical attacks typically must be run in parallel on large clusters of machines, which diminishes the speedup that can be achieved using Grover’s algorithm. When all of these considerations are taken into account, it becomes quite likely that variants of Grover’s algorithm will provide no advantage to an adversary wishing to perform a cryptanalytic attack that can be completed in a matter of years, or even decades. As most quantum attacks on proposed postquantum cryptosystems have involved some variant of Grover’s algorithm, it may be the case that the best attack in practice will simply be the classical attack.

Also, the science involved in assessing classical security is better developed than that for assessing post-quantum security, and there is a larger community of researchers who can contribute to these investigations, increasing our confidence in the security of the proposed cryptosystems. Finally, classical cryptanalysis can improve our understanding of the mathematical structures underlying these cryptosystems, which is also the basis for quantum cryptanalysis.

A9: Even assuming no disparity in the cost of quantum and classical gates, NIST estimates that the assumption holds as long as the adversary is depth limited to fewer than about 287 logical quantum gates. This is quite near the limit of what NIST considers to be a plausible technology for the foreseeable future.

A10: Security strengths 1, 3, and 5 are defined in such a way that they are likely to be met by any scheme that:

      • Provides classical security strength of 128, 192, and 256 bits, respectively, AND
      • Is not subject to quantum attacks, other than classical attacks sped up by generic techniques (Grover’s algorithm, quantum walks, amplitude amplification etc.)

Security strengths 1,3, and 5 are unlikely to be met by any scheme with less than 128, 192 or 256 bits of classical security, respectively. This is not however an explicit requirement: At least for categories 3 and 5, NIST is open to classifying parameters with less classical security in these categories, given a sufficiently compelling argument demonstrating that:

      • With any plausible state of future technology where AES 192 (resp. AES 256) is near the limit of what can be broken by brute force, the most practical brute force attack against AES192 (AES 256 resp.) is not going to be the classical attack.
      • Given any such plausible state of future technology, the most practical attack against the parameters provided will be less practical than the most practical brute force attack against AES192 (resp. AES 256.)

Security strengths 2 and 4 are defined in such a way that they offer the maximum possible quantum security strength that can be offered by a scheme that only has a classical security strength of 128 or 192 bits, respectively. They will generally be easier to meet with parameter sets offering more classical security. A detailed quantum security analysis will be required to determine whether a parameter set meets these security strengths (unless the parameter set also meets the criteria for the next higher security strength). 

Back to Top

A11:  NIST will not remove a scheme from consideration just because it was submitted with incorrectly analyzed parameters. Depending on how far off the estimate was, and how unanticipated the attack, NIST may take it as a sign the algorithm isn’t mature enough, which could lead NIST to remove the scheme from consideration. However, assessments of an algorithm’s maturity will not be primarily based on security strength categories. Rather, the point of the categories is to compare like with like when doing performance comparisons and to make it easier to plan crypto transitions in the future. NIST will respond to attacks that contradict the claimed security strength category, but do not bring the maturity of the scheme into question, by bumping the parameter set down to a lower category, and potentially encouraging the submitter to provide a higher security parameter set.

A12: For any scheme selected for standardization, NIST hopes to select parameters sets from those offered by the submitter. If the submitted parameter sets fail to meet NIST’s needs, for whatever reason, NIST hopes to work with the submitter to provide parameter sets that do meet NIST’s needs. NIST may also choose not to standardize some of the submitted parameter sets. NIST’s reasons for doing this could include insufficient security, unacceptable performance, and too many parameter sets.

NIST has numerous reasons for specifying a categorical post-quantum security hierarchy in the Call for Proposals for post-quantum standards.  The primary purpose is to facilitate the comparison of submissions achieving specific benchmark security levels so that an honest assessment can be made.  Due to the fact that the science is not yet fully developed in this area, it is possible and appropriate for these benchmarks to be refined in response to future advances in theory.  It is not NIST’s intent to unfairly review submissions based on an analysis on parameter sets we learn to be un-impactful.

It is, however, NIST’s present belief that all five of the security strength categories provide sufficient security to allow for standardization. More precisely, NIST would describe security strengths 4 and 5 as “likely excessive,” 2 and 3 as “probably secure for the foreseeable future,” and security strength 1 as “likely secure for the foreseeable future, unless quantum computers improve faster than is anticipated.” The only security considerations which are likely to lead NIST to decline to standardize a parameter set for a scheme NIST has selected are

    1. NIST may assess the parameters as having insufficient security strength for any of the five categories
    2. NIST may assess the parameters as having too little security margin to compensate for the expected uncertainty in attack complexity. And,
    3. NIST may decide, based on technological developments during the evaluation process, that one or more of the security strength categories provides insufficient security. As each security category is defined to be at least as secure as an already-standardized, reference primitive, NIST would signal its uncertainty regarding the security of the category by announcing plans to deprecate or withdraw the reference primitive. For example, if NIST were to signal that parameters in category 1 may provide insufficient security, it would do so by announcing plans to deprecate or withdraw AES128. NIST has not done this, and does not expect to do so during the evaluation process.

NIST may also decline to standardize parameters which have unacceptable performance. If NIST feels the higher security strength categories cannot be met with acceptable performance, NIST may encourage the submitter to provide parameters with intermediate security between security strengths 2 and 3, or between 3 and 4.

Finally, NIST may pare down the range of options offered by the submitter, regarding how to select parameters. Flexibility is generally a good thing, but it may be weighed against the complexity of implementing and testing for all available options.

Back to Top

A13: To convert a public key encryption function to a KEM, NIST will construct the encapsulate function by generating a random key and encrypting it. The key generation and decapsulation functions of the KEM will be the same as the key generation and decryption functions of the original public key encryption scheme. To convert a KEM to a public key encryption scheme, NIST will construct the encryption function, by appending to the KEM ciphertext, an AES-GCM ciphertext of the plaintext message, with a randomly generated IV. The AES key will be the symmetric key output by the encapsulate function. (The key generation function will be identical to that for the original KEM, and the decryption function will be constructed by decapsulation followed by AES decryption.)

Back to Top

A14: NIST is looking primarily to replace quantum-vulnerable schemes with functionalities that are widely used, have widely agreed upon security and correctness definitions in academic literature, and for which there appear to be a range of promising approaches for designing a postquantum replacement. NIST considered a number of other functionalities, but did not provide explicit support for them, since it did not feel they met the above criteria as well as encryption, KEM, and signature. In many cases, NIST expects that schemes providing some of these functionalities may be submitted as a special case or an extension of one of the functionalities we explicitly asked for. In such a case, any additional functionality would be considered an advantage as noted in section 4.C.1 of our Call for Proposals. Two particular functionalities NIST considered were authenticated key exchange (AKE), and a drop in replacement for Diffie-Hellman.

Diffie-Hellman is an extremely widely used primitive, and has a number of potentially useful special features, such as asynchronous key exchange, and secure key use profiles ranging from static-static to ephemeral-ephemeral. However, NIST believes that in its most widely used applications, such as those requiring forward secrecy, Diffie-Hellman can be replaced by any secure KEM with an efficient key generation algorithm. The additional features of Diffie-Hellman may be useful in some applications, but there is no widely accepted security definition of which NIST is aware that captures everything one might want from a Diffie-Hellman replacement. Additionally, some plausibly important security properties of Diffie-Hellman, such as a secure, static-static key exchange, appear difficult to meet in the post-quantum setting. NIST therefore recommends that schemes sharing some or all of the desirable features of Diffie-Hellman be submitted as KEMs, while documenting any additional functionality.

AKE is also a widely used functionality. However, NIST would consider it a protocol rather than a scheme. This is an important distinction, because most widely used AKE protocols are constructed by combining simpler primitives, like digital signature, public key encryption, and KEM schemes. NIST wants to leave open the possibility that standards for these schemes may come from different submitters. Additionally, the security definitions for AKE are significantly more complicated and contentious than those for the functionalities NIST is explicitly asking for in its call for proposals. NIST recognizes that there are some AKE functionalities, in particular implicitly authenticated key exchange (IAKE), that cannot easily be constructed from simpler components. While it is less natural to treat IAKE schemes as an extension of the KEM framework, than it is for Diffie-Hellman-like primitives, NIST does believe that it can be done in most cases. For example, a significant part of the functionality of a 2-message IAKE protocol could be demonstrated by treating the initiator’s public authentication key as part of a KEM public key, and the responder’s public authentication key as part of the KEM ciphertext.

A15:The function randombytes() will be available to the submitters. This is a function from the SUPERCOP test environment and should be used to generate seed values for an algorithm.

For functional and timing tests a deterministic generator is used inside randombytes() to produce the seed values. If security testing is being done simply substitute calls to a true hardware RBG inside randombytes().

Function prototype for randombytes() is:

// The xlen parameter is in bytes
void randombytes(unsigned char *x,unsigned long long xlen)

The following demonstrate the use of the KAT and non-KAT versions of the functions to generate a key pair for encryption:

int crypto_encrypt_keypair_KAT(
              unsigned char *pk,
              unsigned char *sk,
              const unsigned char *randomness
         )

int crypto_encrypt_keypair(unsigned char *pk, unsigned char *sk)
{
      unsigned char pk[CRYPTO_PUBLICKEYBYTES];
      unsigned char sk[CRYPTO_SECRETKEYBYTES];
      unsigned char seed[CRYPTO_RANDOMBYTES];

      randombytes(seed, CRYPTO_RANDOMBYTES);
      crypto_encrypt_keypair_KAT(pk, sk, seed);
}


Back to Top