[2/27/24, 11:00 AM EST] CSRC has been experiencing technical issues. If you are unable to access a CSRC page or resource, or get a 503 error, please try reloading the page several times--it may help to wait a few minutes before trying again. We apologize for the inconvenience, and hope to have a solution in place next week.
Cryptography—the technological foundation for most cybersecurity functions—is constantly under attack by a multiplying array of adversaries that range from individual criminals seeking financial gains to terrorist groups and nation states. If the cryptographic protection for an organization's information technology is defeated or bypassed, the organization—and, potentially, our nation’s entire infrastructure system—may be wide open to malicious attack. NIST is responsible for developing U.S. federal cryptographic standards as well as the technologies and programs used to determine and validate correct implementation of those standards. This has been a mainstay of NIST's computer security work for nearly 50 years.
In the early 1970s, the National Bureau of Standards (NBS) initiated the development of a symmetric key cryptographic algorithm standard that could be used by federal agencies and commercial organizations to protect sensitive, unclassified data from unauthorized disclosure and modification during transmission and in storage. Before that, cryptography had been largely the concern and responsibility of military and intelligence organizations. This was the first Federal Government project to develop a publicly available cryptographic standard to satisfy a broad range of requirements.
NBS invited individuals and organizations to submit candidate algorithms, and in August 1974 received a suitable algorithm from IBM, which had recently developed a family of cryptographic algorithms primarily for financial transaction protection (e.g., ATMs). After reviewing the IBM algorithm’s capabilities, NBS chose the algorithm as the basis for its proposed Data Encryption Standard (DES), and in 1976 held public workshops to analyze the algorithm’s utility and security. NBS, the National Security Agency (NSA), IBM, and academic and industry colleagues worked together to ensure that the proposed standard met the technical criteria and to verify that it, its implementations, and the protocols using it would be useful in many commercial and government applications. NBS issued DES as Federal Information Processing Standard (FIPS) 46 on November 23, 1977.
Establishment of DES was not without controversy. It was known that NSA had worked with NBS throughout the DES development, evaluated the proposed algorithm, and recommended several changes. IBM modified the algorithm based on these recommendations. Some critics suspected that NSA had deliberately weakened the algorithm or perhaps even introduced a "trap door" into the specification that would enable the intelligence community to decrypt messages. Another controversy involved the key length. A commonly accepted requirement is that no attack to obtain the plaintext must exist that is more efficient than trying all possible keys. Critics argued that the effective DES key length of 56 bits (64-bit key minus 8 checksum bits) was too short for long-term security, and that expected increases in computer power would soon make a 56-bit key length vulnerable to attack by key exhaustion. One source of this was "Exhaustive Cryptanalysis of the NBS Data Encryption Standard" from Whitfield Diffie and Martin E. Hellman in Computer magazine in 1977. NBS responded that the standard was adequate against any practical attack for the anticipated life of the standard (15 years) and would be reviewed for adequacy every five years.
DES was used to protect sensitive Federal Government data for decades—far longer than originally anticipated—until 2005 for legacy applications. It was also accepted as an American National Standards Institute (ANSI) standard (X3.92), Data Encryption Algorithm (DEA) for the protection of financial information. In retail banking applications, DES protected the personal identification numbers (PINs) used in bankcard transactions. DES became widely used in many different sectors in the U.S. Both U.S. and international cryptographic products were typically built to include DES. In 1986, the International Organization for Standardization (ISO) voted to approve DES as an international standard called DEA-1. DES generated a revolution in private sector information exchange by enabling the use of cryptography to protect the security of private sector information. Cryptography had emerged from under the cloak of government control.
Note: Portions of this section were derived from Data Encryption Standard (William Burr), The Data Encryption Standard: Past and Future (Miles Smid and Dennis Branstad), and Development of the Advanced Encryption Standard (Miles Smid).
President Reagan issued National Security Decision Directive 145 (NSDD 145) in 1984, which gave the NSA security authority over many Federal Government computers. This caused concern that NSA had too much authority. Congress passed the Computer Security Act of 1987 to transfer some computer security responsibilities from NSA, an intelligence agency, to NBS, a civilian agency. In March 1989, the directors of NIST and the NSA formally established the NIST-NSA Technical Working Group (TWG). This action was controversial because some thought it violated the intent of the Computer Security Act, but it was not rescinded, and three decades later the TWG still fosters NIST-NSA technical collaboration.
One of the first topics the TWG addressed was choosing a public key algorithm for signatures to be used in the forthcoming Digital Signature Standard (DSS). At that time, the most popular industry public key algorithm was RSA. The TWG discussed evaluations of RSA and several other algorithms, including the Digital Signature Algorithm (DSA) designed by NSA. Both DSA and RSA were strong algorithms, but with a significant difference: RSA provided both key management and digital signatures in one algorithm, whereas DSA provided only digital signature capability. Even though RSA was clearly industry’s choice, after extensive discussions NIST proposed DSA for use in the DSS in 1991. This was perceived by many as NIST favoring NSA’s own interests over commercial ones, since it was expected to necessitate having separate DSA and RSA versions of each product for Federal Government use.
The DSS included another algorithm designed by NSA, the Secure Hash Algorithm (SHA), now known as SHA-0. SHA-0 and other cryptographic hash algorithms convert data strings into generally shorter fixed-length numeric strings. They can be used for digital signatures, message authentication codes, key derivation functions, pseudo-random functions, and other security applications. SHA-0 was intended to take the place of an earlier MD5 algorithm because of weaknesses discovered in MD5. SHA-0 was finalized in May 1993 in the original Secure Hash Standard (FIPS 180) and the DSS was finalized one year later in the original Digital Signature Standard (DSS) (FIPS 186).
NIST’s participation in the NIST-NSA TWG drew NIST into a highly controversial topic: enabling Federal Government decryption of DES-encrypted telecommunications by holding all keys in government escrow. The key escrow controversy stemmed from a conflict between the need to protect information exchanges from disclosure or interference, and law enforcement’s need to be able to gain authorized access to information being exchanged. It generated a change in how cryptographic standards were developed, leading to a larger private sector role.
Multiple federal agencies—NIST not among them—had begun to push for a key escrow policy because DES-protected telephones were about to become publicly available. Without key escrow, these phones would make it more difficult or perhaps impossible for intelligence and law enforcement agencies to recover information from encrypted telecommunications. The policy would permit agencies to decrypt telecommunications only if specifically authorized to do so. President Clinton initiated this key escrow program in 1993.
NSA designed the key escrow solution, which used DSA and the Secure Hash Algorithm (SHA-0) as well as an NSA-designed encryption algorithm called Skipjack. Skipjack was classified as secret, so the public cryptographic community could not evaluate it for weaknesses. The Federal Government tried to address this concern by selecting a panel of experts and asking them to reach their own conclusions regarding Skipjack security. The panel’s report concluded, “There is no significant risk that SKIPJACK can be broken through a shortcut method of attack.” Many organizations and individuals remained concerned that the Federal Government would abuse its authority.
NSA’s key escrow solution, which was publicly announced in April 1993, was built into NSA-approved, tamper-resistant electronic devices called Clipper chips. Each chip implemented the cryptographic components and the escrow protocol to be specified by an Escrowed Encryption Standard (EES). Each chip also had a Law Enforcement Access Field (LEAF) burned into the chip during manufacture. Whenever someone used a device to encrypt telecommunications, the escrow protocol would run. Periodically the device would broadcast its LEAF so that agencies authorized to access the key escrow apparatus could decrypt the traffic key, which was encrypted in the LEAF. Those agencies could use that decrypted key to recover plaintext. Originally, three federal agencies were to share the decryption ability, with any two of the three being required to work together to decrypt communications on behalf of another agency with the appropriate authorization.
To support the NSA’s key escrow solution, NIST was tasked with leading the development of the EES. NIST established a secure facility for writing the EES with input from other agencies. As the Executive Order-specified deadline for finalizing the EES standard approached, it became clear that the public was strongly opposed to key escrow and Clipper chips. When NIST staff submitted the EES to the Department of Commerce, the consensus was that the public didn’t want it, the President did want it, and the leadership at the Department of Commerce could either approve the standard or resign. EES was signed on the last permissible day, and NIST published the EES in FIPS 185 in early 1994.
Shortly after its release, NSA discovered they had made a mistake in the design of the Secure Hash Algorithm (SHA-0). A few months later, NIST received a replacement for SHA-0 from NSA. The new algorithm, SHA-1, had few changes from SHA-0. On July 11, 1994, a Federal Register Notice announced a proposed revision to the SHA, indicating that “The proposed revision corrects a technical flaw that made the standard less secure than had been thought. The algorithm is still reliable as a security mechanism, but the correction returns the SHS to the original level of security.” NSA would not explain to NIST why the changes were needed because that information would reveal the nature of the weakness and how it could be attacked. This lack of transparency resulted in increased public criticism of Clipper chips and EES. NIST released an SHA-1 in 1995.
With the standards in place, the Clinton administration declared encryption to be a munition, making it subject to export controls. Even though DES was a public algorithm, the export of DES products from the U.S. was strictly controlled. American companies that wanted to export technology products that used encryption had two options: use weakened (40-bit) DES encryption that the Federal Government could easily break or use the Clipper chip. The export requirements were widely unpopular with American companies, who did not think customers outside the U.S. would buy export-controlled products from them. Vendors either had to have different versions of their software for domestic and foreign markets, or they had to use weaker cryptography for everyone. NIST, as part of the Department of Commerce, normally supported the interests of U.S. companies, but the export requirements put NIST in a difficult position.
Ultimately, the Clipper chip and EES were hardly adopted at all, even by federal agencies. Matt Blaze (of AT&T Bell Laboratories) and other researchers found significant flaws in the EES and other forms of key escrow.
Once it was clear that industry was not adopting Clipper chips and the EES, a Technical Advisory Committee to Develop a Federal Information Processing Standard for the Federal Key Management Infrastructure (TACDFIPSFKMI) was formed in 1996. TACDFIPSFKMI consisted of 24 private-sector and government agency individuals. NIST ran the committee, which was tasked with creating a technical standard for implementing key escrow for federal agency telecommunications. The motivation for industry to work on this was to have a stronger encryption standard that was exportable by certain American companies, with approval from the State Department.
The NIST Director wanted to minimize NIST’s association with the committee. The Director asked the NIST staff to keep the committee’s work away from NIST, and the NIST staff took that literally, holding its meetings in other parts of the country. The committee spent nearly two years working on the standard but didn’t finish it because of a loss of interest by the private sector.
At the same time the TACDFIPSFKMI was working on its escrow standard, NIST participated in a Key Recovery Demonstration Project (KRDP) that focused on the ability to recover stored encrypted data in response to Federal Government business needs. The KRDP pilot efforts included 13 projects performed by 11 agencies using a variety of different commercial key recovery mechanisms.
The conflict of interest that spawned the key escrow activity remains unresolved. Enabling lawful access to information without making transactions vulnerable to terrorists, thieves, and other criminals has proved a very difficult problem with respect to both policy and technology.
Note: Portions of this section were derived from Development of the Advanced Encryption Standard (Miles Smid).
In the mid-1990s, DES was approaching the end of its expected security life. The EES was the subject of public controversy and the Skipjack algorithm it specified remained classified. NIST needed a new symmetric block cipher. One option was for NIST to upgrade its standard to Triple DES, although it was unclear how the cryptographic community and users would view using an algorithm that could have its basic engine exhausted. People wanted a more efficient algorithm, and they wanted to plan for the advent of computers that would work on block sizes larger than 64 bits.
The alternative was to select a different algorithm as the NIST standard. NIST decided to lead the development of a new algorithm. It was clear to NIST that the Federal Government would have to allow major input from non-government organizations for a new encryption standard to successfully replace DES. NIST decided to partner with the international cryptographic community in the development of the new standard, hoping that such a partnership would lead to acceptance of the selected algorithm. The cryptographic expertise of the academic community had grown tremendously since the early days of DES, and many vendors were producing commercial cryptographic products worldwide. Surely the government could use this talent and these capabilities to produce a better standard.
NIST announced its intention to start developing a publicly disclosed Advanced Encryption Standard (AES) in early 1997. NIST’s initial goals for the AES included:
In the fall of 1997, NIST called for AES candidate algorithm nominations. The Federal Register announcement included detailed requirements for the submission packages, such as three required key sizes (128, 196, and 256 bits) and a required block size (128 bits). The submissions were to be evaluated on security, cost, and implementation characteristics. Although NIST intended to perform its own analysis, it strongly encouraged public evaluation, making all evaluation results publicly available. The submission deadline was June 15, 1998.Hardware Performance Simulations of Round 2 Advanced Encryption Standard Algorithms (NSA, 2000)
NIST held the First AES Candidate Conference in August 1998. Round 1 of evaluation included 15 candidate algorithms from 10 countries. During the conference, the submitters of each algorithm briefed the attendees on their submissions. The cryptographers in the audience asked questions and proposed attacks on some algorithms. NIST also solicited public comments on the candidate algorithms.
In March 1999, NIST held a Second AES Candidate Conference to discuss the results of the analysis conducted by the global cryptographic community on the candidate algorithms. The three main evaluation factors were security, efficiency, and flexibility, but intellectual property was also a concern. Attacks had been claimed against several of the algorithms. Based on all comments and analyses received, NIST narrowed the pool of candidates to five algorithms: MARS, RC6, Rijndael, Serpent, and Twofish. By restricting the field to five finalists, NIST hoped that the evaluations of the finalists could be more focused, the security analysis could be more intensive, and the efficiency of the finalists in hardware could be examined.
As part of the second round of evaluations, NIST hosted the Third AES Candidate Conference in April 2000, inviting finalists to attend and discuss their algorithms. Researchers from 12 countries had worked on developing advanced encoding methods during the global competition. NIST invited the worldwide cryptographic community to “attack” the encryption methods in an effort to break them.
The finalists were evaluated, but selecting a winner proved challenging. Some wondered if multiple winners should be declared to provide more flexibility, but implementers were concerned they would have to support all the winners for interoperability reasons, which would add to the cost and complexity of their implementations. After much discussion, it appeared that a single AES winner was the desired choice of most participants.
On October 2, 2000, Secretary of Commerce Mineta announced that NIST had selected Rijndael as the proposed AES. A significant factor in the selection of Rijndael, developed by Joan Daemen and Vincent Rijmen, was its consistently good performance over software, hardware, firmware, and smart card implementations. NIST then formally specified the AES in FIPS 197, published on December 4, 2001.
The AES has enjoyed even more success than DES did at its pinnacle. In the first 20 years after the AES was published, the NIST Cryptographic Algorithm Validation Program has validated more than 6000 AES implementations, compared to fewer than 500 DES validations performed over the past 30 years. AES is used by many protocols—such as IEEE 802.11 for securing Wi-Fi networks, the Transport Layer Security (TLS) protocol, and Internet Protocol Security (IPsec)—and by a vast range of operating systems, applications, and devices—including smartphones, internet routers, and point-of-sale (POS) terminals. Globally, each day AES implementations protect countless online transactions and private data transmissions of individuals, businesses, and governments. The NSA found AES to be strong enough to protect certain types of classified information.
“…the adoption of Rijndael as the AES is a major milestone in the history of cryptography.”
The success of AES’s development demonstrated that by maximizing inclusiveness and transparency, NIST could develop better cryptographic standards with fewer issues and less resistance than if it tried to develop standards in a vacuum. Cooperation between the government and the cryptographic community increased trust and thereby led to more significant analysis and, ultimately, worldwide adoption of AES. AES has become the most attacked cryptographic algorithm ever, with more published cryptoanalysis of AES than all other algorithms put together because AES is so widely used.
The SHA-1 served well in the 1990s and early 2000s, but by 2005, researchers in China had found a theoretical attack on it. Fortunately, NIST had anticipated both 1) the need for bigger hash functions to have strength equivalent to that of the AES competition winner, and 2) the likelihood of SHA-1 vulnerabilities. So, in 2002, NIST specific the SHA-2 hash family in FIPS 180-2. However, because of similarities between the SHA-1 and SHA-2 functions, NIST felt it was prudent to develop another standard specifying a novel hash function.
FIPS 180, Secure Hash Standard (1993)
FIPS 180-2, Secure Hash Standard (2002)
On November 2, 2007, NIST announced the beginning of a competition for a new cryptographic hash algorithm, to be called SHA-3. The process to develop and select SHA-3 was based on that used for AES. NIST worked with the cryptographic community to select the winner, Keccak, on October 2, 2012. During the selection process, new attacks on and new desirable properties for secure hash functions were discovered. In August 2015, NIST formally defined SHA-3 in FIPS 202.
When the selection of Keccak was announced, NIST indicated it was chosen because it “has a large security margin, good general performance, excellent efficiency in hardware implementations, and a flexible design.” It also uses a new construction “that can be readily adjusted to trade generic security strength for throughput, and can generate larger or smaller hash outputs, as required” (NIST Interagency or Internal Report [NISTIR] 7896, p. 59). Authenticated encryption is a security function that can be provided by this algorithm.
As of 2021, SHA-3 had not yet been widely adopted. The SHA-2 algorithms have held up well over time, and they have been incorporated into hardware. Unless a major weakness is discovered in SHA-2, SHA-3 is essentially in reserve, ready to be implemented quickly if needed. SHA-3 works completely differently than SHA-2 and other previous hash algorithms, so any systemic problems, if discovered in SHA-2, would be highly unlikely to affect SHA-3.
FIPS 186, Digital Signature Standard (DSS) (1994)
Cryptography and security applications make extensive use of random numbers and random bits, including in the generation of cryptographic keys and challenge-response protocols. The first NIST-approved random number generators (RNGs) were specified in FIPS 186 in 1994 for generating keys and other random values needed for digital signatures.
In the late 1990s, NIST began working with Accredited Standards Committee (ASC) X9, an ANSI committee for the financial services industry, to develop RNGs for cryptographic applications; this work resulted in the ANSI X9.82 series of publications. NIST also began the development of a suite of tests to determine the randomness of RNGs. As a result of this collaboration of NIST's Computer Security Division and Statistical Engineering Division, NIST Special Publication (SP) 800-22 was published with 16 statistical tests to be run on the output of an RNG.
In 2011, NIST began the development of a Randomness Beacon, a joint effort by the Computer Security Division and the Physical Measurement Laboratory, who would be developing a quantum source of randomness. The Beacon was first made available in 2012 using two commercially available sources of randomness. The Beacon posts bit-strings in blocks of 512 bits every 60 seconds. Each such value is timestamped and signed by NIST and includes the hash of the previous value to chain the sequence of values together. This prevents anyone, even the Beacon itself, from retroactively changing an output packet without being detected. The Beacon keeps all output packets and makes them available online. As the bits posted by the Beacon are public, they are not to be used as secret values, such as cryptographic keys or seeds for random number generators used in the construction of cryptographic keys.
The Beacon can be used whenever there is a need to demonstrate that a choice was made randomly. Some examples—including real problems they mitigate—include:
In 2006, NIST published the first of the SP 800-90 series of publications, based on the work by NIST and the NSA for the ANSI X9.82 publications. NIST specified four deterministic random bit generators (DRBGs): one based on the use of an approved hash function (Hash_DRBG), one based on the use of Hash-Based Message Authentication Code (HMAC) with an approved hash function (HMAC_DRBG), a third based on the use of a block cipher (CTR_DRBG), and a fourth based on the elliptic curve discrete logarithm problem (Dual_EC_DRBG).
SP 800-90B, Recommendation for the Entropy Sources Used for Random Bit Generation (2018)
SP 800-90C (Draft), Recommendation for Random Bit Generator (RBG) Constructions (2016)
In 2013, articles from major news organizations based on classified documents leaked by Edward Snowden, an NSA contractor at the time, raised public concern that one of the DRBGs NIST specified—the Dual_EC_DRBG—might contain a backdoor created by the NSA. NIST immediately published a supplemental Information Technology Laboratory (ITL) Bulletin that provided a high-level discussion of the issues and announced NIST was reopening the entire SP 800-90 series of publications for public comment. NIST also recommended that Dual_EC_DRBG not be used, pending the resolution of the concerns.
This incident resulted in several actions. Based on further review of Dual_EC_DRBG, NIST officially removed approval for its use. NIST also initiated a formal review of its cryptographic standards development efforts. NIST documented its goals and objectives, principles of operation, processes for identifying cryptographic algorithms for standardization, methods for reviewing and resolving public comments, and other important procedures necessary for a rigorous process in draft NISTIR 7977.
At the same time, NIST’s Visiting Committee on Advanced Technology (VCAT) conducted a review of NIST's cryptographic standards and guidelines development process at the request of the NIST director. The VCAT convened a blue-ribbon panel of experts called the Committee of Visitors (COV) and asked each expert to review the process and provide individual reports of their conclusions and recommendations. The VCAT issued their report in July 2014, and their recommendations were incorporated in the process and procedures in NISTIR 7977, finalized in March 2016.
In hindsight, there were several indications of potential problems with Dual_EC_DRBG. Although there were numerous reasons why Dual_EC_DRBG was published with those problems unresolved, the core issue was trusting other organizations and individuals without verifying their work. To prevent such issues from happening again, NISTIR 7977 defined a set of principles for NIST’s cryptographic standards and guidelines development processes, including transparency, openness, balance, integrity, and technical merit.
These principles and the other information from NISTIR 7977 have been adopted for all current and future NIST cryptographic standards and guidelines development work. In the context of the RBG work, NIST has developed guidance on the design principles and requirements for the entropy sources used by RBGs, and the tests for the validation of entropy sources, SP 800-90B (2018). NIST also provided guidance on RBG construction, including methods for using an entropy source and for combining a non-deterministic RBG and DRBG in draft SP 800-90C (2016).
Note: Portions of this section were derived from Independent Committee Begins Review of NIST's Cryptographic Material and Development Process (Jennifer Huergo).
NIST’s involvement in public key infrastructure (PKI) work began in the mid-1990s. PKI standards work was already in the process of shifting from the International Organization for Standardization (ISO)/International Telecommunication Union (ITU), where the X.509 standard for public key certificate formats had been developed in the late 1980s, to the Internet Engineering Task Force (IETF) and IEEE, so NIST began supporting the IETF and IEEE PKI standards work. NIST and the IETF started a PKI X.509 working group called the PKIX in 1995. The X.509 standard had been designed to work only with certain ISO protocols, so the IETF work helped to enable X.509 to work with the Internet Protocol (IP) and common IP-based higher-layer protocols, such as the Domain Name System (DNS).
NIST also began helping with the development of protocols to support communications between PKI users, registration authorities (RAs), and certificate authorities (CAs). At that time, there were no open standards for conducting these communications. Some transactions were primitive: the way to see someone else’s certificate was to ask them to provide a printout of the certificate or to fax the printout. RSA Security Inc. had been developing and releasing a series of Public Key Cryptography Standards (PKCS), and they began introducing these into the IETF.
NIST collaborated with ten industry partners to develop Minimum Interoperability Specification for PKI Components in 1998 through Cooperative Research and Development Agreements (CRADAs). The goal of this work was to achieve interoperability among CAs and RAs. Unfortunately, the work ultimately failed—not due to technology but to business processes. CAs and RAs were not trying to interoperate because each company wanted to provide both CA and RA services.
NIST was much more successful with its international standards work on PKI. NIST staff held key roles in the IETF’s PKIX working group for several years and were heavily involved in protocol development work. However, PKI proved to be too complicated. The original specification for X.509 was only eight pages long. With every revision, more features were added until the IETF version of X.509 was about 100 pages long. The processes defined in the specification also became so complex that they wouldn’t scale well.
For the Federal Government and some federal contractors, the specification was reasonably implementable, but for most other organizations it was too complex. The Federal Government had the Federal Bridge to map PKI policies between its agencies. But once the Federal Government released Personal Identity Verification (PIV) cards in the mid-2000s, PIV forced the Federal Government agencies and the contractors supporting them to use the same policy.
NIST began its involvement in key management in the mid-1980s while participating in the development of American National Standards (ANS) X9.17, Financial Institution Key Management (Wholesale), a standard developed by the financial services industry in ANSI’s X9 committee. ANS X9.17 defined protocols to be used to transfer cryptographic keys using symmetric-key techniques. NIST subsequently developed tests to validate implementations of these protocols, and in 1992 released guidance which recommended the options within the ANS X9.17 standard to be used by the Federal Government.
In the early 2000’s, NIST began a concerted effort to develop key management guidance in coordination with NSA. The first document to be published was SP 800-57 Part 1, which provided basic guidance for managing cryptographic keying material. Part 2 provided a basis for satisfying key management aspects of statutory and policy security planning requirements for federal agencies. Part 3 addressed key management issues associated with available cryptographic infrastructures, protocols, and applications. In the initial publication, topics included PKI, IPsec, TLS, Secure/Multipurpose Internet Mail Extensions (S/MIME), Kerberos, Over-the-Air Rekeying of Digital Radios (OTAR), Domain Name System Security Extensions (DNSSEC), and Encrypted File Systems (EFS). A 2015 revision of Part 3 added discussions on the use of the Secure Shell (SSH).
NIST also provided recommendations for establishing cryptographic keys using automated protocols,specifying key-establishment schemes using discrete logarithm cryptography, based on ANS X9.42 and ANS X9.63. Key-establishment schemes for both finite field cryptography and elliptic curve cryptography were specified for the Diffie-Hellman and Menezes-Qu-Vanstone (MQV) algorithms. NIST specified key-establishment schemes for integer factorization cryptography (i.e., RSA), based on ANS X9.44, and included guidance on the use of key-derivation functions (KDFs), a required step for some of the schemes.
Some of the key-establishment schemes used in communication protocols did not completely conform to the initial guidance, particularly in the key-derivation functions (KDFs) used. In 2010, NIST published guidance to provide security requirements for using the KDFs with several commonly used protocols and with ANS X9.42 and ANS X9.63. KDFs are also used to generate keying material from a secret key.
In some applications, such as the protection of electronically stored data, passwords were the only input required from the authorized users. Due to the low entropy and possibly poor randomness of those passwords, they are not suitable to be used directly as cryptographic keys. NIST developed guidance in 2010 to specify a family of password-based KDFs for deriving cryptographic keys from passwords or passphrases for the protection of electronically stored data or data protection keys. The keying material derived from a password is called a master key, which may be used
NIST had predicted that in time, some of the approved cryptographic algorithms or key lengths would become obsolete. In 2011, NIST published more specific guidance on transitioning to the use of stronger algorithms or longer key lengths, including categorizing their usage into such categories as acceptable, deprecated (the use of the algorithm and key length could be risky), restricted (additional restrictions are imposed for safe use), and legacy-use (the use of the algorithm or key length is only allowed for processing already protected information, e.g., for decryption but not encryption). Guidance was provided for each approved algorithm and key length about the dates when a transition from one category to another is scheduled.
Most cryptographic algorithms require the use of properly generated keys. In 2012, NIST provided approved methods for generating keys for each algorithm. In some cases, the entire key generation process is discussed, including obtaining and using random values from a random bit generator; in other cases, a reference is provided to the appropriate publication for that algorithm.
In 2009, NIST was tasked with identifying technologies that needed to be developed which would allow organizations to "leap ahead" of the normal development lifecycles to vastly improve the security of future sensitive and valuable computer applications. Key management was identified as one of the areas that needed to be addressed. In 2013, NIST identified topics that needed to be addressed when designing and developing a Cryptographic Key Management System (CKMS). It also provided the basis for the Federal CKMS (FCKMS) profile that was subsequently specified in 2015. This guidance was developed to assist CKMS designers and implementers in selecting the features to be provided in their “products,” and to assist federal organizations and their contractors when procuring, installing, configuring, operating, and using an FCKMS.
As early as 1977, when DES was specified in FIPS 46, NBS began validating hardware implementations of its cryptographic specifications in commercial products. NISTIR 4820 (1992) is the earliest NIST publication documenting successful validations of DES (FIPS 46-3), Message Authentication Code (MAC) (FIPS 113), and key management using ANSI X9.17 (FIPS 171). Originally, NBS personnel used a testbed—a device now on exhibit in NIST’s Building 101—to test DES implementations, following procedures documented in NBS SP 500-20 (1977). In 1980 those tests were updated to include several maintenance tests, and in 1988 the MAC Validation System was documented. DES testing expanded in the late 1980s to include firmware implementations (FIPS 46-1).
A seismic shift in NIST’s cryptographic validation testing occurred in the early 1990s. In 1988, NIST had redesignated the General Services Administration’s Federal Standard 1027, General Security Requirements for Equipment Using the Data Encryption Standard, as FIPS 140. Recognizing the importance of securing more than just cryptographic algorithms, in 1994 NIST specified security requirements for cryptographic modules (FIPS 140-1) that could be implemented in hardware, firmware, software, or a combination thereof. Requirements spanned 11 security areas—including at least one FIPS-approved algorithm—with four different security levels. This enabled vendors to develop security products that could meet federal agencies’ wide variety of security needs. FIPS 140-1 required all federal agencies—including defense agencies—to use products that included FIPS 140-1 validated modules when cryptographic protection of sensitive, unclassified information was deemed necessary. The law did allow for some exceptions.
The development and implementation of FIPS 140-1 required a significant increase in validation testing capabilities, and led in 1995 to NIST and the Government of Canada’s Communications Security Establishment (CSE) announcing the Cryptographic Module Validation Program (CMVP), which they had developed in tandem with FIPS 140-1. Independent, commercial testing laboratories were accredited by the National Voluntary Laboratory Accreditation Program (NVLAP) to perform validation testing of cryptographic module and algorithm implementations. Vendors contracted with the labs for testing, and the validation authorities—NIST and CSE—reviewed validation reports and issued validation certificates while updating online validation lists. The first cryptographic module validation was issued in October 1995, when the CMVP had only three accredited testing laboratories—two in the U.S. and one in Canada. By the end of 2021, accredited labs had expanded to twenty-one in eight countries, and over 4100 validation certificates had been issued.
Around 2004/2005, as the number of algorithms, modes of operation, and testing complexity increased, algorithm testing was formally separated from the CMVP, becoming known as the Cryptographic Algorithm Validation Program (CAVP). As of 2020, more than 150 different cryptographic algorithms, modes, and components were being tested through CAVP, and the program began implementing an Automated Cryptographic Validation Testing System (ACVTS). NIST made on-demand testing available by posting open source testing code and using a cluster system that constantly computes test cases and stores more than 160 GB of material for future use. This greatly improved the efficiency of testing, reporting, and validating results.
IPSec is a framework of open standards for ensuring private communications over IP networks. It can provide several types of data protection: confidentiality, integrity, data origin authentication, prevention of packet replay and traffic analysis, and access protection. IPSec has several uses, with the most common being a virtual private network (VPN).
To expedite the development of IPsec, NIST staff designed and developed Cerberus, a reference implementation of the IPsec specifications, and PlutoPlus, a reference implementation of the key negotiation and management specifications used for IPsec. Numerous organizations from all segments of the internet industry acquired these implementations as a platform for their ongoing research on advanced issues in IPsec technology. NIST developed the IPSec Web-based Interoperability Tester (IPsec-WIT), which was built around the Cerberus and PlutoPlus prototype implementations. IPsec-WIT also served as an experiment in test system architectures and technologies. The novel use of web technology allowed IPsec-WIT to provide interoperability testing services anytime and anywhere without requiring any distribution of test system software or relocation of the systems under test.
NIST’s early work in IPsec involved cryptography, with NIST personnel authoring RFC 3602, The AES-CBC Cipher Algorithm and Its Use with IPsec, and RFC 3566, The AES-XCBC-MAC-96 Algorithm and Its Use with IPsec.
The next major effort from NIST on IPsec was the development of SP 800-77, Guide to IPSec VPNs. Published in 2005, this document described the three primary models for IPsec virtual private network architectures: gateway-to-gateway, host-to-gateway, and host-to-host. These models can be used, respectively, to connect two secured networks (such as a branch office and headquarters) over the Internet, to protect communications for hosts on unsecured networks (such as traveling employees), or to secure direct communications between two computers that require extra protection. The guide described the components of IPsec and presented a phased approach to IPsec planning and implementation.
Within a few years, many organizations planning VPN deployments had a choice between an IPSec-based VPN and a Secure Sockets Layer (SSL)-based VPN. An SSL VPN consists of one or more VPN devices to which users connect using their Web browsers. The traffic between the Web browser and SSL VPN device was encrypted with the SSL protocol (which has since been replaced with the TLS protocol, although the “SSL VPN” name persists). SSL VPNs can provide remote users with access to Web applications and client/server applications, as well as connectivity to internal networks. They offer versatility and ease of use because they use a protocol already included with standard web browsers.
To help organizations decide whether IPsec VPNs or SSL VPNs would better address their needs, in 2008 NIST published guidance aiming to assist organizations in understanding SSL VPN technologies. The publication made recommendations for designing, implementing, configuring, securing, monitoring, and maintaining SSL VPN solutions. It also compared SSL VPNs with technologies such as IPSec VPNs and other VPN solutions.
NIST-approved cryptographic standards were designed to perform well on general-purpose computers. In recent years, there has been increased deployment of small computing devices that have limited resources with which to implement cryptography. When current NIST-approved algorithms can be engineered to fit into the limited resources of constrained environments, their performance may not be acceptable. For these reasons, NIST started a lightweight cryptography project in 2013 to learn more about the issues. NIST held two Lightweight Cryptography Workshops in Gaithersburg, Maryland in 2015 and 2016 to solicit public feedback on the constraints and limitations of the target devices, requirements and characteristics of real-world applications of lightweight cryptography.
In 2017, NIST released a report summarizing the findings of the project and outlining NIST’s plans for standardization of lightweight algorithms. In the report, NIST explained that the landscape for lightweight cryptography was moving so quickly that a standard produced using the competition model would likely be outdated prior to standardization. NIST’s approach for lightweight cryptography is to develop new recommendations using an open call for proposals to standardize currently available algorithms. NIST is planning to develop and maintain a portfolio of lightweight algorithms and modes that are approved for limited use. The initial focus of the project is block ciphers, authenticated encryption schemes, hash functions, message authentication codes, cryptographic permutations, and stream ciphers.
In February 2019, 57 candidate algorithms were submitted to NIST for consideration. Among these, 56 were accepted as first-round candidates in April 2019, marking the beginning of the first round of the NIST Lightweight Cryptography Standardization Process. In August 2019, NIST selected 32 candidates to advance to a second round of analysis, and in March 2021, ten algorithms advanced to the final round.
If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. Some engineers predict this will be possible by 2040. Historically, it has taken almost two decades to deploy our modern public key cryptography infrastructure. Therefore, NIST has already begun to prepare for information security systems to be able to resist quantum computing.
The goal of post-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers and can interoperate with existing communications protocols and networks. Public key cryptographic algorithms specified in current NIST publications are vulnerable to attacks from large-scale quantum computers, as documented in NISTIR 8105 in mid-2016. In late 2016, NIST initiated a process to develop and standardize one or more quantum-resistant public key cryptography algorithms. The new standards will specify one or more publicly disclosed digital signature, public key encryption, and key establishment algorithms that are available worldwide and can protect sensitive information well into the foreseeable future.
In November 2017, 82 candidate algorithms were submitted to NIST for consideration. Among these, 69 were accepted as first-round candidates, marking the beginning of the First Round of the NIST Post-Quantum Cryptography Standardization Process. As of March 2022, there were still 15 algorithms being evaluated in the third round. NIST expected the first standard with quantum-resistant algorithms to be published around 2024.
Note: This section was derived from NISTIR 8240, Status Report on the First Round of the NIST Post-Quantum Cryptography Standardization Process, the Post-Quantum Cryptography project page, and the Post-Quantum Cryptography Standardization project page.