U.S. flag   An official website of the United States government
Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.


Secure .gov websites use HTTPS
A lock (Dot gov) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

automation and vulnerability management graphic
Credit: Shutterstock

Security Automation and Vulnerability Management

As the technology and systems that depend on cybersecurity increase in complexity, it becomes ever more important for cybersecurity and privacy to be developed, implemented, tested, and assessed in ways that can be highly automated. NIST has been a leader in this area, with its collaborative development of the Security Content Automation Protocol (SCAP), the National Checklist Program (NCP), United States Government Configuration Baseline (USGCB) and the critical National Vulnerability Database (NVD).

Early Work on Standardizing Practices and Configurations

Federal Agency Security Practices (FASP)

The Federal Agency Security Practices (FASP) effort was initiated as a result of the success of the Federal Chief Information Officers (CIO) Council’s Federal Best Security Practices (BSP) pilot effort to identify, evaluate, and disseminate best practices for critical infrastructure protection (CIP) and security. NIST was asked to undertake the transition of this pilot effort to an operational program. As a result, NIST encouraged agencies to submit their IT security information and practices for posting on its FASP website to share with others.

By 2003, the FASP site contained over 100 documents and presentations, including agency policies, procedures, and practices, and IT product-specific checklists for settings and configurations. These checklists were recommendations, not mandatory requirements, and were not intended to be seen as endorsements by NIST for any products, but as potential aids for securing certain products.

Windows 2000 Professional Configuration Template and Security Checklists

It is complicated, arduous, and time-consuming for system administrators to determine a reasonable set of security settings for a complex operating system. To make this task simpler, easier, and more secure, NIST partnered with major segments of the security community to develop, review, and test the Windows 2000 Professional consensus baseline settings in order to improve the security posture of Windows 2000 Professional systems.

Using guidance developed by the National Security Agency (NSA) and the broader IT security community, NIST published guidance in 2002 intended to assist the users and system administrators of Windows 2000 Professional systems in configuring their hosts by providing configuration templates and security checklists. NIST researchers analyzed and tested the NSA settings, and substantially extended and refined them. Detailed explanatory material for the template settings, Windows 2000 Professional security configuration, and application-specific security configuration was then developed. Subsequently, NIST led the development of a consensus baseline of Windows 2000 security settings in collaboration with NSA, the Defense Information Systems Agency (DISA), the Center for Internet Security (CIS), the SysAdmin Network Security Institute (SANS), and Microsoft also provided valuable technical commentary and advice.

NIST Security Configuration Checklists Program for IT Products

The Cyber Security Research and Development Act of 2002 (Public Law 107-305) tasked NIST to “develop, and revise as necessary, a checklist setting forth settings and option selections that minimize the security risks associated with each computer hardware or software system that is, or is likely to, become widely used within the Federal Government.” In 2003, the Common Configuration Working Group Report of the Technical Standards and Common Criteria Task Force recommended government promotion of the use of a NIST central repository for IT security configuration checklists.

In response, NIST, with sponsorship from the Department of Homeland Security (DHS), created the Security Configuration Checklists Program for IT Products. The goals of this program were:

  • To facilitate the development and sharing of security configuration checklists by providing a framework for developers to submit checklists to NIST
  • To assist developers in making checklists that conform to common baseline levels of security
  • To assist developers and users by providing guidelines for making checklists better documented and more usable
  • To provide a managed process for the review, update, and maintenance of checklists
  • To provide an easy-to-use repository of checklists
  • To assist vendors in the process of making their checklists available to users out-of-the-box.

To meet the challenging requirement to produce checklists for the vast spectrum of IT products used in the government, NIST proposed that IT vendors, consortia, industry, other government organizations, and others in the public and private sectors provide additional checklists and associated guidance material to NIST to be made available for display and download from the NIST CSRC website. Based on input from stakeholders, NIST developed SP 800-70, Security Configuration Checklists Program for IT Products: Guidance for Checklists Users and Developers in May 2005. At the same time, the NIST Beta Checklists repository was made available to the public. It contained checklists and metadata for over 100 checklists covering a variety of desktop and server operating systems and applications, as well as security and networking appliances and multi-function peripherals.

SP 800-70 defined several broad and specialized operational environments developers could use to better target their checklists so users could easily find ones that suited their needs. It also defined steps for checklist developers to follow, including how developers should document and submit checklist packages to NIST, and it explained NIST’s checklist approval and publishing process. The publication also specified the responsibilities of checklist developers to maintain and update checklists over time as needed. SP 800-70 also defined the steps checklist users should take, from gathering local requirements and acquiring the appropriate checklists to modifying, testing, and applying checklists to systems.

Windows XP Security Templates and Checklists

In 2005, NIST created SP 800-68, in order to assist personnel responsible for the administration and security of Windows XP systems. This guide contained information that could be used to secure Windows XP desktops and laptops more effectively in small office, home office (SOHO) and managed enterprise environments.

This guide included security templates to enable system administrators to apply the recommended security-relevant settings rapidly. The security templates modified several key policy areas of a Windows XP system, including password policy, account lockout policy, auditing policy, user rights assignment, system security options, event log policy, system service settings, and file permissions. The templates were based on security templates previously developed by the NSA, DISA, and Microsoft.

The NIST templates and additional settings described in SP 800-68 had been applied to test systems and tested according to detailed functional and security test plans. The functionality of common office productivity tools, web browsers, email clients, personal firewalls, antivirus software and spyware detection and removal utilities were also tested against the NIST templates and additional settings to identify potential conflicts. By implementing the recommendations described throughout this publication, in addition to the NIST Windows XP security templates themselves and general prescriptive recommendations, organizations would be able to meet a secure common configuration baseline for operating a Windows XP system.

ICAT Metabase and the National Vulnerability Database (NVD)

In 1999, NIST developed the Internet Categorization of Attacks Toolkit (ICAT). The searchable database was modified in 2000 to identify vulnerabilities instead of attacks, and was provided online as the ICAT Metabase. Upon its release, ICAT identified 644 vulnerabilities. ICAT provided users with links to various publicly available vulnerability databases and patch sites, enabling people to find and fix the vulnerabilities in their systems. ICAT allowed fine-grained searches, a feature unavailable at that time from most vulnerability databases, by characterizing each vulnerability using more than 20 attributes. ICAT also indexed information from public advisories, bulletins, and mailing lists. It complemented publicly available vulnerability databases as a search engine with pointers for users to other sites. By 2005, the ICAT Metabase was highly utilized, with approximately 2 million visits per year.

In 2005, NIST launched the National Vulnerability Database (NVD) based on, and replacing, ICAT. NVD is sponsored by DHS and complements DHS’s vulnerability management notes and alerts. NVD also became the Federal Government’s repository of comprehensive, publicly available standards-based vulnerability management data, which together represent the Security Content Automation Protocol (SCAP). Adopting the Common Vulnerabilities and Exposures (CVE) standard for defining unique exploits, it provided CVE with a fine-grained search engine and database. In the 17 years since ICAT became NVD, it has grown to include data visualizations, vendor comments, product dictionaries, and severity metrics.

Today, NVD is the de facto waterfall of information that sustains the worldwide vulnerability management ecosystem, and during periods of increased cyber stress, the NVD overtakes time.gov as NIST’s most visited resource. It is likely the only free-to-access, transparent, vendor agnostic, vulnerability database—and it includes products from vendors around the globe. NVD is used by organizations representing state, local, tribal, and territorial governments (SLTT), as well as all sectors of critical infrastructure. A partnership with INCIBE (Spanish National Cybersecurity Institute), also provides Spanish language translations for vulnerabilities in the NVD.

The NVD maintains the authoritative dictionary for Common Platform Enumerations (CPEs). CPE fulfills a function similar to CVE by providing a standard format of identifying distinct IT products and platforms. NVD analysts use the reference information provided with the CVE and any publicly available information at the time of analysis to associate Reference Tags, Common Vulnerability Scoring System (CVSS) v2.0  and v3.1 vector strings and scores, Common Weakness Enumerations (CWEs) and CPE Applicability statements.

CVSS is an industry standard first developed by a White House committee and subsequently revised by the security community under the stewardship of the Forum of Incident Response and Security Teams (FIRST). CVSS enabled the security community to calculate the severity of software flaw vulnerabilities through sets of security metrics and equations, which would help organizations prioritize the remediation of each vulnerability. Currently, NIST participates in the FIRST Special Interest Group (SIG) that maintains and updates CVSS.

At the start of 2022, NVD’s website and API endpoints saw more than 10,000,000 requests a day.  Its database contained more than 180,000 unique vulnerability records for over 800,000 products. See a full timeline of NVD milestones.

In 2021, NVD was selected as the winner of the 2021 Security Innovation Leader Award.  AFCEA Bethesda’s 2021 InnovateIT Awards program recognizes individuals or teams who exhibit qualities of successful leadership across the following areas: security, automation and modernization.

Security Automation

The NIST Checklist program was officially integrated into NIST’s FISMA (Federal Information Security Management Act) Implementation Project by SP 800-68, which included mappings to the FISMA technical controls. This gave rise to the notion that NIST should continue to provide mappings from lower-level security recommendations to higher-level documents (NIST SP 800-53, DISA Security Technical Implementation Guides (STIGs), NSA Guides, etc.) to address the fact that security and compliance were interconnected at the lowest level. NIST, DISA, and NSA soon began a joint security automation and vulnerability management effort to provide more mappings and to have standardized mechanisms for doing this more broadly.

Recognizing that NIST had the responsibility to produce security configuration guidance for the Federal Government, and that NSA and DISA provided the same service to DoD, the agencies decided to consolidate their security data sources and provide the data in a standardized XML format. The data could also be used by commercial-off-the-shelf (COTS) and government-off-the-shelf (GOTS) software products and initiatives to help automate the identification and remediation of vulnerabilities, measure their potential impact, and conduct compliance reporting.

In 2006, NIST, DISA, and NSA held the first Security Automation Conference, with over 300 attendees. Soon the work was moving forward in earnest, with DHS sponsoring what was initially called the Information Security Automation Program (ISAP). The goal of ISAP, which began with interagency participation from NIST, NSA, DISA, and DoD, was to develop the Security Content Automation Protocol (SCAP). Objectives for this work included the following:

  • Document security-related data in a standardized way, such as having a unique identifier for each vulnerability and a common way of expressing the required values for security configuration settings.
  • Share security-related data in a standardized, automated way.
  • Use standard metrics to weigh and aggregate potential vulnerability impact.
  • Assess information systems to identify their vulnerabilities (security-related software flaws and misconfigurations) and report their compliance status.

This is how NIST’s security checklist and template projects, vulnerability metrics work, and security automation efforts became fully intertwined. Checklists could be produced, submitted, distributed, and maintained in standard XML formats. Government, vendors, and private industry could partner to produce original checklists in SCAP format and to translate existing prose checklists into SCAP, while also adding standardized compliance mappings to each checklist. At that time, organizations were concerned about ensuring that operationally deployed products (at least hundreds, if not thousands) were updated with security patches and had secure configurations. There was a great need to automate this laborious, costly, and resource-consuming process.


The NIST Checklist program, in conjunction with SCAP, could reduce the level of effort required to perform vulnerability identification, remediation, and compliance reporting, and allow organizations to refocus valuable personnel resources on other problems. Standards-based security tools could automatically perform configuration checking using these checklists. Supporting this necessitated some changes to NIST’s checklist program, which was renamed the National Checklist Program (NCP). The NCP included all the goals of the original Security Configuration Checklists Program for IT Products. To those, it updated guidance to add the need to provide checklist content in a standardized format, and to encourage the use of automation technologies for checklist application such as SCAP. It also took into account the Federal Acquisition Regulation which now required federal agencies to use security configuration checklists from the NCP.

"In acquiring information technology, agencies shall include the appropriate IT security policies and requirements, including use of common security configurations available from the NIST website at http://checklists.nist.gov. Agency contracting officers should consult with the requiring official to ensure the appropriate standards are incorporated."

- Part 39 of the Federal Acquisition Regulation (FAR)

Another major change at this time was the push to have federal agencies use standardized security configurations for their Windows desktops and laptops. One of the first such efforts was from OMB Memorandum M-07-11 in March 2007.

In August 2008, OMB mandated the use of SCAP-validated tools to monitor Windows XP and Windows Vista system compliance with the release of the Federal Desktop Core Configuration (FDCC) Major Version 1.0. During 2010, the FDCC work evolved into the United States Government Configuration Baseline (USGCB) initiative, which was a broader effort to create security configuration baselines for IT products widely deployed across federal agencies. NIST provided support for the USGCB automation content, including creating patch updates, and assisting USGCB users in continuously monitoring and assessing security compliance of information systems. Initial work under USGCB involved  baselines for Windows 7 and Internet Explorer 8. Subsequent work added baselines for   Red Hat Enterprise Linux Desktop.

SCAP Version 1.0

SCAP version 1.0 was a suite of open standards—developed primarily by NSA, the MITRE Corporation, and NIST—that provided technical specifications for expressing and exchanging security-related data. These interoperable standards could be used to identify, enumerate, assign, and facilitate the measurement and sharing of information security-relevant data. SCAP version 1.0, which was formally defined in NIST SP 800-126, The Technical Specification for the Security Content Automation Protocol (SCAP): SCAP Version 1.0, was comprised of the following standards:

SCAP 1.0 enabled standards-based security tools to automatically perform configuration checking using NCP checklists. Security products and checklist authors assembled content from SCAP data repositories to create SCAP-expressed security guidance. For example, an SCAP-expressed checklist would use XCCDF to describe the checklist, CCE to identify security configuration settings to be addressed or assessed, and CPE to identify platforms for which the checklist is valid. Users of the NCP could browse the checklists based on the checklist tier, IT product, IT product category, or authority, and also search checklist names and summaries for user-specified terms. SCAP 1.0 included software flaw and security configuration standard reference data, which was hosted by the NVD and the MITRE Corporation.

The Evolution of SCAP

The suite of standards within the Security Content Automation Protocol (SCAP) was intended to be expanded over time. SCAP version 1.1, finalized in February 2011, retained the six original standards, updated OVAL to version 5.8, and added the Open Checklist Interactive Language (OCIL) 2.0, for representing checks that collect information from people (such as via questionnaires) or from existing data stores.

The Evolution of SCAP:

SCAP v1.1
SCAP v1.2
SCAP v1.3 (and component specification version updates)

In September 2011, NIST finalized SCAP version 1.2, which made major changes to its component standards:

SP 800-126 Revision 3 and SP 800-126A, released in February 2018, collectively documented the requirements for SCAP 1.3. SP 800-126A was a new publication that allowed SCAP 1.3 to take advantage of selected minor version updates of SCAP component specifications, as well as designated OVAL platform schema revisions. The SCAP 1.3 revision updated OVAL to version 5.11.1 and adopted CVSS version 3.0.

SCAP Validation Program

The SCAP Validation Program was created by a request from the OMB to support conformance testing for the FDCC, to ensure tools used for the FDCC correctly implemented SCAP as defined in SP 800-126, but it also offered testing for many other SCAP capabilities. Conformance testing was necessary because a single error in product implementation could result in undetected vulnerabilities or policy non-compliance within agency and industry networks.

The program worked with the NIST National Voluntary Laboratory Accreditation Program (NVLAP) to set up independent conformance testing laboratories that conducted the testing based on the corresponding version of NISTIR 7511, Security Content Automation Protocol (SCAP) Version 1.0 Validation Program Test Requirements. When testing was completed, the laboratory submitted a test report to NIST for review and approval. Product validations were active for a set period, at which time vendors could choose to renew their validation by submitting the product for testing. SCAP validation testing was designed to be inexpensive, yet effective. The SCAP conformance tests were either easily human verifiable or automated through NIST-provided reference tools.

The SCAP Validation Program resources web page provides the public with information on SCAP validation and a way to find products that have been awarded validations. The validation records posted on the SCAP Validated Products and Modules page identify the product versions that were tested in the laboratory, along with details such as the tested platforms, SCAP capabilities, the validation test suite version, and the lab that tested the product.

Other Noteworthy Projects

Security Automation and Continuous Monitoring (SACM) Working Group

NIST, in collaboration with industry partners in the Internet Engineering Task Force (IETF), established the Security Automation and Continuous Monitoring (SACM) working group in July 2013. This working group provides a venue for advancing appropriate SCAP specifications into international standards and addressing identified gap areas. The scope of work for SACM includes identifying and/or defining the transport protocols and data formats needed to support the collection and evaluation of device state against expected values and standards for interacting with repositories of security automation content. The initial focus of the SACM working group was on identifying use cases, requirements, and architectural models to inform decisions about existing specifications and standards that can be referenced, required modifications or extensions to existing specifications and standards, and any gaps that need to be addressed.

Software Identification Tag (SWID) Data Model

NIST collaborated with industry partners to revise the ISO/IEC 19770-2:2009 standard which established a specification for tagging software to support identification and management. This software identification (SWID) data model defined a mechanism for software publishers to provide authoritative identification, categorization, software relationship (e.g., dependency, bundling, and patch), executable and library footprint details, and other metadata for software they publish. This information could enhance SCAP by providing authoritative information for creating CPE names, targeting checklists, and associating software flaws to products based on a defect in a software library or executable in October 2015.

To supplement the requirements in the SWID standard, NIST worked with DHS and NSA on the development of NISTIR 8060, Guidelines for the Creation of Interoperable Software Identification (SWID) Tags. This guidance provided an overview of the capabilities and usage of SWID tags as part of a comprehensive software lifecycle. The report introduced SWID tags in an operational context, provided guidelines for the creation of interoperable SWID tags, and highlighted key usage scenarios for which SWID tags are applicable.

Trusted Network Connect (TNC) Protocol Specifications

NIST worked with government and industry partners in the Trusted Computing Group (TCG) to define a number of specifications related to the Trusted Network Connect (TNC) protocols. The first such publication was the TNC SCAP Messages for IF-M specification, which supported carrying SCAP content and results over the TNC protocols. The second was the TNC Enterprise Compliance Profile (ECP), which supported the exchange of SWID data over the TNC protocols. The ECP enabled collection of SWID data from a device for use by external tools to provide software inventory information. SCAP and SWID data collected using these mechanisms could be used for network access control decision making, allowing device state to be evaluated when devices connect and on an ongoing basis thereafter.

Created May 27, 2020, Updated March 24, 2022