You are viewing this page in an unauthorized frame window.
This is a potential security issue, you are being redirected to https://csrc.nist.gov.
An official website of the United States government
Here’s how you know
Official websites use .gov A
.gov website belongs to an official government
organization in the United States.
Secure .gov websites use HTTPS A
lock (
) or https:// means you’ve safely connected to
the .gov website. Share sensitive information only on official,
secure websites.
The automotive industry is facing significant challenges from increased cybersecurity risk and adoption of AI and opportunities from rapid technological innovations. NIST is setting up this community of interest (COI) to allow the industry, academia, and government to discuss, comment, and provide input on the potential work that NIST is doing which will affect the automotive industry. Topics of interest include, but are not limited to: Cryptography Cryptographic agility Migration to...
*NEW* Short course from the Defense and Aerospace Test and Analysis Workshop 2025 (Dataworks 2025) - complete course presentation here. The goal of this project is to provide practitioners and researchers with a foundational understanding of combinatorial testing techniques and applications to testing AI-enabled software systems (AIES). Resources are being developed in these areas: Combinatorial testing (CT), applying CT to test traditional software systems, including real-world...
NIST has finalized SP 800-218A, Secure Software Development Practices for Generative AI and Dual-Use Foundation Models: An SSDF Community Profile. This publication augments SP 800-218 by adding practices, tasks, recommendations, considerations, notes, and informative references that are specific to AI model development throughout the software development life cycle. NIST has recently added a Community Profiles section to this page. It will contain links to SSDF Community Profiles developed by...
Recent Updates August 14, 2025: The NIST SP 800-53 Control Overlays for Securing AI Systems Concept Paper is available for comment, and we welcome stakeholders to join the NIST Overlays Securing AI Systems Slack Collaboration to engage in facilitated discussions with the NIST principal investigators and other subgroup members, share ideas, provide real-time feedback, and contribute to overlay development. Feedback about the concept paper and questions about the development of the...