Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

NIST Risk Management Framework RMF

Assessment Cases Overview

The Assessment Cases available for download correspond with NIST Special Publication 800-53, Revision 3. The assessment cases were developed by an interagency working group that has disbanded. Assessment cases for consistency with SP 800-53A Rev 4 or newer will not be developed but the existing assessment cases may continue to be applied and also may be used as a model to extrapolate assessment cases for controls added or changed in NIST SP 800-53 Revision 4 or newer.

Cautionary Note: The assessment cases developed for this project are not the only acceptable assessment cases; rather, the cases represent one possible set of assessor actions for organizations (and assessors supporting those organizations) to use in helping to determine the effectiveness of the security controls employed within the information systems undergoing assessments.

Download Page for Assessment Cases


BACKGROUND

The initial Assessment Case Project provided assessment cases consistent with NIST SP 800-53A and represented the efforts of an inter-agency workgroup lead by the Department of Justice (DOJ) with representatives from the National Institute of Standards and Technology (NIST), Department of Energy (DOE), Department of Transportation (DPT), and Office of Director of National Intelligence Office CIO (ODNI-CIO).  NIST updated the initial set of assessment cases of the inter-agency workgroup to produce the current set of assessment cases that is consistent with SP 800-53A Revision 1. The purpose of this activity was to provide recommendations for the specific actions an assessor might perform in order to obtain the evidence necessary for making the determinations identified in the assessment procedures in NIST Special Publication 800-53A Revision 1. The assessment procedures assist organizations in determining the effectiveness of the security controls defined in NIST Special Publication 800-53 Revision 4.

The purpose of the project is fourfold:

  1. to actively engage experienced assessors from multiple organizations in the development of a representative set of assessment cases corresponding to the assessment procedures in Special Publication 800-53A Revision 1;
  2. to provide organizations and the assessors supporting those organizations with an exemplary set of assessment cases for each assessment procedure in the catalog of procedures in this publication; (
  3. to provide a vehicle for ongoing community-wide review of the assessment  cases to promote continuous improvement in the assessment process for more consistent, cost effective security assessments of federal information systems; and
  4. to serve as a basis for reciprocity among various communities of interest.

ASSESSMENT CASE OVERVIEW

An assessment case represents a worked example of an assessment procedure that provides specific actions that an assessor might carry out during the assessment of a security control or control enhancement in an information system.  The assessment cases are intended to represent a starting point for expanding upon the 800-53A assessment procedures.

The assessment case supplements the information from SP 800-53A by adding two sections, “Potential Assessment Sequencing” and “Potential Assessor Evidence Gathering Actions”. These sections are described below. In addition “Notes to the Assessor” are sometimes added to help better understand the intent of the control or to more efficiently or effectively assess the control. 

The purpose of this section is to help facilitate more efficient, and hence cost-effective, assessment by identifying other control assessments, the sequence of which should be considered with regard to the assessment of this control. Specifically this section provides guidance on the following:

Precursor Controls:

Controls that should be assessed prior to assessing this control. That is, controls, the assessment of which will likely produce information either required in order to make the determinations of this assessment, or helpful in doing so.

Concurrent Controls:

Controls whose assessments involve applying the same method to the same object (or objects) as this assessment. That is, the potential for cost-savings by accomplishing the information gathering for multiple assessment cases in one application of the assessment method to a set of objects.

Successor Controls:

Controls that should be assessed after assessing this control. That is, controls, the assessment of which will likely either require, or find helpful, the information from this assessment.

The purpose for this section is to provide the set of assessment methods (examine, interview, or test) and associated objects that will cost-effectively enable making the required determinations. A series of ‘Assessor Action Steps’ are identified for each determination to be made. Each action step entry consists of an action step identifier/number and an associated evidence gathering statement expressing the assessment action to be performed. Each action step is the application of an identified assessment method to an identified set of objects; and includes both an indication of the depth (i.e., intended rigor and level of detail) and coverage (i.e., intended scope or breath) to be applied, and the specific information to be obtained from that action step. The action step evidence gathering statement draws upon the guidelines defined within the assessment method descriptions in SP800-53A, Appendix D for defining assessor actions. Each assessment case specification includes one or more potential action steps. A minimum of one action step is provided for each determination statement (or subpart) in an assessment objective from 800-53A. The action statement is written so that an assessment action can be adapted for assessing security controls at varying levels of depth and coverage enabling assessors to satisfy the organization-define confidence and assurance desired for the specific assessment. To provide this flexibility assessor-defined parameters are identified within the action statement for selecting an appropriate depth (i.e., rigor and level of detail) and coverage (i.e., scope or breath) of application of the assessment method for assessing the security control. Depth and coverage parameters are bounded in the action statements by brackets (e.g., “[….]”) and contain the depth and coverage attribute value. Depth and coverage attribute values are noted by italics within the parameter brackets. The action statements provided in the assessment cases are written using the basic (i.e., foundation) level of assessment depth and coverage attribute values. An increased rigor and/or scope of the action statement can be expressed by replacing the basic level of assessment depth and coverage attribute value with other defined values.

The potential attribute values for varying depth and coverage for the assessment methods (Examine, Interview and Test) in the action statements are as follows:

Examine, Interview and Test Coverage Attribute Values:

Basic Sample attribute value is used to indicate a ‘basic’ level of scope or breath of coverage; that is, a representative sample of assessment objects (by type and number within type) to provide a level of coverage necessary for determining if the control meets the ‘basic’ criteria defined in SP 800-53A Appendix D.

Focused Sample attribute value is available for use to indicate a ‘focused’ level of scope or breadth coverage; that is, an extended basic sample to include other specific assessment objects important to achieving the assessment objective to provide a level of coverage necessary for determining if the control meets the ‘focused’ coverage criteria defined in SP 800-53A Appendix D.

Sufficiently Large Sample attribute value is available for use to indicate a ‘comprehensive’ level of scope or breadth of coverage; that is, an extendedfocused sample to include more assessment objects to provide a level of coverage necessary for determining if the control meets the ‘comprehensive’ coverage criteria defined in SP 800-53A Appendix D.

Examine, Interview and Test Depth Attribute Values:

Specific action verbs identified in SP 800-53A, Appendix D, in the definition of the examine method are employed in the application of the Action Steps of the Assessment Cases to indicate level of rigor for examining the different types of assessment objects (i.e., documentation, activities and mechanisms) as follows:

Examine documentation rigor –‘reading’:

Review attribute value for reading documentation is used for the ‘basic’ level of rigor and level of detail; that is, a high-level examination of documentation looking for required content and for any obvious errors, omissions, or inconsistencies.

Study attribute value for reading documentation is available for use for the ‘focused’ level of rigor and level of detail; that is, an examination of documentation that includes the intent of ‘review’ and adds a more in-depth examination for greater evidence to support a determination of whether the document has the required content and is free of obvious errors, omissions, and inconsistencies.

Analyze attribute value for reading documentation is available for use for the ‘comprehensive’ level of rigor and level of detail; that is, an examination of documentation that includes the intent of both ‘review’ and ‘study’; adding a thorough and detailed analysis for significant grounds for confidence in the determination of whether the required content is present and the document is correct, complete, and consistent.

Examine activities and mechanisms rigor – ‘watching’:

Observe attribute value for watching activities and mechanisms is used for the ‘basic’ level of rigor and level of detail; that is, watching the execution of an activity or process or looking directly at a mechanism (as opposed to reading documentation produced by someone other than the assessor about that mechanism) for the purpose of seeing whether the activity or mechanism appears to operate as intended (or in the case of a mechanism, perhaps is configured as intended) and whether there are any obvious errors, omissions, or inconsistencies in the operation or configuration.

Inspect attribute value for watching activities and mechanisms is available for use for the ‘focused’ level of rigor and level of detail; that is, adding to the watching associated with ‘observe’ an active investigation to gain further grounds for confidence in the determination of whether that the activity or mechanism is operating as intended and is free of errors, omissions, or inconsistencies in the operation or configuration.

Analyze attribute value for watching activities and mechanisms is available for use for the ‘comprehensive’ level of rigor and level of detail; that is, adding to the watching and investigation of ‘observe’ and ‘inspect’ a thorough and detailed analysis of the information to develop significant grounds for confidence in the determination as to whether the activity or mechanism is operating as intended and is free of errors, omissions, or inconsistencies in the operation or configuration. Analysis achieves this by both leading to further observations and inspections and by a greater understanding of the information obtained from the examination.

Interview individual or group rigor:

Basic attribute value for interviewing individuals and groups is used for the ‘basic’ level of rigor and level of detail; that is, a high-level interview looking for evidence to support a determination of whether the control meets the ‘basic’ interview criteria defined in SP 800-53A Appendix D.

Focused attribute value for interviewing individuals and groups is available for use for the ‘focused’ level of rigor and level of detail; that is, an interview that includes the intent of ‘basic’ and adds a more in-depth interview for greater evidence to support a determination of whether the control meets the ‘focused’ interview criteria defined in SP 800-53A Appendix D.

Comprehensive attribute value for interviewing individuals and groups is available for use for the ‘comprehensive’ level of rigor and level of detail; that is, an interview that includes the intent of both ‘basic’ and ‘focused’; adding a thorough and detailed analysis for significant grounds for confidence in the determination of whether the control meets the ‘comprehensive’ interview criteria defined in SP 800-53A Appendix D.

Test Mechanisms and Activities rigor:

Basic attribute value for mechanisms and activities is used for the ‘basic’ level of rigor and level of detail; that is, a basic level of testing looking for evidence to support a determination of whether the control meets the ‘basic’ test criteria defined in SP 800-53A Appendix D.

Focused attribute value for mechanisms and activities is available for use for the ‘focused’ level of rigor and level of detail; that is, a focused level of testing that includes the intent of ‘basic’ and adds a more in-depth testing for greater evidence to support a determination of whether the control meets the ‘focused’ test criteria defined in SP 800-53A Appendix D.

Comprehensive attribute value for mechanisms and activities is available for use for the ‘comprehensive’ level of rigor and level of detail; that is, a comprehensive level of testing that includes the intent of both ‘basic’ and ‘focused’; adding a thorough and detailed analysis for significant grounds for confidence in the determination of whether the control meets the ‘comprehensive’ test criteria defined in SP 800-53A Appendix D.

The depth and coverage attributes do not alter the logical sequencing, totality, or selection of evidence gathering actions; rather these attributes served as providing/supporting degrees of assessment rigor.

ASSESSMENT CASE TAILORING AND SUPPLEMENTING

Special Publication 800-53A allows organizations to tailor and supplement the basic assessment procedures provided. The concepts of tailoring and supplementing is expected to be applied to the assessment cases similar to the concepts described in Special Publication 800-53A. Tailoring involves scoping the assessment procedures or assessment cases to more closely match the characteristics of the information system and its environment of operation. For example, detailed test scripts may need to be developed for the specific operating system, network component, middleware, or application employed within the information system to adequately assess certain characteristics of a particular security control. Such test scripts are at a lower level of detail than provided by the assessment cases.

Supplementing involves adding assessment procedures or assessment cases to adequately meet the risk management needs of the organization. Supplementation decisions are left to the discretion of the organization in order to maximize flexibility in developing security assessment plans when applying the results of risk assessments in determining the extent, rigor, and level of intensity of the assessments.

While flexibility continues to be an important factor in developing security assessment plans, consistency of assessments is also an important consideration. A major design objective is to provide an assessment framework and initial starting point for assessment that are essential for achieving such consistency.

Organizations are not expected to employ all of the assessment methods and assessment objects contained within the assessment procedures identified for the associated security controls deployed within or inherited by organizational information systems. Rather, organizations have the inherent flexibility to determine the level of effort needed for a particular assessment (e.g., which assessment methods and assessment objects are deemed to be the most useful in obtaining the desired results). This determination is made on the basis of what will accomplish the assessment objectives in the most cost-effective manner and with sufficient confidence to support the subsequent determination of the resulting mission or business risk.


Back to About the RMF

Created November 30, 2016, Updated April 10, 2024