Computer Security Resource Center

Computer Security Resource Center

Computer Security
Resource Center


Identifying Uniformity with Entropy and Divergence

Date Published: February 2017
Comments Due: March 9, 2017 (public comment period is CLOSED)
Email Questions to:

Planning Note (11/8/2017):

This Draft NISTIR will be superseded by a NIST Journal of Research article that is currently in development.


Dmitry Cousin (NIST)


NIST invites comments on Draft NISTIR 8139, Identifying Uniformity with Entropy and Divergence

Entropy models are frequently utilized in tests identifying either qualities of randomness or randomness uniformity of formal and/or observed distributions. The NIST Special Publications SP 800-22 and SP 800-90 (A, B, & C) discuss tests and methods leveraging both Shannon and min entropies. Shannon and min entropies represent two particular cases of Renyi entropy, which is a more general one-parameter entropy model. Renyi entropy insightfully unifies Hartley, Shannon, collision, and min entropies and belongs to the class of one-parameter entropy models, such as entropies named after Havrda-Charvat-Daroczy, Tsallis, Abe, and Kaniadakis. Renyi entropy along with the other members of the one parameter entropy models class can be in turn viewed as a case of the Sharma-Mittal entropy, which is a bi-parametric generalized entropy model. This NISTIR focuses on using Renyi and Tsallis entropy and divergence models to analyze similarities and differences between probability distributions of interest. The report introduces extensions for the traditional uniformity identification and measurement techniques that were proposed in the NIST SP 800-22 and SP 800-90 (A, B, & C). 



Renyi entropy; Tsallis entropy; Sharma-Mittal entropy; randomness; uniformity identification
Control Families

Planning; Risk Assessment;