The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is −(log2 max pi) for a discrete distribution having n possible outputs with probabilities p1,…, pn. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable. Also see [NIST SP 800-90B].
Source(s):
NIST SP 800-90A Rev. 1
under Min-entropy
The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is (log2 max pi) for a discrete distribution having probabilities p1, ...,pk. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable.
Source(s):
NIST SP 800-90B
under Min-entropy
The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is - log2 (max pi) for a discrete distribution having event probabilities p1, ..., pk. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable.
Source(s):
NIST SP 800-133 Rev. 2
under Min-entropy