The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is −(log2 max pi) for a discrete distribution having n possible outputs with probabilities p1,…, pn. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable. Also see [NIST SP 800-90B].
Sources:
NIST SP 800-90A Rev. 1
under Min-entropy
The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is (log2 max pi) for a discrete distribution having probabilities p1, ...,pk. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable.
Sources:
NIST SP 800-90B
under Min-entropy
The min-entropy (in bits) of a random variable X is the largest value m having the property that each observation of X provides at least m bits of information (i.e., the min-entropy of X is the greatest lower bound for the information content of potential observations of X). The min-entropy of a random variable is a lower bound on its entropy. The precise formulation for min-entropy is - log2 (max pi) for a discrete distribution having event probabilities p1, ..., pk. Min-entropy is often used as a worst-case measure of the unpredictability of a random variable.
Sources:
NIST SP 800-133 Rev. 2
under Min-entropy
A lower bound on the entropy of a random variable. The precise formulation for min-entropy is \((-\log_{2} \max p_{i})\) for a discrete distribution having probabilities \(p_{1},...,p_{k}\). Min-entropy is often used as a measure of the unpredictability of a random variable.
Sources:
NIST IR 8427