In 1994, Jim Massey proposed the guessing entropy as a measure of the difficulty that an attacker has to guess a secret used in a cryptographic system, and established a… Click to show full abstract
In 1994, Jim Massey proposed the guessing entropy as a measure of the difficulty that an attacker has to guess a secret used in a cryptographic system, and established a well-known inequality between entropy and guessing entropy. Over 15 years before, in an unpublished work, he also established a well-known inequality for the entropy of an integer-valued random variable of given variance. In this paper, we establish a link between the two works by Massey in the more general framework of the relationship between discrete (absolute) entropy and continuous (differential) entropy. Two approaches are given in which the discrete entropy (or Rényi entropy) of an integer-valued variable can be upper bounded using the differential (Rényi) entropy of some suitably chosen continuous random variable. As an application, lower bounds on guessing entropy and guessing moments are derived in terms of entropy or Rényi entropy (without side information) and conditional entropy or Arimoto conditional entropy (when side information is available).
               
Click one of the above tabs to view related content.