We consider the problem of quantifying the information shared by a pair of random variables X 1 , X 2 about another variable S. We propose a new measure of… Click to show full abstract
We consider the problem of quantifying the information shared by a pair of random variables X 1 , X 2 about another variable S. We propose a new measure of shared information, called extractable shared information, that is left monotonic; that is, the information shared about S is bounded from below by the information shared about f ( S ) for any function f. We show that our measure leads to a new nonnegative decomposition of the mutual information I ( S ; X 1 X 2 ) into shared, complementary and unique components. We study properties of this decomposition and show that a left monotonic shared information is not compatible with a Blackwell interpretation of unique information. We also discuss whether it is possible to have a decomposition in which both shared and unique information are left monotonic.
               
Click one of the above tabs to view related content.