LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Generalized Submodular Information Measures: Theoretical Properties, Examples, Optimization Algorithms, and Applications

Photo by alterego_swiss from unsplash

Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity… Click to show full abstract

Information-theoretic quantities like entropy and mutual information have found numerous uses in machine learning. It is well known that there is a strong connection between these entropic quantities and submodularity since entropy over a set of random variables is submodular. In this paper, we study combinatorial information measures that generalize independence, (conditional) entropy, (conditional) mutual information, and total correlation defined over sets of (not necessarily random) variables. These measures strictly generalize the corresponding entropic measures since they are all parameterized via submodular functions that themselves strictly generalize entropy. Critically, we show that, unlike entropic mutual information in general, the submodular mutual information is actually submodular in one argument, holding the other fixed, for a large class of submodular functions whose third-order partial derivatives satisfy a non-negativity property. This turns out to include a number of practically useful cases such as the facility location and set-cover functions. We study specific instantiations of the submodular information measures on these, as well as the probabilistic coverage, graph-cut, log-determinants, and saturated coverage functions and see that they all have mathematically intuitive and practically useful expressions. Finally, we also study generalized independence between subsets of datapoints (random variables in the entropic case), and connect the independence characterizations to independence in log-submodular distributions. Regarding applications, we connect the maximization of submodular (conditional) mutual information to problems such as mutual-information-based, query-based, and privacy preserving summarization—and we connect optimizing the multi-set submodular mutual information to clustering and robust partitioning. We perform real world as well as synthetic experiments on various data summarization tasks.

Keywords: information; mutual information; submodular information; information measures; random variables; independence

Journal Title: IEEE Transactions on Information Theory
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.