We prove stability estimates for the Shannon–Stam inequality (also known as the entropy-power inequality) for log-concave random vectors in terms of entropy and transportation distance. In particular, we give the… Click to show full abstract
We prove stability estimates for the Shannon–Stam inequality (also known as the entropy-power inequality) for log-concave random vectors in terms of entropy and transportation distance. In particular, we give the first stability estimate for general log-concave random vectors in the following form: for log-concave random vectors $$X,Y \in {\mathbb {R}}^d$$ X , Y ∈ R d , the deficit in the Shannon–Stam inequality is bounded from below by the expression $$\begin{aligned} C \left( \mathrm {D}\left( X||G\right) + \mathrm {D}\left( Y||G\right) \right) , \end{aligned}$$ C D X | | G + D Y | | G , where $$\mathrm {D}\left( \cdot ~ ||G\right) $$ D · | | G denotes the relative entropy with respect to the standard Gaussian and the constant C depends only on the covariance structures and the spectral gaps of X and Y . In the case of uniformly log-concave vectors our analysis gives dimension-free bounds. Our proofs are based on a new approach which uses an entropy-minimizing process from stochastic control theory.
               
Click one of the above tabs to view related content.