Sign Up to like & get
recommendations!
0
Published in 2018 at "Entropy"
DOI: 10.3390/e20060436
Abstract: In this paper, we present a review of recent developments on the κ-deformed statistical mechanics in the framework of the information geometry. Three different geometric structures are introduced in the κ-formalism which are obtained starting…
read more here.
Keywords:
geometry exponential;
information geometry;
dually flat;
geometry ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2021 at "Entropy"
DOI: 10.3390/e23060726
Abstract: Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback–Leibler divergence, the Cressie–Read divergence, the Rényi divergence, and the Hellinger metric, can be explicitly expressed…
read more here.
Keywords:
representations divergence;
measures related;
divergence;
related quantities ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2022 at "Entropy"
DOI: 10.3390/e24030421
Abstract: By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence. Inspired by this formula, we define…
read more here.
Keywords:
duo;
truncated exponential;
duo bregman;
divergence ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2022 at "Entropy"
DOI: 10.3390/e24050698
Abstract: In this paper we introduce a class of statistical models consisting of exponential families depending on additional parameters, called external parameters. The main source for these statistical models resides in the Maximum Entropy framework where…
read more here.
Keywords:
external parameters;
geometry;
exponential families;
parameters exponential ... See more keywords
Sign Up to like & get
recommendations!
1
Published in 2022 at "Entropy"
DOI: 10.3390/e24101400
Abstract: The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical…
read more here.
Keywords:
information;
likelihood ratio;
exponential families;
chernoff information ... See more keywords