Bayesian inference on the covariance matrix is usually performed after placing an inverse-Wishart or a multivariate Jeffreys as a prior density, but both of them, for different reasons, present some… Click to show full abstract
Bayesian inference on the covariance matrix is usually performed after placing an inverse-Wishart or a multivariate Jeffreys as a prior density, but both of them, for different reasons, present some drawbacks. As an alternative, the covariance matrix can be modelled by separating out the standard deviations and the correlations. This separation strategy takes advantage of the fact that usually it is more straightforward and flexible to set priors on the standard deviations and the correlations rather than on the covariance matrix. On the other hand, the priors must preserve the positive definiteness of the correlation matrix. This can be obtained by considering the Cholesky decomposition of the correlation matrix, whose entries are reparameterized using trigonometric functions. The efficiency of the trigonometric separation strategy (TSS) is shown through an application to hidden Markov models (HMMs), with conditional distributions multivariate normal. In the case of an unknown number of hidden states, estimation is conducted using a reversible jump Markov chain Monte Carlo algorithm based on the split-and-combine and birth-and-death moves whose design is straightforward because of the use of the TSS. Finally, an example in remote sensing is described, where a HMM containing the TSS is used for the segmentation of a multi-colour satellite image.
               
Click one of the above tabs to view related content.