A finite-mixture distribution model is introduced for Bayesian classification in the case of asymmetry or shape effects due to higher order moments of parent populations. The mixed normal distribution, $\pi N_d(μ_1, R_1) + (1 − \pi) N_d(μ_2, R_2)$ is shown to be a flexible distributional form for mixture components. As a result of the estimation procedure, component distributions are estimated in a semiparametric way. Gibbs sampling parameter estimation implies putting the model in hierarchical form and deriving full conditional distributions for its parameters. One of the main issues involved with finite mixture Bayesian estimation concerns the definition of prior information: the use of highly informative priors protects from the intrinsic unidentifiability of the model and often makes posterior simulations easier to interpret, though it requires a careful and possibly empirical-based tuning of hyperparameters. Closed form expressions for some predictive posterior densities are also derived, Monte Carlo approximations are given in case these are not available. The simultaneous use of weakly informative priors and identifiability constraints leads to equivalent results. A case study concerning a data set with highly asymmetric components is reported.

Bayesian estimation of finte mixtures of Gaussian mixtures

BILANCIA, Massimo;POLLICE, Alessio
1999-01-01

Abstract

A finite-mixture distribution model is introduced for Bayesian classification in the case of asymmetry or shape effects due to higher order moments of parent populations. The mixed normal distribution, $\pi N_d(μ_1, R_1) + (1 − \pi) N_d(μ_2, R_2)$ is shown to be a flexible distributional form for mixture components. As a result of the estimation procedure, component distributions are estimated in a semiparametric way. Gibbs sampling parameter estimation implies putting the model in hierarchical form and deriving full conditional distributions for its parameters. One of the main issues involved with finite mixture Bayesian estimation concerns the definition of prior information: the use of highly informative priors protects from the intrinsic unidentifiability of the model and often makes posterior simulations easier to interpret, though it requires a careful and possibly empirical-based tuning of hyperparameters. Closed form expressions for some predictive posterior densities are also derived, Monte Carlo approximations are given in case these are not available. The simultaneous use of weakly informative priors and identifiability constraints leads to equivalent results. A case study concerning a data set with highly asymmetric components is reported.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/133267
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? ND
social impact