In the present thesis, we describe and explore a new method of constructing prior distributions on the additional model components that build up a more flexible model starting from a base model which would not include those components. First of all, we want to briefly sketch the idea of the PC prior. Suppose to have a certain model that could be rendered more flexible and richer by introducing an extra-component. As a toy example, suppose one wants to make more robust the Gaussian distribution by allowing an additional parameter to control for kurtosis. In this specific case, we are dealing with a Student-t distribution, so a prior for the degrees of freedom needs to be specified. The concept of PC prior arises from the distance between the simpler model and the more flexible one. The PC prior is closely related to the Kullback-Leibler divergence (KLD), that is meant to penalise deviations from the base model. Afterwards, the KLD is transformed into a more interpretable distance scale, and such a distance is supposed to be exponentially distributed. The selection of the rate parameter of the exponential distribution is left to the user belief and, finally, by performing a change of variable from the distance scale to the parameter of interest the PC prior is obtained.

Penalising model complexity / Battagliese, Diego. - (2020 Feb 28).

Penalising model complexity

BATTAGLIESE, Diego
2020-02-28

Abstract

In the present thesis, we describe and explore a new method of constructing prior distributions on the additional model components that build up a more flexible model starting from a base model which would not include those components. First of all, we want to briefly sketch the idea of the PC prior. Suppose to have a certain model that could be rendered more flexible and richer by introducing an extra-component. As a toy example, suppose one wants to make more robust the Gaussian distribution by allowing an additional parameter to control for kurtosis. In this specific case, we are dealing with a Student-t distribution, so a prior for the degrees of freedom needs to be specified. The concept of PC prior arises from the distance between the simpler model and the more flexible one. The PC prior is closely related to the Kullback-Leibler divergence (KLD), that is meant to penalise deviations from the base model. Afterwards, the KLD is transformed into a more interpretable distance scale, and such a distance is supposed to be exponentially distributed. The selection of the rate parameter of the exponential distribution is left to the user belief and, finally, by performing a change of variable from the distance scale to the parameter of interest the PC prior is obtained.
28-feb-2020
PC prior, copulas, skew symmetric models, multivariate extensions of the PC prior
Penalising model complexity / Battagliese, Diego. - (2020 Feb 28).
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/519861
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact