The problem of tuning hyperparameters in learning approaches is an existing and valuable issue that for its nature can affect the real data analysis. Unfortunately, for now these hyperparameters are choosing with Grid-Search or Cross Validation approaches and there is a lack of an automatic tuning procedure to deal with this problem, especially for unsupervised context such as Dimensionality Reduction (DR) approaches. In this paper we deal with the Hyperparameters Optimization problem (HPO) applied to Nonnegative Matrix Factorization (NMF) a particular Linear DR technique. Starting from results in one of our works, we want to enforce the algorithmic strategy through a theoretical foundation. Firstly, motivated by a bilevel formulation of the NMF, we state suitable conditions needed to derive the existence of a minimizer in a infinite-dimensional Hilbert space through a more general existence result. On the other hand, some theoretical results are employed for all those situations when it is not possible (or it is unnecessary) to obtain an exact minimizer. To meet these needs, we derive a stop criterion of an ``approximate" function via Ekeland's Variational Principle. These abstract issues will be applied to our algorithm and some suggestions to other models will be proposed, too.

How to deal with hyperparmeter optimization in matrix decompositions: new existence theorems and novel stopping criteria

Laura Selicato;Caterina Sportelli;Flavia Esposito
2021-01-01

Abstract

The problem of tuning hyperparameters in learning approaches is an existing and valuable issue that for its nature can affect the real data analysis. Unfortunately, for now these hyperparameters are choosing with Grid-Search or Cross Validation approaches and there is a lack of an automatic tuning procedure to deal with this problem, especially for unsupervised context such as Dimensionality Reduction (DR) approaches. In this paper we deal with the Hyperparameters Optimization problem (HPO) applied to Nonnegative Matrix Factorization (NMF) a particular Linear DR technique. Starting from results in one of our works, we want to enforce the algorithmic strategy through a theoretical foundation. Firstly, motivated by a bilevel formulation of the NMF, we state suitable conditions needed to derive the existence of a minimizer in a infinite-dimensional Hilbert space through a more general existence result. On the other hand, some theoretical results are employed for all those situations when it is not possible (or it is unnecessary) to obtain an exact minimizer. To meet these needs, we derive a stop criterion of an ``approximate" function via Ekeland's Variational Principle. These abstract issues will be applied to our algorithm and some suggestions to other models will be proposed, too.
2021
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/380295
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact