Linear Dimensionality Reduction (LDR) methods has gained much attention in the last decades and has been used in the context of data mining applications to reconstruct a given data matrix. The effectiveness of low rank models in data science is justified by the fact that one can suppose that each row or column in the data matrix is associated to a bounded latent variable, and entries of the matrix are generated by applying a piece-wise analytic function to these latent variables. Formally, LDR can be mathematically formalized as optimization problems at which regularization terms can be often added to enforce particular constraints emphasizing useful properties in data. From this point of view, the tune of the regularization hyperparameters (HPs), controlling the weight of the additional constraints, represents an interesting problem to be solved automatically rather than by a trial and error approach. In this work, we focus on the role the regularization HPs act in Nonnegative Matrix Factorizations (NMF) context and how their right choice can affect further results, proposing a complete overview and new directions for a novel approach. Moreover, a novel bilevel formulation of the regularization HP selection is proposed which incorporates the HP choice directly in the unsupervised algorithm as a part of the updating process.

Toward a New Approach for Tuning Regularization Hyperparameter in NMF

Del Buono N.;Esposito F.;Selicato L.
2022-01-01

Abstract

Linear Dimensionality Reduction (LDR) methods has gained much attention in the last decades and has been used in the context of data mining applications to reconstruct a given data matrix. The effectiveness of low rank models in data science is justified by the fact that one can suppose that each row or column in the data matrix is associated to a bounded latent variable, and entries of the matrix are generated by applying a piece-wise analytic function to these latent variables. Formally, LDR can be mathematically formalized as optimization problems at which regularization terms can be often added to enforce particular constraints emphasizing useful properties in data. From this point of view, the tune of the regularization hyperparameters (HPs), controlling the weight of the additional constraints, represents an interesting problem to be solved automatically rather than by a trial and error approach. In this work, we focus on the role the regularization HPs act in Nonnegative Matrix Factorizations (NMF) context and how their right choice can affect further results, proposing a complete overview and new directions for a novel approach. Moreover, a novel bilevel formulation of the regularization HP selection is proposed which incorporates the HP choice directly in the unsupervised algorithm as a part of the updating process.
2022
978-3-030-95466-6
978-3-030-95467-3
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/411577
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact