This work presents an empirical comparison among three widespread word embedding techniques as Latent Semantic Indexing, Random Indexing and the more recent Word2Vec. Specifically, we employed these techniques to learn a low-dimensional vector space word representation and we exploited it to represent both items and user profiles in a content-based recommendation scenario. The performance of the techniques has been evaluated against two state-of-the-art datasets, and experimental results provided good in-sights which pave the way to several future directions.
Word Embedding techniques for Content-based Recommender Systems: An empirical evaluation
MUSTO, CATALDO;SEMERARO, Giovanni;de GEMMIS, MARCO;LOPS, PASQUALE
2015-01-01
Abstract
This work presents an empirical comparison among three widespread word embedding techniques as Latent Semantic Indexing, Random Indexing and the more recent Word2Vec. Specifically, we employed these techniques to learn a low-dimensional vector space word representation and we exploited it to represent both items and user profiles in a content-based recommendation scenario. The performance of the techniques has been evaluated against two state-of-the-art datasets, and experimental results provided good in-sights which pave the way to several future directions.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.