Download PDFOpen PDF in browser

Hybrid Hyper-Parameter Optimization for Collaborative Filtering

EasyChair Preprint no. 4117

8 pagesDate: August 31, 2020

Abstract

Collaborative filtering (CF) became a prevalent technique to filter objects a user might like, based on other users' reactions. The neural network based solutions for CF rely on hyper-parameters to control the learning process. This paper documents a solution for hyper-parameter optimization (HPO). We empirically prove that optimizing the hyper-parameters leads to a significant performance gain. Moreover, we show a method to streamline HPO while substantially reducing computation time. Our solution relies on the separation of hyper-parameters into two groups, predetermined and automatically optimizable parameters. By minimizing the later, we can significantly reduce the overall time needed for HPO. After an extensive experimental analysis, the method produced significantly better results than manual HPO in the context of a real-world dataset.

Keyphrases: Adam, Artificial Intelligence, collaborative filtering, hyper-parameter, hyper-parameter optimization, machine learning, recommender

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:4117,
  author = {Péter Szabó and Béla Genge},
  title = {Hybrid Hyper-Parameter Optimization for Collaborative Filtering},
  howpublished = {EasyChair Preprint no. 4117},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser