| ||||
| ||||
![]() Title:Infinite Recommendation Networks: a Data-Centric Approach Conference:DataPerf 2022 Tags:Data-centric AI, Dataset Distillation, NTK and Recommender Systems Abstract: We leverage the Neural Tangent Kernel and its equivalence to training infinitely-wide neural networks to devise ∞-AE: an autoencoder with infinitely-wide bottleneck layers. The outcome is a highly expressive yet simplistic recommendation model with a single hyper-parameter and a closed-form solution. Leveraging ∞-AE's simplicity, we also develop Distill-CF for synthesizing tiny, high-fidelity data summaries which distill the most important knowledge from the extremely large and sparse user-item interaction matrix for efficient and accurate subsequent data-usage like model training, inference, architecture search, etc. This takes a data-centric approach to recommendation, where we aim to improve the quality of logged user-feedback data for subsequent modeling, independent of the learning algorithm. Both of our proposed approaches significantly outperform their respective state-of-the-art and when used together, we observe 96-110% of ∞-AE's performance on the full dataset with as little as 0.1% of the original dataset size, leading us to explore the counter-intuitive question: Is more data what you need for better recommendation? Infinite Recommendation Networks: a Data-Centric Approach ![]() Infinite Recommendation Networks: a Data-Centric Approach | ||||
Copyright © 2002 – 2025 EasyChair |