78,318 research outputs found
NASRec: Weight Sharing Neural Architecture Search for Recommender Systems
The rise of deep neural networks provides an important driver in optimizing
recommender systems. However, the success of recommender systems lies in
delicate architecture fabrication, and thus calls for Neural Architecture
Search (NAS) to further improve its modeling. We propose NASRec, a paradigm
that trains a single supernet and efficiently produces abundant
models/sub-architectures by weight sharing. To overcome the data multi-modality
and architecture heterogeneity challenges in recommendation domain, NASRec
establishes a large supernet (i.e., search space) to search the full
architectures, with the supernet incorporating versatile operator choices and
dense connectivity minimizing human prior for flexibility. The scale and
heterogeneity in NASRec impose challenges in search, such as training
inefficiency, operator-imbalance, and degraded rank correlation. We tackle
these challenges by proposing single-operator any-connection sampling,
operator-balancing interaction modules, and post-training fine-tuning. Our
results on three Click-Through Rates (CTR) prediction benchmarks show that
NASRec can outperform both manually designed models and existing NAS methods,
achieving state-of-the-art performance
- …