In Conversational Recommendation System (CRS), an agent is asked to recommend
a set of items to users within natural language conversations. To address the
need for both conversational capability and personalized recommendations, prior
works have utilized separate recommendation and dialogue modules. However, such
approach inevitably results in a discrepancy between recommendation results and
generated responses. To bridge the gap, we propose a multi-task learning for a
unified CRS, where a single model jointly learns both tasks via Contextualized
Knowledge Distillation (ConKD). We introduce two versions of ConKD: hard gate
and soft gate. The former selectively gates between two task-specific teachers,
while the latter integrates knowledge from both teachers. Our gates are
computed on-the-fly in a context-specific manner, facilitating flexible
integration of relevant knowledge. Extensive experiments demonstrate that our
single model significantly improves recommendation performance while enhancing
fluency, and achieves comparable results in terms of diversity.Comment: EMNLP 2023 Main Conferenc