2 research outputs found
PAI-BPR: Personalized Outfit Recommendation Scheme with Attribute-wise Interpretability
Fashion is an important part of human experience. Events such as interviews,
meetings, marriages, etc. are often based on clothing styles. The rise in the
fashion industry and its effect on social influencing have made outfit
compatibility a need. Thus, it necessitates an outfit compatibility model to
aid people in clothing recommendation. However, due to the highly subjective
nature of compatibility, it is necessary to account for personalization. Our
paper devises an attribute-wise interpretable compatibility scheme with
personal preference modelling which captures user-item interaction along with
general item-item interaction. Our work solves the problem of interpretability
in clothing matching by locating the discordant and harmonious attributes
between fashion items. Extensive experiment results on IQON3000, a publicly
available real-world dataset, verify the effectiveness of the proposed model.Comment: 10 pages, 5 figures, to be published in IEEE BigMM, 202
Attribute-aware Explainable Complementary Clothing Recommendation
Modelling mix-and-match relationships among fashion items has become
increasingly demanding yet challenging for modern E-commerce recommender
systems. When performing clothes matching, most existing approaches leverage
the latent visual features extracted from fashion item images for compatibility
modelling, which lacks explainability of generated matching results and can
hardly convince users of the recommendations. Though recent methods start to
incorporate pre-defined attribute information (e.g., colour, style, length,
etc.) for learning item representations and improving the model
interpretability, their utilisation of attribute information is still mainly
reserved for enhancing the learned item representations and generating
explanations via post-processing. As a result, this creates a severe bottleneck
when we are trying to advance the recommendation accuracy and generating
fine-grained explanations since the explicit attributes have only loose
connections to the actual recommendation process. This work aims to tackle the
explainability challenge in fashion recommendation tasks by proposing a novel
Attribute-aware Fashion Recommender (AFRec). Specifically, AFRec recommender
assesses the outfit compatibility by explicitly leveraging the extracted
attribute-level representations from each item's visual feature. The attributes
serve as the bridge between two fashion items, where we quantify the affinity
of a pair of items through the learned compatibility between their attributes.
Extensive experiments have demonstrated that, by making full use of the
explicit attributes in the recommendation process, AFRec is able to achieve
state-of-the-art recommendation accuracy and generate intuitive explanations at
the same time