339 research outputs found
Learning Fashion Compatibility with Bidirectional LSTMs
The ubiquity of online fashion shopping demands effective recommendation
services for customers. In this paper, we study two types of fashion
recommendation: (i) suggesting an item that matches existing components in a
set to form a stylish outfit (a collection of fashion items), and (ii)
generating an outfit with multimodal (images/text) specifications from a user.
To this end, we propose to jointly learn a visual-semantic embedding and the
compatibility relationships among fashion items in an end-to-end fashion. More
specifically, we consider a fashion outfit to be a sequence (usually from top
to bottom and then accessories) and each item in the outfit as a time step.
Given the fashion items in an outfit, we train a bidirectional LSTM (Bi-LSTM)
model to sequentially predict the next item conditioned on previous ones to
learn their compatibility relationships. Further, we learn a visual-semantic
space by regressing image features to their semantic representations aiming to
inject attribute and category information as a regularization for training the
LSTM. The trained network can not only perform the aforementioned
recommendations effectively but also predict the compatibility of a given
outfit. We conduct extensive experiments on our newly collected Polyvore
dataset, and the results provide strong qualitative and quantitative evidence
that our framework outperforms alternative methods.Comment: ACM MM 1
Dressing as a Whole: Outfit Compatibility Learning Based on Node-wise Graph Neural Networks
With the rapid development of fashion market, the customers' demands of
customers for fashion recommendation are rising. In this paper, we aim to
investigate a practical problem of fashion recommendation by answering the
question "which item should we select to match with the given fashion items and
form a compatible outfit". The key to this problem is to estimate the outfit
compatibility. Previous works which focus on the compatibility of two items or
represent an outfit as a sequence fail to make full use of the complex
relations among items in an outfit. To remedy this, we propose to represent an
outfit as a graph. In particular, we construct a Fashion Graph, where each node
represents a category and each edge represents interaction between two
categories. Accordingly, each outfit can be represented as a subgraph by
putting items into their corresponding category nodes. To infer the outfit
compatibility from such a graph, we propose Node-wise Graph Neural Networks
(NGNN) which can better model node interactions and learn better node
representations. In NGNN, the node interaction on each edge is different, which
is determined by parameters correlated to the two connected nodes. An attention
mechanism is utilized to calculate the outfit compatibility score with learned
node representations. NGNN can not only be used to model outfit compatibility
from visual or textual modality but also from multiple modalities. We conduct
experiments on two tasks: (1) Fill-in-the-blank: suggesting an item that
matches with existing components of outfit; (2) Compatibility prediction:
predicting the compatibility scores of given outfits. Experimental results
demonstrate the great superiority of our proposed method over others.Comment: 11 pages, accepted by the 2019 World Wide Web Conference (WWW-2019
- …