In this paper, we propose a novel object-centric representation, called
Block-Slot Representation. Unlike the conventional slot representation, the
Block-Slot Representation provides concept-level disentanglement within a slot.
A block-slot is constructed by composing a set of modular concept
representations, called blocks, generated from a learned memory of abstract
concept prototypes. We call this block-slot construction process Block-Slot
Attention. Block-Slot Attention facilitates the emergence of abstract concept
blocks within a slot such as color, position, and texture, without any
supervision. This brings the benefits of disentanglement into slots and the
representation becomes more interpretable. Similar to Slot Attention, this
mechanism can be used as a drop-in module in any arbitrary neural architecture.
In experiments, we show that our model disentangles object properties
significantly better than the previous methods, including complex textured
scenes. We also demonstrate the ability to compose novel scenes by composing
slots at the block-level