583 research outputs found

    Beyond Vectors: Subspace Representations for Set Operations of Embeddings

    Full text link
    In natural language processing (NLP), the role of embeddings in representing linguistic semantics is crucial. Despite the prevalence of vector representations in embedding sets, they exhibit limitations in expressiveness and lack comprehensive set operations. To address this, we attempt to formulate and apply sets and their operations within pre-trained embedding spaces. Inspired by quantum logic, we propose to go beyond the conventional vector set representation with our novel subspace-based approach. This methodology constructs subspaces using pre-trained embedding sets, effectively preserving semantic nuances previously overlooked, and consequently consistently improving performance in downstream tasks
    • …
    corecore