research

Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

Abstract

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.Comment: ACL 2018 workshop on Relevance of Linguistic Structure in Neural Architectures for NL

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 10/08/2021