Text-to-SQL aims to generate an executable SQL program given the user
utterance and the corresponding database schema. To ensure the well-formedness
of output SQLs, one prominent approach adopts a grammar-based recurrent decoder
to produce the equivalent SQL abstract syntax tree (AST). However, previous
methods mainly utilize an RNN-series decoder, which 1) is time-consuming and
inefficient and 2) introduces very few structure priors. In this work, we
propose an AST structure-aware Transformer decoder (ASTormer) to replace
traditional RNN cells. The structural knowledge, such as node types and
positions in the tree, is seamlessly incorporated into the decoder via both
absolute and relative position embeddings. Besides, the proposed framework is
compatible with different traversing orders even considering adaptive node
selection. Extensive experiments on five text-to-SQL benchmarks demonstrate the
effectiveness and efficiency of our structured decoder compared to competitive
baselines