1 research outputs found
Improving Portuguese Semantic Role Labeling with Transformers and Transfer Learning
The Natural Language Processing task of determining "Who did what to whom" is
called Semantic Role Labeling. For English, recent methods based on Transformer
models have allowed for major improvements in this task over the previous state
of the art. However, for low resource languages, like Portuguese, currently
available semantic role labeling models are hindered by scarce training data.
In this paper, we explore a model architecture with only a pre-trained
Transformer-based model, a linear layer, softmax and Viterbi decoding. We
substantially improve the state-of-the-art performance in Portuguese by over 15
F1. Additionally, we improve semantic role labeling results in Portuguese
corpora by exploiting cross-lingual transfer learning using multilingual
pre-trained models, and transfer learning from dependency parsing in
Portuguese, evaluating the various proposed approaches empirically.Comment: 30 pages, 3 figures; Fixed broken links in Reference