3 research outputs found
Exploiting BERT for End-to-End Aspect-based Sentiment Analysis
In this paper, we investigate the modeling power of contextualized embeddings
from pre-trained language models, e.g. BERT, on the E2E-ABSA task.
Specifically, we build a series of simple yet insightful neural baselines to
deal with E2E-ABSA. The experimental results show that even with a simple
linear classification layer, our BERT-based architecture can outperform
state-of-the-art works. Besides, we also standardize the comparative study by
consistently utilizing a hold-out validation dataset for model selection, which
is largely ignored by previous works. Therefore, our work can serve as a
BERT-based benchmark for E2E-ABSA.Comment: NUT workshop@EMNLP-IJCNLP-201