1 research outputs found
BERT Models for Arabic Text Classification: A Systematic Review
Bidirectional Encoder Representations from Transformers (BERT) has gained increasing attention from researchers and practitioners as it has proven to be an invaluable technique in natural languages processing. This is mainly due to its unique features, including its ability to predict words conditioned on both the left and the right context, and its ability to be pretrained using the plain text corpus that is enormously available on the web. As BERT gained more interest, more BERT models were introduced to support different languages, including Arabic. The current state of knowledge and practice in applying BERT models to Arabic text classification is limited. In an attempt to begin remedying this gap, this review synthesizes the different Arabic BERT models that have been applied to text classification. It investigates the differences between them and compares their performance. It also examines how effective they are compared to the original English BERT models. It concludes by offering insight into aspects that need further improvements and future work