224,162 research outputs found
Machine Learning Applications in Estimating Transformer Loss of Life
Transformer life assessment and failure diagnostics have always been
important problems for electric utility companies. Ambient temperature and load
profile are the main factors which affect aging of the transformer insulation,
and consequently, the transformer lifetime. The IEEE Std. C57.911995 provides a
model for calculating the transformer loss of life based on ambient temperature
and transformer's loading. In this paper, this standard is used to develop a
data-driven static model for hourly estimation of the transformer loss of life.
Among various machine learning methods for developing this static model, the
Adaptive Network-Based Fuzzy Inference System (ANFIS) is selected. Numerical
simulations demonstrate the effectiveness and the accuracy of the proposed
ANFIS method compared with other relevant machine learning based methods to
solve this problem.Comment: IEEE Power and Energy Society General Meeting, 201
Segatron: Segment-Aware Transformer for Language Modeling and Understanding
Transformers are powerful for sequence modeling. Nearly all state-of-the-art
language models and pre-trained language models are based on the Transformer
architecture. However, it distinguishes sequential tokens only with the token
position index. We hypothesize that better contextual representations can be
generated from the Transformer with richer positional information. To verify
this, we propose a segment-aware Transformer (Segatron), by replacing the
original token position encoding with a combined position encoding of
paragraph, sentence, and token. We first introduce the segment-aware mechanism
to Transformer-XL, which is a popular Transformer-based language model with
memory extension and relative position encoding. We find that our method can
further improve the Transformer-XL base model and large model, achieving 17.1
perplexity on the WikiText-103 dataset. We further investigate the pre-training
masked language modeling task with Segatron. Experimental results show that
BERT pre-trained with Segatron (SegaBERT) can outperform BERT with vanilla
Transformer on various NLP tasks, and outperforms RoBERTa on zero-shot sentence
representation learning.Comment: Accepted by AAAI 202
- …
