The impact of Data Augmentation and Regularization on the Transformer Architecture in the scope of NILM

Abstract

Non-Intrusive Load Monitoring (NILM) has been collecting the benefits of Deep Learning since a decade ago, at least. Among the newest approaches, the fashionable trend of the Transformers brought promising results. To deepen the understanding about potential enhancements and limitations, this work compares two benchmarking architectures with a BERT transformer in the scope of kernel regularization and data augmentation. This study doesn’t seek for better performances, but on establishing a comparable baseline that allows for evaluating to which extent each studied method enhances or worsens performances

Similar works

Full text

thumbnail-image

NTNU Open (Norwegian University of Science and Technology)

redirect
Last time updated on 07/06/2025

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.