'Norwegian University of Science and Technology (NTNU) Library'
Abstract
Non-Intrusive Load Monitoring (NILM) has been collecting the benefits of Deep Learning since a decade ago, at least. Among the newest approaches, the fashionable trend of the Transformers brought promising results. To deepen the understanding about potential enhancements and limitations, this work compares two benchmarking architectures with a BERT transformer in the scope of kernel regularization and data augmentation. This study doesn’t seek for better performances, but on establishing a comparable baseline that allows for evaluating to which extent each studied method enhances or worsens performances
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.