2,896 research outputs found
Optimising Parameters in Recurrence Quantification Analysis of Smart Energy Systems
Recurrence Quantification Analysis (RQA) can help to detect significant events and phase transitions of a dynamical system, but choosing a suitable set of parameters is crucial for the success. From recurrence plots different RQA variables can be obtained and analysed. Currently, most of the methods for RQA radius optimisation are focusing on a single RQA variable. In this work we are proposing two new methods for radius optimisation that look for an optimum in the higher dimensional space of the RQA variables, therefore synchronously optimising across several variables. We illustrate our approach using two case studies: a well known Lorenz dynamical system, and a time-series obtained from monitoring energy consumption of a small enterprise. Our case studies show that both methods result in plausible values and can be used to analyse energy data
Scaling Laws for Hyperparameter Optimization
Hyperparameter optimization is an important subfield of machine learning that
focuses on tuning the hyperparameters of a chosen algorithm to achieve peak
performance. Recently, there has been a stream of methods that tackle the issue
of hyperparameter optimization, however, most of the methods do not exploit the
dominant power law nature of learning curves for Bayesian optimization. In this
work, we propose Deep Power Laws (DPL), an ensemble of neural network models
conditioned to yield predictions that follow a power-law scaling pattern. Our
method dynamically decides which configurations to pause and train
incrementally by making use of gray-box evaluations. We compare our method
against 7 state-of-the-art competitors on 3 benchmarks related to tabular,
image, and NLP datasets covering 59 diverse tasks. Our method achieves the best
results across all benchmarks by obtaining the best any-time results compared
to all competitors.Comment: Accepted at NeurIPS 202
- …