16 research outputs found

    On the Connection between LpL_p and Risk Consistency and its Implications on Regularized Kernel Methods

    Full text link
    As a predictor's quality is often assessed by means of its risk, it is natural to regard risk consistency as a desirable property of learning methods, and many such methods have indeed been shown to be risk consistent. The first aim of this paper is to establish the close connection between risk consistency and LpL_p-consistency for a considerably wider class of loss functions than has been done before. The attempt to transfer this connection to shifted loss functions surprisingly reveals that this shift does not reduce the assumptions needed on the underlying probability measure to the same extent as it does for many other results. The results are applied to regularized kernel methods such as support vector machines.Comment: 33 pages, 1 figur

    On consistency and robustness properties of support vector machines for heavy-tailed distributions

    Full text link

    Deep Distributional Time Series Models and the Probabilistic Forecasting of Intraday Electricity Prices

    Full text link
    Recurrent neural networks (RNNs) with rich feature vectors of past values can provide accurate point forecasts for series that exhibit complex serial dependence. We propose two approaches to constructing deep time series probabilistic models based on a variant of RNN called an echo state network (ESN). The first is where the output layer of the ESN has stochastic disturbances and a shrinkage prior for additional regularization. The second approach employs the implicit copula of an ESN with Gaussian disturbances, which is a deep copula process on the feature space. Combining this copula with a non-parametrically estimated marginal distribution produces a deep distributional time series model. The resulting probabilistic forecasts are deep functions of the feature vector and also marginally calibrated. In both approaches, Bayesian Markov chain Monte Carlo methods are used to estimate the models and compute forecasts. The proposed deep time series models are suitable for the complex task of forecasting intraday electricity prices. Using data from the Australian National Electricity Market, we show that our models provide accurate probabilistic price forecasts. Moreover, the models provide a flexible framework for incorporating probabilistic forecasts of electricity demand as additional features. We demonstrate that doing so in the deep distributional time series model in particular, increases price forecast accuracy substantially

    Communication-constrained distributed quantile regression with optimal statistical guarantees

    Get PDF
    We address the problem of how to achieve optimal inference in distributed quantile regression without stringent scaling conditions. This is challenging due to the non-smooth nature of the quantile regression (QR) loss function, which invalidates the use of existing methodology. The difficulties are resolved through a double-smoothing approach that is applied to the local (at each data source) and global objective functions. Despite the reliance on a delicate combination of local and global smoothing parameters, the quantile regression model is fully parametric, thereby facilitating interpretation. In the low-dimensional regime, we establish a finite-sample theoretical framework for the sequentially defined distributed QR estimators. This reveals a trade-off between the communication cost and statistical error. We further discuss and compare several alternative confidence set constructions, based on inversion of Wald and score-type tests and resampling techniques, detailing an improvement that is effective for more extreme quantile coefficients. In high dimensions, a sparse framework is adopted, where the proposed doubly-smoothed objective function is complemented with an â„“1-penalty. We show that the corresponding distributed penalized QR estimator achieves the global convergence rate after a near-constant number of communication rounds. A thorough simulation study further elucidates our findings

    Data-driven modelling, forecasting and uncertainty analysis of disaggregated demands and wind farm power outputs

    Get PDF
    Correct analysis of modern power supply systems requires to evaluate much wider ranges of uncertainties introduced by the implementation of new technologies on both supply and demand sides. On the supply side, these uncertainties are due to the increased contributions of renewable generation sources (e.g., wind and PV), whose stochastic output variations are difficult to predict and control, as well as due to the significant changes in system operating conditions, coming from the implementation of various control and balancing actions, increased automation and switching functionalities, and frequent network reconfiguration. On the demand side, these uncertainties are due to the installation of new types of loads, featuring strong spatio-temporal variations of demands (e.g., EV charging), as well as due to the deployment of different demand-side management schemes. Modern power supply systems are also characterised by much higher availability of measurements and recordings, coming from a number of recently deployed advanced monitoring, data acquisition and control systems, and providing valuable information on system operating and loading conditions, state and status of network components and details on various system events, transients and disturbances. Although the processing of large amounts of measured data brings its own challenges (e.g., data quality, performance, and incorporation of domain knowledge), these data open new opportunities for a more accurate and comprehensive evaluation of the overall system performance, which, however, require new data-driven analytical approaches and modelling tools. This PhD research is aimed at developing and evaluating novel and improved data-driven methodologies for modelling renewable generation and demand, in general, and for assessing the corresponding uncertainties and forecasting, in particular. The research and methods developed in this thesis use actual field measurements of several onshore and offshore wind farms, as well as measured active and reactive power demands at several low voltage (LV) individual household levels, up to the demands at medium voltage (MV) substation level. The models are specifically built to be implemented for power system analysis and are actually used by a number of researchers and PhD students in Edinburgh and elsewhere (e.g., collaborations with colleagues from Italy and Croatia), which is discussed and illustrated in the thesis through the selected study cases taken from this joint research efforts. After literature review and discussion of basic concepts and definitions, the first part of the thesis presents data-driven analysis, modelling, uncertainty evaluation and forecasting of (predominantly residential) demands and load profiles at LV and MV levels. The analysis includes both aggregation and disaggregation of measured demands, where the latter is considered in the context of identifying demand-manageable loads (e.g., heating). For that purpose, periodical changes in demands, e.g., half-daily, daily, weekly, seasonal and annual, are represented with Fourier/frequency components and correlated with the corresponding exploratory meteorological variables (e.g., temperature, solar irradiance), allowing to select the combination of components maximising the positive or negative correlations as an additional predictor variable. Convolutional neural network (CNN) and bidirectional long short-term memory (BiLSTM) are then used to represent dependencies among multiple dimensions and to output the estimated disaggregated time series of specific load types (with Bayesian optimisation applied to select appropriate CNN-BiLSTM hyperparameters). In terms of load forecasting, both tree-based and neural network-based models are analysed and compared for the day-ahead and week-ahead forecasting of demands at MV substation level, which are also correlated with meteorological data. Importantly, the presented load forecasting methodologies allow, for the first time, to forecast both total/aggregate demands and corresponding disaggregated demands of specific load types. In terms of the supply side analysis, the thesis presents data-driven evaluation, modelling, uncertainty evaluation and forecasting of wind-based electricity generation systems. The available measurements from both the individual wind turbines (WTs) and the whole wind farms (WFs) are used to formulate simple yet accurate operational models of WTs and WFs. First, available measurements are preprocessed, to remove outliers, as otherwise obtained WT/WF models may be biased, or even inaccurate. A novel simulation-based approach that builds on a procedure recommended in a standard is presented for processing all outliers due to applied averaging window (typically 10 minutes) and WT hysteresis effects (around the cut-in and cut-out wind speeds). Afterwards, the importance of distinguishing between WT-level and WF-level analysis is discussed and a new six-parameter power curve model is introduced for accurate modelling of both cut-in and cut-out regions and for taking into account operating regimes of a WF (WTs in normal/curtailed operation, or outage/fault). The modelling framework in the thesis starts with deterministic models (e.g., CNN-BiLSTM and power curve models) and is then extended to include probabilistic models, building on the Bayesian inference and Copula theory. In that context, the thesis presents a set of innovative data-driven WT and WF probabilistic models, which can accurately model cross-correlations between the WT/WF power output (Pout), wind speed (WS), air density (AD) and wind direction (WD). Vine Copula and Gaussian mixture Copula model (GMCM) are combined, for the first time, to evaluate the uncertainty of Pout values, conditioning on other explanatory variables (which may be either deterministic, or also uncertain). In terms of probabilistic wind energy forecasting, Bayesian CNN-BiLSTM model is used to analyse and efficiently handle high dimensionality of both input meteorological variables (WS, AD and WD) and additional uncertainties due to WF operating regimes. The presented results demonstrate that the developed Vine-GMCM and operational WF model can accurately integrate and effectively correlate all propagated uncertainties, ultimately resulting in much higher confidence levels of the forecasted WF power outputs than in the existing literature

    Backtesting Systemic Risk Forecasts using Multi-Objective Elicitability

    Full text link
    Backtesting risk measure forecasts requires identifiability (for model validation) and elicitability (for model comparison). The systemic risk measures CoVaR (conditional value-at-risk), CoES (conditional expected shortfall) and MES (marginal expected shortfall), measuring the risk of a position YY given that a reference position XX is in distress, fail to be identifiable and elicitable. We establish the joint identifiability of CoVaR, MES and (CoVaR, CoES) together with the value-at-risk (VaR) of the reference position XX, but show that an analogue result for elicitability fails. The novel notion of multi-objective elicitability however, relying on multivariate scores equipped with an order, leads to a positive result when using the lexicographic order on R2\mathbb{R}^2. We establish comparative backtests of Diebold--Mariano type for superior systemic risk forecasts and comparable VaR forecasts, accompanied by a traffic-light approach. We demonstrate the viability of these backtesting approaches in simulations and in an empirical application to DAX 30 and S&P 500 returns.Comment: 43 pages, 8 figure
    corecore