190 research outputs found

    H∞ Optimality Criteria for LMS and Backpropagation

    Get PDF
    We have recently shown that the widely known LMS algorithm is an H∞ optimal estimator. The H∞ criterion has been introduced, initially in the control theory literature, as a means to ensure robust performance in the face of model uncertainties and lack of statistical information on the exogenous signals. We extend here our analysis to the nonlinear setting often encountered in neural networks, and show that the backpropagation algorithm is locally H∞ optimal. This fact provides a theoretical justification of the widely observed excellent robustness properties of the LMS and backpropagation algorithms. We further discuss some implications of these results

    A Stochastic Interpretation of Stochastic Mirror Descent: Risk-Sensitive Optimality

    Get PDF
    Stochastic mirror descent (SMD) is a fairly new family of algorithms that has recently found a wide range of applications in optimization, machine learning, and control. It can be considered a generalization of the classical stochastic gradient algorithm (SGD), where instead of updating the weight vector along the negative direction of the stochastic gradient, the update is performed in a "mirror domain" defined by the gradient of a (strictly convex) potential function. This potential function, and the mirror domain it yields, provides considerable flexibility in the algorithm compared to SGD. While many properties of SMD have already been obtained in the literature, in this paper we exhibit a new interpretation of SMD, namely that it is a risk-sensitive optimal estimator when the unknown weight vector and additive noise are non-Gaussian and belong to the exponential family of distributions. The analysis also suggests a modified version of SMD, which we refer to as symmetric SMD (SSMD). The proofs rely on some simple properties of Bregman divergence, which allow us to extend results from quadratics and Gaussians to certain convex functions and exponential families in a rather seamless way

    H^∞ Optimal Training Algorithms and their Relation to Backpropagation

    Get PDF
    We derive global H^∞ optimal training algorithms for neural networks. These algorithms guarantee the smallest possible prediction error energy over all possible disturbances of fixed energy, and are therefore robust with respect to model uncertainties and lack of statistical information on the exogenous signals. The ensuing estimators are infinite-dimensional, in the sense that updating the weight vector estimate requires knowledge of all previous weight esimates. A certain finite-dimensional approximation to these estimators is the backpropagation algorithm. This explains the local H6∞ optimality of backpropagation that has been previously demonstrated

    The design of an indirect method for the human presence monitoring in the intelligent building

    Get PDF
    This article describes the design and verification of the indirect method of predicting the course of CO2 concentration (ppm) from the measured temperature variables Tindoor (degrees C) and the relative humidity rH(indoor) (%) and the temperature T-outdoor (degrees C) using the Artificial Neural Network (ANN) with the Bayesian Regulation Method (BRM) for monitoring the presence of people in the individual premises in the Intelligent Administrative Building (IAB) using the PI System SW Tool (PI-Plant Information enterprise information system). The CA (Correlation Analysis), the MSE (Root Mean Squared Error) and the DTW (Dynamic Time Warping) criteria were used to verify and classify the results obtained. Within the proposed method, the LMS adaptive filter algorithm was used to remove the noise of the resulting predicted course. In order to verify the method, two long-term experiments were performed, specifically from February 1 to February 28, 2015, from June 1 to June 28, 2015 and from February 8 to February 14, 2015. For the best results of the trained ANN BRM within the prediction of CO2, the correlation coefficient R for the proposed method was up to 92%. The verification of the proposed method confirmed the possibility to use the presence of people of the monitored IAB premises for monitoring. The designed indirect method of CO2 prediction has potential for reducing the investment and operating costs of the IAB in relation to the reduction of the number of implemented sensors in the IAB within the process of management of operational and technical functions in the IAB. The article also describes the design and implementation of the FEIVISUAL visualization application for mobile devices, which monitors the technological processes in the IAB. This application is optimized for Android devices and is platform independent. The application requires implementation of an application server that communicates with the data server and the application developed. The data of the application developed is obtained from the data storage of the PI System via a PI Web REST API (Application Programming Integration) client.Web of Science8art. no. 2

    Underdetermined-order recursive least-squares adaptive filtering: The concept and algorithms

    No full text
    Published versio

    H∞ optimality of the LMS algorithm

    Get PDF
    We show that the celebrated least-mean squares (LMS) adaptive algorithm is H∞ optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. We show that the LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: it minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central H∞ filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter

    Active Control of Sound based on Diagonal Recurrent Neural Network

    Get PDF
    corecore