1,181 research outputs found

    Hierarchical Decomposition of Nonlinear Dynamics and Control for System Identification and Policy Distillation

    Full text link
    The control of nonlinear dynamical systems remains a major challenge for autonomous agents. Current trends in reinforcement learning (RL) focus on complex representations of dynamics and policies, which have yielded impressive results in solving a variety of hard control tasks. However, this new sophistication and extremely over-parameterized models have come with the cost of an overall reduction in our ability to interpret the resulting policies. In this paper, we take inspiration from the control community and apply the principles of hybrid switching systems in order to break down complex dynamics into simpler components. We exploit the rich representational power of probabilistic graphical models and derive an expectation-maximization (EM) algorithm for learning a sequence model to capture the temporal structure of the data and automatically decompose nonlinear dynamics into stochastic switching linear dynamical systems. Moreover, we show how this framework of switching models enables extracting hierarchies of Markovian and auto-regressive locally linear controllers from nonlinear experts in an imitation learning scenario.Comment: 2nd Annual Conference on Learning for Dynamics and Contro

    Analyzing Daily Behavioral Data for Personalized Health Management

    Get PDF
    Emerging wearable and environmental sensor technologies provide health professionals with unprecedented capacity to continuously collect human behavior data for health monitoring and management. This enables new solutions to mitigate globally emerging health problems such as obesity. With such outburst of dynamic sensor data, it is critical that appropriate mathematical models and computational analytic methods are developed to translate the collected data into an accurate characterization of the underlying health dynamics, enabling more reliable personalized monitoring, prediction, and intervention of health status changes. However, several challenges arise in translating them effectively into personalized activity plans. Besides common analytic challenges that come from the missing values and outliers often seen in sensor behavior data, modeling the complex health dynamics with potential influence from human daily behaviors also pose significant challenges. We address these challenges as follows: We firstly explore existing missing value imputation and outlier detection preprocessing methods. We compare these methods with a recently developed dynamic system learning method – SSMO – that learns a personalized behavior model from real-world sensor data while simultaneously estimating missing values and detecting outliers. We then focus on modeling heterogeneous dynamics to better capture health status changes under different conditions, which may lead to more effective state-dependent intervention strategies. We implement switching-state dynamic models with different complexity levels on real-world daily behavior data. Finally, we conducted evaluation experiments of these models to demonstrate the importance of modeling the dynamic heterogeneity, as well as simultaneously conducting missing value imputation and outlier detection in achieving better prediction of health status changes

    A NOVEL FORWARD BACKWARD LINEAR PREDICTION ALGORITHM FOR SHORT TERM POWER LOAD FORECAST

    Get PDF
    Electrical load forecast is an important part of the power system energy management system. Reliable load forecast technique will help the electric utility to make unit commitment decisions, reduce spinning reserve capacity, and schedule device maintenance plan properly. Thus, besides being a key element in reducing the generation cost, power load forecast is an essential procedure in enhancing the reliability of the power systems. Generally speaking, power systems worldwide are using load forecast as an essential part of off-line network analysis. This is in order to determine the status of the system, and the necessity to implement corrective actions, such as load shedding, power purchases or using peaking units. Short term load forecast (STLF), in terms of one-hour ahead, 24-hours ahead, and 168-hours ahead is a necessary daily task for power dispatch. Its accuracy will significantly affect the cost of generation and the reliability of the system. The majority of the single variable based techniques are using autoregressive-moving average (ARMA) model to solve the STLF problem. In this thesis, a new AR algorithm especially designed for long data records as a solution to STLF problem is proposed. The proposed AR-based algorithm divides long data record into short segments and searches for the AR coefficients that simultaneously model the data with the least means squared errors. In order to verify the proposed algorithm as a solution to STLF problem, its performance is compared with other AR-based algorithms, like Burg and the seasonal Box-Jenkins ARIMA (SARIMA). In addition to the parametric algorithms, the comparison is extended towards artificial neural networks (ANN). Three years data power demand record collected by NEMMCO in four Australian states, NSW, QLD, SA, and VIC, between the beginning of 2005 and the end of 2007 are used for the comparison. The results show the potential of the proposed algorithm as a reliable solution to STLF

    Automatic outlier detection in automated water quality measurement stations

    Get PDF
    Des stations de mesure de la qualité de l’eau sont utilisées pour mesurer la qualité de l'eau à haute fréquence. Pour une gestion efficace de ces mesures, la qualité des données doit être vérifiée. Dans une méthode univariée précédemment développée, des points aberrants et des fautes étaient détectés dans les données mesurées par ces stations en employant des modèles à lissage exponentiel pour prédire les données au moment suivant avec l’intervalle de confiance. Dans la présente étude, ne considérant que le cas univarié, la détection de points aberrants est améliorée par l’identification d’un modèle autorégressif à moyenne mobile sur une fenêtre mobile de données pour prédire la donnée au moment suivant. Les données de turbidité mesurées à l'entrée d'une station d'épuration municipale au Danemark sont utilisées comme étude de cas pour comparer la performance de l’utilisation des deux modèles. Les résultats montrent que le nouveau modèle permet de prédire la donnée au moment suivant avec plus de précision. De plus, l’inclusion du nouveau modèle dans la méthode univariée présente une performance satisfaisante pour la détection de points aberrants et des fautes dans les données de l'étude de cas.Water quality monitoring stations are used to measure water quality at high frequency. For effective data management, the quality of the data must be evaluated. In a previously developed univariate method both outliers and faults were detected in the data measured by these stations by using exponential smoothing models that give one-step ahead forecasts and their confidence intervals. In the present study, the outlier detection step of the univariate method is improved by identifying an auto-regressive moving average model for a moving window of data and forecasting one-step ahead. The turbidity data measured at the inlet of a municipal treatment plant in Denmark is used as case study to compare the performance of the use of the two models. The results show that the forecasts made by the new model are more accurate. Also, inclusion of the new forecasting model in the univariate method shows satisfactory performance for detecting outliers and faults in the case study data

    Dual estimation: Constructing building energy models from data sampled at low rate

    Get PDF
    AbstractEstimation of energy models from data is an important part of advanced fault detection and diagnosis tools for smart energy purposes. Estimated energy models can be used for a large variety of management and control tasks, spanning from model predictive building control to estimation of energy consumption and user behavior. In practical implementation, problems to be considered are the fact that some measurements of relevance are missing and must be estimated, and the fact that other measurements, collected at low sampling rate to save memory, make discretization of physics-based models critical. These problems make classical estimation tools inadequate and call for appropriate dual estimation schemes where states and parameters of a system are estimated simultaneously. In this work we develop dual estimation schemes based on Extended Kalman Filtering (EKF) and Unscented Kalman Filtering (UKF) for constructing building energy models from data: in order to cope with the low sampling rate of data (with sampling time 15min), an implicit discretization (Euler backward method) is adopted to discretize the continuous-time heat transfer dynamics. It is shown that explicit discretization methods like the Euler forward method, combined with 15min sampling time, are ineffective for building reliable energy models (the discrete-time dynamics do not match the continuous-time ones): even explicit methods of higher order like the Runge–Kutta method fail to provide a good approximation of the continuous-time dynamics which such large sampling time. Either smaller time steps or alternative discretization methods are required. We verify that the implicit Euler backward method provides good approximation of the continuous-time dynamics and can be easily implemented for our dual estimation purposes. The applicability of the proposed method in terms of estimation of both states and parameters is demonstrated via simulations and using historical data from a real-life building

    Replicating financial market dynamics with a simple self-organized critical lattice model

    Full text link
    We explore a simple lattice field model intended to describe statistical properties of high frequency financial markets. The model is relevant in the cross-disciplinary area of econophysics. Its signature feature is the emergence of a self-organized critical state. This implies scale invariance of the model, without tuning parameters. Prominent results of our simulation are time series of gains, prices, volatility, and gains frequency distributions, which all compare favorably to features of historical market data. Applying a standard GARCH(1,1) fit to the lattice model gives results that are almost indistinguishable from historical NASDAQ data.Comment: 20 pages, 33 figure
    • …
    corecore