267,246 research outputs found

    Linear theory for filtering nonlinear multiscale systems with model error

    Full text link
    We study filtering of multiscale dynamical systems with model error arising from unresolved smaller scale processes. The analysis assumes continuous-time noisy observations of all components of the slow variables alone. For a linear model with Gaussian noise, we prove existence of a unique choice of parameters in a linear reduced model for the slow variables. The linear theory extends to to a non-Gaussian, nonlinear test problem, where we assume we know the optimal stochastic parameterization and the correct observation model. We show that when the parameterization is inappropriate, parameters chosen for good filter performance may give poor equilibrium statistical estimates and vice versa. Given the correct parameterization, it is imperative to estimate the parameters simultaneously and to account for the nonlinear feedback of the stochastic parameters into the reduced filter estimates. In numerical experiments on the two-layer Lorenz-96 model, we find that parameters estimated online, as part of a filtering procedure, produce accurate filtering and equilibrium statistical prediction. In contrast, a linear regression based offline method, which fits the parameters to a given training data set independently from the filter, yields filter estimates which are worse than the observations or even divergent when the slow variables are not fully observed

    Parameter estimation of ODE's via nonparametric estimators

    Get PDF
    Ordinary differential equations (ODE's) are widespread models in physics, chemistry and biology. In particular, this mathematical formalism is used for describing the evolution of complex systems and it might consist of high-dimensional sets of coupled nonlinear differential equations. In this setting, we propose a general method for estimating the parameters indexing ODE's from times series. Our method is able to alleviate the computational difficulties encountered by the classical parametric methods. These difficulties are due to the implicit definition of the model. We propose the use of a nonparametric estimator of regression functions as a first-step in the construction of an M-estimator, and we show the consistency of the derived estimator under general conditions. In the case of spline estimators, we prove asymptotic normality, and that the rate of convergence is the usual n\sqrt{n}-rate for parametric estimators. Some perspectives of refinements of this new family of parametric estimators are given.Comment: Published in at http://dx.doi.org/10.1214/07-EJS132 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Identification and estimation of continuous time dynamic systems with exogenous variables using panel data

    Get PDF
    This paper deals with the identification and maximum likelihood estimation of the parameters of a stochastic differential equation from discrete time sampling. Score function and maximum likelihood equations are derived explicitly. The stochastic differential equation system is extended to allow for random effects and the analysis of panel data. In addition, we investigate the identifiability of the continuous time parameters, in particular the impact of the inclusion of exogenous variables

    Learning Model-Based Sparsity via Projected Gradient Descent

    Full text link
    Several convex formulation methods have been proposed previously for statistical estimation with structured sparsity as the prior. These methods often require a carefully tuned regularization parameter, often a cumbersome or heuristic exercise. Furthermore, the estimate that these methods produce might not belong to the desired sparsity model, albeit accurately approximating the true parameter. Therefore, greedy-type algorithms could often be more desirable in estimating structured-sparse parameters. So far, these greedy methods have mostly focused on linear statistical models. In this paper we study the projected gradient descent with non-convex structured-sparse parameter model as the constraint set. Should the cost function have a Stable Model-Restricted Hessian the algorithm produces an approximation for the desired minimizer. As an example we elaborate on application of the main results to estimation in Generalized Linear Model
    • …
    corecore