1,042 research outputs found

    Adaptive Sparse-grid Gauss-Hermite Filter

    Get PDF
    In this paper, a new nonlinear filter based on sparse-grid quadrature method has been proposed. The proposed filter is named as adaptive sparse-grid Gauss–Hermite filter (ASGHF). Ordinary sparse-grid technique treats all the dimensions equally, whereas the ASGHF assigns a fewer number of points along the dimensions with lower nonlinearity. It uses adaptive tensor product to construct multidimensional points until a predefined error tolerance level is reached. The performance of the proposed filter is illustrated with two nonlinear filtering problems. Simulation results demonstrate that the new algorithm achieves a similar accuracy as compared to sparse-grid Gauss–Hermite filter (SGHF) and Gauss–Hermite filter (GHF) with a considerable reduction in computational load. Further, in the conventional GHF and SGHF, any increase in the accuracy level may result in an unacceptably high increase in the computational burden. However, in ASGHF, a little increase in estimation accuracy is possible with a limited increase in computational burden by varying the error tolerance level and the error weighting parameter. This enables the online estimator to operate near full efficiency with a predefined computational budget

    Deterministic Mean-field Ensemble Kalman Filtering

    Full text link
    The proof of convergence of the standard ensemble Kalman filter (EnKF) from Legland etal. (2011) is extended to non-Gaussian state space models. A density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence Îş\kappa between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF when the dimension d<2Îşd<2\kappa. The fidelity of approximation of the true distribution is also established using an extension of total variation metric to random measures. This is limited by a Gaussian bias term arising from non-linearity/non-Gaussianity of the model, which exists for both DMFEnKF and standard EnKF. Numerical results support and extend the theory

    ADVANCES IN SYSTEM RELIABILITY-BASED DESIGN AND PROGNOSTICS AND HEALTH MANAGEMENT (PHM) FOR SYSTEM RESILIENCE ANALYSIS AND DESIGN

    Get PDF
    Failures of engineered systems can lead to significant economic and societal losses. Despite tremendous efforts (e.g., $200 billion annually) denoted to reliability and maintenance, unexpected catastrophic failures still occurs. To minimize the losses, reliability of engineered systems must be ensured throughout their life-cycle amidst uncertain operational condition and manufacturing variability. In most engineered systems, the required system reliability level under adverse events is achieved by adding system redundancies and/or conducting system reliability-based design optimization (RBDO). However, a high level of system redundancy increases a system's life-cycle cost (LCC) and system RBDO cannot ensure the system reliability when unexpected loading/environmental conditions are applied and unexpected system failures are developed. In contrast, a new design paradigm, referred to as resilience-driven system design, can ensure highly reliable system designs under any loading/environmental conditions and system failures while considerably reducing systems' LCC. In order to facilitate the development of formal methodologies for this design paradigm, this research aims at advancing two essential and co-related research areas: Research Thrust 1 - system RBDO and Research Thrust 2 - system prognostics and health management (PHM). In Research Thrust 1, reliability analyses under uncertainty will be carried out in both component and system levels against critical failure mechanisms. In Research Thrust 2, highly accurate and robust PHM systems will be designed for engineered systems with a single or multiple time-scale(s). To demonstrate the effectiveness of the proposed system RBDO and PHM techniques, multiple engineering case studies will be presented and discussed. Following the development of Research Thrusts 1 and 2, Research Thrust 3 - resilience-driven system design will establish a theoretical basis and design framework of engineering resilience in a mathematical and statistical context, where engineering resilience will be formulated in terms of system reliability and restoration and the proposed design framework will be demonstrated with a simplified aircraft control actuator design problem

    Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario

    Full text link
    A variety of methods is available to quantify uncertainties arising with\-in the modeling of flow and transport in carbon dioxide storage, but there is a lack of thorough comparisons. Usually, raw data from such storage sites can hardly be described by theoretical statistical distributions since only very limited data is available. Hence, exact information on distribution shapes for all uncertain parameters is very rare in realistic applications. We discuss and compare four different methods tested for data-driven uncertainty quantification based on a benchmark scenario of carbon dioxide storage. In the benchmark, for which we provide data and code, carbon dioxide is injected into a saline aquifer modeled by the nonlinear capillarity-free fractional flow formulation for two incompressible fluid phases, namely carbon dioxide and brine. To cover different aspects of uncertainty quantification, we incorporate various sources of uncertainty such as uncertainty of boundary conditions, of conceptual model definitions and of material properties. We consider recent versions of the following non-intrusive and intrusive uncertainty quantification methods: arbitary polynomial chaos, spatially adaptive sparse grids, kernel-based greedy interpolation and hybrid stochastic Galerkin. The performance of each approach is demonstrated assessing expectation value and standard deviation of the carbon dioxide saturation against a reference statistic based on Monte Carlo sampling. We compare the convergence of all methods reporting on accuracy with respect to the number of model runs and resolution. Finally we offer suggestions about the methods' advantages and disadvantages that can guide the modeler for uncertainty quantification in carbon dioxide storage and beyond

    MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    Full text link
    MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics

    Solving, Estimating and Selecting Nonlinear Dynamic Models without the Curse of Dimensionality

    Get PDF
    We present a comprehensive framework for Bayesian estimation of structural nonlinear dynamic economic models on sparse grids. TheSmolyak operator underlying the sparse grids approach frees global approximation from the curse of dimensionality and we apply it to a Chebyshev approximation of the model solution. The operator also eliminates the curse from Gaussian quadrature and we use it for the integrals arising from rational expectations and in three new nonlinear state space filters. The filters substantially decrease the computational burden compared to the sequential importance resampling particle filter. The posterior of the structural parameters is estimated by a new Metropolis-Hastings algorithm with mixing parallel sequences. The parallel extension improves the global maximization property of the algorithm, simplifies the choice of the innovation variances, allows for unbiased convergence diagnostics and for a simple implementation of the estimation on parallel computers. Finally, we provide all algorithms in the open source software JBendge4 for the solution and estimation of a general class of models.Dynamic Stochastic General Equilibrium (DSGE) Models, Bayesian Time Series Econometrics, Curse of Dimensionality

    Advanced DSP Techniques for High-Capacity and Energy-Efficient Optical Fiber Communications

    Get PDF
    The rapid proliferation of the Internet has been driving communication networks closer and closer to their limits, while available bandwidth is disappearing due to an ever-increasing network load. Over the past decade, optical fiber communication technology has increased per fiber data rate from 10 Tb/s to exceeding 10 Pb/s. The major explosion came after the maturity of coherent detection and advanced digital signal processing (DSP). DSP has played a critical role in accommodating channel impairments mitigation, enabling advanced modulation formats for spectral efficiency transmission and realizing flexible bandwidth. This book aims to explore novel, advanced DSP techniques to enable multi-Tb/s/channel optical transmission to address pressing bandwidth and power-efficiency demands. It provides state-of-the-art advances and future perspectives of DSP as well
    • …
    corecore