1,381 research outputs found

    Trustworthy Analysis of Recent Debris Cloud Conjunction Events Using an Adaptive Monte Carlo Forecasting Platform

    Get PDF
    Every untracked, inactive, or unfamiliar object in Earth's orbit poses a risk to satellites and rockets that wish to safely navigate through space. Objects of this nature, known as "space debris," will remain in orbit without deliberate intervention. The purpose of this project is to perform a highly accurate retrospective analysis of a certain outstanding close-approach event (also known as a conjunction event) that occurred in the geostationary belt. It is expected that the successful completion of this work will result in a trustworthy prognostics tool that can help minimize, or even eliminate, such risk in the future. Events related to candidate resident space objects were considered, and the 2016 Briz-M rocket body explosion was chosen as the particular event of interest. By appropriately modeling the motion of such candidates through astrodynamics analysis and adjusting the initial conditions to reflect sensor precision, a recently developed adaptive Monte Carlo method, a MATLAB-based forecasting platform, can be employed to propagate a particle cloud representing the object's orbit over time. The completion of this project will validate the methods used, while simultaneously reducing the risks of collision and damage in similar events in the future.Ohio Space Grant Consortium Scholarship ('21-'22)No embargoAcademic Major: Aerospace Engineerin

    Beyond Gaussian Statistical Modeling in Geophysical Data Assimilation

    Get PDF
    International audienceThis review discusses recent advances in geophysical data assimilation beyond Gaussian statistical modeling, in the fields of meteorology, oceanography, as well as atmospheric chemistry. The non-Gaussian features are stressed rather than the nonlinearity of the dynamical models, although both aspects are entangled. Ideas recently proposed to deal with these non-Gaussian issues, in order to improve the state or parameter estimation, are emphasized. The general Bayesian solution to the estimation problem and the techniques to solve it are first presented, as well as the obstacles that hinder their use in high-dimensional and complex systems. Approximations to the Bayesian solution relying on Gaussian, or on second-order moment closure, have been wholly adopted in geophysical data assimilation (e.g., Kalman filters and quadratic variational solutions). Yet, nonlinear and non-Gaussian effects remain. They essentially originate in the nonlinear models and in the non-Gaussian priors. How these effects are handled within algorithms based on Gaussian assumptions is then described. Statistical tools that can diagnose them and measure deviations from Gaussianity are recalled. The following advanced techniques that seek to handle the estimation problem beyond Gaussianity are reviewed: maximum entropy filter, Gaussian anamorphosis, non-Gaussian priors, particle filter with an ensemble Kalman filter as a proposal distribution, maximum entropy on the mean, or strictly Bayesian inferences for large linear models, etc. Several ideas are illustrated with recent or original examples that possess some features of high-dimensional systems. Many of the new approaches are well understood only in special cases and have difficulties that remain to be circumvented. Some of the suggested approaches are quite promising, and sometimes already successful for moderately large though specific geophysical applications. Hints are given as to where progress might come from

    An Ensemble Score Filter for Tracking High-Dimensional Nonlinear Dynamical Systems

    Full text link
    We propose an ensemble score filter (EnSF) for solving high-dimensional nonlinear filtering problems with superior accuracy. A major drawback of existing filtering methods, e.g., particle filters or ensemble Kalman filters, is the low accuracy in handling high-dimensional and highly nonlinear problems. EnSF attacks this challenge by exploiting the score-based diffusion model, defined in a pseudo-temporal domain, to characterizing the evolution of the filtering density. EnSF stores the information of the recursively updated filtering density function in the score function, in stead of storing the information in a set of finite Monte Carlo samples (used in particle filters and ensemble Kalman filters). Unlike existing diffusion models that train neural networks to approximate the score function, we develop a training-free score estimation that uses mini-batch-based Monte Carlo estimator to directly approximate the score function at any pseudo-spatial-temporal location, which provides sufficient accuracy in solving high-dimensional nonlinear problems as well as saves tremendous amount of time spent on training neural networks. Another essential aspect of EnSF is its analytical update step, gradually incorporating data information into the score function, which is crucial in mitigating the degeneracy issue faced when dealing with very high-dimensional nonlinear filtering problems. High-dimensional Lorenz systems are used to demonstrate the performance of our method. EnSF provides surprisingly impressive performance in reliably tracking extremely high-dimensional Lorenz systems (up to 1,000,000 dimension) with highly nonlinear observation processes, which is a well-known challenging problem for existing filtering methods.Comment: arXiv admin note: text overlap with arXiv:2306.0928

    Reduced order modeling of subsurface multiphase flow models using deep residual recurrent neural networks

    Get PDF
    We present a reduced order modeling (ROM) technique for subsurface multi-phase flow problems building on the recently introduced deep residual recurrent neural network (DR-RNN) [1]. DR-RNN is a physics aware recurrent neural network for modeling the evolution of dynamical systems. The DR-RNN architecture is inspired by iterative update techniques of line search methods where a fixed number of layers are stacked together to minimize the residual (or reduced residual) of the physical model under consideration. In this manuscript, we combine DR-RNN with proper orthogonal decomposition (POD) and discrete empirical interpolation method (DEIM) to reduce the computational complexity associated with high-fidelity numerical simulations. In the presented formulation, POD is used to construct an optimal set of reduced basis functions and DEIM is employed to evaluate the nonlinear terms independent of the full-order model size. We demonstrate the proposed reduced model on two uncertainty quantification test cases using Monte-Carlo simulation of subsurface flow with random permeability field. The obtained results demonstrate that DR-RNN combined with POD-DEIM provides an accurate and stable reduced model with a fixed computational budget that is much less than the computational cost of standard POD-Galerkin reduced model combined with DEIM for nonlinear dynamical systems
    • …
    corecore