18,401 research outputs found

    Reduced order modeling of fluid flows: Machine learning, Kolmogorov barrier, closure modeling, and partitioning

    Full text link
    In this paper, we put forth a long short-term memory (LSTM) nudging framework for the enhancement of reduced order models (ROMs) of fluid flows utilizing noisy measurements. We build on the fact that in a realistic application, there are uncertainties in initial conditions, boundary conditions, model parameters, and/or field measurements. Moreover, conventional nonlinear ROMs based on Galerkin projection (GROMs) suffer from imperfection and solution instabilities due to the modal truncation, especially for advection-dominated flows with slow decay in the Kolmogorov width. In the presented LSTM-Nudge approach, we fuse forecasts from a combination of imperfect GROM and uncertain state estimates, with sparse Eulerian sensor measurements to provide more reliable predictions in a dynamical data assimilation framework. We illustrate the idea with the viscous Burgers problem, as a benchmark test bed with quadratic nonlinearity and Laplacian dissipation. We investigate the effects of measurements noise and state estimate uncertainty on the performance of the LSTM-Nudge behavior. We also demonstrate that it can sufficiently handle different levels of temporal and spatial measurement sparsity. This first step in our assessment of the proposed model shows that the LSTM nudging could represent a viable realtime predictive tool in emerging digital twin systems

    Data and Predictive Analytics Use for Logistics and Supply Chain Management

    Get PDF
    Purpose The purpose of this paper is to explore the social process of Big Data and predictive analytics (BDPA) use for logistics and supply chain management (LSCM), focusing on interactions among technology, human behavior and organizational context that occur at the technology’s post-adoption phases in retail supply chain (RSC) organizations. Design/methodology/approach The authors follow a grounded theory approach for theory building based on interviews with senior managers of 15 organizations positioned across multiple echelons in the RSC. Findings Findings reveal how user involvement shapes BDPA to fit organizational structures and how changes made to the technology retroactively affect its design and institutional properties. Findings also reveal previously unreported aspects of BDPA use for LSCM. These include the presence of temporal and spatial discontinuities in the technology use across RSC organizations. Practical implications This study unveils that it is impossible to design a BDPA technology ready for immediate use. The emergent process framework shows that institutional and social factors require BDPA use specific to the organization, as the technology comes to reflect the properties of the organization and the wider social environment for which its designers originally intended. BDPA is, thus, not easily transferrable among collaborating RSC organizations and requires managerial attention to the institutional context within which its usage takes place. Originality/value The literature describes why organizations will use BDPA but fails to provide adequate insight into how BDPA use occurs. The authors address the “how” and bring a social perspective into a technology-centric area

    Data Assimilation for a Geological Process Model Using the Ensemble Kalman Filter

    Full text link
    We consider the problem of conditioning a geological process-based computer simulation, which produces basin models by simulating transport and deposition of sediments, to data. Emphasising uncertainty quantification, we frame this as a Bayesian inverse problem, and propose to characterize the posterior probability distribution of the geological quantities of interest by using a variant of the ensemble Kalman filter, an estimation method which linearly and sequentially conditions realisations of the system state to data. A test case involving synthetic data is used to assess the performance of the proposed estimation method, and to compare it with similar approaches. We further apply the method to a more realistic test case, involving real well data from the Colville foreland basin, North Slope, Alaska.Comment: 34 pages, 10 figures, 4 table

    Active Learning of Gaussian Processes for Spatial Functions in Mobile Sensor Networks

    Get PDF
    This paper proposes a spatial function modeling approach using mobile sensor networks, which potentially can be used for environmental surveillance applications. The mobile sensor nodes are able to sample the point observations of an 2D spatial function. On the one hand, they will use the observations to generate a predictive model of the spatial function. On the other hand, they will make collective motion decisions to move into the regions where high uncertainties of the predictive model exist. In the end, an accurate predictive model is obtained in the sensor network and all the mobile sensor nodes are distributed in the environment with an optimized pattern. Gaussian process regression is selected as the modeling technique in the proposed approach. The hyperparameters of Gaussian process model are learned online to improve the accuracy of the predictive model. The collective motion control of mobile sensor nodes is based on a locational optimization algorithm, which utilizes an information entropy of the predicted Gaussian process to explore the environment and reduce the uncertainty of predictive model. Simulation results are provided to show the performance of the proposed approach. © 2011 IFAC

    Variational Downscaling, Fusion and Assimilation of Hydrometeorological States via Regularized Estimation

    Full text link
    Improved estimation of hydrometeorological states from down-sampled observations and background model forecasts in a noisy environment, has been a subject of growing research in the past decades. Here, we introduce a unified framework that ties together the problems of downscaling, data fusion and data assimilation as ill-posed inverse problems. This framework seeks solutions beyond the classic least squares estimation paradigms by imposing proper regularization, which are constraints consistent with the degree of smoothness and probabilistic structure of the underlying state. We review relevant regularization methods in derivative space and extend classic formulations of the aforementioned problems with particular emphasis on hydrologic and atmospheric applications. Informed by the statistical characteristics of the state variable of interest, the central results of the paper suggest that proper regularization can lead to a more accurate and stable recovery of the true state and hence more skillful forecasts. In particular, using the Tikhonov and Huber regularization in the derivative space, the promise of the proposed framework is demonstrated in static downscaling and fusion of synthetic multi-sensor precipitation data, while a data assimilation numerical experiment is presented using the heat equation in a variational setting
    • …
    corecore