8 research outputs found
Scalable Bayesian optimization with high-dimensional outputs using randomized prior networks
Several fundamental problems in science and engineering consist of global
optimization tasks involving unknown high-dimensional (black-box) functions
that map a set of controllable variables to the outcomes of an expensive
experiment. Bayesian Optimization (BO) techniques are known to be effective in
tackling global optimization problems using a relatively small number objective
function evaluations, but their performance suffers when dealing with
high-dimensional outputs. To overcome the major challenge of dimensionality,
here we propose a deep learning framework for BO and sequential decision making
based on bootstrapped ensembles of neural architectures with randomized priors.
Using appropriate architecture choices, we show that the proposed framework can
approximate functional relationships between design variables and quantities of
interest, even in cases where the latter take values in high-dimensional vector
spaces or even infinite-dimensional function spaces. In the context of BO, we
augmented the proposed probabilistic surrogates with re-parameterized Monte
Carlo approximations of multiple-point (parallel) acquisition functions, as
well as methodological extensions for accommodating black-box constraints and
multi-fidelity information sources. We test the proposed framework against
state-of-the-art methods for BO and demonstrate superior performance across
several challenging tasks with high-dimensional outputs, including a
constrained optimization task involving shape optimization of rotor blades in
turbo-machinery.Comment: 18 pages, 8 figure
ClimSim: A large multi-scale dataset for hybrid physics-ML climate emulation
Modern climate projections lack adequate spatial and temporal resolution due to computational constraints. A consequence is inaccurate and imprecise predictions of critical processes such as storms. Hybrid methods that combine physics with machine learning (ML) have introduced a new generation of higher fidelity climate simulators that can sidestep Moore's Law by outsourcing compute-hungry, short, high-resolution simulations to ML emulators. However, this hybrid ML-physics simulation approach requires domain-specific treatment and has been inaccessible to ML experts because of lack of training data and relevant, easy-to-use workflows. We present ClimSim, the largest-ever dataset designed for hybrid ML-physics research. It comprises multi-scale climate simulations, developed by a consortium of climate scientists and ML researchers. It consists of 5.7 billion pairs of multivariate input and output vectors that isolate the influence of locally-nested, high-resolution, high-fidelity physics on a host climate simulator's macro-scale physical state.The dataset is global in coverage, spans multiple years at high sampling frequency, and is designed such that resulting emulators are compatible with downstream coupling into operational climate simulators. We implement a range of deterministic and stochastic regression baselines to highlight the ML challenges and their scoring. The data (https://huggingface.co/datasets/LEAP/ClimSim_high-res) and code (https://leap-stc.github.io/ClimSim) are released openly to support the development of hybrid ML-physics and high-fidelity climate simulations for the benefit of science and society
Curved beam based model for piston-ring designs in internal combustion engines
Thesis: S.M., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2017.Cataloged from PDF version of thesis.Includes bibliographical references (pages 169-173).Characterizing the piston ring behavior is inherently associated with the oil consumption, friction, wear and blow-by in internal combustion engines. This behavior varies along the ring's circumference and determining these variations is of utmost importance for developing ring-packs achieving desired performances in terms of sealing and conformability. This study based on straight beam model was already developed but does not consider the lubrication sub-models, the tip gap effects and the characterization of the ring free shape based on any final closed shape. In this work, three numerical curved beam based models were developed to study the performance of the piston ring-pack. The conformability model was developed to characterize the behavior of the ring within the engine. In this model, the curved beam model is adopted with considering ring-bore and ring-groove interactions. This interactions include asperity and lubrication forces. Besides, gas forces are included to the model along with the inertia and initial ring tangential load. In this model we also allow for bore, groove upper and lower flanks thermal distortion. We also take into account the thermal expansion effect of the ring and the temperature gradient from inner diameter (ID) to outer diameter (OD) effects. The piston secondary motion and the variation of oil viscosity on the liner with its temperature in addition to the existence of fuel and the different hydrodynamic cases (Partially and fully flooded cases) are considered as well. This model revealed the ring position relative to the groove depending on the friction, inertia and gas pressures. It also characterizes the effect of non-uniform oil distribution on the liner and groove flanks. Finally, the ring gap position within a distorted bore also reveals the sealing performance of the ring. Using the curved beam model we also developed a module determining the twist calculation under fix ID or OD constraint. The static twist is an experimental characterization of the ring during which the user taps on the ring till there is a minimum clearance between the ring lowest point and the lower plate all over the ring's circumference but without any force contact. Our last model includes four sub-models that relate the ring free shape, its final shape when subjected to a constant radial pressure (this final shape is called ovality) and the force distribution in circular bore. Knowing one of these distribution, this model determines the other two. This tool is useful in the sense that the characterization of the ring is carried out by measuring its ovality which is more accurate than measuring its free shape or force distribution in circular bore. Thus, having a model that takes the ovality as an input is more convenient and useful based on the experiments carried out to characterize the ring.by Mohamed Aziz Bhouri.S.M
A two-level parameterized Model-Order Reduction approach for time-domain elastodynamics
We present a two-level parameterized Model Order Reduction (pMOR) technique for the linear hyperbolic Partial Differential Equation (PDE) of time-domain elastodynamics. In order to approximate the frequency-domain PDE, we take advantage of the Port-Reduced Reduced-Basis Component (PR-RBC) method to develop (in the offline stage) reduced bases for subdomains; the latter are then assembled (in the online stage) to form the global domains of interest. The PR-RBC approach reduces the effective dimensionality of the parameter space and also provides flexibility in topology and geometry. In the online stage, for each query, we consider a given parameter value and associated global domain. In the first level of reduction, the PR-RBC reduced bases are used to approximate the frequency-domain solution at selected frequencies. In the second level of reduction, these instantiated PR-RBC approximations are used as surrogate truth solutions in a Strong Greedy approach to identify a reduced basis space; the PDE of time-domain elastodynamics is then projected on this reduced space. We provide a numerical example to demonstrate the computational capability and assess the performance of the proposed two-level approach
ClimSim: An open large-scale dataset for training high-resolution physics emulators in hybrid multi-scale climate simulators
Modern climate projections lack adequate spatial and temporal resolution due to computational constraints. A consequence is inaccurate and imprecise predictions of critical processes such as storms. Hybrid methods that combine physics with machine learning (ML) have introduced a new generation of higher fidelity climate simulators that can sidestep Moore's Law by outsourcing compute-hungry, short, high-resolution simulations to ML emulators. However, this hybrid ML-physics simulation approach requires domain-specific treatment and has been inaccessible to ML experts because of lack of training data and relevant, easy-to-use workflows. We present ClimSim, the largest-ever dataset designed for hybrid ML-physics research. It comprises multi-scale climate simulations, developed by a consortium of climate scientists and ML researchers. It consists of 5.7 billion pairs of multivariate input and output vectors that isolate the influence of locally-nested, high-resolution, high-fidelity physics on a host climate simulator's macro-scale physical state. The dataset is global in coverage, spans multiple years at high sampling frequency, and is designed such that resulting emulators are compatible with downstream coupling into operational climate simulators. We implement a range of deterministic and stochastic regression baselines to highlight the ML challenges and their scoring. The data (https://huggingface.co/datasets/LEAP/ClimSim_high-res, https://huggingface.co/datasets/LEAP/ClimSim_low-res, and https://huggingface.co/datasets/LEAP/ClimSim_low-res_aqua-planet) and code (https://leap-stc.github.io/ClimSim) are released openly to support the development of hybrid ML-physics and high-fidelity climate simulations for the benefit of science and society