60,649 research outputs found

    An efficient surrogate model for emulation and physics extraction of large eddy simulations

    Full text link
    In the quest for advanced propulsion and power-generation systems, high-fidelity simulations are too computationally expensive to survey the desired design space, and a new design methodology is needed that combines engineering physics, computer simulations and statistical modeling. In this paper, we propose a new surrogate model that provides efficient prediction and uncertainty quantification of turbulent flows in swirl injectors with varying geometries, devices commonly used in many engineering applications. The novelty of the proposed method lies in the incorporation of known physical properties of the fluid flow as {simplifying assumptions} for the statistical model. In view of the massive simulation data at hand, which is on the order of hundreds of gigabytes, these assumptions allow for accurate flow predictions in around an hour of computation time. To contrast, existing flow emulators which forgo such simplications may require more computation time for training and prediction than is needed for conducting the simulation itself. Moreover, by accounting for coupling mechanisms between flow variables, the proposed model can jointly reduce prediction uncertainty and extract useful flow physics, which can then be used to guide further investigations.Comment: Submitted to JASA A&C

    Bose-Einstein Correlations of Neutral and Charged Pions in Hadronic Z Decays

    Get PDF
    Bose-Einstein correlations of both neutral and like-sign charged pion pairs are measured in a sample of 2 million hadronic Z decays collected with the L3 detector at LEP. The analysis is performed in the four-momentum difference range 300 MeV < Q < 2 GeV. The radius of the neutral pion source is found to be smaller than that of charged pions. This result is in qualitative agreement with the string fragmentation model

    Models and Simulations in Material Science: Two Cases Without Error Bars

    Full text link
    We discuss two research projects in material science in which the results cannot be stated with an estimation of the error: a spectro- scopic ellipsometry study aimed at determining the orientation of DNA molecules on diamond and a scanning tunneling microscopy study of platinum-induced nanowires on germanium. To investigate the reliability of the results, we apply ideas from the philosophy of models in science. Even if the studies had reported an error value, the trustworthiness of the result would not depend on that value alone.Comment: 20 pages, 2 figure

    A study of the material in the ATLAS inner detector using secondary hadronic interactions

    Get PDF
    The ATLAS inner detector is used to reconstruct secondary vertices due to hadronic interactions of primary collision products, so probing the location and amount of material in the inner region of ATLAS. Data collected in 7 TeV pp collisions at the LHC, with a minimum bias trigger, are used for comparisons with simulated events. The reconstructed secondary vertices have spatial resolutions ranging from ~ 200 µm to 1 mm. The overall material description in the simulation is validated to within an experimental uncertainty of about 7%. This will lead to a better understanding of the reconstruction of various objects such as tracks, leptons, jets, and missing transverse momentum

    Influence of Resonances on the Noise Performance of SQUID Susceptometers

    Get PDF
    Scanning Superconducting Quantum Interference Device (SQUID) Susceptometry simultaneously images the local magnetic fields and susceptibilities above a sample with sub-micron spatial resolution. Further development of this technique requires a thorough understanding of the current, voltage, and flux ( IVΦ ) characteristics of scanning SQUID susceptometers. These sensors often have striking anomalies in their current–voltage characteristics, which we believe to be due to electromagnetic resonances. The effect of these resonances on the performance of these SQUIDs is unknown. To explore the origin and impact of the resonances, we develop a model that qualitatively reproduces the experimentally-determined IVΦ characteristics of our scanning SQUID susceptometers. We use this model to calculate the noise characteristics of SQUIDs of different designs. We find that the calculated ultimate flux noise is better in susceptometers with damping resistors that diminish the resonances than in susceptometers without damping resistors. Such calculations will enable the optimization of the signal-to-noise characteristics of scanning SQUID susceptometers

    Validating Predictions of Unobserved Quantities

    Full text link
    The ultimate purpose of most computational models is to make predictions, commonly in support of some decision-making process (e.g., for design or operation of some system). The quantities that need to be predicted (the quantities of interest or QoIs) are generally not experimentally observable before the prediction, since otherwise no prediction would be needed. Assessing the validity of such extrapolative predictions, which is critical to informed decision-making, is challenging. In classical approaches to validation, model outputs for observed quantities are compared to observations to determine if they are consistent. By itself, this consistency only ensures that the model can predict the observed quantities under the conditions of the observations. This limitation dramatically reduces the utility of the validation effort for decision making because it implies nothing about predictions of unobserved QoIs or for scenarios outside of the range of observations. However, there is no agreement in the scientific community today regarding best practices for validation of extrapolative predictions made using computational models. The purpose of this paper is to propose and explore a validation and predictive assessment process that supports extrapolative predictions for models with known sources of error. The process includes stochastic modeling, calibration, validation, and predictive assessment phases where representations of known sources of uncertainty and error are built, informed, and tested. The proposed methodology is applied to an illustrative extrapolation problem involving a misspecified nonlinear oscillator
    corecore