427 research outputs found

    Reliable estimation of prediction uncertainty for physico-chemical property models

    Full text link
    The predictions of parameteric property models and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method akin to Bayesian inference that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the 57Fe Moessbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with twelve density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm/s and 0.04-0.05 mm/s, respectively, the latter being close to the average experimental uncertainty of 0.02 mm/s. Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the coefficient of correlation, r2, or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical Moessbauer spectroscopy, which is of general applicability for physico-chemical property models and not restricted to isomer-shift predictions. We provide the statistically meaningful reference data set MIS39 and a new calibration of the isomer shift based on the PBE0 functional.Comment: 49 pages, 9 figures, 7 table

    How accurate is density functional theory at predicting dipole moments? An assessment using a new database of 200 benchmark values

    Full text link
    Dipole moments are a simple, global measure of the accuracy of the electron density of a polar molecule. Dipole moments also affect the interactions of a molecule with other molecules as well as electric fields. To directly assess the accuracy of modern density functionals for calculating dipole moments, we have developed a database of 200 benchmark dipole moments, using coupled cluster theory through triple excitations, extrapolated to the complete basis set limit. This new database is used to assess the performance of 88 popular or recently developed density functionals. The results suggest that double hybrid functionals perform the best, yielding dipole moments within about 3.6-4.5% regularized RMS error versus the reference values---which is not very different from the 4% regularized RMS error produced by coupled cluster singles and doubles. Many hybrid functionals also perform quite well, generating regularized RMS errors in the 5-6% range. Some functionals however exhibit large outliers and local functionals in general perform less well than hybrids or double hybrids.Comment: Added several double hybrid functionals, most of which turned out to be better than any functional from Rungs 1-4 of Jacob's ladder and are actually competitive with CCS

    Radiative Transfer and Inversion codes for characterizing planetary atmospheres: an overview

    Full text link
    The study of planetary atmospheres is crucial for understanding the origin, evolution, and processes that shape celestial bodies like planets, moons and comets. The interpretation of planetary spectra requires a detailed understanding of radiative transfer (RT) and its application through computational codes. With the advancement of observations, atmospheric modelling, and inference techniques, diverse RT and retrieval codes in planetary science have been proliferated. However, the selection of the most suitable code for a given problem can be challenging. To address this issue, we present a comprehensive mini-overview of the different RT and retrieval codes currently developed or available in the field of planetary atmospheres. This study serves as a valuable resource for the planetary science community by providing a clear and accessible list of codes, and offers a useful reference for researchers and practitioners in their selection and application of RT and retrieval codes for planetary atmospheric studies.Comment: 10 pages, 1 figure, published in Frontiers in Astronomy and Space Sciences. https://www.frontiersin.org/articles/10.3389/fspas.2023.117674

    On the usefulness of imprecise Bayesianism in chemical kinetics

    Get PDF
    International audienceBayesian methods are growing ever more popular in chemical kinetics. The reasons for this and general challenges related to kinetic parameter estimation are shortly reviewed. Most authors content themselves with using one single (mostly uniform) prior distribution. The goal of this paper is to go into some serious issues this raises. The problems of confusing knowledge and ignorance and of reparametrisation are examined. The legitimacy of a probabilistic Ockham’s razor is called into question. A synthetic example involving two reaction models was used to illustrate how merging the parameter space volume with the model accuracy into a single number might be unwise. Robust Bayesian analysis appears to be a simple andstraightforward way to avoid the problems mentioned throughout this article

    Atmospheric Retrieval: Bayesian Methods, Machine Learning, and Application to Exoplanets

    Get PDF
    Atmospheric retrieval is the inverse modeling method where atmospheric properties are constrained based on measured spectra. Due to the low signal-to-noise ratios of exoplanet observations, exoplanetary retrieval codes pair a radiative transfer (RT) simulator with a Bayesian statistical framework in order to characterize the distribution of atmospheric parameters that could explain the observations (the posterior distribution). This requires on the order of 106 RT model evaluations, which requires hours to days of compute time depending on model complexity. In this work, I investigate atmospheric retrieval methods and apply them to observations of hot Jupiters. Chapter 2 presents a set of RT and retrieval tests to validate the Bayesian Atmospheric Radiative Transfer (BART) retrieval code and applies BART to the emission spectrum of HD 189733 b. Chapter 3 investigates the dayside atmosphere of WASP-12b and resolves a tension in the literature over its composition. Chapter 4 introduces a machine learning direct retrieval framework which spawns virtual machines, generates spectra, trains neural networks, and performs atmospheric retrievals using trained neural networks. Chapter 5 builds on this and presents a machine learning indirect retrieval method, where the retrieval is performed using a neural network surrogate model for RT within a Bayesian framework, and compares it with BART. Chapter 6 utilizes the neural network surrogate modeling approach for thermochemical equilibrium chemistry models and compares it with other equilibrium estimation methods. Appendices address retrieval errors induced by choice of wavenumber gridding for opacity-sampling RT schemes, neural network model selection, the effects of data set size on neural network training, and the accuracy of Bayesian frameworks used for atmospheric retrieval

    Development and testing of new exchange correlation functionals

    Get PDF

    Machine learning and uncertainty quantification framework for predictive ab initio Hypersonics

    Get PDF
    Hypersonics represents one of the most challenging applications for predictive science. Due to the multi-scale and multi-physics characteristics, high-Mach phenomena are generally complex from both the computational and the experimental perspectives. Nevertheless, the related simulations typically require high accuracy, as their outcomes inform design and decision-making processes in safety-critical applications. Ab initio approaches aim to improve the predictive accuracy by making the calculations free from empiricism. In order to achieve this goal, these methodologies move the computational resolution down to the interatomic level by relying on first-principles quantum physics. As side effects, the increase in model complexity also results in: i) more physics that could be potentially misrepresented and ii) dramatic inflation of the computational cost. This thesis leverages machine learning (ML), uncertainty quantification (UQ), data science, and reduced order models (ROMs) for tackling these downsides and improving the predictive capabilities of ab initio Hypersonics. The first part of the manuscript focuses on formulating and testing a systematic approach to the reliability assessment of ML-based models based on their non-deterministic extensions. In particular, it introduces a novel methodology for the quantification of uncertainties associated with potential energy surfaces (PESs) computed from first-principles quantum mechanical calculations. The methodology relies on Bayesian inference and ML techniques to construct a stochastic PES and to express the inadequacies associated with the ab initio data points and their fit. The resulting stochastic surface is efficiently forward propagated via quasi-classical trajectory (QCT) and master equation calculations by combining high fidelity calculations and reduced order modeling. In this way, the PES contribution to the uncertainty on predefined quantities of interest (QoIs) is explicitly determined. This study is done at both microscopic (e.g., rovibrational-specific rate coefficients) and macroscopic (e.g., thermal and chemical relaxation properties) levels. A correlation analysis is finally applied to identify the PES regions that require further refinement, based on their effects on the QoI reliability. The methodology is applied to the study of singlet (11A') and quintet (25A') PESs describing the interaction between O2 molecules and O atoms in their ground electronic state. The investigation of the singlet surface reveals a negligible uncertainty on the kinetic properties and relaxation times, which are found to be in excellent agreement with the ones previously published in the literature. On the other hand, the methodology demonstrated significant uncertainty on the quintet surface due to inaccuracies in the description of the exchange barrier and the repulsive wall. When forward propagated, this uncertainty is responsible for the variability of one order of magnitude in the vibrational relaxation time and of factor four in the exchange reaction rate coefficient, both at 2,500 K. The second part of this thesis presents a data-informed and physics-driven coarse-graining strategy aimed to reduce the computational cost of ab initio simulations. At first, an in-depth discussion of the physics governing the non-equilibrium dissociation of O2 molecules colliding with O atoms is proposed. A rovibrationally-resolved database for all of the elementary collisional processes is constructed by including all nine adiabatic electronic states of O3 in the QCT calculations. A detailed analysis of the ab initio data set reveals that, for a rovibrational level, the probability of dissociating is mostly dictated by its deficit in internal energy compared to the centrifugal barrier. Due to the assumption of rotational equilibrium, the conventional vibrational-specific calculations fail to characterize such a dependence, and the new ROM strategy is proposed based on this observation. By relying on a hybrid technique made of rovibrationally-resolved excitation coupled to coarse-grained dissociation, the novel approach is compared to the vibrational-specific model and the direct solution of the rovibrational state-to-state master equation. Simulations are performed in a zero-dimensional isothermal and isochoric chemical reactor for a wide range of temperatures (1,500 - 20,000 K). The study shows that the main contribution to the model inadequacy of vibrational-specific approaches originates from the incapability of characterizing dissociation, rather than the energy transfers. Even when constructed with only twenty groups and only 20% of the original computational cost, the new reduced order model outperforms the vibrational-specific one in predicting all of the QoIs related to dissociation kinetics. At the highest temperature, the accuracy in the mole fraction is improved by 2,000%

    Strict Upper Limits on the Carbon-to-Oxygen Ratios of Eight Hot Jupiters from Self-Consistent Atmospheric Retrieval

    Get PDF
    The elemental compositions of hot Jupiters are informative relics of planet formation that can help us answer long-standing questions regarding the origin and formation of giant planets. Here, I present the main conclusions from a comprehensive atmospheric retrieval survey of eight hot Jupiters with detectable molecular absorption in their near-infrared transmission spectra. I analyze the eight transmission spectra using the newly-developed, self-consistent atmospheric retrieval framework, SCARLET. Unlike previous methods, SCARLET combines the physical and chemical consistency of complex atmospheric models with the statistical treatment of observational uncertainties known from atmospheric retrieval techniques. I find that all eight hot Jupiters consistently require carbon-to-oxygen ratios (C/O) below 0.9. The finding of C/O<0.9 is highly robust for HD209458b, WASP-12b, WASP-19b, HAT-P-1b, and XO-1b. For HD189733b, WASP-17b, and WASP-43b, I find that the published WFC3 transmission spectra favor C/O<0.9 at greater than 95% confidence. I further show that the water abundances on all eight hot Jupiters are consistent with solar composition. The relatively small depth of the detected water absorption features is due to the presence of clouds, not due to a low water abundance as previously suggested for HD209458b. The presence of a thick cloud deck is inferred for HD209458b and WASP-12b. HD189733b may host a similar cloud deck, rather than the previously suggested Rayleigh hazes, if star spots affect the observed spectrum. The approach taken in SCARLET can be regarded as a new pathway to interpreting spectral observations of planetary atmospheres. In this work, including our prior knowledge of H-C-N-O chemistry enables me to constrain the C/O ratio without detecting a single carbon-bearing molecule.Comment: under review at ApJ; updated to account for recently announced observations of WASP-12b and HD 209458
    • …
    corecore