503,481 research outputs found

    Statistical Methodologies to Integrate Experimental and Computational Research

    Get PDF
    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application

    Monte Carlo Methods for Estimating Interfacial Free Energies and Line Tensions

    Full text link
    Excess contributions to the free energy due to interfaces occur for many problems encountered in the statistical physics of condensed matter when coexistence between different phases is possible (e.g. wetting phenomena, nucleation, crystal growth, etc.). This article reviews two methods to estimate both interfacial free energies and line tensions by Monte Carlo simulations of simple models, (e.g. the Ising model, a symmetrical binary Lennard-Jones fluid exhibiting a miscibility gap, and a simple Lennard-Jones fluid). One method is based on thermodynamic integration. This method is useful to study flat and inclined interfaces for Ising lattices, allowing also the estimation of line tensions of three-phase contact lines, when the interfaces meet walls (where "surface fields" may act). A generalization to off-lattice systems is described as well. The second method is based on the sampling of the order parameter distribution of the system throughout the two-phase coexistence region of the model. Both the interface free energies of flat interfaces and of (spherical or cylindrical) droplets (or bubbles) can be estimated, including also systems with walls, where sphere-cap shaped wall-attached droplets occur. The curvature-dependence of the interfacial free energy is discussed, and estimates for the line tensions are compared to results from the thermodynamic integration method. Basic limitations of all these methods are critically discussed, and an outlook on other approaches is given

    Quantitative Integration of Multiple Near-Surface Geophysical Techniques for Improved Subsurface Imaging and Reducing Uncertainty in Discrete Anomaly Detection

    Get PDF
    Currently there is no systematic quantitative methodology in place for the integration of two or more coincident data sets collected using near-surface geophysical techniques. As the need for this type of methodology increases—particularly in the fields of archaeological prospecting, UXO detection, landmine detection, environmental site characterization/remediation monitoring, and forensics—a detailed and refined approach is necessary. The objective of this dissertation is to investigate quantitative techniques for integrating multi-tool near-surface geophysical data to improve subsurface imaging and reduce uncertainty in discrete anomaly detection. This objective is fulfilled by: (1) correlating multi-tool geophysical data with existing well-characterized “targets”; (2) developing methods for quantitatively merging different geophysical data sets; (3) implementing statistical tools within Statistical Analysis System (SAS) to evaluate the multiple integration methodologies; and (4) testing these new methods at several well-characterized sites with varied targets (i.e., case studies). Three geophysical techniques utilized in this research are: ground penetrating radar (GPR), electromagnetic (ground conductivity) methods (EM), and magnetic gradiometry. Computer simulations are developed to generate synthetic data with expected parameters such as heterogeneity of the subsurface, type of target, and spatial sampling. The synthetic data sets are integrated using the same methodologies employed on the case-study sites to (a) further develop the necessary quantitative assessment scheme, and (b) determine if these merged data sets do in fact yield improved results. A controlled setting within The University of Tennessee Geophysical Research Station permits the data (and associated anomalous bodies) to be spatially correlated with the locations of known targets. Error analysis is then conducted to guide any modifications to the data integration methodologies before transitioning to study sites of unknown subsurface features. Statistical analysis utilizing SAS is conducted to quantitatively evaluate the effectiveness of the data integration methodologies and determine if there are significant improvements in subsurface imaging, thus resulting in a reduction in the uncertainty of discrete anomaly detection

    Spectral Network (SpecNet)—What is it and why do we need it?

    Get PDF
    Effective integration of optical remote sensing with flux measurements across multiple scales is essential for understanding global patterns of surface–atmosphere fluxes of carbon and water vapor. SpecNet (Spectral Network) is an international network of cooperating investigators and sites linking optical measurements with flux sampling for the purpose of improving our understanding of the controls on these fluxes. An additional goal is to characterize disturbance impacts on surface–atmosphere fluxes. To reach these goals, key SpecNet objectives include the exploration of scaling issues, development of novel sampling tools, standardization and intercomparison of sampling methods, development of models and statistical methods that relate optical sampling to fluxes, exploration of component fluxes, validation of satellite products, and development of an informatics approach that integrates disparate data sources across scales. Examples of these themes are summarized in this review

    Spectral Network (SpecNet)—What is it and why do we need it?

    Get PDF
    Effective integration of optical remote sensing with flux measurements across multiple scales is essential for understanding global patterns of surface–atmosphere fluxes of carbon and water vapor. SpecNet (Spectral Network) is an international network of cooperating investigators and sites linking optical measurements with flux sampling for the purpose of improving our understanding of the controls on these fluxes. An additional goal is to characterize disturbance impacts on surface–atmosphere fluxes. To reach these goals, key SpecNet objectives include the exploration of scaling issues, development of novel sampling tools, standardization and intercomparison of sampling methods, development of models and statistical methods that relate optical sampling to fluxes, exploration of component fluxes, validation of satellite products, and development of an informatics approach that integrates disparate data sources across scales. Examples of these themes are summarized in this review

    Principal component and Voronoi skeleton alternatives for curve reconstruction from noisy point sets

    Get PDF
    Surface reconstruction from noisy point samples must take into consideration the stochastic nature of the sample -- In other words, geometric algorithms reconstructing the surface or curve should not insist in following in a literal way each sampled point -- Instead, they must interpret the sample as a “point cloud” and try to build the surface as passing through the best possible (in the statistical sense) geometric locus that represents the sample -- This work presents two new methods to find a Piecewise Linear approximation from a Nyquist-compliant stochastic sampling of a quasi-planar C1 curve C(u) : R → R3, whose velocity vector never vanishes -- One of the methods articulates in an entirely new way Principal Component Analysis (statistical) and Voronoi-Delaunay (deterministic) approaches -- It uses these two methods to calculate the best possible tape-shaped polygon covering the planarised point set, and then approximates the manifold by the medial axis of such a polygon -- The other method applies Principal Component Analysis to find a direct Piecewise Linear approximation of C(u) -- A complexity comparison of these two methods is presented along with a qualitative comparison with previously developed ones -- It turns out that the method solely based on Principal Component Analysis is simpler and more robust for non self-intersecting curves -- For self-intersecting curves the Voronoi-Delaunay based Medial Axis approach is more robust, at the price of higher computational complexity -- An application is presented in Integration of meshes originated in range images of an art piece -- Such an application reaches the point of complete reconstruction of a unified mes

    Ab initio statistical mechanics of surface adsorption and desorption: II. Nuclear quantum effects

    Full text link
    We show how the path-integral formulation of quantum statistical mechanics can be used to construct practical {\em ab initio} techniques for computing the chemical potential of molecules adsorbed on surfaces, with full inclusion of quantum nuclear effects. The techniques we describe are based on the computation of the potential of mean force on a chosen molecule, and generalise the techniques developed recently for classical nuclei. We present practical calculations based on density functional theory with a generalised-gradient exchange-correlation functional for the case of H2_2O on the MgO~(001) surface at low coverage. We note that the very high vibrational frequencies of the H2_2O molecule would normally require very large numbers of time slices (beads) in path-integral calculations, but we show that this requirement can be dramatically reduced by employing the idea of thermodynamic integration with respect to the number of beads. The validity and correctness of our path-integral calculations on the H2_2O/MgO~(001) system are demonstrated by supporting calculations on a set of simple model systems for which quantum contributions to the free energy are known exactly from analytic arguments.Comment: 11 pages, including 2 figure

    Hybrid methods in planetesimal dynamics: Formation of protoplanetary systems and the mill condition

    Get PDF
    The formation and evolution of protoplanetary discs remains a challenge from both a theoretical and numerical standpoint. In this work we first perform a series of tests of our new hybrid algorithm presented in Glaschke, Amaro-Seoane and Spurzem 2011 (henceforth Paper I) that combines the advantages of high accuracy of direct-summation N-body methods with a statistical description for the planetesimal disc based on Fokker-Planck techniques. We then address the formation of planets, with a focus on the formation of protoplanets out of planetesimals. We find that the evolution of the system is driven by encounters as well as direct collisions and requires a careful modelling of the evolution of the velocity dispersion and the size distribution over a large range of sizes. The simulations show no termination of the protoplanetary accretion due to gap formation, since the distribution of the planetesimals is only subjected to small fluctuations. We also show that these features are weakly correlated with the positions of the protoplanets. The exploration of different impact strengths indicates that fragmentation mainly controls the overall mass loss, which is less pronounced during the early runaway growth. We prove that the fragmentation in combination with the effective removal of collisional fragments by gas drag sets an universal upper limit of the protoplanetary mass as a function of the distance to the host star, which we refer to as the mill condition.Comment: Submitte

    Ab initio statistical mechanics of surface adsorption and desorption: I. H2_2O on MgO (001) at low coverage

    Full text link
    We present a general computational scheme based on molecular dynamics (m.d.) simulation for calculating the chemical potential of adsorbed molecules in thermal equilibrium on the surface of a material. The scheme is based on the calculation of the mean force in m.d. simulations in which the height of a chosen molecule above the surface is constrained, and subsequent integration of the mean force to obtain the potential of mean force and hence the chemical potential. The scheme is valid at any coverage and temperature, so that in principle it allows the calculation of the chemical potential as a function of coverage and temperature. It avoids all statistical mechanical approximations, except for the use of classical statistical mechanics for the nuclei, and assumes nothing in advance about the adsorption sites. From the chemical potential, the absolute desorption rate of the molecules can be computed, provided the equilibration rate on the surface is faster than the desorption rate. We apply the theory by {\em ab initio} m.d. simulation to the case of H2_2O on MgO (001) in the low-coverage limit, using the Perdew-Burke-Ernzerhof (PBE) form of exchange-correlation. The calculations yield an {\em ab initio} value of the Polanyi-Wigner frequency prefactor, which is more than two orders of magnitude greater than the value of 101310^{13} s1^{-1} often assumed in the past. Provisional comparison with experiment suggests that the PBE adsorption energy may be too low, but the extension of the calculations to higher coverages is needed before firm conclusions can be drawn. The possibility of including quantum nuclear effects by using path-integral simulations is noted.Comment: 11 pages + 10 figure
    corecore