140 research outputs found

    Density of States for a Specified Correlation Function and the Energy Landscape

    Full text link
    The degeneracy of two-phase disordered microstructures consistent with a specified correlation function is analyzed by mapping it to a ground-state degeneracy. We determine for the first time the associated density of states via a Monte Carlo algorithm. Our results are described in terms of the roughness of the energy landscape, defined on a hypercubic configuration space. The use of a Hamming distance in this space enables us to define a roughness metric, which is calculated from the correlation function alone and related quantitatively to the structural degeneracy. This relation is validated for a wide variety of disordered systems.Comment: Accepted for publication in Physical Review Letter

    Pointwise consistency of the kriging predictor with known mean and covariance functions

    Full text link
    This paper deals with several issues related to the pointwise consistency of the kriging predictor when the mean and the covariance functions are known. These questions are of general importance in the context of computer experiments. The analysis is based on the properties of approximations in reproducing kernel Hilbert spaces. We fix an erroneous claim of Yakowitz and Szidarovszky (J. Multivariate Analysis, 1985) that the kriging predictor is pointwise consistent for all continuous sample paths under some assumptions.Comment: Submitted to mODa9 (the Model-Oriented Data Analysis and Optimum Design Conference), 14th-19th June 2010, Bertinoro, Ital

    Means and covariance functions for geostatistical compositional data: an axiomatic approach

    Full text link
    This work focuses on the characterization of the central tendency of a sample of compositional data. It provides new results about theoretical properties of means and covariance functions for compositional data, with an axiomatic perspective. Original results that shed new light on the geostatistical modeling of compositional data are presented. As a first result, it is shown that the weighted arithmetic mean is the only central tendency characteristic satisfying a small set of axioms, namely continuity, reflexivity and marginal stability. Moreover, this set of axioms also implies that the weights must be identical for all parts of the composition. This result has deep consequences on the spatial multivariate covariance modeling of compositional data. In a geostatistical setting, it is shown as a second result that the proportional model of covariance functions (i.e., the product of a covariance matrix and a single correlation function) is the only model that provides identical kriging weights for all components of the compositional data. As a consequence of these two results, the proportional model of covariance function is the only covariance model compatible with reflexivity and marginal stability

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:RdRf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co

    Constraining the initial state granularity with bulk observables in Au+Au collisions at sNN=200\sqrt{s_{\rm NN}}=200 GeV

    Full text link
    In this paper we conduct a systematic study of the granularity of the initial state of hot and dense QCD matter produced in ultra-relativistic heavy-ion collisions and its influence on bulk observables like particle yields, mTm_T spectra and elliptic flow. For our investigation we use a hybrid transport model, based on (3+1)d hydrodynamics and a microscopic Boltzmann transport approach. The initial conditions are generated by a non-equilibrium hadronic transport approach and the size of their fluctuations can be adjusted by defining a Gaussian smoothing parameter σ\sigma. The dependence of the hydrodynamic evolution on the choices of σ\sigma and tstartt_{start} is explored by means of a Gaussian emulator. To generate particle yields and elliptic flow that are compatible with experimental data the initial state parameters are constrained to be σ=1\sigma=1 fm and tstart=0.5t_{\rm start}=0.5 fm. In addition, the influence of changes in the equation of state is studied and the results of our event-by-event calculations are compared to a calculation with averaged initial conditions. We conclude that even though the initial state parameters can be constrained by yields and elliptic flow, the granularity needs to be constrained by other correlation and fluctuation observables.Comment: 14 pages, 8 figures, updated references, version to appear in J. Phys.

    Generalized Whittle-MatEˊ\acute{\text{E}}rn random field as a model of correlated fluctuations

    Full text link
    This paper considers a generalization of Gaussian random field with covariance function of Whittle-Mateˊ\acute{\text{e}}rn family. Such a random field can be obtained as the solution to the fractional stochastic differential equation with two fractional orders. Asymptotic properties of the covariance functions belonging to this generalized Whittle-Mateˊ\acute{\text{e}}rn family are studied, which are used to deduce the sample path properties of the random field. The Whittle-Mateˊ\acute{\text{e}}rn field has been widely used in modeling geostatistical data such as sea beam data, wind speed, field temperature and soil data. In this article we show that generalized Whittle-Mateˊ\acute{\text{e}}rn field provides a more flexible model for wind speed data.Comment: 22 pages, 10 figures, accepted by Journal of Physics

    Global sensitivity analysis of stochastic computer models with joint metamodels

    Get PDF
    The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong

    Non-stationary covariance function modelling in 2D least-squares collocation

    Get PDF
    Standard least-squares collocation (LSC) assumes 2D stationarity and 3D isotropy, and relies on a covariance function to account for spatial dependence in the ob-served data. However, the assumption that the spatial dependence is constant through-out the region of interest may sometimes be violated. Assuming a stationary covariance structure can result in over-smoothing of, e.g., the gravity field in mountains and under-smoothing in great plains. We introduce the kernel convolution method from spatial statistics for non-stationary covariance structures, and demonstrate its advantage fordealing with non-stationarity in geodetic data. We then compared stationary and non-stationary covariance functions in 2D LSC to the empirical example of gravity anomaly interpolation near the Darling Fault, Western Australia, where the field is anisotropic and non-stationary. The results with non-stationary covariance functions are better than standard LSC in terms of formal errors and cross-validation against data not used in the interpolation, demonstrating that the use of non-stationary covariance functions can improve upon standard (stationary) LSC
    corecore