172 research outputs found

    Density of States for a Specified Correlation Function and the Energy Landscape

    Full text link
    The degeneracy of two-phase disordered microstructures consistent with a specified correlation function is analyzed by mapping it to a ground-state degeneracy. We determine for the first time the associated density of states via a Monte Carlo algorithm. Our results are described in terms of the roughness of the energy landscape, defined on a hypercubic configuration space. The use of a Hamming distance in this space enables us to define a roughness metric, which is calculated from the correlation function alone and related quantitatively to the structural degeneracy. This relation is validated for a wide variety of disordered systems.Comment: Accepted for publication in Physical Review Letter

    Pointwise consistency of the kriging predictor with known mean and covariance functions

    Full text link
    This paper deals with several issues related to the pointwise consistency of the kriging predictor when the mean and the covariance functions are known. These questions are of general importance in the context of computer experiments. The analysis is based on the properties of approximations in reproducing kernel Hilbert spaces. We fix an erroneous claim of Yakowitz and Szidarovszky (J. Multivariate Analysis, 1985) that the kriging predictor is pointwise consistent for all continuous sample paths under some assumptions.Comment: Submitted to mODa9 (the Model-Oriented Data Analysis and Optimum Design Conference), 14th-19th June 2010, Bertinoro, Ital

    Numerical solution of random differential models

    Full text link
    This paper deals with the construction of a numerical solution of random initial value problems by means of a random improved Euler method. Conditions for the mean square convergence of the proposed method are established. Finally, an illustrative example is included in which the main statistics properties such as the mean and the variance of the stochastic approximation solution process are given. © 2011 Elsevier Ltd.This work has been partially supported by the Spanish M.C.Y.T. grants MTM2009-08587, DPI2010-20891-C02-01, Universidad Politecnica de Valencia grant PAID06-09-2588 and Mexican Conacyt.Cortés López, JC.; Jódar Sánchez, LA.; Villafuerte Altuzar, L.; Company Rossi, R. (2011). Numerical solution of random differential models. Mathematical and Computer Modelling. 54(7):1846-1851. https://doi.org/10.1016/j.mcm.2010.12.037S1846185154

    Means and covariance functions for geostatistical compositional data: an axiomatic approach

    Full text link
    This work focuses on the characterization of the central tendency of a sample of compositional data. It provides new results about theoretical properties of means and covariance functions for compositional data, with an axiomatic perspective. Original results that shed new light on the geostatistical modeling of compositional data are presented. As a first result, it is shown that the weighted arithmetic mean is the only central tendency characteristic satisfying a small set of axioms, namely continuity, reflexivity and marginal stability. Moreover, this set of axioms also implies that the weights must be identical for all parts of the composition. This result has deep consequences on the spatial multivariate covariance modeling of compositional data. In a geostatistical setting, it is shown as a second result that the proportional model of covariance functions (i.e., the product of a covariance matrix and a single correlation function) is the only model that provides identical kriging weights for all components of the compositional data. As a consequence of these two results, the proportional model of covariance function is the only covariance model compatible with reflexivity and marginal stability

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:RdRf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co

    Constraining the initial state granularity with bulk observables in Au+Au collisions at sNN=200\sqrt{s_{\rm NN}}=200 GeV

    Full text link
    In this paper we conduct a systematic study of the granularity of the initial state of hot and dense QCD matter produced in ultra-relativistic heavy-ion collisions and its influence on bulk observables like particle yields, mTm_T spectra and elliptic flow. For our investigation we use a hybrid transport model, based on (3+1)d hydrodynamics and a microscopic Boltzmann transport approach. The initial conditions are generated by a non-equilibrium hadronic transport approach and the size of their fluctuations can be adjusted by defining a Gaussian smoothing parameter σ\sigma. The dependence of the hydrodynamic evolution on the choices of σ\sigma and tstartt_{start} is explored by means of a Gaussian emulator. To generate particle yields and elliptic flow that are compatible with experimental data the initial state parameters are constrained to be σ=1\sigma=1 fm and tstart=0.5t_{\rm start}=0.5 fm. In addition, the influence of changes in the equation of state is studied and the results of our event-by-event calculations are compared to a calculation with averaged initial conditions. We conclude that even though the initial state parameters can be constrained by yields and elliptic flow, the granularity needs to be constrained by other correlation and fluctuation observables.Comment: 14 pages, 8 figures, updated references, version to appear in J. Phys.

    Lithofacies uncertainty modeling in a siliciclastic reservoir setting by incorporating geological contacts and seismic information

    Get PDF
    Deterministic modeling lonely provides a unique boundary layout, depending on the geological interpretation or interpolation from the hard available data. Changing the interpreter’s attitude or interpolation parameters leads to displacing the location of these borders. In contrary, probabilistic modeling of geological domains such as lithofacies is a critical aspect to providing information to take proper decision in the case of evaluation of oil reservoirs parameters, that is, applicable for quantification of uncertainty along the boundaries. These stochastic modeling manifests itself dramatically beyond this occasion. Conventional approaches of probabilistic modeling (object and pixel-based) mostly suffers from consideration of contact knowledge on the simulated domains. Plurigaussian simulation algorithm, in contrast, allows reproducing the complex transitions among the lithofacies domains and has found wide acceptance for modeling petroleum reservoirs. Stationary assumption for this framework has implications on the homogeneous characterization of the lithofacies. In this case, the proportion is assumed constant and the covariance function as a typical feature of spatial continuity depends only on the Euclidean distances between two points. But, whenever there exists a heterogeneity phenomenon in the region, this assumption does not urge model to generate the desired variability of the underlying proportion of facies over the domain. Geophysical attributes as a secondary variable in this place, plays an important role for generation of the realistic contact relationship between the simulated categories. In this paper, a hierarchical plurigaussian simulation approach is used to construct multiple realizations of lithofacies by incorporating the acoustic impedance as soft data through an oil reservoir in Iran.This research was funded by the National Elites Foundation of Iran in collaboration with research Institute Petroleum of Industry in Iran under the project number of 9265005

    Global sensitivity analysis of stochastic computer models with joint metamodels

    Get PDF
    The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong

    Generalized Whittle-MatEˊ\acute{\text{E}}rn random field as a model of correlated fluctuations

    Full text link
    This paper considers a generalization of Gaussian random field with covariance function of Whittle-Mateˊ\acute{\text{e}}rn family. Such a random field can be obtained as the solution to the fractional stochastic differential equation with two fractional orders. Asymptotic properties of the covariance functions belonging to this generalized Whittle-Mateˊ\acute{\text{e}}rn family are studied, which are used to deduce the sample path properties of the random field. The Whittle-Mateˊ\acute{\text{e}}rn field has been widely used in modeling geostatistical data such as sea beam data, wind speed, field temperature and soil data. In this article we show that generalized Whittle-Mateˊ\acute{\text{e}}rn field provides a more flexible model for wind speed data.Comment: 22 pages, 10 figures, accepted by Journal of Physics
    corecore