464 research outputs found

    Pointwise consistency of the kriging predictor with known mean and covariance functions

    Full text link
    This paper deals with several issues related to the pointwise consistency of the kriging predictor when the mean and the covariance functions are known. These questions are of general importance in the context of computer experiments. The analysis is based on the properties of approximations in reproducing kernel Hilbert spaces. We fix an erroneous claim of Yakowitz and Szidarovszky (J. Multivariate Analysis, 1985) that the kriging predictor is pointwise consistent for all continuous sample paths under some assumptions.Comment: Submitted to mODa9 (the Model-Oriented Data Analysis and Optimum Design Conference), 14th-19th June 2010, Bertinoro, Ital

    Cosmic Calibration: Constraints from the Matter Power Spectrum and the Cosmic Microwave Background

    Get PDF
    Several cosmological measurements have attained significant levels of maturity and accuracy over the last decade. Continuing this trend, future observations promise measurements of the statistics of the cosmic mass distribution at an accuracy level of one percent out to spatial scales with k~10 h/Mpc and even smaller, entering highly nonlinear regimes of gravitational instability. In order to interpret these observations and extract useful cosmological information from them, such as the equation of state of dark energy, very costly high precision, multi-physics simulations must be performed. We have recently implemented a new statistical framework with the aim of obtaining accurate parameter constraints from combining observations with a limited number of simulations. The key idea is the replacement of the full simulator by a fast emulator with controlled error bounds. In this paper, we provide a detailed description of the methodology and extend the framework to include joint analysis of cosmic microwave background and large scale structure measurements. Our framework is especially well-suited for upcoming large scale structure probes of dark energy such as baryon acoustic oscillations and, especially, weak lensing, where percent level accuracy on nonlinear scales is needed.Comment: 15 pages, 14 figure

    The Obliteration of Truth by Management: Badiou, St. Paul and the Question of Economic Managerialism in Education

    Get PDF
    This paper considers the questions that Badiou’s theory of the subject poses to cultures of economic managerialism within education. His argument that radical change is possible, for people and the situations they inhabit, provides a stark challenge to the stifling nature of much current educational climate. In 'Saint Paul: The Foundation of Universalism', Badiou describes the current universalism of capitalism, monetary homogeneity and the rule of the count. Badiou argues that the politics of identity are all too easily subsumed by the prerogatives of the marketplace and unable to present, therefore, a critique of the status quo. These processes are, he argues, without the potential for truth. What are the implications of Badiou’s claim that education is the arranging of ‘the forms of knowledge in such a way that truth may come to pierce a hole in them’ (Badiou, 2005, p. 9)? In this paper, I argue that Badiou’s theory opens up space for a kind of thinking about education that resists its colonisation by cultures of management and marketisation and leads educationalists to consider the emancipatory potential of education in a new light

    A bounded confidence approach to understanding user participation in peer production systems

    Full text link
    Commons-based peer production does seem to rest upon a paradox. Although users produce all contents, at the same time participation is commonly on a voluntary basis, and largely incentivized by achievement of project's goals. This means that users have to coordinate their actions and goals, in order to keep themselves from leaving. While this situation is easily explainable for small groups of highly committed, like-minded individuals, little is known about large-scale, heterogeneous projects, such as Wikipedia. In this contribution we present a model of peer production in a large online community. The model features a dynamic population of bounded confidence users, and an endogenous process of user departure. Using global sensitivity analysis, we identify the most important parameters affecting the lifespan of user participation. We find that the model presents two distinct regimes, and that the shift between them is governed by the bounded confidence parameter. For low values of this parameter, users depart almost immediately. For high values, however, the model produces a bimodal distribution of user lifespan. These results suggest that user participation to online communities could be explained in terms of group consensus, and provide a novel connection between models of opinion dynamics and commons-based peer production.Comment: 17 pages, 5 figures, accepted to SocInfo201

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    Constraining the initial state granularity with bulk observables in Au+Au collisions at sNN=200\sqrt{s_{\rm NN}}=200 GeV

    Full text link
    In this paper we conduct a systematic study of the granularity of the initial state of hot and dense QCD matter produced in ultra-relativistic heavy-ion collisions and its influence on bulk observables like particle yields, mTm_T spectra and elliptic flow. For our investigation we use a hybrid transport model, based on (3+1)d hydrodynamics and a microscopic Boltzmann transport approach. The initial conditions are generated by a non-equilibrium hadronic transport approach and the size of their fluctuations can be adjusted by defining a Gaussian smoothing parameter σ\sigma. The dependence of the hydrodynamic evolution on the choices of σ\sigma and tstartt_{start} is explored by means of a Gaussian emulator. To generate particle yields and elliptic flow that are compatible with experimental data the initial state parameters are constrained to be σ=1\sigma=1 fm and tstart=0.5t_{\rm start}=0.5 fm. In addition, the influence of changes in the equation of state is studied and the results of our event-by-event calculations are compared to a calculation with averaged initial conditions. We conclude that even though the initial state parameters can be constrained by yields and elliptic flow, the granularity needs to be constrained by other correlation and fluctuation observables.Comment: 14 pages, 8 figures, updated references, version to appear in J. Phys.
    corecore