719 research outputs found

    The Obliteration of Truth by Management: Badiou, St. Paul and the Question of Economic Managerialism in Education

    Get PDF
    This paper considers the questions that Badiou’s theory of the subject poses to cultures of economic managerialism within education. His argument that radical change is possible, for people and the situations they inhabit, provides a stark challenge to the stifling nature of much current educational climate. In 'Saint Paul: The Foundation of Universalism', Badiou describes the current universalism of capitalism, monetary homogeneity and the rule of the count. Badiou argues that the politics of identity are all too easily subsumed by the prerogatives of the marketplace and unable to present, therefore, a critique of the status quo. These processes are, he argues, without the potential for truth. What are the implications of Badiou’s claim that education is the arranging of ‘the forms of knowledge in such a way that truth may come to pierce a hole in them’ (Badiou, 2005, p. 9)? In this paper, I argue that Badiou’s theory opens up space for a kind of thinking about education that resists its colonisation by cultures of management and marketisation and leads educationalists to consider the emancipatory potential of education in a new light

    Cosmic Calibration: Constraints from the Matter Power Spectrum and the Cosmic Microwave Background

    Get PDF
    Several cosmological measurements have attained significant levels of maturity and accuracy over the last decade. Continuing this trend, future observations promise measurements of the statistics of the cosmic mass distribution at an accuracy level of one percent out to spatial scales with k~10 h/Mpc and even smaller, entering highly nonlinear regimes of gravitational instability. In order to interpret these observations and extract useful cosmological information from them, such as the equation of state of dark energy, very costly high precision, multi-physics simulations must be performed. We have recently implemented a new statistical framework with the aim of obtaining accurate parameter constraints from combining observations with a limited number of simulations. The key idea is the replacement of the full simulator by a fast emulator with controlled error bounds. In this paper, we provide a detailed description of the methodology and extend the framework to include joint analysis of cosmic microwave background and large scale structure measurements. Our framework is especially well-suited for upcoming large scale structure probes of dark energy such as baryon acoustic oscillations and, especially, weak lensing, where percent level accuracy on nonlinear scales is needed.Comment: 15 pages, 14 figure

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years

    Multivariate analysis using high definition flow cytometry reveals distinct T cell repertoires between the fetal–maternal interface and the peripheral blood

    Get PDF
    The human T cell compartment is a complex system and while some information is known on repertoire composition and dynamics in the peripheral blood, little is known about repertoire composition at different anatomical sites. Here, we determine the T cell receptor beta variable (TRBV) repertoire at the decidua and compare it with the peripheral blood during normal pregnancy and pre-eclampsia. We found total T cell subset disparity of up to 58% between sites, including large signature TRBV expansions unique to the fetal–maternal interface. Defining the functional nature and specificity of compartment-specific T cells will be necessary if we are to understand localized immunity, tolerance, and pathogenesis

    Bayesian optimization for materials design

    Full text link
    We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian process regression, which allows predicting the performance of a new design based on previously tested designs. After providing a detailed introduction to Gaussian process regression, we introduce two Bayesian optimization methods: expected improvement, for design problems with noise-free evaluations; and the knowledge-gradient method, which generalizes expected improvement and may be used in design problems with noisy evaluations. Both methods are derived using a value-of-information analysis, and enjoy one-step Bayes-optimality

    A bounded confidence approach to understanding user participation in peer production systems

    Full text link
    Commons-based peer production does seem to rest upon a paradox. Although users produce all contents, at the same time participation is commonly on a voluntary basis, and largely incentivized by achievement of project's goals. This means that users have to coordinate their actions and goals, in order to keep themselves from leaving. While this situation is easily explainable for small groups of highly committed, like-minded individuals, little is known about large-scale, heterogeneous projects, such as Wikipedia. In this contribution we present a model of peer production in a large online community. The model features a dynamic population of bounded confidence users, and an endogenous process of user departure. Using global sensitivity analysis, we identify the most important parameters affecting the lifespan of user participation. We find that the model presents two distinct regimes, and that the shift between them is governed by the bounded confidence parameter. For low values of this parameter, users depart almost immediately. For high values, however, the model produces a bimodal distribution of user lifespan. These results suggest that user participation to online communities could be explained in terms of group consensus, and provide a novel connection between models of opinion dynamics and commons-based peer production.Comment: 17 pages, 5 figures, accepted to SocInfo201

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:RdRf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co

    Design of Experiments for Screening

    Full text link
    The aim of this paper is to review methods of designing screening experiments, ranging from designs originally developed for physical experiments to those especially tailored to experiments on numerical models. The strengths and weaknesses of the various designs for screening variables in numerical models are discussed. First, classes of factorial designs for experiments to estimate main effects and interactions through a linear statistical model are described, specifically regular and nonregular fractional factorial designs, supersaturated designs and systematic fractional replicate designs. Generic issues of aliasing, bias and cancellation of factorial effects are discussed. Second, group screening experiments are considered including factorial group screening and sequential bifurcation. Third, random sampling plans are discussed including Latin hypercube sampling and sampling plans to estimate elementary effects. Fourth, a variety of modelling methods commonly employed with screening designs are briefly described. Finally, a novel study demonstrates six screening methods on two frequently-used exemplars, and their performances are compared

    Nontransgenic models of breast cancer

    Get PDF
    Numerous models have been developed to address key elements in the biology of breast cancer development and progression. No model is ideal, but the most useful are those that reflect the natural history and histopathology of human disease, and allow for basic investigations into underlying cellular and molecular mechanisms. We describe two types of models: those that are directed toward early events in breast cancer development (hyperplastic alveolar nodules [HAN] murine model, MCF10AT human xenograft model); and those that seek to reflect the spectrum of metastatic disease (murine sister cell lines 67, 168, 4T07, 4T1). Collectively, these models provide cell lines that represent all of the sequential stages of progression in breast disease, which can be modified to test the effect of genetic changes
    corecore