3,241 research outputs found

    Quantifying simulator discrepancy in discrete-time dynamical simulators

    Get PDF
    When making predictions with complex simulators it can be important to quantify the various sources of uncertainty. Errors in the structural specification of the simulator, for example due to missing processes or incorrect mathematical specification, can be a major source of uncertainty, but are often ignored. We introduce a methodology for inferring the discrepancy between the simulator and the system in discrete-time dynamical simulators. We assume a structural form for the discrepancy function, and show how to infer the maximum likelihood parameter estimates using a particle filter embedded within a Monte Carlo expectation maximization (MCEM) algorithm. We illustrate the method on a conceptual rainfall runoff simulator (logSPM) used to model the Abercrombie catchment in Australia. We assess the simulator and discrepancy model on the basis of their predictive performance using proper scoring rules

    Azaphilones inhibit tau aggregation and dissolve tau aggregates in vitro

    Get PDF
    The aggregation of the microtubule-associated protein tau is a seminal event in many neurodegenerative diseases, including Alzheimer’s disease. The inhibition or reversal of tau aggregation is therefore a potential therapeutic strategy for these diseases. Fungal natural products have proven to be a rich source of useful compounds having wide varieties of biological activities. We have previously screened Aspergillus nidulans secondary metabolites for their ability to inhibit tau aggregation in vitro using an arachidonic acid polymerization protocol. One aggregation inhibitor identified was asperbenzaldehyde, an intermediate in azaphilone biosynthesis. We therefore tested 11 azaphilone derivatives to determine their tau assembly inhibition properties in vitro. All compounds tested inhibited tau filament assembly to some extent, while four of the 11 compounds had the advantageous property of disassembling preformed tau aggregates in a dose-dependent fashion. The addition of these compounds to the tau aggregates reduced both the total length and numbers of tau polymers. The most potent compounds were tested in in vitro reactions to determine whether they interfere with tau’s normal function of stabilizing microtubules (MTs). We found that they did not completely inhibit MT assembly in the presence of tau. These derivatives are very promising lead compounds for tau aggregation inhibitors and, more excitingly, for compounds that can disassemble pre-existing tau filaments. They also represent a new class of anti-tau aggregation compounds with a novel structural scaffold

    Calculating partial expected value of perfect information via Monte Carlo sampling algorithms

    Get PDF
    Partial expected value of perfect information (EVPI) calculations can quantify the value of learning about particular subsets of uncertain parameters in decision models. Published case studies have used different computational approaches. This article examines the computation of partial EVPI estimates via Monte Carlo sampling algorithms. The mathematical definition shows 2 nested expectations, which must be evaluated separately because of the need to compute a maximum between them. A generalized Monte Carlo sampling algorithm uses nested simulation with an outer loop to sample parameters of interest and, conditional upon these, an inner loop to sample remaining uncertain parameters. Alternative computation methods and shortcut algorithms are discussed and mathematical conditions for their use considered. Maxima of Monte Carlo estimates of expectations are biased upward, and the authors show that the use of small samples results in biased EVPI estimates. Three case studies illustrate 1) the bias due to maximization and also the inaccuracy of shortcut algorithms 2) when correlated variables are present and 3) when there is nonlinearity in net benefit functions. If relatively small correlation or nonlinearity is present, then the shortcut algorithm can be substantially inaccurate. Empirical investigation of the numbers of Monte Carlo samples suggests that fewer samples on the outer level and more on the inner level could be efficient and that relatively small numbers of samples can sometimes be used. Several remaining areas for methodological development are set out. A wider application of partial EVPI is recommended both for greater understanding of decision uncertainty and for analyzing research priorities

    The Autism Spectrum Disorder Evaluative Education Model: a school-based method of assessing and selecting interventions for classroom use

    Get PDF
    Evaluating educational programs and interventions is generally considered a normal part of curriculum development and improvement, and published findings are readily accessible through peer-reviewed journals. Recently, however, researchers and practicing educators have identified a lack of evaluative research regarding Autism Spectrum Disorder (ASD) educational practices in the peer-reviewed literature. Autism Spectrum Australia (Aspect) has an established evidence-informed ASD curriculum that is constantly reviewed and updated to meet the needs of the students in Aspect schools and classes. Through a methodical evaluative process, all educational interventions and support processes and devices undergo a series of Evidence-Based Research Trials and evaluations before they are implemented in classes. This article demonstrates how a workflow model can deliver a systematic method for identifying, evaluating, implementing, and disseminating the research findings of a program or support intervention. The Autism Spectrum Disorder Evaluative Education (ASDEE) model is discussed

    Slip Inversion Along Inner Fore-Arc Faults, Eastern Tohoku, Japan

    Get PDF
    The kinematics of deformation in the overriding plate of convergent margins may vary across timescales ranging from a single seismic cycle to many millions of years. In Northeast Japan, a network of active faults has accommodated contraction across the arc since the Pliocene, but several faults located along the inner fore arc experienced extensional aftershocks following the 2011 Tohoku-oki earthquake, opposite that predicted from the geologic record. This observation suggests that fore-arc faults may be favorable for stress triggering and slip inversion, but the geometry and deformation history of these fault systems are poorly constrained. Here we document the Neogene kinematics and subsurface geometry of three prominent fore-arc faults in Tohoku, Japan. Geologic mapping and dating of growth strata provide evidence for a 5.6–2.2 Ma initiation of Plio-Quaternary contraction along the Oritsume, Noheji, and Futaba Faults and an earlier phase of Miocene extension from 25 to 15 Ma along the Oritsume and Futaba Faults associated with the opening of the Sea of Japan. Kinematic modeling indicates that these faults have listric geometries, with ramps that dip ~40–65°W and sole into subhorizontal detachments at 6–10 km depth. These fault systems can experience both normal and thrust sense slip if they are mechanically weak relative to the surrounding crust. We suggest that the inversion history of Northeast Japan primed the fore arc with a network of weak faults mechanically and geometrically favorable for slip inversion over geologic timescales and in response to secular variations in stress state associated with the megathrust seismic cycle

    Basins of attraction on random topography

    Full text link
    We investigate the consequences of fluid flowing on a continuous surface upon the geometric and statistical distribution of the flow. We find that the ability of a surface to collect water by its mere geometrical shape is proportional to the curvature of the contour line divided by the local slope. Consequently, rivers tend to lie in locations of high curvature and flat slopes. Gaussian surfaces are introduced as a model of random topography. For Gaussian surfaces the relation between convergence and slope is obtained analytically. The convergence of flow lines correlates positively with drainage area, so that lower slopes are associated with larger basins. As a consequence, we explain the observed relation between the local slope of a landscape and the area of the drainage basin geometrically. To some extent, the slope-area relation comes about not because of fluvial erosion of the landscape, but because of the way rivers choose their path. Our results are supported by numerically generated surfaces as well as by real landscapes

    No measure for culture? Value in the new economy

    Get PDF
    This paper explores articulations of the value of investment in culture and the arts through a critical discourse analysis of policy documents, reports and academic commentary since 1997. It argues that in this period, discourses around the value of culture have moved from a focus on the direct economic contributions of the culture industries to their indirect economic benefits. These indirect benefits are discussed here under three main headings: creativity and innovation, employability, and social inclusion. These are in turn analysed in terms of three forms of capital: human, social and cultural. The paper concludes with an analysis of this discursive shift through the lens of autonomist Marxist concerns with the labour of social reproduction. It is our argument that, in contemporary policy discourses on culture and the arts, the government in the UK is increasingly concerned with the use of culture to form the social in the image of capital. As such, we must turn our attention beyond the walls of the factory in order to understand the contemporary capitalist production of value and resistance to it. </jats:p

    Copyright and cultural work: an exploration

    Get PDF
    This article first discusses the contemporary debate on cultural “creativity” and the economy. Second, it considers the current state of UK copyright law and how it relates to cultural work. Third, based on empirical research on British dancers and musicians, an analysis of precarious cultural work is presented. A major focus is how those who follow their art by way of “portfolio” work handle their rights in ways that diverge significantly from the current simplistic assumptions of law and cultural policy. Our conclusions underline the distance between present top-down conceptions of what drives production in the cultural field and the actual practice of dancers and musicians

    Vibration Isolation Design for the Micro-X Rocket Payload

    Get PDF
    Micro-X is a NASA-funded, sounding rocket-borne X-ray imaging spectrometer that will allow high precision measurements of velocity structure, ionization state and elemental composition of extended astrophysical systems. One of the biggest challenges in payload design is to maintain the temperature of the detectors during launch. There are several vibration damping stages to prevent energy transmission from the rocket skin to the detector stage, which causes heating during launch. Each stage should be more rigid than the outer stages to achieve vibrational isolation. We describe a major design effort to tune the resonance frequencies of these vibration isolation stages to reduce heating problems prior to the projected launch in the summer of 2014

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:Rd→Rf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co
    • …
    corecore