126,340 research outputs found

    Model based safety analysis for an Unmanned Aerial System

    Get PDF
    This paper aims at describing safety architectures of autonomous systems by using Event-B formal method. The autonomous systems combine various activities which can be organised in layers. The Event-B formalism well supports the rigorous design of this kind of systems. Its refinement mechanism allows a progressive modelling by checking the correctness and the relevance of the models by discharging proof obligations. The application of the Event-B method within the framework of layered architecture specification enables the emergence of desired global properties with relation to layer interactions. The safety objectives are derived in each layer and they involve static and dynamic properties such as an independence property, a redundant property or a sequential property. The originality of our approach is to consider a refinement process between two layers in which the abstract model is the model of the lower layer. In our modelling, we distinguish nominal behaviour and abnormal behaviour in order to well establish failure propagation in our architecture

    Crossover Scaling of Wavelength Selection in Directional Solidification of Binary Alloys

    Full text link
    We simulate dendritic growth in directional solidification in dilute binary alloys using a phase-field model solved with an adaptive-mesh refinement. The spacing of primary branches is examined for a range of thermal gradients and alloy compositions and is found to undergo a maximum as a function of pulling velocity, in agreement with experimental observations. We demonstrate that wavelength selection is unambiguously described by a non-trivial crossover scaling function from the emergence of cellular growth to the onset of dendritic fingers, a result validated using published experimental data.Comment: 4 pages, four figures, submitted to Physical Review Letter

    Modelling binary alloy solidification with adaptive mesh refinement

    Get PDF
    The solidification of a binary alloy results in the formation of a porous mushy layer, within which spontaneous localisation of fluid flow can lead to the emergence of features over a range of spatial scales. We describe a finite volume method for simulating binary alloy solidification in two dimensions with local mesh refinement in space and time. The coupled heat, solute, and mass transport is described using an enthalpy method with flow described by a Darcy-Brinkman equation for flow across porous and liquid regions. The resulting equations are solved on a hierarchy of block-structured adaptive grids. A projection method is used to compute the fluid velocity, whilst the viscous and nonlinear diffusive terms are calculated using a semi-implicit scheme. A series of synchronization steps ensure that the scheme is flux-conservative and correct for errors that arise at the boundaries between different levels of refinement. We also develop a corresponding method using Darcy's law for flow in a porous medium/narrow Hele-Shaw cell. We demonstrate the accuracy and efficiency of our method using established benchmarks for solidification without flow and convection in a fixed porous medium, along with convergence tests for the fully coupled code. Finally, we demonstrate the ability of our method to simulate transient mushy layer growth with narrow liquid channels which evolve over time

    Standardization and Control for Confounding in Observational Studies: A Historical Perspective

    Full text link
    Control for confounders in observational studies was generally handled through stratification and standardization until the 1960s. Standardization typically reweights the stratum-specific rates so that exposure categories become comparable. With the development first of loglinear models, soon also of nonlinear regression techniques (logistic regression, failure time regression) that the emerging computers could handle, regression modelling became the preferred approach, just as was already the case with multiple regression analysis for continuous outcomes. Since the mid 1990s it has become increasingly obvious that weighting methods are still often useful, sometimes even necessary. On this background we aim at describing the emergence of the modelling approach and the refinement of the weighting approach for confounder control.Comment: Published in at http://dx.doi.org/10.1214/13-STS453 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Mixed precision GMRES-based iterative refinement with recycling

    Get PDF
    summary:With the emergence of mixed precision hardware, mixed precision GMRES-based iterative refinement schemes for solving linear systems Ax=bAx=b have recently been developed. However, in certain settings, GMRES may require too many iterations per refinement step, making it potentially more expensive than the alternative of recomputing the LU factors in a higher precision. In this work, we incorporate the idea of Krylov subspace recycling, a well-known technique for reusing information across sequential invocations, of a Krylov subspace method into a mixed precision GMRES-based iterative refinement solver. The insight is that in each refinement step, we call preconditioned GMRES on a linear system with the same coefficient matrix AA. In this way, the GMRES solves in subsequent refinement steps can be accelerated by recycling information obtained from previous steps. We perform numerical experiments on various random dense problems, Toeplitz problems, and problems from real applications, which confirm the benefits of the recycling approach

    What Is a Macrostate? Subjective Observations and Objective Dynamics

    Get PDF
    We consider the question of whether thermodynamic macrostates are objective consequences of dynamics, or subjective reflections of our ignorance of a physical system. We argue that they are both; more specifically, that the set of macrostates forms the unique maximal partition of phase space which 1) is consistent with our observations (a subjective fact about our ability to observe the system) and 2) obeys a Markov process (an objective fact about the system's dynamics). We review the ideas of computational mechanics, an information-theoretic method for finding optimal causal models of stochastic processes, and argue that macrostates coincide with the ``causal states'' of computational mechanics. Defining a set of macrostates thus consists of an inductive process where we start with a given set of observables, and then refine our partition of phase space until we reach a set of states which predict their own future, i.e. which are Markovian. Macrostates arrived at in this way are provably optimal statistical predictors of the future values of our observables.Comment: 15 pages, no figure

    A synthesis of logic and biology in the design of dependable systems

    Get PDF
    The technologies of model-based design and dependability analysis in the design of dependable systems, including software intensive systems, have advanced in recent years. Much of this development can be attributed to the application of advances in formal logic and its application to fault forecasting and verification of systems. In parallel, work on bio-inspired technologies has shown potential for the evolutionary design of engineering systems via automated exploration of potentially large design spaces. We have not yet seen the emergence of a design paradigm that combines effectively and throughout the design lifecycle these two techniques which are schematically founded on the two pillars of formal logic and biology. Such a design paradigm would apply these techniques synergistically and systematically from the early stages of design to enable optimal refinement of new designs which can be driven effectively by dependability requirements. The paper sketches such a model-centric paradigm for the design of dependable systems that brings these technologies together to realise their combined potential benefits
    • ā€¦
    corecore