6,817 research outputs found

    Prediction intervals for reliability growth models with small sample sizes

    Get PDF
    Engineers and practitioners contribute to society through their ability to apply basic scientific principles to real problems in an effective and efficient manner. They must collect data to test their products every day as part of the design and testing process and also after the product or process has been rolled out to monitor its effectiveness. Model building, data collection, data analysis and data interpretation form the core of sound engineering practice.After the data has been gathered the engineer must be able to sift them and interpret them correctly so that meaning can be exposed from a mass of undifferentiated numbers or facts. To do this he or she must be familiar with the fundamental concepts of correlation, uncertainty, variability and risk in the face of uncertainty. In today's global and highly competitive environment, continuous improvement in the processes and products of any field of engineering is essential for survival. Many organisations have shown that the first step to continuous improvement is to integrate the widespread use of statistics and basic data analysis into the manufacturing development process as well as into the day-to-day business decisions taken in regard to engineering processes.The Springer Handbook of Engineering Statistics gathers together the full range of statistical techniques required by engineers from all fields to gain sensible statistical feedback on how their processes or products are functioning and to give them realistic predictions of how these could be improved

    Data-driven and Model-based Verification: a Bayesian Identification Approach

    Full text link
    This work develops a measurement-driven and model-based formal verification approach, applicable to systems with partly unknown dynamics. We provide a principled method, grounded on reachability analysis and on Bayesian inference, to compute the confidence that a physical system driven by external inputs and accessed under noisy measurements, verifies a temporal logic property. A case study is discussed, where we investigate the bounded- and unbounded-time safety of a partly unknown linear time invariant system

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Speaker segmentation and clustering

    Get PDF
    This survey focuses on two challenging speech processing topics, namely: speaker segmentation and speaker clustering. Speaker segmentation aims at finding speaker change points in an audio stream, whereas speaker clustering aims at grouping speech segments based on speaker characteristics. Model-based, metric-based, and hybrid speaker segmentation algorithms are reviewed. Concerning speaker clustering, deterministic and probabilistic algorithms are examined. A comparative assessment of the reviewed algorithms is undertaken, the algorithm advantages and disadvantages are indicated, insight to the algorithms is offered, and deductions as well as recommendations are given. Rich transcription and movie analysis are candidate applications that benefit from combined speaker segmentation and clustering. © 2007 Elsevier B.V. All rights reserved

    A proposed framework for characterising uncertainty and variability in rock mechanics and rock engineering

    Get PDF
    This thesis develops a novel understanding of the fundamental issues in characterising and propagating unpredictability in rock engineering design. This unpredictability stems from the inherent complexity and heterogeneity of fractured rock masses as engineering media. It establishes the importance of: a) recognising that unpredictability results from epistemic uncertainty (i.e. resulting from a lack of knowledge) and aleatory variability (i.e. due to inherent randomness), and; b) the means by which uncertainty and variability associated with the parameters that characterise fractured rock masses are propagated through the modelling and design process. Through a critical review of the literature, this thesis shows that in geotechnical engineering – rock mechanics and rock engineering in particular – there is a lack of recognition in the existence of epistemic uncertainty and aleatory variability, and hence inappropriate design methods are often used. To overcome this, a novel taxonomy is developed and presented that facilitates characterisation of epistemic uncertainty and aleatory variability in the context of rock mechanics and rock engineering. Using this taxonomy, a new framework is developed that gives a protocol for correctly propagating uncertainty and variability through engineering calculations. The effectiveness of the taxonomy and the framework are demonstrated through their application to simple challenge problems commonly found in rock engineering. This new taxonomy and framework will provide engineers engaged in preparing rock engineering designs an objective means of characterising unpredictability in parameters commonly used to define properties of fractured rock masses. These new tools will also provide engineers with a means of clearly understanding the true nature of unpredictability inherent in rock mechanics and rock engineering, and thus direct selection of an appropriate unpredictability model to propagate unpredictability faithfully through engineering calculations. Thus, the taxonomy and framework developed in this thesis provide practical tools to improve the safety of rock engineering designs through an improved understanding of the unpredictability concepts.Open Acces

    Bayesian updating of soil-water character curve parameters based on the monitor data of a large-scale landslide model experiment

    Get PDF
    It is important to determine the soil-water characteristic curve (SWCC) for analyzing landslide seepage under varying hydrodynamic conditions. However, the SWCC exhibits high uncertainty due to the variability inherent in soil. To this end, a Bayesian updating framework based on the experimental data was developed to investigate the uncertainty of the SWCC parameters in this study. The objectives of this research were to quantify the uncertainty embedded within the SWCC and determine the critical factors affecting an unsaturated soil landslide under hydrodynamic conditions. For this purpose, a large-scale landslide experiment was conducted, and the monitored water content data were collected. Steady-state seepage analysis was carried out using the finite element method (FEM) to simulate the slope behavior during water level change. In the proposed framework, the parameters of the SWCC model were treated as random variables and parameter uncertainties were evaluated using the Bayesian approach based on the Markov chain Monte Carlo (MCMC) method. Observed data from large-scale landslide experiments were used to calculate the posterior information of SWCC parameters. Then, 95% confidence intervals for the model parameters of the SWCC were derived. The results show that the Bayesian updating method is feasible for the monitoring of data of large-scale landslide model experiments. The establishment of an artificial neural network (ANN) surrogate model in the Bayesian updating process can greatly improve the efficiency of Bayesian model updating
    • …
    corecore