189,892 research outputs found

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Modelling of the drying section of a continuous paper machine : a thesis presented in partial fulfilment of the requirement for the degree of Master in Production Technology at Massey University

    Get PDF
    The invention of paper in 105 A.D. was a milestone in the history of civilization and demand for paper has been increasing steadily ever since. Although it has become more and more popular to store, process and transfer information in electronic forms, paper is to date still the most common means for recording information. According to Storat (1993), production in the last twenty years has increased by more than 60 percent, while capital expenditures in the industry have grown to almost 12 percent of sales, or double the average expenditures of other manufacturing industries. This capital investment has gone towards capacity expansion and extensive rebuilds of existing mills - almost 60 percent of the existing capacity comes from modern facilities containing machines either newly installed or rebuilt in the past ten years. As a result, fossil fuel and energy consumption in this industry fell by 46 percent in the last two decades.[FROM INTRODUCTION

    Mathematical models of avascular cancer

    Get PDF
    This review will outline a number of illustrative mathematical models describing the growth of avascular tumours. The aim of the review is to provide a relatively comprehensive list of existing models in this area and discuss several representative models in greater detail. In the latter part of the review, some possible future avenues of mathematical modelling of avascular tumour development are outlined together with a list of key questions

    Several types of types in programming languages

    Get PDF
    Types are an important part of any modern programming language, but we often forget that the concept of type we understand nowadays is not the same it was perceived in the sixties. Moreover, we conflate the concept of "type" in programming languages with the concept of the same name in mathematical logic, an identification that is only the result of the convergence of two different paths, which started apart with different aims. The paper will present several remarks (some historical, some of more conceptual character) on the subject, as a basis for a further investigation. The thesis we will argue is that there are three different characters at play in programming languages, all of them now called types: the technical concept used in language design to guide implementation; the general abstraction mechanism used as a modelling tool; the classifying tool inherited from mathematical logic. We will suggest three possible dates ad quem for their presence in the programming language literature, suggesting that the emergence of the concept of type in computer science is relatively independent from the logical tradition, until the Curry-Howard isomorphism will make an explicit bridge between them.Comment: History and Philosophy of Computing, HAPOC 2015. To appear in LNC

    Mathematical models in physiology

    Get PDF
    Computational modelling of biological processes and systems has witnessed a remarkable development in recent years. The search-term (modelling OR modeling) yields over 58000 entries in PubMed, with more than 34000 since the year 2000: thus, almost two-thirds of papers appeared in the last 5–6 years, compared to only about one-third in the preceding 5–6 decades.\ud \ud The development is fuelled both by the continuously improving tools and techniques available for bio-mathematical modelling and by the increasing demand in quantitative assessment of element inter-relations in complex biological systems. This has given rise to a worldwide public domain effort to build a computational framework that provides a comprehensive theoretical representation of integrated biological function—the Physiome.\ud \ud The current and next issues of this journal are devoted to a small sub-set of this initiative and address biocomputation and modelling in physiology, illustrating the breadth and depth of experimental data-based model development in biological research from sub-cellular events to whole organ simulations

    Formal analysis techniques for gossiping protocols

    Get PDF
    We give a survey of formal verification techniques that can be used to corroborate existing experimental results for gossiping protocols in a rigorous manner. We present properties of interest for gossiping protocols and discuss how various formal evaluation techniques can be employed to predict them

    Two-Dimensional Mathematical Modelling of a Dam-Break Wave in a Narrow Steep Stream

    Get PDF
    The paper deals with hydraulic aspects of a wave, emerging as a result of a potential dam break of the upper storage reservoir of the pumpedstorage hydropower plant Kolarjev vrh. A two-dimensional depth-averaged mathematical approach was used. The upper storage reservoir and its dam failure were modelled with the mathematical model PCFLOW2D, which is based on the Cartesian coordinate numerical mesh.\ud The results of PCFLOW2D were used as the upper boundary condition for the mathematical model PCFLOW2D-ORTHOCURVE, based on the orthogonal curvilinear numerical mesh. The model PCFLOW2D-ORTHOCURVE provided a tool for the analysis of flood wave flow in a steep, narrow and geometrically diversified stream channel. The classic Manning’s equation fails to give good results for streams with steep bed\ud slopes and therefore, a different equation should be used. The application of the Rickenmann’s equation was chosen, presented in a form similar to Manning’s equation. For the purpose of the example given here, the equation was somewhat simplified and adapted to the data available. The roughness coefficient used at each calculation cell depended on the slope of that cell. The results of numerical calculations\ud were compared to measurements carried out on a physical model in the scale of 1 : 200. Regarding the complexity of the flow phenomenon a rather good correlation of maximum depth was established: only at one gauge the difference in water depth was up to 27% while at the other four it was 7% of water depth on average

    Advances in computational modelling for personalised medicine after myocardial infarction

    Get PDF
    Myocardial infarction (MI) is a leading cause of premature morbidity and mortality worldwide. Determining which patients will experience heart failure and sudden cardiac death after an acute MI is notoriously difficult for clinicians. The extent of heart damage after an acute MI is informed by cardiac imaging, typically using echocardiography or sometimes, cardiac magnetic resonance (CMR). These scans provide complex data sets that are only partially exploited by clinicians in daily practice, implying potential for improved risk assessment. Computational modelling of left ventricular (LV) function can bridge the gap towards personalised medicine using cardiac imaging in patients with post-MI. Several novel biomechanical parameters have theoretical prognostic value and may be useful to reflect the biomechanical effects of novel preventive therapy for adverse remodelling post-MI. These parameters include myocardial contractility (regional and global), stiffness and stress. Further, the parameters can be delineated spatially to correspond with infarct pathology and the remote zone. While these parameters hold promise, there are challenges for translating MI modelling into clinical practice, including model uncertainty, validation and verification, as well as time-efficient processing. More research is needed to (1) simplify imaging with CMR in patients with post-MI, while preserving diagnostic accuracy and patient tolerance (2) to assess and validate novel biomechanical parameters against established prognostic biomarkers, such as LV ejection fraction and infarct size. Accessible software packages with minimal user interaction are also needed. Translating benefits to patients will be achieved through a multidisciplinary approach including clinicians, mathematicians, statisticians and industry partners

    Simulation modelling: Educational development roles for learning technologists

    Get PDF
    Simulation modelling was in the mainstream of CAL development in the 1980s when the late David Squires introduced this author to the Dynamic Modelling System. Since those early days, it seems that simulation modelling has drifted into a learning technology backwater to become a member of Laurillard's underutilized, ‘adaptive and productive’ media. Referring to her Conversational Framework, Laurillard constructs a pedagogic case for modelling as a productive student activity but provides few references to current practice and available resources. This paper seeks to complement her account by highlighting the pioneering initiatives of the Computers in the Curriculum Project and more recent developments in systems modelling within geographic and business education. The latter include improvements to system dynamics modelling programs such as STELLA®, the publication of introductory textbooks, and the emergence of online resources. The paper indicates several ways in which modelling activities may be approached and identifies some educational development roles for learning technologists. The paper concludes by advocating simulation modelling as an exemplary use of learning technologies ‐ one that realizes their creative‐transformative potential
    corecore