192,258 research outputs found

    Mathematical models of avascular cancer

    Get PDF
    This review will outline a number of illustrative mathematical models describing the growth of avascular tumours. The aim of the review is to provide a relatively comprehensive list of existing models in this area and discuss several representative models in greater detail. In the latter part of the review, some possible future avenues of mathematical modelling of avascular tumour development are outlined together with a list of key questions

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Modelling of the drying section of a continuous paper machine : a thesis presented in partial fulfilment of the requirement for the degree of Master in Production Technology at Massey University

    Get PDF
    The invention of paper in 105 A.D. was a milestone in the history of civilization and demand for paper has been increasing steadily ever since. Although it has become more and more popular to store, process and transfer information in electronic forms, paper is to date still the most common means for recording information. According to Storat (1993), production in the last twenty years has increased by more than 60 percent, while capital expenditures in the industry have grown to almost 12 percent of sales, or double the average expenditures of other manufacturing industries. This capital investment has gone towards capacity expansion and extensive rebuilds of existing mills - almost 60 percent of the existing capacity comes from modern facilities containing machines either newly installed or rebuilt in the past ten years. As a result, fossil fuel and energy consumption in this industry fell by 46 percent in the last two decades.[FROM INTRODUCTION

    Formal analysis techniques for gossiping protocols

    Get PDF
    We give a survey of formal verification techniques that can be used to corroborate existing experimental results for gossiping protocols in a rigorous manner. We present properties of interest for gossiping protocols and discuss how various formal evaluation techniques can be employed to predict them

    Advances in computational modelling for personalised medicine after myocardial infarction

    Get PDF
    Myocardial infarction (MI) is a leading cause of premature morbidity and mortality worldwide. Determining which patients will experience heart failure and sudden cardiac death after an acute MI is notoriously difficult for clinicians. The extent of heart damage after an acute MI is informed by cardiac imaging, typically using echocardiography or sometimes, cardiac magnetic resonance (CMR). These scans provide complex data sets that are only partially exploited by clinicians in daily practice, implying potential for improved risk assessment. Computational modelling of left ventricular (LV) function can bridge the gap towards personalised medicine using cardiac imaging in patients with post-MI. Several novel biomechanical parameters have theoretical prognostic value and may be useful to reflect the biomechanical effects of novel preventive therapy for adverse remodelling post-MI. These parameters include myocardial contractility (regional and global), stiffness and stress. Further, the parameters can be delineated spatially to correspond with infarct pathology and the remote zone. While these parameters hold promise, there are challenges for translating MI modelling into clinical practice, including model uncertainty, validation and verification, as well as time-efficient processing. More research is needed to (1) simplify imaging with CMR in patients with post-MI, while preserving diagnostic accuracy and patient tolerance (2) to assess and validate novel biomechanical parameters against established prognostic biomarkers, such as LV ejection fraction and infarct size. Accessible software packages with minimal user interaction are also needed. Translating benefits to patients will be achieved through a multidisciplinary approach including clinicians, mathematicians, statisticians and industry partners

    Simulation modelling: Educational development roles for learning technologists

    Get PDF
    Simulation modelling was in the mainstream of CAL development in the 1980s when the late David Squires introduced this author to the Dynamic Modelling System. Since those early days, it seems that simulation modelling has drifted into a learning technology backwater to become a member of Laurillard's underutilized, ‘adaptive and productive’ media. Referring to her Conversational Framework, Laurillard constructs a pedagogic case for modelling as a productive student activity but provides few references to current practice and available resources. This paper seeks to complement her account by highlighting the pioneering initiatives of the Computers in the Curriculum Project and more recent developments in systems modelling within geographic and business education. The latter include improvements to system dynamics modelling programs such as STELLA®, the publication of introductory textbooks, and the emergence of online resources. The paper indicates several ways in which modelling activities may be approached and identifies some educational development roles for learning technologists. The paper concludes by advocating simulation modelling as an exemplary use of learning technologies ‐ one that realizes their creative‐transformative potential

    A review of wildland fire spread modelling, 1990-present 3: Mathematical analogues and simulation models

    Full text link
    In recent years, advances in computational power and spatial data analysis (GIS, remote sensing, etc) have led to an increase in attempts to model the spread and behvaiour of wildland fires across the landscape. This series of review papers endeavours to critically and comprehensively review all types of surface fire spread models developed since 1990. This paper reviews models of a simulation or mathematical analogue nature. Most simulation models are implementations of existing empirical or quasi-empirical models and their primary function is to convert these generally one dimensional models to two dimensions and then propagate a fire perimeter across a modelled landscape. Mathematical analogue models are those that are based on some mathematical conceit (rather than a physical representation of fire spread) that coincidentally simulates the spread of fire. Other papers in the series review models of an physical or quasi-physical nature and empirical or quasi-empirical nature. Many models are extensions or refinements of models developed before 1990. Where this is the case, these models are also discussed but much less comprehensively.Comment: 20 pages + 9 pages references + 1 page figures. Submitted to the International Journal of Wildland Fir
    corecore