808 research outputs found

    Dynamic Partitioning in Linear Relation Analysis. Application to the Verification of Synchronous Programs

    Get PDF
    We apply linear relation analysis [CH78, HPR97] to the verificationof declarative synchronous programs [Hal98]. In this approach,state partitioning plays an important role: on one hand the precision of the results highly depends on the fineness of the partitioning; on the other hand, a too much detailed partitioning may result in an exponential explosion of the analysis. In this paper we propose to consider very general partitions of the state space and to dynamically select a suitable partitioning according to the property to be proved. The presented approach is quite general and can be applied to other abstract interpretations.Keywords and Phrases: Abstract Interpretation, Partitioning,Linear Relation Analysis, Reactive Systems, Program Verificatio

    Test-Driven, Model-Based Systems Engineering.

    Get PDF

    Quantitative risk assessment using Monte Carlo and dynamic process simulation

    Get PDF
    Currently, the concern about the industrial risk is a key issue to implement any technology process or to improve the industry competitiveness. In this sense, the risk concept may be considered as the main tool to anticipate behaviors that can lead to further problems. Considering the process industry, different risk analysis techniques are employed to identify hazardous events, to estimate their frequencies and severities, and to characterize the risk, being such tools the best ones to improve the industrial safety. Knowing that, the present Thesis discusses these risk topics to propose four main contributions: (i) new procedure to identify hazardous events; (ii) new procedures to quantify frequency; (iii) new risk definition and representation; and (iv) a method to integrate the proposed procedures to manage a complete risk assessment management. The idea behind the contributions is to use computational tools in new techniques with improved results about the operational risk, helping its obtainment and understanding. Thus, based on a new risk definition that allow better relation between the developed analysis, process simulations are employed to identify hazardous events and Monte Carlo simulations are employed to estimate frequency and to generate a new risk representation characterized by a severity x time x frequency surface. Despite all contributions has its particularity and importance for the risk analyses development, as final contribution, the presented Thesis apply all developed techniques in a case study, proposing an innovative risk assessment procedure.Atualmente, a preocupação com o risco industrial é um ponto chave para implantação de uma nova tecnologia ou para um melhor posicionamento competitivo. Neste sentido, a ideia de risco pode ser considerada como o principal recurso para antever situações que podem gerar problemas futuros. Considerando a indústria de processos, diferentes técnicas de análise de riscos são utilizadas para identificar eventos perigosos, estimar suas frequências e severidades e caracterizar o risco, sendo essas as principais ferramentas para o aumento da segurança industrial. Sabendo disso, a presente tese aborda tais tópicos e propõe quatro contribuições principais: (i) novo procedimento para identificação de eventos perigosos; (ii) novos procedimentos para quantificação de frequência; (iii) nova definição e representação de risco e (iv) um método para integrar os procedimentos propostos em uma avaliação quantitativa de risco completa. A ideia por trás destas contribuições é utilizar procedimentos computacionais que geram resultados mais acurados sobre o risco de uma operação, ajudando em seu entendimento e na obtenção de seu valor. Assim, baseado em um novo conceito de risco que melhor relaciona as análises desenvolvidas, simulações de processos são utilizadas para identificação de eventos perigosos e simulações de Monte Carlo são utilizadas para estimativa de frequência e gerar uma nova representação de risco caracterizada por uma superfície com eixos frequência x severidade x tempo. Apesar de cada contribuição ter sua particularidade e importância para o desenvolvimento das técnicas de análise de risco, como contribuição final, a presente tese aplica todas as técnicas desenvolvidas em um estudo de caso, apresentando assim uma avaliação de risco inovadora

    Developing a distributed electronic health-record store for India

    Get PDF
    The DIGHT project is addressing the problem of building a scalable and highly available information store for the Electronic Health Records (EHRs) of the over one billion citizens of India

    A comparative assessment of collaborative business process verification approaches.

    Get PDF
    Industry 4.0 is a key strategic trend of the economy. Virtual factories are key building blocks for Industry 4.0 where product design processes, manufacturing processes and general collaborative business processes across factories and enterprises are integrated. In the context of EU H2020 FIRST (vF Interoperation suppoRting buSiness innovaTion) project, end users of vFs are not experts in business process modelling to guarantee correct collaborative business processes for realizing execution. To enable automatic execution of business processes, verification is an important step at the business process design stage to avoid errors at runtime. Research in business process model verification has yielded a plethora of approaches in form of languages and tools that are based on Petri nets family and temporal logic. However, no report specifically targets and presents a comparative assessment of these approaches based on criteria as one we propose. In this paper we present an assessment of the most common verification approaches based on their expressibility, flexibility, suitability and complexity. We also look at how big data impacts the business process verification approach in a data-rich world

    Automatic techniques for detecting and exploiting symmetry in model checking

    Get PDF
    The application of model checking is limited due to the state-space explosion problem – as the number of components represented by a model increase, the worst case size of the associated state-space grows exponentially. Current techniques can handle limited kinds of symmetry, e.g. full symmetry between identical components in a concurrent system. They avoid the problem of automatic symmetry detection by requiring the user to specify the presence of symmetry in a model (explicitly, or by annotating the associated specification using additional language keywords), or by restricting the input language of a model checker so that only symmetric systems can be specified. Additionally, computing unique representatives for each symmetric equivalence class is easy for these limited kinds of symmetry. We present a theoretical framework for symmetry reduction which can be applied to explicit state model checking. The framework includes techniques for automatic symmetry detection using computational group theory, which can be applied with no additional user input. These techniques detect structural symmetries induced by the topology of a concurrent system, so our framework includes exact and approximate techniques to efficiently exploit arbitrary symmetry groups which may arise in this way. These techniques are also based on computational group theoretic methods. We prove that our framework is logically sound, and demonstrate its general applicability to explicit state model checking. By providing a new symmetry reduction package for the SPIN model checker, we show that our framework can be feasibly implemented as part of a system which is widely used in both industry and academia. Through a study of SPIN users, we assess the usability of our automatic symmetry detection techniques in practice

    Towards the Correctness of Software Behavior in UML: A Model Checking Approach Based on Slicing

    Get PDF
    Embedded systems are systems which have ongoing interactions with their environments, accepting requests and producing responses. Such systems are increasingly used in applications where failure is unacceptable: traffic control systems, avionics, automobiles, etc. Correct and highly dependable construction of such systems is particularly important and challenging. A very promising and increasingly attractive method for achieving this goal is using the approach of formal verification. A formal verification method consists of three major components: a model for describing the behavior of the system, a specification language to embody correctness requirements, and an analysis method to verify the behavior against the correctness requirements. This Ph.D. addresses the correctness of the behavioral design of embedded systems, using model checking as the verification technology. More precisely, we present an UML-based verification method that checks whether the conditions on the evolution of the embedded system are met by the model. Unfortunately, model checking is limited to medium size systems because of its high space requirements. To overcome this problem, this Ph.D. suggests the integration of the slicing (reduction) technique

    From white elephant to Nobel Prize: Dennis Gabor’s wavefront reconstruction

    Get PDF
    Dennis Gabor devised a new concept for optical imaging in 1947 that went by a variety of names over the following decade: holoscopy, wavefront reconstruction, interference microscopy, diffraction microscopy and Gaboroscopy. A well-connected and creative research engineer, Gabor worked actively to publicize and exploit his concept, but the scheme failed to capture the interest of many researchers. Gabor’s theory was repeatedly deemed unintuitive and baffling; the technique was appraised by his contemporaries to be of dubious practicality and, at best, constrained to a narrow branch of science. By the late 1950s, Gabor’s subject had been assessed by its handful of practitioners to be a white elephant. Nevertheless, the concept was later rehabilitated by the research of Emmett Leith and Juris Upatnieks at the University of Michigan, and Yury Denisyuk at the Vavilov Institute in Leningrad. What had been judged a failure was recast as a success: evaluations of Gabor’s work were transformed during the 1960s, when it was represented as the foundation on which to construct the new and distinctly different subject of holography, a re-evaluation that gained the Nobel Prize for Physics for Gabor alone in 1971. This paper focuses on the difficulties experienced in constructing a meaningful subject, a practical application and a viable technical community from Gabor’s ideas during the decade 1947-1957
    • …
    corecore