15 research outputs found

    LCF-style Platform based on Multiway Decision Graphs

    Get PDF
    AbstractThe combination of state exploration approach (mainly model checking) and deductive reasoning approach (theorem proving) promises to overcome the limitation and to enhance the capabilities of each. In this paper, we are interested in defining a platform for Multiway Decision Graphs (MDGs) in LCF-style theorem prover. We define a platform to represent the MDG operations: conjunction, disjunction, relational product and prune-by-subsumption as a set of inference rules. Based on this platform, the reachability analysis is implemented as a conversion that uses the MDG theory within the HOL theorem prover. Finally, we present some experimental results to show the performance of the MDG operations of our platform

    Providing Automated Verification in HOL Using MDGs

    Full text link
    While model checking suffers from the state space explosion problem, theorem proving is quite tedious and impractical for verifying complex designs. In this work, we present a verification framework in which we attempt to strike the balance between the expressiveness of theorem proving and the efficiency and automation of state exploration techniques. To this end, we propose to integrate a layer of checking algorithms based on Multiway Decision Graphs (MDG) in the HOL theorem prover. We deeply embedded the MDG underlying logic in HOL and implemented a platform that provides a set of algorithms allowing the user to develop his/her own state-exploration based application inside HOL. While the verification problem is specified in HOL, the proof is derived by tightly combining the MDG based computations and the theorem prover facilities. We have been able to implement and experiment with different state exploration techniques within HOL such as MDG reachability analysis, equivalence and model checking

    The verification of MDG algorithms in the HOL theorem prover

    Get PDF
    Formal verification of digital systems is achieved, today, using one of two main approaches: states exploration (mainly model checking and equivalence checking) or deductive reasoning (theorem proving). Indeed, the combination of the two approaches, states exploration and deductive reasoning promises to overcome the limitation and to enhance the capabilities of each. Our research is motivated by this goal. In this thesis, we provide the entire necessary infrastructure (data structure + algorithms) to define high level states exploration in the HOL theorem prover named as MDG-HOL platform. While related work has tackled the same problem by representing primitive Binary Decision Diagram (BDD) operations as inference rules added to the core of the theorem prover, we have based our approach on the Multiway Decision Graphs (MDGs). MDG generalizes ROBDD to represent and manipulate a subset of first-order logic formulae. With MDGs, a data value is represented by a single variable of an abstract type and operations on data are represented in terms of uninterpreted function. Considering MDGs instead of BDDs will raise the abstraction level of what can be verified using a state exploration within a theorem prover. The MDGs embedding is based on the logical formulation of an MDG as a Directed Formulae (DF). The DF syntax is defined as HOL built-in data types. We formalize the basic MDG operations using this syntax within HOL following a deep embedding approach. Such approach ensures the consistency of our embedding. Then, we derive the correctness proof for each MDG basic operator. Based on this platform, the MDG reachability analysis is defined in HOL as a conversion that uses the MDG theory within HOL. Then, we demonstrate the effectiveness of our platform by considering four case studies. Our obtained results show that this verification framework offers a considerable gain in terms of automation without sacrificing CPU time and memory usage compared to automatic model checker tools. Finally, we propose a reduction technique to improve MDGs model checking based on the MDG-HOL platform. The idea is to prune the transition relation of the circuits using pre-proved theorems and lemmas from the specification given at system level. We also use the consistency of the specifications to verify if the reduced model is faithful to the original one. We provide two case studies, the first one is the reduction using SAT-MDG of an Island Tunnel Controller and the second one is the MDG-HOL assume-guarantee reduction of the Look-Aside Interface. The obtained results of our approach offers a considerable gain in terms of heuristics and reduction techniques correctness as to commercial model checking; however a small penalty is paid in terms of CPU time and memory usag

    Emerging trends proceedings of the 17th International Conference on Theorem Proving in Higher Order Logics: TPHOLs 2004

    Get PDF
    technical reportThis volume constitutes the proceedings of the Emerging Trends track of the 17th International Conference on Theorem Proving in Higher Order Logics (TPHOLs 2004) held September 14-17, 2004 in Park City, Utah, USA. The TPHOLs conference covers all aspects of theorem proving in higher order logics as well as related topics in theorem proving and verification. There were 42 papers submitted to TPHOLs 2004 in the full research cate- gory, each of which was refereed by at least 3 reviewers selected by the program committee. Of these submissions, 21 were accepted for presentation at the con- ference and publication in volume 3223 of Springer?s Lecture Notes in Computer Science series. In keeping with longstanding tradition, TPHOLs 2004 also offered a venue for the presentation of work in progress, where researchers invite discussion by means of a brief introductory talk and then discuss their work at a poster session. The work-in-progress papers are held in this volume, which is published as a 2004 technical report of the School of Computing at the University of Utah

    Information-Theoretic Analysis using Theorem Proving

    Get PDF
    Information theory is widely used for analyzing a wide range of scientific and engineering problems, including cryptography, neurobiology, quantum computing, plagiarism detection and other forms of data analysis. Despite the safety-critical nature of some of these applications, most of the information-theoretic analysis is done using informal techniques, mainly computer simulation and paper-and-pencil analysis, and thus cannot be completely relied upon. The unreliable nature of the produced results poses a serious problem in safety-critical applications and may result in heavy financial losses or even the loss of human life. In order to overcome the inaccuracy limitations of these techniques, this thesis proposes to conduct the analysis within the trusted kernel of a higher-order-logic (HOL) theorem prover. For this purpose, we provide HOL formalizations of the fundamental theories of measure, Lebesgue integration and probability and use them to formalize some of the most widely used information-theoretic principles. We use the Kullback-Leibler divergence as a unified measure of information which is in turn used to define the main measures of information like the Shannon entropy, mutual information and conditional mutual information. Furthermore, we introduce two new measures of information leakage, namely the information leakage degree and the conditional information leakage degree and compare them with existing measures. We illustrate the usefulness of the proposed framework by tackling various applications including the performance analysis of a communication encoder used in the proof of the Shannon source coding theorem, the quantitative analysis of privacy properties of a digital communications mixer and the one-time pad encryption system using information-theoretic measures

    27th Annual European Symposium on Algorithms: ESA 2019, September 9-11, 2019, Munich/Garching, Germany

    Get PDF

    Towards Proactive Context-aware Computing and Systems

    Get PDF
    A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the usersā€™ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as ā€œProactive Context-aware Computingā€. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on usersā€™ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate usersā€™ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of usersā€™ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine usersā€™ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ā€˜Locusā€™, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of usersā€™ context include the activities that they are engaged in. To this end, we have developed ā€˜SenseMeā€™, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ā€˜SenseMeā€™ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to usersā€™ situations, we have developed ā€˜TellMeā€™ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of usersā€™ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict usersā€™ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying usersā€™ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing

    Computer Aided Verification

    Get PDF
    The open access two-volume set LNCS 12224 and 12225 constitutes the refereed proceedings of the 32st International Conference on Computer Aided Verification, CAV 2020, held in Los Angeles, CA, USA, in July 2020.* The 43 full papers presented together with 18 tool papers and 4 case studies, were carefully reviewed and selected from 240 submissions. The papers were organized in the following topical sections: Part I: AI verification; blockchain and Security; Concurrency; hardware verification and decision procedures; and hybrid and dynamic systems. Part II: model checking; software verification; stochastic systems; and synthesis. *The conference was held virtually due to the COVID-19 pandemic

    Programming Languages and Systems

    Get PDF
    This open access book constitutes the proceedings of the 31st European Symposium on Programming, ESOP 2022, which was held during April 5-7, 2022, in Munich, Germany, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2022. The 21 regular papers presented in this volume were carefully reviewed and selected from 64 submissions. They deal with fundamental issues in the specification, design, analysis, and implementation of programming languages and systems

    Programming Languages and Systems

    Get PDF
    This open access book constitutes the proceedings of the 31st European Symposium on Programming, ESOP 2022, which was held during April 5-7, 2022, in Munich, Germany, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2022. The 21 regular papers presented in this volume were carefully reviewed and selected from 64 submissions. They deal with fundamental issues in the specification, design, analysis, and implementation of programming languages and systems
    corecore