127,146 research outputs found

    Slot Games for Detecting Timing Leaks of Programs

    Full text link
    In this paper we describe a method for verifying secure information flow of programs, where apart from direct and indirect flows a secret information can be leaked through covert timing channels. That is, no two computations of a program that differ only on high-security inputs can be distinguished by low-security outputs and timing differences. We attack this problem by using slot-game semantics for a quantitative analysis of programs. We show how slot-games model can be used for performing a precise security analysis of programs, that takes into account both extensional and intensional properties of programs. The practicality of this approach for automated verification is also shown.Comment: In Proceedings GandALF 2013, arXiv:1307.416

    Sciduction: Combining Induction, Deduction, and Structure for Verification and Synthesis

    Full text link
    Even with impressive advances in automated formal methods, certain problems in system verification and synthesis remain challenging. Examples include the verification of quantitative properties of software involving constraints on timing and energy consumption, and the automatic synthesis of systems from specifications. The major challenges include environment modeling, incompleteness in specifications, and the complexity of underlying decision problems. This position paper proposes sciduction, an approach to tackle these challenges by integrating inductive inference, deductive reasoning, and structure hypotheses. Deductive reasoning, which leads from general rules or concepts to conclusions about specific problem instances, includes techniques such as logical inference and constraint solving. Inductive inference, which generalizes from specific instances to yield a concept, includes algorithmic learning from examples. Structure hypotheses are used to define the class of artifacts, such as invariants or program fragments, generated during verification or synthesis. Sciduction constrains inductive and deductive reasoning using structure hypotheses, and actively combines inductive and deductive reasoning: for instance, deductive techniques generate examples for learning, and inductive reasoning is used to guide the deductive engines. We illustrate this approach with three applications: (i) timing analysis of software; (ii) synthesis of loop-free programs, and (iii) controller synthesis for hybrid systems. Some future applications are also discussed

    SHARPE: Variation-Aware Formal Statistical Timing Analysis in RTL

    Get PDF
    Variations in timing can occur due to multiple sources on a chip. Many circuit level statistical techniques are used to analyze timing in the presence of these sources of variation. At the system (higher) level of design, however, timing estimation/verification is not performed. The design at the Register Transfer Level (RTL) is unaware of the underlying statistics and timing variations. It is desirable to have ``variation awareness'' at the higher level, and estimate block level delay distributions early in the design cycle, to evaluate design choices quickly and minimize post-synthesis simulation costs. In this paper, we introduce SHARPE, a rigorous, systematic timing analysis/verification methodology and tool flow to find statistical delay invariants in RTL. We treat the RTL source code as a program and use static program analysis techniques to compute probabilities. We model the probabilistic RTL modules as Discrete Time Markov Chains (DTMCs) that are then checked formally for probabilistic invariants using PRISM, a probabilistic model checker. Our technique is illustrated on the RTL description of the datapath of OR1200, an open source embedded processor.Ope

    Design and implementation of WCET analyses : including a case study on multi-core processors with shared buses

    Get PDF
    For safety-critical real-time embedded systems, the worst-case execution time (WCET) analysis — determining an upper bound on the possible execution times of a program — is an important part of the system verification. Multi-core processors share resources (e.g. buses and caches) between multiple processor cores and, thus, complicate the WCET analysis as the execution times of a program executed on one processor core significantly depend on the programs executed in parallel on the concurrent cores. We refer to this phenomenon as shared-resource interference. This thesis proposes a novel way of modeling shared-resource interference during WCET analysis. It enables an efficient analysis — as it only considers one processor core at a time — and it is sound for hardware platforms exhibiting timing anomalies. Moreover, this thesis demonstrates how to realize a timing-compositional verification on top of the proposed modeling scheme. In this way, this thesis closes the gap between modern hardware platforms, which exhibit timing anomalies, and existing schedulability analyses, which rely on timing compositionality. In addition, this thesis proposes a novel method for calculating an upper bound on the amount of interference that a given processor core can generate in any time interval of at most a given length. Our experiments demonstrate that the novel method is more precise than existing methods.Die Analyse der maximalen Ausführungszeit (Worst-Case-Execution-Time-Analyse, WCET-Analyse) ist für eingebettete Echtzeit-Computer-Systeme in sicherheitskritischen Anwendungsbereichen unerlässlich. Mehrkernprozessoren erschweren die WCET-Analyse, da einige ihrer Hardware-Komponenten von mehreren Prozessorkernen gemeinsam genutzt werden und die Ausführungszeit eines Programmes somit vom Verhalten mehrerer Kerne abhängt. Wir bezeichnen dies als Interferenz durch gemeinsam genutzte Komponenten. Die vorliegende Arbeit schlägt eine neuartige Modellierung dieser Interferenz während der WCET-Analyse vor. Der vorgestellte Ansatz ist effizient und führt auch für Computer-Systeme mit Zeitanomalien zu korrekten Ergebnissen. Darüber hinaus zeigt diese Arbeit, wie ein zeitkompositionales Verfahren auf Basis der vorgestellten Modellierung umgesetzt werden kann. Auf diese Weise schließt diese Arbeit die Lücke zwischen modernen Mikroarchitekturen, die Zeitanomalien aufweisen, und den existierenden Planbarkeitsanalysen, die sich alle auf die Kompositionalität des Zeitverhaltens verlassen. Außerdem stellt die vorliegende Arbeit ein neues Verfahren zur Berechnung einer oberen Schranke der Menge an Interferenz vor, die ein bestimmter Prozessorkern in einem beliebigen Zeitintervall einer gegebenen Länge höchstens erzeugen kann. Unsere Experimente zeigen, dass das vorgestellte Berechnungsverfahren präziser ist als die existierenden Verfahren.Deutsche Forschungsgemeinschaft (DFG) as part of the Transregional Collaborative Research Centre SFB/TR 14 (AVACS

    A Model-Derivation Framework for Software Analysis

    Full text link
    Model-based verification allows to express behavioral correctness conditions like the validity of execution states, boundaries of variables or timing at a high level of abstraction and affirm that they are satisfied by a software system. However, this requires expressive models which are difficult and cumbersome to create and maintain by hand. This paper presents a framework that automatically derives behavioral models from real-sized Java programs. Our framework builds on the EMF/ECore technology and provides a tool that creates an initial model from Java bytecode, as well as a series of transformations that simplify the model and eventually output a timed-automata model that can be processed by a model checker such as UPPAAL. The framework has the following properties: (1) consistency of models with software, (2) extensibility of the model derivation process, (3) scalability and (4) expressiveness of models. We report several case studies to validate how our framework satisfies these properties.Comment: In Proceedings MARS 2017, arXiv:1703.0581

    Analyzing an Embedded Sensor with Timed Automata in Uppaal

    Get PDF
    International audienceAn infrared sensor is modeled and analyzed in Uppaal. The sensor typifies the sort of component that engineers regularly integrate into larger systems by writing interface hardware and software. In all, three main models are developed. For the first, the timing diagram of the sensor is interpreted and modeled as a timed safety automaton. This model serves as a specification for the complete system. A second model that emphasizes the separate roles of driver and sensor is then developed. It is validated against the timing diagram model using an existing construction that permits the verification of timed trace inclusion, for certain models, by reachability analysis (i.e., model checking). A transmission correctness property is also stated by means of an auxiliary automaton and shown to be satisfied by the model. A third model is created from an assembly language driver program, using a direct translation from the instruction set of a processor with simple timing behavior. This model is validated against the driver component of the second timing diagram model using the timed trace inclusion validation technique. While no pretense is made of providing a general means to verify systems, The approach and its limitations offer insight into the nature and challenges of programming in real time

    A Model-Derivation Framework for Software Analysis

    Get PDF
    Model-based verification allows to express behavioral correctness conditions like the validity of execution states, boundaries of variables or timing at a high level of abstraction and affirm that they are satisfied by a software system. However, this requires expressive models which are difficult and cumbersome to create and maintain by hand. This paper presents a framework that automatically derives behavioral models from real-sized Java programs. Our framework builds on the EMF/ECore technology and provides a tool that creates an initial model from Java bytecode, as well as a series of transformations that simplify the model and eventually output a timed-automata model that can be processed by a model checker such as UPPAAL. The framework has the following properties: (1) consistency of models with software, (2) extensibility of the model derivation process, (3) scalability and (4) expressiveness of models. We report several case studies to validate how our framework satisfies these properties.Comment: In Proceedings MARS 2017, arXiv:1703.0581
    • …
    corecore