8 research outputs found

    Advances in Usability of Formal Methods for Code Verification with Frama-C

    Get PDF
    Industrial usage of code analysis tools based on semantic analysis, such as the Frama-C platform, poses several challenges, from the setup of analyses to the exploitation of their results.  In this paper, we discuss two of these challenges.  First, such analyses require detailed information about the code structure and the build process, which are often not documented, being part of the implicit build chain used by the developers.  Unlike heuristics-based tools, which can deal with incomplete information, semantics-based tools require stubs or specifications for external library functions, compiler builtins, non-standard extensions, etc.  Setting up a new analysis has a high cost, which precludes industrial users from trying such tools, since the return on investment is not clear in advance: the analysis may reveal itself of little use w.r.t. the invested time.  Improving the usability of this first step is essential for the widespread adoption of formal methods in software development.  A second aspect that is essential for successful analyses is understanding the data and navigating it.  Visualizing data and rendering it in an interactive manner allows users to considerably speed up the process of refining the analysis results.  We present some approaches to both of these issues, derived from experience with code bases given by industrial partners

    Concrete Categorical Model of a Quantum Circuit Description Language with Measurement

    Get PDF
    In this paper, we introduce dynamic lifting to a quantum circuit-description language, following the Proto-Quipper language approach. Dynamic lifting allows programs to transfer the result of measuring quantum data - qubits - into classical data - booleans -. We propose a type system and an operational semantics for the language and we state safety properties. Next, we introduce a concrete categorical semantics for the proposed language, basing our approach on a recent model from Rios&Selinger for Proto-Quipper-M. Our approach is to construct on top of a concrete category of circuits with measurements a Kleisli category, capturing as a side effect the action of retrieving classical content out of a quantum memory. We then show a soundness result for this semantics

    A Deductive Verification Framework for Circuit-building Quantum Programs

    Full text link
    While recent progress in quantum hardware open the door for significant speedup in certain key areas, quantum algorithms are still hard to implement right, and the validation of such quantum programs is a challenge. Early attempts either suffer from the lack of automation or parametrized reasoning, or target high-level abstract algorithm description languages far from the current de facto consensus of circuit-building quantum programming languages. As a consequence, no significant quantum algorithm implementation has been currently verified in a scale-invariant manner. We propose Qbricks, the first formal verification environment for circuit-building quantum programs, featuring clear separation between code and proof, parametric specifications and proofs, high degree of proof automation and allowing to encode quantum programs in a natural way, i.e. close to textbook style. Qbricks builds on best practice of formal verification for the classical case and tailor them to the quantum case: we bring a new domain-specific circuit-building language for quantum programs, namely Qbricks-DSL, together with a new logical specification language Qbricks-Spec and a dedicated Hoare-style deductive verification rule named Hybrid Quantum Hoare Logic. Especially, we introduce and intensively build upon HOPS, a higher-order extension of the recent path-sum symbolic representation, used for both specification and automation. To illustrate the opportunity of Qbricks, we implement the first verified parametric implementations of several famous and non-trivial quantum algorithms, including the quantum part of Shor integer factoring (Order Finding - Shor-OF), quantum phase estimation (QPE) - a basic building block of many quantum algorithms, and Grover search. These breakthroughs were amply facilitated by the specification and automated deduction principles introduced within Qbricks

    Hard Real Time and Mixed Time Criticality on Off-The-Shelf Embedded Multi-Cores

    Get PDF
    International audienceThe paper describes a pragmatic solution to the parallel execution of hard real-time tasks on off-the-shelf embedded multiprocessors. We propose a simple timing isolation protocol allowing computational tasks to communicate with hard real-time ones. Excellent parallel resource utilization can be achieved while preserving timing compositionality. An extension to a synchronous language enables the correct-by-construction compilation to efficient parallel code. We do not explicitly address certification issues at this stage, yet our approach is designed to enable full system certification at the highest safety standards, such as SIL 4 in IEC 61508 or DAL A in DO-178B

    Analyse statique de programmes manipulant des tableaux

    Get PDF
    Static analysis is key area in compilation, optimization and software validation. The complex data structures (arrays, dynamic lists, graphs...) are ubiquitous in programs, and can be challenging, because they can be large or of unbounded size and accesses are computed. (through indexing or indirections). Whereas the verification of the validity of the array accesses was one of the initial motivations of abstract interpretation, the discovery of properties about array contents was only adressed recently. Most of the analyses of array contents are based on a partitioning of the arrays. Then, they try to discover properties about each fragment of this partition. The choice of this partition is a difficult problem and each method have its flaw. Moreover, classical representations of array partitions induce an exponential complexity for these analyzes. In this thesis, we generalize the concept of array partitioning into the concept of "fragmentation" which allow overlapping fragments, handling potentially empty fragments and selecting specialized relations. On the other hand, we propose an abstraction of these fragmentations in terms of graphs called "slices diagrams" as well as the operations to manipulate them and ensuring a polynomial complexity. Finally, we propose a new criterion to compute a semantic fragmentation inspired by the existing ones which attempt to correct their flaws. These methods have been implemented in a static analyzer. Experimentations shows that the analyzer can efficiently and precisly prove some challenging exemples in the field of static analysis of programs manipulating arrays.L’analyse statique de programmes est un domaine crucial en compilation, en optimisation, et en validation de logiciels. Les structures de donnĂ©es complexes (tableaux, listes, graphes...), omniprĂ©sentes dans les programmes, posent des problĂšmes difficiles, du fait qu’elles reprĂ©sentent des ensembles de donnĂ©es de taille importante ou inconnue, et que l’adressage des donnĂ©es dans ces ensembles est calculĂ© (indexation, indirection). La plupart des travaux sur l’analyse des structures de donnĂ©es concernent la vĂ©rification de la correction des accĂšs aux donnĂ©es (vĂ©rification que les indices d’un tableau sont dans les bornes, que les pointeurs ne sont pas nuls, “shape analysis”). L’analyse du contenu des structures de donnĂ©es est encore peu abordĂ©e. A Verimag, ce domaine a Ă©tĂ© abordĂ© rĂ©cemment, et a donnĂ© lieu Ă  de premiers rĂ©sultats sur l’analyse de tableaux unidimensionnels. Une mĂ©thode d’analyse de programmes simples a Ă©tĂ© proposĂ©e [1], qui permet de dĂ©couvrir des propriĂ©tĂ©s des contenus de tableaux, comme par exemple que le rĂ©sultat d’un programme de tri est bien un tableau triĂ©. Un autre type de propriĂ©tĂ©s, dites “non positionnelles” a aussi Ă©tĂ© considĂ©rĂ© [2], qui concerne le contenu global d’un tableau, indĂ©pendamment de son rangement: par exemple, on montre que le rĂ©sultat d’un tri est une permutation du tableau initial. Ces premiers rĂ©sultats sont trĂšs encourageants, mais encore embryonnaires. L’objectif du travail de thĂšse proposĂ© est de les Ă©tendre dans plusieurs directions. Notre analyse de propriĂ©tĂ©s positionnelles est capable de dĂ©couvrir des relations point- Ă -point entre des “tranches” de tableaux (ensembles de cellules consĂ©cutives). Les extensions envisagĂ©es concernent les tableaux multidimensionnels, les ensembles de cellules non nĂ©cessairement consĂ©cutives, et les structures de donnĂ©es plus gĂ©nĂ©rales. Concernant les propriĂ©tĂ©s non positionnelles, les premiers rĂ©sultats sont limitĂ©s aux Ă©galitĂ©s de contenus de tableaux. Ils doivent ĂȘtre Ă©tendus Ă  des relations plus complexes (inclusions, sommes disjointes...) et Ă  d’autres structures de donnĂ©es. Ce travail prend place dans le projet ASOPT (“Analyse statique et optimisation”), acceptĂ© dans le programme ArpĂšge de l’ANR en 2008. RĂ©fĂ©rences : [1] N. Halbwachs, M. PĂ©ron. Discovering properties about arrays in simple programs. ACM Conference on Programming Language Design and Implementation, PLDI 2008. Tucson (Az.), juin 2008. [2] V. Perrelle. Analyse statique du contenu de tableaux, propriĂ©tĂ©s non positionnelles. Rapport de M2R, Master Parisien de Recherche en Informatique, septembre 2008

    Réutilisations de caches et d'invariants pour l'analyse statique incrémentale

    No full text
    National audienceL'analyse statique de programmes permet aujourd'hui d'analyser des programmes de grande taille, avec une trÚs bonne précision, tout en étant raisonnablement rapide. Néanmoins, les temps d'analyse continuent de se compter en minutes, voire dizaines de minutes, ce qui rend compliqué leur intégration dans les processus de développement : les modifications d'un programme y sont trÚs fréquentes et requiÚrent donc d'obtenir rapidement les résultats de l'analyseur. Néanmoins, ces modifications sont souvent mineures, de l'ordre de quelques lignes de code tout au plus. L'analyse statique incrémentale exploite cette caractéristique pour permettre à un analyseur statique de se contenter d'actualiser les résultats d'une analyse antérieure plutÎt que de tout recalculer, ce qui permet des gains de temps significatifs. Cet article présente deux nouvelles approches pour l'analyse statique incrémentale, l'une réutilisant des caches de fonction et l'autre des invariants de boucle. Nous les avons implémentées dans Eva, l'analyseur de valeurs par interprétation abstraite de Frama-C en utilisant une nouvelle fonctionalité de cette plateforme permettant de comparer deux programmes. Nos travaux ont été évalués sur un ensemble de commits de programmes réels

    Mixed-criticality in Railway Systems: A Case Study on Signalling Application

    Get PDF
    International audienceWe presented the work conducted in the FSF project to han-dle mixed criticality. We used a synchronous design frameworkto implement a simplified signaling application and to deployit on a partitioned OS.We are continuously working towards a better integrationof the tools composing the framework.In the passenger exchange use case, mixed criticality residesat the application level, or even at function level, rather thanthe system level. On the other hand, the approach proposedin IMA and ARINC meets the needs of a system integrator.The main constraint highlighted by this case study is that theremay be, even within a single system function, many communi-cations between its vital and non-vital subcomponents. Whengeneralized to the whole set of system functions, this patterninduces a large number of communications between the vitaland non-vital parts. Furthermore, if we want to preserve thesynchronous semantics (e.g. no additional delay) the numberof windows may explode. The overall cost of communicationsand context-switch become prohibitive for systems global per-formance. Executing mixed-critical signaling applications onthe same platform remains a challenging problem consideringthe state of the art in real-time operating systems.Finally, the vital/non-vital dichotomy traditionally used insignaling application proved to be insufficient with respectto the operational availability of the system. It would bemore appropriate to consider at least three levels, safety-critical, mission-critical, and non-critical, and to exploit thisinformation in the partitioning and scheduling
    corecore