1,378 research outputs found

    Development and Verification of a Flight Stack for a High-Altitude Glider in Ada/SPARK 2014

    Full text link
    SPARK 2014 is a modern programming language and a new state-of-the-art tool set for development and verification of high-integrity software. In this paper, we explore the capabilities and limitations of its latest version in the context of building a flight stack for a high-altitude unmanned glider. Towards that, we deliberately applied static analysis early and continuously during implementation, to give verification the possibility to steer the software design. In this process we have identified several limitations and pitfalls of software design and verification in SPARK, for which we give workarounds and protective actions to avoid them. Finally, we give design recommendations that have proven effective for verification, and summarize our experiences with this new language

    Proving Memory Safety of the ANI Windows Image Parser Using Compositional Exhaustive Testing

    Get PDF
    We report in this paper how we proved memory safety of a complex Windows image parser written in low-level C in only three months of work and using only three core tech-niques, namely (1) symbolic execution at the x86 binary level, (2) exhaustive program path enumeration and testing, and (3) user-guided program decomposition and summariza-tion. We also used a new tool, named MicroX, for executing code fragments in isolation using a custom virtual machine designed for testing purposes. As a result of this work, we are able to prove, for the first time, that a Windows image parser is memory safe, i.e., free of any buffer-overflow secu-rity vulnerabilities, modulo the soundness of our tools and several additional assumptions regarding bounding input-dependent loops, fixing a few buffer-overflow bugs, and ex-cluding some code parts that are not memory safe by design. In the process, we also discovered and fixed several limita-tions in our tools, and narrowed the gap between systematic testing and verification. 1

    The development of an incremental debugging system : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University

    Get PDF
    Debugging is a major area of software development that has received little attention. This thesis starts by looking at work done in the area of bug prevention, bug detection, bug location and bug correction. A debugging system, BIAS, is proposed to help in detecting, locating and correcting bugs. Three major design goals are established. Firstly, the system should be simple and easy to understand as this will encourage use. Secondly, the system should be general so that it will be available to a large number of users. Finally, it should be incremental as this will save users' time. An incremental language, STILL, is designed to show how BIAS applies to structured languages. The construction of the system is shown. Each data structure, and how it is used, is described. BIAS uses an interpretive system and runs threaded code on a pseudo-machine. How the threads are interpreted and how they are set up is shown next. The use of BIAS is shown by following through an example session with the system. This consists of entering a program, editing it, and running it. As bugs show themselves, various debugging commands are used to locate the bugs. The program is then edited, and the corrections linked into the code so that it will run correctly. This cycle is repeated until no bugs remain, without at any time recompiling the whole program. It turns out that the best way of achieving the design goals is to extend an incremental compiler host to include debugging commands. This gives a clear emphasis to the power of incremental compilers

    Proving memory safety of floating-point computations by combining static and dynamic program analysis

    Get PDF
    Whitebox fuzzing is a novel form of security testing based on runtime symbolic execution and constraint solving. Over the last couple of years, whitebox fuzzers have found dozens of new security vulnerabilities (buffer overflows) in Windows and Linux applications, including codecs, image viewers and media players. Those types of applications tend to use floating-point instructions available on modern processors, yet existing whitebox fuzzers and SMT constraint solvers do not handle floating-point arithmetic. Are there new security vulnerabilities lurking in floating-point code? A naive solution would be to extend symbolic execu-tion to floating-point (FP) instructions (months of work), ex-tend SMT solvers to reason about FP constraints (months of work), and then face more complex constraints and an even worse path explosion problem. Instead, we propose an alternative approach, based on the rough intuition that FP code should only perform memory-safe data-processing of the “payload ” of an image or video file, while the non-FP part of the application should deal with buffer alloca-tions and memory address computations, with only the lat-ter being prone to buffer overflows and other security critical bugs. Our approach combines (1) a lightweight local path-insensitive “may ” static analysis of FP instructions with (2) a high-precision whole-program path-sensitive “must ” dy-namic analysis of non-FP instructions. The aim of this com-bination is to prove memory safety of the FP part and a form of non-interference between the FP part and the non-FP part with respect to memory address computations. We have implemented our approach using two existing tools for, respectively, static and dynamic x86 binary analysis. We present preliminary results of experiments with standard JPEG, GIF and ANI Windows parsers. For a given test suite of diverse input files, our mixed static/dynamic analysis is able to prove memory safety of FP code in those parsers for a small upfront static analysis cost and a marginal runtime expense compared to regular runtime symbolic execution

    On the Use of Underspecified Data-Type Semantics for Type Safety in Low-Level Code

    Full text link
    In recent projects on operating-system verification, C and C++ data types are often formalized using a semantics that does not fully specify the precise byte encoding of objects. It is well-known that such an underspecified data-type semantics can be used to detect certain kinds of type errors. In general, however, underspecified data-type semantics are unsound: they assign well-defined meaning to programs that have undefined behavior according to the C and C++ language standards. A precise characterization of the type-correctness properties that can be enforced with underspecified data-type semantics is still missing. In this paper, we identify strengths and weaknesses of underspecified data-type semantics for ensuring type safety of low-level systems code. We prove sufficient conditions to detect certain classes of type errors and, finally, identify a trade-off between the complexity of underspecified data-type semantics and their type-checking capabilities.Comment: In Proceedings SSV 2012, arXiv:1211.587

    A Static Analyzer for Large Safety-Critical Software

    Get PDF
    We show that abstract interpretation-based static program analysis can be made efficient and precise enough to formally verify a class of properties for a family of large programs with few or no false alarms. This is achieved by refinement of a general purpose static analyzer and later adaptation to particular programs of the family by the end-user through parametrization. This is applied to the proof of soundness of data manipulation operations at the machine level for periodic synchronous safety critical embedded software. The main novelties are the design principle of static analyzers by refinement and adaptation through parametrization, the symbolic manipulation of expressions to improve the precision of abstract transfer functions, the octagon, ellipsoid, and decision tree abstract domains, all with sound handling of rounding errors in floating point computations, widening strategies (with thresholds, delayed) and the automatic determination of the parameters (parametrized packing)

    The SeaHorn Verification Framework

    Get PDF
    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code

    A Comparative Analysis Of Conventional Software Development Approaches Vs. Formal Methods In Call Distribution Systems

    Get PDF
    When we think about formal method; the first thing which comes in our mind is mathematical approach. The process of formalization is an approach based on mathematics and used to elaborate the properties of systems (hardware and software). The mathematical modeling or formal methods provide us a framework for large and complex systems. Thus these systems can be specified, analyzed, designed, and verified in a systematic way rather than the approaches which are used conventionally. Formal verification and the methods are applied using theoretical computer science fundamentals to solve the complex and difficult problems in large and complex software and hardware systems to ensure the systems will not fail with run-time errors. Conventional approaches of software verification in call distribution systems rely on quality assurance to verify the system behavior and robustness. The process of software testing cannot show the absence of errors it can only show the presence of errors in software systems. [1] In contrast, the mathematically-based techniques of verification are based on formal methods to prove certain software attributes, for example proving that software does or does not contain the occurrence of errors at run-time such as overflows, divide-by-zero, and access violation, invalid memory access and stack/heap corruption. [1] In this paper later we will have comparative analysis of formal methods vs. conventional software development approaches in call distribution systems. Using this comparison we‘ll try to identify the methodologies and approaches which would be better in SDLC for call distribution systems.
    • 

    corecore