13,783 research outputs found

    Generating program analyzers

    Get PDF
    In this work the automatic generation of program analyzers from concise specifications is presented. It focuses on provably correct and complex interprocedural analyses for real world sized imperative programs. Thus, a powerful and flexible specification mechanism is required, enabling both correctness proofs and efficient implementations. The generation process relies on the theory of data flow analysis and on abstract interpretation. The theory of data flow analysis provides methods to efficiently implement analyses. Abstract interpretation provides the relation to the semantics of the programming language. This allows the systematic derivation of efficient provably correct, and terminating analyses. The approach has been implemented in the program analyzer generator PAG. It addresses analyses ranging from "simple\u27; intraprocedural bit vector frameworks to complex interprocedural alias analyses. A high level specialized functional language is used as specification mechanism enabling elegant and concise specifications even for complex analyses. Additionally, it allows the automatic selection of efficient implementations for the underlying abstract datatypes, such as balanced binary trees, binary decision diagrams, bit vectors, and arrays. For the interprocedural analysis the functional approach, the call string approach, and a novel approach especially targeting on the precise analysis of loops can be chosen. In this work the implementation of PAG as well as a large number of applications of PAG are presented.Diese Arbeit befaßt sich mit der automatischen Generierung von Programmanalysatoren aus prägnanten Spezifikationen. Dabei wird besonderer Wert auf die Generierung von beweisbar korrekten und komplexen interprozeduralen Analysen für imperative Programme realer Größe gelegt. Um dies zu erreichen, ist ein leistungsfähiger und flexibler Spezifikationsmechanismus erforderlich, der sowohl Korrektheitsbeweise, als auch effiziente Implementierungen ermöglicht. Die Generierung basiert auf den Theorien der Datenflußanalyse und der abstrakten Interpretation. Die Datenflußanalyse liefert Methoden zur effizienten Implementierung von Analysen. Die abstrakte Interpretation stellt den Bezug zur Semantik der Programmiersprache her und ermöglicht dadurch die systematische Ableitung beweisbar korrekter und terminierender Analysen. Dieser Ansatz wurde im Programmanalysatorgenerator PAG implementiert, der sowohl für einfache intraprozedurale Bitvektor- Analysen, als auch für komplexe interprozedurale Alias-Analysen geeignet ist. Als Spezifikationsmechanismus wird dabei eine spezialisierte funktionale Sprache verwendet, die es ermöglicht, auch komplexe Analysen kurz und prägnant zu spezifizieren. Darüberhinaus ist es möglich, für die zugrunde liegenden abstrakten Bereiche automatisch effiziente Implementierungen auszuwählen, z.B. balancierte binäre Bäume, Binary Decision Diagrams, Bitvektoren oder Felder. Für die interprozedurale Analyse stehen folgende Möglichkeiten zur Auswahl: der funktionale Ansatz, der Call-String-Ansatz und ein neuer Ansatz, der besonders auf die präzise Analyse von Schleifen abzielt. Diese Arbeit beschreibt sowohl die Implementierung von PAG, als auch eine große Anzahl von Anwendungen

    Differentially Testing Soundness and Precision of Program Analyzers

    Full text link
    In the last decades, numerous program analyzers have been developed both by academia and industry. Despite their abundance however, there is currently no systematic way of comparing the effectiveness of different analyzers on arbitrary code. In this paper, we present the first automated technique for differentially testing soundness and precision of program analyzers. We used our technique to compare six mature, state-of-the art analyzers on tens of thousands of automatically generated benchmarks. Our technique detected soundness and precision issues in most analyzers, and we evaluated the implications of these issues to both designers and users of program analyzers

    Learning a Static Analyzer from Data

    Full text link
    To be practically useful, modern static analyzers must precisely model the effect of both, statements in the programming language as well as frameworks used by the program under analysis. While important, manually addressing these challenges is difficult for at least two reasons: (i) the effects on the overall analysis can be non-trivial, and (ii) as the size and complexity of modern libraries increase, so is the number of cases the analysis must handle. In this paper we present a new, automated approach for creating static analyzers: instead of manually providing the various inference rules of the analyzer, the key idea is to learn these rules from a dataset of programs. Our method consists of two ingredients: (i) a synthesis algorithm capable of learning a candidate analyzer from a given dataset, and (ii) a counter-example guided learning procedure which generates new programs beyond those in the initial dataset, critical for discovering corner cases and ensuring the learned analysis generalizes to unseen programs. We implemented and instantiated our approach to the task of learning JavaScript static analysis rules for a subset of points-to analysis and for allocation sites analysis. These are challenging yet important problems that have received significant research attention. We show that our approach is effective: our system automatically discovered practical and useful inference rules for many cases that are tricky to manually identify and are missed by state-of-the-art, manually tuned analyzers

    Efficient symbolic computation of approximated small-signal characteristics of analog integrated circuits

    Get PDF
    A symbolic analysis tool is presented that generates simplified symbolic expressions for the small-signal characteristics of large analog integrated circuits. The expressions are approximated while they are computed, so that only those terms are generated which remain in the final expression. This principle causes drastic savings in CPU time and memory, compared with previous symbolic analysis tools. In this way, the maximum size of circuits that can be analyzed, is largely increased. By taking into account a range for the value of a circuit parameter rather than one single number, the generated expressions are also more generally valid. Mismatch handling is explicitly taken into account in the algorithm. The capabilities of the new tool are illustrated with several experimental result

    Comparison of matroid intersection algorithms for large circuit analysis

    Get PDF
    This paper presents two approaches to symbolic analysis of large analog integrated circuits via simplification during the generation of the symbolic expressions. Both techniques are examined from the point of view of matroid theory. Finally, a new approach which combines the positive features of both approaches is introduced

    Symbolic analysis tools-the state of the art

    Get PDF
    This paper reviews the main last generation symbolic analyzers, comparing them in terms of functionality, pointing out also their shortcomings. The state of the art in this field is also studied, pointing out directions for future research

    Probing quantum-classical boundary with compression software

    Get PDF
    We experimentally demonstrate that it is impossible to simulate quantum bipartite correlations with a deterministic universal Turing machine. Our approach is based on the Normalized Information Distance (NID) that allows the comparison of two pieces of data without detailed knowledge about their origin. Using NID, we derive an inequality for output of two local deterministic universal Turing machines with correlated inputs. This inequality is violated by correlations generated by a maximally entangled polarization state of two photons. The violation is shown using a freely available lossless compression program. The presented technique may allow to complement the common statistical interpretation of quantum physics by an algorithmic one.Comment: 7 pages, 6 figure

    STATIC CODE ANALYSIS

    Get PDF
    A lot of the defects that are present in a program are not visible to the compiler. Static code analysis is a way to find bugs and reduce the defects in a software application. This paper gives you an overview on static code analysis, well-known tools and the benefits of this practice.code, analysis
    corecore