2,222 research outputs found

    Combining Static and Dynamic Program Analysis Techniques for Checking Relational Properties

    Get PDF
    Die vorliegende Dissertation ist im Bereich der formalen Verifikation von Software angesiedelt. Sie behandelt die Überprüfung relationaler Eigenschaften von Computerprogrammen, d.h. solche Eigenschaften, die zwei oder mehr Programmausführungen betrachten. Die Dissertation konzentriert sich auf zwei spezifische relationale Eigenschaften: (1) Nichtinterferenz und (2) ob ein Programm ein Slice eines anderen Programms ist. Die Nichtinterferenz-Eigenschaft besagt, dass die Ausführung eines Programms mit den gleichen öffentlichen Eingaben die gleichen öffentlichen Ausgaben produziert und dies unabhängig von den geheimen Eingaben (z.B. eines Passworts) ist. Das bedeutet, dass die geheimen Eingaben die öffentlichen Ausgaben nicht beeinflussen. Programm-Slicing ist eine Technik zur Reduzierung eines Programms durch das Entfernen von Programmbefehlen, sodass ein spezifizierter Teil des Programmverhaltens erhalten bleibt, z.B. der Wert einer Variablen in einer Instruktion in dem Programm. Die Dissertation stellt Frameworks zur Verfügung, die es dem Nutzer ermöglichen, die obigen zwei Eigenschaften für ein gegebenes Programm zu analysieren. Die Dissertation erweitert den Stand der Technik in dem Bereich der Verifikation relationaler Eigenschaften, indem sie einerseits neue Ansätze zur Verfügung stellt und andererseits bereits existierende Ansätze miteinander kombiniert. Die Dissertation enthält jeweils einen Teil für die behandelten zwei relationalen Eigenschaften. Nichtinterferenz.\textbf{Nichtinterferenz.} Das Framework zur Überprüfung der Nichtinterferenz stellt neue Ansätze für die automatische Testgenerierung und für das Debuggen des Programms zur Verfügung und kombiniert diese mit Ansätzen, die auf deduktiver Verifikation und Programmabhängigkeitsgraphen basieren. Der erste neue Ansatz ermöglicht die automatische Generierung von Nichtinterferenz-Tests. Er ermöglicht dem Nutzer, nach Verletzungen der Nichtinterferenz-Eigenschaft im Programm zu suchen und stellt zudem ein für relationale Eigenschaften passendes Abdeckungskriterium für die generierten Test-Suites zur Verfügung. Der zweite neue Ansatz ist ein relationaler Debugger zur Analyse von Nichtinterferenz-Gegenbeispielen. Er verwendet bekannte Konzepte des Programm-Debuggens und erweitert diese für die Analyse relationaler Eigenschaften. Um den Nutzer beim Beweisen der Nichtinterferenz-Eigenschaft zu unterstützen, kombiniert das Framework einen auf Programmabhängigkeitsgraphen basierenden Ansatz mit einem auf Logik basierenden Ansatz, der einen Theorembeweiser verwendet. Auf Programmabhängigkeitsgraphen basierende Ansätze berechnen die Abhängigkeiten zwischen den unterschiedlichen Programmteilen und überprüfen, ob die öffentliche Ausgabe von der geheimen Eingabe abhängt. Im Vergleich zu logik-basierten Ansätzen skalieren programmabhängigkeitsgraphen-basierte Ansätze besser. Allerdings, können sie Fehlalarme melden, da sie die Programmabhängigkeiten überapproximieren. Somit bestehen zwei weitere Beiträge des Frameworks in Kombinationen von programmabhängigkeitsgraphen- und logik basierten Ansätzen: (1) der programmabhängigkeitsgraphen basierte Ansatz vereinfacht das Programm, das danach vom logik basierten Ansatz überprüft wird und (2) der logik basierte Ansatz beweist, dass einige vom Programmabhängigkeitsgraphen-basierten Ansatz berechnete Abhängigkeiten Überapproximationen sind und aus der Analyse entfernt werden können. Programm-Slicing.\textbf{Programm-Slicing.} Der zweite Teil der Dissertation behandelt ein Framework für das automatische Programm-Slicing. Während die meisten zum Stand der Technik gehörenden Slicing-Ansätze nur eine syntaktische Programmanalyse durchführen, betrachtet dieses Framework auch die Programmsemantik und kann dadurch mehr Programmbefehle entfernen. Der erste Beitrag des Frameworks besteht aus einem Ansatz zur relationalen Verifikation, der erweitert wurde, um die Korrektheit eines Programm-Slice nachzuweisen, d.h. dass es das spezifizierte Verhalten des Originalprogramms bewahrt. Der Vorteil der Benutzung relationaler Verifikation ist, dass sie auf zwei ähnlichen Programmen automatisch läuft -- was bei einem Slice-Kandidaten und Originalprogramm der Fall ist. Somit, anders als bei den wenigen zum Stand der Technik gehörenden Ansätzen, die die Programmsemantik betrachten, ist dieser Ansatz automatisch. Der zweite Beitrag des Frameworks besteht aus einer neuen Strategie zur Generierung von Slice-Kandidaten durch durch die Verfeinerung von dynamischen Slices (für eine Eingabe gültigen Slices) mithilfe von der relationalen Verifikation gelieferte Gegenbeispiele

    Using Relational Verification for Program Slicing

    Get PDF
    Program slicing is the process of removing statements from a program such that defined aspects of its behavior are retained. For producing precise slices, i.e., slices that are minimal in size, the program\u27s semantics must be considered. Existing approaches that go beyond a syntactical analysis and do take the semantics into account are not fully automatic and require auxiliary specifications from the user. In this paper, we adapt relational verification to check whether a slice candidate obtained by removing some instructions from a program is indeed a valid slice. Based on this, we propose a framework for precise and automatic program slicing. As part of this framework, we present three strategies for the generation of slice candidates, and we show how dynamic slicing approaches - that interweave generating and checking slice candidates - can be used for this purpose. The framework can easily be extended with other strategies for generating slice candidates. We discuss the strengths and weaknesses of slicing approaches that use our framework

    Extending Traditional Static Analysis Techniques to Support Development, Testing and Maintenance of Component-Based Solutions

    Get PDF
    Traditional static code analysis encompasses a mature set of techniques for helping understand and optimize programs, such as dead code elimination, program slicing, and partial evaluation (code specialization). It is well understood that compared to other program analysis techniques (e.g., dynamic analysis), static analysis techniques do a reasonable job for the cost associated with implementing them. Industry and government are moving away from more ‘traditional’ development approaches towards component-based approaches as ‘the norm.’ Component-based applications most often comprise a collection of distributed object-oriented components such as forms, code snippets, reports, modules, databases, objects, containers, and the like. These components are glued together by code typically written in a visual language. Some industrial experience shows that component-based development and the subsequent use of visual development environments, while reducing an application\u27s total development time, actually increase certain maintenance problems. This provides a motivation for using automated analysis techniques on such systems. The results of this research show that traditional static analysis techniques may not be sufficient for analyzing component-based systems. We examine closely the characteristics of a component-based system and document many of the issues that we feel make the development, analysis, testing and maintenance of such systems more difficult. By analyzing additional summary information for the components as well as any available source code for an application, we show ways in which traditional static analysis techniques may be augmented, thereby increasing the accuracy of static analysis results and ultimately making the maintenance of component-based systems a manageable task. We develop a technique to use semantic information about component properties obtained from type library and interface definition language files, and demonstrate this technique by extending a traditional unreachable code algorithm. To support more complex analysis, we then develop a technique for component developers to provide summary information about a component. This information can be integrated with several traditional static analysis techniques to analyze component-based systems more precisely. We then demonstrate the effectiveness of these techniques on several real Department of Defense (DoD) COTS component-based systems

    Support for automatic refactoring of business logic

    Get PDF
    Software’s structure profoundly affects its development and maintenance costs. Poor software’s structure may lead to well-known design flaws, such as large modules or long methods. A possible ap- proach to reduce a module’s complexity is the Extract Method refactor- ing technique. This technique allows the decomposition of a large and complex method into smaller and simpler ones, while reducing the orig- inal method’s size and improving its readability and comprehension. The OutSystems platform is a low-code platform that allows the de- velopment of web and mobile applications that rely on a set of visual Domain-Specific Languages (DSLs). Even low-code languages when im- properly used can lead to software that has maintenance issues like long methods. Thus, the purpose of this paper is to present the research and devel- opment done to provide the OutSystems platform with a tool that au- tomatically suggests Extract Method refactoring opportunities. The re- search combines program slicing techniques with code complexity metrics to calculate the best refactoring opportunities that preserve programs’ functionality. The proposed approach was tested on typical OutSystems apps and was shown to be able to reduce the overall applications’ complexity.info:eu-repo/semantics/publishedVersio

    Moduli Stabilization and the Holographic RG for AdS and dS

    Get PDF
    We relate moduli stabilization (V=0V'=0) in the bulk of AdSDAdS_D or dSDdS_D to basic properties of the Wilsonian effective action in the holographic dual theory on dSD1dS_{D-1}: the single-trace terms in the action have vanishing beta functions, and higher-trace couplings are determined purely from lower-trace ones. In the de Sitter case, this encodes the maximal symmetry of the bulk spacetime in a quantity which is accessible within an observer patch. Along the way, we clarify the role of counterterms, constraints, and operator redundancy in the Wilsonian holographic RG prescription, reproducing the expected behavior of the trace of the stress-energy tensor in the dual for both AdSDAdS_D and dSDdS_D. We further show that metastability of the gravity-side potential energy corresponds to a nonperturbatively small imaginary contribution to the Wilsonian action of pure de Sitter, a result consistent with the need for additional degrees of freedom in the holographic description of its ultimate decay.Comment: 28 pages; v2: minor modifications, published version in JHE

    Numerical Relativity As A Tool For Computational Astrophysics

    Full text link
    The astrophysics of compact objects, which requires Einstein's theory of general relativity for understanding phenomena such as black holes and neutron stars, is attracting increasing attention. In general relativity, gravity is governed by an extremely complex set of coupled, nonlinear, hyperbolic-elliptic partial differential equations. The largest parallel supercomputers are finally approaching the speed and memory required to solve the complete set of Einstein's equations for the first time since they were written over 80 years ago, allowing one to attempt full 3D simulations of such exciting events as colliding black holes and neutron stars. In this paper we review the computational effort in this direction, and discuss a new 3D multi-purpose parallel code called ``Cactus'' for general relativistic astrophysics. Directions for further work are indicated where appropriate.Comment: Review for JCA

    Identifying reusable functions in code using specification driven techniques

    Get PDF
    The work described in this thesis addresses the field of software reuse. Software reuse is widely considered as a way to increase the productivity and improve the quality and reliability of new software systems. Identifying, extracting and reengineering software. components which implement abstractions within existing systems is a promising cost-effective way to create reusable assets. Such a process is referred to as reuse reengineering. A reference paradigm has been defined within the RE(^2) project which decomposes a reuse reengineering process in five sequential phases. In particular, the first phase of the reference paradigm, called Candidature phase, is concerned with the analysis of source code for the identification of software components implementing abstractions and which are therefore candidate to be reused. Different candidature criteria exist for the identification of reuse-candidate software components. They can be classified in structural methods (based on structural properties of the software) and specification driven methods (that search for software components implementing a given specification).In this thesis a new specification driven candidature criterion for the identification and the extraction of code fragments implementing functional abstractions is presented. The method is driven by a formal specification of the function to be isolated (given in terms of a precondition and a post condition) and is based on the theoretical frameworks of program slicing and symbolic execution. Symbolic execution and theorem proving techniques are used to map the specification of the functional abstractions onto a slicing criterion. Once the slicing criterion has been identified the slice is isolated using algorithms based on dependence graphs. The method has been specialised for programs written in the C language. Both symbolic execution and program slicing are performed by exploiting the Combined C Graph (CCG), a fine-grained dependence based program representation that can be used for several software maintenance tasks

    Nonperturbative Quantum Gravity

    Get PDF
    Asymptotic safety describes a scenario in which general relativity can be quantized as a conventional field theory, despite being nonrenormalizable when expanding it around a fixed background geometry. It is formulated in the framework of the Wilsonian renormalization group and relies crucially on the existence of an ultraviolet fixed point, for which evidence has been found using renormalization group equations in the continuum. "Causal Dynamical Triangulations" (CDT) is a concrete research program to obtain a nonperturbative quantum field theory of gravity via a lattice regularization, and represented as a sum over spacetime histories. In the Wilsonian spirit one can use this formulation to try to locate fixed points of the lattice theory and thereby provide independent, nonperturbative evidence for the existence of a UV fixed point. We describe the formalism of CDT, its phase diagram, possible fixed points and the "quantum geometries" which emerge in the different phases. We also argue that the formalism may be able to describe a more general class of Ho\v{r}ava-Lifshitz gravitational models.Comment: Review, 146 pages, many figure
    corecore