647 research outputs found

    Automated Fixing of Programs with Contracts

    Full text link
    This paper describes AutoFix, an automatic debugging technique that can fix faults in general-purpose software. To provide high-quality fix suggestions and to enable automation of the whole debugging process, AutoFix relies on the presence of simple specification elements in the form of contracts (such as pre- and postconditions). Using contracts enhances the precision of dynamic analysis techniques for fault detection and localization, and for validating fixes. The only required user input to the AutoFix supporting tool is then a faulty program annotated with contracts; the tool produces a collection of validated fixes for the fault ranked according to an estimate of their suitability. In an extensive experimental evaluation, we applied AutoFix to over 200 faults in four code bases of different maturity and quality (of implementation and of contracts). AutoFix successfully fixed 42% of the faults, producing, in the majority of cases, corrections of quality comparable to those competent programmers would write; the used computational resources were modest, with an average time per fix below 20 minutes on commodity hardware. These figures compare favorably to the state of the art in automated program fixing, and demonstrate that the AutoFix approach is successfully applicable to reduce the debugging burden in real-world scenarios.Comment: Minor changes after proofreadin

    A quality model for the Ada standard container library

    Get PDF
    The existence of a standard container library has been largely recognized as a key feature for improving the quality and effectiveness of Ada programming. In this paper, we aim at providing a quality model for making explicit the quality features (those concerning functionality, suitability, etc.) that determine the form that such a library might take. Quality features are arranged hierarchically according to the ISO/IEC quality standard. We tailor this standard to the specific context of container libraries, by identifying their observable attributes and establishing some tradeoffs among them. Afterwards, we apply the resulting model to a pair of existing container libraries. As main contribution of our proposal, we may say that the resulting quality model provides a structured framework for (1) discussing and evaluating the capabilities that the prospective Ada Standard Container Library might offer, and (2) analyzing the consequences of the decisions taken during its design.Peer ReviewedPostprint (author's final draft

    An Investigation Into the Generality of a Graphical Representation of Program Code for Source to Source Translation

    Get PDF
    This thesis addresses the problem of defining a source-to-source translation system for reusable software components. It describes the development of an interoperable language for writing software components, and presents a system to translate components written in the interoperable language to a set of compatible target languages. The common features in a set of popular programming languages are analyzed to inform the design of the interoperable language. An evaluation is performed by using the source-to-source translator to convert two well-known open source Java libraries to C++ and Python, and the accuracy and performance of the resulting translations are assessed

    Notes on object-orientation

    Full text link

    What good are strong specifications?

    Full text link
    Abstract—Experience with lightweight formal methods suggests that programmers are willing to write specification if it brings tangible benefits to their usual development activities. This paper considers stronger specifications and studies whether they can be deployed as an incremental practice that brings additional benefits without being unacceptably expensive. We introduce a methodology that extends Design by Contract to write strong specifications of functional properties in the form of preconditions, postconditions, and invariants. The methodology aims at being palatable to developers who are not fluent in formal techniques but are comfortable with writing simple specifications. We evaluate the cost and the benefits of using strong specifications by applying the methodology to testing data structure implementations written in Eiffel and C#. In our extensive experiments, testing against strong specifications detects twice as many bugs as standard contracts, with a reasonable overhead in terms of annotation burden and runtime performance while testing. In the wide spectrum of formal techniques for software quality, testing against strong specifications lies in a “sweet spot ” with a favorable benefit to effort ratio. I

    Tool Support for Design by Contract

    Get PDF

    Generating a Catalog of Unanticipated Schemas in Class Hierarchies using Formal Concept Analysis

    Get PDF
    International audienceContext: Inheritance is the cornerstone of object-oriented development, supporting conceptual modeling, subtype polymorphism and software reuse. But inheritance can be used in subtle ways that make complex systems hard to understand and extend, due to the presence of implicit dependencies in the inheritance hierarchy. Objective: Although these dependencies often specify well-known schemas (i.e., recurrent design or coding patterns, such as hook and template methods), new unanticipated dependency schemas arise in practice, and can consequently be hard to recognize and detect. Thus, a developer making changes or extensions to an object-oriented system needs to understand these implicit contracts defined by the dependencies between a class and its subclasses, or risk that seemingly innocuous changes break them. Method: To tackle this problem, we have developed an approach based on Formal Concept Analysis. Our FoCARE methodology (Formal Concept Analysis based-Reverse Engineering) identifies undocumented hi- erarchical dependencies in a hierarchy by taking into account the existing structure and behavior of classes and subclasses. Results: We validate our approach by applying it to a large and non-trivial case study, yielding a catalog of Hierarchy Schemas, each one composed of a set of dependencies over methods and attributes in a class hierarchy. We show how the discovered dependency schemas can be used not only to identify good design practices, but also to expose bad smells in design, thereby helping developers in initial reengineering phases to develop a first mental model of a system. Although some of the identified schemas are already documented in existing literature, with our approach based on Formal Concept Analysis (FCA), we are also able to identify previously unidentified schemas

    Integrating measurement techniques in an Object-Orientedsystems design process.

    Get PDF
    The theme of this thesis is the assessment of quality in class hierarchies. In particular, the notion ofinheritance and the mechanism of redefinition from a modelling perspective are reviewed. It isshown that, in Object-Oriented languages, controversial uses of inheritance can be implementedand are subject of debate as they contradict the essence of inheritance. The discovery of anunexpected use of the method redefinition mechanism confirmed that potential designinconsistencies occur more often than expected in class hierarchies. To address such problems,design heuristics and measurement techniques are investigated as the main instrument tools for theevaluation "goodness" or "badness" in class hierarchies. Their benefits are demonstrated withinthe design process. After the identification of an obscure use of the method redefinition mechanism referred to as themultiple descendant redefinition (MDR) problem, a set of metrics based on the GQMlMEDEA[Bri&aI94] model is proposed. To enable a measurement programme to take place within a designprocess, the necessary design considerations are detailed and the technical issues involved in themeasurement process are presented. Both aspects form ~. methodological approach for classhierarchy assessment and especially concentrate on the use of the redefinition mechanism.. .As one of the main criticisms of the measure~ent science is the lack orgood design feedback, the, analysis and interpretation phase. of the metfics results is seen: as a crucial phase for inferring,meaningful conclusions. A novel· data interpretation framework is pr~posed' and includes the use ofvarious graphical data representations and detection techniques. Also, the notion of redefinitionprofiles suggested a, more generic approach whereby a pattern profile can be found for a metric.The benefits of the data interpretation method for the extraction of meaningful design feedbackfrom the metrics results are discussed.The implementation of a metric tool collector enabled a set of experiments to be carried out on theSmalltalk class hierarchy. Surprisingly, the analysis of metrics results showed that methodredefmition is heavily used compared to method extension. This suggested the existence ofpotential design inconsistencies in the class hierarchy and permitted the discovery of the MDRproblem on many occasions. In addition, a set of experiments demonstrates the benefits of examplegraphical representations together with detection techniques such as alarmers. In the light offacilitating the interpretation phase, the need for additional supporting tools is highlighted. This thesis illustrates the potential benefits of integration of measurement techniques within anObject-Oriented design process. Given the identification of the MDR problem, it is believed thatthe redefinition metrics are strong and simple candidates for detecting complex design problemsoccurring within a class hierarchy. An integrated design assessment model is proposed whichlogically fits into an incremental design development process. Benefits and disadvantages of theapproach are discussed together with future work
    corecore