14 research outputs found

    Convex polyhedral abstractions, specialisation and property-based predicate splitting in Horn clause verification

    Get PDF
    We present an approach to constrained Horn clause (CHC) verification combining three techniques: abstract interpretation over a domain of convex polyhedra, specialisation of the constraints in CHCs using abstract interpretation of query-answer transformed clauses, and refinement by splitting predicates. The purpose of the work is to investigate how analysis and transformation tools developed for constraint logic programs (CLP) can be applied to the Horn clause verification problem. Abstract interpretation over convex polyhedra is capable of deriving sophisticated invariants and when used in conjunction with specialisation for propagating constraints it can frequently solve challenging verification problems. This is a contribution in itself, but refinement is needed when it fails, and the question of how to refine convex polyhedral analyses has not been studied much. We present a refinement technique based on interpolants derived from a counterexample trace; these are used to drive a property-based specialisation that splits predicates, leading in turn to more precise convex polyhedral analyses. The process of specialisation, analysis and splitting can be repeated, in a manner similar to the CEGAR and iterative specialisation approaches.Comment: In Proceedings HCVS 2014, arXiv:1412.082

    Poly-controlled partial evaluation and its application to resource-aware program specialization

    Get PDF
    La Evaluaci贸n Parcial es una t茅cnica autom谩tica para la optimizaci贸n de programas. Su objetivo principal es el de especializar un programa con respecto a parte de sus datos de entrada, los que se conocen como datos est谩ticos. La calidad del c贸digo generado por la evaluaci贸n parcial de programas l贸gicos depende, en gran medida, de la estrategia de control que se haya empleado. Desafortunadamente, a煤n estamos lejos de contar con una estrategia de control suficientemente sofisticada como para comportarse de manera 贸ptima para cualquier programa. La principal contribuci贸n de esta tesis es el desarrollo de la Evaluaci贸n Parcial Policontrolada, un novedoso entorno para la evaluaci贸n parcial de programas l贸gicos, el cual es policontrolado en el sentido de que puede tomar en cuenta conjuntos de reglas de control global y local, en lugar de emplear una 煤nica combinaci贸n predeterminada (como es el caso de la evaluaci贸n parcial tradicional). Este entorno es m谩s flexible que los enfoques existentes, ya que permite asignar diferentes reglas de control local y global a diferentes patrones de llamada. De este modo, es posible obtener programas especializados que no pueden ser generados usando evaluaci贸n parcial tradicional. En consecuencia, el entorno de evaluaci贸n parcial policontrolada puede generar conjuntos de programas especializados, en lugar de un 煤nico programa. A trav茅s de t茅cnicas auto-ajustables, es posible hacer que este enfoque sea completamente autom谩tico. Dichas t茅cnicas permiten medir la calidad de los diferentes programas especializados obtenidos. Este entorno es consciente de los recursos, en el sentido de que cada una de las soluciones obtenidas a trav茅s de la evaluaci贸n parcial policontrolada es valorada utilizando funciones de adecuaci贸n, las que pueden tener en cuenta factores tales como el tama帽o de los programas especializados, o la cantidad de memoria que consumen, adem谩s de la velocidad del programa especializado que es el factor habitualmente considerado en otros entornos de evaluaci贸n parcial. Este entorno de evaluaci贸n parcial policontrolada ha sido implementado en el sistema CiaoPP, y evaluado con numerosos programas de prueba. Los resultados experimentales muestran que nuestra propuesta obtiene en muchos casos mejores especializaciones que aquellas generadas usando la evaluaci贸n parcial tradicional, especialmente cuando la especializaci贸n es consciente de los recursos. Otra de las principales contribuciones de esta tesis es la presentaci贸n de una visi贸n unificada del problema de eliminar la polivarianza superflua en la evaluaci贸n parcial y en la especializaci贸n abstracta m煤ltiple, a trav茅s del uso de un paso de minimizaci贸n, el cual agrupa versiones equivalentes de predicados. Este paso se puede aplicar en la especializaci贸n de cualquier programa Prolog, inclusive aquellos que contienen llamadas a predicados predefinidos o predicados externos. Adem谩s, ofrecemos la posibilidad de agrupar versiones que no sean estrictamente equivalentes, con el prop贸sito de obtener programas m谩s peque帽os

    Doctor of Philosophy in Computer Science

    Get PDF
    dissertationControl-flow analysis of higher-order languages is a difficult problem, yet an important one. It aids in enabling optimizations, improved reliability, and improved security of programs written in these languages. This dissertation explores three techniques to improve the precision and speed of a small-step abstract interpreter: using a priority work list, environment unrolling, and strong function call. In an abstract interpreter, the interpreter is no longer deterministic and choices can be made in how the abstract state space is explored and trade-offs exist. A priority queue is one option. There are also many ways to abstract the concrete interpreter. Environment unrolling gives a slightly different approach than is usually taken, by holding off abstraction in order to gain precision, which can lead to a faster analysis. Strong function call is an approach to clean up some of the imprecision when making a function call that is introduced when abstractly interpreting a program. An alternative approach to building an abstract interpreter to perform static analysis is through the use of constraint solving. Existing techniques to do this have been developed over the last several decades. This dissertation maps these constraints to three different problems, allowing control-flow analysis of higher-order languages to be solved with tools that are already mature and well developed. The control-flow problem is mapped to pointer analysis of first-order languages, SAT, and linear-algebra operations. These mappings allow for fast and parallel implementations of control-flow analysis of higher-order languages. A recent development in the field of static analysis has been pushdown control-flow analysis, which is able to precisely match calls and returns, a weakness in the existing techniques. This dissertation also provides an encoding of pushdown control-flow analysis to linear-algebra operations. In the process, it demonstrates that under certain conditions (monovariance and flow insensitivity) that in terms of precision, a pushdown control-flow analysis is in fact equivalent to a direct style constraint-based formulation

    Building a Typed Scripting Language

    Get PDF
    Since the 1990s, scripting languages (e.g. Python, Ruby, JavaScript, and many others) have gained widespread popularity. Features such as ad-hoc data manipulation, dynamic structural typing, and terse syntax permit rapid engineering and improve developer productivity. Unfortunately, programs written in scripting languages execute slower and are less scalable than those written in traditional languages (such as C or Java) due to the challenge of statically analyzing scripting languages' semantics. Although various research projects have made progress on this front, corner cases in the semantics of existing scripting languages continue to defy static analysis and software engineers must generally still choose between program performance and programmer performance when selecting a language. We address that dichotomy in this dissertation by designing a scripting language with the intent of statically analyzing it. We select a set of core primitives in which common language features such as object-orientation and case analysis can be encoded and give a sound and decidable type inference system for it. Our type theory is based on subtype constraint systems but is also closely related to abstract interpretation; we use this connection to guide development of the type system and to employ a novel type soundness proof strategy based on simulation. At the heart of our approach is a type indexed record we call the onion which supports asymmetric concatenation and dispatch; we use onions to formally encode a variety of features, including records, operator overloading, objects, and mixins. An optimistic call-site polymorphism model defined herein captures the ad-hoc, case-analysis-based reasoning often used in scripting languages. Although the language in this dissertation uses a particular set of core primitives, the strategy we use to design it is general: we demonstrate a simple, formulaic process for adding features such as integers and state

    Problem space of modern society: philosophical-communicative and pedagogical interpretations. Part II

    Get PDF
    This collective monograph offers the description of philosophical bases of definition of communicative competence and pedagogical conditions for the formation of communication skills. The authors of individual chapters have chosen such point of view for the topic which they considered as the most important and specific for their field of study using the methods of logical and semantic analysis of concepts, the method of reflection, textual reconstruction and comparative analysis. The theoretical and applied problems of modern society are investigated in the context of philosophical, communicative and pedagogical interpretations

    Programming Languages and Systems

    Get PDF
    This open access book constitutes the proceedings of the 29th European Symposium on Programming, ESOP 2020, which was planned to take place in Dublin, Ireland, in April 2020, as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2020. The actual ETAPS 2020 meeting was postponed due to the Corona pandemic. The papers deal with fundamental issues in the specification, design, analysis, and implementation of programming languages and systems

    Proceedings of Monterey Workshop 2001 Engineering Automation for Sofware Intensive System Integration

    Get PDF
    The 2001 Monterey Workshop on Engineering Automation for Software Intensive System Integration was sponsored by the Office of Naval Research, Air Force Office of Scientific Research, Army Research Office and the Defense Advance Research Projects Agency. It is our pleasure to thank the workshop advisory and sponsors for their vision of a principled engineering solution for software and for their many-year tireless effort in supporting a series of workshops to bring everyone together.This workshop is the 8 in a series of International workshops. The workshop was held in Monterey Beach Hotel, Monterey, California during June 18-22, 2001. The general theme of the workshop has been to present and discuss research works that aims at increasing the practical impact of formal methods for software and systems engineering. The particular focus of this workshop was "Engineering Automation for Software Intensive System Integration". Previous workshops have been focused on issues including, "Real-time & Concurrent Systems", "Software Merging and Slicing", "Software Evolution", "Software Architecture", "Requirements Targeting Software" and "Modeling Software System Structures in a fastly moving scenario".Office of Naval ResearchAir Force Office of Scientific Research Army Research OfficeDefense Advanced Research Projects AgencyApproved for public release, distribution unlimite

    Forecast for the domestic economy and national competitiveness

    Get PDF
    The globalization of markets, the fourth industrial revolution - Industry 4.0, and the post-pandemic business paradigm affected not only domestic enterprises but enterprises around the world as well. Forecasts regarding GDP growth, unemployment rates and competitiveness are provided and discussed by various organizations (World Bank, European Central Bank, World Economic Forum etc.). Additionally, a large number of quarterly reports on various metrics for almost every country are published. In this paper the existing data and reports published by some of the noted organizations are analyzed. The goal is to provide a concise and informative overview on potential domestic macro-economic trends, on national competitiveness and on the competitiveness of domestic enterprises. The overview is conceptualized through discussing existing and potential economic forecasts based on reports. The paper provides a solid basis for future research in the domain of domestic economy and national competitiveness. The paper also provides suggestions and guidelines for improving the competitiveness of domestic enterprises
    corecore