13 research outputs found

    Foundations of Software Science and Computation Structures

    Get PDF
    This open access book constitutes the proceedings of the 24th International Conference on Foundations of Software Science and Computational Structures, FOSSACS 2021, which was held during March 27 until April 1, 2021, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2021. The conference was planned to take place in Luxembourg and changed to an online format due to the COVID-19 pandemic. The 28 regular papers presented in this volume were carefully reviewed and selected from 88 submissions. They deal with research on theories and methods to support the analysis, integration, synthesis, transformation, and verification of programs and software systems

    Proof-theoretic Semantics for Intuitionistic Multiplicative Linear Logic

    Get PDF
    This work is the first exploration of proof-theoretic semantics for a substructural logic. It focuses on the base-extension semantics (B-eS) for intuitionistic multiplicative linear logic (IMLL). The starting point is a review of Sandqvist’s B-eS for intuitionistic propositional logic (IPL), for which we propose an alternative treatment of conjunction that takes the form of the generalized elimination rule for the connective. The resulting semantics is shown to be sound and complete. This motivates our main contribution, a B-eS for IMLL , in which the definitions of the logical constants all take the form of their elimination rule and for which soundness and completeness are established

    Pseudo-contractions as Gentle Repairs

    Get PDF
    Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas

    Proceedings of the 26th International Symposium on Theoretical Aspects of Computer Science (STACS'09)

    Get PDF
    The Symposium on Theoretical Aspects of Computer Science (STACS) is held alternately in France and in Germany. The conference of February 26-28, 2009, held in Freiburg, is the 26th in this series. Previous meetings took place in Paris (1984), Saarbr¨ucken (1985), Orsay (1986), Passau (1987), Bordeaux (1988), Paderborn (1989), Rouen (1990), Hamburg (1991), Cachan (1992), W¨urzburg (1993), Caen (1994), M¨unchen (1995), Grenoble (1996), L¨ubeck (1997), Paris (1998), Trier (1999), Lille (2000), Dresden (2001), Antibes (2002), Berlin (2003), Montpellier (2004), Stuttgart (2005), Marseille (2006), Aachen (2007), and Bordeaux (2008). ..

    Dichotomies in Constraint Satisfaction: Canonical Functions and Numeric CSPs

    Get PDF
    Constraint satisfaction problems (CSPs) form a large class of decision problems that con- tains numerous classical problems like the satisfiability problem for propositional formulas and the graph colourability problem. Feder and Vardi [52] gave the following logical for- malization of the class of CSPs: every finite relational structure A, the template, gives rise to the decision problem of determining whether there exists a homomorphism from a finite input structure B to A. In their seminal paper, Feder and Vardi recognised that CSPs had a particular status in the landscape of computational complexity: despite the generality of these problems, it seemed impossible to construct NP-intermediate problems `a la Ladner [72] within this class. The authors thus conjectured that the class of CSPs satisfies a complexity dichotomy , i.e., that every CSP is solvable in polynomial time or is NP-complete. The Feder-Vardi dichotomy conjecture was the motivation of an intensive line of research over the last two decades. Some of the landmarks of this research are the confirmation of the conjecture for special classes of templates, e.g., for the class of undi- rected graphs [55], for the class of smooth digraphs [5], and for templates with at most three elements [43, 84]. Finally, after being open for 25 years, Bulatov [44] and Zhuk [87] independently proved that the conjecture of Feder and Vardi indeed holds. The success of the research program on the Feder-Vardi conjecture is based on the con- nection between constraint satisfaction problems and universal algebra. In their seminal paper, Feder and Vardi described polynomial-time algorithms for CSPs whose template satisfies some closure properties. These closure properties are properties of the polymor- phism clone of the template and similar properties were later used to provide tractability or hardness criteria [61, 62]. Shortly thereafter, Bulatov, Jeavons, and Krokhin [46] proved that the complexity of the CSP depends only on the equational properties of the poly- morphism clone of the template. They proved that trivial equational properties imply hardness of the CSP, and conjectured that the CSP is solvable in polynomial time if the polymorphism clone of the template satisfies some nontrivial equation. It is this conjecture that Bulatov and Zhuk finally proved, relying on recent developments in universal algebra. As a by-product of the fact that the delineation between polynomial-time tractability and NP-hardness can be stated algebraically, we also obtain that the meta-problem for finite- domain CSPs is decidable. That is, there exists an algorithm that, given a finite relational structure A as input, decides the complexity of the CSP of A

    Dichotomies in Constraint Satisfaction: Canonical Functions and Numeric CSPs

    Get PDF
    Constraint satisfaction problems (CSPs) form a large class of decision problems that con- tains numerous classical problems like the satisfiability problem for propositional formulas and the graph colourability problem. Feder and Vardi [52] gave the following logical for- malization of the class of CSPs: every finite relational structure A, the template, gives rise to the decision problem of determining whether there exists a homomorphism from a finite input structure B to A. In their seminal paper, Feder and Vardi recognised that CSPs had a particular status in the landscape of computational complexity: despite the generality of these problems, it seemed impossible to construct NP-intermediate problems `a la Ladner [72] within this class. The authors thus conjectured that the class of CSPs satisfies a complexity dichotomy , i.e., that every CSP is solvable in polynomial time or is NP-complete. The Feder-Vardi dichotomy conjecture was the motivation of an intensive line of research over the last two decades. Some of the landmarks of this research are the confirmation of the conjecture for special classes of templates, e.g., for the class of undi- rected graphs [55], for the class of smooth digraphs [5], and for templates with at most three elements [43, 84]. Finally, after being open for 25 years, Bulatov [44] and Zhuk [87] independently proved that the conjecture of Feder and Vardi indeed holds. The success of the research program on the Feder-Vardi conjecture is based on the con- nection between constraint satisfaction problems and universal algebra. In their seminal paper, Feder and Vardi described polynomial-time algorithms for CSPs whose template satisfies some closure properties. These closure properties are properties of the polymor- phism clone of the template and similar properties were later used to provide tractability or hardness criteria [61, 62]. Shortly thereafter, Bulatov, Jeavons, and Krokhin [46] proved that the complexity of the CSP depends only on the equational properties of the poly- morphism clone of the template. They proved that trivial equational properties imply hardness of the CSP, and conjectured that the CSP is solvable in polynomial time if the polymorphism clone of the template satisfies some nontrivial equation. It is this conjecture that Bulatov and Zhuk finally proved, relying on recent developments in universal algebra. As a by-product of the fact that the delineation between polynomial-time tractability and NP-hardness can be stated algebraically, we also obtain that the meta-problem for finite- domain CSPs is decidable. That is, there exists an algorithm that, given a finite relational structure A as input, decides the complexity of the CSP of A

    Proceedings of the 1st International Conference on Algebras, Graphs and Ordered Sets (ALGOS 2020)

    Get PDF
    International audienceOriginating in arithmetics and logic, the theory of ordered sets is now a field of combinatorics that is intimately linked to graph theory, universal algebra and multiple-valued logic, and that has a wide range of classical applications such as formal calculus, classification, decision aid and social choice.This international conference “Algebras, graphs and ordered set” (ALGOS) brings together specialists in the theory of graphs, relational structures and ordered sets, topics that are omnipresent in artificial intelligence and in knowledge discovery, and with concrete applications in biomedical sciences, security, social networks and e-learning systems. One of the goals of this event is to provide a common ground for mathematicians and computer scientists to meet, to present their latest results, and to discuss original applications in related scientific fields. On this basis, we hope for fruitful exchanges that can motivate multidisciplinary projects.The first edition of ALgebras, Graphs and Ordered Sets (ALGOS 2020) has a particular motivation, namely, an opportunity to honour Maurice Pouzet on his 75th birthday! For this reason, we have particularly welcomed submissions in areas related to Maurice’s many scientific interests:• Lattices and ordered sets• Combinatorics and graph theory• Set theory and theory of relations• Universal algebra and multiple valued logic• Applications: formal calculus, knowledge discovery, biomedical sciences, decision aid and social choice, security, social networks, web semantics..

    Understanding Optimisation Processes with Biologically-Inspired Visualisations

    Get PDF
    Evolutionary algorithms (EAs) constitute a branch of artificial intelligence utilised to evolve solutions to solve optimisation problems abound in industry and research. EAs often generate many solutions and visualisation has been a primary strategy to display EA solutions, given that visualisation is a multi-domain well-evaluated medium to comprehend extensive data. The endeavour of visualising solutions is inherent with challenges resulting from high dimensional phenomenons and the large number of solutions to display. Recently, scholars have produced methods to mitigate some of these known issues when illustrating solutions. However, one key consideration is that displaying the final subset of solutions exclusively (rather than the whole population) discards most of the informativeness of the search, creating inadequate insight into the black-box EA. There is an unequivocal knowledge gap and requirement for methods which can visualise the whole population of solutions from an optimiser and subjugate the high-dimensional problems and scaling issues to create interpretability of the EA search process. Furthermore, a requirement for explainability in evolutionary computing has been demanded by the evolutionary computing community, which could take the form of visualisations, to support EA comprehension much like the support explainable artificial intelligence has brought to artificial intelligence. In this thesis, we report novel visualisation methods that can be used to visualise large and high-dimensional optimiser populations with the aim of creating greater interpretability during a search. We consider the nascent intersection of visualisation and explainability in evolutionary computing. The potential high informativeness of a visualisation method from an early chapter of this work forms an effective platform to develop an explainability visualisation method, namely the population dynamics plot, to attempt to inject explainability into the inner workings of the search process. We further support the visualisation of populations using machine learning to construct models which can capture the characteristics of an EA search and develop intelligent visualisations which use artificial intelligence to potentially enhance and support visualisation for a more informative search process. The methods developed in this thesis are evaluated both quantitatively and qualitatively. We use multi-feature benchmark problems to show the method’s ability to reveal specific problem characteristics such as disconnected fronts, local optima and bias, as well as potentially creating a better understanding of the problem landscape and optimiser search for evaluating and comparing algorithm performance (we show the visualisation method to be more insightful than conventional metrics like hypervolume alone). One of the most insightful methods developed in this thesis can produce a visualisation requiring less than 1% of the time and memory necessary to produce a visualisation of the same objective space solutions using existing methods. This allows for greater scalability and the use in short compile time applications such as online visualisations. Predicated by an existing visualisation method in this thesis, we then develop and apply an explainability method to a real-world problem and evaluate it to show the method to be highly effective at explaining the search via solutions in the objective spaces, solution lineage and solution variation operators to compactly comprehend, evaluate and communicate the search of an optimiser, although we note the explainability properties are only evaluated against the author’s ability and could be evaluated further in future work with a usability study. The work is then supported by the development of intelligent visualisation models that may allow one to predict solutions in optima (importantly local optima) in unseen problems by using a machine learning model. The results are effective, with some models able to predict and visualise solution optima with a balanced F1 accuracy metric of 96%. The results of this thesis provide a suite of visualisations which aims to provide greater informativeness of the search and scalability than previously existing literature. The work develops one of the first explainability methods aiming to create greater insight into the search space, solution lineage and reproductive operators. The work applies machine learning to potentially enhance EA understanding via visualisation. These models could also be used for a number of applications outside visualisation. Ultimately, the work provides novel methods for all EA stakeholders which aims to support understanding, evaluation and communication of EA processes with visualisation

    The French-Anglophone divide in lithic research: A plea for pluralism in Palaeolithic Archaeology

    Get PDF
    In this provocative study, Shumon T. Hussain engages with the long-standing issue of French-Anglophone research conflicts in Palaeolithic archaeology. By examining a range of well-selected case studies and discursive contexts, the author shows that French and Anglophone approaches in lithic analysis are anchored in opposing cognitive frameworks. He argues that the mainstays of this division can be elucidated by calling upon the marginalised work of American philosopher Stephen C. Pepper, who captured the totality of credible Western thought in terms of four equitable world hypotheses. Based upon his insights, the dissertation demonstrates that French lithic research gravitates towards ‘contextualistic’ and ‘organicistic’ modes of inquiry, while Anglophone approaches tend to rely on ‘formistic’ and ‘mechanistic’ styles of reasoning. Hussain carefully lays out the implications of this condition for mutual understanding and critical practice. He contends that the French-Anglophone divide can only be overcome if scholars endorse scientific pluralism and begin to seriously take into consideration both the strengths and shortcomings of different cognitive frameworks, including their own. Human Origin
    corecore