94 research outputs found

    Non-reformist reform for Haskell Modularity

    Get PDF
    In this thesis, I present Backpack, a new language for building separately-typecheckable packages on top of a weak module system like Haskell’s. The design of Backpack is the first to bring the rich world of type systems to the practical world of packages via mixin modules. It’s inspired by the MixML module calculus of Rossberg and Dreyer but by choosing practicality over expressivity Backpack both simplifies that semantics and supports a flexible notion of applicative instantiation. Moreover, this design is motivated less by foundational concerns and more by the practical concern of integration into Haskell. The result is a new approach to writing modular software at the scale of packages.Modulsysteme wie die in Haskell erlauben nur eine weiche Art der Modularität, in dem Modulimplementierungen direkt von anderen Implementierungen abhängen und in dieser Abhängigkeitsreihenfolge verarbeitet werden müssen. Modulsysteme wie die in ML andererseits erlauben eine kräftige Art der Modularität, in dem explizite Schnittstellen Vermutungen über Abhängigkeiten ausdrücken und jeder Modultyp überprüft und unabhängig ergründet werden kann. In dieser Dissertation präsentiere ich Backpack, eine neue Sprache zur Entwicklung separattypenüberprüfbarer Pakete über einem weichen Modulsystem wie Haskells. Das Design von Backpack überführt erstmalig die reichhaltige Welt der Typsysteme in die praktische Welt der Pakete durch Mixin-Module. Es wird von der MixML-Kalkulation von Rossberg und Dreyer angeregt. Backpack vereinfacht allerdings diese Semantik durch die Auswahl von Anwendbarkeit statt Expressivität und fördert eine flexible Art von geeigneter Applicative- Instantiierung. Zudem wird dieses Design weniger von grundlegenden Anliegen als von dem praktischen Anliegen der Eingliederung in Haskell begründet. Die Semantik von Backpack wird durch die Ausarbeitung in Mengen von Haskell-Modulen und „binary interface files“ definiert, und zeigt so, wie Backpack Interoperabilität mit Haskell erhält, während Backpack es mit Schnittstellen nachrüstet. In meiner Formalisierung Backpacks präsentiere ich ein neuartiges Typsystem für Haskellmodule und überprüfe einen entscheidenen Korrektheitssatz, um die Semantik von Backpack zu validieren.Max Planck Institute for Software Systems (MPI-SWS

    An interactive semantics of logic programming

    Full text link
    We apply to logic programming some recently emerging ideas from the field of reduction-based communicating systems, with the aim of giving evidence of the hidden interactions and the coordination mechanisms that rule the operational machinery of such a programming paradigm. The semantic framework we have chosen for presenting our results is tile logic, which has the advantage of allowing a uniform treatment of goals and observations and of applying abstract categorical tools for proving the results. As main contributions, we mention the finitary presentation of abstract unification, and a concurrent and coordinated abstract semantics consistent with the most common semantics of logic programming. Moreover, the compositionality of the tile semantics is guaranteed by standard results, as it reduces to check that the tile systems associated to logic programs enjoy the tile decomposition property. An extension of the approach for handling constraint systems is also discussed.Comment: 42 pages, 24 figure, 3 tables, to appear in the CUP journal of Theory and Practice of Logic Programmin

    Knowledge-Based Techniques for Scholarly Data Access: Towards Automatic Curation

    Get PDF
    Accessing up-to-date and quality scientific literature is a critical preliminary step in any research activity. Identifying relevant scholarly literature for the extents of a given task or application is, however a complex and time consuming activity. Despite the large number of tools developed over the years to support scholars in their literature surveying activity, such as Google Scholar, Microsoft Academic search, and others, the best way to access quality papers remains asking a domain expert who is actively involved in the field and knows research trends and directions. State of the art systems, in fact, either do not allow exploratory search activity, such as identifying the active research directions within a given topic, or do not offer proactive features, such as content recommendation, which are both critical to researchers. To overcome these limitations, we strongly advocate a paradigm shift in the development of scholarly data access tools: moving from traditional information retrieval and filtering tools towards automated agents able to make sense of the textual content of published papers and therefore monitor the state of the art. Building such a system is however a complex task that implies tackling non trivial problems in the fields of Natural Language Processing, Big Data Analysis, User Modelling, and Information Filtering. In this work, we introduce the concept of Automatic Curator System and present its fundamental components.openDottorato di ricerca in InformaticaopenDe Nart, Dari

    A Process Model of Non-Relativistic Quantum Mechanics

    Get PDF
    A process model of quantum mechanics utilizes a combinatorial game to generate a discrete and finite causal space upon which can be defined a self-consistent quantum mechanics. An emergent space-time and continuous wave function arise through a uniform interpolation process. Standard non-relativistic quantum mechanics (at least for integer spin particles) emerges under the limit of infinite information (the causal space grows to infinity) and infinitesimal scale (the separation between points goes to zero). This model is quasi-local, discontinuous, and quasi-non-contextual. The bridge between process and wave function is through the process covering map, which reveals that the standard wave function formalism lacks important dynamical information related to the generation of the causal space. Reformulating several classical conundrums such as wave particle duality, Schrodinger's cat, hidden variable results, the model offers potential resolutions to all, while retaining a high degree of locality and contextuality at the local level, yet nonlocality and contextuality at the emergent level. The model remains computationally powerful

    Linear superposition as a core theorem of quantum empiricism

    Get PDF
    Clarifying the nature of the quantum state Ψ|\Psi\rangle is at the root of the problems with insight into (counterintuitive) quantum postulates. We provide a direct-and math-axiom free-empirical derivation of this object as an element of a vector space. Establishing the linearity of this structure-quantum superposition-is based on a set-theoretic creation of ensemble formations and invokes the following three principia: (I)(\textsf{I}) quantum statics, (II)(\textsf{II}) doctrine of a number in the physical theory, and (III)(\textsf{III}) mathematization of matching the two observations with each other; quantum invariance. All of the constructs rest upon a formalization of the minimal experimental entity: observed micro-event, detector click. This is sufficient for producing the C\mathbb C-numbers, axioms of linear vector space (superposition principle), statistical mixtures of states, eigenstates and their spectra, and non-commutativity of observables. No use is required of the concept of time. As a result, the foundations of theory are liberated to a significant extent from the issues associated with physical interpretations, philosophical exegeses, and mathematical reconstruction of the entire quantum edifice.Comment: No figures. 64 pages; 68 pages(+4), overall substantial improvements; 70 pages(+2), further improvement

    Human decision-making in computer security incident response

    Get PDF
    Background: Cybersecurity has risen to international importance. Almost every organization will fall victim to a successful cyberattack. Yet, guidance for computer security incident response analysts is inadequate. Research Questions: What heuristics should an incident analyst use to construct general knowledge and analyse attacks? Can we construct formal tools to enable automated decision support for the analyst with such heuristics and knowledge? Method: We take an interdisciplinary approach. To answer the first question, we use the research tradition of philosophy of science, specifically the study of mechanisms. To answer the question on formal tools, we use the research tradition of program verification and logic, specifically Separation Logic. Results: We identify several heuristics from biological sciences that cybersecurity researchers have re-invented to varying degrees. We consolidate the new mechanisms literature to yield heuristics related to the fact that knowledge is of clusters of multi-field mechanism schema on four dimensions. General knowledge structures such as the intrusion kill chain provide context and provide hypotheses for filling in details. The philosophical analysis answers this research question, and also provides constraints on building the logic. Finally, we succeed in defining an incident analysis logic resembling Separation Logic and translating the kill chain into it as a proof of concept. Conclusion: These results benefits incident analysis, enabling it to expand from a tradecraft or art to also integrate science. Future research might realize our logic into automated decision-support. Additionally, we have opened the field of cybersecuity to collaboration with philosophers of science and logicians
    corecore