120 research outputs found

    A Dualities-Consolidating Framework to Support Systematic Programming Language Design

    Get PDF
    In the theory of programming languages, duality is increasingly recognized as being important for improving economy, offering the theoretical development for one of two dual concepts "for free". Two prevalent dualities are the extensibility duality, related to the Expression Problem, and the De Morgan duality, related to evaluation strategies and control flow; for instance, a language which is symmetric with respect to the extensibility duality has both a facility which allows for easy extension with new variants, similar to how classes implement an interface in certain object-oriented languages, and a dual facility which allows for easy extension with new operations, as in functional programming with algebraic data types. However, this theoretical knowledge arguably has yet to be made more accessible to the practician. In particular, the design of programming languages does not yet really benefit from it in a systematic way. As a step to improve this situation, building on these prior results, the present work presents a prototype of a, in the conceptual sense rather economical, foundational system, in which the extensibility duality and the De Morgan duality are consolidated. In particular, the system is inherently highly symmetric with respect to both dualities and their consolidation quite naturally allows to carve out the essence of the extensibility duality, thereby further optimizing the meta-level economy. As will be demonstrated, this system can serve as a framework in which various language features known from practical programming languages can be recovered (by local syntactic abstractions, a.k.a. macros) and systematically compared, including algebraic data types and function types as known from functional programming, classes and objects, and exception handling, in combination with the evaluation strategies employed by the respective languages. This is intended to facilitate a systematic analysis of programming language concepts which may aid in the design of parsimonious languages which are symmetric with respect to one or both of the mentioned dualities. For the more short-term perspective, the system may also serve as a cornerstone for the systematic development of tools which automatically semantically compare (and convert between) programs in different languages by means of analyzing the results of embedding them into the framework.In der theoretischen Betrachtung von Programmiersprachen wird DualitĂ€t als zunehmend wichtig fĂŒr die Verbesserung der Ökonomie betrachtet, da diese ermöglicht, die Theorie-Entwicklung fĂŒr eines von zwei dualen Konzepten "umsonst" zu erhalten. Zwei vorherrschende DualitĂ€ten sind die ExtensibilitĂ€ts-DualitĂ€t, die im Zusammenhang mit dem Expression Problem steht, und die De Morgan-DualitĂ€t, die im Zusammenhang mit Auswertungsstrategien und Kontrollfluss steht; zum Beispiel bietet eine Sprache, die symmetrisch in Bezug auf die ExtensibilitĂ€ts-DualitĂ€t ist, sowohl ein Konstrukt, das die einfache HinzufĂŒgung von neuen Varianten ermöglicht, Ă€hnlich dazu wie in gewissen Objekt-Orientierten Sprachen Klassen ein Interface implementieren, als auch ein duales Konstrukt, das die einfache HinzufĂŒgung von neuen Operationen ermöglicht, wie in der Funktionalen Programmierung mit algebraischen Datentypen. Dieses theoretische Wissen muss wohl allerdings dem Praktiker noch besser zugĂ€nglich gemacht werden. Insbesondere profitiert die Entwicklung von Programmiersprachen noch nicht wirklich auf eine systematische Weise davon. Als Schritt auf dem Weg dahin, diese Situation zu verbessern, prĂ€sentiert diese Arbeit, auf diesen bisherigen Resultaten aufbauend, ein grundlegendes, im konzeptuellen Sinne recht ökonomisches System, in dem die ExtensibilitĂ€ts-DualitĂ€t und die De Morgan-DualitĂ€t miteinander vereinigt sind. Insbesondere ist dieses System inhĂ€rent höchst symmetrisch in Bezug auf beide DualitĂ€ten und deren Vereinigung ermöglicht auf recht natĂŒrliche Weise die Essenz der ExtensibilitĂ€ts-DualitĂ€t herauszuarbeiten, was die Ökonomie auf der Meta-Ebene weiter verbessert. Wie dargestellt werden wird, kann dieses System als Framework dienen, in dem sich verschiedene Sprach-Features aus in der Praxis relevanten Programmiersprachen darstellen lassen (durch lokale syntaktische Abstraktionen, auch bekannt als Macros) und in dem man diese vergleichen kann, wie etwa algebraische Datentypen und Funktionstypen, wie man sie aus der Funktionalen Programmierung kennt, Klassen und Objekte, sowie Exception-Handling, in Verbindung mit den Auswertungsstrategien die von den jeweiligen Sprachen verwendet werden. Dies soll dem Zweck dienen, eine systematische Analyse von Programmiersprachen-Konzepten zu ermöglichen, welche bei der Entwicklung von kompakten Sprachen helfen kann, die symmetrisch in Bezug auf eine oder beide der erwĂ€hnten DualitĂ€ten sind. FĂŒr die kurzfristigere Perspektive bietet es das System auch als Grundstein fĂŒr die systematische Entwicklung von Tools an, welche automatisch Programme in verschiedenen Sprache semantisch vergleichen (und ineinander umwandeln), indem sie die Ergebnisse von deren Einbettung in das Framework analysieren

    Programming Languages and Systems

    Get PDF
    This open access book constitutes the proceedings of the 28th European Symposium on Programming, ESOP 2019, which took place in Prague, Czech Republic, in April 2019, held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2019

    Making predictions using χEFT

    Get PDF
    In this thesis we explore the merits of chiral effective field theory (χEFT) as a model for low-energy nuclear physics. χEFT is an effective field theory based on quantum chromo dynamics (QCD) describing low-energy interactions of nucleons and pions. We estimate the inherent uncertainties of χEFT and the accompanying methods used to compute observables in order to test the predictive power of the model. We use experimental pion-nucleon, nucleon- nucleon and few-nucleon data to perform a simultaneous fit of the low-energy constants in χEFT. This results in small statistical uncertainties in the model. The results show a clear order-by-order improvement of χEFT with the systematical model error dominating the total error budget

    Components as coalgebras

    Get PDF
    In the tradition of mathematical modelling in physics and chemistry, constructive formal specification methods are based on the notion of a software model, understood as a state-based abstract machine which persists and evolves in time, according to a behavioural model capturing, for example, partiality or (different degrees of) nondeterminism. This can be identified with the more prosaic notion of a software component advocated by the software industry as ‘building block’ of large, often distributed, systems. Such a component typically encapsulates a number of services through a public interface which provides a limited access to a private state space, paying tribute to the nowadays widespread object-oriented programming principles. The tradition of communicating systems formal design, by contrast, has developed the notion of a process as an abstraction of the behavioural patterns of a computing system, deliberately ignoring the data and state aspects of software systems. Both processes and components are among the broad group of computing phenomena which are hardly definable (or simply not definable) algebraically, i.e., in terms of a complete set of constructors. Their semantics is essentially observational, in the sense that all that can be traced of their evolution is their interaction with the environment. Therefore, coalgebras, whose theory has recently witnessed remarkable developments, appear as a suitable modelling tool. The basic observation of category theory that universal constructions always come in pairs, has motivated research on the duality between algebras and coalgebras, which provides a bridge between models of static (constructive, data-oriented) and dynamical (observational, behaviour-oriented) systems. At the programming level, the intuitive symmetry between data and behaviour provides evidence of such a duality, in its canonical initial-final specialisation. This line of thought entails both definitional and proof principles, i.e., a basis for the development of program calculi directly based on (actually driven by) type specifications. Moreover, such properties can be expressed in terms of generic programming combinators which are used, not only to calculate programs, but also to program with. Framed in this context, this thesis addresses the following main themes: The investigation of a semantic model for (state-based) software components. These are regarded as concrete coalgebras for some Set endofunctors, with specified initial conditions, and organise themselves in a bicategorical setting. The model is able to capture both behavioural issues, which are usually left implicit in state-based specification methods, and interaction through structured data, which is usually a minor concern on process calculi. Two basic cases are considered entailing, respectively, a ‘functional’ and an ‘object-oriented’ shape for components. Both cases are parametrized by a model of behaviour, introduced as a strong (usually commutative) monad. The development of corresponding component calculi, also parametric on the behaviour model, which adds to the genericity of the approach. The study of processes and the ‘reconstruction’ of classical (CCS-like) process calculi on top of their representation as inhabitants of (the carriers of) final coalgebras, in an essentially pointfree, calculational style. An overall concern for genericity, in the sense that models and calculi for both components and processes are parametric on the behaviour model and the interaction discipline, respectively. The animation of both processes and components in CHARITY, a functional programming language entirely based on inductive and coinductive categorical data types. In particular this leads to the development of a process calculi interpreter parametric on the interaction discipline.PRAXIS XXI - Projecto LOGCAMP; POO11/IC-PME/II/S -Projecto KARMA; Fundação para a CiĂȘncia e Tecnologia; ALGORITMI Research Center

    Carbonate based ionic liquid synthesis: development, supported by quantum chemical computation and technical application

    Get PDF
    Die Carbonat basierende Synthese Ionischer FlĂŒssigkeiten (CBILSÂź) ist ein grĂŒnes Produktionsverfahren und basiert auf der Verwendung von Alkylcarbonaten als Alkylierungsmittel. In der vorliegenden Arbeit wurde die Entwicklung von CBILSÂź bis zum Multi-100kg Maßstab behandelt. Kritische Nebenprodukte konnten unterdrĂŒckt oder erfolgreich entfernt werden. Es wurden eine Vielzahl von diversen Ausgangsmaterialien experimentell gescreent und quantenchemische Berechnungsmethoden entwickelt, um thermodynamische Funktionen fĂŒr die Optimierung und Anwendung des CBILSÂź-Prozesses modellieren zu können.The Carbonate Based Ionic Liquid Synthesis (CBILSÂź) is a green manufacturing process and is based on the use of alkylcarbonates as alkylating agents. In this work the development of CBILSÂź up to a multi-100kg scale is presented. Critical side products could be suppressed or removed successfully. A large number of diverse starting materials have been screened experimentally and quantum-chemical calculation methods have been established, in order to model thermodynamic functions for the development and application of the CBILSÂź process

    The possibility of the existence of a nitrogen cycle within Enceladus

    Get PDF
    In dieser Arbeit wird zunĂ€chst auf Enceladus als Satellit des Saturn eingegangen und eine Vielzahl an planetologischen Eigenschaften dargestellt. Es werden AbschĂ€tzungen ĂŒber seine Masse, sein Volumen und seine mittlere Dichte gemacht und ein 2-Schalen-Modell fĂŒr seine innere Struktur errechnet. Weiters werden die Erkenntnisse ĂŒber die AtmosphĂ€re, das Magnetfeld und die OberflĂ€che des Mondes zusammengefasst. Dabei wird vor allem auf die sĂŒdpolare Region samt "Tiger Stripes" eingegangen. Einen weiteren Punkt stellt die AbschĂ€tzung des WĂ€rmehaushaltes des Enceladus dar, die vermutlich vor allem auf Gezeitenreibung ("tidal heating") und radioaktiven Zerfall von langlebigen Isotopen beruht. Dazu wird von der Autorin die WĂ€rmerate im Gleichgewichtszustand berechnet. Mittels variabler Konzentration und WĂ€rmefreisetzung der eben genannten Isotope wird die mittlere radiogene WĂ€rmeerzeugung abgeschĂ€tzt. Im nĂ€chsten Kapitel wird zunĂ€chst auf die Speisung des E-Rings durch den "plume" eingegangen und dann seine Zusammensetzung und mögliche Quellen erlĂ€utert. Im folgenden Abschnitt wird ein möglicher unterirdischer Aquifer nĂ€her bezĂŒglich Zusammensetzung und Struktur behandelt. Im Anschluss folgt eine ausfĂŒhrliche Darstellung des Stickstoffkreislaufs der Erde, wobei sowohl die chemischen Reaktionen als auch die daran beteiligten Mikroorganismen vorgestellt werden. Im letzten Kapitel wird die Möglichkeit von Leben auf Enceladus erlĂ€utert. Dabei werden zunĂ€chst terrestrische Ökosysteme und Lebensformen prĂ€sentiert, die aus heutiger Sicht auch auf Enceladus möglich wĂ€ren. Dann wird versucht zu ermitteln, ob ein Stickstoffkreislauf, wie wir ihn von der Erde kennen, auch auf diesem Eismond möglich wĂ€re. Dabei wird erlĂ€utert, warum die Stickstoffverbindungen wohl eher abiotischen Ursprungs sind und somit solch ein Stickstoffkreislauf als unwahrscheinlich erscheint

    Multi-Photon Entanglement

    Get PDF
    Major efforts in quantum information science are devoted to the development of methods that are superior to the one of classical information processing, for example the quantum computer or quantum simulations. For these purposes, superposition and entangled states are considered a decisive resource. Furthermore, since the early days of quantum mechanics, entanglement has revealed the discrepancy between the quantum mechanical and the everyday life perception of the physical world. This combination of fundamental science and application-oriented research makes the realization, characterization, and application of entanglement a challenge pursued by many researchers. In this work, the observation of entangled states of polarization encoded photonic qubits is pushed forward in two directions: flexibility in state observation and increase in photon rate. To achieve flexibility two different schemes are developed: setup-based and entanglement-based observation of inequivalent multi-photon states. The setup-based scheme relies on multi-photon interference at a polarizing beam splitter with prior polarization manipulations. It allows the observation of a family of important four-photon entangled states. The entanglement-based scheme exploits the rich properties of Dicke states under particle projections or loss in order to obtain inequivalent multi-photon entangled states. The observed states are characterized using the fidelity and entanglement witnesses. An increase in photon rate is crucial to achieve entanglement of higher photon numbers. This holds especially, when photon sources are utilized that emit photons spontaneously. To this end, a new photon source is presented based on a femtosecond ultraviolet enhancement cavity and applied to the observation of the six-photon Dicke state with three excitations. The implemented schemes not only allow the observation of inequivalent types of entanglement, but also the realization of various quantum information tasks. In this work, the four-photon GHZ state has been used in a quantum simulation of a minimal instance of the toric code. This code enables the demonstration of basic properties of anyons, which are quasiparticles distinct from bosons and fermions. Further, the six-photon Dicke state has been applied for quantum metrology: It allows one to estimate a phase shift with a higher precision than by using only classical resources. Altogether, a whole series of experiments for observing inequivalent multi-photon entangled states can now be substituted by a single experimental setup based on the designs developed in this work. In addition to this new approach of photon processing, a novel photon source has been implemented, paving the way to realizations of applications requiring higher photon numbers.This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the Ludwig-Maximilians-UniversitĂ€t MĂŒnchen's products or services. Internal or personal use of this material is permitted. However, permission to reprint republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this material, you agree to all provisions of the copyright laws protecting it

    A Model of Composite Dark Matter in Light-Front Holographic QCD

    Get PDF
    Even after 50 years, there is still no standard, analytically tractable way to treat Quantum Chromodynamics (QCD) non-numerically besides perturbation theory. In the high-energy regime perturbation theory agrees with experimental results to a great accuracy. However, at low energies the theory becomes strongly coupled and therefore not computable by perturbative methods. Therefore, non-perturbative methods are needed, and one of the candidates is light-front holography. In this thesis, the basics of light-front quantisation and holography are discussed. Light-front quantisation takes light-cone coordinates and the Hamiltonian quantisation scheme as its basis and the resulting field theory has many beneficial properties like frame-independence. Still, to extract meaningful results from the light-front QCD, one needs to apply bottom-up holographic methods. Last part of this work focuses on the applicability of light-front holographic QCD in the area of dark matter. We find that one can build a secluded SU(3) sector consisting of a doublet of elementary particles, analogous to quarks and gluons. Due to a global symmetry, the lightest stable particle is analogous with ordinary neutron. It meets the basic requirements for being a WIMP candidate when its mass is higher than 5 TeV
    • 

    corecore