2,688 research outputs found

    Algebras for parameterised monads

    Get PDF
    Parameterised monads have the same relationship to adjunctions with parameters as monads do to adjunctions. In this paper, we investigate algebras for parameterised monads. We identify the Eilenberg-Moore category of algebras for parameterised monads and prove a generalisation of Beck’s theorem characterising this category. We demonstrate an application of this theory to the semantics of type and effect systems

    Layer by layer - Combining Monads

    Full text link
    We develop a method to incrementally construct programming languages. Our approach is categorical: each layer of the language is described as a monad. Our method either (i) concretely builds a distributive law between two monads, i.e. layers of the language, which then provides a monad structure to the composition of layers, or (ii) identifies precisely the algebraic obstacles to the existence of a distributive law and gives a best approximant language. The running example will involve three layers: a basic imperative language enriched first by adding non-determinism and then probabilistic choice. The first extension works seamlessly, but the second encounters an obstacle, which results in a best approximant language structurally very similar to the probabilistic network specification language ProbNetKAT

    Hilbert-Post completeness for the state and the exception effects

    Get PDF
    In this paper, we present a novel framework for studying the syntactic completeness of computational effects and we apply it to the exception effect. When applied to the states effect, our framework can be seen as a generalization of Pretnar's work on this subject. We first introduce a relative notion of Hilbert-Post completeness, well-suited to the composition of effects. Then we prove that the exception effect is relatively Hilbert-Post complete, as well as the "core" language which may be used for implementing it; these proofs have been formalized and checked with the proof assistant Coq.Comment: Siegfried Rump (Hamburg University of Technology), Chee Yap (Courant Institute, NYU). Sixth International Conference on Mathematical Aspects of Computer and Information Sciences , Nov 2015, Berlin, Germany. 2015, LNC

    Generic Fibrational Induction

    Full text link
    This paper provides an induction rule that can be used to prove properties of data structures whose types are inductive, i.e., are carriers of initial algebras of functors. Our results are semantic in nature and are inspired by Hermida and Jacobs' elegant algebraic formulation of induction for polynomial data types. Our contribution is to derive, under slightly different assumptions, a sound induction rule that is generic over all inductive types, polynomial or not. Our induction rule is generic over the kinds of properties to be proved as well: like Hermida and Jacobs, we work in a general fibrational setting and so can accommodate very general notions of properties on inductive types rather than just those of a particular syntactic form. We establish the soundness of our generic induction rule by reducing induction to iteration. We then show how our generic induction rule can be instantiated to give induction rules for the data types of rose trees, finite hereditary sets, and hyperfunctions. The first of these lies outside the scope of Hermida and Jacobs' work because it is not polynomial, and as far as we are aware, no induction rules have been known to exist for the second and third in a general fibrational framework. Our instantiation for hyperfunctions underscores the value of working in the general fibrational setting since this data type cannot be interpreted as a set.Comment: For Special Issue from CSL 201

    Effect of hard processes on momentum correlations in pppp and ppˉp\bar{p} collisions

    Full text link
    The HBT radii extracted in p-pbar and pp collisions at SPS and Tevatron show a clear correlation with the charged particle rapidity density. We propose to explain the correlation using a simple model where the distance from the initial hard parton-parton scattering to the hadronization point depends on the energy of the partons emitted. Since the particle multiplicity is correlated with the mean energy of the partons produced we can explain the experimental observations without invoking scenarios that assume a thermal fireball. The model has been applied with success to the existing experimental data both in the magnitude and the intensity of the correlation. As well, the model has been extended to pp collisions at the LHC energy of 14 TeV. The possibilities of a better insight into the string spatial development using 3D HBT analysis is discussed.Comment: 12 pages, 6 figure

    A semantical approach to equilibria and rationality

    Full text link
    Game theoretic equilibria are mathematical expressions of rationality. Rational agents are used to model not only humans and their software representatives, but also organisms, populations, species and genes, interacting with each other and with the environment. Rational behaviors are achieved not only through conscious reasoning, but also through spontaneous stabilization at equilibrium points. Formal theories of rationality are usually guided by informal intuitions, which are acquired by observing some concrete economic, biological, or network processes. Treating such processes as instances of computation, we reconstruct and refine some basic notions of equilibrium and rationality from the some basic structures of computation. It is, of course, well known that equilibria arise as fixed points; the point is that semantics of computation of fixed points seems to be providing novel methods, algebraic and coalgebraic, for reasoning about them.Comment: 18 pages; Proceedings of CALCO 200

    Dual-readout Calorimetry

    Full text link
    The RD52 Project at CERN is a pure instrumentation experiment whose goal is to understand the fundamental limitations to hadronic energy resolution, and other aspects of energy measurement, in high energy calorimeters. We have found that dual-readout calorimetry provides heretofore unprecedented information event-by-event for energy resolution, linearity of response, ease and robustness of calibration, fidelity of data, and particle identification, including energy lost to binding energy in nuclear break-up. We believe that hadronic energy resolutions of {\sigma}/E \approx 1 - 2% are within reach for dual-readout calorimeters, enabling for the first time comparable measurement preci- sions on electrons, photons, muons, and quarks (jets). We briefly describe our current progress and near-term future plans. Complete information on all aspects of our work is available at the RD52 website http://highenergy.phys.ttu.edu/dream/.Comment: 10 pages, 10 figures, Snowmass White pape

    Charged-Particle Multiplicity in Proton-Proton Collisions

    Full text link
    This article summarizes and critically reviews measurements of charged-particle multiplicity distributions and pseudorapidity densities in p+p(pbar) collisions between sqrt(s) = 23.6 GeV and sqrt(s) = 1.8 TeV. Related theoretical concepts are briefly introduced. Moments of multiplicity distributions are presented as a function of sqrt(s). Feynman scaling, KNO scaling, as well as the description of multiplicity distributions with a single negative binomial distribution and with combinations of two or more negative binomial distributions are discussed. Moreover, similarities between the energy dependence of charged-particle multiplicities in p+p(pbar) and e+e- collisions are studied. Finally, various predictions for pseudorapidity densities, average multiplicities in full phase space, and multiplicity distributions of charged particles in p+p(pbar) collisions at the LHC energies of sqrt(s) = 7 TeV, 10 TeV, and 14 TeV are summarized and compared.Comment: Invited review for Journal of Physics G -- version 2: version after referee's comment

    Extended Call-by-Push-Value: Reasoning About Effectful Programs and Evaluation Order

    Get PDF
    Traditionally, reasoning about programs under varying evaluation regimes (call-by-value, call-by-name etc.) was done at the meta-level, treating them as term rewriting systems. Levy’s call-by-push-value (CBPV) calculus provides a more powerful approach for reasoning, by treating CBPV terms as a common intermediate language which captures both call-by-value and call-by-name, and by allowing equational reasoning about changes to evaluation order between or within programs. We extend CBPV to additionally deal with call-by-need, which is non-trivial because of shared reductions. This allows the equational reasoning to also support call-by-need. As an example, we then prove that call-by-need and call-by-name are equivalent if nontermination is the only side-effect in the source language. We then show how to incorporate an effect system. This enables us to exploit static knowledge of the potential effects of a given expression to augment equational reasoning; thus a program fragment might be invariant under change of evaluation regime only because of knowledge of its effects
    corecore