2,426 research outputs found

    Foundational Extensible Corecursion

    Full text link
    This paper presents a formalized framework for defining corecursive functions safely in a total setting, based on corecursion up-to and relational parametricity. The end product is a general corecursor that allows corecursive (and even recursive) calls under well-behaved operations, including constructors. Corecursive functions that are well behaved can be registered as such, thereby increasing the corecursor's expressiveness. The metatheory is formalized in the Isabelle proof assistant and forms the core of a prototype tool. The corecursor is derived from first principles, without requiring new axioms or extensions of the logic

    Keynesian Theory and the AD-AS Framework: A Reconsideration

    Get PDF
    Contrary to what has been argued by a number of critics, the AD-AS framework is both internally consistent and in conformity with Keynes’s own analysis. Moreover, the eclectic approach to behavioral foundations allows models in this tradition to take into account aggregation problems as well as evidence from behavioral economics. Unencumbered by the straightjacket of optimizing microfoundations, the approach can provide a useful starting point for the analysis of dynamic macroeconomic interactions. In developing this analysis, the AD-AS approach can draw on insights from the Post Keynesian, neo-Marxian and structuralist traditions, as well as from the burgeoning literature on behavioral economics. JEL Categories: E12, O11, B22, B41, B50AS-AD, Keynes, New Keynesian theory, microeconomic foundations

    Unified classical logic completeness: a coinductive pearl

    Get PDF
    Codatatypes are absent from many programming languages and proof assistants. We make a case for their importance by revisiting a classic result: the completeness theorem for first-order logic established through a Gentzen system. The core of the proof establishes an abstract property of possibly infinite derivation trees, independently of the concrete syntax or inference rules. This separation of concerns simplifies the presentation. The abstract proof can be instantiated for a wide range of Gentzen and tableau systems as well as various flavors of first order logic. The corresponding Isabelle/HOL formalization demonstrates the recently introduced support for codatatypes and the Haskell code generator

    Unified classical logic completeness: a coinductive pearl

    Get PDF
    Codatatypes are absent from many programming languages and proof assistants. We make a case for their importance by revisiting a classic result: the completeness theorem for first-order logic established through a Gentzen system. The core of the proof establishes an abstract property of possibly infinite derivation trees, independently of the concrete syntax or inference rules. This separation of concerns simplifies the presentation. The abstract proof can be instantiated for a wide range of Gentzen and tableau systems as well as various flavors of first order logic. The corresponding Isabelle/HOL formalization demonstrates the recently introduced support for codatatypes and the Haskell code generator

    Attesting Digital Discrimination Using Norms

    Get PDF
    More and more decisions are delegated to Machine Learning (ML) and automatic decision systems recently. Despite initial misconceptions considering these systems unbiased and fair, recent cases such as racist algorithms being used to inform parole decisions in the US, low-income neighborhood's targeted with high-interest loans and low credit scores, and women being undervalued by online marketing, fueled public distrust in machine learning. This poses a significant challenge to the adoption of ML by companies or public sector organisations, despite ML having the potential to lead to significant reductions in cost and more efficient decisions, and is motivating research in the area of algorithmic fairness and fair ML. Much of that research is aimed at providing detailed statistics, metrics and algorithms which are difficult to interpret and use by someone without technical skills. This paper tries to bridge the gap between lay users and fairness metrics by using simpler notions and concepts to represent and reason about digital discrimination. In particular, we use norms as an abstraction to communicate situations that may lead to algorithms committing discrimination. In particular, we formalise non-discrimination norms in the context of ML systems and propose an algorithm to attest whether ML systems violate these norms

    Certified Impossibility Results for Byzantine-Tolerant Mobile Robots

    Get PDF
    We propose a framework to build formal developments for robot networks using the COQ proof assistant, to state and to prove formally various properties. We focus in this paper on impossibility proofs, as it is natural to take advantage of the COQ higher order calculus to reason about algorithms as abstract objects. We present in particular formal proofs of two impossibility results forconvergence of oblivious mobile robots if respectively more than one half and more than one third of the robots exhibit Byzantine failures, starting from the original theorems by Bouzid et al.. Thanks to our formalization, the corresponding COQ developments are quite compact. To our knowledge, these are the first certified (in the sense of formally proved) impossibility results for robot networks

    Non-empirical problems in fair machine learning

    Get PDF
    The problem of fair machine learning has drawn much attention over the last few years and the bulk of offered solutions are, in principle, empirical. However, algorithmic fairness also raises important conceptual issues that would fail to be addressed if one relies entirely on empirical considerations. Herein, I will argue that the current debate has developed an empirical framework that has brought important contributions to the development of algorithmic decision-making, such as new techniques to discover and prevent discrimination, additional assessment criteria, and analyses of the interaction between fairness and predictive accuracy. However, the same framework has also suggested higher-order issues regarding the translation of fairness into metrics and quantifiable trade-offs. Although the (empirical) tools which have been developed so far are essential to address discrimination encoded in data and algorithms, their integration into society elicits key (conceptual) questions such as: What kind of assumptions and decisions underlies the empirical framework? How do the results of the empirical approach penetrate public debate? What kind of reflection and deliberation should stakeholders have over available fairness metrics? I will outline the empirical approach to fair machine learning, i.e. how the problem is framed and addressed, and suggest that there are important non-empirical issues that should be tackled. While this work will focus on the problem of algorithmic fairness, the lesson can extend to other conceptual problems in the analysis of algorithmic decision-making such as privacy and explainability

    Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems

    Get PDF
    This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods

    GROWTH CYCLES IN MATURE AND DUAL ECONOMIES

    Get PDF
    Mature economies may experience fluctuations, but the average medium and long run growth rate matches the natural rate. Like Kaldor's neo-Keynesian models, the Marx-Goodwin tradition explains this outcome by endogenizing the distribution of income and assuming that the accumulation of capital is increasing as a function of the profit share. The application of Goodwin cycles to developing economies may be hard to justify, however. The modified Goodwin models in this paper include relative-wage norms as a central element of wage formation. Norms change endogenously, leading to path dependence (hysteresis) in the stationary solution for the employment share of the modern sector. The effects of shocks – the sensitivity of the long-run outcome to initial conditions – may be amplified by non-linearities in the adjustment of wages to deviations of actual wages from the norm.Mature economies may experience fluctuations, but the average medium and long run growth rate matches the natural rate. Like Kaldor's neo-Keynesian models, the Marx-Goodwin tradition explains this outcome by endogenizing the distribution of income and assuming that the accumulation of capital is increasing as a function of the profit share. The application of Goodwin cycles to developing economies may be hard to justify, however. The modified Goodwin models in this paper include relative-wage norms as a central element of wage formation. Norms change endogenously, leading to path dependence (hysteresis) in the stationary solution for the employment share of the modern sector. The effects of shocks – the sensitivity of the long-run outcome to initial conditions – may be amplified by non-linearities in the adjustment of wages to deviations of actual wages from the norm
    • …
    corecore