1,294 research outputs found

    Unexpected Power of Random Strings

    Get PDF

    Computability and Algorithmic Complexity in Economics

    Get PDF
    This is an outline of the origins and development of the way computability theory and algorithmic complexity theory were incorporated into economic and finance theories. We try to place, in the context of the development of computable economics, some of the classics of the subject as well as those that have, from time to time, been credited with having contributed to the advancement of the field. Speculative thoughts on where the frontiers of computable economics are, and how to move towards them, conclude the paper. In a precise sense - both historically and analytically - it would not be an exaggeration to claim that both the origins of computable economics and its frontiers are defined by two classics, both by Banach and Mazur: that one page masterpiece by Banach and Mazur ([5]), built on the foundations of Turing’s own classic, and the unpublished Mazur conjecture of 1928, and its unpublished proof by Banach ([38], ch. 6 & [68], ch. 1, #6). For the undisputed original classic of computable economics is Rabinís effectivization of the Gale-Stewart game ([42];[16]); the frontiers, as I see them, are defined by recursive analysis and constructive mathematics, underpinning computability over the computable and constructive reals and providing computable foundations for the economist’s Marshallian penchant for curve-sketching ([9]; [19]; and, in general, the contents of Theoretical Computer Science, Vol. 219, Issue 1-2). The former work has its roots in the Banach-Mazur game (cf. [38], especially p.30), at least in one reading of it; the latter in ([5]), as well as other, earlier, contributions, not least by Brouwer.

    Efficient universal pushdown cellular automata and their application to complexity

    Get PDF
    In order to obtain universal classical cellular automata an infinite space is required. Therefore, the number of required processors depends on the length of input data and, additionally, may increase during the computation. On the other hand, Turing machines are universal devices which have one processor only and additionally an infinite storage tape. Here an in some sense intermediate model is studied. The pushdown cellular automata are a stack augmented generalization of classical cellular automata. They form a massively parallel universal model where the number of processors is bounded by the length of input data. Effcient universal pushdown cellular automata and their efficiently verifiable encodings are proposed. They are applied to computational complexity, and tight time and stack-space hierarchies are shown. CR Subject Classification (1998): F.1, F.4.3, B.6.1, E.

    Uncomputability and Undecidability in Economic Theory

    Get PDF
    Economic theory, game theory and mathematical statistics have all increasingly become algorithmic sciences. Computable Economics, Algorithmic Game Theory ([28]) and Algorithmic Statistics ([13]) are frontier research subjects. All of them, each in its own way, are underpinned by (classical) recursion theory - and its applied branches, say computational complexity theory or algorithmic information theory - and, occasionally, proof theory. These research paradigms have posed new mathematical and metamathematical questions and, inadvertently, undermined the traditional mathematical foundations of economic theory. A concise, but partial, pathway into these new frontiers is the subject matter of this paper. Interpreting the core of mathematical economic theory to be defined by General Equilibrium Theory and Game Theory, a general - but concise - analysis of the computable and decidable content of the implications of these two areas are discussed. Issues at the frontiers of macroeconomics, now dominated by Recursive Macroeconomic Theory, are also tackled, albeit ultra briefly. The point of view adopted is that of classical recursion theory and varieties of constructive mathematics.General Equilibrium Theory, Game Theory, Recursive Macro-economics, (Un)computability, (Un)decidability, Constructivity

    The Computability-Theoretic Content of Emergence

    Get PDF
    In dealing with emergent phenomena, a common task is to identify useful descriptions of them in terms of the underlying atomic processes, and to extract enough computational content from these descriptions to enable predictions to be made. Generally, the underlying atomic processes are quite well understood, and (with important exceptions) captured by mathematics from which it is relatively easy to extract algorithmic con- tent. A widespread view is that the difficulty in describing transitions from algorithmic activity to the emergence associated with chaotic situations is a simple case of complexity outstripping computational resources and human ingenuity. Or, on the other hand, that phenomena transcending the standard Turing model of computation, if they exist, must necessarily lie outside the domain of classical computability theory. In this article we suggest that much of the current confusion arises from conceptual gaps and the lack of a suitably fundamental model within which to situate emergence. We examine the potential for placing emer- gent relations in a familiar context based on Turing's 1939 model for interactive computation over structures described in terms of reals. The explanatory power of this model is explored, formalising informal descrip- tions in terms of mathematical definability and invariance, and relating a range of basic scientific puzzles to results and intractable problems in computability theory

    Verifiable Network-Performance Measurements

    Get PDF
    In the current Internet, there is no clean way for affected parties to react to poor forwarding performance: when a domain violates its Service Level Agreement (SLA) with a contractual partner, the partner must resort to ad-hoc probing-based monitoring to determine the existence and extent of the violation. Instead, we propose a new, systematic approach to the problem of forwarding-performance verification. Our mechanism relies on voluntary reporting, allowing each domain to disclose its loss and delay performance to its neighbors; it does not disclose any information regarding the participating domains' topology or routing policies beyond what is already publicly available. Most importantly, it enables verifiable performance measurements, i.e., domains cannot abuse it to significantly exaggerate their performance. Finally, our mechanism is tunable, allowing each participating domain to determine how many resources to devote to it independently (i.e., without any inter-domain coordination), exposing a controllable trade-off between performance-verification quality and resource consumption. Our mechanism comes at the cost of deploying modest functionality at the participating domains' border routers; we show that it requires reasonable processing and memory resources within modern network capabilities.Comment: 14 page

    Resource Bounded Immunity and Simplicity

    Get PDF
    Revisiting the thirty years-old notions of resource-bounded immunity and simplicity, we investigate the structural characteristics of various immunity notions: strong immunity, almost immunity, and hyperimmunity as well as their corresponding simplicity notions. We also study limited immunity and simplicity, called k-immunity and feasible k-immunity, and their simplicity notions. Finally, we propose the k-immune hypothesis as a working hypothesis that guarantees the existence of simple sets in NP.Comment: This is a complete version of the conference paper that appeared in the Proceedings of the 3rd IFIP International Conference on Theoretical Computer Science, Kluwer Academic Publishers, pp.81-95, Toulouse, France, August 23-26, 200
    • …
    corecore