288 research outputs found

    On Quasi-Newton Forward--Backward Splitting: Proximal Calculus and Convergence

    Get PDF
    We introduce a framework for quasi-Newton forward--backward splitting algorithms (proximal quasi-Newton methods) with a metric induced by diagonal ±\pm rank-rr symmetric positive definite matrices. This special type of metric allows for a highly efficient evaluation of the proximal mapping. The key to this efficiency is a general proximal calculus in the new metric. By using duality, formulas are derived that relate the proximal mapping in a rank-rr modified metric to the original metric. We also describe efficient implementations of the proximity calculation for a large class of functions; the implementations exploit the piece-wise linear nature of the dual problem. Then, we apply these results to acceleration of composite convex minimization problems, which leads to elegant quasi-Newton methods for which we prove convergence. The algorithm is tested on several numerical examples and compared to a comprehensive list of alternatives in the literature. Our quasi-Newton splitting algorithm with the prescribed metric compares favorably against state-of-the-art. The algorithm has extensive applications including signal processing, sparse recovery, machine learning and classification to name a few.Comment: arXiv admin note: text overlap with arXiv:1206.115

    Computation in Economics

    Get PDF
    This is an attempt at a succinct survey, from methodological and epistemological perspectives, of the burgeoning, apparently unstructured, field of what is often – misleadingly – referred to as computational economics. We identify and characterise four frontier research fields, encompassing both micro and macro aspects of economic theory, where machine computation play crucial roles in formal modelling exercises: algorithmic behavioural economics, computable general equilibrium theory, agent based computational economics and computable economics. In some senses these four research frontiers raise, without resolving, many interesting methodological and epistemological issues in economic theorising in (alternative) mathematical modesClassical Behavioural Economics, Computable General Equilibrium theory, Agent Based Economics, Computable Economics, Computability, Constructivity, Numerical Analysis

    Calibrating Generative Models: The Probabilistic Chomsky-Schützenberger Hierarchy

    Get PDF
    A probabilistic Chomsky–Schützenberger hierarchy of grammars is introduced and studied, with the aim of understanding the expressive power of generative models. We offer characterizations of the distributions definable at each level of the hierarchy, including probabilistic regular, context-free, (linear) indexed, context-sensitive, and unrestricted grammars, each corresponding to familiar probabilistic machine classes. Special attention is given to distributions on (unary notations for) positive integers. Unlike in the classical case where the "semi-linear" languages all collapse into the regular languages, using analytic tools adapted from the classical setting we show there is no collapse in the probabilistic hierarchy: more distributions become definable at each level. We also address related issues such as closure under probabilistic conditioning

    Nonstandard analysis, deformation quantization and some logical aspects of (non)commutative algebraic geometry

    Full text link
    This paper surveys results related to well-known works of B. Plotkin and V. Remeslennikov on the edge of algebra, logic and geometry. We start from a brief review of the paper and motivations. The first sections deal with model theory. In Section 2.1 we describe the geometric equivalence, the elementary equivalence, and the isotypicity of algebras. We look at these notions from the positions of universal algebraic geometry and make emphasis on the cases of the first order rigidity. In this setting Plotkin's problem on the structure of automorphisms of (auto)endomorphisms of free objects, and auto-equivalence of categories is pretty natural and important. Section 2.2 is dedicated to particular cases of Plotkin's problem. Section 2.3 is devoted to Plotkin's problem for automorphisms of the group of polynomial symplectomorphisms. This setting has applications to mathematical physics through the use of model theory (non-standard analysis) in the studying of homomorphisms between groups of symplectomorphisms and automorphisms of the Weyl algebra. The last two sections deal with algorithmic problems for noncommutative and commutative algebraic geometry. Section 3.1 is devoted to the Gr\"obner basis in non-commutative situation. Despite the existence of an algorithm for checking equalities, the zero divisors and nilpotency problems are algorithmically unsolvable. Section 3.2 is connected with the problem of embedding of algebraic varieties; a sketch of the proof of its algorithmic undecidability over a field of characteristic zero is given.Comment: In this review we partially used results of arXiv:1512.06533, arXiv:math/0512273, arXiv:1812.01883 and arXiv:1606.01566 and put them in a new contex

    Three Dogmas of First-Order Logic and some Evidence-based Consequences for Constructive Mathematics of differentiating between Hilbertian Theism, Brouwerian Atheism and Finitary Agnosticism

    Get PDF
    We show how removing faith-based beliefs in current philosophies of classical and constructive mathematics admits formal, evidence-based, definitions of constructive mathematics; of a constructively well-defined logic of a formal mathematical language; and of a constructively well-defined model of such a language. We argue that, from an evidence-based perspective, classical approaches which follow Hilbert's formal definitions of quantification can be labelled `theistic'; whilst constructive approaches based on Brouwer's philosophy of Intuitionism can be labelled `atheistic'. We then adopt what may be labelled a finitary, evidence-based, `agnostic' perspective and argue that Brouwerian atheism is merely a restricted perspective within the finitary agnostic perspective, whilst Hilbertian theism contradicts the finitary agnostic perspective. We then consider the argument that Tarski's classic definitions permit an intelligence---whether human or mechanistic---to admit finitary, evidence-based, definitions of the satisfaction and truth of the atomic formulas of the first-order Peano Arithmetic PA over the domain N of the natural numbers in two, hitherto unsuspected and essentially different, ways. We show that the two definitions correspond to two distinctly different---not necessarily evidence-based but complementary---assignments of satisfaction and truth to the compound formulas of PA over N. We further show that the PA axioms are true over N, and that the PA rules of inference preserve truth over N, under both the complementary interpretations; and conclude some unsuspected constructive consequences of such complementarity for the foundations of mathematics, logic, philosophy, and the physical sciences

    Algorithmic Verification of Continuous and Hybrid Systems

    Get PDF
    We provide a tutorial introduction to reachability computation, a class of computational techniques that exports verification technology toward continuous and hybrid systems. For open under-determined systems, this technique can sometimes replace an infinite number of simulations.Comment: In Proceedings INFINITY 2013, arXiv:1402.661

    The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines
    • …
    corecore