12,321 research outputs found

    Refactoring pattern matching

    Get PDF
    Defining functions by pattern matching over the arguments is advantageous for understanding and reasoning, but it tends to expose the implementation of a datatype. Significant effort has been invested in tackling this loss of modularity; however, decoupling patterns from concrete representations while maintaining soundness of reasoning has been a challenge. Inspired by the development of invertible programming, we propose an approach to program refactoring based on a right-invertible language rinv—every function has a right (or pre-) inverse. We show how this new design is able to permit a smooth incremental transition from programs with algebraic datatypes and pattern matching, to ones with proper encapsulation, while maintaining simple and sound reasoning

    Problem Theory

    Full text link
    The Turing machine, as it was presented by Turing himself, models the calculations done by a person. This means that we can compute whatever any Turing machine can compute, and therefore we are Turing complete. The question addressed here is why, Why are we Turing complete? Being Turing complete also means that somehow our brain implements the function that a universal Turing machine implements. The point is that evolution achieved Turing completeness, and then the explanation should be evolutionary, but our explanation is mathematical. The trick is to introduce a mathematical theory of problems, under the basic assumption that solving more problems provides more survival opportunities. So we build a problem theory by fusing set and computing theories. Then we construct a series of resolvers, where each resolver is defined by its computing capacity, that exhibits the following property: all problems solved by a resolver are also solved by the next resolver in the series if certain condition is satisfied. The last of the conditions is to be Turing complete. This series defines a resolvers hierarchy that could be seen as a framework for the evolution of cognition. Then the answer to our question would be: to solve most problems. By the way, the problem theory defines adaptation, perception, and learning, and it shows that there are just three ways to resolve any problem: routine, trial, and analogy. And, most importantly, this theory demonstrates how problems can be used to found mathematics and computing on biology.Comment: 43 page

    Quantum advantage by relational queries about physically realizable equivalence classes

    Full text link
    Relational quantum queries are sometimes capable to effectively decide between collections of mutually exclusive elementary cases without completely resolving and determining those individual instances. Thereby the set of mutually exclusive elementary cases is effectively partitioned into equivalence classes pertinent to the respective query. In the second part of the paper, we review recent progress in theoretical certifications (relative to the assumptions made) of quantum value indeterminacy as a means to build quantum oracles for randomness.Comment: 8 Pages, one figure, invited contribution to TopHPC2019, Tehran, Iran, April 22-25, 201

    A Numerical Approach to Virasoro Blocks and the Information Paradox

    Full text link
    We chart the breakdown of semiclassical gravity by analyzing the Virasoro conformal blocks to high numerical precision, focusing on the heavy-light limit corresponding to a light probe propagating in a BTZ black hole background. In the Lorentzian regime, we find empirically that the initial exponential time-dependence of the blocks transitions to a universal t−32t^{-\frac{3}{2}} power-law decay. For the vacuum block the transition occurs at t≈πc6hLt \approx \frac{\pi c}{6 h_L}, confirming analytic predictions. In the Euclidean regime, due to Stokes phenomena the naive semiclassical approximation fails completely in a finite region enclosing the `forbidden singularities'. We emphasize that limitations on the reconstruction of a local bulk should ultimately stem from distinctions between semiclassical and exact correlators.Comment: 45 pages, 23 figure

    Machine Learning for Fluid Mechanics

    Full text link
    The field of fluid mechanics is rapidly advancing, driven by unprecedented volumes of data from field measurements, experiments and large-scale simulations at multiple spatiotemporal scales. Machine learning offers a wealth of techniques to extract information from data that could be translated into knowledge about the underlying fluid mechanics. Moreover, machine learning algorithms can augment domain knowledge and automate tasks related to flow control and optimization. This article presents an overview of past history, current developments, and emerging opportunities of machine learning for fluid mechanics. It outlines fundamental machine learning methodologies and discusses their uses for understanding, modeling, optimizing, and controlling fluid flows. The strengths and limitations of these methods are addressed from the perspective of scientific inquiry that considers data as an inherent part of modeling, experimentation, and simulation. Machine learning provides a powerful information processing framework that can enrich, and possibly even transform, current lines of fluid mechanics research and industrial applications.Comment: To appear in the Annual Reviews of Fluid Mechanics, 202
    • 

    corecore