4,333 research outputs found

    Cartographic Algorithms: Problems of Implementation and Evaluation and the Impact of Digitising Errors

    Get PDF
    Cartographic generalisation remains one of the outstanding challenges in digital cartography and Geographical Information Systems (GIS). It is generally assumed that computerisation will lead to the removal of spurious variability introduced by the subjective decisions of individual cartographers. This paper demonstrates through an in‐depth study of a line simplification algorithm that computerisation introduces its own sources of variability. The algorithm, referred to as the Douglas‐Peucker algorithm in cartographic literature, has been widely used in image processing, pattern recognition and GIS for some 20 years. An analysis of this algorithm and study of some implementations in wide use identify the presence of variability resulting from the subjective decisions of software implementors. Spurious variability in software complicates the processes of evaluation and comparison of alternative algorithms for cartographic tasks. No doubt, variability in implementation could be removed by rigorous study and specification of algorithms. Such future work must address the presence of digitising error in cartographic data. Our analysis suggests that it would be difficult to adapt the Douglas‐Peucker algorithm to cope with digitising error without altering the method. Copyright © 1991, Wiley Blackwell. All rights reserve

    Layers of generality and types of generalization in pattern activities

    Get PDF
    Pattern generalization is considered one of the prominent routes for in-troducing students to algebra. However, not all generalizations are al-gebraic. In the use of pattern generalization as a route to algebra, we —teachers and educators— thus have to remain vigilant in order not to confound algebraic generalizations with other forms of dealing with the general. But how to distinguish between algebraic and non-algebraic generalizations? On epistemological and semiotic grounds, in this arti-cle I suggest a characterization of algebraic generalizations. This char-acterization helps to bring about a typology of algebraic and arithmetic generalizations. The typology is illustrated with classroom examples

    Manufacturing a mathematical group: a study in heuristics

    Get PDF
    I examine the way a relevant conceptual novelty in mathematics, that is, the notion of group, has been constructed in order to show the kinds of heuristic reasoning that enabled its manufacturing. To this end, I examine salient aspects of the works of Lagrange, Cauchy, Galois and Cayley (Sect. 2). In more detail, I examine the seminal idea resulting from Lagrange’s heuristics and how Cauchy, Galois and Cayley develop it. This analysis shows us how new mathematical entities are generated, and also how what counts as a solution to a problem is shaped and changed. Finally, I argue that this case study shows us that we have to study inferential micro-structures (Sect. 3), that is, the ways similarities and regularities are sought, in order to understand how theoretical novelty is constructed and heuristic reasoning is put forwar

    The Manin conjecture in dimension 2

    Full text link
    These lecture notes describe the current state of affairs for Manin's conjecture in the context of del Pezzo surfaces.Comment: 57 pages. These are a preliminary version of lecture notes for the "School and conference on analytic number theory", ICTP, Trieste, 23/04/07-11/05/0

    Mathematics and language

    Full text link
    This essay considers the special character of mathematical reasoning, and draws on observations from interactive theorem proving and the history of mathematics to clarify the nature of formal and informal mathematical language. It proposes that we view mathematics as a system of conventions and norms that is designed to help us make sense of the world and reason efficiently. Like any designed system, it can perform well or poorly, and the philosophy of mathematics has a role to play in helping us understand the general principles by which it serves its purposes well

    CryptOpt: Verified Compilation with Random Program Search for Cryptographic Primitives

    Full text link
    Most software domains rely on compilers to translate high-level code to multiple different machine languages, with performance not too much worse than what developers would have the patience to write directly in assembly language. However, cryptography has been an exception, where many performance-critical routines have been written directly in assembly (sometimes through metaprogramming layers). Some past work has shown how to do formal verification of that assembly, and other work has shown how to generate C code automatically along with formal proof, but with consequent performance penalties vs. the best-known assembly. We present CryptOpt, the first compilation pipeline that specializes high-level cryptographic functional programs into assembly code significantly faster than what GCC or Clang produce, with mechanized proof (in Coq) whose final theorem statement mentions little beyond the input functional program and the operational semantics of x86-64 assembly. On the optimization side, we apply randomized search through the space of assembly programs, with repeated automatic benchmarking on target CPUs. On the formal-verification side, we connect to the Fiat Cryptography framework (which translates functional programs into C-like IR code) and extend it with a new formally verified program-equivalence checker, incorporating a modest subset of known features of SMT solvers and symbolic-execution engines. The overall prototype is quite practical, e.g. producing new fastest-known implementations for the relatively new Intel i9 12G, of finite-field arithmetic for both Curve25519 (part of the TLS standard) and the Bitcoin elliptic curve secp256k1
    corecore