13,598 research outputs found

    Map Calculus in GIS: a proposal and demonstration

    Get PDF
    This paper provides a new representation for fields (continuous surfaces) in Geographical Information Systems (GIS), based on the notion of spatial functions and their combinations. Following Tomlin's (1990) Map Algebra, the term 'Map Calculus' is used for this new representation. In Map Calculus, GIS layers are stored as functions, and new layers can be created by combinations of other functions. This paper explains the principles of Map Calculus and demonstrates the creation of function-based layers and their supporting management mechanism. The proposal is based on Church's (1941) Lambda Calculus and elements of functional computer languages (such as Lisp or Scheme)

    Fourier Series Formalization in ACL2(r)

    Get PDF
    We formalize some basic properties of Fourier series in the logic of ACL2(r), which is a variant of ACL2 that supports reasoning about the real and complex numbers by way of non-standard analysis. More specifically, we extend a framework for formally evaluating definite integrals of real-valued, continuous functions using the Second Fundamental Theorem of Calculus. Our extended framework is also applied to functions containing free arguments. Using this framework, we are able to prove the orthogonality relationships between trigonometric functions, which are the essential properties in Fourier series analysis. The sum rule for definite integrals of indexed sums is also formalized by applying the extended framework along with the First Fundamental Theorem of Calculus and the sum rule for differentiation. The Fourier coefficient formulas of periodic functions are then formalized from the orthogonality relations and the sum rule for integration. Consequently, the uniqueness of Fourier sums is a straightforward corollary. We also present our formalization of the sum rule for definite integrals of infinite series in ACL2(r). Part of this task is to prove the Dini Uniform Convergence Theorem and the continuity of a limit function under certain conditions. A key technique in our proofs of these theorems is to apply the overspill principle from non-standard analysis.Comment: In Proceedings ACL2 2015, arXiv:1509.0552

    Linear lambda terms as invariants of rooted trivalent maps

    Full text link
    The main aim of the article is to give a simple and conceptual account for the correspondence (originally described by Bodini, Gardy, and Jacquot) between α\alpha-equivalence classes of closed linear lambda terms and isomorphism classes of rooted trivalent maps on compact oriented surfaces without boundary, as an instance of a more general correspondence between linear lambda terms with a context of free variables and rooted trivalent maps with a boundary of free edges. We begin by recalling a familiar diagrammatic representation for linear lambda terms, while at the same time explaining how such diagrams may be read formally as a notation for endomorphisms of a reflexive object in a symmetric monoidal closed (bi)category. From there, the "easy" direction of the correspondence is a simple forgetful operation which erases annotations on the diagram of a linear lambda term to produce a rooted trivalent map. The other direction views linear lambda terms as complete invariants of their underlying rooted trivalent maps, reconstructing the missing information through a Tutte-style topological recurrence on maps with free edges. As an application in combinatorics, we use this analysis to enumerate bridgeless rooted trivalent maps as linear lambda terms containing no closed proper subterms, and conclude by giving a natural reformulation of the Four Color Theorem as a statement about typing in lambda calculus.Comment: accepted author manuscript, posted six months after publicatio

    Variable types for meaning assembly: a logical syntax for generic noun phrases introduced by most

    Get PDF
    This paper proposes a way to compute the meanings associated with sentences with generic noun phrases corresponding to the generalized quantifier most. We call these generics specimens and they resemble stereotypes or prototypes in lexical semantics. The meanings are viewed as logical formulae that can thereafter be interpreted in your favourite models. To do so, we depart significantly from the dominant Fregean view with a single untyped universe. Indeed, our proposal adopts type theory with some hints from Hilbert \epsilon-calculus (Hilbert, 1922; Avigad and Zach, 2008) and from medieval philosophy, see e.g. de Libera (1993, 1996). Our type theoretic analysis bears some resemblance with ongoing work in lexical semantics (Asher 2011; Bassac et al. 2010; Moot, Pr\'evot and Retor\'e 2011). Our model also applies to classical examples involving a class, or a generic element of this class, which is not uttered but provided by the context. An outcome of this study is that, in the minimalism-contextualism debate, see Conrad (2011), if one adopts a type theoretical view, terms encode the purely semantic meaning component while their typing is pragmatically determined

    An Embedding of the BSS Model of Computation in Light Affine Lambda-Calculus

    Full text link
    This paper brings together two lines of research: implicit characterization of complexity classes by Linear Logic (LL) on the one hand, and computation over an arbitrary ring in the Blum-Shub-Smale (BSS) model on the other. Given a fixed ring structure K we define an extension of Terui's light affine lambda-calculus typed in LAL (Light Affine Logic) with a basic type for K. We show that this calculus captures the polynomial time function class FP(K): every typed term can be evaluated in polynomial time and conversely every polynomial time BSS machine over K can be simulated in this calculus.Comment: 11 pages. A preliminary version appeared as Research Report IAC CNR Roma, N.57 (11/2004), november 200

    A topological approach to non-Archimedean Mathematics

    Full text link
    Non-Archimedean mathematics (in particular, nonstandard analysis) allows to construct some useful models to study certain phenomena arising in PDE's; for example, it allows to construct generalized solutions of differential equations and variational problems that have no classical solution. In this paper we introduce certain notions of non-Archimedean mathematics (in particular, of nonstandard analysis) by means of an elementary topological approach; in particular, we construct non-Archimedean extensions of the reals as appropriate topological completions of R\mathbb{R}. Our approach is based on the notion of Λ\Lambda -limit for real functions, and it is called Λ\Lambda -theory. It can be seen as a topological generalization of the α\alpha -theory presented in \cite{BDN2003}, and as an alternative topological presentation of the ultrapower construction of nonstandard extensions (in the sense of \cite{keisler}). To motivate the use of Λ\Lambda -theory for applications we show how to use it to solve a minimization problem of calculus of variations (that does not have classical solutions) by means of a particular family of generalized functions, called ultrafunctions.Comment: 22 page

    The exp-log normal form of types

    Get PDF
    Lambda calculi with algebraic data types lie at the core of functional programming languages and proof assistants, but conceal at least two fundamental theoretical problems already in the presence of the simplest non-trivial data type, the sum type. First, we do not know of an explicit and implemented algorithm for deciding the beta-eta-equality of terms---and this in spite of the first decidability results proven two decades ago. Second, it is not clear how to decide when two types are essentially the same, i.e. isomorphic, in spite of the meta-theoretic results on decidability of the isomorphism. In this paper, we present the exp-log normal form of types---derived from the representation of exponential polynomials via the unary exponential and logarithmic functions---that any type built from arrows, products, and sums, can be isomorphically mapped to. The type normal form can be used as a simple heuristic for deciding type isomorphism, thanks to the fact that it is a systematic application of the high-school identities. We then show that the type normal form allows to reduce the standard beta-eta equational theory of the lambda calculus to a specialized version of itself, while preserving the completeness of equality on terms. We end by describing an alternative representation of normal terms of the lambda calculus with sums, together with a Coq-implemented converter into/from our new term calculus. The difference with the only other previously implemented heuristic for deciding interesting instances of eta-equality by Balat, Di Cosmo, and Fiore, is that we exploit the type information of terms substantially and this often allows us to obtain a canonical representation of terms without performing sophisticated term analyses

    Analyzing Individual Proofs as the Basis of Interoperability between Proof Systems

    Get PDF
    We describe the first results of a project of analyzing in which theories formal proofs can be ex- pressed. We use this analysis as the basis of interoperability between proof systems.Comment: In Proceedings PxTP 2017, arXiv:1712.0089

    ASMs and Operational Algorithmic Completeness of Lambda Calculus

    Get PDF
    We show that lambda calculus is a computation model which can step by step simulate any sequential deterministic algorithm for any computable function over integers or words or any datatype. More formally, given an algorithm above a family of computable functions (taken as primitive tools, i.e., kind of oracle functions for the algorithm), for every constant K big enough, each computation step of the algorithm can be simulated by exactly K successive reductions in a natural extension of lambda calculus with constants for functions in the above considered family. The proof is based on a fixed point technique in lambda calculus and on Gurevich sequential Thesis which allows to identify sequential deterministic algorithms with Abstract State Machines. This extends to algorithms for partial computable functions in such a way that finite computations ending with exceptions are associated to finite reductions leading to terms with a particular very simple feature.Comment: 37 page
    • …
    corecore