4,974 research outputs found

    Introduction to Projective Arithmetics

    Full text link
    Science and mathematics help people to better understand world, eliminating many inconsistencies, fallacies and misconceptions. One of such misconceptions is related to arithmetic of natural numbers, which is extremely important both for science and everyday life. People think their counting is governed by the rules of the conventional arithmetic and thus other kinds of arithmetics of natural numbers do not exist and cannot exist. However, this popular image of the situation with the natural numbers is wrong. In many situations, people have to utilize and do implicitly utilize rules of counting and operating different from rules and operations in the conventional arithmetic. This is a consequence of the existing diversity in nature and society. To correctly represent this diversity, people have to explicitly employ different arithmetics. To make a distinction, we call the conventional arithmetic by the name Diophantine arithmetic, while other arithmetics are called non-Diophantine. There are two big families of non-Diophantine arithmetics: projective arithmetics and dual arithmetics (Burgin, 1997). In this work, we give an exposition of projective arithmetics, presenting their properties and considering also a more general mathematical structure called a projective prearithmetic. The Diophantine arithmetic is a member of this parametric family: its parameter is equal to the identity function f(x) = x. In conclusion, it is demonstrated how non-Diophantine arithmetics may be utilized beyond mathematics and how they allow one to eliminate inconsistencies and contradictions encountered by other researchers

    Interval Superposition Arithmetic for Guaranteed Parameter Estimation

    Full text link
    The problem of guaranteed parameter estimation (GPE) consists in enclosing the set of all possible parameter values, such that the model predictions match the corresponding measurements within prescribed error bounds. One of the bottlenecks in GPE algorithms is the construction of enclosures for the image-set of factorable functions. In this paper, we introduce a novel set-based computing method called interval superposition arithmetics (ISA) for the construction of enclosures of such image sets and its use in GPE algorithms. The main benefits of using ISA in the context of GPE lie in the improvement of enclosure accuracy and in the implied reduction of number set-membership tests of the set-inversion algorithm

    Automated construction of U(1)U(1)-invariant matrix-product operators from graph representations

    Full text link
    We present an algorithmic construction scheme for matrix-product-operator (MPO) representations of arbitrary U(1)U(1)-invariant operators whenever there is an expression of the local structure in terms of a finite-states machine (FSM). Given a set of local operators as building blocks, the method automatizes two major steps when constructing a U(1)U(1)-invariant MPO representation: (i) the bookkeeping of auxiliary bond-index shifts arising from the application of operators changing the local quantum numbers and (ii) the appearance of phase factors due to particular commutation rules. The automatization is achieved by post-processing the operator strings generated by the FSM. Consequently, MPO representations of various types of U(1)U(1)-invariant operators can be constructed generically in MPS algorithms reducing the necessity of expensive MPO arithmetics. This is demonstrated by generating arbitrary products of operators in terms of FSM, from which we obtain exact MPO representations for the variance of the Hamiltonian of a S=1S=1 Heisenberg chain.Comment: resubmitted version with minor correction

    Model Theory of Ultrafinitism I: Fuzzy Initial Segments of Arithmetics

    Full text link
    This article is the first of an intended series of works on the model theory of Ultrafinitism. It is roughly divided into two parts. The first one addresses some of the issues related to ultrafinitistic programs, as well as some of the core ideas proposed thus far. The second part of the paper presents a model of ultrafinitistic arithmetics based on the notion of fuzzy initial segments of the standard natural numbers series. We also introduce a proof theory and a semantics for ultrafinitism through which feasibly consistent theories can be treated on the same footing as their classically consistent counterparts. We conclude with a brief sketch of a foundational program, that aims at reproducing the transfinite within the finite realm.Comment: 31 pages, Tennenbaum Memorial invited tal

    Pushing the Limits of Encrypted Databases with Secure Hardware

    Full text link
    Encrypted databases have been studied for more than 10 years and are quickly emerging as a critical technology for the cloud. The current state of the art is to use property-preserving encrypting techniques (e.g., deterministic encryption) to protect the confidentiality of the data and support query processing at the same time. Unfortunately, these techniques have many limitations. Recently, trusted computing platforms (e.g., Intel SGX) have emerged as an alternative to implement encrypted databases. This paper demonstrates some vulnerabilities and the limitations of this technology, but it also shows how to make best use of it in order to improve on confidentiality, functionality, and performance

    About the Chasm Separating the Goals of Hilbert's Consistency Program from the Second Incompletess Theorem

    Full text link
    We have published several articles about generalizations and boundary-case exceptions to the Second Incompleteness Theorem during the last 25 years. The current paper will review some of our prior results and also introduce an `enriched' refinement of semantic tableaux deduction. While the Second Incompleteness Theorem is a strong result, we will emphasize its boundary-case exceptions are germane to Global Warming's threat because our systems can own a simultaneous knowledge about their own consistency, together with an understanding of the Π1\Pi_1 implications of Peano Arithmetic.Comment: The bibliography section of this article contains citations to all of Willard's major papers prior to 2018 about logi

    Non-redundant random generation from weighted context-free languages

    Full text link
    We address the non-redundant random generation of k words of length n from a context-free language. Additionally, we want to avoid a predefined set of words. We study the limits of a rejection-based approach, whose time complexity is shown to grow exponentially in k in some cases. We propose an alternative recursive algorithm, whose careful implementation allows for a non-redundant generation of k words of size n in O(kn log n) arithmetic operations after the precomputation of O(n) numbers. The overall complexity is therefore dominated by the generation of k words, and the non-redundancy comes at a negligible cost

    RCF1: Theories of PR Maps and Partial PR Maps

    Full text link
    We give to the categorical theory PR of Primitive Recursion a logically simple, algebraic presentation, via equations between maps, plus one genuine Horner type schema, namely Freyd's uniqueness of the initialised iterated. Free Variables are introduced - formally - as another names for projections. Predicates \chi: A -> 2 admit interpretation as (formal) Objects {A|\chi} of a surrounding Theory PRA = PR + (abstr) : schema (abstr) formalises this predicate abstraction into additional Objects. Categorical Theory P\hat{R}_A \sqsupset PR_A \sqsupset PR then is the Theory of formally partial PR-maps, having Theory PR_A embedded. This Theory P\hat{R}_A bears the structure of a (still) diagonal monoidal category. It is equivalent to "the" categorical theory of \mu-recursion (and of while loops), viewed as partial PR maps. So the present approach to partial maps sheds new light on Church's Thesis, "embedded" into a Free-Variables, formally variable-free (categorical) framework

    In whose mind is Mathematics an "a priori cognition"?

    Full text link
    According to the philosopher Kant, Mathematics is an "a priori cognition". Kant's assumption, together with the unsolvability of Hilbert's 10th problem, implies an astonishing result.Comment: Philosophy of Mathematics, 11 page

    A recurrence scheme for least-square optimized polynomials

    Get PDF
    A recurrence scheme is defined for the numerical determination of high degree polynomial approximations to functions as, for instance, inverse powers near zero. As an example, polynomials needed in the two-step multi-boson (TSMB) algorithm for fermion simulations are considered. For the polynomials needed in TSMB a code in C is provided which is easily applicable to polynomial degrees of several thousands.Comment: 13 pages, 1 figur
    • …
    corecore