35,046 research outputs found

    ENIGMA: Efficient Learning-based Inference Guiding Machine

    Full text link
    ENIGMA is a learning-based method for guiding given clause selection in saturation-based theorem provers. Clauses from many proof searches are classified as positive and negative based on their participation in the proofs. An efficient classification model is trained on this data, using fast feature-based characterization of the clauses . The learned model is then tightly linked with the core prover and used as a basis of a new parameterized evaluation heuristic that provides fast ranking of all generated clauses. The approach is evaluated on the E prover and the CASC 2016 AIM benchmark, showing a large increase of E's performance.Comment: Submitted to LPAR 201

    On the relationship between plane and solid geometry

    Get PDF
    Traditional geometry concerns itself with planimetric and stereometric considerations, which are at the root of the division between plane and solid geometry. To raise the issue of the relation between these two areas brings with it a host of different problems that pertain to mathematical practice, epistemology, semantics, ontology, methodology, and logic. In addition, issues of psychology and pedagogy are also important here. To our knowledge there is no single contribution that studies in detail even one of the aforementioned area

    The Singularity Problem for Space-Times with Torsion

    Full text link
    The problem of a rigorous theory of singularities in space-times with torsion is addressed. We define geodesics as curves whose tangent vector moves by parallel transport. This is different from what other authors have done, because their definition of geodesics only involves the Christoffel connection, though studying theories with torsion. We propose a preliminary definition of singularities which is based on timelike or null geodesic incompleteness, even though for theories with torsion the paths of particles are not geodesics. The study of the geodesic equation for cosmological models with torsion shows that the definition has a physical relevance. It can also be motivated, as done in the literature, remarking that the causal structure of a space-time with torsion does not get changed with respect to general relativity. We then prove how to extend Hawking's singularity theorem without causality assumptions to the space-time of the ECSK theory. This is achieved studying the generalized Raychaudhuri equation in the ECSK theory, the conditions for the existence of conjugate points and properties of maximal timelike geodesics. Hawking's theorem can be generalized, provided the torsion tensor obeys some conditions. Thus our result can also be interpreted as a no-singularity theorem if these additional conditions are not satisfied. In other words, it turns out that the occurrence of singularities in closed cosmological models based on the ECSK theory is less generic than in general relativity. Our work is to be compared with previous papers in the literature. There are some relevant differences, because we rely on a different definition of geodesics, we keep the field equations of the ECSK theory in their original form rather than casting them in a form similar to general relativity with a modified energy momentum tensor,Comment: 17 pages, plain-tex, published in Nuovo Cimento B, volume 105, pages 75-90, year 199

    A Labelled Analytic Theorem Proving Environment for Categorial Grammar

    Full text link
    We present a system for the investigation of computational properties of categorial grammar parsing based on a labelled analytic tableaux theorem prover. This proof method allows us to take a modular approach, in which the basic grammar can be kept constant, while a range of categorial calculi can be captured by assigning different properties to the labelling algebra. The theorem proving strategy is particularly well suited to the treatment of categorial grammar, because it allows us to distribute the computational cost between the algorithm which deals with the grammatical types and the algebraic checker which constrains the derivation.Comment: 11 pages, LaTeX2e, uses examples.sty and a4wide.st

    Foley's Thesis, Negishi's Method, Existence Proofs and Computation

    Get PDF
    Duncan Foleyís many-faceted and outstanding contributions to macroeconomics, microeconomics, general equilibrium theory, the theory of taxation, history of economic thought, the magnificent dynamics of classical economics, classical value theory, Bayesian statistics, formal dynamics and, most recently, fascinating forays into an interpretation of economic evolution from a variety of complexity theoretic viewpoints have all left -and continue to leave - significant marks in the development and structure of economic theory. He belongs to the grand tradition of visionaries who theorise with imaginative audacity on the dynamics, evolution and contradictions of capitalist economies - a tradition that, perhaps, begins with Marx and Mill, continues with Keynes and Schumpeter, reaching new heights with the iconoclastic brilliancies of a Tsuru and a Goodwin, a Chakravarty and a Nelson, and to which Duncan Foley adds a lustre of much value. In this contribution I return to mathematical themes broached in Foleyís brilliant and pioneering Yale doctoral dissertation (Foley, 1967) and attempt to view them as a Computable Economist would.The intention is to suggest that algorithmic indeterminacies are intrinsic to the foundations of economic theory in the mathematical modeEquilibrium existence theorems, Welfare theorems, Constructive proofs, Computability

    Learning-assisted Theorem Proving with Millions of Lemmas

    Full text link
    Large formal mathematical libraries consist of millions of atomic inference steps that give rise to a corresponding number of proved statements (lemmas). Analogously to the informal mathematical practice, only a tiny fraction of such statements is named and re-used in later proofs by formal mathematicians. In this work, we suggest and implement criteria defining the estimated usefulness of the HOL Light lemmas for proving further theorems. We use these criteria to mine the large inference graph of the lemmas in the HOL Light and Flyspeck libraries, adding up to millions of the best lemmas to the pool of statements that can be re-used in later proofs. We show that in combination with learning-based relevance filtering, such methods significantly strengthen automated theorem proving of new conjectures over large formal mathematical libraries such as Flyspeck.Comment: journal version of arXiv:1310.2797 (which was submitted to LPAR conference

    HOL(y)Hammer: Online ATP Service for HOL Light

    Full text link
    HOL(y)Hammer is an online AI/ATP service for formal (computer-understandable) mathematics encoded in the HOL Light system. The service allows its users to upload and automatically process an arbitrary formal development (project) based on HOL Light, and to attack arbitrary conjectures that use the concepts defined in some of the uploaded projects. For that, the service uses several automated reasoning systems combined with several premise selection methods trained on all the project proofs. The projects that are readily available on the server for such query answering include the recent versions of the Flyspeck, Multivariate Analysis and Complex Analysis libraries. The service runs on a 48-CPU server, currently employing in parallel for each task 7 AI/ATP combinations and 4 decision procedures that contribute to its overall performance. The system is also available for local installation by interested users, who can customize it for their own proof development. An Emacs interface allowing parallel asynchronous queries to the service is also provided. The overall structure of the service is outlined, problems that arise and their solutions are discussed, and an initial account of using the system is given
    corecore