12 research outputs found

    An Elementary Formal Proof of the Group Law on Weierstrass Elliptic Curves in Any Characteristic

    Get PDF
    Elliptic curves are fundamental objects in number theory and algebraic geometry, whose points over a field form an abelian group under a geometric addition law. Any elliptic curve over a field admits a Weierstrass model, but prior formal proofs that the addition law is associative in this model involve either advanced algebraic geometry or tedious computation, especially in characteristic two. We formalise in the Lean theorem prover, the type of nonsingular points of a Weierstrass curve over a field of any characteristic and a purely algebraic proof that it forms an abelian group

    Probabilistic Arguments in Mathematics

    Get PDF
    This thesis addresses a question that emerges naturally from some observations about contemporary mathematical practice. Firstly, mathematicians always demand proof for the acceptance of new results. Secondly, the ability of mathematicians to tell if a discourse gives expression to a proof is less than perfect, and the computers they use are subject to a variety of hardware and software failures. So false results are sometimes accepted, despite insistence on proof. Thirdly, over the past few decades, researchers have also developed a variety of methods that are probabilistic in nature. Even if carried out perfectly, these procedures only yield a conclusion that is very likely to be true. In some cases, these chances of error are precisely specifiable and can be made as small as desired. The likelihood of an error arising from the inherently uncertain nature of these probabilistic algorithms can therefore be made vanishingly small in comparison to the chances of an error arising when implementing an equivalent deductive algorithm. Moreover, the structure of probabilistic algorithms tends to minimise these Implementation Errors too. So overall, probabilistic methods are sometimes more reliable than deductive ones. This invites the question: ‘Are mathematicians rational in continuing to reject these probabilistic methods as a means of establishing mathematical claims?

    From universal morphisms to megabytes: A Baayen space odyssey

    Get PDF

    Proof and Proving in Mathematics Education

    Get PDF
    undefine

    Computer Simulations in Science and Engineering. Concept, Practices, Perspectives

    Get PDF
    This book addresses key conceptual issues relating to the modern scientific and engineering use of computer simulations. It analyses a broad set of questions, from the nature of computer simulations to their epistemological power, including the many scientific, social and ethics implications of using computer simulations. The book is written in an easily accessible narrative, one that weaves together philosophical questions and scientific technicalities. It will thus appeal equally to all academic scientists, engineers, and researchers in industry interested in questions related to the general practice of computer simulations

    Probing the nature of dark energy with 21-cm intensity mapping.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Durban.Two approaches to measure the BAOs (baryon acoustic oscillations) with optical and radio telescopes, namely; galaxy redshift and intensity mapping (IM) surveys have been introduced and discussed in the literature. Among the two methods, the galaxy redshift survey has been used to great effect and is based on the detection and survey of millions of individual galaxies and measuring their redshifts by comparing templates of the spectral energy distributions of the light emitted from the galaxies with optical lines. IM is novel but a robust approach that focuses on surveys of extremely large volumes of galaxies without resolving each individual galaxy and can efficiently probe scales over redshift ranges inaccessible to the current galaxy redshift surveys. However, the IM survey has promisingly shown to have better overall sensitivity to the BAOs than the galaxy redshift survey but has a number of serious issues to be quantified. The most obvious of these issues is the presence of foreground contaminants from the Milky Way galaxy and extragalactic point sources which strongly dominate the neutral hydrogen (Hi) signal of our interest. Under this study, we are interested to realize the IM approach, pave the pathway, and optimize the scientific outputs of future radio experiments. We, therefore, carry out simulations and present forecasts of the cosmological constraints by employing Hi IM technique with three near-term radio telescopes by assuming 1 year of observational time. The telescopes considered here are Five-hundred-meter Aperture Spherical radio Telescope (FAST), BAOs In Neutral Gas Observations (BINGO), and Square Kilometre Array Phase I (SKA-I) single-dish experiments. We further forecast the combined constraints of the three radio telescopes with Planck measurements. In order to tackle the foreground challenge, we develop strategies to model various sky components and employ an approach to clean them from our Milky Way galaxy and extragalactic point sources by considering a typical single-dish radio telescope. Particularly, the Principal Component Analysis foreground separation approach considered can indeed recover the cosmological Hi signal to high precision. We show that, although the approach may face some challenges, it can be fully realized on the selected range of angular scales
    corecore