101 research outputs found

    Fermat, Leibniz, Euler, and the gang: The true history of the concepts of limit and shadow

    Full text link
    Fermat, Leibniz, Euler, and Cauchy all used one or another form of approximate equality, or the idea of discarding "negligible" terms, so as to obtain a correct analytic answer. Their inferential moves find suitable proxies in the context of modern theories of infinitesimals, and specifically the concept of shadow. We give an application to decreasing rearrangements of real functions.Comment: 35 pages, 2 figures, to appear in Notices of the American Mathematical Society 61 (2014), no.

    Field theoretic formulation and empirical tracking of spatial processes

    Get PDF
    Spatial processes are attacked on two fronts. On the one hand, tools from theoretical and statistical physics can be used to understand behaviour in complex, spatially-extended multi-body systems. On the other hand, computer vision and statistical analysis can be used to study 4D microscopy data to observe and understand real spatial processes in vivo. On the rst of these fronts, analytical models are developed for abstract processes, which can be simulated on graphs and lattices before considering real-world applications in elds such as biology, epidemiology or ecology. In the eld theoretic formulation of spatial processes, techniques originating in quantum eld theory such as canonical quantisation and the renormalization group are applied to reaction-di usion processes by analogy. These techniques are combined in the study of critical phenomena or critical dynamics. At this level, one is often interested in the scaling behaviour; how the correlation functions scale for di erent dimensions in geometric space. This can lead to a better understanding of how macroscopic patterns relate to microscopic interactions. In this vein, the trace of a branching random walk on various graphs is studied. In the thesis, a distinctly abstract approach is emphasised in order to support an algorithmic approach to parts of the formalism. A model of self-organised criticality, the Abelian sandpile model, is also considered. By exploiting a bijection between recurrent con gurations and spanning trees, an e cient Monte Carlo algorithm is developed to simulate sandpile processes on large lattices. On the second front, two case studies are considered; migratory patterns of leukaemia cells and mitotic events in Arabidopsis roots. In the rst case, tools from statistical physics are used to study the spatial dynamics of di erent leukaemia cell lineages before and after a treatment. One key result is that we can discriminate between migratory patterns in response to treatment, classifying cell motility in terms of sup/super/di usive regimes. For the second case study, a novel algorithm is developed to processes a 4D light-sheet microscopy dataset. The combination of transient uorescent markers and a poorly localised specimen in the eld of view leads to a challenging tracking problem. A fuzzy registration-tracking algorithm is developed to track mitotic events so as to understand their spatiotemporal dynamics under normal conditions and after tissue damage.Open Acces

    Dynamic block encryption with self-authenticating key exchange

    Get PDF
    One of the greatest challenges facing cryptographers is the mechanism used for key exchange. When secret data is transmitted, the chances are that there may be an attacker who will try to intercept and decrypt the message. Having done so, he/she might just gain advantage over the information obtained, or attempt to tamper with the message, and thus, misguiding the recipient. Both cases are equally fatal and may cause great harm as a consequence. In cryptography, there are two commonly used methods of exchanging secret keys between parties. In the first method, symmetric cryptography, the key is sent in advance, over some secure channel, which only the intended recipient can read. The second method of key sharing is by using a public key exchange method, where each party has a private and public key, a public key is shared and a private key is kept locally. In both cases, keys are exchanged between two parties. In this thesis, we propose a method whereby the risk of exchanging keys is minimised. The key is embedded in the encrypted text using a process that we call `chirp coding', and recovered by the recipient using a process that is based on correlation. The `chirp coding parameters' are exchanged between users by employing a USB flash memory retained by each user. If the keys are compromised they are still not usable because an attacker can only have access to part of the key. Alternatively, the software can be configured to operate in a one time parameter mode, in this mode, the parameters are agreed upon in advance. There is no parameter exchange during file transmission, except, of course, the key embedded in ciphertext. The thesis also introduces a method of encryption which utilises dynamic blocks, where the block size is different for each block. Prime numbers are used to drive two random number generators: a Linear Congruential Generator (LCG) which takes in the seed and initialises the system and a Blum-Blum Shum (BBS) generator which is used to generate random streams to encrypt messages, images or video clips for example. In each case, the key created is text dependent and therefore will change as each message is sent. The scheme presented in this research is composed of five basic modules. The first module is the key generation module, where the key to be generated is message dependent. The second module, encryption module, performs data encryption. The third module, key exchange module, embeds the key into the encrypted text. Once this is done, the message is transmitted and the recipient uses the key extraction module to retrieve the key and finally the decryption module is executed to decrypt the message and authenticate it. In addition, the message may be compressed before encryption and decompressed by the recipient after decryption using standard compression tools

    Association of Christians in the Mathematical Sciences Proceedings 2019

    Get PDF
    The conference proceedings of the Association of Christians in the Mathematical Sciences biannual conference, May 29-June 1, 2019 at Indiana Wesleyan University

    Annales Mathematicae et Informaticae (46.)

    Get PDF

    The Mathematics of Collision and the Collision of Mathematics in the 17th Century

    Get PDF
    Thesis (Ph.D.) - Indiana University, History and Philosophy of Science, 2015This dissertation charts the development of the quantitative rules of collision in the 17th century. These were central to the mathematization of nature, offering natural philosophy a framework to explain all the changes of nature in terms of the size and speed of bodies in motion. The mathematization of nature is a classic thesis in the history of early modern science. However, the significance of the dynamism within mathematics should not be neglected. One important change was the emergence of a new language of nature, an algebraic physico-mathematics, whose development was intertwined with the rules of collision. The symbolic equations provided a unified system to express previously diverse kinds of collision with a new representation of speed with direction, while at the same time collision provided a practical justification of the otherwise "impossible" negative numbers. In private manuscripts, Huygens criticized Descartes's rules of collision with heuristic use of Cartesian symbolic algebra. After he successfully predicted the outcomes of experiments using algebraic calculations at an early meeting of the Royal Society, Wallis and Wren extended the algebraic investigations in their published works. In addition to the impact of the changes in mathematics itself, the rules of collision were shaped by the inventive use of principles formulated by 'thinking with objects,' such as the balance and the pendulum. The former provided an initial framework to relate the speeds and sizes of bodies, and the latter was key both in the development of novel conservation principles and made possible experimental investigations of collision. This dissertation documents the formation of concepts central to modern physical science, and re-evaluates the mathematics of collision, with implications for our understanding of major figures in early modern science, such as Descartes and Huygens, and repercussions for the mathematization of nature

    On the Cryptanalysis of Public-Key Cryptography

    Get PDF
    Nowadays, the most popular public-key cryptosystems are based on either the integer factorization or the discrete logarithm problem. The feasibility of solving these mathematical problems in practice is studied and techniques are presented to speed-up the underlying arithmetic on parallel architectures. The fastest known approach to solve the discrete logarithm problem in groups of elliptic curves over finite fields is the Pollard rho method. The negation map can be used to speed up this calculation by a factor √2. It is well known that the random walks used by Pollard rho when combined with the negation map get trapped in fruitless cycles. We show that previously published approaches to deal with this problem are plagued by recurring cycles, and we propose effective alternative countermeasures. Furthermore, fast modular arithmetic is introduced which can take advantage of prime moduli of a special form using efficient "sloppy reduction." The effectiveness of these techniques is demonstrated by solving a 112-bit elliptic curve discrete logarithm problem using a cluster of PlayStation 3 game consoles: breaking a public-key standard and setting a new world record. The elliptic curve method (ECM) for integer factorization is the asymptotically fastest method to find relatively small factors of large integers. From a cryptanalytic point of view the performance of ECM gives information about secure parameter choices of some cryptographic protocols. We optimize ECM by proposing carry-free arithmetic modulo Mersenne numbers (numbers of the form 2M – 1) especially suitable for parallel architectures. Our implementation of these techniques on a cluster of PlayStation 3 game consoles set a new record by finding a 241-bit prime factor of 21181 – 1. A normal form for elliptic curves introduced by Edwards results in the fastest elliptic curve arithmetic in practice. Techniques to reduce the temporary storage and enhance the performance even further in the setting of ECM are presented. Our results enable one to run ECM efficiently on resource-constrained platforms such as graphics processing units

    Stochastic-like behavior in arithmetic dynamical systems : an investigation of collatz map hailstone sequences

    Get PDF
    Orientador: Prof. Dr. Marcos Gomes E. da Luz.Coorientador: Prof Dr. Madras Viswanathan GandhiDissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Física. Defesa : Curitiba, 28/02/2023Inclui referências: p. 87-96Resumo: Dinâmica Aritmética é um a área de pesquisa emergente, que estuda o comportamento de sistemas em espaços e tempos discretos. A presente dissertação lida com um sistema dinâmico aritmético chamado Mapa de Collatz, uma regra para inteiros positivos n, tais que n -> n/2 para n par, e n -> 3n + 1 para n ímpar. Um renomado e não resolvido problema matemático conjectura que para qualquer inteiro positivo n, finitas iterações do Mapa de Collatz eventualmente atingirão 1. A sequência de inteiros da iteração do Mapa de Collatz a partir de uma condição inicial n0, até o ponto em que atinge 1, é chamada sequência de granizo. O Mapa de Collatz, apesar de fornecer um a dinâmica muito rica para números naturais, só começou a ser explorado recentemente no contexto de modelos e fenômenos físicos. Este trabalho descreve investigações na tentativa de caracterizar se as sequências de granizo podem ser vistas como um sistema determinístico realizando com portamento do tipo estocástico, buscando iluminar o caminho entre teoria de números e mecânica estatística, através da área de sistemas dinâmicos aritméticos estocásticos. Para fazer isso, análises estatísticas apropriadas em várias sequências foram feitas, utilizando um numeroso conjunto de condições iniciais muito grandes (até a ordem de n0 ~ 2^10000). O processo de amostragem de condições iniciais foi conduzido utilizando uma nova representação para inteiros positivos, com conexão direta com 2-ádicas, chamados vetores-m. Ao aplicar métodos de análise de séries temporais tais como Power Spectrum e Detrended Fluctuation Analysis, o comportamento do tipo estocástico é confirmado, reforçando a literatura acerca das sequências de granizo performarem Movimento Browniano Geométrico (MBG). Análises de função de autocorrelação e entropia de von Neumann mostram desvios em relação ao MBG para condições iniciais especiais, indicando fontes de determinismo e previsibilidade dentro da tendência geral estocástica das séries. Estes desvios aparecem na forma de autocorrelações de curto e médio alcance, bem como uma diminuição no valor das entropias para órbitas a partir destas condições iniciais. A entropia de von Neuman também permite a caracterização da estrutura interna da sequência, por meio da análise das componentes dos vetores-m, indicando que o processo das sequências de granizo segue o teorema do limite central. Por fim, é possível conceber o Mapa de Collatz como um destruidor de estruturas, criando e reproduzindo padrões aleatórios, e lentamente destruindo toda e qualquer ordem imposta previamente.Abstract: Arithmetic Dynamics is an emergent research area that studies the behavior of systems in discrete spaces and times. The present dissertation deals with an arithmetic dynamical system called Collatz Map, a rule for positive integers n, stating that n -> n/2 for n even, and n -> 3n + 1 for n odd. A renowned and unsolved mathematical problem conjectures that for any positive integer n, finite iterations of the Collatz Map eventually reach 1. The sequence of integers from iterating the Collatz Map from an initial condition n0 until reach 1 is often called hailstone sequence. The Collatz Map, besides providing very rich dynamics for natural numbers, only recently has been explored in the context of physical models and phenomena. This work describes investigations trying to characterize whether the hailstone sequences can be regarded as a deterministic system performing stochastic-like behavior, aiming to enlighten the path from number theory to statistical mechanics, in the area of stochastic arithmetic dynamical systems. In order to do that, proper statistical analysis of various sequences were done, utilizing a set of very large initial conditions (up to n0 ~ 2^10000). The sampling of initial conditions was conduced by using a new representation for positive integers with direct connection with 2-adics, called m-vectors. By employing methods of time series analysis such as Power Spectrum and Detrended Fluctuation Analysis, the stochastic-like behavior is confirmed, reinforcing literature about the hailstone sequences performing Geometric Brownian Motion (GBM). Autocorrelation function and von Neumann entropy analysis shows deviations from GBM for special initial conditions, indicating sources of determinism and predictability inside the general trend. These deviations appears in the form of short- to mid-range autocorrelations and smaller entropy values for the orbits from these specific initial conditions. The von Neumann entropy also allows the characterization of the internal structure of the sequence by the m-vectors components analysis, indicating that the hailstone sequence is process following a central limit theorem. Finally, it is possible to conceive the Collatz Map as a structure destroyer, that creates and reproduces random patterns, slowly destructing any previously imposed ordering
    corecore