3,663 research outputs found

    The importance of the Selberg integral

    Full text link
    It has been remarked that a fair measure of the impact of Atle Selberg's work is the number of mathematical terms which bear his name. One of these is the Selberg integral, an n-dimensional generalization of the Euler beta integral. We trace its sudden rise to prominence, initiated by a question to Selberg from Enrico Bombieri, more than thirty years after publication. In quick succession the Selberg integral was used to prove an outstanding conjecture in random matrix theory, and cases of the Macdonald conjectures. It further initiated the study of q-analogues, which in turn enriched the Macdonald conjectures. We review these developments and proceed to exhibit the sustained prominence of the Selberg integral, evidenced by its central role in random matrix theory, Calogero-Sutherland quantum many body systems, Knizhnik-Zamolodchikov equations, and multivariable orthogonal polynomial theory.Comment: 43 page

    The evolution of radiation towards thermal equilibrium: A soluble model which illustrates the foundations of Statistical Mechanics

    Full text link
    In 1916 Einstein introduced the first rules for a quantum theory of electromagnetic radiation, and he applied them to a model of matter in thermal equilibrium with radiation to derive Planck's black-body formula. Einstein's treatment is extended here to time-dependent stochastic variables, which leads to a master equation for the probability distribution that describes the irreversible approach of Einstein's model towards thermal equilibrium, and elucidates aspects of the foundation of statistical mechanics. An analytic solution of this equation is obtained in the Fokker-Planck approximation which is in excellent agreement with numerical results. At equilibrium, it is shown that the probability distribution is proportional to the total number of microstates for a given configuration, in accordance with Boltzmann's fundamental postulate of equal a priori probabilities for these states. While the counting of these configurations depends on particle statistics- Boltzmann, Bose-Einstein, or Fermi-Dirac - the corresponding probability is determined here by the dynamics which are embodied in the form of Einstein's quantum transition probabilities for the emission and absorption of radiation. In a special limit, it is shown that the photons in Einstein's model can act as a thermal bath for the evolution of the atoms towards the canonical equilibrium distribution of Gibbs. In this limit, the present model is mathematically equivalent to an extended version of the Ehrenfests' ``dog-flea'' model, which has been discussed recently by Ambegaokar and Clerk

    Specific "scientific" data structures, and their processing

    Full text link
    Programming physicists use, as all programmers, arrays, lists, tuples, records, etc., and this requires some change in their thought patterns while converting their formulae into some code, since the "data structures" operated upon, while elaborating some theory and its consequences, are rather: power series and Pad\'e approximants, differential forms and other instances of differential algebras, functionals (for the variational calculus), trajectories (solutions of differential equations), Young diagrams and Feynman graphs, etc. Such data is often used in a [semi-]numerical setting, not necessarily "symbolic", appropriate for the computer algebra packages. Modules adapted to such data may be "just libraries", but often they become specific, embedded sub-languages, typically mapped into object-oriented frameworks, with overloaded mathematical operations. Here we present a functional approach to this philosophy. We show how the usage of Haskell datatypes and - fundamental for our tutorial - the application of lazy evaluation makes it possible to operate upon such data (in particular: the "infinite" sequences) in a natural and comfortable manner.Comment: In Proceedings DSL 2011, arXiv:1109.032

    Thirty-two Goldbach Variations

    Full text link
    We give thirty-two diverse proofs of a small mathematical gem--the fundamental Euler sum identity zeta(2,1)=zeta(3) =8zeta(\bar 2,1). We also discuss various generalizations for multiple harmonic (Euler) sums and some of their many connections, thereby illustrating both the wide variety of techniques fruitfully used to study such sums and the attraction of their study.Comment: v1: 34 pages AMSLaTeX. v2: 41 pages AMSLaTeX. New introductory material added and material on inequalities, Hilbert matrix and Witten zeta functions. Errors in the second section on Complex Line Integrals are corrected. To appear in International Journal of Number Theory. Title change

    The role of data in model building and prediction: a survey through examples

    Get PDF
    The goal of Science is to understand phenomena and systems in order to predict their development and gain control over them. In the scientific process of knowledge elaboration, a crucial role is played by models which, in the language of quantitative sciences, mean abstract mathematical or algorithmical representations. This short review discusses a few key examples from Physics, taken from dynamical systems theory, biophysics, and statistical mechanics, representing three paradigmatic procedures to build models and predictions from available data. In the case of dynamical systems we show how predictions can be obtained in a virtually model-free framework using the methods of analogues, and we briefly discuss other approaches based on machine learning methods. In cases where the complexity of systems is challenging, like in biophysics, we stress the necessity to include part of the empirical knowledge in the models to gain the minimal amount of realism. Finally, we consider many body systems where many (temporal or spatial) scales are at play-and show how to derive from data a dimensional reduction in terms of a Langevin dynamics for their slow components

    Efficient simulation of strong system-environment interactions

    Full text link
    Multi-component quantum systems in strong interaction with their environment are receiving increasing attention due to their importance in a variety of contexts, ranging from solid state quantum information processing to the quantum dynamics of bio-molecular aggregates. Unfortunately, these systems are difficult to simulate as the system-bath interactions cannot be treated perturbatively and standard approaches are invalid or inefficient. Here we combine the time dependent density matrix renormalization group methods with techniques from the theory of orthogonal polynomials to provide an efficient method for simulating open quantum systems, including spin-boson models and their generalisations to multi-component systems
    corecore