237,409 research outputs found

    Catastrophic Particle Production under Periodic Perturbation

    Get PDF
    We develop a formalism to investigate the behavior of quantum field and quantum ground state when the field is coupled to perturbation that periodically oscillates. Working in the Schroedinger picture of quantum field theory, we confirm that the phenomenon of parametric resonance in the classical theory implies an instability of quantum vacuum, and correspondingly it gives rise to catastrophic particle production if the oscillation lasts indefinitely; the produced number of particles exponentially increases without bound as time proceeds. The density matrix describing the limiting stage of the quantum state is determined by a small set of parameters. Moreover, the energy spectrum and the intensity of produced particles are worked out in greatest detail in the limit of weak coupling or small amplitude perturbation. In the case of strong coupling or large amplitude perturbation the leading adiabatic formula is derived. Application to cosmological fate of weakly interacting spinless fields (WISF) such as the invisible axion, the Polonyi, and the modular fields is discussed. Although very little effect is expected on the invisible axion, the Polonyi type field has a chance that it catastrophically decays at an early epoch without much production of entropy, provided that an intrinsic coupling is large enough.Comment: 33 page

    Polynomial tuning of multiparametric combinatorial samplers

    Full text link
    Boltzmann samplers and the recursive method are prominent algorithmic frameworks for the approximate-size and exact-size random generation of large combinatorial structures, such as maps, tilings, RNA sequences or various tree-like structures. In their multiparametric variants, these samplers allow to control the profile of expected values corresponding to multiple combinatorial parameters. One can control, for instance, the number of leaves, profile of node degrees in trees or the number of certain subpatterns in strings. However, such a flexible control requires an additional non-trivial tuning procedure. In this paper, we propose an efficient polynomial-time, with respect to the number of tuned parameters, tuning algorithm based on convex optimisation techniques. Finally, we illustrate the efficiency of our approach using several applications of rational, algebraic and P\'olya structures including polyomino tilings with prescribed tile frequencies, planar trees with a given specific node degree distribution, and weighted partitions.Comment: Extended abstract, accepted to ANALCO2018. 20 pages, 6 figures, colours. Implementation and examples are available at [1] https://github.com/maciej-bendkowski/boltzmann-brain [2] https://github.com/maciej-bendkowski/multiparametric-combinatorial-sampler

    An overview of the proper generalized decomposition with applications in computational rheology

    Get PDF
    We review the foundations and applications of the proper generalized decomposition (PGD), a powerful model reduction technique that computes a priori by means of successive enrichment a separated representation of the unknown field. The computational complexity of the PGD scales linearly with the dimension of the space wherein the model is defined, which is in marked contrast with the exponential scaling of standard grid-based methods. First introduced in the context of computational rheology by Ammar et al. [3] and [4], the PGD has since been further developed and applied in a variety of applications ranging from the solution of the Schrödinger equation of quantum mechanics to the analysis of laminate composites. In this paper, we illustrate the use of the PGD in four problem categories related to computational rheology: (i) the direct solution of the Fokker-Planck equation for complex fluids in configuration spaces of high dimension, (ii) the development of very efficient non-incremental algorithms for transient problems, (iii) the fully three-dimensional solution of problems defined in degenerate plate or shell-like domains often encountered in polymer processing or composites manufacturing, and finally (iv) the solution of multidimensional parametric models obtained by introducing various sources of problem variability as additional coordinates

    Bayesian optimisation for likelihood-free cosmological inference

    Full text link
    Many cosmological models have only a finite number of parameters of interest, but a very expensive data-generating process and an intractable likelihood function. We address the problem of performing likelihood-free Bayesian inference from such black-box simulation-based models, under the constraint of a very limited simulation budget (typically a few thousand). To do so, we adopt an approach based on the likelihood of an alternative parametric model. Conventional approaches to approximate Bayesian computation such as likelihood-free rejection sampling are impractical for the considered problem, due to the lack of knowledge about how the parameters affect the discrepancy between observed and simulated data. As a response, we make use of a strategy previously developed in the machine learning literature (Bayesian optimisation for likelihood-free inference, BOLFI), which combines Gaussian process regression of the discrepancy to build a surrogate surface with Bayesian optimisation to actively acquire training data. We extend the method by deriving an acquisition function tailored for the purpose of minimising the expected uncertainty in the approximate posterior density, in the parametric approach. The resulting algorithm is applied to the problems of summarising Gaussian signals and inferring cosmological parameters from the Joint Lightcurve Analysis supernovae data. We show that the number of required simulations is reduced by several orders of magnitude, and that the proposed acquisition function produces more accurate posterior approximations, as compared to common strategies.Comment: 16+9 pages, 12 figures. Matches PRD published version after minor modification

    Quantum System under Periodic Perturbation: Effect of Environment

    Full text link
    In many physical situations the behavior of a quantum system is affected by interaction with a larger environment. We develop, using the method of influence functional, how to deduce the density matrix of the quantum system incorporating the effect of environment. After introducing characterization of the environment by spectral weight, we first devise schemes to approximate the spectral weight, and then a perturbation method in field theory models, in order to approximately describe the environment. All of these approximate models may be classified as extended Ohmic models of dissipation whose differences are in the high frequency part. The quantum system we deal with in the present work is a general class of harmonic oscillators with arbitrary time dependent frequency. The late time behavior of the system is well described by an approximation that employs a localized friction in the dissipative part of the correlation function appearing in the influence functional. The density matrix of the quantum system is then determined in terms of a single classical solution obtained with the time dependent frequency. With this one can compute the entropy, the energy distribution function, and other physical quantities of the system in a closed form. Specific application is made to the case of periodically varying frequency. This dynamical system has a remarkable property when the environmental interaction is switched off: Effect of the parametric resonance gives rise to an exponential growth of the populated number in higher excitation levels, or particle production in field theory models. The effect of the environment is investigated for this dynamical system and it is demonstrated that there existsComment: 55 pages, LATEX file plus 13 PS figures. A few calculational mistatkes and corresponding figure 1 in field theory model corrected and some changes made for publication in Phys. Rev.D (in press

    Semiparametric Bayesian inference in multiple equation models

    Get PDF
    This paper outlines an approach to Bayesian semiparametric regression in multiple equation models which can be used to carry out inference in seemingly unrelated regressions or simultaneous equations models with nonparametric components. The approach treats the points on each nonparametric regression line as unknown parameters and uses a prior on the degree of smoothness of each line to ensure valid posterior inference despite the fact that the number of parameters is greater than the number of observations. We develop an empirical Bayesian approach that allows us to estimate the prior smoothing hyperparameters from the data. An advantage of our semiparametric model is that it is written as a seemingly unrelated regressions model with independent normal-Wishart prior. Since this model is a common one, textbook results for posterior inference, model comparison, prediction and posterior computation are immediately available. We use this model in an application involving a two-equation structural model drawn from the labour and returns to schooling literatures

    On the Number of Zeros of Abelian Integrals: A Constructive Solution of the Infinitesimal Hilbert Sixteenth Problem

    Full text link
    We prove that the number of limit cycles generated by a small non-conservative perturbation of a Hamiltonian polynomial vector field on the plane, is bounded by a double exponential of the degree of the fields. This solves the long-standing tangential Hilbert 16th problem. The proof uses only the fact that Abelian integrals of a given degree are horizontal sections of a regular flat meromorphic connection (Gauss-Manin connection) with a quasiunipotent monodromy group.Comment: Final revisio
    corecore