4,368 research outputs found

    Observational Constraints on the Averaged Universe

    Full text link
    Averaging in general relativity is a complicated operation, due to the general covariance of the theory and the non-linearity of Einstein's equations. The latter of these ensures that smoothing spacetime over cosmological scales does not yield the same result as solving Einstein's equations with a smooth matter distribution, and that the smooth models we fit to observations need not be simply related to the actual geometry of spacetime. One specific consequence of this is a decoupling of the geometrical spatial curvature term in the metric from the dynamical spatial curvature in the Friedmann equation. Here we investigate the consequences of this decoupling by fitting to a combination of HST, CMB, SNIa and BAO data sets. We find that only the geometrical spatial curvature is tightly constrained, and that our ability to constrain dark energy dynamics will be severely impaired until we gain a thorough understanding of the averaging problem in cosmology.Comment: 6 pages, 4 figure

    Randomized Extended Kaczmarz for Solving Least-Squares

    Full text link
    We present a randomized iterative algorithm that exponentially converges in expectation to the minimum Euclidean norm least squares solution of a given linear system of equations. The expected number of arithmetic operations required to obtain an estimate of given accuracy is proportional to the square condition number of the system multiplied by the number of non-zeros entries of the input matrix. The proposed algorithm is an extension of the randomized Kaczmarz method that was analyzed by Strohmer and Vershynin.Comment: 19 Pages, 5 figures; code is available at https://github.com/zouzias/RE

    Solving the riddle of codon usage preferences: a test for translational selection

    Get PDF
    Translational selection is responsible for the unequal usage of synonymous codons in protein coding genes in a wide variety of organisms. It is one of the most subtle and pervasive forces of molecular evolution, yet, establishing the underlying causes for its idiosyncratic behaviour across living kingdoms has proven elusive to researchers over the past 20 years. In this study, a statistical model for measuring translational selection in any given genome is developed, and the test is applied to 126 fully sequenced genomes, ranging from archaea to eukaryotes. It is shown that tRNA gene redundancy and genome size are interacting forces that ultimately determine the action of translational selection, and that an optimal genome size exists for which this kind of selection is maximal. Accordingly, genome size also presents upper and lower boundaries beyond which selection on codon usage is not possible. We propose a model where the coevolution of genome size and tRNA genes explains the observed patterns in translational selection in all living organisms. This model finally unifies our understanding of codon usage across prokaryotes and eukaryotes. Helicobacter pylori, Saccharomyces cerevisiae and Homo sapiens are codon usage paradigms that can be better understood under the proposed model

    The application of reliability methods in the design of stiffened FRP composite panels for marine vessels

    Get PDF
    The use of composite laminate materials has increased rapidly in recent years due to their excellent strength to weight ratio and resistance to corrosion. In the construction of marine vessels, stiffened plates are the most commonly used structural elements, forming the deck, bottom hull, side shells and bulkheads. This paper presents the use of a stochastic approach to the design of stiffened marine composite panels as part of a current research programme into developing stochastic methods for composite ship structures, accounting for variations in material properties, geometric indices and processing techniques, from the component level to the full system level. An analytical model for the solution of a stiffened isotropic plate using a grillage analogy is extended by the use of equivalent elastic properties for composite modelling. This methodology is applied in a reliability analysis of an isotropic (steel) stiffened plate before the final application for a reliability analysis for a FRP composite stiffened plate

    Encoded

    Full text link
    ENCODED is an immersive aerial dance performance and installation that uses the latest interactive technologies to build a projected digital environment that responds to the movements of the performers

    Rational solutions of the discrete time Toda lattice and the alternate discrete Painleve II equation

    Get PDF
    The Yablonskii-Vorob'ev polynomials yn(t)y_{n}(t), which are defined by a second order bilinear differential-difference equation, provide rational solutions of the Toda lattice. They are also polynomial tau-functions for the rational solutions of the second Painlev\'{e} equation (PIIP_{II}). Here we define two-variable polynomials Yn(t,h)Y_{n}(t,h) on a lattice with spacing hh, by considering rational solutions of the discrete time Toda lattice as introduced by Suris. These polynomials are shown to have many properties that are analogous to those of the Yablonskii-Vorob'ev polynomials, to which they reduce when h=0h=0. They also provide rational solutions for a particular discretisation of PIIP_{II}, namely the so called {\it alternate discrete} PIIP_{II}, and this connection leads to an expression in terms of the Umemura polynomials for the third Painlev\'{e} equation (PIIIP_{III}). It is shown that B\"{a}cklund transformation for the alternate discrete Painlev\'{e} equation is a symplectic map, and the shift in time is also symplectic. Finally we present a Lax pair for the alternate discrete PIIP_{II}, which recovers Jimbo and Miwa's Lax pair for PIIP_{II} in the continuum limit h→0h\to 0.Comment: 23 pages, IOP style. Title changed, and connection with Umemura polynomials adde

    Synergistic Gravity and the Role of Resonances in GRS-Inspired Braneworlds

    Full text link
    We consider 5D braneworld models of quasi-localized gravity in which 4D gravity is reproduced at intermediate scales while the extra dimension opens up at both the very short and the very long distances, where the geometry is flat. Our main interest is the interplay between the zero mode of these models, whenever a normalizable zero mode exists, and the effects of zero energy graviton resonant modes coming from the contributions of massive KK modes. We first consider a compactified version of the GRS model and find that quasi-localized gravity is characterized by a scale for which both the resonance and the zero mode have significant contribution to 4D gravity. Above this scale, gravity is primarily mediated by the zero mode, while the resonance gives only minor corrections. Next, we consider an asymmetric version of the standard non-compact GRS model, characterized by different cosmological constants on each AdS side. We show that a resonance is present but the asymmetry, through the form of the localizing potential, can weaken it, resulting in a shorter lifetime and, thus, in a shorter distance scale for 4D gravity. As a third model exhibiting quasi-localization, we consider a version of the GRS model in which the central positive tension brane has been replaced by a configuration of a scalar field propagating in the bulk.Comment: 18 pages, 3 figures, added 1 figure, revised version as published in Class. Quant. Gra

    Trust in Crowds: probabilistic behaviour in anonymity protocols

    No full text
    The existing analysis of the Crowds anonymity protocol assumes that a participating member is either ‘honest’ or ‘corrupted’. This paper generalises this analysis so that each member is assumed to maliciously disclose the identity of other nodes with a probability determined by her vulnerability to corruption. Within this model, the trust in a principal is defined to be the probability that she behaves honestly. We investigate the effect of such a probabilistic behaviour on the anonymity of the principals participating in the protocol, and formulate the necessary conditions to achieve ‘probable innocence’. Using these conditions, we propose a generalised Crowds-Trust protocol which uses trust information to achieves ‘probable innocence’ for principals exhibiting probabilistic behaviour
    • …
    corecore