9,649 research outputs found

    The solar siblings in the Gaia era

    Full text link
    We perform realistic simulations of the Sun's birth cluster in order to predict the current distribution of solar siblings in the Galaxy. We study the possibility of finding the solar siblings in the Gaia catalogue by using only positional and kinematic information. We find that the number of solar siblings predicted to be observed by Gaia will be around 100 in the most optimistic case, and that a phase space only search in the Gaia catalogue will be extremely difficult. It is therefore mandatory to combine the chemical tagging technique with phase space selection criteria in order to have any hope of finding the solar siblings.Comment: To be published in the proceedings of the GREAT-ITN conference "The Milky Way Unravelled by Gaia: GREAT Science from the Gaia Data Releases", 1-5 December 2014, University of Barcelona, Spain, EAS Publications Series, eds Nicholas Walton, Francesca Figueras, and Caroline Soubira

    Time-Reversal Symmetry Breaking and Decoherence in Chaotic Dirac Billiards

    Full text link
    In this work, we perform a statistical study on Dirac Billiards in the extreme quantum limit (a single open channel on the leads). Our numerical analysis uses a large ensemble of random matrices and demonstrates the preponderant role of dephasing mechanisms in such chaotic billiards. Physical implementations of these billiards range from quantum dots of graphene to topological insulators structures. We show, in particular, that the role of finite crossover fields between the universal symmetries quickly leaves the conductance to the asymptotic limit of unitary ensembles. Furthermore, we show that the dephasing mechanisms strikingly lead Dirac billiards from the extreme quantum regime to the semiclassical Gaussian regime

    Modeling the input history of programs for improved instruction-memory performance

    Full text link
    When a program is loaded into memory for execution, the relative position of its basic blocks is crucial, since loading basic blocks that are unlikely to be executed first places them high in the instruction-memory hierarchy only to be dislodged as the execution goes on. In this paper we study the use of Bayesian networks as models of the input history of a program. The main point is the creation of a probabilistic model that persists as the program is run on different inputs and at each new input refines its own parameters in order to reflect the program's input history more accurately. As the model is thus tuned, it causes basic blocks to be reordered so that, upon arrival of the next input for execution, loading the basic blocks into memory automatically takes into account the input history of the program. We report on extensive experiments, whose results demonstrate the efficacy of the overall approach in progressively lowering the execution times of a program on identical inputs placed randomly in a sequence of varied inputs. We provide results on selected SPEC CINT2000 programs and also evaluate our approach as compared to the gcc level-3 optimization and to Pettis-Hansen reordering
    • …
    corecore