521,676 research outputs found

    Evolution of large-scale perturbations in quintessence models

    Get PDF
    We carry out a comprehensive study of the dynamics of large-scale perturbations in quintessence scenarios. We model the contents of the Universe by a perfect fluid with equation of state w_f and a scalar field Q with potential V(Q). We are able to reduce the perturbation equations to a system of four first-order equations. During each of the five main regimes of quintessence field behaviour, these equations have constant coefficients, enabling analytic solution of the perturbation evolution by eigenvector decomposition. We determine these solutions and discuss their main properties.Comment: 5 pages RevTeX4 file with two figures incorporate

    The ALHAMBRA Project: A large area multi medium-band optical and NIR photometric survey

    Get PDF
    (ABRIDGED) We describe the first results of the ALHAMBRA survey which provides cosmic tomography of the evolution of the contents of the Universe over most of Cosmic history. Our approach employs 20 contiguous, equal-width, medium-band filters covering from 3500 to 9700 A, plus the JHKs bands, to observe an area of 4 sqdeg on the sky. The optical photometric system has been designed to maximize the number of objects with accurate classification by SED and redshift, and to be sensitive to relatively faint emission lines. The observations are being carried out with the Calar Alto 3.5m telescope using the cameras LAICA and O-2000. The first data confirm that we are reaching the expected magnitude limits of AB<~25 mag in the optical filters from the blue to 8300 A, and from AB=24.7 to 23.4 for the redder ones. The limit in the NIR is (Vega) K_s~20, H~21, J~22. We expect to obtain accurate redshift values, Delta z/(1+z) <~ 0.03 for about 5x10^5 galaxies with I<~25 (60% complete), and z_med=0.74. This accuracy, together with the homogeneity of the selection function, will allow for the study of the redshift evolution of the large scale structure, the galaxy population and its evolution with redshift, the identification of clusters of galaxies, and many other studies, without the need for any further follow-up. It will also provide targets for detailed studies with 10m-class telescopes. Given its area, spectral coverage and its depth, apart from those main goals, the ALHAMBRA-Survey will also produce valuable data for galactic studies.Comment: Accepted to the Astronomical Journal. 43 pages, 18 figures. The images have been reduced in resolution to adapt to standard file sizes. Readers can find the full-resolution version of the paper at the ALHAMBRA web site (http://www.iaa.es/alhambra) under the "Publications" lin

    Top Quark Physics at the LHC: A Review of the First Two Years

    Full text link
    This review summarizes the highlights in the area of top quark physics obtained with the two general purpose detectors ATLAS and CMS during the first two years of operation of the Large Hadron Collider LHC. It covers the 2010 and 2011 data taking periods, where the LHC provided pp collisions at a center-of-mass energy of sqrt(s)=7 TeV. Measurements are presented of the total and differential top quark pair production cross section in many different channels, the top quark mass and various other properties of the top quark and its interactions, for instance the charge asymmetry. Measurements of single top quark production and various searches for new physics involving top quarks are also discussed. The already very precise experimental data are in good agreement with the standard model.Comment: 107 pages, invited review for Int. J. Mod. Phys. A, v2 is identical to v1 except for the addition of the table of content

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    Reverse engineering to achieve maintainable WWW sites

    Get PDF
    The growth of the World Wide Web and the accelerated development of web sites and associated web technologies has resulted in a variety of maintenance problems. The maintenance problems associated with web sites and the WWW are examined. It is argued that currently web sites and the WWW lack both data abstractions and structures that could facilitate maintenance. A system to analyse existing web sites and extract duplicated content and style is described here. In designing the system, existing Reverse Engineering techniques have been applied, and a case for further application of these techniques is made in order to prepare sites for their inevitable evolution in futur

    Spying the World from your Laptop -- Identifying and Profiling Content Providers and Big Downloaders in BitTorrent

    Get PDF
    This paper presents a set of exploits an adversary can use to continuously spy on most BitTorrent users of the Internet from a single machine and for a long period of time. Using these exploits for a period of 103 days, we collected 148 million IPs downloading 2 billion copies of contents. We identify the IP address of the content providers for 70% of the BitTorrent contents we spied on. We show that a few content providers inject most contents into BitTorrent and that those content providers are located in foreign data centers. We also show that an adversary can compromise the privacy of any peer in BitTorrent and identify the big downloaders that we define as the peers who subscribe to a large number of contents. This infringement on users' privacy poses a significant impediment to the legal adoption of BitTorrent
    • …
    corecore