328 research outputs found

    Advanced localization of massive black hole coalescences with LISA

    Full text link
    The coalescence of massive black holes is one of the primary sources of gravitational waves (GWs) for LISA. Measurements of the GWs can localize the source on the sky to an ellipse with a major axis of a few tens of arcminutes to a few degrees, depending on source redshift, and a minor axis which is 2--4 times smaller. The distance (and thus an approximate redshift) can be determined to better than a per cent for the closest sources we consider, although weak lensing degrades this performance. It will be of great interest to search this three-dimensional `pixel' for an electromagnetic counterpart to the GW event. The presence of a counterpart allows unique studies which combine electromagnetic and GW information, especially if the counterpart is found prior to final merger of the holes. To understand the feasibility of early counterpart detection, we calculate the evolution of the GW pixel with time. We find that the greatest improvement in pixel size occurs in the final day before merger, when spin precession effects are maximal. The source can be localized to within 10 square degrees as early as a month before merger at z=1z = 1; for higher redshifts, this accuracy is only possible in the last few days.Comment: 11 pages, 4 figures, version published in Classical and Quantum Gravity (special issue for proceedings of 7th International LISA Symposium

    Berkeley Supernova Ia Program I: Observations, Data Reduction, and Spectroscopic Sample of 582 Low-Redshift Type Ia Supernovae

    Get PDF
    In this first paper in a series we present 1298 low-redshift (z\leq0.2) optical spectra of 582 Type Ia supernovae (SNe Ia) observed from 1989 through 2008 as part of the Berkeley SN Ia Program (BSNIP). 584 spectra of 199 SNe Ia have well-calibrated light curves with measured distance moduli, and many of the spectra have been corrected for host-galaxy contamination. Most of the data were obtained using the Kast double spectrograph mounted on the Shane 3 m telescope at Lick Observatory and have a typical wavelength range of 3300-10,400 Ang., roughly twice as wide as spectra from most previously published datasets. We present our observing and reduction procedures, and we describe the resulting SN Database (SNDB), which will be an online, public, searchable database containing all of our fully reduced spectra and companion photometry. In addition, we discuss our spectral classification scheme (using the SuperNova IDentification code, SNID; Blondin & Tonry 2007), utilising our newly constructed set of SNID spectral templates. These templates allow us to accurately classify our entire dataset, and by doing so we are able to reclassify a handful of objects as bona fide SNe Ia and a few other objects as members of some of the peculiar SN Ia subtypes. In fact, our dataset includes spectra of nearly 90 spectroscopically peculiar SNe Ia. We also present spectroscopic host-galaxy redshifts of some SNe Ia where these values were previously unknown. [Abridged]Comment: 34 pages, 11 figures, 11 tables, revised version, re-submitted to MNRAS. Spectra will be released in January 2013. The SN Database homepage (http://hercules.berkeley.edu/database/index_public.html) contains the full tables, plots of all spectra, and our new SNID template

    Managing marine disease emergencies in an era of rapid change

    Get PDF
    Infectious marine diseases can decimate populations and are increasing among some taxa due to global change and our increasing reliance on marine environments. Marine diseases become emergencies when significant ecological, economic or social impacts occur. We can prepare for and manage these emergencies through improved surveillance, and the development and iterative refinement of approaches to mitigate disease and its impacts. Improving surveillance requires fast, accurate diagnoses, forecasting disease risk and real-time monitoring of disease-promoting environmental conditions. Diversifying impact mitigation involves increasing host resilience to disease, reducing pathogen abundance and managing environmental factors that facilitate disease. Disease surveillance and mitigation can be adaptive if informed by research advances and catalysed by communication among observers, researchers and decision-makers using information-sharing platforms. Recent increases in the awareness of the threats posed by marine diseases may lead to policy frameworks that facilitate the responses and management that marine disease emergencies require

    Quantum Computing

    Full text link
    Quantum mechanics---the theory describing the fundamental workings of nature---is famously counterintuitive: it predicts that a particle can be in two places at the same time, and that two remote particles can be inextricably and instantaneously linked. These predictions have been the topic of intense metaphysical debate ever since the theory's inception early last century. However, supreme predictive power combined with direct experimental observation of some of these unusual phenomena leave little doubt as to its fundamental correctness. In fact, without quantum mechanics we could not explain the workings of a laser, nor indeed how a fridge magnet operates. Over the last several decades quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit these unique quantum properties? Today it is understood that the answer is yes. Many research groups around the world are working towards one of the most ambitious goals humankind has ever embarked upon: a quantum computer that promises to exponentially improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for this task---ranging from single particles of light to superconducting circuits---and it is not yet clear which, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain what the major challenges are for the future.Comment: 26 pages, 7 figures, 291 references. Early draft of Nature 464, 45-53 (4 March 2010). Published version is more up-to-date and has several corrections, but is half the length with far fewer reference

    The Star Formation and Nuclear Accretion Histories of Normal Galaxies in the AGES Survey

    Full text link
    We combine IR, optical and X-ray data from the overlapping, 9.3 square degree NOAO Deep Wide-Field Survey (NDWFS), AGN and Galaxy Evolution Survey (AGES), and XBootes Survey to measure the X-ray evolution of 6146 normal galaxies as a function of absolute optical luminosity, redshift, and spectral type over the largely unexplored redshift range 0.1 < z < 0.5. Because only the closest or brightest of the galaxies are individually detected in X-rays, we use a stacking analysis to determine the mean properties of the sample. Our results suggest that X-ray emission from spectroscopically late-type galaxies is dominated by star formation, while that from early-type galaxies is dominated by a combination of hot gas and AGN emission. We find that the mean star formation and supermassive black hole accretion rate densities evolve like (1+z)^3, in agreement with the trends found for samples of bright, individually detectable starburst galaxies and AGN. Our work also corroborates the results of many previous stacking analyses of faint source populations, with improved statistics.Comment: 19 pages, 15 figures, 3 tables, accepted for publication in Ap
    • …
    corecore