293 research outputs found

    Quasar broad absorption line variability measurements using reconstructions of un-absorbed spectra

    Full text link
    We present a two-epoch Sloan Digital Sky Survey and Gemini/GMOS+William Herschel Telescope/ISIS variability study of 50 broad absorption line quasars of redshift range 1.9 < z < 4.2, containing 38 Si IV and 59 C IV BALs and spanning rest-frame time intervals of approximately 10 months to 3.7 years. We find that 35/50 quasars exhibit one or more variable BALs, with 58% of Si IV and 46% of C IV BALs showing variability across the entire sample. On average, Si IV BALs show larger fractional change in BAL pseudo equivalent width than C IV BALs, as referenced to an unabsorbed continuum+emission-line spectrum constructed using non-negative matrix factorisation. No correlation is found between BAL variability and quasar luminosity, suggesting that ionizing continuum changes do not play a significant role in BAL variability (assuming the gas is in photoionization equilibrium with the ionizing continuum). A subset of 14 quasars have one variable BAL from each of Si IV and C IV with significant overlap in velocity space and for which variations are in the same sense (strengthening or weakening) and which appear to be correlated (98% confidence). We find examples of both appearing and disappearing BALs in weaker/shallower lines with disappearance rates of 2.3% for C IV and 5.3% for Si IV, suggesting average lifetimes of 142 and 43 years respectively. We identify 5 objects in which the BAL is coincident with the broad emission-line, but appears to cover only the continuum source. Assuming a clumpy inhomogeneous absorber model and a typical size for the continuum source, we infer a maximum cloud radius of 10^13 to 10^14 cm, assuming Eddington limited accretion.Comment: Accepted for publication in MNRAS. 22 pages, 12 figures, 7 table

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    Wide-Field Infrared Survey Telescope (WFIRST) Interim Report

    Get PDF
    The New Worlds, New Horizons (NWNH) in Astronomy and Astrophysics 2010 Decadal Survey prioritized the community consensus for ground-based and space-based observatories. Recognizing that many of the community s key questions could be answered with a wide-field infrared survey telescope in space, and that the decade would be one of budget austerity, WFIRST was top ranked in the large space mission category. In addition to the powerful new science that could be accomplished with a wide-field infrared telescope, the WFIRST mission was determined to be both technologically ready and only a small fraction of the cost of previous flagship missions, such as HST or JWST. In response to the top ranking by the community, NASA formed the WFIRST Science Definition Team (SDT) and Project Office. The SDT was charged with fleshing out the NWNH scientific requirements to a greater level of detail. NWNH evaluated the risk and cost of the JDEM-Omega mission design, as submitted by NASA, and stated that it should serve as the basis for the WFIRST mission. The SDT and Project Office were charged with developing a mission optimized for achieving the science goals laid out by the NWNH re-port. The SDT and Project Office opted to use the JDEM-Omega hardware configuration as an initial start-ing point for the hardware implementation. JDEM-Omega and WFIRST both have an infrared imager with a filter wheel, as well as counter-dispersed moderate resolution spectrometers. The primary advantage of space observations is being above the Earth's atmosphere, which absorbs, scatters, warps and emits light. Observing from above the atmosphere enables WFIRST to obtain precision infrared measurements of the shapes of galaxies for weak lensing, infrared light-curves of supernovae and exoplanet microlensing events with low systematic errors, and infrared measurements of the H hydrogen line to be cleanly detected in the 1<z<2 redshift range important for baryon acoustic oscillation (BAO) dark energy measurements. The Infrared Astronomical Satellite (IRAS), the Cosmic Background Explorer (COBE), Herschel, Spitzer, and Wide-field Infrared Sur-vey Explorer (WISE) are all space missions that have produced stunning new scientific advances by going to space to observe in the infrared. This interim report describes progress as of June 2011 on developing a requirements flowdown and an evaluation of scientific performance. An Interim Design Reference Mission (IDRM) configuration is presented that is based on the specifications of NWNH with some refinements to optimize the design in accordance with the new scientific requirements. Analysis of this WFIRST IDRM concept is in progress to ensure the capability of the observatory is compatible with the science requirements. The SDT and Project will continue to refine the mission concept over the coming year as design, analysis and simulation work are completed, resulting in the SDT s WFIRST Design Reference Mission (DRM) by the end of 2012

    Dynamic Fine-Grained Scheduling for Energy-Efficient Main-Memory Queries

    Get PDF
    Power and cooling costs are some of the highest costs in data centers today, which make improvement in energy efficiency crucial. Energy efficiency is also a major design point for chips that power whole ranges of computing devices. One important goal in this area is energy proportionality, arguing that the system's power consumption should be proportional to its performance. Currently, a major trend among server processors, which stems from the design of chips for mobile devices, is the inclusion of advanced power management techniques, such as dynamic voltage-frequency scaling, clock gating, and turbo modes. A lot of recent work on energy efficiency of database management systems is focused on coarse-grained power management at the granularity of multiple machines and whole queries. These techniques, however, cannot efficiently adapt to the frequently fluctuating behavior of contemporary workloads. In this paper, we argue that databases should employ a fine-grained approach by dynamically scheduling tasks using precise hardware models. These models can be produced by calibrating operators under different combinations of scheduling policies, parallelism, and memory access strategies. The models can be employed at run-time for dynamic scheduling and power management in order to improve the overall energy efficiency. We experimentally show that energy efficiency can be improved by up to 4x for fundamental memory-intensive database operations, such as scans

    The LSST Era of Supermassive Black Hole Accretion Disk Reverberation Mapping

    Full text link
    peer reviewedThe Vera C. Rubin Observatory's Legacy Survey of Space and Time (LSST) will detect an unprecedentedly large sample of actively accreting supermassive black holes with typical accretion disk (AD) sizes of a few light days. This brings us to face challenges in the reverberation mapping (RM) measurement of AD sizes in active galactic nuclei using interband continuum delays. We examine the effect of LSST cadence strategies on AD RM using our metric AGN_TimeLagMetric. It accounts for redshift, cadence, the magnitude limit, and magnitude corrections for dust extinction. Running our metric on different LSST cadence strategies, we produce an atlas of the performance estimations for LSST photometric RM measurements. We provide an upper limit on the estimated number of quasars for which the AD time lag can be computed within 0 1000 sources in each deep drilling field (DDF; (10 deg2)) in any filter, with the redshift distribution of these sources peaking at z ≍ 1. We find the LSST observation strategies with a good cadence (≲5 days) and a long cumulative season (~9 yr), as proposed for LSST DDF, are favored for the AD size measurement. We create synthetic LSST light curves for the most suitable DDF cadences and determine RM time lags to demonstrate the impact of the best cadences based on the proposed metric

    Anatomical and Functional Lung Imaging with Volumetric Computed Tomography in Non-Small Cell Lung Cancer

    Get PDF
    Non-small cell lung cancer (NSCLC) is one of the most diagnosed cancers in Canada, and the leading cause of cancer deaths. A significant challenge in treating NSCLC is balancing aggressive treatment with the potentially severe side effects. In radiation therapy, the management of respiratory motion and the risks of radiation-induced lung injury (RILI) pose significant challenges. 4-dimensional computed tomography (4D-CT) is an important part of motion management, but images often suffer from motion-induced artifacts. Volumetric CT scanners with wide axial field-of-view (aFOV) may reduce these artifacts and present an opportunity to advance CT-based functional lung imaging. Chapter 2 presents a phantom imaging study to investigate the suitability of a 256-slice volumetric CT (vCT) scanner for radiotherapy treatment planning. The density of the highest density materials was under-estimated by the scanner, which can be addressed with the use of an appropriate relative electron density (RED) curve. An average RED curve for all aFOV settings may be used. Chapter 3 presents a study of phantom and NSCLC patient 4D-CT images acquired on a clinical scanner and a vCT scanner. The v4D-CT images were re-sampled to simulate a conventional acquisition using a narrow aFOV clinical scanner. The phantom images demonstrated that target contouring variability decreased in v4D-CT imaging as compared to clinical 4D-CT. In the patient images, mean Hausdorff distance between organs at risk (OAR) contours was significantly correlated to respiratory phase, indicating that motion artifacts contribute to this variability. Chapter 4 presents a novel acquisition and analysis pipeline to image lung ventilation (V), perfusion (Q) and V/Q ratio in a single volumetric CT scan. In a porcine study, these images of V and Q were significantly correlated to standard Xe-enhanced ventilation and PET perfusion images in voxel-wise analysis. In a NSCLC patient study, the images were sensitive to changes in V and Q between baseline imaging and follow-up 6 weeks after radiotherapy. In this thesis, I demonstrate that volumetric CT scanners are suitable for use in radiation therapy simulation and treatment planning, and detail two scanning protocols which may reduce the challenges posed by respiratory motion and RILI risk in NSCLC

    Precise measurements of time delays in gravitationally lensed quasars for competitive and independent determination of the Hubble constant

    Get PDF
    During these last decades, by virtue of observations, the Standard Cosmological Model has emerged, providing a description of the Universe's evolution using a minimal set of independent constraints - the cosmological parameters. Among them is the expansion rate of the Universe, the so-called Hubble constant or H0, first measured by Lemaître in 1927. The century that followed this cornerstone measurement saw numerous attempts to refine the initial value, and for good reason: a precise and independent measurement of H0 will bring strong constraints on the cosmological models. It could notably help the astronomers to better understand the nature of dark energy, thus making it one of the most sought-after prizes in modern cosmology. My work at the Laboratory of Astrophysics of EPFL is embedded in this context. I am part of the COSMOGRAIL and H0LiCOW collaborations, aiming to measure the Hubble constant with the highest level of precision using time-delay cosmography, a method based on the theory of strong gravitational lensing. This effect occurs when an observer looks at a light source located behind a massive foreground galaxy. The mass of the galaxy acts similarly to an optical lens and focuses the light rays emitted by the source. As a consequence, multiple lensed images of the source appear around the lens galaxy. If the luminosity of the source changes over time, the variations will be seen in all the lensed images but with a temporal delay due to the different travel paths of the light rays. By carefully monitoring the luminosity variations of each lensed image, one can precisely measure the temporal delays between them. Combined to high-resolution observations of the foreground galaxy and its surroundings, it is possible to directly measure the Hubble constant upon the sole assumption that the General Relativity is correct. Since more than 13 years, COSMOGRAIL monitors dozens of lensed quasars to produce high-quality light curves and time-delay measurements. During these last four years, I took care of the monitoring schedule, continuous data reduction and time-delay measurements through the development of curve-shifting techniques. I produced light curves and measured time delays on a variety of lenses. After more than a decade of endeavours, COSMOGRAIL and H0LiCOW finally revealed their measurement of the expansion rate of the Universe from a blind analysis of three lensed sources. I had the privilege to be the lead author of the publication presenting our measurement of the Hubble constant, H0=71.9 -3.0+2.4 km/s/Mpc 3.8% precision in the Standard Cosmological Model. Such a precision allows a direct comparison with the results of the distance ladder technique in the local Universe and the Planck satellite Cosmic Microwave Background observations in the distant Universe, both of which being currently in a significant tension of unknown source
    corecore