194 research outputs found

    Optical Light Curve of the Type Ia Supernova 1998bu in M96 and the Supernova Calibration of the Hubble Constant

    Get PDF
    We present the UBVRI light curves of the Type Ia supernova SN 1998bu which appeared in the nearby galaxy M96 (NGC 3368). M96 is a spiral galaxy in the Leo I group which has a Cepheid-based distance. Our photometry allows us to calculate the absolute magnitude and reddening of this supernova. These data, when combined with measurements of the four other well-observed supernovae with Cepheid based distances, allow us to calculate the Hubble constant with respect to the Hubble flow defined by the distant Calan/Tololo Type Ia sample. We find a Hubble constant of 64.0 +/- 2.2(internal) +/- 3.5(external) km/s/Mpc, consistent with most previous estimates based on Type Ia supernovae. We note that the two well-observed Type Ia supernovae in Fornax, if placed at the Cepheid distance to the possible Fornax spiral NGC 1365, are apparently too faint with respect to the Calan/Tololo sample calibrated with the five Type Ia supernovae with Cepheid distances to the host galaxies.Comment: AAS LaTeX, 20 pages, 4 figures, 6 tables, accepted for publication in the Astronomical Journal. Figure 1 (finding chart) not include

    Uncertainty in United States coastal wetland greenhouse gas inventorying

    Get PDF
    © The Author(s), 2018. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Environmental Research Letters 13 (2018): 115005, doi:10.1088/1748-9326/aae157.Coastal wetlands store carbon dioxide (CO2) and emit CO2 and methane (CH4) making them an important part of greenhouse gas (GHG) inventorying. In the contiguous United States (CONUS), a coastal wetland inventory was recently calculated by combining maps of wetland type and change with soil, biomass, and CH4 flux data from a literature review. We assess uncertainty in this developing carbon monitoring system to quantify confidence in the inventory process itself and to prioritize future research. We provide a value-added analysis by defining types and scales of uncertainty for assumptions, burial and emissions datasets, and wetland maps, simulating 10 000 iterations of a simplified version of the inventory, and performing a sensitivity analysis. Coastal wetlands were likely a source of net-CO2-equivalent (CO2e) emissions from 2006–2011. Although stable estuarine wetlands were likely a CO2e sink, this effect was counteracted by catastrophic soil losses in the Gulf Coast, and CH4 emissions from tidal freshwater wetlands. The direction and magnitude of total CONUS CO2e flux were most sensitive to uncertainty in emissions and burial data, and assumptions about how to calculate the inventory. Critical data uncertainties included CH4 emissions for stable freshwater wetlands and carbon burial rates for all coastal wetlands. Critical assumptions included the average depth of soil affected by erosion events, the method used to convert CH4 fluxes to CO2e, and the fraction of carbon lost to the atmosphere following an erosion event. The inventory was relatively insensitive to mapping uncertainties. Future versions could be improved by collecting additional data, especially the depth affected by loss events, and by better mapping salinity and inundation gradients relevant to key GHG fluxes. Social Media Abstract: US coastal wetlands were a recent and uncertain source of greenhouse gasses because of CH4 and erosion.Financial support was provided primarily by NASA Carbon Monitoring Systems (NNH14AY67I) and the USGS Land Carbon Program, with additional support from The Smithsonian Institution, The Coastal Carbon Research Coordination Network (DEB-1655622), and NOAA Grant: NA16NMF4630103

    Benchtop flow-NMR for rapid online monitoring of RAFT and free radical polymerisation in batch and continuous reactors

    Get PDF
    A “Benchtop” NMR spectrometer is used for detailed monitoring of controlled and free radical polymerisations performed in batch and continuous reactors both offline and in real-time. This allows detailed kinetic analysis with unprecedented temporal resolution for reactions which reach near completion in under five minutes

    CMB-S4 Science Book, First Edition

    Full text link
    This book lays out the scientific goals to be addressed by the next-generation ground-based cosmic microwave background experiment, CMB-S4, envisioned to consist of dedicated telescopes at the South Pole, the high Chilean Atacama plateau and possibly a northern hemisphere site, all equipped with new superconducting cameras. CMB-S4 will dramatically advance cosmological studies by crossing critical thresholds in the search for the B-mode polarization signature of primordial gravitational waves, in the determination of the number and masses of the neutrinos, in the search for evidence of new light relics, in constraining the nature of dark energy, and in testing general relativity on large scales

    Why MSM in rural South African communities should be an HIV prevention research priority.

    Get PDF
    Research into HIV and men who have sex with men's (MSM) health in South Africa has been largely confined to the metropolitan centres. Only two studies were located making reference to MSM in rural contexts or same-sex behaviors among men in the same. There is growing recognition in South Africa that MSM are not only disproportionately affected by HIV and have been underserved by the country's national response, but that they contribute significantly to sustaining the high number of new infections recorded each year. We argue that to meet the objectives of the country's national strategic plan for HIV, STI and TB it is important we know how these behaviours may be contributing to the sustained rural HIV epidemic in the youngest age groups and determine what constitutes appropriate and feasible programmatic response that can be implemented in the country's public sector health services

    Climate Change Meets the Law of the Horse

    Get PDF
    The climate change policy debate has only recently turned its full attention to adaptation - how to address the impacts of climate change we have already begun to experience and that will likely increase over time. Legal scholars have in turn begun to explore how the many different fields of law will and should respond. During this nascent period, one overarching question has gone unexamined: how will the legal system as a whole organize around climate change adaptation? Will a new distinct field of climate change adaptation law and policy emerge, or will legal institutions simply work away at the problem through unrelated, duly self-contained fields, as in the famous Law of the Horse? This Article is the first to examine that question comprehensively, to move beyond thinking about the law and climate change adaptation to consider the law of climate change adaptation. Part I of the Article lays out our methodological premises and approach. Using a model we call Stationarity Assessment, Part I explores how legal fields are structured and sustained based on assumptions about the variability of natural, social, and economic conditions, and how disruptions to that regime of variability can lead to the emergence of new fields of law and policy. Case studies of environmental law and environmental justice demonstrate the model’s predictive power for the formation of new distinct legal regimes. Part II applies the Stationarity Assessment model to the topic of climate change adaptation, using a case study of a hypothetical coastal region and the potential for climate change impacts to disrupt relevant legal doctrines and institutions. We find that most fields of law appear capable of adapting effectively to climate change. In other words, without some active intervention, we expect the law and policy of climate change adaptation to follow the path of the Law of the Horse - a collection of fields independently adapting to climate change - rather than organically coalescing into a new distinct field. Part III explores why, notwithstanding this conclusion, it may still be desirable to seek a different trajectory. Focusing on the likelihood of systemic adaptation decisions with perverse, harmful results, we identify the potential benefits offered by intervening to shape a new and distinct field of climate change adaptation law and policy. Part IV then identifies the contours of such a field, exploring the distinct purposes of reducing vulnerability, ensuring resiliency, and safeguarding equity. These features provide the normative policy components for a law of climate change adaptation that would be more than just a Law of the Horse. This new field would not replace or supplant any existing field, however, as environmental law did with regard to nuisance law, and it would not be dominated by substantive doctrine. Rather, like the field of environmental justice, this new legal regime would serve as a holistic overlay across other fields to ensure more efficient, effective, and just climate change adaptation solutions

    Cosmological parameters from SDSS and WMAP

    Full text link
    We measure cosmological parameters using the three-dimensional power spectrum P(k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a ``vanilla'' flat adiabatic Lambda-CDM model without tilt (n=1), running tilt, tensor modes or massive neutrinos. Adding SDSS information more than halves the WMAP-only error bars on some parameters, tightening 1 sigma constraints on the Hubble parameter from h~0.74+0.18-0.07 to h~0.70+0.04-0.03, on the matter density from Omega_m~0.25+/-0.10 to Omega_m~0.30+/-0.04 (1 sigma) and on neutrino masses from <11 eV to <0.6 eV (95%). SDSS helps even more when dropping prior assumptions about curvature, neutrinos, tensor modes and the equation of state. Our results are in substantial agreement with the joint analysis of WMAP and the 2dF Galaxy Redshift Survey, which is an impressive consistency check with independent redshift survey data and analysis techniques. In this paper, we place particular emphasis on clarifying the physical origin of the constraints, i.e., what we do and do not know when using different data sets and prior assumptions. For instance, dropping the assumption that space is perfectly flat, the WMAP-only constraint on the measured age of the Universe tightens from t0~16.3+2.3-1.8 Gyr to t0~14.1+1.0-0.9 Gyr by adding SDSS and SN Ia data. Including tensors, running tilt, neutrino mass and equation of state in the list of free parameters, many constraints are still quite weak, but future cosmological measurements from SDSS and other sources should allow these to be substantially tightened.Comment: Minor revisions to match accepted PRD version. SDSS data and ppt figures available at http://www.hep.upenn.edu/~max/sdsspars.htm

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie
    corecore