1,692 research outputs found

    Closing yield gaps: perils and possibilities for biodiversity conservation.

    Get PDF
    Increasing agricultural productivity to 'close yield gaps' creates both perils and possibilities for biodiversity conservation. Yield increases often have negative impacts on species within farmland, but at the same time could potentially make it more feasible to minimize further cropland expansion into natural habitats. We combine global data on yield gaps, projected future production of maize, rice and wheat, the distributions of birds and their estimated sensitivity to changes in crop yields to map where it might be most beneficial for bird conservation to close yield gaps as part of a land-sparing strategy, and where doing so might be most damaging. Closing yield gaps to attainable levels to meet projected demand in 2050 could potentially help spare an area equivalent to that of the Indian subcontinent. Increasing yields this much on existing farmland would inevitably reduce its biodiversity, and therefore we advocate efforts both to constrain further increases in global food demand, and to identify the least harmful ways of increasing yields. The land-sparing potential of closing yield gaps will not be realized without specific mechanisms to link yield increases to habitat protection (and restoration), and therefore we suggest that conservationists, farmers, crop scientists and policy-makers collaborate to explore promising mechanisms.BP was funded by the Zukerman Research Fellowship in Global Food Security at King’s College, Cambridge.This is the accepted manuscript of a paper which was published in the Philosophical Transactions of the Royal Society B: Biological Sciences (B Phalan, R Green, A Balmford, Phil. Trans. R. Soc. B 2014, 369, 20120285

    Density estimation with Gaussian processes for gravitational-wave posteriors

    Get PDF
    The properties of black-hole and neutron-star binaries are extracted from gravitational-wave signals using Bayesian inference. This involves evaluating a multi-dimensional posterior probability function with stochastic sampling. The marginal probability density distributions from which the samples are drawn are usually interpolated with kernel density estimators. Since most post-processing analysis within the field is based on these parameter estimation products, interpolation accuracy of the marginals is essential. In this work, we propose a new method combining histograms and Gaussian Processes as an alternative technique to fit arbitrary combinations of samples from the source parameters. This method comes with several advantages such as flexible interpolation of non-Gaussian correlations, Bayesian estimate of uncertainty, and efficient re-sampling with Hamiltonian Monte Carlo

    Introduction

    Get PDF
    No abstract

    Land sparing to make space for species dependent on natural habitats and high nature value farmland.

    Get PDF
    Empirical evidence from four continents indicates that human food demand may be best reconciled with biodiversity conservation through sparing natural habitats by boosting agricultural yields. This runs counter to the conservation paradigm of wildlife-friendly farming, which is influential in Europe, where many species are dependent on low-yielding high nature value farmland threatened by both intensification and abandonment. In the first multi-taxon population-level test of land-sparing theory in Europe, we quantified how population densities of 175 bird and sedge species varied with farm yield across 26 squares (each with an area of 1 km2) in eastern Poland. We discovered that, as in previous studies elsewhere, simple land sparing, with only natural habitats on spared land, markedly out-performed land sharing in its effect on region-wide projected population sizes. However, a novel 'three-compartment' land-sparing approach, in which about one-third of spared land is assigned to very low-yield agriculture and the remainder to natural habitats, resulted in least-reduced projected future populations for more species. Implementing the three-compartment model would require significant reorganization of current subsidy regimes, but would mean high-yield farming could release sufficient land for species dependent on both natural and high nature value farmland to persist.Supported by a NERC CASE studentship to C.F

    There’s more than one way to ride the wave: A multi-disciplinary approach to gravitational wave data analysis

    Get PDF
    Since the first detection in 2015, gravitational-wave astronomy has progressed hugely. Several observing runs have been completed, resulting in many more confirmed detections of compact binary coalescence. As the number of detections grows larger, the potential for exciting science also increases, however, this is not without challenges. Specifically efficiently analyzing growing data will present many computational problems going forward. In order to properly interpret and understand this growing data, we must develop new ways to approach these computational problems. When seeking to tackle a difficult problem there are broadly two ways to do this. One can tackle the problem using some physical or mathematical insight, this understanding can then be translated into a simpler formulation or good approximation which makes the problem tractable. This has been the standard way to tackle problems since the beginning of science, recently, however, data-driven methods have become hugely popular. These data-driven methods such as machine learning generally do not use physical insight but make use of large amounts of data efficiently to produce solutions to these intractable problems. This thesis draws on both of these approaches and presents several new methods to analyze gravitational-wave data. In chapters 2-3 we derive a way to describe a precessing waveform as a harmonic decomposition, where each harmonic is a simple non-precessing waveform. With this formulation, we are able to obtain a simple picture of precession as the beating of two waveforms. We then use this understanding to answer questions such as when will we observe precessing waveforms? And where in parameter space will we be able to observe precessing waveforms? The remaining chapters look at data-driven approaches, using machine learning techniques to improve different aspects of gravitational-wave data analysis. Chapter 5 uses Gaussian Processes to interpolate posterior samples, this allows us to have a smooth continuous representation of our posterior as opposed to histograms for example. Chapter 6 uses using advances in waveform modeling and GPUs to potentially make parameter estimation more efficient. In chapters 7 and 8 we look at how reliable machine learning techniques are, we show that often they do not incorporate uncertainty properly into their predictions. We then present a simple algorithm for both classification and regression pipelines that can be used with any machine learning model to address this. Finally, in the conclusions, we review the work presented as a whole and discuss ways in which these two approaches can be combined to get the best of both. We suggest that using our physical insights to guide and constrain our data-driven methods will eventually provide the best path forward for gravitational-wave data analysis

    When will we observe binary black holes precessing?

    Get PDF
    After eleven gravitational-wave detections from compact-binary mergers, we are yet to observe the striking general-relativistic phenomenon of orbital precession. Measurements of precession would provide valuable insights into the distribution of black-hole spins, and therefore into astrophysical binary formation mechanisms. Using our recent two-harmonic approximation of precessing-binary signals~\cite{Fairhurst:2019_2harm}, we introduce the ``precession signal-to-noise ratio'', ρp\rho_p. We demonstrate that this can be used to clearly identify whether precession was measured in an observation (by comparison with both current detections and simulated signals), and can immediately quantify the measurability of precession in a given signal, which currently requires computationally expensive parameter-estimation studies. ρp\rho_p has numerous potential applications to signal searches, source-property measurements, and population studies. We give one example: assuming one possible astrophysical spin distribution, we predict that precession has a one in 25\sim 25 chance of being observed in any detection.Comment: 5 pages, 2 Figures; resubmission following reviewers comment

    Gravitational-wave surrogate models powered by artificial neural networks

    Get PDF
    Inferring the properties of black holes and neutron stars is a key science goal of gravitational-wave (GW) astronomy. To extract as much information as possible from GW observations, we must develop methods to reduce the cost of Bayesian inference. In this paper, we use artificial neural networks (ANNs) and the parallelization power of graphics processing units (GPUs) to improve the surrogate modeling method, which can produce accelerated versions of existing models. As a first application of our method, the artificial neural networks surrogate model (ANN-Sur), we build a time-domain surrogate model of the spin-aligned binary black hole (BBH) waveform model SEOBNRv4. We achieve median mismatches of approximately 2 e−5 and mismatches no worse than approximately 2e−3. For a typical BBH waveform generated from 12 Hz with a total mass of 60M⊙, the original SEOBNRv4 model takes 1794 ms. Existing custom-made code optimizations (SEOBNRv4opt) reduced this to 83.7 ms, and the interpolation-based, frequency-domain surrogate SEOBNRv4ROM can generate this waveform in 3.5 ms. Our ANN-Sur model when run on a CPU takes 1.2 ms and when run on a graphics processing unit (GPU) takes just 0.5 ms. ANN-Sur can also generate large batches of waveforms simultaneously. We find that batches of up to 10 3 waveforms can be evaluated on a GPU in just 1.57 ms, corresponding to a time per waveform of 0.0016 ms. This method is a promising way to utilize the parallelization power of GPUs to drastically increase the computational efficiency of Bayesian parameter estimation
    corecore