12 research outputs found

    Risk measures and the distribution of damage curves for 600 European coastal cities

    Get PDF
    A good understanding of climate change damages is vital to design effective adaptation policies and measures. Using a dataset of probabilistic sea-level rise and other of flood damages and protection cost curves for the 600 largest European coastal cities we generate stochastic damage curves and their distributions with and without adaptation. We apply the Generalized Extreme Value distribution to characterize the distributions and calculate two risk measures: the Value at Risk and the Expected Shortfall, which contribute to understanding the magnitude and probability of high-end sea-level rise represented by the upper tail of the distribution. This allows the costs of sea-level rise to be estimated (that is, in addition to other costs related to coastal extreme events) and supports decision-makers in integrating the high uncertainty related to future projections. This knowledge is necessary for an adequate risk management that does not underestimate risk. Furthermore, it allows city planners to tailor their risk tolerance. A great number of cities in Europe are currently undertaking adaptation plans or have already done so. Making these findings available should therefore be of great priority value to inform these processes. © 2019 The Author(s). Published by IOP Publishing Ltd.This research is supported by the Basque Government through the BERC 2018–2021 program and by the Spanish Ministry of Science, Innovation and Universities (MICINN) through BC3’s María de Maeztu excellence accreditation MDM-2017-0714

    Identifying gravitationally lensed supernovae within the Zwicky Transient Facility public survey

    Full text link
    Strong gravitational lensing of supernovae is exceedingly rare. To date, only a handful of lensed supernovae are known. Despite their rarity, lensed supernovae have emerged as one of the most promising methods for measuring the current expansion rate of the Universe and breaking the Hubble tension. We present an extensive search for gravitationally lensed supernovae within the Zwicky Transient Facility (ZTF) public survey, covering 12,524 transients with good light curves discovered during four years of observations. We crossmatch a catalogue of known and candidate lens galaxies with our transient sample and find only one coincident source, which was due to chance alignment. To search for supernovae magnified by unknown lens galaxies, we test multiple methods that have been suggested in the literature, for the first time on real data. This includes selecting objects with extremely red colours and those that appear inconsistent with the host galaxy redshift. In both cases, we find a few hundred candidates, most of which are due to contamination from activate galactic nuclei, bogus detections, or unlensed supernovae. The false positive rate from these methods presents significant challenges for future surveys. In total, 65 unique transients were identified across all of our selection methods that required detailed manual rejection, which would be infeasible for larger samples. Overall, we do not find any compelling candidates for lensed supernovae, which is broadly consistent with previous estimates for the rate of lensed supernovae in the ZTF public survey and the number expected to pass the selection cuts we apply.Comment: Submitte

    Scaling up genetic circuit design for cellular computing:advances and prospects

    Get PDF

    Risk measures and the distribution of damage curves for 600 European coastal cities

    Get PDF
    A good understanding of climate change damages is vital to design effective adaptation policies and measures. Using a dataset of probabilistic sea-level rise and other of flood damages and protection cost curves for the 600 largest European coastal cities we generate stochastic damage curves and their distributions with and without adaptation. We apply the Generalized Extreme Value distribution to characterize the distributions and calculate two risk measures: the Value at Risk and the Expected Shortfall, which contribute to understanding the magnitude and probability of high-end sea-level rise represented by the upper tail of the distribution. This allows the costs of sea-level rise to be estimated (that is, in addition to other costs related to coastal extreme events) and supports decision-makers in integrating the high uncertainty related to future projections. This knowledge is necessary for an adequate risk management that does not underestimate risk. Furthermore, it allows city planners to tailor their risk tolerance. A great number of cities in Europe are currently undertaking adaptation plans or have already done so. Making these findings available should therefore be of great priority value to inform these processes. © 2019 The Author(s). Published by IOP Publishing Ltd.This research is supported by the Basque Government through the BERC 2018–2021 program and by the Spanish Ministry of Science, Innovation and Universities (MICINN) through BC3’s María de Maeztu excellence accreditation MDM-2017-0714

    Towards successful adaptation: a checklist for the development of climate change adaptation plans

    No full text
    The earliest climate change adaptation plans emerged about ten years ago and are an increasingly important component of the international policy agenda. Because these plans by nature involve long-term objectives, some of the main questions raised in current adaptation tracking research studies are whether and how they will be implemented and what is required for these plans to successfully achieve their objectives? There is no consensus on how to define “successful adaptation” and there are multiple, sometimes competing, interpretations of success. In this working paper, we define three areas where climate change adaptation plans should focus on to successfully achieve their goals: policy and economy, science and learning and legitimacy. We develop a checklist that identifies required aspects for successful adaptation and sustainability in the long-term based on these three areas and related indicators. We suggest that plans follow this checklist as a guideline for plan development and institutional capacity building in the long term. We eventually discuss the adequacy of these metrics for assessing the credibility of developed climate adaptation policies

    Constructing synthetic biology workflows in the cloud

    No full text
    The synthetic biology design process has traditionally been heavily dependent upon manual searching, acquisition and integration of existing biological data. A large amount of such data is already available from Internet-based resources, but data exchange between these resources is often undertaken manually. Automating the communication between different resources can be done by the generation of computational workflows to achieve complex tasks that cannot be carried out easily or efficiently by a single resource. Computational workflows involve the passage of data from one resource, or process, to another in a distributed computing environment. In a typical bioinformatics workflow, the predefined order in which processes are invoked in a synchronous fashion and are described in a workflow definition document. However, in synthetic biology the diversity of resources and manufacturing tasks required favour a more flexible model for process execution. Here, the authors present the Protocol for Linking External Nodes (POLEN), a Cloud-based system that facilitates synthetic biology design workflows that operate asynchronously. Messages are used to notify POLEN resources of events in real time, and to log historical events such as the availability of new data, enabling networks of cooperation. POLEN can be used to coordinate the integration of different synthetic biology resources, to ensure consistency of information across distributed repositories through added support for data standards, and ultimately to facilitate the synthetic biology life cycle for designing and implementing biological systems

    Constraining Type Ia supernova explosions and early flux excesses with the Zwicky Transient Factory

    No full text
    International audienceIn the new era of time-domain surveys, Type Ia supernovae are being caught sooner after explosion, which has exposed significant variation in their early light curves. Two driving factors for early-time evolution are the distribution of ^56Ni in the ejecta and the presence of flux excesses of various causes. We perform an analysis of the largest young SN Ia sample to date. We compare 115 SN Ia light curves from the Zwicky Transient Facility to the turtls model grid containing light curves of Chandrasekhar mass explosions with a range of ^56Ni masses, ^56Ni distributions, and explosion energies. We find that the majority of our observed light curves are well reproduced by Chandrasekhar mass explosion models with a preference for highly extended ^56Ni distributions. We identify six SNe Ia with an early-time flux excess in our gr-band data (four ‘blue’ and two ‘red’ flux excesses). We find an intrinsic rate of 18 ± 11 per cent of early flux excesses in SNe Ia at z < 0.07, based on three detected flux excesses out of 30 (10 per cent) observed SNe Ia with a simulated efficiency of 57 per cent. This is comparable to rates of flux excesses in the literature but also accounts for detection efficiencies. Two of these events are mostly consistent with circumstellar material interaction, while the other four have longer lifetimes in agreement with companion interaction and ^56Ni-clump models. We find a higher frequency of flux excesses in 91T/99aa-like events (44 ± 13 per cent)
    corecore