266 research outputs found

    A weekly, continually updated dataset of the probability of large wildfires across western US forests and woodlands

    Get PDF
    There is broad consensus that wildfire activity is likely to increase in western US forests and woodlands over the next century. Therefore, spatial predictions of the potential for large wildfires have immediate and growing relevance to near- and long-term research, planning, and management objectives. Fuels, climate, weather, and the landscape all exert controls on wildfire occurrence and spread, but the dynamics of these controls vary from daily to decadal timescales. Accurate spatial predictions of large wildfires should therefore strive to integrate across these variables and timescales. Here, we describe a high spatial resolution dataset (250&thinsp;m pixel) of the probability of large wildfires ( &gt; 405&thinsp;ha) across forests and woodlands in the contiguous western US, from 2005 to the present. The dataset is automatically updated on a weekly basis using Google Earth Engine and a continuous integration pipeline. Each image in the dataset is the output of a random forest machine-learning algorithm, trained on random samples of historic small and large wildfires and represents the predicted conditional probability of an individual pixel burning in a large fire, given an ignition or fire spread to that pixel. This novel workflow is able to integrate the near-term dynamics of fuels and weather into weekly predictions while also integrating longer-term dynamics of fuels, the climate, and the landscape. As a continually updated product, the dataset can provide operational fire managers with contemporary, on-the-ground information to closely monitor the changing potential for large wildfire occurrence and spread. It can also serve as a foundational dataset for longer-term planning and research, such as the strategic targeting of fuels management, fire-smart development at the wildland–urban interface, and the analysis of trends in wildfire potential over time. Weekly large fire probability GeoTiff products from 2005 to 2017 are archived on the Figshare online digital repository with the DOI https://doi.org/10.6084/m9.figshare.5765967 (available at https://doi.org/10.6084/m9.figshare.5765967.v1). Weekly GeoTiff products and the entire dataset from 2005 onwards are also continually uploaded to a Google Cloud Storage bucket at https://console.cloud.google.com/storage/wffr-preds/V1 (last access: 14 September 2018) and are available free of charge with a Google account. Continually updated products and the long-term archive are also available to registered Google Earth Engine (GEE) users as public GEE assets and can be accessed with the image collection ID users/mgray/wffr-preds within GEE.</p

    Ein neues, unkompliziertes Verfahren zur Bestimmung der Zusammensetzung binärer Flüssigkeitsgemische

    Get PDF
    Ein neues Verfahren zur Bestimmung der Zusammensetzung binärer Flüssigkeitsgemische mit Hilfe solvatochromer Farbstoffe wird beschrieben. Die Analyse erfolgt durch einfache UV/VIS-Absorptionsmessung und ist unter Verwendung einer Zwei-Parameter-Gleichung ein exakter Schnelltest

    Where is the EU headed given its current climate policy? A stakeholder-driven model inter-comparison.

    Get PDF
    Recent calls to do climate policy research with, rather than for, stakeholders have been answered in non-modelling science. Notwithstanding progress in modelling literature, however, very little of the scenario space traces back to what stakeholders are ultimately concerned about. With a suite of eleven integrated assessment, energy system and sectoral models, we carry out a model inter-comparison for the EU, the scenario logic and research questions of which have been formulated based on stakeholders' concerns. The output of this process is a scenario framework exploring where the region is headed rather than how to achieve its goals, extrapolating its current policy efforts into the future. We find that Europe is currently on track to overperforming its pre-2020 40% target yet far from its newest ambition of 55% emissions cuts by 2030, as well as looking at a 1.0-2.35 GtCO2 emissions range in 2050. Aside from the importance of transport electrification, deployment levels of carbon capture and storage are found intertwined with deeper emissions cuts and with hydrogen diffusion, with most hydrogen produced post-2040 being blue. Finally, the multi-model exercise has highlighted benefits from deeper decarbonisation in terms of energy security and jobs, and moderate to high renewables-dominated investment needs

    The Economics of 1.5°C Climate Change

    Get PDF
    The economic case for limiting warming to 1.5°C is unclear, due to manifold uncertainties. However, it cannot be ruled out that the 1.5°C target passes a cost-benefit test. Costs are almost certainly high: The median global carbon price in 1.5°C scenarios implemented by various energy models is more than US$100 per metric ton of CO2 in 2020, for example. Benefits estimates range from much lower than this to much higher. Some of these uncertainties may reduce in the future, raising the question of how to hedge in the near term. Maintaining an option on limiting warming to 1.5°C means targeting it now. Setting off with higher emissions will make 1.5°C unattainable quickly without recourse to expensive large-scale carbon dioxide removal (CDR), or solar radiation management (SRM), which can be cheap but poses ambiguous risks society seems unwilling to take. Carbon pricing could reduce mitigation costs substantially compared with ramping up the current patchwork of regulatory instruments. Nonetheless, a mix of policies is justified and technology-specific approaches may be required. It is particularly important to step up mitigation finance to developing countries, where emissions abatement is relatively cheap
    • …
    corecore