1,569 research outputs found

    Best practices for post-processing ensemble climate forecasts, part I: selecting appropriate recalibration methods

    Get PDF
    ArticleThis is the final version of the article. Available from the publisher via the DOI in this record.This study describes a systematic approach to selecting optimal statistical recalibration methods and hindcast designs for producing reliable probability forecasts on seasonal-to-decadal time scales. A new recalibration method is introduced that includes adjustments for both unconditional and conditional biases in the mean and variance of the forecast distribution, and linear time-dependent bias in the mean. The complexity of the recalibration can be systematically varied by restricting the parameters. Simple recalibration methods may outperform more complex ones given limited training data. A new cross-validation methodology is proposed that allows the comparison of multiple recalibration methods and varying training periods using limited data. Part I considers the effect on forecast skill of varying the recalibration complexity and training period length. The interaction between these factors is analysed for grid box forecasts of annual mean near-surface temperature from the CanCM4 model. Recalibration methods that include conditional adjustment of the ensemble mean outperform simple bias correction by issuing climatological forecasts where the model has limited skill. Trend-adjusted forecasts outperform forecasts without trend adjustment at almost 75% of grid boxes. The optimal training period is around 30 years for trend-adjusted forecasts, and around 15 years otherwise. The optimal training period is strongly related to the length of the optimal climatology. Longer training periods may increase overall performance, but at the expense of very poor forecasts where skill is limited

    An assessment of road-verge grass as a feedstock for farm-fed anaerobic digestion plants

    Get PDF
    Cuttings from road-verge grass could provide biomass for energy generation, but currently this potential is not exploited. This research assessed the technical, practical and financial feasibility of using grass harvested from road verges as a feedstock in farm-fed anaerobic digestion (AD) plants. The methane potential (191 mL CH4 gDM−1) and digestion characteristics of verge grass were similar to those of current farm feedstocks; indicating suitability for AD. Ensiling had no significant impact on the biomethane generated. Testing co-digestions of verge grass with current farm feedstocks showed enhanced methane yields, suggesting that verge grass could be a valuable addition to AD feedstock mixes. In a case study of the UK county of Lincolnshire, potential volumes and locations of verge grass biomass were estimated, with capacities and locations of existing AD plants, to assess the potential to supply practical grass volumes. Grass harvesting costs were modelled and compared with other feedstock costs. Finally, the attitudes of AD operators to using verge grass were investigated to understand whether a market for verge grass exists. In a small survey all operators were willing to use it as a feedstock and most were prepared to pay over the estimated harvesting cost. If verge grass was legally recognised as a waste product it could be attractive to AD operators especially where financial incentives to use waste feedstocks are in place. In rural areas, verge grass could be harvested and co-digested by existing farm-fed AD plants, potentially reducing the cost of road verge maintenance and increasing biodiversity

    Damage function for historic paper. Part I: Fitness for use

    Get PDF
    Background In heritage science literature and in preventive conservation practice, damage functions are used to model material behaviour and specifically damage (unacceptable change), as a result of the presence of a stressor over time. For such functions to be of use in the context of collection management, it is important to define a range of parameters, such as who the stakeholders are (e.g. the public, curators, researchers), the mode of use (e.g. display, storage, manual handling), the long-term planning horizon (i.e. when in the future it is deemed acceptable for an item to become damaged or unfit for use), and what the threshold of damage is, i.e. extent of physical change assessed as damage. Results In this paper, we explore the threshold of fitness for use for archival and library paper documents used for display or reading in the context of access in reading rooms by the general public. Change is considered in the context of discolouration and mechanical deterioration such as tears and missing pieces: forms of physical deterioration that accumulate with time in libraries and archives. We also explore whether the threshold fitness for use is defined differently for objects perceived to be of different value, and for different modes of use. The data were collected in a series of fitness-for-use workshops carried out with readers/visitors in heritage institutions using principles of Design of Experiments. Conclusions The results show that when no particular value is pre-assigned to an archival or library document, missing pieces influenced readers/visitors’ subjective judgements of fitness-for-use to a greater extent than did discolouration and tears (which had little or no influence). This finding was most apparent in the display context in comparison to the reading room context. The finding also best applied when readers/visitors were not given a value scenario (in comparison to when they were asked to think about the document having personal or historic value). It can be estimated that, in general, items become unfit when text is evidently missing. However, if the visitor/reader is prompted to think of a document in terms of its historic value, then change in a document has little impact on fitness for use

    Cooperation and Contagion in Web-Based, Networked Public Goods Experiments

    Get PDF
    A longstanding idea in the literature on human cooperation is that cooperation should be reinforced when conditional cooperators are more likely to interact. In the context of social networks, this idea implies that cooperation should fare better in highly clustered networks such as cliques than in networks with low clustering such as random networks. To test this hypothesis, we conducted a series of web-based experiments, in which 24 individuals played a local public goods game arranged on one of five network topologies that varied between disconnected cliques and a random regular graph. In contrast with previous theoretical work, we found that network topology had no significant effect on average contributions. This result implies either that individuals are not conditional cooperators, or else that cooperation does not benefit from positive reinforcement between connected neighbors. We then tested both of these possibilities in two subsequent series of experiments in which artificial seed players were introduced, making either full or zero contributions. First, we found that although players did generally behave like conditional cooperators, they were as likely to decrease their contributions in response to low contributing neighbors as they were to increase their contributions in response to high contributing neighbors. Second, we found that positive effects of cooperation were contagious only to direct neighbors in the network. In total we report on 113 human subjects experiments, highlighting the speed, flexibility, and cost-effectiveness of web-based experiments over those conducted in physical labs

    How many crowdsourced workers should a requester hire?

    Get PDF
    Recent years have seen an increased interest in crowdsourcing as a way of obtaining information from a potentially large group of workers at a reduced cost. The crowdsourcing process, as we consider in this paper, is as follows: a requester hires a number of workers to work on a set of similar tasks. After completing the tasks, each worker reports back outputs. The requester then aggregates the reported outputs to obtain aggregate outputs. A crucial question that arises during this process is: how many crowd workers should a requester hire? In this paper, we investigate from an empirical perspective the optimal number of workers a requester should hire when crowdsourcing tasks, with a particular focus on the crowdsourcing platform Amazon Mechanical Turk. Specifically, we report the results of three studies involving different tasks and payment schemes. We find that both the expected error in the aggregate outputs as well as the risk of a poor combination of workers decrease as the number of workers increases. Surprisingly, we find that the optimal number of workers a requester should hire for each task is around 10 to 11, no matter the underlying task and payment scheme. To derive such a result, we employ a principled analysis based on bootstrapping and segmented linear regression. Besides the above result, we also find that overall top-performing workers are more consistent across multiple tasks than other workers. Our results thus contribute to a better understanding of, and provide new insights into, how to design more effective crowdsourcing processes

    Transport Through Andreev Bound States in a Graphene Quantum Dot

    Full text link
    Andreev reflection-where an electron in a normal metal backscatters off a superconductor into a hole-forms the basis of low energy transport through superconducting junctions. Andreev reflection in confined regions gives rise to discrete Andreev bound states (ABS), which can carry a supercurrent and have recently been proposed as the basis of qubits [1-3]. Although signatures of Andreev reflection and bound states in conductance have been widely reported [4], it has been difficult to directly probe individual ABS. Here, we report transport measurements of sharp, gate-tunable ABS formed in a superconductor-quantum dot (QD)-normal system, which incorporates graphene. The QD exists in the graphene under the superconducting contact, due to a work-function mismatch [5, 6]. The ABS form when the discrete QD levels are proximity coupled to the superconducting contact. Due to the low density of states of graphene and the sensitivity of the QD levels to an applied gate voltage, the ABS spectra are narrow, can be tuned to zero energy via gate voltage, and show a striking pattern in transport measurements.Comment: 25 Pages, included SO

    Collapse of superconductivity in a hybrid tin-graphene Josephson junction array

    Full text link
    When a Josephson junction array is built with hybrid superconductor/metal/superconductor junctions, a quantum phase transition from a superconducting to a two-dimensional (2D) metallic ground state is predicted to happen upon increasing the junction normal state resistance. Owing to its surface-exposed 2D electron gas and its gate-tunable charge carrier density, graphene coupled to superconductors is the ideal platform to study the above-mentioned transition between ground states. Here we show that decorating graphene with a sparse and regular array of superconducting nanodisks enables to continuously gate-tune the quantum superconductor-to-metal transition of the Josephson junction array into a zero-temperature metallic state. The suppression of proximity-induced superconductivity is a direct consequence of the emergence of quantum fluctuations of the superconducting phase of the disks. Under perpendicular magnetic field, the competition between quantum fluctuations and disorder is responsible for the resilience at the lowest temperatures of a superconducting glassy state that persists above the upper critical field. Our results provide the entire phase diagram of the disorder and magnetic field-tuned transition and unveil the fundamental impact of quantum phase fluctuations in 2D superconducting systems.Comment: 25 pages, 6 figure

    Upper- and mid-mantle interaction between the Samoan plume and the Tonga-Kermadec slabs

    Get PDF
    Mantle plumes are thought to play a key role in transferring heat from the core\u2013mantle boundary to the lithosphere, where it can significantly influence plate tectonics. On impinging on the lithosphere at spreading ridges or in intra-plate settings, mantle plumes may generate hotspots, large igneous provinces and hence considerable dynamic topography. However, the active role of mantle plumes on subducting slabs remains poorly understood. Here we show that the stagnation at 660 km and fastest trench retreat of the Tonga slab in Southwestern Pacific are consistent with an interaction with the Samoan plume and the Hikurangi plateau. Our findings are based on comparisons between 3D anisotropic tomography images and 3D petrological-thermo-mechanical models, which self-consistently explain several unique features of the Fiji\u2013Tonga region. We identify four possible slip systems of bridgmanite in the lower mantle that reconcile the observed seismic anisotropy beneath the Tonga slab (VSH4VSV) with thermo-mechanical calculations
    • 

    corecore