1,221 research outputs found

    Grassland biodiversity restoration increases resistance of carbon fluxes to drought

    Get PDF
    Evidence suggests that the restoration of plant diversity in grasslands not only brings benefits for biodiversity conservation, but also the delivery of ecosystem services. While biodiversity-function experiments show that greater plant diversity increases resistance of plant productivity to climate extremes, it is not known whether real-world management options for grassland restoration likewise stabilize ecosystem responses to extreme climate events. We used a long-term (23 year) field experiment in northern England to test the hypothesis that management aimed at biodiversity restoration increases the resistance and recovery of ecosystem carbon (C) fluxes to short-term summer drought. This was tested by measuring plant, soil and microbial responses to a simulated drought in experimental grassland plots where fertilizer application and seed addition have been managed to enhance plant species diversity. The cessation of fertilizer application brought about small increases in plant species richness. Additionally, cessation of fertilizer application reduced overall plant productivity and promoted hemi-parasitic plants at the expense of grasses and forbs. Resistance of CO 2 fluxes to drought, measured as ecosystem respiration, was greater in non-fertilized plots, as lower plant biomass reduced water demand, likely aided by proportionally more hemi-parasitic plants further reducing plant biomass. Additionally, legumes increased under drought, thereby contributing to overall resistance of plant productivity. Recovery of soil microbial C and nitrogen was more rapid after rewetting than soil microbial community composition, irrespective of restoration treatment, suggesting high resilience of soil microbial communities to drought. Synthesis and applications. This study shows that while grassland diversity restoration management increases the resistance of carbon fluxes to drought, it also reduces agricultural yields, revealing a trade-off for land managers. Furthermore legumes, promoted through long-term restoration treatments, can help to maintain plant community productivity under drought by increasing their biomass. As such, grassland management strategies not only have consequences for ecosystem processes, but also the capacity to withstand extreme weather events

    Porting Decision Tree Algorithms to Multicore using FastFlow

    Full text link
    The whole computer hardware industry embraced multicores. For these machines, the extreme optimisation of sequential algorithms is no longer sufficient to squeeze the real machine power, which can be only exploited via thread-level parallelism. Decision tree algorithms exhibit natural concurrency that makes them suitable to be parallelised. This paper presents an approach for easy-yet-efficient porting of an implementation of the C4.5 algorithm on multicores. The parallel porting requires minimal changes to the original sequential code, and it is able to exploit up to 7X speedup on an Intel dual-quad core machine.Comment: 18 pages + cove

    When to Censor?

    Get PDF
    Loss to follow-up is an endemic feature of time-to-event analyses that precludes observation of the event of interest. To our knowledge, in typical cohort studies with encounters occurring at regular or irregular intervals, there is no consensus on how to handle person-time between participants’ last study encounter and the point at which they meet a definition of loss to follow-up. We demonstrate, using simulation and an example, that when the event of interest is captured outside of a study encounter (e.g., in a registry), person-time should be censored when the study-defined criterion for loss to follow-up is met (e.g., 1 year after last encounter), rather than at the last study encounter. Conversely, when the event of interest must be measured within the context of a study encounter (e.g., a biomarker value), person-time should be censored at the last study encounter. An inappropriate censoring scheme has the potential to result in substantial bias that may not be easily corrected

    Revisiting the Cache Miss Analysis of Multithreaded Algorithms

    Full text link

    A framework for digital sunken relief generation based on 3D geometric models

    Get PDF
    Sunken relief is a special art form of sculpture whereby the depicted shapes are sunk into a given surface. This is traditionally created by laboriously carving materials such as stone. Sunken reliefs often utilize the engraved lines or strokes to strengthen the impressions of a 3D presence and to highlight the features which otherwise are unrevealed. In other types of reliefs, smooth surfaces and their shadows convey such information in a coherent manner. Existing methods for relief generation are focused on forming a smooth surface with a shallow depth which provides the presence of 3D figures. Such methods unfortunately do not help the art form of sunken reliefs as they omit the presence of feature lines. We propose a framework to produce sunken reliefs from a known 3D geometry, which transforms the 3D objects into three layers of input to incorporate the contour lines seamlessly with the smooth surfaces. The three input layers take the advantages of the geometric information and the visual cues to assist the relief generation. This framework alters existing techniques in line drawings and relief generation, and then combines them organically for this particular purpose

    Sensitivity analyses for misclassification of cause of death in the parametric G-formula

    Get PDF
    Cause-specific mortality is an important outcome in studies of interventions to improve survival, yet causes of death can be misclassified. Here, we present an approach to performing sensitivity analyses formisclassification of cause of death in the parametric g-formula. The g-formula is a useful method to estimate effects of interventions in epidemiologic research because it appropriately accounts for time-varying confounding affected by prior treatment and can estimate risk under dynamic treatment plans.We illustrate our approach using an example comparing acquired immune deficiency syndrome (AIDS)-related mortality under immediate and delayed treatment strategies in a cohort of therapy-naive adults entering care for human immunodeficiency virus infection in the United States. In the standard g-formula approach, 10-year risk of AIDSrelatedmortality under delayed treatment was 1.73 (95% CI: 1.17, 2.54) times the risk under immediate treatment. In a sensitivity analysis assuming that AIDS-related death was measured with sensitivity of 95% and specificity of 90%, the 10-year risk ratio comparing AIDS-related mortality between treatment plans was 1.89 (95% CI: 1.13, 3.14). When sensitivity and specificity are unknown, this approach can be used to estimate the effects of dynamic treatment plans under a range of plausible values of sensitivity and specificity of the recorded event type

    Sensitivity analyses for effect modifiers not observed in the target population when generalizing treatment effects from a randomized controlled trial: Assumptions, models, effect scales, data scenarios, and implementation details

    Get PDF
    Background inform policy and practice for broad populations. The average treatment effect (ATE) for a target population, however, may be different from the ATE observed in a trial if there are effect modifiers whose distribution in the target population is different that from that in the trial. Methods exist to use trial data to estimate the target population ATE, provided the distributions of treatment effect modifiers are observed in both the trial and target population—an assumption that may not hold in practice. Methods The proposed sensitivity analyses address the situation where a treatment effect modifier is observed in the trial but not the target population. These methods are based on an outcome model or the combination of such a model and weighting adjustment for observed differences between the trial sample and target population. They accommodate several types of outcome models: linear models (including single time outcome and pre- and post-treatment outcomes) for additive effects, and models with log or logit link for multiplicative effects. We clarify the methods’ assumptions and provide detailed implementation instructions. Illustration We illustrate the methods using an example generalizing the effects of an HIV treatment regimen from a randomized trial to a relevant target population. Conclusion These methods allow researchers and decision-makers to have more appropriate confidence when drawing conclusions about target population effects

    Systematic model behavior of adsorption on flat surfaces

    Full text link
    A low density film on a flat surface is described by an expansion involving the first four virial coefficients. The first coefficient (alone) yields the Henry's law regime, while the next three correct for the effects of interactions. The results permit exploration of the idea of universal adsorption behavior, which is compared with experimental data for a number of systems

    Exploring passenger rail markets using new station catchment size and shape metrics

    Get PDF
    This paper presents a novel spatial market segmentation method to determine key user groups of a train station (such as gender, age and access mode), based on the size and shape of the station catchment area of each group. Two new indices–area ratio and composite ratio–are developed to quantify the importance of user groups for a train station. This method is applied to identify key user groups at seven train stations in Perth, Western Australia. The study offers a new way to explore the travel behaviour of train users and provides insights for rail transport planning and marketing

    Virologic suppression and CD4 + cell count recovery after initiation of raltegravir or efavirenz-containing HIV treatment regimens

    Get PDF
    Objective: To explore the effectiveness of raltegravir-based antiretroviral therapy (ART) on treatment response among ART-naive patients seeking routine clinical care. Design: Cohort study of adults enrolled in HIV care in the United States. Methods: We compared virologic suppression and CD4 + cell count recovery over a 2.5 year period after initiation of an ART regimen containing raltegravir or efavirenz using observational data from a US clinical cohort, generalized to the US population of people with diagnosed HIV. We accounted for nonrandom treatment assignment, informative censoring, and nonrandom selection from the US target population using inverse probability weights. Results: Of the 2843 patients included in the study, 2476 initiated the efavirenz-containing regimen and 367 initiated the raltegravir-containing regimen. In the weighted intent-To-Treat analysis, patients spent an average of 74 (95% confidence interval: 41, 106) additional days alive with a suppressed viral load on the raltegravir regimen than on the efavirenz regimen over the 2.5-year study period. CD4 + cell count recovery was also superior under the raltegravir regimen. Conclusion: Patients receiving raltegravir spent more time alive and suppressed than patients receiving efavirenz, but the probability of viral suppression by 2.5 years after treatment was similar between groups. Optimizing the amount of time spent in a state of viral suppression is important to improve survival among people living with HIV and to reduce onward transmission
    • …
    corecore