228 research outputs found

    Recent advances in understanding and measurement of mercury in the environment: Terrestrial Hg cycling

    Get PDF
    This review documents recent advances in terrestrial mercury cycling. Terrestrial mercury (Hg) research has matured in some areas, and is developing rapidly in others. We summarize the state of the science circa 2010 as a starting point, and then present the advances during the last decade in three areas: land use, sulfate deposition, and climate change. The advances are presented in the framework of three Hg "gateways" to the terrestrial environment: inputs from the atmosphere, uptake in food, and run off with surface water. Among the most notable advances:The Arctic has emerged as a hotbed of Hg cycling, with high stream fluxes and large stores of Hg poised for release from permafrost with rapid high-latitude warming.The bi-directional exchange of Hg between the atmosphere and terrestrial surfaces is better understood, thanks largely to interpretation from Hg isotopes; the latest estimates place land surface Hg re-emission lower than previously thought.Artisanal gold mining is now thought responsible for over half the global stream flux of Hg.There is evidence that decreasing inputs ofHg to ecosystems may bring recovery sooner than expected, despite large ecosystem stores of legacy Hg.Freshly deposited Hg is more likely than stored Hg to methylate and be incorporated in rice.Topography and hydrological connectivity have emerged as master variables for explaining the disparate response of THg and MeHg to forest harvest and other land disturbance.These and other advances reported here are of value in evaluating the effectiveness of theMinamata Convention on reducing environmental Hg exposure to humans and wildlife. (C) 2020 The Authors. Published by Elsevier B.V

    Intercomparison of snow depth retrievals over Arctic sea ice from radar data acquired by Operation IceBridge

    Get PDF
    Since 2009, the ultra-wideband snow radar on Operation IceBridge (OIB; a NASA airborne mission to survey the polar ice covers) has acquired data in annual campaigns conducted during the Arctic and Antarctic springs. Progressive improvements in radar hardware and data processing methodologies have led to improved data quality for subsequent retrieval of snow depth. Existing retrieval algorithms differ in the way the air–snow (a–s) and snow–ice (s–i) interfaces are detected and localized in the radar returns and in how the system limitations are addressed (e.g., noise, resolution). In 2014, the Snow Thickness On Sea Ice Working Group (STOSIWG) was formed and tasked with investigating how radar data quality affects snow depth retrievals and how retrievals from the various algorithms differ. The goal is to understand the limitations of the estimates and to produce a well-documented, long-term record that can be used for understanding broader changes in the Arctic climate system. Here, we assess five retrieval algorithms by comparisons with field measurements from two ground-based campaigns, including the BRomine, Ozone, and Mercury EXperiment (BROMEX) at Barrow, Alaska; a field program by Environment and Climate Change Canada at Eureka, Nunavut; and available climatology and snowfall from ERA-Interim reanalysis. The aim is to examine available algorithms and to use the assessment results to inform the development of future approaches. We present results from these assessments and highlight key considerations for the production of a long-term, calibrated geophysical record of springtime snow thickness over Arctic sea ice

    An early developmental vertebrate model for nanomaterial safety:Bridging cell-based and mammalian toxicity assessment

    Get PDF
    Background. With the rise in production of nanoparticles for an ever-increasing number of applications, there is an urgent need to efficiently assess their potential toxicity. We propose a nanoparticle hazard assessment protocol that combines mammalian cytotoxicity data with embryonic vertebrate abnormality scoring to determine an overall toxicity index. Results. We observed that, after exposure to a range of nanoparticles, Xenopus phenotypic scoring showed a strong correlation with cell based in vitro assays. Magnetite-cored nanoparticles, negative for toxicity in vitro and Xenopus, were further confirmed as non-toxic in mice. Conclusion. The results highlight the potential of Xenopus embryo analysis as a fast screening approach for toxicity assessment of nanoparticles, which could be introduced for the routine testing of nanomaterials

    Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol

    Get PDF
    <b>Background</b> Understanding implementation processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions.<p></p> <b>Objectives</b> The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures.<p></p> <b>Methods</b> A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings.<p></p> <b>Discussion</b> The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    <b>Background</b> Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.<p></p> <b>Methods</b> A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.<p></p> <b>Results</b> The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.<p></p> <b>Conclusions</b> To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Why is it difficult to implement e-health initiatives? A qualitative study

    Get PDF
    <b>Background</b> The use of information and communication technologies in healthcare is seen as essential for high quality and cost-effective healthcare. However, implementation of e-health initiatives has often been problematic, with many failing to demonstrate predicted benefits. This study aimed to explore and understand the experiences of implementers - the senior managers and other staff charged with implementing e-health initiatives and their assessment of factors which promote or inhibit the successful implementation, embedding, and integration of e-health initiatives.<p></p> <b>Methods</b> We used a case study methodology, using semi-structured interviews with implementers for data collection. Case studies were selected to provide a range of healthcare contexts (primary, secondary, community care), e-health initiatives, and degrees of normalization. The initiatives studied were Picture Archiving and Communication System (PACS) in secondary care, a Community Nurse Information System (CNIS) in community care, and Choose and Book (C&B) across the primary-secondary care interface. Implementers were selected to provide a range of seniority, including chief executive officers, middle managers, and staff with 'on the ground' experience. Interview data were analyzed using a framework derived from Normalization Process Theory (NPT).<p></p> <b>Results</b> Twenty-three interviews were completed across the three case studies. There were wide differences in experiences of implementation and embedding across these case studies; these differences were well explained by collective action components of NPT. New technology was most likely to 'normalize' where implementers perceived that it had a positive impact on interactions between professionals and patients and between different professional groups, and fit well with the organisational goals and skill sets of existing staff. However, where implementers perceived problems in one or more of these areas, they also perceived a lower level of normalization.<p></p> <b>Conclusions</b> Implementers had rich understandings of barriers and facilitators to successful implementation of e-health initiatives, and their views should continue to be sought in future research. NPT can be used to explain observed variations in implementation processes, and may be useful in drawing planners' attention to potential problems with a view to addressing them during implementation planning

    Parent-of-origin-specific allelic associations among 106 genomic loci for age at menarche.

    Get PDF
    Age at menarche is a marker of timing of puberty in females. It varies widely between individuals, is a heritable trait and is associated with risks for obesity, type 2 diabetes, cardiovascular disease, breast cancer and all-cause mortality. Studies of rare human disorders of puberty and animal models point to a complex hypothalamic-pituitary-hormonal regulation, but the mechanisms that determine pubertal timing and underlie its links to disease risk remain unclear. Here, using genome-wide and custom-genotyping arrays in up to 182,416 women of European descent from 57 studies, we found robust evidence (P < 5 × 10(-8)) for 123 signals at 106 genomic loci associated with age at menarche. Many loci were associated with other pubertal traits in both sexes, and there was substantial overlap with genes implicated in body mass index and various diseases, including rare disorders of puberty. Menarche signals were enriched in imprinted regions, with three loci (DLK1-WDR25, MKRN3-MAGEL2 and KCNK9) demonstrating parent-of-origin-specific associations concordant with known parental expression patterns. Pathway analyses implicated nuclear hormone receptors, particularly retinoic acid and γ-aminobutyric acid-B2 receptor signalling, among novel mechanisms that regulate pubertal timing in humans. Our findings suggest a genetic architecture involving at least hundreds of common variants in the coordinated timing of the pubertal transition

    Can artificial intelligence accelerate the diagnosis of inherited retinal diseases? Protocol for a data-only retrospective cohort study (Eye2Gene)

    Get PDF
    INTRODUCTION: Inherited retinal diseases (IRD) are a leading cause of visual impairment and blindness in the working age population. Mutations in over 300 genes have been found to be associated with IRDs and identifying the affected gene in patients by molecular genetic testing is the first step towards effective care and patient management. However, genetic diagnosis is currently slow, expensive and not widely accessible. The aim of the current project is to address the evidence gap in IRD diagnosis with an AI algorithm, Eye2Gene, to accelerate and democratise the IRD diagnosis service. METHODS AND ANALYSIS: The data-only retrospective cohort study involves a target sample size of 10 000 participants, which has been derived based on the number of participants with IRD at three leading UK eye hospitals: Moorfields Eye Hospital (MEH), Oxford University Hospital (OUH) and Liverpool University Hospital (LUH), as well as a Japanese hospital, the Tokyo Medical Centre (TMC). Eye2Gene aims to predict causative genes from retinal images of patients with a diagnosis of IRD. For this purpose, 36 most common causative IRD genes have been selected to develop a training dataset for the software to have enough examples for training and validation for detection of each gene. The Eye2Gene algorithm is composed of multiple deep convolutional neural networks, which will be trained on MEH IRD datasets, and externally validated on OUH, LUH and TMC. ETHICS AND DISSEMINATION: This research was approved by the IRB and the UK Health Research Authority (Research Ethics Committee reference 22/WA/0049) 'Eye2Gene: accelerating the diagnosis of IRDs' Integrated Research Application System (IRAS) project ID: 242050. All research adhered to the tenets of the Declaration of Helsinki. Findings will be reported in an open-access format
    corecore