40 research outputs found

    Gene content evolution in the arthropods

    Get PDF
    Arthropods comprise the largest and most diverse phylum on Earth and play vital roles in nearly every ecosystem. Their diversity stems in part from variations on a conserved body plan, resulting from and recorded in adaptive changes in the genome. Dissection of the genomic record of sequence change enables broad questions regarding genome evolution to be addressed, even across hyper-diverse taxa within arthropods. Using 76 whole genome sequences representing 21 orders spanning more than 500 million years of arthropod evolution, we document changes in gene and protein domain content and provide temporal and phylogenetic context for interpreting these innovations. We identify many novel gene families that arose early in the evolution of arthropods and during the diversification of insects into modern orders. We reveal unexpected variation in patterns of DNA methylation across arthropods and examples of gene family and protein domain evolution coincident with the appearance of notable phenotypic and physiological adaptations such as flight, metamorphosis, sociality, and chemoperception. These analyses demonstrate how large-scale comparative genomics can provide broad new insights into the genotype to phenotype map and generate testable hypotheses about the evolution of animal diversity

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Sex Identification of Mytilus edulis following 26 day treatment to EE2 and KZ

    No full text
    Blue mussel M. edulis were exposed for 26 days to 50 ng/L 17alpha-ethinylestradiol or 30 ug/L ketoconazole for 26 days in four replicate tanks. Prior to exposure, mussel sex was identified using the VERL-VCL RT-qPCR assay on hemolymph collected from each mussel. Following exposure, mantle tissue was collected from each treated mussel and the VERL-VCL RT-qPCR assay was used to identify the sex of the mussels.https://scholarworks.umb.edu/data/1000/thumbnail.jp

    Toxicogenomic responses of low level anticancer drug exposures in Daphnia magna

    No full text
    The use of anticancer drugs in chemotherapy is increasing, leading to growing environmental concentrations of imatinib mesylate (IMA), cisplatinum (CDDP), and etoposide (ETP) in aquatic systems. Previous studies have shown that these anticancer drugs cause DNA damage in the crustacean Daphnia magna at low, environmentally relevant concentrations. To explore the mechanism of action of these compounds and the downstream effects of DNA damage on D. magna growth and development at a sensitive life stage, we exposed neonates to low level concentrations equivalent to those that elicit DNA damage (IMA: 2000 ng/L, ETP: 300 ng/L, CDDP: 10 ng/L) and performed transcriptomic analysis using an RNA-seq approach. RNA sequencing generated 14 million reads per sample, which were aligned to the D. magna genome and assembled, producing approximately 23,000 transcripts per sample. Over 90% of the transcripts showed homology to proteins in GenBank, revealing a high quality transcriptome assembly, although functional annotation was much lower. RT-qPCR was used to identify robust biomarkers and confirmed the downregulation of an angiotensin converting enzyme-like gene (once) involved in neuropeptide regulation across all three anticancer drugs and the down-regulation of DNA topoisomerase II by ETP. RNA-seq analysis also allowed for an in depth exploration of the differential splicing of transcripts revealing that regulation of different gene isoforms predicts potential impacts on translation and protein expression, providing a more meaningful assessment of transcriptomic data. Enrichment analysis and investigation of affected biological processes suggested that the DNA damage caused by ETP and IMA influences cell cycle regulation and GPCR signaling. This dysregulation is likely responsible for effects to neurological system processes and development, and overall growth and development. Our transcriptomic approach provided insight into the mechanisms that respond to DNA damage caused by anticancer drug exposure and generated novel hypotheses on how these chemicals may impact the growth and survival of this ecologically important zooplankton species

    EFFECTS FROM FILTRATION, CAPPING AGENTS, AND PRESENCE/ABSENCE OF FOOD ON THE TOXICITY OF SILVER NANOPARTICLES TO \u3ci\u3eDAPHNIA MAGNA\u3c/i\u3e

    Get PDF
    Relatively little is known about the behavior and toxicity of nanoparticles in the environment. Objectives of work presented here include establishing the toxicity of a variety of silver nanoparticles (AgNPs) to Daphnia magna neonates, assessing the applicability of a commonly used bioassay for testing AgNPs, and determining the advantages and disadvantages of multiple characterization techniques for AgNPs in simple aquatic systems. Daphnia magna were exposed to a silver nitrate solution and AgNPs suspensions including commercially available AgNPs (uncoated and coated), and laboratory-synthesized AgNPs (coated with coffee or citrate). The nanoparticle suspensions were analyzed for silver concentration (microwave acid digestions), size (dynamic light scattering and electron microscopy), shape (electron microscopy), surface charge (zeta potentiometer), and chemical speciation (X-ray absorption spectroscopy, X-ray diffraction). Toxicities of filtered (100 nm) versus unfiltered suspensions were compared. Additionally, effects from addition of food were examined. Stock suspensions were prepared by adding AgNPs to moderately hard reconstituted water, which were then diluted and used straight or after filtration with 100-nm filters. All nanoparticle exposure suspensions, at every time interval, were digested via microwave digester and analyzed by inductively coupled argon plasma–optical emission spectroscopy or graphite furnace– atomic absorption spectroscopy. Dose–response curves were generated and median lethal concentration (LC50) values calculated. The LC50 values for the unfiltered particles were (in μ/L): 1.1±0.1-AgNO3; 1.0±0.1-coffee coated; 1.1±0.2-citrate coated; 16.7±2.4 Sigma Aldrich Ag-nanoparticles (SA) uncoated; 31.5±8.1 SA coated. LC50 values for the filtered particles were (in μ/L): 0.7±0.1- AgNO3; 1.4±0.1-SA uncoated; 4.4±1.4-SA coated. The LC50 resulting from the addition of food was 176.4±25.5-SA coated. Recommendations presented in this study include AgNP handling methods, effects from sample preparation, and advantages/ disadvantages of different nanoparticle characterization techniques

    Gene Expression Profiling in Daphnia magna

    No full text
    corecore