59 research outputs found

    A new monotypic family for the enigmatic crustose red alga Plagiospora gracilis.

    Get PDF
    Plagiospora gracilis, a mucilaginous crustose red alga growing on subtidal pebbles on both coasts of the North Atlantic Ocean, forms distinctive tetrasporangia (red algal meiotic structures that release haploid tetraspores) but gametophytes have never been reported. In the absence of gametangia, the taxonomic position of this monotypic genus has always been uncertain; it is currently placed provisionally in the Gloiosiphoniaceae (Gigartinales) by comparison with sporophytes of Gloiosiphonia obtained in culture. Dioecious gametophytic crusts of P. gracilis are now reported for the first time, forming gametangia in inconspicuous superficial sori. There is no evidence that fertilization ever occurs in the field although fertile males and female were collected together. In culture, tetraspores grew into tetrasporophytes for three successive generations, by presumed apomictic sporophyte recycling. The life history of P. gracilis may represent a late stage in the loss of sexual reproduction leading to tetraspore-to-tetrasporophyte life histories such as that in Hildenbrandia. Phylogenetic analysis of sequences of the rbcL, LSU (28S) rDNA and coxI (COI-5P) genes for P. gracilis with other Gigartinales resolved P. gracilis as a distinct lineage in a well-supported clade of the families Sphaerococcaceae, Gloiosiphoniaceae, Endocladiaceae, Nizymeniaceae and Phacelocarpaceae. We here propose the monotypic Plagiosporaceae fam. nov. to accommodate P. gracilis

    Evaluating parents’ decisions about next-generation sequencing for their child in the NC NEXUS (North Carolina Newborn Exome Sequencing for Universal Screening) study: a randomized controlled trial protocol

    Get PDF
    Abstract Background Using next-generation sequencing (NGS) in newborn screening (NBS) could expand the number of genetic conditions detected pre-symptomatically, simultaneously challenging current precedents, raising ethical concerns, and extending the role of parental decision-making in NBS. The NC NEXUS (Newborn Exome Sequencing for Universal Screening) study seeks to assess the technical possibilities and limitations of NGS-NBS, devise and evaluate a framework to convey various types of genetic information, and develop best practices for incorporating NGS-NBS into clinical care. The study is enrolling both a healthy cohort and a cohort diagnosed with known disorders identified through recent routine NBS. It uses a novel age-based metric to categorize a priori the large amount of data generated by NGS-NBS and interactive online decision aids to guide parental decision-making. Primary outcomes include: (1) assessment of NGS-NBS sensitivity, (2) decision regret, and (3) parental decision-making about NGS-NBS, and, for parents randomized to have the option of requesting them, additional findings (diagnosed and healthy cohorts). Secondary outcomes assess parents’ reactions to the study and to decision-making. Methods/design Participants are parents and children in a well-child cohort recruited from a prenatal clinic and a diagnosed cohort recruited from pediatric clinics that treat children with disorders diagnosed through traditional NBS (goal of 200 children in each cohort). In phase 1, all parent participants use an online decision aid to decide whether to accept NGS-NBS for their child and provide consent for NGS-NBS. In phase 2, parents who consent to NGS-NBS are randomized to a decision arm or control arm (2:1 allocation) and learn their child’s NGS-NBS results, which include conditions from standard (non-NGS) NBS plus other highly actionable childhood-onset conditions. Parents in the decision arm use a second decision aid to make decisions about additional results from their child’s sequencing. In phase 3, decision arm participants learn additional results they have requested. Online questionnaires are administered at up to five time points. Discussion NC NEXUS will use a rigorous interdisciplinary approach designed to collect rich data to inform policy, practice, and future research. Trial registration clinicaltrials.gov, NCT02826694 . Registered on 11 July, 2016

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Local host-dependent persistence of the entomopathogenic nematode Steinernema carpocapsae used to control the large pine weevil Hylobius abietis

    Get PDF
    Entomopathogenic nematodes (EPN) applied inundatively to suppress insect pests are more likely to persist and establish in stable agroecosystems than in annual crops. We investigated a system of intermediate stability: three stumps harbouring the large pine weevil (Hylobius abietis L.; Coleoptera: Curculionidae), a major European forestry pest. We tested whether persistence of EPN Steinernema carpocapsae Weiser (Rhabditida: Steinernematidae) applied around stumps is maintained by recycling of EPN through pine weevils developing within stumps. Steinernema carpocapsae was detected in soil around and under the bark of treated tree stumps up to two years, but not 4–5 years after application. Differences in nematode presence between sites were better explained by tree species (pine or spruce) than soil type (mineral or peat). Presence of S. carpocapsae in soil was positively correlated with the number of H. abietis emerging from untreated stumps the previous year, which was greater for pine stumps than spruce stumps

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Comprehensive Pan-Genomic Characterization of Adrenocortical Carcinoma

    Get PDF
    SummaryWe describe a comprehensive genomic characterization of adrenocortical carcinoma (ACC). Using this dataset, we expand the catalogue of known ACC driver genes to include PRKAR1A, RPL22, TERF2, CCNE1, and NF1. Genome wide DNA copy-number analysis revealed frequent occurrence of massive DNA loss followed by whole-genome doubling (WGD), which was associated with aggressive clinical course, suggesting WGD is a hallmark of disease progression. Corroborating this hypothesis were increased TERT expression, decreased telomere length, and activation of cell-cycle programs. Integrated subtype analysis identified three ACC subtypes with distinct clinical outcome and molecular alterations which could be captured by a 68-CpG probe DNA-methylation signature, proposing a strategy for clinical stratification of patients based on molecular markers

    TRY plant trait database – enhanced coverage and open access

    Get PDF
    Plant traits - the morphological, anatomical, physiological, biochemical and phenological characteristics of plants - determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits - almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives
    • 

    corecore