83 research outputs found

    Early Detection of t(8;21) Chromosomal Translocations During Treatment of PML-RARA Positive Acute Promyelocytic Leukemia: A Case Study

    Get PDF
    Here we describe a female patient who developed acute promyelocytic leukemia (APL) characterized by t(l5;17) translocation at diagnosis. The patient began treatment with all-trans retinoic acid (ATRA) + chemotherapy. During follow up, the patient was found to be negative for the t(15;17) transcript after 3 months of therapy which remained undetectable, thereafter. However, the emergence of a small clone with a t(8;21) abnormality was observed in the bone marrow and peripheral blood (PB) cells between 3 and 18 months following treatment initiation. The abnormal translocation observed in PB cells obtained at 3 months was detected after the second cycle of consolidation therapy and reappeared at 15 months during maintenance treatment, a period without ATRA. Although based on a single case, we conclude that genetic screening of multiple translocations in AML patients should be requested to allow early identification of other emerging clones during therapy that may manifest clinically following treatment

    A cross-institutional analysis of the effects of broadening trainee professional development on research productivity

    Get PDF
    PhD-trained scientists are essential contributors to the workforce in diverse employment sectors that include academia, industry, government, and nonprofit organizations. Hence, best practices for training the future biomedical workforce are of national concern. Complementing coursework and laboratory research training, many institutions now offer professional training that enables career exploration and develops a broad set of skills critical to various career paths. The National Institutes of Health (NIH) funded academic institutions to design innovative programming to enable this professional development through a mechanism known as Broadening Experiences in Scientific Training (BEST). Programming at the NIH BEST awardee institutions included career panels, skill-building workshops, job search workshops, site visits, and internships. Because doctoral training is lengthy and requires focused attention on dissertation research, an initial concern was that students participating in additional complementary training activities might exhibit an increased time to degree or diminished research productivity. Metrics were analyzed from 10 NIH BEST awardee institutions to address this concern, using time to degree and publication records as measures of efficiency and productivity. Comparing doctoral students who participated to those who did not, results revealed that across these diverse academic institutions, there were no differences in time to degree or manuscript output. Our findings support the policy that doctoral students should participate in career and professional development opportunities that are intended to prepare them for a variety of diverse and important careers in the workforce

    Mapping geographical inequalities in childhood diarrhoeal morbidity and mortality in low-income and middle-income countries, 2000–17 : analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background Across low-income and middle-income countries (LMICs), one in ten deaths in children younger than 5 years is attributable to diarrhoea. The substantial between-country variation in both diarrhoea incidence and mortality is attributable to interventions that protect children, prevent infection, and treat disease. Identifying subnational regions with the highest burden and mapping associated risk factors can aid in reducing preventable childhood diarrhoea. Methods We used Bayesian model-based geostatistics and a geolocated dataset comprising 15 072 746 children younger than 5 years from 466 surveys in 94 LMICs, in combination with findings of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017, to estimate posterior distributions of diarrhoea prevalence, incidence, and mortality from 2000 to 2017. From these data, we estimated the burden of diarrhoea at varying subnational levels (termed units) by spatially aggregating draws, and we investigated the drivers of subnational patterns by creating aggregated risk factor estimates. Findings The greatest declines in diarrhoeal mortality were seen in south and southeast Asia and South America, where 54·0% (95% uncertainty interval [UI] 38·1–65·8), 17·4% (7·7–28·4), and 59·5% (34·2–86·9) of units, respectively, recorded decreases in deaths from diarrhoea greater than 10%. Although children in much of Africa remain at high risk of death due to diarrhoea, regions with the most deaths were outside Africa, with the highest mortality units located in Pakistan. Indonesia showed the greatest within-country geographical inequality; some regions had mortality rates nearly four times the average country rate. Reductions in mortality were correlated to improvements in water, sanitation, and hygiene (WASH) or reductions in child growth failure (CGF). Similarly, most high-risk areas had poor WASH, high CGF, or low oral rehydration therapy coverage. Interpretation By co-analysing geospatial trends in diarrhoeal burden and its key risk factors, we could assess candidate drivers of subnational death reduction. Further, by doing a counterfactual analysis of the remaining disease burden using key risk factors, we identified potential intervention strategies for vulnerable populations. In view of the demands for limited resources in LMICs, accurately quantifying the burden of diarrhoea and its drivers is important for precision public health

    Risk profiles and one-year outcomes of patients with newly diagnosed atrial fibrillation in India: Insights from the GARFIELD-AF Registry.

    Get PDF
    BACKGROUND: The Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF) is an ongoing prospective noninterventional registry, which is providing important information on the baseline characteristics, treatment patterns, and 1-year outcomes in patients with newly diagnosed non-valvular atrial fibrillation (NVAF). This report describes data from Indian patients recruited in this registry. METHODS AND RESULTS: A total of 52,014 patients with newly diagnosed AF were enrolled globally; of these, 1388 patients were recruited from 26 sites within India (2012-2016). In India, the mean age was 65.8 years at diagnosis of NVAF. Hypertension was the most prevalent risk factor for AF, present in 68.5% of patients from India and in 76.3% of patients globally (P < 0.001). Diabetes and coronary artery disease (CAD) were prevalent in 36.2% and 28.1% of patients as compared with global prevalence of 22.2% and 21.6%, respectively (P < 0.001 for both). Antiplatelet therapy was the most common antithrombotic treatment in India. With increasing stroke risk, however, patients were more likely to receive oral anticoagulant therapy [mainly vitamin K antagonist (VKA)], but average international normalized ratio (INR) was lower among Indian patients [median INR value 1.6 (interquartile range {IQR}: 1.3-2.3) versus 2.3 (IQR 1.8-2.8) (P < 0.001)]. Compared with other countries, patients from India had markedly higher rates of all-cause mortality [7.68 per 100 person-years (95% confidence interval 6.32-9.35) vs 4.34 (4.16-4.53), P < 0.0001], while rates of stroke/systemic embolism and major bleeding were lower after 1 year of follow-up. CONCLUSION: Compared to previously published registries from India, the GARFIELD-AF registry describes clinical profiles and outcomes in Indian patients with AF of a different etiology. The registry data show that compared to the rest of the world, Indian AF patients are younger in age and have more diabetes and CAD. Patients with a higher stroke risk are more likely to receive anticoagulation therapy with VKA but are underdosed compared with the global average in the GARFIELD-AF. CLINICAL TRIAL REGISTRATION-URL: http://www.clinicaltrials.gov. Unique identifier: NCT01090362

    Public health utility of cause of death data: applying empirical algorithms to improve data quality

    Get PDF
    Background: Accurate, comprehensive, cause-specific mortality estimates are crucial for informing public health decision making worldwide. Incorrectly or vaguely assigned deaths, defined as garbage-coded deaths, mask the true cause distribution. The Global Burden of Disease (GBD) study has developed methods to create comparable, timely, cause-specific mortality estimates; an impactful data processing method is the reallocation of garbage-coded deaths to a plausible underlying cause of death. We identify the pattern of garbage-coded deaths in the world and present the methods used to determine their redistribution to generate more plausible cause of death assignments. Methods: We describe the methods developed for the GBD 2019 study and subsequent iterations to redistribute garbage-coded deaths in vital registration data to plausible underlying causes. These methods include analysis of multiple cause data, negative correlation, impairment, and proportional redistribution. We classify garbage codes into classes according to the level of specificity of the reported cause of death (CoD) and capture trends in the global pattern of proportion of garbage-coded deaths, disaggregated by these classes, and the relationship between this proportion and the Socio-Demographic Index. We examine the relative importance of the top four garbage codes by age and sex and demonstrate the impact of redistribution on the annual GBD CoD rankings. Results: The proportion of least-specific (class 1 and 2) garbage-coded deaths ranged from 3.7 of all vital registration deaths to 67.3 in 2015, and the age-standardized proportion had an overall negative association with the Socio-Demographic Index. When broken down by age and sex, the category for unspecified lower respiratory infections was responsible for nearly 30 of garbage-coded deaths in those under 1Â year of age for both sexes, representing the largest proportion of garbage codes for that age group. We show how the cause distribution by number of deaths changes before and after redistribution for four countries: Brazil, the United States, Japan, and France, highlighting the necessity of accounting for garbage-coded deaths in the GBD. Conclusions: We provide a detailed description of redistribution methods developed for CoD data in the GBD; these methods represent an overall improvement in empiricism compared to past reliance on a priori knowledge

    Uncovering the heterogeneity and temporal complexity of neurodegenerative diseases with Subtype and Stage Inference

    Get PDF
    The heterogeneity of neurodegenerative diseases is a key confound to disease understanding and treatment development, as study cohorts typically include multiple phenotypes on distinct disease trajectories. Here we introduce a machine-learning technique\u2014Subtype and Stage Inference (SuStaIn)\u2014able to uncover data-driven disease phenotypes with distinct temporal progression patterns, from widely available cross-sectional patient studies. Results from imaging studies in two neurodegenerative diseases reveal subgroups and their distinct trajectories of regional neurodegeneration. In genetic frontotemporal dementia, SuStaIn identifies genotypes from imaging alone, validating its ability to identify subtypes; further the technique reveals within-genotype heterogeneity. In Alzheimer\u2019s disease, SuStaIn uncovers three subtypes, uniquely characterising their temporal complexity. SuStaIn provides fine-grained patient stratification, which substantially enhances the ability to predict conversion between diagnostic categories over standard models that ignore subtype (p = 7.18 7 10 124 ) or temporal stage (p = 3.96 7 10 125 ). SuStaIn offers new promise for enabling disease subtype discovery and precision medicine

    Long COVID and cardiovascular disease: a prospective cohort study

    Get PDF
    Background Pre-existing cardiovascular disease (CVD) or cardiovascular risk factors have been associated with an increased risk of complications following hospitalisation with COVID-19, but their impact on the rate of recovery following discharge is not known. Objectives To determine whether the rate of patient-perceived recovery following hospitalisation with COVID-19 was affected by the presence of CVD or cardiovascular risk factors. Methods In a multicentre prospective cohort study, patients were recruited following discharge from the hospital with COVID-19 undertaking two comprehensive assessments at 5 months and 12 months. Patients were stratified by the presence of either CVD or cardiovascular risk factors prior to hospitalisation with COVID-19 and compared with controls with neither. Full recovery was determined by the response to a patient-perceived evaluation of full recovery from COVID-19 in the context of physical, physiological and cognitive determinants of health. Results From a total population of 2545 patients (38.8% women), 472 (18.5%) and 1355 (53.2%) had CVD or cardiovascular risk factors, respectively. Compared with controls (n=718), patients with CVD and cardiovascular risk factors were older and more likely to have had severe COVID-19. Full recovery was significantly lower at 12 months in patients with CVD (adjusted OR (aOR) 0.62, 95% CI 0.43 to 0.89) and cardiovascular risk factors (aOR 0.66, 95% CI 0.50 to 0.86). Conclusion Patients with CVD or cardiovascular risk factors had a delayed recovery at 12 months following hospitalisation with COVID-19. Targeted interventions to reduce the impact of COVID-19 in patients with cardiovascular disease remain an unmet need

    Large-scale phenotyping of patients with long COVID post-hospitalization reveals mechanistic subtypes of disease

    Get PDF
    One in ten severe acute respiratory syndrome coronavirus 2 infections result in prolonged symptoms termed long coronavirus disease (COVID), yet disease phenotypes and mechanisms are poorly understood1. Here we profiled 368 plasma proteins in 657 participants ≥3 months following hospitalization. Of these, 426 had at least one long COVID symptom and 233 had fully recovered. Elevated markers of myeloid inflammation and complement activation were associated with long COVID. IL-1R2, MATN2 and COLEC12 were associated with cardiorespiratory symptoms, fatigue and anxiety/depression; MATN2, CSF3 and C1QA were elevated in gastrointestinal symptoms and C1QA was elevated in cognitive impairment. Additional markers of alterations in nerve tissue repair (SPON-1 and NFASC) were elevated in those with cognitive impairment and SCG3, suggestive of brain–gut axis disturbance, was elevated in gastrointestinal symptoms. Severe acute respiratory syndrome coronavirus 2-specific immunoglobulin G (IgG) was persistently elevated in some individuals with long COVID, but virus was not detected in sputum. Analysis of inflammatory markers in nasal fluids showed no association with symptoms. Our study aimed to understand inflammatory processes that underlie long COVID and was not designed for biomarker discovery. Our findings suggest that specific inflammatory pathways related to tissue damage are implicated in subtypes of long COVID, which might be targeted in future therapeutic trials

    SARS-CoV-2-specific nasal IgA wanes 9 months after hospitalisation with COVID-19 and is not induced by subsequent vaccination

    Get PDF
    BACKGROUND: Most studies of immunity to SARS-CoV-2 focus on circulating antibody, giving limited insights into mucosal defences that prevent viral replication and onward transmission. We studied nasal and plasma antibody responses one year after hospitalisation for COVID-19, including a period when SARS-CoV-2 vaccination was introduced. METHODS: In this follow up study, plasma and nasosorption samples were prospectively collected from 446 adults hospitalised for COVID-19 between February 2020 and March 2021 via the ISARIC4C and PHOSP-COVID consortia. IgA and IgG responses to NP and S of ancestral SARS-CoV-2, Delta and Omicron (BA.1) variants were measured by electrochemiluminescence and compared with plasma neutralisation data. FINDINGS: Strong and consistent nasal anti-NP and anti-S IgA responses were demonstrated, which remained elevated for nine months (p < 0.0001). Nasal and plasma anti-S IgG remained elevated for at least 12 months (p < 0.0001) with plasma neutralising titres that were raised against all variants compared to controls (p < 0.0001). Of 323 with complete data, 307 were vaccinated between 6 and 12 months; coinciding with rises in nasal and plasma IgA and IgG anti-S titres for all SARS-CoV-2 variants, although the change in nasal IgA was minimal (1.46-fold change after 10 months, p = 0.011) and the median remained below the positive threshold determined by pre-pandemic controls. Samples 12 months after admission showed no association between nasal IgA and plasma IgG anti-S responses (R = 0.05, p = 0.18), indicating that nasal IgA responses are distinct from those in plasma and minimally boosted by vaccination. INTERPRETATION: The decline in nasal IgA responses 9 months after infection and minimal impact of subsequent vaccination may explain the lack of long-lasting nasal defence against reinfection and the limited effects of vaccination on transmission. These findings highlight the need to develop vaccines that enhance nasal immunity. FUNDING: This study has been supported by ISARIC4C and PHOSP-COVID consortia. ISARIC4C is supported by grants from the National Institute for Health and Care Research and the Medical Research Council. Liverpool Experimental Cancer Medicine Centre provided infrastructure support for this research. The PHOSP-COVD study is jointly funded by UK Research and Innovation and National Institute of Health and Care Research. The funders were not involved in the study design, interpretation of data or the writing of this manuscript

    Cognitive and psychiatric symptom trajectories 2–3 years after hospital admission for COVID-19: a longitudinal, prospective cohort study in the UK

    Get PDF
    Background COVID-19 is known to be associated with increased risks of cognitive and psychiatric outcomes after the acute phase of disease. We aimed to assess whether these symptoms can emerge or persist more than 1 year after hospitalisation for COVID-19, to identify which early aspects of COVID-19 illness predict longer-term symptoms, and to establish how these symptoms relate to occupational functioning. Methods The Post-hospitalisation COVID-19 study (PHOSP-COVID) is a prospective, longitudinal cohort study of adults (aged ≥18 years) who were hospitalised with a clinical diagnosis of COVID-19 at participating National Health Service hospitals across the UK. In the C-Fog study, a subset of PHOSP-COVID participants who consented to be recontacted for other research were invited to complete a computerised cognitive assessment and clinical scales between 2 years and 3 years after hospital admission. Participants completed eight cognitive tasks, covering eight cognitive domains, from the Cognitron battery, in addition to the 9-item Patient Health Questionnaire for depression, the Generalised Anxiety Disorder 7-item scale, the Functional Assessment of Chronic Illness Therapy Fatigue Scale, and the 20-item Cognitive Change Index (CCI-20) questionnaire to assess subjective cognitive decline. We evaluated how the absolute risks of symptoms evolved between follow-ups at 6 months, 12 months, and 2–3 years, and whether symptoms at 2–3 years were predicted by earlier aspects of COVID-19 illness. Participants completed an occupation change questionnaire to establish whether their occupation or working status had changed and, if so, why. We assessed which symptoms at 2–3 years were associated with occupation change. People with lived experience were involved in the study. Findings 2469 PHOSP-COVID participants were invited to participate in the C-Fog study, and 475 participants (191 [40·2%] females and 284 [59·8%] males; mean age 58·26 [SD 11·13] years) who were discharged from one of 83 hospitals provided data at the 2–3-year follow-up. Participants had worse cognitive scores than would be expected on the basis of their sociodemographic characteristics across all cognitive domains tested (average score 0·71 SD below the mean [IQR 0·16–1·04]; p<0·0001). Most participants reported at least mild depression (263 [74·5%] of 353), anxiety (189 [53·5%] of 353), fatigue (220 [62·3%] of 353), or subjective cognitive decline (184 [52·1%] of 353), and more than a fifth reported severe depression (79 [22·4%] of 353), fatigue (87 [24·6%] of 353), or subjective cognitive decline (88 [24·9%] of 353). Depression, anxiety, and fatigue were worse at 2–3 years than at 6 months or 12 months, with evidence of both worsening of existing symptoms and emergence of new symptoms. Symptoms at 2–3 years were not predicted by the severity of acute COVID-19 illness, but were strongly predicted by the degree of recovery at 6 months (explaining 35·0–48·8% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); by a biocognitive profile linking acutely raised D-dimer relative to C-reactive protein with subjective cognitive deficits at 6 months (explaining 7·0–17·2% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); and by anxiety, depression, fatigue, and subjective cognitive deficit at 6 months. Objective cognitive deficits at 2–3 years were not predicted by any of the factors tested, except for cognitive deficits at 6 months, explaining 10·6% of their variance. 95 of 353 participants (26·9% [95% CI 22·6–31·8]) reported occupational change, with poor health being the most common reason for this change. Occupation change was strongly and specifically associated with objective cognitive deficits (odds ratio [OR] 1·51 [95% CI 1·04–2·22] for every SD decrease in overall cognitive score) and subjective cognitive decline (OR 1·54 [1·21–1·98] for every point increase in CCI-20). Interpretation Psychiatric and cognitive symptoms appear to increase over the first 2–3 years post-hospitalisation due to both worsening of symptoms already present at 6 months and emergence of new symptoms. New symptoms occur mostly in people with other symptoms already present at 6 months. Early identification and management of symptoms might therefore be an effective strategy to prevent later onset of a complex syndrome. Occupation change is common and associated mainly with objective and subjective cognitive deficits. Interventions to promote cognitive recovery or to prevent cognitive decline are therefore needed to limit the functional and economic impacts of COVID-19. Funding National Institute for Health and Care Research Oxford Health Biomedical Research Centre, Wolfson Foundation, MQ Mental Health Research, MRC-UK Research and Innovation, and National Institute for Health and Care Research
    corecore