1,534 research outputs found

    Beyond Volume: The Impact of Complex Healthcare Data on the Machine Learning Pipeline

    Full text link
    From medical charts to national census, healthcare has traditionally operated under a paper-based paradigm. However, the past decade has marked a long and arduous transformation bringing healthcare into the digital age. Ranging from electronic health records, to digitized imaging and laboratory reports, to public health datasets, today, healthcare now generates an incredible amount of digital information. Such a wealth of data presents an exciting opportunity for integrated machine learning solutions to address problems across multiple facets of healthcare practice and administration. Unfortunately, the ability to derive accurate and informative insights requires more than the ability to execute machine learning models. Rather, a deeper understanding of the data on which the models are run is imperative for their success. While a significant effort has been undertaken to develop models able to process the volume of data obtained during the analysis of millions of digitalized patient records, it is important to remember that volume represents only one aspect of the data. In fact, drawing on data from an increasingly diverse set of sources, healthcare data presents an incredibly complex set of attributes that must be accounted for throughout the machine learning pipeline. This chapter focuses on highlighting such challenges, and is broken down into three distinct components, each representing a phase of the pipeline. We begin with attributes of the data accounted for during preprocessing, then move to considerations during model building, and end with challenges to the interpretation of model output. For each component, we present a discussion around data as it relates to the healthcare domain and offer insight into the challenges each may impose on the efficiency of machine learning techniques.Comment: Healthcare Informatics, Machine Learning, Knowledge Discovery: 20 Pages, 1 Figur

    Ct threshold values, a proxy for viral load in community SARS-CoV-2 cases, demonstrate wide variation across populations and over time

    Get PDF
    Background: Information on SARS-CoV-2 in representative community surveillance is limited, particularly cycle threshold (Ct) values (a proxy for viral load). Methods: We included all positive nose and throat swabs 26 April 2020 to 13 March 2021 from the UK’s national COVID-19 Infection Survey, tested by RT-PCR for the N, S, and ORF1ab genes. We investigated predictors of median Ct value using quantile regression. Results: Of 3,312,159 nose and throat swabs, 27,902 (0.83%) were RT-PCR-positive, 10,317 (37%), 11,012 (40%), and 6550 (23%) for 3, 2, or 1 of the N, S, and ORF1ab genes, respectively, with median Ct = 29.2 (~215 copies/ml; IQR Ct = 21.9–32.8, 14–56,400 copies/ml). Independent predictors of lower Cts (i.e. higher viral load) included self-reported symptoms and more genes detected, with at most small effects of sex, ethnicity, and age. Single-gene positives almost invariably had Ct > 30, but Cts varied widely in triple-gene positives, including without symptoms. Population-level Cts changed over time, with declining Ct preceding increasing SARS-CoV-2 positivity. Of 6189 participants with IgG S-antibody tests post-first RT-PCR-positive, 4808 (78%) were ever antibody-positive; Cts were significantly higher in those remaining antibody negative. Conclusions: Marked variation in community SARS-CoV-2 Ct values suggests that they could be a useful epidemiological early-warning indicator. Funding: Department of Health and Social Care, National Institutes of Health Research, Huo Family Foundation, Medical Research Council UK; Wellcome Trust

    Effect of Delta variant on viral burden and vaccine effectiveness against new SARS-CoV-2 infections in the UK

    Get PDF
    The effectiveness of the BNT162b2 and ChAdOx1 vaccines against new severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections requires continuous re-evaluation, given the increasingly dominant B.1.617.2 (Delta) variant. In this study, we investigated the effectiveness of these vaccines in a large, community-based survey of randomly selected households across the United Kingdom. We found that the effectiveness of BNT162b2 and ChAdOx1 against infections (new polymerase chain reaction (PCR)-positive cases) with symptoms or high viral burden is reduced with the B.1.617.2 variant (absolute difference of 10–13% for BNT162b2 and 16% for ChAdOx1) compared to the B.1.1.7 (Alpha) variant. The effectiveness of two doses remains at least as great as protection afforded by prior natural infection. The dynamics of immunity after second doses differed significantly between BNT162b2 and ChAdOx1, with greater initial effectiveness against new PCR-positive cases but faster declines in protection against high viral burden and symptomatic infection with BNT162b2. There was no evidence that effectiveness varied by dosing interval, but protection was higher in vaccinated individuals after a prior infection and in younger adults. With B.1.617.2, infections occurring after two vaccinations had similar peak viral burden as those in unvaccinated individuals. SARS-CoV-2 vaccination still reduces new infections, but effectiveness and attenuation of peak viral burden are reduced with B.1.617.2

    Marine Dynamics and Productivity in the Bay of Bengal

    Get PDF
    The Bay of Bengal provides important ecosystem services to the Bangladesh delta. It is also subject to the consequences of climate change as monsoon atmospheric circulation and fresh water input from the major rivers are the dominating influences. Changes in marine circulation will affect patterns of biological production through alterations in the supply of nutrients to photosynthesising plankton. Productivity in the northern Bay will also be sensitive to changes in riverborne nutrients. In turn, these changes could influence potential fish catch. The Bay also affects the physical environment of Bangladesh: relative sea-level rise is expected to be in the range of 0.5–1.7 m by 2100, and changing climate could affect the development of tropical cyclones over the Bay

    Increased urine IgM excretion predicts cardiovascular events in patients with type 1 diabetes nephropathy

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Diabetic nephropathy, a major complication of diabetes, is characterized by progressive renal injury and increased cardiovascular mortality. An increased urinary albumin excretion due dysfunction of the glomerular barrier is an early sign of diabetic nephropathy. An increased urinary excretion of higher molecular weight proteins such as IgM appears with progression of glomerular injury. We aim here to study the prognostic significance of urine IgM excretion in patients with type 1 diabetes mellitus (type 1 diabetic nephropathy).</p> <p>Methods</p> <p>This is an observational study of 139 patients with type1 diabetes mellitus (79 males and 60 females) under routine care at the diabetic outpatient clinic at the Lund University Hospital. The median follow-up time was 18 years (1 to 22) years. Urine albumin and urine IgM concentration were measured at time of recruitment.</p> <p>Results</p> <p>Overall 32 (14 male and 18 female) patients died in a cardiovascular event and 20 (11 male and 9 female) patients reached end-stage renal disease. Univariate analysis indicated that patient survival and renal survival were inversely associated with urine albumin excretion (RR = 2.9 and 5.8, respectively) and urine IgM excretion (RR = 4.6 and 5.7, respectively). Stratified analysis demonstrated that in patients with different degrees of albuminuria, the cardiovascular mortality rate and the incidence of end-stage renal disease was approximately three times higher in patients with increased urine IgM excretion.</p> <p>Conclusion</p> <p>An increase in urinary IgM excretion in patients with type 1 diabetes is associated with an increased risk for cardiovascular mortality and renal failure, regardless of the degree of albuminuria.</p

    Cognitive and environmental predictors of early literacy skills

    Get PDF
    Not all young children benefit from book exposure in preschool age. It is claimed that the ability to hold information in mind (short-term memory), to ignore distraction (inhibition), and to focus attention and stay focused (sustained attention) may have a moderating effect on children’s reactions to the home literacy environment. In a group of 228 junior kindergarten children with a native Dutch background, with a mean age of 54.29 months (SD = 2.12 months), we explored therefore the relationship between book exposure, cognitive control and early literacy skills. Parents filled in a HLE questionnaire (book sharing frequency and an author recognition checklist as indicator of parental leisure reading habits), and children completed several tests in individual sessions with the researcher (a book-cover recognition test, PPVT, letter knowledge test, the subtests categories and patterns of the SON, and cognitive control measures namely digit span of the KABC, a peg tapping task and sustained attention of the ANT). Main findings were: (1) Children’s storybook knowledge mediated the relationship between home literacy environment and literacy skills. (2) Both vocabulary and letter knowledge were predicted by book exposure. (3) Short-term memory predicted vocabulary over and above book exposure. (4) None of the cognitive control mechanisms moderated the beneficial effects of book exposure

    EhMAPK, the Mitogen-Activated Protein Kinase from Entamoeba histolytica Is Associated with Cell Survival

    Get PDF
    Mitogen Activated Protein Kinases (MAPKs) are a class of serine/threonine kinases that regulate a number of different cellular activities including cell proliferation, differentiation, survival and even death. The pathogen Entamoeba histolytica possess a single homologue of a typical MAPK gene (EhMAPK) whose identification was previously reported by us but its functional implications remained unexplored. EhMAPK, the only mitogen-activated protein kinase from the parasitic protist Entamoeba histolytica with Threonine-X-Tyrosine (TXY) phosphorylation motif was cloned, expressed in E. coli and functionally characterized under different stress conditions. The expression profile of EhMAPK at the protein and mRNA level remained similar among untreated, heat shocked and hydrogen peroxide-treated samples in all cases of dose and time. But a significant difference was obtained in the phosphorylation status of the protein in response to different stresses. Heat shock at 43°C or 0.5 mM H2O2 treatment enhanced the phosphorylation status of EhMAPK and augmented the kinase activity of the protein whereas 2.0 mM H2O2 treatment induced dephosphorylation of EhMAPK and loss of kinase activity. 2.0 mM H2O2 treatment reduced parasite viability significantly but heat shock and 0.5 mM H2O2 treatment failed to adversely affect E. histolytica viability. Therefore, a distinct possibility that activation of EhMAPK is associated with stress survival in E. histolytica is seen. Our study also gives a glimpse of the regulatory mechanism of the protein under in vivo conditions. Since the parasite genome lacks any typical homologue of mammalian MEK, the dual specificity kinases which are the upstream activators of MAPK, indications of the existence of some alternate regulatory mechanisms of the EhMAPK activity is perceived. These may include the autophosphorylation activity of the protein itself in combination with some upstream phosphatases which are not yet identified

    Overcoming language barriers with foreign-language speaking patients: a survey to investigate intra-hospital variation in attitudes and practices

    Get PDF
    Background Use of available interpreter services by hospital clincial staff is often suboptimal, despite evidence that trained interpreters contribute to quality of care and patient safety. Examination of intra-hospital variations in attitudes and practices regarding interpreter use can contribute to identifying factors that facilitate good practice. The purpose of this study was to describe attitudes, practices and preferences regarding communication with limited French proficiency (LFP) patients, examine how these vary across professions and departments within the hospital, and identify factors associated with good practices. Methods A self-administered questionnaire was mailed to random samples of 700 doctors, 700 nurses and 93 social workers at the Geneva University Hospitals, Switzerland. Results Seventy percent of respondents encounter LFP patients at least once a month, but this varied by department. 66% of respondents said they preferred working with ad hoc interpreters (patient's family and bilingual staff), mainly because these were easier to access. During the 6 months preceding the study, ad hoc interpreters were used at least once by 71% of respondents, and professional interpreters were used at least once by 51%. Overall, only nine percent of respondents had received any training in how and why to work with a trained interpreter. Only 23.2% of respondents said the clinical service in which they currently worked encouraged them to use professional interpreters. Respondents working in services where use of professional interpreters was encouraged were more likely to be of the opinion that the hospital should systematically provide a professional interpreter to LFP patients (40.3%) as compared with those working in a department that discouraged use of professional interpreters (15.5%) and they used professional interpreters more often during the previous 6 months. Conclusion Attitudes and practices regarding communication with LFP patients vary across professions and hospital departments. In order to foster an institution-wide culture conducive to ensuring adequate communication with LFP patients will require both the development of a hospital-wide policy and service-level activities aimed at reinforcing this policy and putting it into practice

    Anemia status, hemoglobin concentration and outcome after acute stroke: a cohort study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In the setting of an acute stroke, anemia has the potential to worsen brain ischemia, however, the relationship between the entire range of hemoglobin to long-term outcome is not well understood.</p> <p>Methods</p> <p>We examined the association between World Health Organization-defined admission anemia status (hemoglobin<13 in males, <12 g/dl in women) and hemoglobin concentration and 1-year outcome among 859 consecutive patients with acute stroke (ischemic or intracerebral hemorrhage).</p> <p>Results</p> <p>The mean baseline hemoglobin concentration was 13.8 ± 1.7 g/dl (range 8.1 - 18.7). WHO-defined anemia was present in 19% of patients among both women and men. After adjustment for differences in baseline characteristics, patients with admission anemia had an adjusted OR for all-cause death at 1-month of 1.90 (95% CI, 1.05 to 3.43) and at 1-year of 1.72 (95% CI, 1.00 to 2.93) and for the combined end-point of disability, nursing facility care or death of 2.09 (95% CI, 1.13 to 3.84) and 1.83 (95% CI, 1.02 to 3.27) respectively. The relationship between hemoglobin quartiles and all-cause death revealed a non-linear association with increased risk at extremes of both low and high concentrations. In logistic regression models developed to estimate the linear and quadratic relation between hemoglobin and outcomes of interest, each unit increment in hemoglobin squared was associated with increased adjusted odds of all-cause death [at 1-month 1.06 (1.01 to 1.12; p = 0.03); at 1-year 1.09 (1.04 to 1.15; p < 0.01)], confirming that extremes of both low and high levels of hemoglobin were associated with increased mortality.</p> <p>Conclusions</p> <p>WHO-defined anemia was common in both men and women among patients with acute stroke and predicted poor outcome. Moreover, the association between admission hemoglobin and mortality was not linear; risk for death increased at both extremes of hemoglobin.</p
    corecore