12 research outputs found

    Developing Electron Microscopy Tools for Profiling Plasma Lipoproteins Using Methyl Cellulose Embedment, Machine Learning and Immunodetection of Apolipoprotein B and Apolipoprotein(a)

    Get PDF
    Plasma lipoproteins are important carriers of cholesterol and have been linked strongly to cardiovascular disease (CVD). Our study aimed to achieve fine-grained measurements of lipoprotein subpopulations such as low-density lipoprotein (LDL), lipoprotein(a) (Lp(a), or remnant lipoproteins (RLP) using electron microscopy combined with machine learning tools from microliter samples of human plasma. In the reported method, lipoproteins were absorbed onto electron microscopy (EM) support films from diluted plasma and embedded in thin films of methyl cellulose (MC) containing mixed metal stains, providing intense edge contrast. The results show that LPs have a continuous frequency distribution of sizes, extending from LDL (> 15 nm) to intermediate density lipoprotein (IDL) and very low-density lipoproteins (VLDL). Furthermore, mixed metal staining produces striking “positive” contrast of specific antibodies attached to lipoproteins providing quantitative data on apolipoprotein(a)-positive Lp(a) or apolipoprotein B (ApoB)-positive particles. To enable automatic particle characterization, we also demonstrated efficient segmentation of lipoprotein particles using deep learning software characterized by a Mask Region-based Convolutional Neural Networks (R-CNN) architecture with transfer learning. In future, EM and machine learning could be combined with microarray deposition and automated imaging for higher throughput quantitation of lipoproteins associated with CVD risk.Publisher PDFPeer reviewe

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)1.

    Get PDF
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Plasma Membrane-Located Purine Nucleotide Transport Proteins Are Key Components for Host Exploitation by Microsporidian Intracellular Parasites

    Get PDF
    EH and TAW acknowledge support from the Marie Curie Fellowship Programme (HTTP://cordis.europa.eu/fp7/home_en.html). ERSK, JML and TME acknowledge support from the Wellcome Trust (www.wellcome.ac.uk/). ERSK acknowledges support from the Medical Research Council (www.mrc.ac.uk). TME acknowledges support from the European Research Council Advanced Investigator Programme (http://erc.europa.eu/advanced-grants).Microsporidia are obligate intracellular parasites of most animal groups including humans, but despite their significant economic and medical importance there are major gaps in our understanding of how they exploit infected host cells. We have investigated the evolution, cellular locations and substrate specificities of a family of nucleotide transport (NTT) proteins from Trachipleistophora hominis, a microsporidian isolated from an HIV/AIDS patient. Transport proteins are critical to microsporidian success because they compensate for the dramatic loss of metabolic pathways that is a hallmark of the group. Our data demonstrate that the use of plasma membrane-located nucleotide transport proteins (NTT) is a key strategy adopted by microsporidians to exploit host cells. Acquisition of an ancestral transporter gene at the base of the microsporidian radiation was followed by lineage-specific events of gene duplication, which in the case of T. hominis has generated four paralogous NTT transporters. All four T. hominis NTT proteins are located predominantly to the plasma membrane of replicating intracellular cells where they can mediate transport at the host-parasite interface. In contrast to published data for Encephalitozoon cuniculi, we found no evidence for the location for any of the T. hominis NTT transporters to its minimal mitochondria (mitosomes), consistent with lineage-specific differences in transporter and mitosome evolution. All of the T. hominis NTTs transported radiolabelled purine nucleotides (ATP, ADP, GTP and GDP) when expressed in Escherichia coli, but did not transport radiolabelled pyrimidine nucleotides. Genome analysis suggests that imported purine nucleotides could be used by T. hominis to make all of the critical purine-based building-blocks for DNA and RNA biosynthesis during parasite intracellular replication, as well as providing essential energy for parasite cellular metabolism and protein synthesis.Publisher PDFPeer reviewe

    Retrospective delirium ascertainment from case notes: a retrospective cohort study

    No full text
    Objectives This study sets out to ascertain if recognition of delirium impacts on patient outcomes. Design Retrospective cohort study. Setting Unscheduled admissions to acute care trust/secondary care UK hospitals. Participants Six hundred and fifty-six older adults aged ≥65 years admitted on 14 September 2018. Measurements Delirium was ascertained retrospectively from case notes using medical notes. Documented delirium was classified as recognised delirium and retrospectively ascertained delirium was classified as unrecognised delirium. Primary and secondary outcome measures Primary outcome measure: inpatient mortality. Secondary outcome measures: length of stay, discharge destination. Results Delirium was present in 21.1% (132/626) of patients at any point during admission. The presence of delirium was associated with increased mortality (HR 2.65, CI 1.40 to 5.01). Recognition of delirium did not significantly impact on outcomes. Conclusions Delirium is associated with adverse outcomes in hospitalised older adults. However, there is insufficient evidence that recognition of delirium affects outcomes. However, delirium recognition presents an opportunity to discuss a person’s overall prognosis and discuss this with the patient and their family. Further research is needed to assess the pathophysiology of delirium to enable development of targeted interventions towards improved outcomes in patients with delirium

    The Subcellular Distribution of Small Molecules: From Pharmacokinetics to Synthetic Biology

    No full text

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    Get PDF
    International audienceIn 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Erratum to: Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition) (Autophagy, 12, 1, 1-222, 10.1080/15548627.2015.1100356

    No full text
    non present
    corecore