401 research outputs found

    Prognosis, characteristics, and provision of care for patients with the unspecified heart failure electronic health record phenotype: a population-based linked cohort study of 95262 individuals

    Get PDF
    Background: Whether the accuracy of the phenotype ascribed to patients in electronic health records (EHRs) is associated with variation in prognosis and care provision is unknown. We investigated this for heart failure (HF, characterised as HF with preserved ejection fraction [HFpEF], HF with reduced ejection fraction [HFrEF] and unspecified HF). / Methods: We included individuals aged 16 years and older with a new diagnosis of HF between January 2, 1998 and February 28, 2022 from linked primary and secondary care records in the Clinical Practice Research Datalink in England. We investigated the provision of guideline-recommended diagnostic investigations and pharmacological treatments. The primary outcome was a composite of HF hospitalisation or all-cause death, and secondary outcomes were time to HF hospitalisation, all-cause death and death from cardiovascular causes. We used Kaplan–Meier curves and log rank tests to compare survival across HF phenotypes and adjusted for potential confounders in Cox proportional hazards regression analyses. / Findings: Of a cohort of 95,262 individuals, 1271 (1.3%) were recorded as having HFpEF, 10,793 (11.3%) as HFrEF and 83,198 (87.3%) as unspecified HF. Individuals recorded as unspecified HF were older with a higher prevalence of dementia. Unspecified HF, compared to patients with a recorded HF phenotype, were less likely to receive specialist assessment, echocardiography or natriuretic peptide testing in the peri-diagnostic period, or receive angiotensin-converting enzyme inhibitors, beta blockers or mineralocorticoid receptor antagonists up to 12 months after diagnosis (risk ratios compared to HFrEF, 0.64, 95% CI 0.63–0.64; 0.59, 0.58–0.60; 0.57, 0.55–0.59; respectively) and had significantly worse outcomes (adjusted hazard ratios compared to HFrEF, HF hospitalisation and death 1.66, 95% CI 1.59–1.74; all-cause mortality 2.00, 1.90–2.10; cardiovascular death 1.77, 1.65–1.90). / Interpretation: Our findings suggested that absence of specification of HF phenotype in routine EHRs is inversely associated with clinical investigations, treatments and survival, representing an actionable target to mitigate prognostic and health resource burden. / Funding: Japan Research Foundation for Healthy Aging andBritish Heart Foundation

    Impact of an interatrial shunt device on survival and heart failure hospitalization in patients with preserved ejection fraction

    Get PDF
    Aims: Impaired left ventricular diastolic function leading to elevated left atrial pressures, particularly during exertion, is a key driver of symptoms and outcomes in heart failure with preserved ejection fraction (HFpEF). Insertion of an interatrial shunt device (IASD) to reduce left atrial pressure in HFpEF has been shown to be associated with short‐term haemodynamic and symptomatic benefit. We aimed to investigate the potential effects of IASD placement on HFpEF survival and heart failure hospitalization (HFH). Methods and results: Heart failure with preserved ejection fraction patients participating in the Reduce Elevated Left Atrial Pressure in Patients with Heart Failure study (Corvia Medical) of an IASD were followed for a median duration of 739 days. The theoretical impact of IASD implantation on HFpEF mortality was investigated by comparing the observed survival of the study cohort with the survival predicted from baseline data using the Meta‐analysis Global Group in Chronic Heart Failure heart failure risk survival score. Baseline and post‐IASD implant parameters associated with HFH were also investigated. Based upon the individual baseline demographic and cardiovascular profile of the study cohort, the Meta‐analysis Global Group in Chronic Heart Failure score‐predicted mortality was 10.2/100 pt years. The observed mortality rate of the IASD‐treated cohort was 3.4/100 pt years, representing a 33% lower rate (P = 0.02). By Kaplan–Meier analysis, the observed survival in IASD patients was greater than predicted (P = 0.014). Baseline parameters were not predictive of future HFH events; however, poorer exercise tolerance and a higher workload‐corrected exercise pulmonary capillary wedge pressure at the 6 months post‐IASD study were associated with HFH. Conclusions: The current study suggests IASD implantation may be associated with a reduction in mortality in HFpEF. Large‐scale ongoing randomized studies are required to confirm the potential benefit of this therapy

    Ferumoxytol-enhanced MRI in patients with prior cardiac transplantation.

    Get PDF
    Objectives: Ultra-small superparamagnetic particles of iron oxide (USPIO)-enhanced MRI can detect cellular inflammation within tissues and may help non-invasively identify cardiac transplant rejection. Here, we aimed to determine the normal reference values for USPIO-enhanced MRI in patients with a prior cardiac transplant and examine whether USPIO-enhanced MRI could detect myocardial inflammation in patients with transplant rejection. Methods: Ten volunteers and 11 patients with cardiac transplant underwent T2, T2* and late gadolinium enhancement 1.5T MRI, with further T2* imaging at 24 hours after USPIO (ferumoxytol, 4 mg/kg) infusion, at baseline and 3 months. Results: Ten patients with clinically stable cardiac transplantation were retained for analysis. Myocardial T2 values were higher in patients with cardiac transplant versus healthy volunteers (53.8±5.2 vs 48.6±1.9 ms, respectively; p=0.003). There were no differences in the magnitude of USPIO-induced change in R2* in patients with transplantation (change in R2*, 26.6±7.3 vs 22.0±10.4 s-1 in healthy volunteers; p=0.28). After 3 months, patients with transplantation (n=5) had unaltered T2 values (52.7±2.8 vs 52.12±3.4 ms; p=0.80) and changes in R2* following USPIO (29.42±8.14 vs 25.8±7.8 s-1; p=0.43). Conclusion: Stable patients with cardiac transplantation have increased myocardial T2 values, consistent with resting myocardial oedema or fibrosis. In contrast, USPIO-enhanced MRI is normal and stable over time suggesting the absence of chronic macrophage-driven cellular inflammation. It remains to be determined whether USPIO-enhanced MRI may be able to identify acute cardiac transplant rejection. Trial registration number: NCT02319278349 (https://clinicaltrials.gov/ct2/show/NCT02319278) Registered 03.12.2014 EUDraCT 2013-002336-24

    Living biointerfaces based on non-pathogenic bacteria to direct cell differentiation

    Get PDF
    Genetically modified Lactococcus lactis, non-pathogenic bacteria expressing the FNIII7-10 fibronectin fragment as a protein membrane have been used to create a living biointerface between synthetic materials and mammalian cells. This FNIII7-10 fragment comprises the RGD and PHSRN sequences of fibronectin to bind α5β1 integrins and triggers signalling for cell adhesion, spreading and differentiation. We used L. lactis strain to colonize material surfaces and produce stable biofilms presenting the FNIII7-10 fragment readily available to cells. Biofilm density is easily tunable and remains stable for several days. Murine C2C12 myoblasts seeded over mature biofilms undergo bipolar alignment and form differentiated myotubes, a process triggered by the FNIII7-10 fragment. This biointerface based on living bacteria can be further modified to express any desired biochemical signal, establishing a new paradigm in biomaterial surface functionalisation for biomedical applications

    Association of season and herd size with somatic cell count for cows in Irish,English, and Welsh dairy herds

    Get PDF
    The aims of this study were to describe associations of time of year, and herd size with cow somatic cell count (SCC) for Irish, English, and Welsh dairy herds. Random samples of 497 and 493 Irish herds, and two samples of 200 English and Welsh (UK) herds were selected. Random effects models for the natural logarithm of individual cow test day SCC were developed using data from herds in one sub-dataset from each country. Data from the second sub-datasets were used for cross validation. Baseline model results showed that geometric mean cow SCC (GSCC) in Irish herds was highest from February to August, and ranged from 111,000 cells/mL in May to 61,000 cells/mL in October. For cows in UK herds, GSCC ranged from 84,000 cells/mL in February and June, to 66,000 cells/mL in October. The results highlight the importance of monitoring cow SCC during spring and summer despite low bulk milk SCC at this time for Irish herds. GSCC was lowest in Irish herds of up to 130 cows (63,000 cells/mL), and increased for larger herds, reaching 68,000 cells/mL in herds of up to 300 cows. GSCC in UK herds was lowest for herds of 130–180 cows (60,000 cells/mL) and increased to 63,000 cells/mL in herds of 30 cows, and 68,000 cells/mL in herds of 300 cows. Importantly, these results suggest expansion may be associated with increased cow SCC, highlighting the importance of appropriate management, to benefit from potential economies of scale, in terms of udder health

    Which patients with heart failure should receive specialist palliative care?

    Full text link
    AIMS: We investigated which patients with heart failure (HF) should receive specialist palliative care (SPC) by first creating a definition of need for SPC in patients hospitalised with HF using patient-reported outcome measures (PROMs) and then testing this definition using the outcome of days alive and out of hospital (DAOH). We also evaluated which baseline variables predicted need for SPC and whether those with this need received SPC. METHODS AND RESULTS: PROMs assessing quality of life (QoL), symptoms, and mood were administered at baseline and every 4 months. SPC need was defined as persistently severe impairment of any PROM without improvement (or severe impairment immediately preceding death). We then tested whether need for SPC, so defined, was reflected in DAOH, a measure which combines length of stay, days of hospital re-admission, and days lost due to death. Of 272 patients recruited, 74 (27%) met the definition of SPC needs. These patients lived one third fewer DAOH than those without SPC need (and less than a quarter of QoL-adjusted DAOH). A Kansas City Cardiomyopathy Questionnaire (KCCQ) summary score of <29 identified patients who subsequently had SPC needs (area under receiver operating characteristic curve 0.78). Twenty-four per cent of patients with SPC needs actually received SPC (n = 18). CONCLUSIONS: A quarter of patients hospitalised with HF had a need for SPC and were identified by a low KCCQ score on admission. Those with SPC need spent many fewer DAOH and their DAOH were of significantly worse quality. Very few patients with SPC needs accessed SPC services

    Alterations in Mesenteric Lymph Node T Cell Phenotype and Cytokine Secretion are Associated with Changes in Thymocyte Phenotype after LP-BM5 Retrovirus Infection

    Get PDF
    In this study, mouse MLN cells and thymocytes from advanced stages of LP-BM5 retrovirus infection were studied. A decrease in the percentage of IL-7+ cells and an increase in the percentage of IL-16+ cells in the MLN indicated that secretion of these cytokines was also altered after LP-BM5 infection. The percentage of MLN T cells expressing IL-7 receptors was significantly reduced, while the percentage of MLN T cells expressing TNFR-p75 and of B cells expressing TNFR-p55 increased. Simultaneous analysis of surface markers and cytokine secretion was done in an attempt to understand whether the deregulation of IFN-Υ secretion could be ascribed to a defined cell phenotype, concluding that all T cell subsets studied increased IFN-Υ secretion after retrovirus infection. Finally, thymocyte phenotype was further analyzed trying to correlate changes in thymocyte phenotype with MLN cell phenotype. The results indicated that the increase in single positive either CD4+CD8- or CD4- CD8+ cells was due to accumulation of both immature (CD3- ) and mature (CD3+) single positive thymocytes. Moreover, single positive mature thymocytes presented a phenotype similar to the phenotype previously seen on MLN T cells. In summary, we can conclude that LP-BM5 uses the immune system to reach the thymus where it interferes with the generation of functionally mature T cells, favoring the development of T cells with an abnormal phenotype. These new T cells are activated to secrete several cytokines that in turn will favor retrovirus replication and inhibit any attempt of the immune system to control infection

    High-dose intravenous iron reduces myocardial infarction in patients on haemodialysis

    Get PDF
    AIMS: To investigate the effect of high-dose iron vs. low-dose intravenous (IV) iron on myocardial infarction (MI) in patients on maintenance haemodialysis. METHODS AND RESULTS: This was a pre-specified analysis of secondary endpoints of the Proactive IV Iron Therapy in Hemodialysis Patients trial (PIVOTAL) randomized, controlled clinical trial. Adults who had started haemodialysis within the previous year, who had a ferritin concentration <400 μg per litre and a transferrin saturation <30% were randomized to high-dose or low-dose IV iron. The main outcome measure for this analysis was fatal or non-fatal MI. Over a median of 2.1 years of follow-up, 8.4% experienced a MI. Rates of type 1 MIs (3.2/100 patient-years) were 2.5 times higher than type 2 MIs (1.3/100 patient-years). Non-ST-elevation MIs (3.3/100 patient-years) were 6 times more common than ST-elevation MIs (0.5/100 patient-years). Mortality was high after non-fatal MI (1- and 2-year mortality of 40% and 60%, respectively). In time-to-first event analyses, proactive high-dose IV iron reduced the composite endpoint of non-fatal and fatal MI [hazard ratio (HR) 0.69, 95% confidence interval (CI) 0.52-0.93, P = 0.01] and non-fatal MI (HR 0.69, 95% CI 0.51-0.93; P = 0.01) when compared with reactive low-dose IV iron. There was less effect of high-dose IV iron on recurrent MI events than on the time-to-first event analysis. CONCLUSION: In total, 8.4% of patients on maintenance haemodialysis had an MI over 2 years. High-dose compared to low-dose IV iron reduced MI in patients receiving haemodialysis. EUDRACT REGISTRATION NUMBER: 2013-002267-25

    Heart Failure Hospitalization in Adults Receiving Hemodialysis and the Effect of Intravenous Iron Therapy

    Get PDF
    OBJECTIVES: This study sought to examine the effect of intravenous iron on heart failure events in hemodialysis patients. BACKGROUND: Heart failure is a common and deadly complication in patients receiving hemodialysis and is difficult to diagnose and treat. METHODS: The study analyzed heart failure events in the PIVOTAL (Proactive IV Iron Therapy in Hemodialysis Patients) trial, which compared intravenous iron administered proactively in a high-dose regimen with a low-dose regimen administered reactively. Heart failure hospitalization was an adjudicated outcome, a component of the primary composite outcome, and a prespecified secondary endpoint in the trial. RESULTS: Overall, 2,141 participants were followed for a median of 2.1 years. A first fatal or nonfatal heart failure event occurred in 51 (4.7%) of 1,093 patients in the high-dose iron group and in 70 (6.7%) of 1,048 patients in the low-dose group (HR: 0.66; 95% CI: 0.46-0.94; P = 0.023). There was a total of 63 heart failure events (including first and recurrent events) in the high-dose iron group and 98 in the low-dose group, giving a rate ratio of 0.59 (95% CI: 0.40-0.87; P = 0.0084). Most patients presented with pulmonary edema and were mainly treated by mechanical removal of fluid. History of heart failure and diabetes were independent predictors of a heart failure event. CONCLUSIONS: Compared with a lower-dose regimen, high-dose intravenous iron decreased the occurrence of first and recurrent heart failure events in patients undergoing hemodialysis, with large relative and absolute risk reductions. (UK Multicentre Open-label Randomised Controlled Trial Of IV Iron Therapy In Incident Haemodialysis Patients; 2013-002267-25)

    Coralline algal Barium as indicator for 20th century northwestern North Atlantic surface ocean freshwater variability

    Get PDF
    During the past decades climate and freshwater dynamics in the northwestern North Atlantic have undergone major changes. Large-scale freshening episodes, related to polar freshwater pulses, have had a strong influence on ocean variability in this climatically important region. However, little is known about variability before 1950, mainly due to the lack of long-term high-resolution marine proxy archives. Here we present the first multidecadal-length records of annually resolved Ba/Ca variations from Northwest Atlantic coralline algae. We observe positive relationships between algal Ba/Ca ratios from two Newfoundland sites and salinity observations back to 1950. Both records capture episodical multi-year freshening events during the 20th century. Variability in algal Ba/Ca is sensitive to freshwater-induced changes in upper ocean stratification, which affect the transport of cold, Ba-enriched deep waters onto the shelf (highly stratified equals less Ba/Ca). Algal Ba/Ca ratios therefore may serve as a new resource for reconstructing past surface ocean freshwater changes
    corecore