122 research outputs found

    Analysis of C-reactive protein from finger stick dried blood spot to predict high risk of cardiovascular disease

    Get PDF
    C-reactive protein (CRP) is an acute-phase protein involved in inflammation. Furthermore, CRP is an important biomarker used in diagnostics to predict risk of cardiovascular disease (CVD) in addition to monitoring bacterial and viral infections. To measure plasma CRP, venipuncture is still necessitated and has to be performed by trained phlebotomists. As a solution, dried blood spots (DBS) are used for minimally invasive at-home sampling of blood and can be send to diagnostic laboratories by regular mail. In this study, we included 53 patients that presented to the outpatient clinic of the University Medical Center Utrecht. Capillary finger stick was used to spot blood on a filter paper card and allowed to dry. After extraction of DBS, CRP was analyzed on an automated high-throughput chemistry analyzer. Additional validation steps regarding stability, effect of hematocrit, precision, and limits of blank and quantitation were conducted according to corresponding Clinical and Laboratory Standards Institute standards. An excellent regression analysis of R2 (95% confidence interval) = 0.986 (0.982–0.989) was found. This enabled correct classification for high CVD risk of all 25 cases with sensitivity (95% CI) of 1.00 (1.00–1.00) and specificity (95% CI) of 0.96 (0.89–1.03) and correct diagnosis of inflammation of 12/13 cases with sensitivity (95% CI) of 0.92 (0.77–1.07) and specificity (95% CI) of 1.00 (1.00–1.00). Furthermore, CRP was found to be stable for 31 days and observed hematocrit variation amongst patients was clinically acceptable. CRP from DBS can be accurately measured on an automated high-throughput chemistry analyzer and used to diagnose inflammation and classify high CVD risk. This method enables individuals to engage in at-home sampling of blood on DBS for (tele)diagnostics, screening programs, patient follow-up, and medication management

    Design, validation and implementation of an automated e-alert for acute kidney injury: 6-month pilot study shows increased awareness

    Get PDF
    BACKGROUND: Acute kidney injury (AKI) is defined as a sudden episode of kidney failure but is known to be under-recognized by healthcare professionals. The Kidney Disease Improving Global Outcome (KDIGO) guidelines have formulated criteria to facilitate AKI diagnosis by comparing changes in plasma creatinine measurements (PCr). To improve AKI awareness, we implemented these criteria as an electronic alert (e-alert), in our electronic health record (EHR) system. METHODS: For every new PCr measurement measured in the University Medical Center Utrecht that triggered the e-alert, we provided the physician with actionable insights in the form of a memo, to improve or stabilize kidney function. Since e-alerts qualify for software as a medical device (SaMD), we designed, implemented and validated the e-alert according to the European Union In Vitro Diagnostic Regulation (IVDR). RESULTS: We evaluated the impact of the e-alert using pilot data six months before and after implementation. 2,053 e-alerts of 866 patients were triggered in the before implementation, and 1,970 e-alerts of 853 patients were triggered after implementation. We found improvements in AKI awareness as measured by (1) 2 days PCr follow up (56.6-65.8%, p-value: 0.003), and (2) stop of nephrotoxic medication within 7 days of the e-alert (59.2-63.2%, p-value: 0.002). CONCLUSION: Here, we describe the design and implementation of the e-alert in line with the IVDR, leveraging a multi-disciplinary team consisting of physicians, clinical chemists, data managers and data scientists, and share our firsts results that indicate an improved awareness among treating physicians

    Transportability and Implementation Challenges of Early Warning Scores for Septic Shock in the ICU: A Perspective on the TREWScore

    Get PDF
    The increased use of electronic health records (EHRs) has improved the availability of routine care data for medical research. Combined with machine learning techniques this has spurred the development of early warning scores (EWSs) in hospitals worldwide. EWSs are commonly used in the hospital where they have been developed, yet few have been transported to external settings and/or internationally. In this perspective, we describe our experiences in implementing the TREWScore, a septic shock EWS, and the transportability challenges regarding domain, predictors, and clinical outcome we faced. We used data of 53,330 ICU stays from Medical Information Mart for Intensive Care-III (MIMIC-III) and 18,013 ICU stays from the University Medical Center (UMC) Utrecht, including 17,023 (31.9%) and 2,557 (14.2%) cases of sepsis, respectively. The MIMIC-III and UMC populations differed significantly regarding the length of stay (6.9 vs. 9.0 days) and hospital mortality (11.6% vs. 13.6%). We mapped all 54 TREWScore predictors to the UMC database: 31 were readily available, seven required unit conversion, 14 had to be engineered, one predictor required text mining, and one predictor could not be mapped. Lastly, we classified sepsis cases for septic shock using the sepsis-2 criteria. Septic shock populations (UMC 31.3% and MIMIC-III 23.3%) and time to shock events showed significant differences between the two cohorts. In conclusion, we identified challenges to transportability and implementation regarding domain, predictors, and clinical outcome when transporting EWS between hospitals across two continents. These challenges need to be systematically addressed to improve model transportability between centers and unlock the potential clinical utility of EWS

    Relations between lipoprotein(a) concentrations, LPA genetic variants, and the risk of mortality in patients with established coronary heart disease: a molecular and genetic association study

    Get PDF
    Background: Lipoprotein(a) concentrations in plasma are associated with cardiovascular risk in the general population. Whether lipoprotein(a) concentrations or LPA genetic variants predict long-term mortality in patients with established coronary heart disease remains less clear. Methods: We obtained data from 3313 patients with established coronary heart disease in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study. We tested associations of tertiles of lipoprotein(a) concentration in plasma and two LPA single-nucleotide polymorphisms ([SNPs] rs10455872 and rs3798220) with all-cause mortality and cardiovascular mortality by Cox regression analysis and with severity of disease by generalised linear modelling, with and without adjustment for age, sex, diabetes diagnosis, systolic blood pressure, BMI, smoking status, estimated glomerular filtration rate, LDL-cholesterol concentration, and use of lipid-lowering therapy. Results for plasma lipoprotein(a) concentrations were validated in five independent studies involving 10 195 patients with established coronary heart disease. Results for genetic associations were replicated through large-scale collaborative analysis in the GENIUS-CHD consortium, comprising 106 353 patients with established coronary heart disease and 19 332 deaths in 22 studies or cohorts. Findings: The median follow-up was 9·9 years. Increased severity of coronary heart disease was associated with lipoprotein(a) concentrations in plasma in the highest tertile (adjusted hazard radio [HR] 1·44, 95% CI 1·14–1·83) and the presence of either LPA SNP (1·88, 1·40–2·53). No associations were found in LURIC with all-cause mortality (highest tertile of lipoprotein(a) concentration in plasma 0·95, 0·81–1·11 and either LPA SNP 1·10, 0·92–1·31) or cardiovascular mortality (0·99, 0·81–1·2 and 1·13, 0·90–1·40, respectively) or in the validation studies. Interpretation: In patients with prevalent coronary heart disease, lipoprotein(a) concentrations and genetic variants showed no associations with mortality. We conclude that these variables are not useful risk factors to measure to predict progression to death after coronary heart disease is established. Funding: Seventh Framework Programme for Research and Technical Development (AtheroRemo and RiskyCAD), INTERREG IV Oberrhein Programme, Deutsche Nierenstiftung, Else-Kroener Fresenius Foundation, Deutsche Stiftung für Herzforschung, Deutsche Forschungsgemeinschaft, Saarland University, German Federal Ministry of Education and Research, Willy Robert Pitzer Foundation, and Waldburg-Zeil Clinics Isny

    Root Suberin Forms an Extracellular Barrier That Affects Water Relations and Mineral Nutrition in Arabidopsis

    Get PDF
    Though central to our understanding of how roots perform their vital function of scavenging water and solutes from the soil, no direct genetic evidence currently exists to support the foundational model that suberin acts to form a chemical barrier limiting the extracellular, or apoplastic, transport of water and solutes in plant roots. Using the newly characterized enhanced suberin1 (esb1) mutant, we established a connection in Arabidopsis thaliana between suberin in the root and both water movement through the plant and solute accumulation in the shoot. Esb1 mutants, characterized by increased root suberin, were found to have reduced day time transpiration rates and increased water-use efficiency during their vegetative growth period. Furthermore, these changes in suberin and water transport were associated with decreases in the accumulation of Ca, Mn, and Zn and increases in the accumulation of Na, S, K, As, Se, and Mo in the shoot. Here, we present direct genetic evidence establishing that suberin in the roots plays a critical role in controlling both water and mineral ion uptake and transport to the leaves. The changes observed in the elemental accumulation in leaves are also interpreted as evidence that a significant component of the radial root transport of Ca, Mn, and Zn occurs in the apoplast

    Unexpected decline in tuberculosis cases coincident with economic recession -- United States, 2009

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Since 1953, through the cooperation of state and local health departments, the U.S. Centers for Disease Control and Prevention (CDC) has collected information on incident cases of tuberculosis (TB) disease in the United States. In 2009, TB case rates declined -11.4%, compared to an average annual -3.8% decline since 2000. The unexpectedly large decline raised concerns that TB cases may have gone unreported. To address the unexpected decline, we examined trends from multiple sources on TB treatment initiation, medication sales, and laboratory and genotyping data on culture-positive TB.</p> <p>Methods</p> <p>We analyzed 142,174 incident TB cases reported to the U. S. National Tuberculosis Surveillance System (NTSS) during January 1, 2000-December 31, 2009; TB control program data from 59 public health reporting areas; self-reported data from 50 CDC-funded public health laboratories; monthly electronic prescription claims for new TB therapy prescriptions; and complete genotyping results available for NTSS cases. Accounting for prior trends using regression and time-series analyses, we calculated the deviation between observed and expected TB cases in 2009 according to patient and clinical characteristics, and assessed at what point in time the deviation occurred.</p> <p>Results</p> <p>The overall deviation in TB cases in 2009 was -7.9%, with -994 fewer cases reported than expected (<it>P </it>< .001). We ruled out evidence of surveillance underreporting since declines were seen in states that used new software for case reporting in 2009 as well as states that did not, and we found no cases unreported to CDC in our examination of over 5400 individual line-listed reports in 11 areas. TB cases decreased substantially among both foreign-born and U.S.-born persons. The unexpected decline began in late 2008 or early 2009, and may have begun to reverse in late 2009. The decline was greater in terms of case counts among foreign-born than U.S.-born persons; among the foreign-born, the declines were greatest in terms of percentage deviation from expected among persons who had been in the United States less than 2 years. Among U.S.-born persons, the declines in percentage deviation from expected were greatest among homeless persons and substance users. Independent information systems (NTSS, TB prescription claims, and public health laboratories) reported similar patterns of declines. Genotyping data did not suggest sudden decreases in recent transmission.</p> <p>Conclusions</p> <p>Our assessments show that the decline in reported TB was not an artifact of changes in surveillance methods; rather, similar declines were found through multiple data sources. While the steady decline of TB cases before 2009 suggests ongoing improvement in TB control, we were not able to identify any substantial change in TB control activities or TB transmission that would account for the abrupt decline in 2009. It is possible that other multiple causes coincident with economic recession in the United States, including decreased immigration and delayed access to medical care, could be related to TB declines. Our findings underscore important needs in addressing health disparities as we move towards TB elimination in the United States.</p

    Postsynaptic nigrostriatal dopamine receptors and their role in movement regulation

    Get PDF
    The article presents the hypothesis that nigrostriatal dopamine may regulate movement by modulation of tone and contraction in skeletal muscles through a concentration-dependent influence on the postsynaptic D1 and D2 receptors on the follow manner: nigrostriatal axons innervate both receptor types within the striatal locus somatotopically responsible for motor control in agonist/antagonist muscle pair around a given joint. D1 receptors interact with lower and D2 receptors with higher dopamine concentrations. Synaptic dopamine concentration increases immediately before movement starts. We hypothesize that increasing dopamine concentrations stimulate first the D1 receptors and reduce muscle tone in the antagonist muscle and than stimulate D2 receptors and induce contraction in the agonist muscle. The preceded muscle tone reduction in the antagonist muscle eases the efficient contraction of the agonist. Our hypothesis is applicable for an explanation of physiological movement regulation, different forms of movement pathology and therapeutic drug effects. Further, this hypothesis provides a theoretical basis for experimental investigation of dopaminergic motor control and development of new strategies for treatment of movement disorders

    Association of Factor V Leiden with Subsequent Atherothrombotic Events:A GENIUS-CHD Study of Individual Participant Data

    Get PDF
    BACKGROUND: Studies examining the role of factor V Leiden among patients at higher risk of atherothrombotic events, such as those with established coronary heart disease (CHD), are lacking. Given that coagulation is involved in the thrombus formation stage on atherosclerotic plaque rupture, we hypothesized that factor V Leiden may be a stronger risk factor for atherothrombotic events in patients with established CHD. METHODS: We performed an individual-level meta-analysis including 25 prospective studies (18 cohorts, 3 case-cohorts, 4 randomized trials) from the GENIUS-CHD (Genetics of Subsequent Coronary Heart Disease) consortium involving patients with established CHD at baseline. Participating studies genotyped factor V Leiden status and shared risk estimates for the outcomes of interest using a centrally developed statistical code with harmonized definitions across studies. Cox proportional hazards regression models were used to obtain age- and sex-adjusted estimates. The obtained estimates were pooled using fixed-effect meta-analysis. The primary outcome was composite of myocardial infarction and CHD death. Secondary outcomes included any stroke, ischemic stroke, coronary revascularization, cardiovascular mortality, and all-cause mortality. RESULTS: The studies included 69 681 individuals of whom 3190 (4.6%) were either heterozygous or homozygous (n=47) carriers of factor V Leiden. Median follow-up per study ranged from 1.0 to 10.6 years. A total of 20 studies with 61 147 participants and 6849 events contributed to analyses of the primary outcome. Factor V Leiden was not associated with the combined outcome of myocardial infarction and CHD death (hazard ratio, 1.03 [95% CI, 0.92-1.16]; I2=28%; P-heterogeneity=0.12). Subgroup analysis according to baseline characteristics or strata of traditional cardiovascular risk factors did not show relevant differences. Similarly, risk estimates for the secondary outcomes including stroke, coronary revascularization, cardiovascular mortality, and all-cause mortality were also close to identity. CONCLUSIONS: Factor V Leiden was not associated with increased risk of subsequent atherothrombotic events and mortality in high-risk participants with established and treated CHD. Routine assessment of factor V Leiden status is unlikely to improve atherothrombotic events risk stratification in this population
    corecore