43 research outputs found

    Multicenter Evaluation of a 0-Hour/1-Hour Algorithm in the Diagnosis of Myocardial Infarction With High-Sensitivity Cardiac Troponin T

    Get PDF
    Study objectiveWe aim to prospectively validate the diagnostic accuracy of the recently developed 0-h/1-h algorithm, using high-sensitivity cardiac troponin T (hs-cTnT) for the early rule-out and rule-in of acute myocardial infarction.MethodsWe enrolled patients presenting with suspected acute myocardial infarction and recent (<6 hours) onset of symptoms to the emergency department in a global multicenter diagnostic study. Hs-cTnT (Roche Diagnostics) and sensitive cardiac troponin I (Siemens Healthcare) were measured at presentation and after 1 hour, 2 hours, and 4 to 14 hours in a central laboratory. Patient triage according to the predefined hs-cTnT 0-hour/1-hour algorithm (hs-cTnT below 12 ng/L and Δ1 hour below 3 ng/L to rule out; hs-cTnT at least 52 ng/L or Δ1 hour at least 5 ng/L to rule in; remaining patients to the “observational zone”) was compared against a centrally adjudicated final diagnosis by 2 independent cardiologists (reference standard). The final diagnosis was based on all available information, including coronary angiography and echocardiography results, follow-up data, and serial measurements of sensitive cardiac troponin I, whereas adjudicators remained blinded to hs-cTnT.ResultsAmong 1,282 patients enrolled, acute myocardial infarction was the final diagnosis for 213 (16.6%) patients. Applying the hs-cTnT 0-hour/1-hour algorithm, 813 (63.4%) patients were classified as rule out, 184 (14.4%) were classified as rule in, and 285 (22.2%) were triaged to the observational zone. This resulted in a negative predictive value and sensitivity for acute myocardial infarction of 99.1% (95% confidence interval [CI] 98.2% to 99.7%) and 96.7% (95% CI 93.4% to 98.7%) in the rule-out zone (7 patients with false-negative results), a positive predictive value and specificity for acute myocardial infarction of 77.2% (95% CI 70.4% to 83.0%) and 96.1% (95% CI 94.7% to 97.2%) in the rule-in zone, and a prevalence of acute myocardial infarction of 22.5% in the observational zone.ConclusionThe hs-cTnT 0-hour/1-hour algorithm performs well for early rule-out and rule-in of acute myocardial infarction

    Prevalence of Coxiella burnetii in clinically healthy German sheep flocks

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Current epidemiological data on the situation of <it>Coxiella (C.) burnetii </it>infections in sheep are missing, making risk assessment and the implementation of counteractive measures difficult. Using the German state of Thuringia as a model example, the estimated sero-, and antigen prevalence of <it>C. burnetii </it>(10% and 25%, respectively) was assessed at flock level in 39/252 randomly selected clinically healthy sheep flocks with more than 100 ewes and unknown abortion rate.</p> <p>Results</p> <p>The CHECKITℱ Q-fever Test Kit identified 11 (28%) antibody positive herds, whereas real-time PCR revealed the presence of <it>C. burnetii </it>DNA in 2 (5%) of the flocks. Multiple-locus variable number of tandem repeats analysis of 9 isolates obtained from one flock revealed identical profiles. All isolates contained the plasmid QpH1.</p> <p>Conclusions</p> <p>The results demonstrate that <it>C. burnetii </it>is present in clinically inconspicuous sheep flocks and sporadic flare-ups do occur as the notifications to the German animal disease reporting system show. Although <it>C. burnetii </it>infections are not a primary veterinary concern due to the lack of significant clinical impact on animal health (with the exception of goats), the eminent zoonotic risk for humans should not be underestimated. Therefore, strategies combining the interests of public and veterinary public health should include monitoring of flocks, the identification and culling of shedders as well as the administration of protective vaccines.</p

    Dietary fruits and vegetables and cardiovascular diseases risk

    Get PDF
    Diet is likely to be an important determinant of cardiovascular disease (CVD) risk. In this article, we will review the evidence linking the consumption of fruit and vegetables and CVD risk. The initial evidence that fruit and vegetable consumption has a protective effect against CVD came from observational studies. However, uncertainty remains about the magnitude of the benefit of fruit and vegetable intake on the occurrence of CVD and whether the optimal intake is five portions or greater. Results from randomized controlled trials do not show conclusively that fruit and vegetable intake protects against CVD, in part because the dietary interventions have been of limited intensity to enable optimal analysis of their putative effects. The protective mechanisms of fruit and vegetables may not only include some of the known bioactive nutrient effects dependent on their antioxidant, anti-inflammatory, and electrolyte properties, but also include their functional properties, such as low glycemic load and energy density. Taken together, the totality of the evidence accumulated so far does appear to support the notion that increased intake of fruits and vegetables may reduce cardiovascular risk. It is clear that fruit and vegetables should be eaten as part of a balanced diet, as a source of vitamins, fiber, minerals, and phytochemicals. The evidence now suggests that a complicated set of several nutrients may interact with genetic factors to influence CVD risk. Therefore, it may be more important to focus on whole foods and dietary patterns rather than individual nutrients to successfully impact on CVD risk reduction. A clearer understanding of the relationship between fruit and vegetable intake and cardiovascular risk would provide health professionals with significant information in terms of public health and clinical practice

    The GALAH Survey : Non-LTE departure coefficients for large spectroscopic surveys

    Get PDF
    19 pages, 25 figures, 2 tables, arXiv abstract abridged; accepted for publication in A&AMassive sets of stellar spectroscopic observations are rapidly becoming available and these can be used to determine the chemical composition and evolution of the Galaxy with unprecedented precision. One of the major challenges in this endeavour involves constructing realistic models of stellar spectra with which to reliably determine stellar abundances. At present, large stellar surveys commonly use simplified models that assume that the stellar atmospheres are approximately in local thermodynamic equilibrium (LTE). To test and ultimately relax this assumption, we have performed non-LTE calculations for 1313 different elements (H, Li, C, N, O, Na, Mg, Al, Si, K, Ca, Mn, and Ba), using recent model atoms that have physically-motivated descriptions for the inelastic collisions with neutral hydrogen, across a grid of 37563756 1D MARCS model atmospheres that spans 3000≀Teff/K≀80003000\leq T_{\mathrm{eff}}/\mathrm{K}\leq8000, −0.5≀log⁥g/cm s−2≀5.5-0.5\leq\log{g/\mathrm{cm\,s^{-2}}}\leq5.5, and −5≀[Fe/H]≀1-5\leq\mathrm{[Fe/H]}\leq1. We present the grids of departure coefficients that have been implemented into the GALAH DR3 analysis pipeline in order to complement the extant non-LTE grid for iron. We also present a detailed line-by-line re-analysis of 5012650126 stars from GALAH DR3. We found that relaxing LTE can change the abundances by between −0.7 dex-0.7\,\mathrm{dex} and +0.2 dex+0.2\,\mathrm{dex} for different lines and stars. Taking departures from LTE into account can reduce the dispersion in the [A/Fe]\mathrm{[A/Fe]} versus [Fe/H]\mathrm{[Fe/H]} plane by up to 0.1 dex0.1\,\mathrm{dex}, and it can remove spurious differences between the dwarfs and giants by up to 0.2 dex0.2\,\mathrm{dex}. The resulting abundance slopes can thus be qualitatively different in non-LTE, possibly with important implications for the chemical evolution of our Galaxy.Peer reviewe

    The Water-Energy Nexus of Hydraulic Fracturing: A Global Hydrologic Analysis for Shale Oil and Gas Extraction

    Get PDF
    Shale deposits are globally abundant and widespread. Extraction of shale oil and shale gas is generally performed through water-intensive hydraulic fracturing. Despite recent work on its environmental impacts, it remains unclear where and to what extent shale resource extraction could compete with other water needs. Here we consider the global distribution of known shale deposits suitable for oil and gas extraction and develop a water balance model to quantify their impacts on local water availability for other human uses and ecosystem functions. We find that 31–44% of the world's shale deposits are located in areas where water stress would either emerge or be exacerbated as a result of shale oil or gas extraction; 20% of shale deposits are in areas affected by groundwater depletion and 30% in irrigated land. In these regions shale oil and shale gas production would likely compete for local water resources with agriculture, environmental flows, and other water needs. By adopting a hydrologic perspective that considers water availability and demand together, decision makers and local communities can better understand the water and food security implications of shale resource development

    The fate of mercury in Arctic terrestrial and aquatic ecosystems, a review

    Full text link

    Effect of remote ischaemic conditioning on clinical outcomes in patients with acute myocardial infarction (CONDI-2/ERIC-PPCI): a single-blind randomised controlled trial.

    Get PDF
    BACKGROUND: Remote ischaemic conditioning with transient ischaemia and reperfusion applied to the arm has been shown to reduce myocardial infarct size in patients with ST-elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PPCI). We investigated whether remote ischaemic conditioning could reduce the incidence of cardiac death and hospitalisation for heart failure at 12 months. METHODS: We did an international investigator-initiated, prospective, single-blind, randomised controlled trial (CONDI-2/ERIC-PPCI) at 33 centres across the UK, Denmark, Spain, and Serbia. Patients (age >18 years) with suspected STEMI and who were eligible for PPCI were randomly allocated (1:1, stratified by centre with a permuted block method) to receive standard treatment (including a sham simulated remote ischaemic conditioning intervention at UK sites only) or remote ischaemic conditioning treatment (intermittent ischaemia and reperfusion applied to the arm through four cycles of 5-min inflation and 5-min deflation of an automated cuff device) before PPCI. Investigators responsible for data collection and outcome assessment were masked to treatment allocation. The primary combined endpoint was cardiac death or hospitalisation for heart failure at 12 months in the intention-to-treat population. This trial is registered with ClinicalTrials.gov (NCT02342522) and is completed. FINDINGS: Between Nov 6, 2013, and March 31, 2018, 5401 patients were randomly allocated to either the control group (n=2701) or the remote ischaemic conditioning group (n=2700). After exclusion of patients upon hospital arrival or loss to follow-up, 2569 patients in the control group and 2546 in the intervention group were included in the intention-to-treat analysis. At 12 months post-PPCI, the Kaplan-Meier-estimated frequencies of cardiac death or hospitalisation for heart failure (the primary endpoint) were 220 (8·6%) patients in the control group and 239 (9·4%) in the remote ischaemic conditioning group (hazard ratio 1·10 [95% CI 0·91-1·32], p=0·32 for intervention versus control). No important unexpected adverse events or side effects of remote ischaemic conditioning were observed. INTERPRETATION: Remote ischaemic conditioning does not improve clinical outcomes (cardiac death or hospitalisation for heart failure) at 12 months in patients with STEMI undergoing PPCI. FUNDING: British Heart Foundation, University College London Hospitals/University College London Biomedical Research Centre, Danish Innovation Foundation, Novo Nordisk Foundation, TrygFonden

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
    corecore