686 research outputs found
Recommended from our members
The REDS score: a new scoring system to risk-stratify emergency department suspected sepsis: a derivation and validation study.
OBJECTIVE: To derive and validate a new clinical prediction rule to risk-stratify emergency department (ED) patients admitted with suspected sepsis. DESIGN: Retrospective prognostic study of prospectively collected data. SETTING: ED. PARTICIPANTS: Patients aged ≥18 years who met two Systemic Inflammatory Response Syndrome criteria or one Red Flag sepsis criteria on arrival, received intravenous antibiotics for a suspected infection and admitted. PRIMARY OUTCOME MEASURE: In-hospital all-cause mortality. METHOD: The data were divided into derivation and validation cohorts. The simplified-Mortality in Severe Sepsis in the ED score and quick-SOFA scores, refractory hypotension and lactate were collectively termed 'component scores' and cumulatively termed the 'Risk-stratification of ED suspected Sepsis (REDS) score'. Each patient in the derivation cohort received a score (0-3) for each component score. The REDS score ranged from 0 to 12. The component scores were subject to univariate and multivariate logistic regression analyses. The receiver operator characteristic (ROC) curves for the REDS and the components scores were constructed and their cut-off points identified. Scores above the cut-off points were deemed high-risk. The area under the ROC (AUROC) curves and sensitivity for mortality of the high-risk category of the REDS score and component scores were compared. The REDS score was internally validated. RESULTS: 2115 patients of whom 282 (13.3%) died in hospital. Derivation cohort: 1078 patients with 140 deaths (13%). The AUROC curve with 95% CI, cut-off point and sensitivity for mortality (95% CI) of the high-risk category of the REDS score were: derivation: 0.78 (0.75 to 0.80); ≥3; 85.0 (78 to 90.5). VALIDATION: 0.74 (0.71 to 0.76); ≥3; 84.5 (77.5 to 90.0). The AUROC curve and the sensitivity for mortality of the REDS score was better than that of the component scores. Specificity and mortality rates for REDS scores of ≥3, ≥5 and ≥7 were 54.8%, 88.8% and 96.9% and 21.8%, 36.0% and 49.1%, respectively. CONCLUSION: The REDS score is a simple and objective score to risk-stratify ED patients with suspected sepsis
Recommended from our members
Treatment variables associated with outcome in emergency department patients with suspected sepsis.
BACKGROUND: Early treatment is advocated in the management of patients with suspected sepsis in the emergency department (ED). We sought to understand the association between the ED treatments and outcome in patients admitted with suspected sepsis. The treatments studied were: (i) the time to antibiotics, where time zero is the time the patient was booked in which is also the triage time; (ii) the volume of intravenous fluid (IVF); (iii) mean arterial pressure (MAP) after 2000 ml of IVF and (iv) the final MAP in the ED. METHODS: We performed a retrospective analysis of the ED database of patients aged ≥ 18 year who met two SIRS criteria or one red flag sepsis criteria on arrival, received intravenous antibiotics for a suspected infection and admitted between 8th February 2016 and 31st August 2017. The primary outcome measure was all-cause in-hospital mortality. The four treatments stated above were controlled for severity of illness and subject to multivariate logistic regression and Cox proportional-hazard regression to identify independent predictors of mortality. RESULTS: Of the 2,066 patients studied 272 (13.2%) died in hospital. The median time to antibiotics was 48 (interquartile range 30-82) minutes. The time to antibiotics was an independent predictor of mortality only in those who developed refractory hypotension (RH); antibiotics administered more than 55 mins after arrival was associated with an odds ratio (OR) for mortality of 2.75 [95% confidence interval (CI) 1.22-6.14]. The number-needed-to-treat was 4. IVF > 2000 ml (95% CI > 500- > 2100), except in RH, and a MAP ≤ 66 mmHg after 2000 ml of IVF were also independent predictors of mortality. The OR for mortality of IVF > 2,000 ml in non-RH was 1.80 (95% CI 1.15-2.82); Number-needed-to-harm was 14. The OR for morality for a MAP ≤ 66 mmHg after 2000 ml of IVF was 3.42 (95% CI 2.10-5.57). A final MAP 2000 ml (except in RH) and a MAP ≤ 66 mmHg after 2000 ml of IVF were also independent predictors of mortality
Recommended from our members
Prognostic performance of the REDS score, SOFA score, NEWS2 score, and the red-flag, NICE high-risk, and SIRS criteria to predict survival at 180 days, in emergency department patients admitted with suspected sepsis - An observational cohort study.
BACKGROUND: Patients admitted to hospital with sepsis are at persistent risk of poor outcome after discharge. Many tools are available to risk-stratify sepsis patients for in-hospital mortality. This study aimed to identify the best risk-stratification tool to prognosticate outcome 180 days after admission via the emergency department (ED) with suspected sepsis. METHODS: A retrospective observational cohort study was performed of adult ED patients who were admitted after receiving intravenous antibiotics for the treatment of a suspected sepsis, between 1st March and 31st August 2019. The Risk-stratification of ED suspected Sepsis (REDS) score, SOFA score, Red-flag sepsis criteria met, NICE high-risk criteria met, the NEWS2 score and the SIRS criteria, were calculated for each patient. Death and survival at 180 days were noted. Patients were stratified in to high and low-risk groups as per accepted criteria for each risk-stratification tool. Kaplan-Meier curves were plotted for each tool and the log-rank test performed. The tools were compared using Cox-proportional hazard regression (CPHR). The tools were studied further in those without the following specified co-morbidities: Dementia, malignancy, Rockwood Frailty score of 6 or more, long-term oxygen therapy and previous do-not-resuscitate orders. RESULTS: Of the 1,057 patients studied 146 (13.8%) died at hospital discharge and 284 were known to have died within 180 days. Overall survival proportion was 74.4% at 180 days and 8.6% of the population was censored before 180 days. Only the REDS and SOFA scores identified less than 50% of the population as high-risk. All tools except the SIRS criteria, prognosticated for outcome at 180 days; Log-rank tests between high and low-risk groups were: REDS score p < 0.0001, SOFA score p < 0.0001, Red-flag criteria p = 0.001, NICE high-risk criteria p = 0.0001, NEWS2 score p = 0.003 and SIRS criteria p = 0.98. On CPHR, the REDS [Hazard ratio (HR) 2.54 (1.92-3.35)] and SOFA [HR 1.58 (1.24-2.03)] scores out-performed the other risk-stratification tools. In patients without the specified co-morbidities, only the REDS score and the SOFA score risk-stratified for outcome at 180 days. CONCLUSION: In this study, all the risk-stratification tools studied were found to prognosticate for outcome at 180 days, except the SIRS criteria. The REDS and SOFA scores outperformed the other tools
Above and beyond state-of-the-art approaches to investigate sequence data: summary of methods and results from the population-based association group at the Genetic Analysis Workshop 19
Ischemia reperfusion dysfunction changes model-estimated kinetics of myofilament interaction due to inotropic drugs in isolated hearts
BACKGROUND: The phase-space relationship between simultaneously measured myoplasmic [Ca(2+)] and isovolumetric left ventricular pressure (LVP) in guinea pig intact hearts is altered by ischemic and inotropic interventions. Our objective was to mathematically model this phase-space relationship between [Ca(2+)] and LVP with a focus on the changes in cross-bridge kinetics and myofilament Ca(2+ )sensitivity responsible for alterations in Ca(2+)-contraction coupling due to inotropic drugs in the presence and absence of ischemia reperfusion (IR) injury. METHODS: We used a four state computational model to predict LVP using experimentally measured, averaged myoplasmic [Ca(2+)] transients from unpaced, isolated guinea pig hearts as the model input. Values of model parameters were estimated by minimizing the error between experimentally measured LVP and model-predicted LVP. RESULTS: We found that IR injury resulted in reduced myofilament Ca(2+ )sensitivity, and decreased cross-bridge association and dissociation rates. Dopamine (8 μM) reduced myofilament Ca(2+ )sensitivity before, but enhanced it after ischemia while improving cross-bridge kinetics before and after IR injury. Dobutamine (4 μM) reduced myofilament Ca(2+ )sensitivity while improving cross-bridge kinetics before and after ischemia. Digoxin (1 μM) increased myofilament Ca(2+ )sensitivity and cross-bridge kinetics after but not before ischemia. Levosimendan (1 μM) enhanced myofilament Ca(2+ )affinity and cross-bridge kinetics only after ischemia. CONCLUSION: Estimated model parameters reveal mechanistic changes in Ca(2+)-contraction coupling due to IR injury, specifically the inefficient utilization of Ca(2+ )for contractile function with diastolic contracture (increase in resting diastolic LVP). The model parameters also reveal drug-induced improvements in Ca(2+)-contraction coupling before and after IR injury
Optically stimulated luminescence ages for human occupation during the penultimate glaciation in the western Loess Plateau of China
The chronology of few Palaeolithic sites in the Loess Plateau of China has been well investigated for the Middle Pleistocene. In this study, the Yangshang Palaeolithic site with significant archaeological remains, located in Gansu Province of the western Chinese Loess Plateau, was dated using blue‐stimulated optically stimulated luminescence (OSL) and infrared stimulated luminescence (IRSL) techniques on eight medium‐grained (45–63 µm) quartz and polymineral samples through the Palaeolithic sequence. Except for one sample from the upper cultural layer (55 ± 3 ka), age estimates obtained by quartz OSL dating exceed its upper datable limit, and so an elevated temperature post‐IR IRSL SAR protocol was used on 45–63 μm polymineral grains to derive a numerical chronology for the site. The luminescence ages of seven samples from six cultural layers lie between 104 ± 5 and 220 ± 11 ka; for the three main cultural layers the results varied from 149 ± 9 to 186 ± 10 ka, consistent with the presently observed stratigraphy. The dating results shown that early hominins may have occupied this region in the western Loess Plateau of China during Marine Isotope Stage (MIS) 7, early glacial MIS 6 and early interglacial MIS 5, reflecting their ability to adapt to variable environments.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/135605/1/jqs2917_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/135605/2/jqs2917.pd
Conserved presence of G-quadruplex forming sequences in the Long Terminal Repeat Promoter of Lentiviruses
G-quadruplexes (G4s) are secondary structures of nucleic acids that epigenetically regulate cellular processes. In the human immunodeficiency lentivirus 1 (HIV-1), dynamic G4s are located in the unique viral LTR promoter. Folding of HIV-1 LTR G4s inhibits viral transcription; stabilization by G4 ligands intensifies this effect. Cellular proteins modulate viral transcription by inducing/unfolding LTR G4s. We here expanded our investigation on the presence of LTR G4s to all lentiviruses. G4s in the 5'-LTR U3 region were completely conserved in primate lentiviruses. A G4 was also present in a cattle-infecting lentivirus. All other non-primate lentiviruses displayed hints of less stable G4s. In primate lentiviruses, the possibility to fold into G4s was highly conserved among strains. LTR G4 sequences were very similar among phylogenetically related primate viruses, while they increasingly differed in viruses that diverged early from a common ancestor. A strong correlation between primate lentivirus LTR G4s and Sp1/NF\u3baB binding sites was found. All LTR G4s folded: their complexity was assessed by polymerase stop assay. Our data support a role of the lentiviruses 5'-LTR G4 region as control centre of viral transcription, where folding/unfolding of G4s and multiple recruitment of factors based on both sequence and structure may take place
Tumor innate immunity primed by specific interferon-stimulated endogenous retroviruses.
Mesenchymal tumor subpopulations secrete pro-tumorigenic cytokines and promote treatment resistance1-4. This phenomenon has been implicated in chemorefractory small cell lung cancer and resistance to targeted therapies5-8, but remains incompletely defined. Here, we identify a subclass of endogenous retroviruses (ERVs) that engages innate immune signaling in these cells. Stimulated 3 prime antisense retroviral coding sequences (SPARCS) are oriented inversely in 3' untranslated regions of specific genes enriched for regulation by STAT1 and EZH2. Derepression of these loci results in double-stranded RNA generation following IFN-γ exposure due to bi-directional transcription from the STAT1-activated gene promoter and the 5' long terminal repeat of the antisense ERV. Engagement of MAVS and STING activates downstream TBK1, IRF3, and STAT1 signaling, sustaining a positive feedback loop. SPARCS induction in human tumors is tightly associated with major histocompatibility complex class 1 expression, mesenchymal markers, and downregulation of chromatin modifying enzymes, including EZH2. Analysis of cell lines with high inducible SPARCS expression reveals strong association with an AXL/MET-positive mesenchymal cell state. While SPARCS-high tumors are immune infiltrated, they also exhibit multiple features of an immune-suppressed microenviroment. Together, these data unveil a subclass of ERVs whose derepression triggers pathologic innate immune signaling in cancer, with important implications for cancer immunotherapy
Conodonts in Ordovician biostratigraphy
The long time interval after Pander's (1856) original conodont study can in terms of Ordovician conodont biostratigraphical research be subdivided into three periods, namely the Pioneer Period (1856-1955), the Transition Period (1955-1971) and the Modern Period (1971-Recent). During the pre-1920s, the few published conodont investigations were restricted to Europe and North America and were not concerned about the potential use of conodonts as guide fossils. Although primarily of taxonomic nature, the pioneer studies by Branson & Mehl, Stauffer and Furnish during the 1930s represent the beginning of the use of conodonts in Ordovician biostratigraphy. However, no formal zones were introduced until Lindstr\uf6m (1955) proposed four conodont zones in the Lower Ordovician of Sweden, which marks the end of the Pioneer Period. Because Lindstr\uf6m's zone classification was not followed by similar work outside Baltoscandia, the time interval up to the late 1960s can be regarded as a Transition Period. A milestone symposium volume, entitled 'Symposium on Conodont Biostratigraphy' and published in 1971, summarized much new information on Ordovician conodont biostratigraphy and is taken as the beginning of the Modern Period of Ordovician conodont biostratigraphy. In this volume, the Baltoscandic Ordovician was subdivided into named conodont zones, whereas the North American Ordovician succession was classified into a series of lettered or numbered faunas. Although most of the latter did not receive zone names until 1984, this classification has been used widely in North America. The Middle and Upper Ordovician Baltoscandic zone classification, which was largely based on evolutionary species changes in lineages and hence includes phylozones, has subsequently undergone only minor changes and has been used slightly modified also in some other regions, such as New Zealand, China and eastern North America. The great importance of conodonts in Ordovician biostratigraphy is shown by the fact that conodonts are used for the definition of two of the seven global stages, and seven of the 20 stage slices, now recognized within this system
- …