56 research outputs found

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Beyond equilibrium climate sensitivity

    Get PDF
    ISSN:1752-0908ISSN:1752-089

    Correlation methods in fingerprint detection studies

    No full text
    This investigation addresses two general issues regarding the role of pattern similarity statistics in greenhouse warming detection studies: normalization, and the relative merits of centered versus uncentered statistics. A pattern correlation statistic is used to search for the greenhouse warming signals predicted by five different models in the observed records of land and ocean surface temperature changes. Two forms of this statistic were computed: R (t), which makes use of nonnormalized data, and {Mathematical expression} (t), which employs point-wise normalized data in order to focus the search on regions where the signal-to-noise ratio is large. While there are no trends in the R (t) time series, the time series of {Mathematical expression} (t) show large positive trends. However, it is not possible to infer from the {Mathematical expression} (t) results that the observed pattern of temperature change is, in fact, becoming increasingly similar to the model-predicted signal. This is because point-wise normalization of the observed and simulated mean change fields by a single common field introduces a "common factor" effect, which means that the quantities being compared should show some similarity a priori. This does not necessarily make normalization inapplicable, because the detection test involves seeking a trend in the similarity statistic. We show, however, that trends in {Mathematical expression} (t) must arise almost completely from the observed data, and cannot be an indicator of increasing observed data/signal similarity. We also compare the information provided by centered statistics such as R(t) and the uncentered C(t) statistic introduced by Barnett. We show that C(t) may be expressed as the weighted sum of two terms, one proportional to R(t) and the other proportional to the observed spatial mean. For near-surface temperatures, the spatial average term dominates over the R(t) term. In this case the use of C(t) is equivalent to the use of spatial-mean temperature. We conclude that at present, the most informative pattern correlation statistic for detection purposes is R(t), the standard product-moment correlation coefficient between the observed and model fields. Our failure to find meaningful trends in R(t) may be due to the fact that the signal is being obscured by the background noise of natural variability, and/or because of incorrect model signals or sensitivities

    A probabilistic quantification of the anthropogenic component of twentieth century global warming

    No full text
    This paper examines in detail the statement in the 2007 IPCC Fourth Assessment Report that “Most of the observed increase in global average temperatures since the mid-twentieth century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations”. We use a quantitative probabilistic analysis to evaluate this IPCC statement, and discuss the value of the statement in the policy context. For forcing by greenhouse gases (GHGs) only, we show that there is a greater than 90 % probability that the expected warming over 1950–2005 is larger than the total amount (not just “most”) of the observed warming. This is because, following current best estimates, negative aerosol forcing has substantially offset the GHG-induced warming. We also consider the expected warming from all anthropogenic forcings using the same probabilistic framework. This requires a re-assessment of the range of possible values for aerosol forcing. We provide evidence that the IPCC estimate for the upper bound of indirect aerosol forcing is almost certainly too high. Our results show that the expected warming due to all human influences since 1950 (including aerosol effects) is very similar to the observed warming. Including the effects of natural external forcing factors has a relatively small impact on our 1950–2005 results, but improves the correspondence between model and observations over 1900–2005. Over the longer period, however, externally forced changes are insufficient to explain the early twentieth century warming. We suggest that changes in the formation rate of North Atlantic Deep Water may have been a significant contributing factor.T. M. L. Wigley, B. D. Sante
    corecore