874 research outputs found

    Risk prediction to inform surveillance of chronic kidney disease in the US Healthcare Safety Net: a cohort study.

    Get PDF
    BackgroundThe capacity of electronic health record (EHR) data to guide targeted surveillance in chronic kidney disease (CKD) is unclear. We sought to leverage EHR data for predicting risk of progressing from CKD to end-stage renal disease (ESRD) to help inform surveillance of CKD among vulnerable patients from the healthcare safety-net.MethodsWe conducted a retrospective cohort study of adults (n = 28,779) with CKD who received care within 2 regional safety-net health systems during 1996-2009 in the Western United States. The primary outcomes were progression to ESRD and death as ascertained by linkage with United States Renal Data System and Social Security Administration Death Master files, respectively, through September 29, 2011. We evaluated the performance of 3 models which included demographic, comorbidity and laboratory data to predict progression of CKD to ESRD in conditions commonly targeted for disease management (hypertension, diabetes, chronic viral diseases and severe CKD) using traditional discriminatory criteria (AUC) and recent criteria intended to guide population health management strategies.ResultsOverall, 1730 persons progressed to end-stage renal disease and 7628 died during median follow-up of 6.6 years. Performance of risk models incorporating common EHR variables was highest in hypertension, intermediate in diabetes and chronic viral diseases, and lowest in severe CKD. Surveillance of persons who were in the highest quintile of ESRD risk yielded 83-94 %, 74-95 %, and 75-82 % of cases who progressed to ESRD among patients with hypertension, diabetes and chronic viral diseases, respectively. Similar surveillance yielded 42-71 % of ESRD cases among those with severe CKD. Discrimination in all conditions was universally high (AUC ≥0.80) when evaluated using traditional criteria.ConclusionsRecently proposed discriminatory criteria account for varying risk distribution and when applied to common clinical conditions may help to inform surveillance of CKD in diverse populations

    Contemplative Science: An Insider's Prospectus

    Get PDF
    This chapter describes the potential far‐reaching consequences of contemplative higher education for the fields of science and medicine

    Child-stripping in the Victorian City

    Get PDF
    During the nineteenth century, police, magistrates, reformers and the press noticed a rising tide of juvenile crime. Child-stripping, the crime of stealing young children's clothes by force or deception, was an activity of this type which caused alarm among contemporaries. As the century progressed, improved policing, urbanization and Irish migration, allied to growing social concern, caused more cases of child-stripping to be noticed. Accounts by Dickens, Mayhew and others characterized child-stripping as an activity indulged in by old women who were able to make money by victimizing the weakest strata of society. However, research in the British Library's digitized newspaper collections as well as in parliamentary papers conclusively demonstrates that child-stripping, far from being the domain of Dickensian crones, was actually perpetrated by older children, notably girls, against children even younger than themselves. Despite widespread revulsion, which at times approached a ‘moral panic’ prompted by the nature of the crime, progressive attitudes largely prevailed with most child-stripping children being sent to reformatories or industrial schools in the hope of reforming their behaviour. This article thus conforms with Foucauldian notions of the switch from physical to mental punishments and aligns with the Victorians’ invention of children as a category of humanity that could be saved

    Participant retention practices in longitudinal clinical research studies with high retention rates

    Get PDF
    Abstract Background There is a need for improving cohort retention in longitudinal studies. Our objective was to identify cohort retention strategies and implementation approaches used in studies with high retention rates. Methods Longitudinal studies with ≥200 participants, ≥80% retention rates over ≥1 year of follow-up were queried from an Institutional Review Board database at a large research-intensive U.S. university; additional studies were identified through networking. Nineteen (86%) of 22 eligible studies agreed to participate. Through in-depth semi-structured interviews, participants provided retention strategies based on themes identified from previous literature reviews. Synthesis of data was completed by a multidisciplinary team. Results The most commonly used retention strategies were: study reminders, study visit characteristics, emphasizing study benefits, and contact/scheduling strategies. The research teams were well-functioning, organized, and persistent. Additionally, teams tailored their strategies to their participants, often adapting and innovating their approaches. Conclusions These studies included specialized and persistent teams and utilized tailored strategies specific to their cohort and individual participants. Studies’ written protocols and published manuscripts often did not reflect the varied strategies employed and adapted through the duration of study. Appropriate retention strategy use requires cultural sensitivity and more research is needed to identify how strategy use varies globally

    The Limits of Anthropocene Narratives

    Get PDF
    The rapidly growing transdisciplinary enthusiasm about developing new kinds of Anthropocene stories is based on the shared assumption that the Anthropocene predicament is best made sense of by narrative means. Against this assumption, this article argues that the challenge we are facing today does not merely lie in telling either scientific, socio-political, or entangled Anthropocene narratives to come to terms with our current condition. Instead, the challenge lies in coming to grips with how the stories we can tell in the Anthropocene relate to the radical novelty of the Anthropocene condition about which no stories can be told. What we need to find are meaningful ways to reconcile an inherited commitment to narrativization and the collapse of storytelling as a vehicle of understanding the Anthropocene as our current predicament

    Fluid accumulation, recognition and staging of acute kidney injury in critically-ill patients

    Get PDF
    Abstract Introduction Serum creatinine concentration (sCr) is the marker used for diagnosing and staging acute kidney injury (AKI) in the RIFLE and AKIN classification systems, but is influenced by several factors including its volume of distribution. We evaluated the effect of fluid accumulation on sCr to estimate severity of AKI. Methods In 253 patients recruited from a prospective observational study of critically-ill patients with AKI, we calculated cumulative fluid balance and computed a fluid-adjusted sCr concentration reflecting the effect of volume of distribution during the development phase of AKI. The time to reach a relative 50% increase from the reference sCr using the crude and adjusted sCr was compared. We defined late recognition to estimate severity of AKI when this time interval to reach 50% relative increase between the crude and adjusted sCr exceeded 24 hours. Results The median cumulative fluid balance increased from 2.7 liters on day 2 to 6.5 liters on day 7. The difference between adjusted and crude sCr was significantly higher at each time point and progressively increased from a median difference of 0.09 mg/dL to 0.65 mg/dL after six days. Sixty-four (25%) patients met criteria for a late recognition to estimate severity progression of AKI. This group of patients had a lower urine output and a higher daily and cumulative fluid balance during the development phase of AKI. They were more likely to need dialysis but showed no difference in mortality compared to patients who did not meet the criteria for late recognition of severity progression. Conclusions In critically-ill patients, the dilution of sCr by fluid accumulation may lead to underestimation of the severity of AKI and increases the time required to identify a 50% relative increase in sCr. A simple formula to correct sCr for fluid balance can improve staging of AKI and provide a better parameter for earlier recognition of severity progression

    Total and corrected antioxidant capacity in hemodialyzed patients

    Get PDF
    BACKGROUND: Oxidative stress may play a critical role in the vascular disease of end stage renal failure and hemodialysis patients. Studies, analyzing either discrete analytes and antioxidant substances, or the integrated total antioxidant activity of human plasma during hemodialysis, give contradictory results. METHODS: Recently, we have introduced a new automated method for the determination of Total Antioxidant Capacity (TAC) of human plasma. We have serially measured TAC and corrected TAC (cTAC: after subtraction of the interactions due to endogenous uric acid, bilirubin and albumin) in 10 patients before the onset of the dialysis session, 10 min, 30 min, 1 h, 2 h and 3 h into the procedure and after completion of the session. RESULTS: Our results indicate that TAC decreases, reaching minimum levels at 2 h. However, corrected TAC increases with t(1/2 )of about 30 min. We then repeated the measurements in 65 patients undergoing dialysis with different filters (36 patients with ethylene vinyl alcohol copolymer resin filter -Eval-, 23 patients with two polysulfone filters -10 with F6 and 13 with PSN140-, and 6 patients with hemophan filters). Three specimens were collected (0, 30, 240 min). The results of this second group confirm our initial results, while no significant difference was observed using either filter. CONCLUSIONS: Our results are discussed under the point of view of possible mechanisms of modification of endogenous antioxidants, and the interaction of lipid- and water-soluble antioxidants

    Sepsis as a cause and consequence of acute kidney injury: Program to Improve Care in Acute Renal Disease

    Get PDF
    Sepsis commonly contributes to acute kidney injury (AKI); however, the frequency with which sepsis develops as a complication of AKI and the clinical consequences of this sepsis are unknown. This study examined the incidence of, and outcomes associated with, sepsis developing after AKI. We analyzed data from 618 critically ill patients enrolled in a multicenter observational study of AKI (PICARD). Patients were stratified according to their sepsis status and timing of incident sepsis relative to AKI diagnosis. We determined the associations among sepsis, clinical characteristics, provision of dialysis, in-hospital mortality, and length of stay (LOS), comparing outcomes among patients according to their sepsis status. Among the 611 patients with data on sepsis status, 174 (28%) had sepsis before AKI, 194 (32%) remained sepsis-free, and 243 (40%) developed sepsis a median of 5 days after AKI. Mortality rates for patients with sepsis developing after AKI were higher than in sepsis-free patients (44 vs. 21%; p < 0.0001) and similar to patients with sepsis preceding AKI (48 vs. 44%; p = 0.41). Compared with sepsis-free patients, those with sepsis developing after AKI were also more likely to be dialyzed (70 vs. 50%; p < 0.001) and had longer LOS (37 vs. 27 days; p < 0.001). Oliguria, higher fluid accumulation and severity of illness scores, non-surgical procedures after AKI, and provision of dialysis were predictors of sepsis after AKI. Sepsis frequently develops after AKI and portends a poor prognosis, with high mortality rates and relatively long LOS. Future studies should evaluate techniques to monitor for and manage this complication to improve overall prognosis
    corecore