50 research outputs found

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Basis for treatment of tuberculosis among HIV-infected patients in Tanzania: the role of chest x-ray and sputum culture

    Get PDF
    BACKGROUND: Active tuberculosis (TB) is common among HIV-infected persons living in tuberculosis endemic countries, and screening for tuberculosis (TB) is recommended routinely. We sought to determine the role of chest x-ray and sputum culture in the decision to treat for presumptive TB using active case finding in a large cohort of HIV-infected patients. METHODS: Ambulatory HIV-positive subjects with CD4 counts ≥ 200/mm3 entering a Phase III TB vaccine study in Tanzania were screened for TB with a physical examination, standard interview, CD4 count, chest x-ray (CXR), blood culture for TB, and three sputum samples for acid fast bacillus (AFB) smear and culture. RESULTS: Among 1176 subjects 136 (12%) were treated for presumptive TB. These patients were more frequently male than those without treatment (34% vs. 25%, respectively; p = 0.049) and had lower median CD4 counts (319/μL vs. 425/μL, respectively; p < .0001). Among the 136 patients treated for TB, 38 (28%) had microbiologic confirmation, including 13 (10%) who had a normal CXR and no symptoms. There were 58 (43%) treated patients in whom the only positive finding was an abnormal CXR. Blood cultures were negative in all patients. CONCLUSION: Many ambulatory HIV-infected patients with CD4 counts ≥ 200/mm3 are treated for presumptive TB. Our data suggest that optimal detection requires comprehensive evaluation, including CXR and sputum culture on both symptomatic and asymptomatic subjects.National Institutes of Health (A1 45407); Fogarty International Center (D43-TW006807

    Environmental effects and individual body condition drive seasonal fecundity of rabbits: identifying acute and lagged processes

    Get PDF
    The reproduction of many species is determined by seasonally-driven resource supply. But it is difficult to quantify whether the fecundity is sensitive to short- or long-term exposure to environmental conditions such as rainfall that drive resource supply. Using 25 years of data on individual fecundity of European female rabbits, Oryctolagus cuniculus, from semiarid Australia, we investigate the role of individual body condition, rainfall and temperature as drivers of seasonal and long-term and population-level changes in fecundity (breeding probability, ovulation rate, embryo survival). We built distributed lag models in a hierarchical Bayesian framework to account for both immediate and time-lagged effects of climate and other environmental drivers, and possible shifts in reproduction over consecutive seasons. We show that rainfall during summer, when rabbits typically breed only rarely, increased breeding probability immediately and with time lags of up to 10 weeks. However, an earlier onset of the yearly breeding period did not result in more overall reproductive output. Better body condition was associated with an earlier onset of breeding and higher embryo survival. Breeding probability in the main breeding season declined with increased breeding activity in the preceding season and only individuals in good body condition were able to breed late in the season. Higher temperatures reduce breeding success across seasons. We conclude that a better understanding of seasonal dynamics and plasticity (and their interplay) in reproduction will provide crucial insights into how lagomorphs are likely to respond and potentially adapt to the influence of future climate and other environmental change.Konstans Wells, Robert B. O’Hara, Brian D. Cooke, Greg J. Mutze, Thomas A.A. Prowse, Damien A. Fordha

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics
    corecore