13 research outputs found

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty

    Environmentally Realistic Exposure to the Herbicide Atrazine Alters Some Sexually Selected Traits in Male Guppies

    Get PDF
    Male mating signals, including ornaments and courtship displays, and other sexually selected traits, like male-male aggression, are largely controlled by sex hormones. Environmental pollutants, notably endocrine disrupting compounds, can interfere with the proper functioning of hormones, thereby impacting the expression of hormonally regulated traits. Atrazine, one of the most widely used herbicides, can alter sex hormone levels in exposed animals. I tested the effects of environmentally relevant atrazine exposures on mating signals and behaviors in male guppies, a sexually dimorphic freshwater fish. Prolonged atrazine exposure reduced the expression of two honest signals: the area of orange spots (ornaments) and the number of courtship displays performed. Atrazine exposure also reduced aggression towards competing males in the context of mate competition. In the wild, exposure levels vary among individuals because of differential distribution of the pollutants across habitats; hence, differently impacted males often compete for the same mates. Disrupted mating signals can reduce reproductive success as females avoid mating with perceptibly suboptimal males. Less aggressive males are at a competitive disadvantage and lose access to females. This study highlights the effects of atrazine on ecologically relevant mating signals and behaviors in exposed wildlife. Altered reproductive traits have important implications for population dynamics, evolutionary patterns, and conservation of wildlife species

    The controversy of patellar resurfacing in total knee arthroplasty: Ibisne in medio tutissimus?

    Get PDF
    Early arthroplasty designs were associated with a high level of anterior knee pain as they failed to cater for the patello-femoral joint. Patellar resurfacing was heralded as the saviour safeguarding patient satisfaction and success but opinion on its necessity has since deeply divided the scientific community and has become synonymous to topics of religion or politics. Opponents of resurfacing contend that the native patella provides better patellar tracking, improved clinical function, and avoids implant-related complications, whilst proponents argue that patients have less pain, are overall more satisfied, and avert the need for secondary resurfacing. The question remains whether complications associated with patellar resurfacing including those arising from future component revision outweigh the somewhat increased incidence of anterior knee pain recorded in unresurfaced patients. The current scientific literature, which is often affected by methodological limitations and observer bias, remains confusing as it provides evidence in support of both sides of the argument, whilst blinded satisfaction studies comparing resurfaced and non-resurfaced knees generally reveal equivalent results. Even national arthroplasty register data show wide variations in the proportion of patellar resurfacing between countries that cannot be explained by cultural differences alone. Advocates who always resurface or never resurface indiscriminately expose the patella to a random choice. Selective resurfacing offers a compromise by providing a decision algorithm based on a propensity for improved clinical success, whilst avoiding potential complications associated with unnecessary resurfacing. Evidence regarding the validity of selection criteria, however, is missing, and the decision when to resurface is often based on intuitive reasoning. Our lack of understanding why, irrespective of pre-operative symptoms and patellar resurfacing, some patients may suffer pain following TKA and others may not have so far stifled our efforts to make the strategy of selective resurfacing succeed. We should hence devote our efforts in defining predictive criteria and indicators that will enable us to reliably identify those individuals who might benefit from a resurfacing procedure. Level of evidence V

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    Evaluating the Significance of Paleophylogeographic Species Distribution Models in Reconstructing Quaternary Range-Shifts of Nearctic Chelonians

    Get PDF
    <div><p>The climatic cycles of the Quaternary, during which global mean annual temperatures have regularly changed by 5–10°C, provide a special opportunity for studying the rate, magnitude, and effects of geographic responses to changing climates. During the Quaternary, high- and mid-latitude species were extirpated from regions that were covered by ice or otherwise became unsuitable, persisting in refugial retreats where the environment was compatible with their tolerances. In this study we combine modern geographic range data, phylogeny, Pleistocene paleoclimatic models, and isotopic records of changes in global mean annual temperature, to produce a temporally continuous model of geographic changes in potential habitat for 59 species of North American turtles over the past 320 Ka (three full glacial-interglacial cycles). These paleophylogeographic models indicate the areas where past climates were compatible with the modern ranges of the species and serve as hypotheses for how their geographic ranges would have changed in response to Quaternary climate cycles. We test these hypotheses against physiological, genetic, taxonomic and fossil evidence, and we then use them to measure the effects of Quaternary climate cycles on species distributions. Patterns of range expansion, contraction, and fragmentation in the models are strongly congruent with (i) phylogeographic differentiation; (ii) morphological variation; (iii) physiological tolerances; and (iv) intraspecific genetic variability. Modern species with significant interspecific differentiation have geographic ranges that strongly fluctuated and repeatedly fragmented throughout the Quaternary. Modern species with low genetic diversity have geographic distributions that were highly variable and at times exceedingly small in the past. Our results reveal the potential for paleophylogeographic models to (i) reconstruct past geographic range modifications, (ii) identify geographic processes that result in genetic bottlenecks; and (iii) predict threats due to anthropogenic climate change in the future.</p></div

    Heart transplantation with donation after circulatory determination of death.

    No full text
    The constant shortage of available organs is a major obstacle and limiting factor in heart transplantation; the discrepancy between the number of donors and potential recipients leads to waiting-list mortality of 10-12% per year in Europe and the USA. If adopted for heart transplantation, donation after circulatory determination of death (DCDD) would be expected to improve the availability of organs substantially for both adults and children. With DCDD, however, hearts to be transplanted undergo a period of warm ischaemia before procurement, which is of particular concern because tissue damage occurs rapidly and might be sufficient to preclude transplantation. Nonetheless, the heart is able to withstand limited periods of warm ischaemia, which could provide a window of opportunity for DCDD. Development of clinical approaches specifically for DCDD is critical for the exploitation of these organs, because current practices for donor heart procurement, evaluation, and storage have been optimized for conventional donation after brain death, without consideration of warm ischaemia before organ procurement. Establishment of clinical protocols and ethical and legal frameworks for DCDD of other organs is underway. This Review provides a timely evaluation of the potential for DCDD in heart transplantation

    Evaluating the Significance of Paleophylogeographic Species Distribution Models in Reconstructing Quaternary Range-Shifts of Nearctic Chelonians

    No full text
    corecore