60 research outputs found

    Comparison of inpatient vs. outpatient anterior cervical discectomy and fusion: a retrospective case series

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Spinal surgery is increasingly being done in the outpatient setting. We reviewed our experience with inpatient and outpatient single-level anterior cervical discectomy and fusion with plating (ACDF+P).</p> <p>Methods</p> <p>All patients undergoing single-level anterior cervical discectomy and fusion with plating between August 2005 and May 2007 by two surgeons (RPB or JAF) were retrospectively reviewed. All patients underwent anterior cervical microdiscectomy, arthrodesis using structural allograft, and titanium plating. A planned change from doing ACDF+P on an inpatient basis to doing ACDF+P on an outpatient basis was instituted at the midpoint of the study. There were no other changes in technique, patient selection, instrumentation, facility, or other factors. All procedures were done in full-service hospitals accommodating outpatient and inpatient care.</p> <p>Results</p> <p>64 patients underwent ACDF+P as inpatients, while 45 underwent ACDF+P as outpatients. When outpatient surgery was planned, 17 patients were treated as inpatients due to medical comorbidities (14), older age (1), and patient preference (2). At a mean follow-up of 62.4 days, 90 patients had an excellent outcome, 19 patients had a good outcome, and no patients had a fair or poor outcome. There was no significant difference in outcome between inpatients and outpatients. There were 4 complications, all occurring in inpatients: a hematoma one week post-operatively requiring drainage, a cerebrospinal fluid leak treated with lumbar drainage, syncope of unknown etiology, and moderate dysphagia.</p> <p>Conclusion</p> <p>In this series, outpatient ACDF+P was safe and was not associated with a significant difference in outcome compared with inpatient ACDF+P.</p

    The effect of arm training on thermoregulatory responses and calf volume during upper body exercise

    Get PDF
    The final publication is available at Springer via https://doi.org/10.1007/s00421-014-2842-9.PURPOSE: The smaller muscle mass of the upper body compared to the lower body may elicit a smaller thermoregulatory stimulus during exercise and thus produce novel training-induced thermoregulatory adaptations. Therefore, the principal aim of the study was to examine the effect of arm training on thermoregulatory responses during submaximal exercise. METHODS: Thirteen healthy male participants (Mean ± SD age 27.8 ± 5.0 years, body mass 74.8 ± 9.5 kg) took part in 8 weeks of arm crank ergometry training. Thermoregulatory and calf blood flow responses were measured during 30 min of arm cranking at 60% peak power (W peak) pre-, and post-training and post-training at the same absolute intensity as pre-training. Core temperature and skin temperatures were measured, along with heat flow at the calf, thigh, upper arm and chest. Calf blood flow using venous occlusion plethysmography was performed pre- and post-exercise and calf volume was determined during exercise. RESULTS: The upper body training reduced aural temperature (0.1 ± 0.3 °C) and heat storage (0.3 ± 0.2 J g(-1)) at a given power output as a result of increased whole body sweating and heat flow. Arm crank training produced a smaller change in calf volume post-training at the same absolute exercise intensity (-1.2 ± 0.8% compared to -2.2 ± 0.9% pre-training; P < 0.05) suggesting reduced leg vasoconstriction. CONCLUSION: Training improved the main markers of aerobic fitness. However, the results of this study suggest arm crank training additionally elicits physiological responses specific to the lower body which may aid thermoregulation.Peer reviewedFinal Accepted Versio

    In Vivo, In Vitro, and In Silico Characterization of Peptoids as Antimicrobial Agents

    Get PDF
    Bacterial resistance to conventional antibiotics is a global threat that has spurred the development of antimicrobial peptides (AMPs) and their mimetics as novel anti-infective agents. While the bioavailability of AMPs is often reduced due to protease activity, the non-natural structure of AMP mimetics renders them robust to proteolytic degradation, thus offering a distinct advantage for their clinical application. We explore the therapeutic potential of N-substituted glycines, or peptoids, as AMP mimics using a multi-faceted approach that includes in silico, in vitro, and in vivo techniques. We report a new QSAR model that we developed based on 27 diverse peptoid sequences, which accurately correlates antimicrobial peptoid structure with antimicrobial activity. We have identified a number of peptoids that have potent, broad-spectrum in vitro activity against multi-drug resistant bacterial strains. Lastly, using a murine model of invasive S. aureus infection, we demonstrate that one of the best candidate peptoids at 4 mg/kg significantly reduces with a two-log order the bacterial counts compared with saline-treated controls. Taken together, our results demonstrate the promising therapeutic potential of peptoids as antimicrobial agents

    Affinity Inequality among Serum Antibodies That Originate in Lymphoid Germinal Centers

    Get PDF
    Upon natural infection with pathogens or vaccination, antibodies are produced by a process called affinity maturation. As affinity maturation ensues, average affinity values between an antibody and ligand increase with time. Purified antibodies isolated from serum are invariably heterogeneous with respect to their affinity for the ligands they bind, whether macromolecular antigens or haptens (low molecular weight approximations of epitopes on antigens). However, less is known about how the extent of this heterogeneity evolves with time during affinity maturation. To shed light on this issue, we have taken advantage of previously published data from Eisen and Siskind (1964). Using the ratio of the strongest to the weakest binding subsets as a metric of heterogeneity (or affinity inequality), we analyzed antibodies isolated from individual serum samples. The ratios were initially as high as 50-fold, and decreased over a few weeks after a single injection of small antigen doses to around unity. This decrease in the effective heterogeneity of antibody affinities with time is consistent with Darwinian evolution in the strong selection limit. By contrast, neither the average affinity nor the heterogeneity evolves much with time for high doses of antigen, as competition between clones of the same affinity is minimal.Ragon Institute of MGH, MIT and HarvardSamsung Scholarship FoundationNational Science Foundation (U.S.). Graduate Research Fellowship (Grant 1122374

    Triclosan Disrupts SKN-1/Nrf2- Mediated Oxidative Stress Response in C. elegans and Human Mesenchymal Stem Cells

    Get PDF
    Triclosan (TCS), an antimicrobial chemical with potential endocrine-disrupting properties, may pose a risk to early embryonic development and cellular homeostasis during adulthood. Here, we show that TCS induces toxicity in both the nematode C. elegans and human mesenchymal stem cells (hMSCs) by disrupting the SKN-1/Nrf2-mediated oxidative stress response. Specifically, TCS exposure affected C. elegans survival and hMSC proliferation in a dose-dependent manner. Cellular analysis showed that TCS inhibited the nuclear localization of SKN-1/Nrf2 and the expression of its target genes, which were associated with oxidative stress response. Notably, TCS-induced toxicity was significantly reduced by either antioxidant treatment or constitutive SKN-1/Nrf2 activation. As Nrf2 is strongly associated with aging and chemoresistance, these findings will provide a novel approach to the identification of therapeutic targets and disease treatment

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore