107 research outputs found

    Jejunal microvilli atrophy and reduced nutrient transport in rats with advanced liver cirrhosis: improvement by Insulin-like Growth Factor I

    Get PDF
    BACKGROUND: Previous results have shown that in rats with non-ascitic cirrhosis there is an altered transport of sugars and amino acids associated with elongated microvilli. These alterations returned to normal with the administration of Insulin-Like Growth Factor-I (IGF-I). The aims of this study were to explore the evolution of these alterations and analyse the effect of IGF-I in rats with advanced cirrhosis and ascites. Thus, jejunal structure and nutrient transport (D-galactose, L-leucine, L-proline, L-glutamic acid and L-cystine) were studied in rats with ascitic cirrhosis. METHODS: Advanced cirrhosis was induced by CCl(4 )inhalation and Phenobarbital administration for 30 weeks. Cirrhotic animals were divided into two groups which received IGF-I or saline during two weeks. Control group was studied in parallel. Jejunal microvilli were studied by electron microscopy. Nutrient transport was assessed in brush border membrane vesicles using (14)C or (35)S-labelled subtracts in the three experimental groups. RESULTS: Intestinal active Na(+)-dependent transport was significantly reduced in untreated cirrhotic rats. Kinetic studies showed a decreased V(max )and a reduced affinity for sugar and four amino acids transporters (expressed as an increased K(t)) in the brush border membrane vesicles from untreated cirrhotic rats as compared with controls. Both parameters were normalised in the IGF-I-treated cirrhotic group. Electron microscopy showed elongation and fusion of microvilli with degenerative membrane lesions and/or notable atrophy. CONCLUSIONS: The initial microvilli elongation reported in non ascitic cirrhosis develops into atrophy in rats with advanced cirrhosis and nutrient transports (monosaccharides and amino acids) are progressively reduced. Both morphological and functional alterations improved significantly with low doses of IGF-I

    Rapid and Sensitive Detection of an Intracellular Pathogen in Human Peripheral Leukocytes with Hybridizing Magnetic Relaxation Nanosensors

    Get PDF
    Bacterial infections are still a major global healthcare problem. The quick and sensitive detection of pathogens responsible for these infections would facilitate correct diagnosis of the disease and expedite treatment. Of major importance are intracellular slow-growing pathogens that reside within peripheral leukocytes, evading recognition by the immune system and detection by traditional culture methods. Herein, we report the use of hybridizing magnetic nanosensors (hMRS) for the detection of an intracellular pathogen, Mycobacterium avium spp. paratuberculosis (MAP). The hMRS are designed to bind to a unique genomic sequence found in the MAP genome, causing significant changes in the sample’s magnetic resonance signal. Clinically relevant samples, including tissue and blood, were screened with hMRS and results were compared with traditional PCR analysis. Within less than an hour, the hMRS identified MAP-positive samples in a library of laboratory cultures, clinical isolates, blood and homogenized tissues. Comparison of the hMRS with culture methods in terms of prediction of disease state revealed that the hMRS outperformed established culture methods, while being significantly faster (1 hour vs 12 weeks). Additionally, using a single instrument and one nanoparticle preparation we were able to detect the intracellular bacterial target in clinical samples at the genomic and epitope levels. Overall, since the nanoparticles are robust in diverse environmental settings and substantially more affordable than PCR enzymes, the potential clinical and field-based use of hMRS in the multiplexed identification of microbial pathogens and other disease-related biomarkers via a single, deployable instrument in clinical and complex environmental samples is foreseen

    Identification of Serotype in Culture Negative Pneumococcal Meningitis Using Sequential Multiplex PCR: Implication for Surveillance and Vaccine Design

    Get PDF
    BACKGROUND: PCR-based serotyping of Streptococcus pneumoniae has been proposed as a simpler approach than conventional methods, but has not been applied to strains in Asia where serotypes are diverse and different from other part of the world. Furthermore, PCR has not been used to determine serotype distribution in culture-negative meningitis cases. METHODOLOGY: Thirty six serotype-specific primers, 7 newly designed and 29 previously published, were arranged in 7 multiplex PCR sets, each in new hierarchies designed for overall serotype distribution in Bangladesh, and specifically for meningitis and non-meningitis isolates. Culture-negative CSF specimens were then tested directly for serotype-specific sequences using the meningitis-specific set of primers. PCR-based serotyping of 367 strains of 56 known serotypes showed 100% concordance with quellung reaction test. The first 7 multiplex reactions revealed the serotype of 40% of all, and 31% and 48% non-meningitis and meningitis isolates, respectively. By redesigning the multiplex scheme specifically for non-meningitis or meningitis, the quellung reaction of 43% and 48% of respective isolates could be identified. Direct examination of 127 culture-negative CSF specimens, using the meningitis-specific set of primers, yielded serotype for 51 additional cases. CONCLUSIONS: This PCR approach, could improve ascertainment of pneumococcal serotype distributions, especially for meningitis in settings with high prior use of antibiotics

    Empathy among undergraduate medical students: A multi-centre cross-sectional comparison of students beginning and approaching the end of their course

    Get PDF
    BACKGROUND: Although a core element in patient care the trajectory of empathy during undergraduate medical education remains unclear. Empathy is generally regarded as comprising an affective capacity: the ability to be sensitive to and concerned for, another and a cognitive capacity: the ability to understand and appreciate the other person's perspective. The authors investigated whether final year undergraduate students recorded lower levels of empathy than their first year counterparts, and whether male and female students differed in this respect. METHODS: Between September 2013 and June 2014 an online questionnaire survey was administered to 15 UK, and 2 international medical schools. Participating schools provided both 5-6 year standard courses and 4 year accelerated graduate entry courses. The survey incorporated the Jefferson Scale of Empathy-Student Version (JSE-S) and Davis's Interpersonal Reactivity Index (IRI), both widely used to measure medical student empathy. Participation was voluntary. Chi squared tests were used to test for differences in biographical characteristics of student groups. Multiple linear regression analyses, in which predictor variables were year of course (first/final); sex; type of course and broad socio-economic group were used to compare empathy scores. RESULTS: Five medical schools (4 in the UK, 1 in New Zealand) achieved average response rates of 55 % (n = 652) among students starting their course and 48 % (n = 487) among final year students. These schools formed the High Response Rate Group. The remaining 12 medical schools recorded lower response rates of 24.0 % and 15.2 % among first and final year students respectively. These schools formed the Lower Response Rate Group. For both male and female students in both groups of schools no significant differences in any empathy scores were found between students starting and approaching the end of their course. Gender was found to significantly predict empathy scores, with females scoring higher than males. CONCLUSIONS: Participant male and female medical students approaching the end of their undergraduate education, did not record lower levels of empathy, compared to those at the beginning of their course. Questions remain concerning the trajectory of empathy after qualification and how best to support it through the pressures of starting out in medical practice

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    How to discuss gene therapy for haemophilia? A patient and physician perspective

    Get PDF
    Gene therapy has the potential to revolutionise treatment for patients with haemophilia and is close to entering clinical practice. While factor concentrates have improved outcomes, individuals still face a lifetime of injections, pain, progressive joint damage, the potential for inhibitor development and impaired quality of life. Recently published studies in adeno‐associated viral (AAV) vector‐mediated gene therapy have demonstrated improvement in endogenous factor levels over sustained periods, significant reduction in annualised bleed rates, lower exogenous factor usage and thus far a positive safety profile. In making the shared decision to proceed with gene therapy for haemophilia, physicians should make it clear that research is ongoing and that there are remaining evidence gaps, such as long‐term safety profiles and duration of treatment effect. The eligibility criteria for gene therapy trials mean that key patient groups may be excluded, eg children/adolescents, those with liver or kidney dysfunction and those with a prior history of factor inhibitors or pre‐existing neutralising AAV antibodies. Gene therapy offers a life‐changing opportunity for patients to reduce their bleeding risk while also reducing or abrogating the need for exogenous factor administration. Given the expanding evidence base, both physicians and patients will need sources of clear and reliable information to be able to discuss and judge the risks and benefits of treatment

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore