53 research outputs found

    Uptake of the Necrotic Serpin in Drosophila melanogaster via the Lipophorin Receptor-1

    Get PDF
    The humoral response to fungal and Gram-positive infections is regulated by the serpin-family inhibitor, Necrotic. Following immune-challenge, a proteolytic cascade is activated which signals through the Toll receptor. Toll activation results in a range of antibiotic peptides being synthesised in the fat-body and exported to the haemolymph. As with mammalian serpins, Necrotic turnover in Drosophila is rapid. This serpin is synthesised in the fat-body, but its site of degradation has been unclear. By “freezing” endocytosis with a temperature sensitive Dynamin mutation, we demonstrate that Necrotic is removed from the haemolymph in two groups of giant cells: the garland and pericardial athrocytes. Necrotic uptake responds rapidly to infection, being visibly increased after 30 mins and peaking at 6–8 hours. Co-localisation of anti-Nec with anti-AP50, Rab5, and Rab7 antibodies establishes that the serpin is processed through multi-vesicular bodies and delivered to the lysosome, where it co-localises with the ubiquitin-binding protein, HRS. Nec does not co-localise with Rab11, indicating that the serpin is not re-exported from athrocytes. Instead, mutations which block late endosome/lysosome fusion (dor, hk, and car) cause accumulation of Necrotic-positive endosomes, even in the absence of infection. Knockdown of the 6 Drosophila orthologues of the mammalian LDL receptor family with dsRNA identifies LpR1 as an enhancer of the immune response. Uptake of Necrotic from the haemolymph is blocked by a chromosomal deletion of LpR1. In conclusion, we identify the cells and the receptor molecule responsible for the uptake and degradation of the Necrotic serpin in Drosophila melanogaster. The scavenging of serpin/proteinase complexes may be a critical step in the regulation of proteolytic cascades

    Revisiting the association between candidal infection and carcinoma, particularly oral squamous cell carcinoma

    Get PDF
    Background: Tobacco and alcohol are risk factors associated with cancer of the upper aerodigestive tract, but increasingly the role of infection and chronic inflammation is recognized as being significant in cancer development. Bacteria, particularly Helicobacter pylori, and viruses such as members of the human papilloma virus family and hepatitis B and C are strongly implicated as etiological factors in certain cancers. There is less evidence for an association between fungi and cancer, although it has been recognized for many years that white patches on the oral mucosa, which are infected with Candida, have a greater likelihood of undergoing malignant transformation than those that are not infected. Objective: This article reviews the association between the development of oral squamous cell carcinoma in potentially malignant oral lesions with chronic candidal infection and describes mechanisms that may be involved in Candida-associated malignant transformation

    Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.

    Get PDF
    BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700

    Left ventricular blood flow kinetic energy after myocardial infarction - insights from 4D flow cardiovascular magnetic resonance

    Get PDF
    Background: Myocardial infarction (MI) leads to complex changes in left ventricular (LV) haemodynamics that are linked to clinical outcomes. We hypothesize that LV blood flow kinetic energy (KE) is altered in MI and is associated with LV function and infarct characteristics. This study aimed to investigate the intra-cavity LV blood flow KE in controls and MI patients, using cardiovascular magnetic resonance (CMR) four-dimensional (4D) flow assessment. Methods: Forty-eight patients with MI (acute-22; chronic-26) and 20 age/gender-matched healthy controls underwent CMR which included cines and whole-heart 4D flow. Patients also received late gadolinium enhancement imaging for infarct assessment. LV blood flow KE parameters were indexed to LV end-diastolic volume and include: averaged LV, minimal, systolic, diastolic, peak E-wave and peak A-wave KEiEDV. In addition, we investigated the in-plane proportion of LV KE (%) and the time difference (TD) to peak E-wave KE propagation from base to mid-ventricle was computed. Association of LV blood flow KE parameters to LV function and infarct size were investigated in all groups. Results: LV KEiEDV was higher in controls than in MI patients (8.5 ± 3 μJ/ml versus 6.5 ± 3 μJ/ml, P = 0.02). Additionally, systolic, minimal and diastolic peak E-wave KEiEDV were lower in MI (P < 0.05). In logistic-regression analysis, systolic KEiEDV (Beta = − 0.24, P < 0.01) demonstrated the strongest association with the presence of MI. In multiple-regression analysis, infarct size was most strongly associated with in-plane KE (r = 0.5, Beta = 1.1, P < 0.01). In patients with preserved LV ejection fraction (EF), minimal and in-plane KEiEDV were reduced (P < 0.05) and time difference to peak E-wave KE propagation during diastole increased (P < 0.05) when compared to controls with normal EF. Conclusions: Reduction in LV systolic function results in reduction in systolic flow KEiEDV. Infarct size is independently associated with the proportion of in-plane LV KE. Degree of LV impairment is associated with TD of peak E-wave KE. In patient with preserved EF post MI, LV blood flow KE mapping demonstrated significant changes in the in-plane KE, the minimal KEiEDV and the TD. These three blood flow KE parameters may offer novel methods to identify and describe this patient population

    Evaluation of appendicitis risk prediction models in adults with suspected appendicitis

    Get PDF
    Background Appendicitis is the most common general surgical emergency worldwide, but its diagnosis remains challenging. The aim of this study was to determine whether existing risk prediction models can reliably identify patients presenting to hospital in the UK with acute right iliac fossa (RIF) pain who are at low risk of appendicitis. Methods A systematic search was completed to identify all existing appendicitis risk prediction models. Models were validated using UK data from an international prospective cohort study that captured consecutive patients aged 16–45 years presenting to hospital with acute RIF in March to June 2017. The main outcome was best achievable model specificity (proportion of patients who did not have appendicitis correctly classified as low risk) whilst maintaining a failure rate below 5 per cent (proportion of patients identified as low risk who actually had appendicitis). Results Some 5345 patients across 154 UK hospitals were identified, of which two‐thirds (3613 of 5345, 67·6 per cent) were women. Women were more than twice as likely to undergo surgery with removal of a histologically normal appendix (272 of 964, 28·2 per cent) than men (120 of 993, 12·1 per cent) (relative risk 2·33, 95 per cent c.i. 1·92 to 2·84; P < 0·001). Of 15 validated risk prediction models, the Adult Appendicitis Score performed best (cut‐off score 8 or less, specificity 63·1 per cent, failure rate 3·7 per cent). The Appendicitis Inflammatory Response Score performed best for men (cut‐off score 2 or less, specificity 24·7 per cent, failure rate 2·4 per cent). Conclusion Women in the UK had a disproportionate risk of admission without surgical intervention and had high rates of normal appendicectomy. Risk prediction models to support shared decision‐making by identifying adults in the UK at low risk of appendicitis were identified

    Prevalence of peste des pestits ruminant (PPR) and helminthiasis in sheep and goats in Bauchi, Nigeria

    No full text
    No Abstract Bull Anim. Hlth. Prod. Afr. (2005) 53(2), 131-13

    An assessment of integrated Striga hermonthica control and early adoption by farmers in Northern Nigeria

    No full text
    Two sets of on-farm trials, each covering two years, were conducted in the northern Guinea savannah of Nigeria over the period 1999–2001, the objective being to compare integrated Striga hermonthica control measures (soybean or cowpea trap crops followed by maize resistant to Striga) with farmers' traditional cereal-based cropping systems. In both sets of trials, this proved to be highly effective in increasing productivity over the two year period, especially where soybean was used as a trap crop. Resistant maize after a trap crop increased the net benefit over the two cropping seasons in both trials by over 100% over farmer practice. However, in the second set of trials there was no significant increase in productivity between a trap crop followed by Striga resistant maize, and a trap crop followed by local maize especially where legume intercropping and fertilizer had been applied in the farmer practice. There was also no increase in productivity between two years' traditional cereal cropping and one year's local maize followed by Striga resistant maize. This indicates the importance of a legume trap crop in the first year in order to ensure high productivity in the second year, regardless of variety. Up to 20% of farmers obtained higher productivity from their own practices, notably intercropping of cereals with legumes and use of inorganic fertilizers. Leguminous trap crops and Striga resistant maize, together with two key management practices (increased soybean planting density and hand-roguing) were seen to be spreading both within and beyond the research villages, indicating that farmers see the economic benefits of controlling Striga. Survey findings show that explaining the reasons why control practices work can greatly increase the adoption of these practices. Wider adoption of Striga control will therefore require an extension approach that provides this training as well as encouraging farmers to experiment and adapt Striga control options for their local farming systems
    corecore