34 research outputs found

    Intravenous Immunoglobulins after Liver Transplantation: new insights in mechanisms of action

    Get PDF
    The principal concept of organ transplantation is the replacement of a diseased organ with a healthy one from another individual. In recent decades, transplantation has saved the lives of thousands of people who otherwise were condemned to death because of their life-threatening diseases. Currently, liver transplantation is the treatment of choice for both acute and chronic liver failure. The first successful transplantation of a liver in human was performed by Thomas Starzl in 1967 1. In 1983 the National Institutes of Health (NIH) declare liver transplantation as an accepted therapy for end-stage liver disease 2. From the last two decades, most important indications for liver transplantation in Europe were cirrhosis (58%), cancer (13%), cholestatic diseases (11%) and acute hepatic failure (9%). Survival is excellent both in short term and long term transplant patients, with patient survival rates of approximately 81% one year after surgery, and 69% five years after transplantation (source: www.eltr.org)

    Optimization of Immunoglobulin Substitution Therapy by a Stochastic Immune Response Model

    Get PDF
    Background: The immune system is a complex adaptive system of cells and molecules that are interwoven in a highly organized communication network. Primary immune deficiencies are disorders in which essential parts of the immune system are absent or do not function according to plan. X-linked agammaglobulinemia is a B-lymphocyte maturation disorder in which the production of immunoglobulin is prohibited by a genetic defect. Patients have to be put on life-long immunoglobulin substitution therapy in order to prevent recurrent and persistent opportunistic infections. Methodology: We formulate an immune response model in terms of stochastic differential equations and perform a systematic analysis of empirical therapy protocols that differ in the treatment frequency. The model accounts for the immunoglobulin reduction by natural degradation and by antigenic consumption, as well as for the periodic immunoglobulin replenishment that gives rise to an inhomogeneous distribution of immunoglobulin specificities in the shape space. Results are obtained from computer simulations and from analytical calculations within the framework of the Fokker-Planck formalism, which enables us to derive closed expressions for undetermined model parameters such as the infection clearance rate. Conclusions: We find that the critical value of the clearance rate, below which a chronic infection develops, is strongly dependent on the strength of fluctuations in the administered immunoglobulin dose per treatment and is an increasing function of the treatment frequency. The comparative analysis of therapy protocols with regard to the treatment frequency yields quantitative predictions of therapeutic relevance, where the choice of the optimal treatment frequency reveals a conflict of competing interests: In order to diminish immunomodulatory effects and to make good economic sense, therapeutic immunoglobulin levels should be kept close to physiological levels, implying high treatment frequencies. However, clearing infections without additional medication is more reliably achieved by substitution therapies with low treatment frequencies. Our immune response model predicts that the compromise solution of immunoglobulin substitution therapy has a treatment frequency in the range from one infusion per week to one infusion per two weeks

    Analysis and Functional Consequences of Increased Fab-Sialylation of Intravenous Immunoglobulin (IVIG) after Lectin Fractionation

    Get PDF
    It has been proposed that the anti-inflammatory effects of intravenous immunoglobulin (IVIG) might be due to the small fraction of Fc-sialylated IgG. In this study we biochemically and functionally characterized sialic acid-enriched IgG obtained by Sambucus nigra agglutinin (SNA) lectin fractionation. Two main IgG fractions isolated by elution with lactose (E1) or acidified lactose (E2) were analyzed for total IgG, F(ab’)2 and Fc-specific sialic acid content, their pattern of specific antibodies and anti-inflammatory potential in a human in vitro inflammation system based on LPS- or PHA-stimulated whole blood. HPLC and LC-MS testing revealed an increase of sialylated IgG in E1 and more substantially in the E2 fraction. Significantly, the increased amount of sialic acid residues was primarily found in the Fab region whereas only a minor increase was observed in the Fc region. This indicates preferential binding of the Fab sialic acid to SNA. ELISA analyses of a representative range of pathogen and auto-antigens indicated a skewed antibody pattern of the sialylated IVIG fractions. Finally, the E2 fraction exerted a more profound anti-inflammatory effect compared to E1 or IVIG, evidenced by reduced CD54 expression on monocytes and reduced secretion of MCP-1 (CCL2); again these effects were Fab- but not Fc-dependent. Our results show that SNA fractionation of IVIG yields a minor fraction (approx. 10%) of highly sialylated IgG, wherein the sialic acid is mainly found in the Fab region. The tested anti-inflammatory activity was associated with Fab not Fc sialylation

    Enrichment of Sialylated IgG by Lectin Fractionation Does Not Enhance the Efficacy of Immunoglobulin G in a Murine Model of Immune Thrombocytopenia

    Get PDF
    Intravenous immunoglobulin G (IVIg) is widely used against a range of clinical symptoms. For its use in immune modulating therapies such as treatment of immune thrombocytopenic purpura high doses of IVIg are required. It has been suggested that only a fraction of IVIg causes this anti immune modulating effect. Recent studies indicated that this fraction is the Fc-sialylated IgG fraction. The aim of our study was to determine the efficacy of IVIg enriched for sialylated IgG (IVIg-SA (+)) in a murine model of passive immune thrombocytopenia (PIT). We enriched IVIg for sialylated IgG by Sambucus nigra agglutinin (SNA) lectin fractionation and determined the degree of sialylation. Analysis of IVIg-SA (+) using a lectin-based ELISA revealed that we enriched predominantly for Fab-sialylated IgG, whereas we did not find an increase in Fc-sialylated IgG. Mass spectrometric analysis confirmed that Fc sialylation did not change after SNA lectin fractionation. The efficacy of sialylated IgG was measured by administering IVIg or IVIg-SA (+) 24 hours prior to an injection of a rat anti-mouse platelet mAb. We found an 85% decrease in platelet count after injection of an anti-platelet mAb, which was reduced to a 70% decrease by injecting IVIg (p<0.01). In contrast, IVIg-SA (+) had no effect on the platelet count. Serum levels of IVIg and IVIg-SA (+) were similar, ruling out enhanced IgG clearance as a possible explanation. Our results indicate that SNA lectin fractionation is not a suitable method to enrich IVIg for Fc-sialylated IgG. The use of IVIg enriched for Fab-sialylated IgG abolishes the efficacy of IVIg in the murine PIT model

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore