73 research outputs found
Cardiosphere-derived cells suppress allogeneic lymphocytes by production of PGE2 acting via the EP4 receptor
derived cells (CDCs) are a cardiac progenitor cell population, which have been shown to possess cardiac regenerative properties and can improve heart function in a variety of cardiac diseases. Studies in large animal models have predominantly focussed on using autologous cells for safety, however allogeneic cell banks would allow for a practical, cost-effective and efficient use in a clinical setting. The aim of this work was to determine the immunomodulatory status of these cells using CDCs and lymphocytes from 5 dogs. CDCs expressed MHC I but not MHC II molecules and in mixed lymphocyte reactions demonstrated a lack of lymphocyte proliferation in response to MHC-mismatched CDCs. Furthermore, MHC-mismatched CDCs suppressed lymphocyte proliferation and activation in response to Concanavalin A. Transwell experiments demonstrated that this was predominantly due
to direct cell-cell contact in addition to soluble mediators whereby CDCs produced high levels of PGE2
under inflammatory conditions. This led to down-regulation of CD25 expression on lymphocytes via the
EP4 receptor. Blocking prostaglandin synthesis restored both, proliferation and activation (measured via CD25 expression) of stimulated lymphocytes. We demonstrated for the first time in a large animal model that CDCs inhibit proliferation in allo-reactive lymphocytes and have potent immunosuppressive activity mediated via PGE2
Data Stream Clustering for Real-Time Anomaly Detection: An Application to Insider Threats
Insider threat detection is an emergent concern for academia, industries, and governments due to the growing number of insider incidents in recent years. The continuous streaming of unbounded data coming from various sources in an organisation, typically in a high velocity, leads to a typical Big Data computational problem. The malicious insider threat refers to anomalous behaviour(s) (outliers) that deviate from the normal baseline of a data stream. The absence of previously logged activities executed by users shapes the insider threat detection mechanism into an unsupervised anomaly detection approach over a data stream. A common shortcoming in the existing data mining approaches to detect insider threats is the high number of false alarms/positives (FPs). To handle the Big Data issue and to address the shortcoming, we propose a streaming anomaly detection approach, namely Ensemble of Random subspace Anomaly detectors In Data Streams (E-RAIDS), for insider threat detection. E-RAIDS learns an ensemble of p established outlier detection techniques [Micro-cluster-based Continuous Outlier Detection (MCOD) or Anytime Outlier Detection (AnyOut)] which employ clustering over continuous data streams. Each model of the p models learns from a random feature subspace to detect local outliers, which might not be detected over the whole feature space. E-RAIDS introduces an aggregate component that combines the results from the p feature subspaces, in order to confirm whether to generate an alarm at each window iteration. The merit of E-RAIDS is that it defines a survival factor and a vote factor to address the shortcoming of high number of FPs. Experiments on E-RAIDS-MCOD and E-RAIDS-AnyOut are carried out, on synthetic data sets including malicious insider threat scenarios generated at Carnegie Mellon University, to test the effectiveness of voting feature subspaces, and the capability to detect (more than one)-behaviour-all-threat in real-time. The results show that E-RAIDS-MCOD reports the highest F1 measure and less number of false alarm = 0 compared to E-RAIDS-AnyOut, as well as it attains to detect approximately all the insider threats in real-time
Recommended from our members
Glycaemic control in people with type 2 diabetes mellitus during and after cancer treatment: A systematic review and meta-analysis
Background
Cancer and Diabetes Mellitus (DM) are leading causes of death worldwide and the prevalence of both is escalating. People with co-morbid cancer and DM have increased morbidity and premature mortality compared with cancer patients with no DM. The reasons for this are likely to be multifaceted but will include the impact of hypo/hyperglycaemia and diabetes therapies on cancer treatment and disease progression. A useful step toward addressing this disparity in treatment outcomes is to establish the impact of cancer treatment on diabetes control.
Aim
The aim of this review is to identify and analyse current evidence reporting glycaemic control (HbA1c) during and after cancer treatment.
Methods
Systematic searches of published quantitative research relating to comorbid cancer and type 2 diabetes mellitus were conducted using databases, including Medline, Embase, PsychINFO, CINAHL and Web of Science (February 2017). Full text publications were eligible for inclusion if they: were quantitative, published in English language, investigated the effects of cancer treatment on glycaemic control, reported HbA1c (%/mmols/mol) and included adult populations with diabetes. Means, standard deviations and sample sizes were extracted from each paper; missing standard deviations were imputed. The completed datasets were analysed using a random effects model. A mixed-effects analysis was undertaken to calculate mean HbA1c (%/mmols/mol) change over three time periods compared to baseline.
Results
The available literature exploring glycaemic control post-diagnosis was mixed. There was increased risk of poor glycaemic control during this time if studies of surgical treatment for gastric cancer are excluded, with significant differences between baseline and 12 months (p < 0.001) and baseline and 24 months (p = 0.002).
Conclusion
We found some evidence to support the contention that glycaemic control during and/or after non-surgical cancer treatment is worsened, and the reasons are not well defined in individual studies. Future studies should consider the reasons why this is the case
Approaches in biotechnological applications of natural polymers
Natural polymers, such as gums and mucilage, are biocompatible, cheap, easily available and non-toxic materials of native origin. These polymers are increasingly preferred over synthetic materials for industrial applications due to their intrinsic properties, as well as they are considered alternative sources of raw materials since they present characteristics of sustainability, biodegradability and biosafety. As definition, gums and mucilages are polysaccharides or complex carbohydrates consisting of one or more monosaccharides or their derivatives linked in bewildering variety of linkages and structures. Natural gums are considered polysaccharides naturally occurring in varieties of plant seeds and exudates, tree or shrub exudates, seaweed extracts, fungi, bacteria, and animal sources. Water-soluble gums, also known as hydrocolloids, are considered exudates and are pathological products; therefore, they do not form a part of cell wall. On the other hand, mucilages are part of cell and physiological products. It is important to highlight that gums represent the largest amounts of polymer materials derived from plants. Gums have enormously large and broad applications in both food and non-food industries, being commonly used as thickening, binding, emulsifying, suspending, stabilizing agents and matrices for drug release in pharmaceutical and cosmetic industries. In the food industry, their gelling properties and the ability to mold edible films and coatings are extensively studied. The use of gums depends on the intrinsic properties that they provide, often at costs below those of synthetic polymers. For upgrading the value of gums, they are being processed into various forms, including the most recent nanomaterials, for various biotechnological applications. Thus, the main natural polymers including galactomannans, cellulose, chitin, agar, carrageenan, alginate, cashew gum, pectin and starch, in addition to the current researches about them are reviewed in this article.. }To the Conselho Nacional de Desenvolvimento Cientfíico e Tecnológico (CNPq) for fellowships (LCBBC and MGCC) and the Coordenação de Aperfeiçoamento de Pessoal de Nvíel Superior (CAPES) (PBSA). This study was supported by the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/BIO/04469/2013 unit, the Project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and COMPETE 2020 (POCI-01-0145-FEDER-006684) (JAT)
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
- …