57 research outputs found

    RelA regulates CXCL1/CXCR2-dependent oncogene-induced senescence in murine Kras-driven pancreatic carcinogenesis

    Get PDF
    Tumor suppression that is mediated by oncogene-induced senescence (OIS) is considered to function as a safeguard during development of pancreatic ductal adenocarcinoma (PDAC). However, the mechanisms that regulate OIS in PDAC are poorly understood. Here, we have determined that nuclear RelA reinforces OIS to inhibit carcinogenesis in the Kras mouse model of PDAC. Inactivation of RelA accelerated pancreatic lesion formation in Kras mice by abrogating the senescence-associated secretory phenotype (SASP) gene transcription signature. Using genetic and pharmacological tools, we determined that RelA activation promotes OIS via elevation of the SASP factor CXCL1 (also known as KC), which activates CXCR2, during pancreatic carcinogenesis. In Kras mice, pancreas-specific inactivation of CXCR2 prevented OIS and was correlated with increased tumor proliferation and decreased survival. Moreover, reductions in CXCR2 levels were associated with advanced neoplastic lesions in tissue from human pancreatic specimens. Genetically disabling OIS in Kras mice caused RelA to promote tumor proliferation, suggesting a dual role for RelA signaling in pancreatic carcinogenesis. Taken together, our data suggest a pivotal role for RelA in regulating OIS in preneoplastic lesions and implicate the RelA/CXCL1/CXCR2 axis as an essential mechanism of tumor surveillance in PDAC

    Early prediction of acute necrotizing pancreatitis by artificial intelligence : a prospective cohort-analysis of 2387 cases

    Get PDF
    Pancreatic necrosis is a consistent prognostic factor in acute pancreatitis (AP). However, the clinical scores currently in use are either too complicated or require data that are unavailable on admission or lack sufficient predictive value. We therefore aimed to develop a tool to aid in necrosis prediction. The XGBoost machine learning algorithm processed data from 2387 patients with AP. The confidence of the model was estimated by a bootstrapping method and interpreted via the 10th and the 90th percentiles of the prediction scores. Shapley Additive exPlanations (SHAP) values were calculated to quantify the contribution of each variable provided. Finally, the model was implemented as an online application using the Streamlit Python-based framework. The XGBoost classifier provided an AUC value of 0.757. Glucose, C-reactive protein, alkaline phosphatase, gender and total white blood cell count have the most impact on prediction based on the SHAP values. The relationship between the size of the training dataset and model performance shows that prediction performance can be improved. This study combines necrosis prediction and artificial intelligence. The predictive potential of this model is comparable to the current clinical scoring systems and has several advantages over them

    Combined inhibition of BET family proteins and histone deacetylases as a potential epigenetics-based therapy for pancreatic ductal adenocarcinoma

    Get PDF
    Pancreatic ductal adenocarcinoma (PDAC) is one of the most lethal human cancers and shows resistance to any therapeutic strategy used. Here we tested small-molecule inhibitors targeting chromatin regulators as possible therapeutic agents in PDAC. We show that JQ1, an inhibitor of the bromodomain and extraterminal (BET) family of proteins, suppresses PDAC development in mice by inhibiting both MYC activity and inflammatory signals. The histone deacetylase (HDAC) inhibitor SAHA synergizes with JQ1 to augment cell death and more potently suppress advanced PDAC. Finally, using a CRISPR-Cas9–based method for gene editing directly in the mouse adult pancreas, we show that de-repression of p57 (also known as KIP2 or CDKN1C) upon combined BET and HDAC inhibition is required for the induction of combination therapy–induced cell death in PDAC. SAHA is approved for human use, and molecules similar to JQ1 are being tested in clinical trials. Thus, these studies identify a promising epigenetic-based therapeutic strategy that may be rapidly implemented in fatal human tumors

    The FASER Detector

    Get PDF
    FASER, the ForwArd Search ExpeRiment, is an experiment dedicated to searching for light, extremely weakly-interacting particles at CERN's Large Hadron Collider (LHC). Such particles may be produced in the very forward direction of the LHC's high-energy collisions and then decay to visible particles inside the FASER detector, which is placed 480 m downstream of the ATLAS interaction point, aligned with the beam collisions axis. FASER also includes a sub-detector, FASERν\nu, designed to detect neutrinos produced in the LHC collisions and to study their properties. In this paper, each component of the FASER detector is described in detail, as well as the installation of the experiment system and its commissioning using cosmic-rays collected in September 2021 and during the LHC pilot beam test carried out in October 2021. FASER will start taking LHC collision data in 2022, and will run throughout LHC Run 3

    DUNE DAQ R&D integration in ProtoDUNE Single-Phase at CERN

    Get PDF
    The DAQ system of ProtoDUNE-SP successfully proved its design principles and met the requirements of the beam run of 2018. The technical design of the DAQ system for the DUNE experiment has major differences compared to the prototype due to different requirements placed on the detector, as well as a radically different location of operation. The single-phase prototype at CERN is a major integration facility for R&D aspects of the DUNE DAQ system. The facility allows for the exploration of additional data processing capabilities and optimization of the FELIX system, which is the chosen TPC readout solution for the DUNE single-phase detectors. One of the fundamental differences from the prototype is that the DUNE DAQ relies on self-triggering. Therefore, real-time processing of the data stream for hit and trigger primitive finding is essential for the requirement of continuous readout. The supernova burst trigger requires a large and fast buffering technique, where 3D XPoint persistent memory solutions are evaluated and integrated. In order to maximize resource utilization of the FELIX hosting servers, the elimination of the 100 Gb network communication stack is desired. This implies the design and development of a single-host application layer, which is a fundamental element of the self-triggering chain. This paper discusses the evaluation and integration of these developments for the DUNE DAQ, in the ProtoDUNE environment

    Functional tests of a prototype for the CMS-ATLAS common non-event data handling framework

    No full text
    Since the 2014 the experiments ATLAS and CMS have started to share a common vision for the Condition Database infrastructure required for the forthcoming LHC runs. The large commonality in the use cases to be satisfied has allowed to agree to an overall design solution which could meet the requirements for both experiments. A first prototype implementing these solutions has been completed in 2015 and made available to both the experiments. The prototype is based on a web service implementing a REST api with a set of functions for the management of conditions data. The objects which constitute the elements of the data model are seen as resources on which CRUD operations can be performed via standard HTTP methods. The choice to insert a REST api in the architecture has several advantages: 1) the conditions data are exchanged in a neutral format ( JSON or XML), allowing to be processed by different technologies in different frameworks. 2) the client is agnostic with respect to the underlying technology adopted for the persistency ( allowing RDBMS and No-SQL back-ends) The implementation is using standard technologies available in Java for server based applications. This choice has the benefit to easy the integration with the existing java-based applications in use in both experiments, notably the FronTier service in the distributed computing environment. In this contribution, we describe the tests of this prototype performed within the CMS computing infrastructure, with the aim to prove the support of the main use cases and to suggest future improvements. Since the Data Model reflected in this prototype is very close to the layout of the current CMS Condition Database, the tests have been possible with the existing condition data, with no additional manipulation. The strategy for the integration of the prototype has consisted in replacing the inner Condition Database software layer with a plugin capable of accessing the web service and to decode the retrieved the data into the appropriate object structures used in the CMS offline software. This strategy has been applied to run a test suite on the specific physics data samples, used by the experiment for the software release validation
    corecore