2,557 research outputs found

    Induction of T-cell responses against mutation-specific peptides from malignant pediatric brain tumor samples

    Get PDF
    Medulloblastoma is the most common malignant brain tumor in childhood and adolescence and constitutes an important cause for cancer-related death in pediatric patients. Although standard therapy including surgery, chemotherapy and radiation can cure up to 80 % of average-risk patients, they imply severe cognitive long-term adverse effects and are unsatisfactory in advanced tumors. Therefore, alternative treatment strategies need to be established. Immunotherapeutic approaches like peptide vaccination and adoptive T-cell transfer (ATT) aim at enhancing self-protection through detection and elimination of malignant cells. Tumor-specific neoepitopes are promising targets for ATT as they are expressed exclusively by cancer tissue. Moreover, administration of mutation-derived peptide vaccines allows augmenting the endogenous immune response through abundant presentation of tumor antigen. In this proof-of-concept study we demonstrate a highly individualized approach where patient-specific neoepitopes are determined and tested for immunogenicity. Primary tumor samples from two pediatric medulloblastoma patients were analyzed in this project. Tumor-specific mutations were identified by next generation sequencing of tumor tissue and whole blood. Variants were confirmed by deep sequencing. In order to identify neoepitope peptides presented by the patients’ human leucocyte antigen (HLA) molecules, HLA binding affinity was predicted in silico by netMHC database. Respective peptides were synthesized and blood cells from healthy donors matching the patients’ HLA types were used to provide T lymphocytes and dendritic cells for antigen presentation. After seven restimulations in vitro, CD8+ cytotoxic T-cell reactivity against neoepitopes was assessed via flow-cytometric analysis of Interferon gamma and Tumor Necrosis Factor alpha release. A successful de novo T-cell response was induced for 9 of 19 tested peptides. In this proof-of-principle study we show that induction of a T-cell response against medullobastoma-derived neoantigens is feasible despite low mutational burden and low immunogenicity. In the future, this strategy can be used to synthesize individualized peptide cocktails for peptide vaccination or identify medulloblastoma-specific T-cell receptors for ATT. Long-term aims of this study are the identification of medulloblastoma/T-cell interaction and improvement of current treatment options for pediatric patients with advanced medulloblastoma

    Main memory in HPC: do we need more, or could we live with less?

    Get PDF
    An important aspect of High-Performance Computing (HPC) system design is the choice of main memory capacity. This choice becomes increasingly important now that 3D-stacked memories are entering the market. Compared with conventional Dual In-line Memory Modules (DIMMs), 3D memory chiplets provide better performance and energy efficiency but lower memory capacities. Therefore, the adoption of 3D-stacked memories in the HPC domain depends on whether we can find use cases that require much less memory than is available now. This study analyzes the memory capacity requirements of important HPC benchmarks and applications. We find that the High-Performance Conjugate Gradients (HPCG) benchmark could be an important success story for 3D-stacked memories in HPC, but High-Performance Linpack (HPL) is likely to be constrained by 3D memory capacity. The study also emphasizes that the analysis of memory footprints of production HPC applications is complex and that it requires an understanding of application scalability and target category, i.e., whether the users target capability or capacity computing. The results show that most of the HPC applications under study have per-core memory footprints in the range of hundreds of megabytes, but we also detect applications and use cases that require gigabytes per core. Overall, the study identifies the HPC applications and use cases with memory footprints that could be provided by 3D-stacked memory chiplets, making a first step toward adoption of this novel technology in the HPC domain.This work was supported by the Collaboration Agreement between Samsung Electronics Co., Ltd. and BSC, Spanish Government through Severo Ochoa programme (SEV-2015-0493), by the Spanish Ministry of Science and Technology through TIN2015-65316-P project and by the Generalitat de Catalunya (contracts 2014-SGR-1051 and 2014-SGR-1272). This work has also received funding from the European Union’s Horizon 2020 research and innovation programme under ExaNoDe project (grant agreement No 671578). Darko Zivanovic holds the Severo Ochoa grant (SVP-2014-068501) of the Ministry of Economy and Competitiveness of Spain. The authors thank Harald Servat from BSC and Vladimir Marjanovi´c from High Performance Computing Center Stuttgart for their technical support.Postprint (published version

    PROFET: modeling system performance and energy without simulating the CPU

    Get PDF
    The approaching end of DRAM scaling and expansion of emerging memory technologies is motivating a lot of research in future memory systems. Novel memory systems are typically explored by hardware simulators that are slow and often have a simplified or obsolete abstraction of the CPU. This study presents PROFET, an analytical model that predicts how an application's performance and energy consumption changes when it is executed on different memory systems. The model is based on instrumentation of an application execution on actual hardware, so it already takes into account CPU microarchitectural details such as the data prefetcher and out-of-order engine. PROFET is evaluated on two real platforms: Sandy Bridge-EP E5-2670 and Knights Landing Xeon Phi platforms with various memory configurations. The evaluation results show that PROFET's predictions are accurate, typically with only 2% difference from the values measured on actual hardware. We release the PROFET source code and all input data required for memory system and application profiling. The released package can be seamlessly installed and used on high-end Intel platforms.Peer ReviewedPostprint (author's final draft

    Maximally Divergent Intervals for Anomaly Detection

    Full text link
    We present new methods for batch anomaly detection in multivariate time series. Our methods are based on maximizing the Kullback-Leibler divergence between the data distribution within and outside an interval of the time series. An empirical analysis shows the benefits of our algorithms compared to methods that treat each time step independently from each other without optimizing with respect to all possible intervals.Comment: ICML Workshop on Anomaly Detectio

    Dramatic Changes in the Electronic Structure Upon Transition to the Collapsed Tetragonal Phase in CaFe2As2

    Get PDF
    We use angle-resolved photoemission spectroscopy (ARPES) and density functional theory (DFT) calculations to study the electronic structure of CaFe2_2As2_2 in previously unexplored collapsed tetragonal (CT) phase. This unusual phase of the iron arsenic high temperature superconductors was hard to measure as it exists only under pressure. By inducing internal strain, via the post growth, thermal treatment of the single crystals, we were able to stabilize the CT phase at ambient-pressure. We find significant differences in the Fermi surface topology and band dispersion data from the more common orthorhombic-antiferromagnetic or tetragonal-paramagnetic phases, consistent with electronic structure calculations. The top of the hole bands sinks below the Fermi level, which destroys the nesting present in parent phases. The absence of nesting in this phase along with apparent loss of Fe magnetic moment, are now clearly experimentally correlated with the lack of superconductivity in this phase.Comment: 5 pages, 4 figures, accepted in PRB(RC

    Rates of erosion and landscape change along the Blue Ridge escarpment, southern Appalachian Mountains, estimated from in situ cosmogenic 10Be

    Get PDF
    The Blue Ridge escarpment, located within the southern Appalachian Mountains of Virginia and North Carolina, forms a distinct, steep boundary between the lower-elevation Piedmont and higher-elevation Blue Ridge physiographic provinces. To understand better the rate at which this landform and the adjacent landscape are changing, we measured cosmogenic 10Be in quartz separated from sediment samples (n = 50) collected in thirty-two streams and from three exposed bedrock outcrops along four transects normal to the escarpment, allowing us to calculate erosion rates integrated over 104–105 years. These basin-averaged erosion rates (5.4–49 m My-1) are consistent with those measured elsewhere in the southern Appalachians and show a positive relationship between erosion rate and average basin slope. Erosion rates show no relationship with basin size or relative position of the Brevard fault zone, a fundamental structural element of the region. The cosmogenic isotopic data, when considered along with the distribution of average basin slopes in each physiographic province, suggest that the escarpment is eroding on average more rapidly than the Blue Ridge uplands, which are eroding more rapidly than the Piedmont lowlands. This difference in erosion rates by geomorphic setting suggests that the elevation difference between the uplands and lowlands adjacent to the escarpment is being reduced but at extremely slow rates

    Authentication and authorisation in entrusted unions

    Get PDF
    This paper reports on the status of a project whose aim is to implement and demonstrate in a real-life environment an integrated eAuthentication and eAuthorisation framework to enable trusted collaborations and delivery of services across different organisational/governmental jurisdictions. This aim will be achieved by designing a framework with assurance of claims, trust indicators, policy enforcement mechanisms and processing under encryption to address the security and confidentiality requirements of large distributed infrastructures. The framework supports collaborative secure distributed storage, secure data processing and management in both the cloud and offline scenarios and is intended to be deployed and tested in two pilot studies in two different domains, viz, Bio-security incident management and Ambient Assisted Living (eHealth). Interim results in terms of security requirements, privacy preserving authentication, and authorisation are reported

    Education for sustainable construction

    Get PDF
    The COST Action C25 "Sustainability of Constructions - Integrated Approach to Life-time Structural Engineering" is a network of scientists and researchers from 28 European countries and the EU Joint Research Centre in Ispra. It was established to promote science-based developments in sustainable construction in Europe through research on life-time structural engineering. The COST Action has been active since 2006.COSTEuropean Science Foundatio
    • …
    corecore