60 research outputs found
Ontology Based Integration of Distributed and Heterogeneous Data Sources in ACGT.
In this work, we describe the set of tools comprising the Data Access Infrastructure within Advancing Clinic-genomic Trials on Cancer (ACGT), a R&D Project funded in part by the European. This infrastructure aims at improving Post-genomic clinical trials by providing seamless access to integrated clinical, genetic, and image databases. A data access layer, based on OGSA-DAI, has been developed in order to cope with syntactic heterogeneities in databases. The semantic problems present in data sources with different nature are tackled by two core tools, namely the Semantic Mediator and the Master Ontology on Cancer. The ontology is used as a common framework for semantics, modeling the domain and acting as giving support to homogenization. SPARQL has been selected as query language for the Data Access Services and the Mediator. Two experiments have been carried out in order to test the suitability of the selected approach, integrating clinical and DICOM image databases
The relentless variability of Mrk 421 from the TeV to the radio
The origin of the gamma-ray emission of the blazar Mrk 421 is still a matter
of debate. We used 5.5 years of unbiased observing campaign data, obtained
using the FACT telescope and the Fermi LAT detector at TeV and GeV energies,
the longest and densest so far, together with contemporaneous multi-wavelength
observations, to characterise the variability of Mrk 421 and to constrain the
underlying physical mechanisms. We studied and correlated light curves obtained
by ten different instruments and found two significant results. The TeV and
X-ray light curves are very well correlated with a lag of <0.6 days. The GeV
and radio (15 Ghz band) light curves are widely and strongly correlated.
Variations of the GeV light curve lead those in the radio. Lepto-hadronic and
purely hadronic models in the frame of shock acceleration predict proton
acceleration or cooling timescales that are ruled out by the short variability
timescales and delays observed in Mrk 421. Instead the observations match the
predictions of leptonic models.Comment: 10 pages, 8 figures, 1 tabl
Single-cell transcriptomic atlas-guided development of CAR-T cells for the treatment of acute myeloid leukemia
A single-cell screening approach identifies targets for CAR-T cells in acute myeloid leukemia. Chimeric antigen receptor T cells (CAR-T cells) have emerged as a powerful treatment option for individuals with B cell malignancies but have yet to achieve success in treating acute myeloid leukemia (AML) due to a lack of safe targets. Here we leveraged an atlas of publicly available RNA-sequencing data of over 500,000 single cells from 15 individuals with AML and tissue from 9 healthy individuals for prediction of target antigens that are expressed on malignant cells but lacking on healthy cells, including T cells. Aided by this high-resolution, single-cell expression approach, we computationally identify colony-stimulating factor 1 receptor and cluster of differentiation 86 as targets for CAR-T cell therapy in AML. Functional validation of these established CAR-T cells shows robust in vitro and in vivo efficacy in cell line- and human-derived AML models with minimal off-target toxicity toward relevant healthy human tissues. This provides a strong rationale for further clinical development
Current Wildland Fire Patterns and Challenges in Europe : A Synthesis of National Perspectives
Changes in climate, land use, and land management impact the occurrence and severity of wildland fires in many parts of the world. This is particularly evident in Europe, where ongoing changes in land use have strongly modified fire patterns over the last decades. Although satellite data by the European Forest Fire Information System provide large-scale wildland fire statistics across European countries, there is still a crucial need to collect and summarize in-depth local analysis and understanding of the wildland fire condition and associated challenges across Europe. This article aims to provide a general overview of the current wildland fire patterns and challenges as perceived by national representatives, supplemented by national fire statistics (2009-2018) across Europe. For each of the 31 countries included, we present a perspective authored by scientists or practitioners from each respective country, representing a wide range of disciplines and cultural backgrounds. The authors were selected from members of the COST Action "Fire and the Earth System: Science & Society" funded by the European Commission with the aim to share knowledge and improve communication about wildland fire. Where relevant, a brief overview of key studies, particular wildland fire challenges a country is facing, and an overview of notable recent fire events are also presented. Key perceived challenges included (1) the lack of consistent and detailed records for wildland fire events, within and across countries, (2) an increase in wildland fires that pose a risk to properties and human life due to high population densities and sprawl into forested regions, and (3) the view that, irrespective of changes in management, climate change is likely to increase the frequency and impact of wildland fires in the coming decades. Addressing challenge (1) will not only be valuable in advancing national and pan-European wildland fire management strategies, but also in evaluating perceptions (2) and (3) against more robust quantitative evidence.Peer reviewe
Architecture and performance of the KM3NeT front-end firmware
The KM3NeT infrastructure consists of two deep-sea neutrino telescopes being deployed in the Mediterranean Sea. The telescopes will detect extraterrestrial and atmospheric neutrinos by means of the incident photons induced by the passage of relativistic charged particles through the seawater as a consequence of a neutrino interaction. The telescopes are configured in a three-dimensional grid of digital optical modules, each hosting 31 photomultipliers. The photomultiplier signals produced by the incident Cherenkov photons are converted into digital information consisting of the integrated pulse duration and the time at which it surpasses a chosen threshold. The digitization is done by means of time to digital converters (TDCs) embedded in the field programmable gate array of the central logic board. Subsequently, a state machine formats the acquired data for its transmission to shore. We present the architecture and performance of the front-end firmware consisting of the TDCs and the state machine
GA4GH: International policies and standards for data sharing across genomic research and healthcare.
The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits
Fractional variability—a tool to study blazar variability
Active Galactic Nuclei emit radiation over the whole electromagnetic spectrum up to TeV energies. Blazars are one subtype with their jets pointing towards the observer. One of their typical features is extreme variability on timescales, from minutes to years. The fractional variability is an often used parameter for investigating the degree of variability of a light curve. Different detection methods and sensitivities of the instruments result in differently binned data and light curves with gaps. As they can influence the physics interpretation of the broadband variability, the effects of these differences on the fractional variability need to be studied. In this paper, we study the systematic effects of completeness in time coverage and the sampling rate. Using public data from instruments monitoring blazars in various energy ranges, we study the variability of the bright TeV blazars Mrk 421 and Mrk 501 over the electromagnetic spectrum, taking into account the systematic effects, and compare our findings with previous results. Especially in the TeV range, the fractional variability is higher than in previous studies, which can be explained by the much longer (seven years compared to few weeks) and more complete data sample
The online multi-agency support barometer
This report presents the findings of a small feasibility study which sought to investigate the use of an online risk management Barometer. The barometer was developed with a view to helping multiple agencies communicate about vulnerable and 'at risk' patients within mental health settings.The Barometer is an online tool which allows staff from multiple agencies to access and share information with other staff if a patient is at risk.The key aims and objectives of this small research project were i) to evaluating how professionals felt about their current risk assessment tools, ii) to assess the ease of use and the relevance of questions within the Barometer tool and iii) to discuss some potential modifications/problem areas of incorporating the Barometer tool within mental health services and across a multi-disciplinary perspective.The research was conducted in three mental services within the South Essex Partnership Trust (2 x CAMHS and 2 x Adult Services) using a mixed-methods approach
Using the Empirical Attainment Function for Analyzing Single-objective Black-box Optimization Algorithms
A widely accepted way to assess the performanceof iterative black-box optimizers is to analyze their empirical cumulative distribution function (ECDF) of pre-defined quality targets achieved not later than a given runtime. In this work, we consider an alternative approach, based on the empirical attainment function (EAF) and we show that the target-based ECDF is an approximation of the EAF. We argue that the EAF has several advantages over the target-based ECDF. In particular, it does not require defining a priori quality targets per function, captures performance differences more precisely, and enables the use of additional summary statistics that enrich the analysis. We also show that the average area over the convergence curves is a simpler-to-calculate, but equivalent, measure of anytime performance. To facilitate the accessibility of the EAF, we integrate a module to compute it into the IOHanalyzer platform. Finally, we illustrate the use of the EAF via synthetic examples and via the data available for the BBOB suite
- …