161 research outputs found

    Rosetta: A container-centric science platform for resource-intensive, interactive data analysis

    Get PDF
    Rosetta is a science platform for resource-intensive, interactive data analysis which runs user tasks as software containers. It is built on top of a novel architecture based on framing user tasks as microservices – independent and self-contained units – which allows to fully support custom and user-defined software packages, libraries and environments. These include complete remote desktop and GUI applications, besides common analysis environments as the Jupyter Notebooks. Rosetta relies on Open Container Initiative containers, which allow for safe, effective and reproducible code execution; can use a number of container engines and runtimes; and seamlessly supports several workload management systems, thus enabling containerized workloads on a wide range of computing resources. Although developed in the astronomy and astrophysics space, Rosetta can virtually support any science and technology domain where resource-intensive, interactive data analysis is required

    Cloud access to interoperable IVOA-compliant VOSpace storage

    Get PDF
    Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitation

    Drop-out rate among patients treated with omalizumab for severe asthma: Literature review and real-life experience

    Get PDF
    BACKGROUND: In patients with asthma, particularly severe asthma, poor adherence to inhaled drugs negatively affects the achievement of disease control. A better adherence rate is expected in the case of injected drugs, such as omalizumab, as they are administered only in a hospital setting. However, adherence to omalizumab has never been systematically investigated. The aim of this study was to review the omalizumab drop-out rate in randomized controlled trials (RCTs) and real-life studies. A comparative analysis was performed between published data and the Italian North East Omalizumab Network (NEONet) database. RESULTS: In RCTs the drop-out rate ranged from 7.1 to 19.4 %. Although the reasons for withdrawal were only occasionally reported, patient decision and adverse events were the most frequently reported causes. In real-life studies the drop-out rate ranged from 0 to 45.5 %. In most cases lack of efficacy was responsible for treatment discontinuation. According to NEONet data, 32 % of treated patients dropped out, with an increasing number of drop outs observed over time. Patient decision and lack of efficacy accounted for most treatment withdrawals. CONCLUSIONS: Treatment adherence is particularly crucial in patients with severe asthma considering the clinical impact of the disease and the cost of non-adherence. The risk of treatment discontinuation has to be carefully considered both in the experimental and real-life settings. Increased knowledge regarding the main reasons for patient withdrawal is important to improve adherence in clinical practice

    Drop-out rate among patients treated with omalizumab for severe asthma: Literature review and real-life experience

    Get PDF
    In patients with asthma, particularly severe asthma, poor adherence to inhaled drugs negatively affects the achievement of disease control. A better adherence rate is expected in the case of injected drugs, such as omalizumab, as they are administered only in a hospital setting. However, adherence to omalizumab has never been systematically investigated. The aim of this study was to review the omalizumab drop-out rate in randomized controlled trials (RCTs) and real-life studies. A comparative analysis was performed between published data and the Italian North East Omalizumab Network (NEONet) database

    BeyondPlanck II. CMB map-making through Gibbs sampling

    Full text link
    We present a Gibbs sampling solution to the map-making problem for CMB measurements, building on existing destriping methodology. Gibbs sampling breaks the computationally heavy destriping problem into two separate steps; noise filtering and map binning. Considered as two separate steps, both are computationally much cheaper than solving the combined problem. This provides a huge performance benefit as compared to traditional methods, and allows us for the first time to bring the destriping baseline length to a single sample. We apply the Gibbs procedure to simulated Planck 30 GHz data. We find that gaps in the time-ordered data are handled efficiently by filling them with simulated noise as part of the Gibbs process. The Gibbs procedure yields a chain of map samples, from which we may compute the posterior mean as a best-estimate map. The variation in the chain provides information on the correlated residual noise, without need to construct a full noise covariance matrix. However, if only a single maximum-likelihood frequency map estimate is required, we find that traditional conjugate gradient solvers converge much faster than a Gibbs sampler in terms of total number of iterations. The conceptual advantages of the Gibbs sampling approach lies in statistically well-defined error propagation and systematic error correction, and this methodology forms the conceptual basis for the map-making algorithm employed in the BeyondPlanck framework, which implements the first end-to-end Bayesian analysis pipeline for CMB observations.Comment: 11 pages, 10 figures. All BeyondPlanck products and software will be released publicly at http://beyondplanck.science during the online release conference (November 18-20, 2020). Connection details will be made available at the same website. Registration is mandatory for the online tutorial, but optional for the conferenc

    BeyondPlanck VII. Bayesian estimation of gain and absolute calibration for CMB experiments

    Full text link
    We present a Bayesian calibration algorithm for CMB observations as implemented within the global end-to-end BeyondPlanck (BP) framework, and apply this to the Planck Low Frequency Instrument (LFI) data. Following the most recent Planck analysis, we decompose the full time-dependent gain into a sum of three orthogonal components: One absolute calibration term, common to all detectors; one time-independent term that can vary between detectors; and one time-dependent component that is allowed to vary between one-hour pointing periods. Each term is then sampled conditionally on all other parameters in the global signal model through Gibbs sampling. The absolute calibration is sampled using only the orbital dipole as a reference source, while the two relative gain components are sampled using the full sky signal, including the orbital and Solar CMB dipoles, CMB fluctuations, and foreground contributions. We discuss various aspects of the data that influence gain estimation, including the dipole/polarization quadrupole degeneracy and anomalous jumps in the instrumental gain. Comparing our solution to previous pipelines, we find good agreement in general, with relative deviations of -0.84% (-0.67%) for 30 GHz, -0.14% (0.02%) for 44 GHz and -0.69% (-0.08%) for 70 GHz, compared to Planck 2018 (NPIPE). The deviations we find are within expected error bounds, and we attribute them to differences in data usage and general approach between the pipelines. In particular, the BP calibration is performed globally, resulting in better inter-frequency consistency. Additionally, WMAP observations are used actively in the BP analysis, which breaks degeneracies in the Planck data set and results in better agreement with WMAP. Although our presentation and algorithm are currently oriented toward LFI processing, the procedure is fully generalizable to other experiments.Comment: 18 pages, 15 figures. All BeyondPlanck products and software will be released publicly at http://beyondplanck.science during the online release conference (November 18-20, 2020). Connection details will be made available at the same website. Registration is mandatory for the online tutorial, but optional for the conferenc
    • …
    corecore