130 research outputs found
Rosetta: A container-centric science platform for resource-intensive, interactive data analysis
Rosetta is a science platform for resource-intensive, interactive data analysis which runs user tasks as software containers. It is built on top of a novel architecture based on framing user tasks as microservices – independent and self-contained units – which allows to fully support custom and user-defined software packages, libraries and environments. These include complete remote desktop and GUI applications, besides common analysis environments as the Jupyter Notebooks. Rosetta relies on Open Container Initiative containers, which allow for safe, effective and reproducible code execution; can use a number of container engines and runtimes; and seamlessly supports several workload management systems, thus enabling containerized workloads on a wide range of computing resources. Although developed in the astronomy and astrophysics space, Rosetta can virtually support any science and technology domain where resource-intensive, interactive data analysis is required
Drop-out rate among patients treated with omalizumab for severe asthma: Literature review and real-life experience
In patients with asthma, particularly severe asthma, poor adherence to inhaled drugs negatively affects the achievement of disease control. A better adherence rate is expected in the case of injected drugs, such as omalizumab, as they are administered only in a hospital setting. However, adherence to omalizumab has never been systematically investigated. The aim of this study was to review the omalizumab drop-out rate in randomized controlled trials (RCTs) and real-life studies. A comparative analysis was performed between published data and the Italian North East Omalizumab Network (NEONet) database
BeyondPlanck II. CMB map-making through Gibbs sampling
We present a Gibbs sampling solution to the map-making problem for CMB
measurements, building on existing destriping methodology. Gibbs sampling
breaks the computationally heavy destriping problem into two separate steps;
noise filtering and map binning. Considered as two separate steps, both are
computationally much cheaper than solving the combined problem. This provides a
huge performance benefit as compared to traditional methods, and allows us for
the first time to bring the destriping baseline length to a single sample. We
apply the Gibbs procedure to simulated Planck 30 GHz data. We find that gaps in
the time-ordered data are handled efficiently by filling them with simulated
noise as part of the Gibbs process. The Gibbs procedure yields a chain of map
samples, from which we may compute the posterior mean as a best-estimate map.
The variation in the chain provides information on the correlated residual
noise, without need to construct a full noise covariance matrix. However, if
only a single maximum-likelihood frequency map estimate is required, we find
that traditional conjugate gradient solvers converge much faster than a Gibbs
sampler in terms of total number of iterations. The conceptual advantages of
the Gibbs sampling approach lies in statistically well-defined error
propagation and systematic error correction, and this methodology forms the
conceptual basis for the map-making algorithm employed in the BeyondPlanck
framework, which implements the first end-to-end Bayesian analysis pipeline for
CMB observations.Comment: 11 pages, 10 figures. All BeyondPlanck products and software will be
released publicly at http://beyondplanck.science during the online release
conference (November 18-20, 2020). Connection details will be made available
at the same website. Registration is mandatory for the online tutorial, but
optional for the conferenc
BeyondPlanck X. Planck LFI frequency maps with sample-based error propagation
We present Planck LFI frequency sky maps derived within the BeyondPlanck
framework. This framework draws samples from a global posterior distribution
that includes instrumental, astrophysical and cosmological parameters, and the
main product is an entire ensemble of frequency sky map samples. This ensemble
allows for computationally convenient end-to-end propagation of low-level
instrumental uncertainties into higher-level science products. We show that the
two dominant sources of LFI instrumental systematic uncertainties are
correlated noise and gain fluctuations, and the products presented here support
- for the first time - full Bayesian error propagation for these effects at
full angular resolution. We compare our posterior mean maps with traditional
frequency maps delivered by the Planck collaboration, and find generally good
agreement. The most important quality improvement is due to significantly lower
calibration uncertainties in the new processing, as we find a fractional
absolute calibration uncertainty at 70 GHz of , which is nominally 40 times smaller than that reported by Planck
2018. However, the original Planck 2018 estimate has a non-trivial statistical
interpretation, and this further illustrates the advantage of the new framework
in terms of producing self-consistent and well-defined error estimates of all
involved quantities without the need of ad hoc uncertainty contributions. We
describe how low-resolution data products, including dense pixel-pixel
covariance matrices, may be produced directly from the posterior samples
without the need for computationally expensive analytic calculations or
simulations. We conclude that posterior-based frequency map sampling provides
unique capabilities in terms of low-level systematics modelling and error
propagation, and may play an important role for future CMB B-mode experiments.
(Abridged.)Comment: 32 pages, 23 figures, data available from
https://www.cosmoglobe.uio.no
BeyondPlanck XI. Bayesian CMB analysis with sample-based end-to-end error propagation
We present posterior sample-based cosmic microwave background (CMB)
constraints from Planck LFI and WMAP observations derived through global
end-to-end Bayesian processing. We use these samples to study correlations
between CMB, foreground, and instrumental parameters, and we identify a
particularly strong degeneracy between CMB temperature fluctuations and
free-free emission on intermediate angular scales, which is mitigated through
model reduction, masking, and resampling. We compare our posterior-based CMB
results with previous Planck products, and find generally good agreement, but
with higher noise due to exclusion of HFI data. We find a best-fit CMB dipole
amplitude of , in excellent agreement with previous Planck
results. The quoted uncertainty is derived directly from the sampled posterior
distribution, and does not involve any ad hoc contribution for systematic
effects. Similarly, we find a temperature quadrupole amplitude of
, in good agreement with previous results in
terms of the amplitude, but the uncertainty is an order of magnitude larger
than the diagonal Fisher uncertainty. Relatedly, we find lower evidence for a
possible alignment between and than previously reported
due to a much larger scatter in the individual quadrupole coefficients, caused
both by marginalizing over a more complete set of systematic effects, and by
our more conservative analysis mask. For higher multipoles, we find that the
angular temperature power spectrum is generally in good agreement with both
Planck and WMAP. This is the first time the sample-based asymptotically exact
Blackwell-Rao estimator has been successfully established for multipoles up to
, and it now accounts for the majority of the cosmologically
important information. Cosmological parameter constraints are presented in a
companion paper. (Abriged)Comment: 26 pages, 24 figures. Submitted to A&A. Part of the BeyondPlanck
paper suit
- …