17 research outputs found

    The foreground transfer function for HI intensity mapping signal reconstruction: MeerKLASS and precision cosmology applications

    Full text link
    Blind cleaning methods are currently the preferred strategy for handling foreground contamination in single-dish HI intensity mapping surveys. Despite the increasing sophistication of blind techniques, some signal loss will be inevitable across all scales. Constructing a corrective transfer function using mock signal injection into the contaminated data has been a practice relied on for HI intensity mapping experiments. However, assessing whether this approach is viable for future intensity mapping surveys where precision cosmology is the aim, remains unexplored. In this work, using simulations, we validate for the first time the use of a foreground transfer function to reconstruct power spectra of foreground-cleaned low-redshift intensity maps and look to expose any limitations. We reveal that even when aggressive foreground cleaning is required, which causes >50%{>}\,50\% negative bias on the largest scales, the power spectrum can be reconstructed using a transfer function to within sub-percent accuracy. We specifically outline the recipe for constructing an unbiased transfer function, highlighting the pitfalls if one deviates from this recipe, and also correctly identify how a transfer function should be applied in an auto-correlation power spectrum. We validate a method that utilises the transfer function variance for error estimation in foreground-cleaned power spectra. Finally, we demonstrate how incorrect fiducial parameter assumptions (up to ±100%{\pm}100\% bias) in the generation of mocks, used in the construction of the transfer function, do not significantly bias signal reconstruction or parameter inference (inducing <5%{<}\,5\% bias in recovered values).Comment: 25 pages, 20 figures. See Figure 4 for the main demonstration of the transfer function's performance for reconstructing signal loss from foreground cleaning. Submitted to MNRAS for publicatio

    SKAO HI intensity mapping: blind foreground subtraction challenge

    Get PDF
    Neutral Hydrogen Intensity Mapping (H I IM) surveys will be a powerful new probe of cosmology. However, strong astrophysical foregrounds contaminate the signal and their coupling with instrumental systematics further increases the data cleaning complexity. In this work, we simulate a realistic single-dish HI IM survey of a 5000 deg2 patch in the 950–1400 MHz range, with both the MID telescope of the SKA Observatory (SKAO) and MeerKAT, its precursor. We include a state-of-the-art HI simulation and explore different foreground models and instrumental effects such as non-homogeneous thermal noise and beam side lobes. We perform the first Blind Foreground Subtraction Challenge for HI IM on these synthetic data cubes, aiming to characterize the performance of available foreground cleaning methods with no prior knowledge of the sky components and noise level. Nine foreground cleaning pipelines joined the challenge, based on statistical source separation algorithms, blind polynomial fitting, and an astrophysical-informed parametric fit to foregrounds. We devise metrics to compare the pipeline performances quantitatively. In general, they can recover the input maps’ two-point statistics within 20 per cent in the range of scales least affected by the telescope beam. However, spurious artefacts appear in the cleaned maps due to interactions between the foreground structure and the beam side lobes. We conclude that it is fundamental to develop accurate beam deconvolution algorithms and test data post-processing steps carefully before cleaning. This study was performed as part of SKAO preparatory work by the HI IM Focus Group of the SKA Cosmology Science Working Group

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Recovery of 21 cm intensity maps with sparse component separation

    No full text
    International audience21-cm intensity mapping has emerged as a promising technique to map the large-scale structure of the Universe. However, the presence of foregrounds with amplitudes orders of magnitude larger than the cosmological signal constitutes a critical challenge. Here, we test the sparsity-based algorithm generalized morphological component analysis (GMCA) as a blind component separation technique for this class of experiments. We test the GMCA performance against realistic full-sky mock temperature maps that include, besides astrophysical foregrounds, also a fraction of the polarized part of the signal leaked into the unpolarized one, a very troublesome foreground to subtract, usually referred to as polarization leakage. To our knowledge, this is the first time the removal of such component is performed with no prior assumption. We assess the success of the cleaning by comparing the true and recovered power spectra, in the angular and radial directions. In the best scenario looked at, GMCA is able to recover the input angular (radial) power spectrum with an average bias of |5 per cent{\sim} 5{{\ \rm per\ cent}}| for ℓ > 25 (⁠|20 ⁣ ⁣30 per cent20\!-\!30 {{\ \rm per\ cent}}| for |k0.02h1k_{\parallel } \gtrsim 0.02 \, h^{-1}| Mpc), in the presence of polarization leakage. Our results are robust also when up to |40 per cent40{{\ \rm per\ cent}}| of channels are missing, mimicking a radio-frequency interference (RFI) flagging of the data. Having quantified the notable effect of polarization leakage on our results, in perspective we advocate the use of more realistic simulations when testing 21-cm intensity mapping capabilities

    Determining thermal dust emission from Planck HFI data using a sparse, parametric technique

    No full text
    International audienceContext. The Planck data releases have provided the community with submillimetre and full-sky radio observations at unprecedented resolutions. We make use of the Planck 353, 545, and 857 GHz maps alongside the IRAS 3000 GHz map. These maps contain information on the cosmic microwave background (CMB), cosmic infrared background (CIB), extragalactic point sources, and diffuse thermal dust emission.Aims. We aim to determine the modified black-body (MBB) model parameters of thermal dust emission in total intensity and produce all-sky maps of pure thermal dust, having separated this Galactic component from the CMB and CIB.Methods. This separation is completed using a new, sparsity-based, parametric method, Parameter Recovery Exploiting Model Informed Sparse Estimates (premise). The method is comprised of three main stages: 1) filtering the raw data to reduce the effect of the CIB on the MBB fit; 2) fitting an MBB model to the filtered data across super-pixels of various sizes determined by the algorithm itself; and 3) refining these super-pixel estimates into full-resolution maps of the MBB parameters.Results. We present our maps of MBB temperature, spectral index, and optical depth at 5 arcmin resolution and compare our estimates to those of GNILC and to the two-step MBB fit presented by the Planck Collaboration in 2013.Conclusions. By exploiting sparsity we avoid the need for smoothing, enabling us to produce the first full-resolution MBB parameter maps from intensity measurements of thermal dust emission. We consider the premise parameter estimates to be competitive with the existing state-of-the-art solutions, outperforming these methods within low signal-to-noise regions as we account for the CIB without removing thermal dust emission through oversmoothing.Key words: cosmic background radiation / dust, extinction / methods: data analysis⋆ Parameter maps are only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/623/A2

    21-cm foregrounds and polarization leakage: cleaning and mitigation strategies

    No full text
    International audienceThe success of H i intensity mapping is largely dependent on how well 21-cm foreground contamination can be controlled. In order to progress our understanding further, we present a range of simulated foreground data from two different ∼3000 deg^2 sky regions, with varying effects from polarization leakage. Combining these with cosmological H i simulations creates a range of intensity mapping test cases that require different foreground treatments. This allows us to conduct the most generalized study to date into 21-cm foregrounds and their cleaning techniques for the post-reionization era. We first provide a pedagogical review of the most commonly used blind foreground removal techniques [principal component analysis (PCA)/singular value decomposition (SVD), fast independent component analysis (FASTICA), and generalized morphological component analysis (GMCA)]. We also trial a non-blind parametric fitting technique and discuss potential hybridization of methods. We highlight the similarities and differences in these techniques finding that the blind methods produce near equivalent results, and we explain the fundamental reasons for this. Our results demonstrate that polarized foreground residuals should be generally subdominant to H i on small scales (⁠|k0.1hMpc1k\gtrsim 0.1\, h\, \text{Mpc}^{-1}|⁠). However, on larger scales, results are more case dependent. In some cases, aggressive cleans severely damp H i power but still leave dominant foreground residuals. We find a changing polarization fraction has little impact on results within a realistic range (0.5–2 per cent); however, a higher level of Faraday rotation does require more aggressive cleaning. We also demonstrate the gain from cross-correlations with optical galaxy surveys, where extreme levels of residual foregrounds can be circumvented. However, these residuals still contribute to errors and we discuss the optimal balance between overcleaning and undercleaning
    corecore