59 research outputs found

    UNIONS: The impact of systematic errors on weak-lensing peak counts

    Full text link
    UNIONS is an ongoing deep photometric multi-band survey of the Northern sky. As part of UNIONS, CFIS provides r-band data which we use to study weak-lensing peak counts for cosmological inference. We assess systematic effects for weak-lensing peak counts and their impact on cosmological parameters for the UNIONS survey. In particular, we present results on local calibration, metacalibration shear bias, baryonic feedback, the source galaxy redshift estimate, intrinsic alignment, and the cluster member dilution. For each uncertainty and systematic effect, we describe our mitigation scheme and the impact on cosmological parameter constraints. We obtain constraints on cosmological parameters from MCMC using CFIS data and MassiveNuS N-body simulations as a model for peak counts statistics. Depending on the calibration (local versus global, and the inclusion of the residual multiplicative shear bias), the mean matter density parameter Ωm\Omega_m can shift up to 0.024-0.024 (0.5σ-0.5\sigma). We also see that including baryonic corrections can shift Ωm\Omega_m by +0.027+0.027 (+0.5σ+0.5 \sigma) with respect to the DM-only simulations. Reducing the impact of the intrinsic alignment and cluster member dilution through signal-to-noise cuts can lead to a shift in Ωm\Omega_m of +0.027+0.027 (+0.5σ+0.5 \sigma). Finally, with a mean redshift uncertainty of Δzˉ=0.03\Delta \bar{z} = 0.03, we see that the shift of Ωm\Omega_m (+0.001+0.001 which corresponds to +0.02σ+0.02 \sigma) is not significant. This paper investigates for the first time with UNIONS weak-lensing data and peak counts the impact of systematic effects. The value of Ωm\Omega_m is the most impacted and can shift up to 0.03\sim 0.03 which corresponds to 0.5σ0.5\sigma depending on the choices for each systematics. We expect constraints to become more reliable with future (larger) data catalogues, for which the current pipeline will provide a starting point.Comment: 17 pages, 17 figure

    Impact of Point Spread Function Higher Moments Error on Weak Gravitational Lensing II: A Comprehensive Study

    Full text link
    Weak gravitational lensing, or weak lensing, is one of the most powerful probes for dark matter and dark energy science, although it faces increasing challenges in controlling systematic uncertainties as \edit{the statistical errors become smaller}. The Point Spread Function (PSF) needs to be precisely modeled to avoid systematic error on the weak lensing measurements. The weak lensing biases induced by errors in the PSF model second moments, i.e., its size and shape, are well-studied. However, Zhang et al. (2021) showed that errors in the higher moments of the PSF may also be a significant source of systematics for upcoming weak lensing surveys. Therefore, the goal of this work is to comprehensively investigate the modeling quality of PSF moments from the 3rd3^{\text{rd}} to 6th6^{\text{th}} order, and estimate their impact on cosmological parameter inference. We propagate the \textsc{PSFEx} higher moments modeling error in the HSC survey dataset to the weak lensing \edit{shear-shear correlation functions} and their cosmological analyses. We find that the overall multiplicative shear bias associated with errors in PSF higher moments can cause a 0.1σ\sim 0.1 \sigma shift on the cosmological parameters for LSST Y10. PSF higher moment errors also cause additive biases in the weak lensing shear, which, if not accounted for in the cosmological parameter analysis, can induce cosmological parameter biases comparable to their 1σ1\sigma uncertainties for LSST Y10. We compare the \textsc{PSFEx} model with PSF in Full FOV (\textsc{Piff}), and find similar performance in modeling the PSF higher moments. We conclude that PSF higher moment errors of the future PSF models should be reduced from those in current methods to avoid a need to explicitly model these effects in the weak lensing analysis.Comment: 24 pages, 17 figures, 3 tables; Submitted to MNRAS; Comments welcome

    The shape of dark matter haloes: results from weak lensing in the Ultraviolet Near-Infrared Optical Northern Survey (UNIONS)

    Full text link
    Cold dark matter haloes are expected to be triaxial, and so appear elliptical in projection. We use weak gravitational lensing from the Canada-France Imaging Survey (CFIS) component of the Ultraviolet-Near Infrared Optical Northern Survey (UNIONS) to measure the ellipticity of the dark matter haloes around Luminous Red Galaxies (LRGs) from the Sloan Digital Sky Survey Data Release 7 (DR7) and from the CMASS and LOWZ samples of the Baryon Oscillation Spectroscopic Survey (BOSS), assuming their major axes are aligned with the stellar light. We find that DR7 LRGs with masses M2.5×1013M/hM \sim 2.5\times10^{13} \textrm{M}_{\odot}/h have halo ellipticities e=0.35±0.09e=0.35\pm0.09. Expressed as a fraction of the galaxy ellipticity, we find fh=1.4±0.4f_h = 1.4\pm0.4. For BOSS LRGs, the detection is of marginal significance: e=0.17±0.10e = 0.17\pm0.10 and fh=0.1±0.4f_h=0.1\pm0.4. These results are in agreement with other measurements of halo ellipticity from weak lensing and, taken together with previous results, suggest an increase of halo ellipticity of 0.10±0.050.10\pm0.05 per decade in halo mass. This trend agrees with the predictions from hydrodynamical simulations, which find that at higher halo masses, not only do dark matter haloes become more elliptical, but that the misalignment between major axis of the stellar light in the central galaxy and that of the dark matter decreases

    Analyse du lentillage gravitationnel faible avec le relevé Canada-France Imaging Survey : des pixels à la cosmology, préparation de la mission Euclid

    No full text
    Among the big questions cosmology faces today, the nature of dark matter and dark energy are at the center of upcoming surveys. The future stage IV missions Euclid and LSST will cover a surface on the sky never reached before to unveil structures at very large scales and different epochs. Weak gravitational lensing will be one of the cosmological probes used to trace dark matter. Gravitational lensing is a physical phenomenon which uses the distortion of light to trace the presence of mass in the Universe. The interesting point of weak lensing is its sensitivity to total mass, i.e. baryonic and non-baryonic. Due to gravitational lensing, distant galaxies will appear distorted on the observed images. The measurement of the distortions induced by gravitational shear requires a very accurate estimation of the shape of galaxies. This thesis will present the data reduction pipeline built for weak lensing studies, from the telescope to cosmological parameter inference. The work focuses on the analysis of the Canada-France Imaging Survey (CFIS), a u- and r-band survey covering 5,000 deg2 in the Northern hemisphere. The high resolution and depth of those data make it one the best survey candidates for weak lensing science to date. Among other things, accurate measurement of the shape galaxies requires a very good knowledge of the PSF for which a suite of validation tests have been developed. Due to the noise, and approximations used in the shape measurement, the results can be biased. The residual multiplicative and additive biases have been reduced to m < 0.1% and c < 0.001% respectively by using state-of-art techniques such as Metacalibration. This thesis will also present the work that is required for the development of a weak lensing pipeline, such as the elaboration of highly accurate and data-representative image simulations. We will show validation tests performed to ensure systematic-free measurements. Finally, preliminary science results will be presented demonstrating the viability of the pipeline. We have constructed maps of dark matter over a surface of 2,000 deg2. We have measured tangential shear around 50 clusters and compared to theoretical predictions. To conclude, we will present a first 3x2 points analysis combining the weak lensing study performed on CFIS and the redshift measurement from eBOSS observations on the 50 deg2 chosen for science verification purposes.Parmi les grandes questions auxquelles la cosmologie fait face aujourd’hui, la nature de la matière noire et de l’énergie noire sont au centre des relevés à venir. Les futures missions de stade IV Euclid et LSST vont couvrir une surface du ciel jamais atteinte auparavant dans le but de révéler les structures aux très grandes échelles et de différentes époques. Le lentillage gravitationnel faible sera une des sondes cosmologiques utilisées pour tracer la matière noire. Le lentillage gravitationnel est un phénomène physique qui utilise la distorsion de la lumière pour tracer la présence de masses dans l’Univers. Ce qui est intéressant avec le lentillage gravitationnel faible est sa sensitivité à la masse totale, i.e. baryonique et non-baryonique. Dû au lentillage gravitationnel, les galaxies distantes apparaissent distordues sur les images observées. La mesure des distorsions provoquées par le cisaillement gravitationnel requiert une très précise estimation des formes des galaxies. Cette thèse présentera la chaîne de réduction de données construites pour l’étude du lentillage gravitationnel faible, depuis le télescope jusqu’à l’inférence des paramètres cosmologiques. Le travail se focalise sur l’analyse du relevé Canada-France Imaging Survey (CFIS), un relevé couvrant 5,000 deg2 de l’hémisphère Nord dans les bandes u et r. La haute résolution et la profondeur de ces données en font à ce jour un des meilleurs candidats pour l’étude de la science du lentillage gravitationnel faible. Entre autres choses, la mesure précise de la forme des galaxies nécessite une très bonne connaissance de la PSF pour laquelle une suite de tests ont été développés pour la validation. Dû au bruit, et l’utilisation d’approximations pour la mesure de formes, les résultats peuvent être biaisés. Par l’utilisation de techniques de pointe comme la Metacalibration, les biais multiplicatifs et additifs résiduels ont été réduits à m < 0.1% et c < 0.001% respectivement. Cette thèse présentera aussi le travail qui est demandé pour le développement d’une chaîne de traitements pour du lentillage gravitationnel faible, comme l’élaboration de simulations d’images très précises et représentatives des données. Nous présenterons les tests de validations réalisés pour assurer une mesure dénuée d’erreurs systématiques. Enfin, des résultats scientifiques préliminaires seront présentés pour démontrer la viabilité de la chaîne de traitements. Nous avons construit des cartes de matière noire sur une surface de 2,000 deg2. Nous avons mesuré et comparé aux prédictions théoriques le cisaillement gravitationnel tangentiel autour d’environ 50 amas. Pour conclure, nous présenterons une première analyse 3x2 points combinant le lentillage gravitationnel faible de l’étude réalisée sur CFIS et la mesure du décalage vers le rouge des galaxies des observations de eBOSS sur les 50 deg2 choisis dans le but de vérifications scientifiques

    Dynamics and contribution of karst groundwater to surface flow during Mediterranean flood

    No full text
    International audienc

    Shear measurement bias: II. A fast machine-learning calibration method

    No full text
    International audienceWe present a new shear calibration method based on machine learning. The method estimates the individual shear responses of the objects from the combination of several measured properties on the images using supervised learning. The supervised learning uses the true individual shear responses obtained from copies of the image simulations with different shear values. On simulated GREAT3 data, we obtain a residual bias after the calibration compatible with 0 and beyond Euclid requirements for a signal-to-noise ratio > 20 within ∼15 CPU hours of training using only ∼105 objects. This efficient machine-learning approach can use a smaller data set because the method avoids the contribution from shape noise. The low dimensionality of the input data also leads to simple neural network architectures. We compare it to the recently described method Metacalibration, which shows similar performances. The different methods and systematics suggest that the two methods are very good complementary methods. Our method can therefore be applied without much effort to any survey such as Euclid or the Vera C. Rubin Observatory, with fewer than a million images to simulate to learn the calibration function

    Bayesian multi-band fitting of alerts for kilonovae detection

    No full text
    International audienceIn the era of multi-messenger astronomy, early classification of photometric alerts from wide-field and high-cadence surveys is a necessity to trigger spectroscopic follow-ups. These classifications are expected to play a key role in identifying potential candidates that might have a corresponding gravitational wave (GW) signature. Machine learning classifiers using features from parametric fitting of light curves are widely deployed by broker software to analyze millions of alerts, but most of these algorithms require as many points in the filter as the number of parameters to produce the fit, which increases the chances of missing a short transient. Moreover, the classifiers are not able to account for the uncertainty in the fits when producing the final score. In this context, we present a novel classification strategy that incorporates data-driven priors for extracting a joint posterior distribution of fit parameters and hence obtaining a distribution of classification scores. We train and test a classifier to identify kilonovae events which originate from binary neutron star mergers or neutron star black hole mergers, among simulations for the Zwicky Transient Facility observations with 19 other non-kilonovae-type events. We demonstrate that our method can estimate the uncertainty of misclassification, and the mean of the distribution of classification scores as point estimate obtains an AUC score of 0.96 on simulated data. We further show that using this method we can process the entire alert steam in real-time and bring down the sample of probable events to a scale where they can be analyzed by domain experts

    Bayesian multi-band fitting of alerts for kilonovae detection

    No full text
    International audienceIn the era of multi-messenger astronomy, early classification of photometric alerts from wide-field and high-cadence surveys is a necessity to trigger spectroscopic follow-ups. These classifications are expected to play a key role in identifying potential candidates that might have a corresponding gravitational wave (GW) signature. Machine learning classifiers using features from parametric fitting of light curves are widely deployed by broker software to analyze millions of alerts, but most of these algorithms require as many points in the filter as the number of parameters to produce the fit, which increases the chances of missing a short transient. Moreover, the classifiers are not able to account for the uncertainty in the fits when producing the final score. In this context, we present a novel classification strategy that incorporates data-driven priors for extracting a joint posterior distribution of fit parameters and hence obtaining a distribution of classification scores. We train and test a classifier to identify kilonovae events which originate from binary neutron star mergers or neutron star black hole mergers, among simulations for the Zwicky Transient Facility observations with 19 other non-kilonovae-type events. We demonstrate that our method can estimate the uncertainty of misclassification, and the mean of the distribution of classification scores as point estimate obtains an AUC score of 0.96 on simulated data. We further show that using this method we can process the entire alert steam in real-time and bring down the sample of probable events to a scale where they can be analyzed by domain experts

    Bayesian multi-band fitting of alerts for kilonovae detection

    No full text
    International audienceIn the era of multi-messenger astronomy, early classification of photometric alerts from wide-field and high-cadence surveys is a necessity to trigger spectroscopic follow-ups. These classifications are expected to play a key role in identifying potential candidates that might have a corresponding gravitational wave (GW) signature. Machine learning classifiers using features from parametric fitting of light curves are widely deployed by broker software to analyze millions of alerts, but most of these algorithms require as many points in the filter as the number of parameters to produce the fit, which increases the chances of missing a short transient. Moreover, the classifiers are not able to account for the uncertainty in the fits when producing the final score. In this context, we present a novel classification strategy that incorporates data-driven priors for extracting a joint posterior distribution of fit parameters and hence obtaining a distribution of classification scores. We train and test a classifier to identify kilonovae events which originate from binary neutron star mergers or neutron star black hole mergers, among simulations for the Zwicky Transient Facility observations with 19 other non-kilonovae-type events. We demonstrate that our method can estimate the uncertainty of misclassification, and the mean of the distribution of classification scores as point estimate obtains an AUC score of 0.96 on simulated data. We further show that using this method we can process the entire alert steam in real-time and bring down the sample of probable events to a scale where they can be analyzed by domain experts

    Bayesian multi-band fitting of alerts for kilonovae detection

    No full text
    International audienceIn the era of multi-messenger astronomy, early classification of photometric alerts from wide-field and high-cadence surveys is a necessity to trigger spectroscopic follow-ups. These classifications are expected to play a key role in identifying potential candidates that might have a corresponding gravitational wave (GW) signature. Machine learning classifiers using features from parametric fitting of light curves are widely deployed by broker software to analyze millions of alerts, but most of these algorithms require as many points in the filter as the number of parameters to produce the fit, which increases the chances of missing a short transient. Moreover, the classifiers are not able to account for the uncertainty in the fits when producing the final score. In this context, we present a novel classification strategy that incorporates data-driven priors for extracting a joint posterior distribution of fit parameters and hence obtaining a distribution of classification scores. We train and test a classifier to identify kilonovae events which originate from binary neutron star mergers or neutron star black hole mergers, among simulations for the Zwicky Transient Facility observations with 19 other non-kilonovae-type events. We demonstrate that our method can estimate the uncertainty of misclassification, and the mean of the distribution of classification scores as point estimate obtains an AUC score of 0.96 on simulated data. We further show that using this method we can process the entire alert steam in real-time and bring down the sample of probable events to a scale where they can be analyzed by domain experts
    corecore