26 research outputs found

    Auto-identification of unphysical source reconstructions in strong gravitational lens modelling

    Get PDF
    With the advent of next-generation surveys and the expectation of discovering huge numbers of strong gravitational lens systems, much effort is being invested into developing automated procedures for handling the data. The several orders of magnitude increase in the number of strong galaxy–galaxy lens systems is an insurmountable challenge for traditional modelling techniques. Whilst machine learning techniques have dramatically improved the efficiency of lens modelling, parametric modelling of the lens mass profile remains an important tool for dealing with complex lensing systems. In particular, source reconstruction methods are necessary to cope with the irregular structure of high-redshift sources. In this paper, we consider a convolutional neural network (CNN) that analyses the outputs of semi-analytic methods that parametrically model the lens mass and linearly reconstruct the source surface brightness distribution. We show the unphysical source reconstructions that arise as a result of incorrectly initialized lens models can be effectively caught by our CNN. Furthermore, the CNN predictions can be used to automatically reinitialize the parametric lens model, avoiding unphysical source reconstructions. The CNN, trained on reconstructions of lensed Sérsic sources, accurately classifies source reconstructions of the same type with a precision P > 0.99 and recall R > 0.99. The same CNN, without retraining, achieves P = 0.89 and R = 0.89 when classifying source reconstructions of more complex lensed Hubble Ultra-Deep Field (HUDF) sources. Using the CNN predictions to reinitialize the lens modelling procedure, we achieve a 69 per cent decrease in the occurrence of unphysical source reconstructions. This combined CNN and parametric modelling approach can greatly improve the automation of lens modelling

    Ada style guide (version 1.1)

    Get PDF
    Ada is a programming language of considerable expressive power. The Ada Language Reference Manual provides a thorough definition of the language. However, it does not offer sufficient guidance on the appropriate use of Ada's powerful features. For this reason, the Goddard Space Flight Center Ada User's Group has produced this style guide which addresses such program style issues. The guide covers three areas of Ada program style: the structural decomposition of a program; the coding and the use of specific Ada features; and the textural formatting of a program

    Strong lens modelling: comparing and combining Bayesian neural networks and parametric profile fitting

    Get PDF
    The vast quantity of strong galaxy–galaxy gravitational lenses expected by future large-scale surveys necessitates the development of automated methods to efficiently model their mass profiles. For this purpose, we train an approximate Bayesian convolutional neural network (CNN) to predict mass profile parameters and associated uncertainties, and compare its accuracy to that of conventional parametric modelling for a range of increasingly complex lensing systems. These include standard smooth parametric density profiles, hydrodynamical EAGLE galaxies, and the inclusion of foreground mass structures, combined with parametric sources and sources extracted from the Hubble Ultra Deep Field. In addition, we also present a method for combining the CNN with traditional parametric density profile fitting in an automated fashion, where the CNN provides initial priors on the latter’s parameters. On average, the CNN achieved errors 19 ± 22 per cent lower than the traditional method’s blind modelling. The combination method instead achieved 27 ± 11 per cent lower errors over the blind modelling, reduced further to 37 ± 11 per cent when the priors also incorporated the CNN-predicted uncertainties, with errors also 17 ± 21 per cent lower than the CNN by itself. While the CNN is undoubtedly the fastest modelling method, the combination of the two increases the speed of conventional fitting alone by factors of 1.73 and 1.19 with and without CNN-predicted uncertainties, respectively. This, combined with greatly improved accuracy, highlights the benefits one can obtain through combining neural networks with conventional techniques in order to achieve an efficient automated modelling approach

    Modelling high-resolution ALMA observations of strongly lensed dusty star-forming galaxies detected by Herschel

    Get PDF
    We present modelling of ∼0.1 arcsec resolution Atacama Large Millimetre/submillimetre Array imaging of seven strong gravitationally lensed galaxies detected by the Herschel Space Observatory. Four of these systems are galaxy–galaxy strong lenses, with the remaining three being group-scale lenses. Through careful modelling of visibilities, we infer the mass profiles of the lensing galaxies and by determining the magnification factors, we investigate the intrinsic properties and morphologies of the lensed submillimetre sources. We find that these submillimetre sources all have ratios of star formation rate to dust mass that are consistent with, or in excess of, the mean ratio for high-redshift submillimetre galaxies and low redshift ultra-luminous infrared galaxies. Reconstructions of the background sources reveal that the majority of our sample display disturbed morphologies. The majority of our lens models have mass density slopes close to isothermal, but some systems show significant differences

    The molecular-gas properties in the gravitationally lensed merger HATLAS J142935.3-002836

    Get PDF
    Follow-up observations of (sub-)mm-selected gravitationally-lensed systems have allowed a more detailed study of the dust-enshrouded phase of star-formation up to very early cosmic times. Here, the case of the gravitationally lensed merger in HATLAS J142935.3-002836 (also known as H1429-0028; z_lens=0.218, z_bkg=1.027) is revisited following recent developments in the literature and new APEX observations targeting two carbon monoxide (CO) rotational transitions J_up=3 and 6. We show that the line-profiles comprise three distinct velocity components, where the fainter high-velocity one is less magnified and more compact. The modelling of the observed spectral line energy distribution of CO J_up=2 to 6 and [CI]3P_1-3P_0 assumes a large velocity gradient scenario, where the analysis is based on four statistical approaches. Since the detected gas and dust emission comes exclusively from only one of the two merging components (the one oriented North-South, NS), we are only able to determine upper-limits for the companion. The molecular gas in the NS component in H1429-0028 is found to have a temperature of ~70K, a volume density of log(n[/cm3])~3.7, to be expanding at ~10km/s/pc, and amounts to M_H2=4(-2,+3)*1e9 Msun. The CO to H2 conversion factor is estimated to be alpha_CO=0.4(-0.2,+0.3) Msun/(K.km/s.pc2). The NS galaxy is expected to have a factor of >10x more gas than its companion (M_H

    The impact of human expert visual inspection on the discovery of strong gravitational lenses

    Get PDF
    We investigate the ability of human ’expert’ classifiers to identify strong gravitational lens candidates in Dark Energy Survey like imaging. We recruited a total of 55 people that completed more than 25% of the project. During the classification task, we present to the participants 1489 images. The sample contains a variety of data including lens simulations, real lenses, non-lens examples, and unlabeled data. We find that experts are extremely good at finding bright, well-resolved Einstein rings, whilst arcs with g-band signal-to-noise less than ∼25 or Einstein radii less than ∼1.2 times the seeing are rarely recovered. Very few non-lenses are scored highly. There is substantial variation in the performance of individual classifiers, but they do not appear to depend on the classifier’s experience, confidence or academic position. These variations can be mitigated with a team of 6 or more independent classifiers. Our results give confidence that humans are a reliable pruning step for lens candidates, providing pure and quantifiably complete samples for follow-up studies

    Developments towards Bragg edge imaging on the IMAT beamline at the ISIS Pulsed Neutron and Muon Source: BEAn software

    No full text
    The BEAn (Bragg Edge Analysis) software has been developed as a toolkit for analysis of Bragg edges and strain maps from data obtained at the time-of-flight imaging instrument IMAT or other compatible instruments. The code is built primarily using Python 3 and the Qt framework, and includes tools useful for neutron imaging such as principal component analysis. This paper introduces BEAn and its features, briefly discuses the scientific concepts behind them, and concludes with planned future work on the code
    corecore