21 research outputs found

    Observing the earliest moments of supernovae using strong gravitational lenses

    Get PDF
    We determine the viability of exploiting lensing time delays to observe strongly gravitationally lensed supernovae (gLSNe) from first light. Assuming a plausible discovery strategy, the Legacy Survey of Space and Time (LSST) and the Zwicky Transient Facility (ZTF) will discover \sim 110 and \sim 1 systems per year before the supernova (SN) explosion in the final image respectively. Systems will be identified 11.79.3+29.811.7^{+29.8}_{-9.3} days before the final explosion. We then explore the possibility of performing early-time observations for Type IIP and Type Ia SNe in LSST-discovered systems. Using a simulated Type IIP explosion, we predict that the shock breakout in one trailing image per year will peak at \lesssim 24.1 mag (\lesssim 23.3) in the BB-band (F218WF218W), however evolving over a timescale of \sim 30 minutes. Using an analytic model of Type Ia companion interaction, we find that in the BB-band we should observe at least one shock cooling emission event per year that peaks at \lesssim 26.3 mag (\lesssim 29.6) assuming all Type Ia gLSNe have a 1 M_\odot red giant (main sequence) companion. We perform Bayesian analysis to investigate how well deep observations with 1 hour exposures on the European Extremely Large Telescope would discriminate between Type Ia progenitor populations. We find that if all Type Ia SNe evolved from the double-degenerate channel, then observations of the lack of early blue flux in 10 (50) trailing images would rule out more than 27% (19%) of the population having 1 M_\odot main sequence companions at 95% confidence.Comment: 17 pages, 15 figures (including appendices). Accepted by MNRAS 3rd May 202

    Observing the earliest moments of supernovae using strong gravitational lenses

    Get PDF
    We determine the viability of exploiting lensing time delays to observe strongly gravitationally lensed supernovae (gLSNe) from first light. Assuming a plausible discovery strategy, the Legacy Survey of Space and Time (LSST) and the Zwicky Transient Facility (ZTF) will discover ∼110 and ∼1 systems per year before the supernova (SN) explosion in the final image, respectively. Systems will be identified 11.7^(+29.8)_(−9.3) d before the final explosion. We then explore the possibility of performing early-time observations for Type IIP and Type Ia SNe in LSST-discovered systems. Using a simulated Type IIP explosion, we predict that the shock breakout in one trailing image per year will peak at ≲24.1 mag (≲23.3) in the B-band (F218W), however evolving over a time-scale of ∼30 min. Using an analytic model of Type Ia companion interaction, we find that in the B-band we should observe at least one shock cooling emission event per year that peaks at ≲26.3 mag (≲29.6) assuming all Type Ia gLSNe have a 1 M_⊙ red giant (main sequence) companion. We perform Bayesian analysis to investigate how well deep observations with 1 h exposures on the European Extremely Large Telescope would discriminate between Type Ia progenitor populations. We find that if all Type Ia SNe evolved from the double-degenerate channel, then observations of the lack of early blue flux in 10 (50) trailing images would rule out more than 27 per cent (19 per cent) of the population having 1 M_⊙ main sequence companions at 95 per cent confidence

    The Impact of Microlensing on the Standardisation of Strongly Lensed Type Ia Supernovae

    Get PDF
    We investigate the effect of microlensing on the standardisation of strongly lensed Type Ia supernovae (GLSNe Ia). We present predictions for the amount of scatter induced by microlensing across a range of plausible strong lens macromodels. We find that lensed images in regions of low convergence, shear and stellar density are standardisable, where the microlensing scatter is < 0.15 magnitudes, comparable to the intrinsic dispersion of for a typical SN Ia. These standardisable configurations correspond to asymmetric lenses with an image located far outside the Einstein radius of the lens. Symmetric and small Einstein radius lenses (< 0.5 arcsec) are not standardisable. We apply our model to the recently discovered GLSN Ia iPTF16geu and find that the large discrepancy between the observed flux and the macromodel predictions from More et al. (2017) cannot be explained by microlensing alone. Using the mock GLSNe Ia catalogue of Goldstein et al. (2017), we predict that ~ 22% of GLSNe Ia discovered by LSST will be standardisable, with a median Einstein radius of 0.9 arcseconds and a median time-delay of 41 days. By breaking the mass-sheet degeneracy the full LSST GLSNe Ia sample will be able to detect systematics in H0 at the 0.5% level.Comment: 11 pages, 8 Figures. Accepted by MNRAS May 17 201

    Generalised deep learning model for semi-automated length measurement of fish in stereo-BRUVS

    Get PDF
    Assessing the health of fish populations relies on determining the length of fish in sample species subsets, in conjunction with other key ecosystem markers; thereby, inferring overall health of communities. Despite attempts to use artificial intelligence (AI) to measure fish, most measurement remains a manual process, often necessitating fish being removed from the water. Overcoming this limitation and potentially harmful intervention by measuring fish without disturbance in their natural habitat would greatly enhance and expedite the process. Stereo baited remote underwater video systems (stereo-BRUVS) are widely used as a non-invasive, stressless method for manually counting and measuring fish in aquaculture, fisheries and conservation management. However, the application of deep learning (DL) to stereo-BRUVS image processing is showing encouraging progress towards replacing the manual and labour-intensive task of precisely locating the heads and tails of fish with computer-vision-based algorithms. Here, we present a generalised, semi-automated method for measuring the length of fish using DL with near-human accuracy for numerous species of fish. Additionally, we combine the DL method with a highly precise stereo-BRUVS calibration method, which uses calibration cubes to ensure precision within a few millimetres in calculated lengths. In a human versus DL comparison of accuracy, we show that, although DL commonly slightly over-estimates or under-estimates length, with enough repeated measurements, the two values average and converge to the same length, demonstrated by a Pearson correlation coefficient (r) of 0.99 for n=3954 measurement in ‘out-of-sample’ test data. We demonstrate, through the inclusion of visual examples of stereo-BRUVS scenes, the accuracy of this approach. The head-to-tail measurement method presented here builds on, and advances, previously published object detection for stereo-BRUVS. Furthermore, by replacing the manual process of four careful mouse clicks on the screen to precisely locate the head and tail of a fish in two images, with two fast clicks anywhere on that fish in those two images, a significant reduction in image processing and analysis time is expected. By reducing analysis times, more images can be processed; thereby, increasing the amount of data available for environmental reporting and decision making

    Between a reef and a hard place: capacity to map the next coral reef catastrophe

    Get PDF
    Increasing sea surface temperature and extreme heat events pose the greatest threat to coral reefs globally, with trends exceeding previous norms. The resultant mass bleaching events, such as those evidenced on the Great Barrier Reef in 2016, 2017, and 2020 have substantial ecological costs in addition to economic and social costs. Advancing remote (nanosatellites, rapid revisit traditional satellites) and in-field (drones) technological capabilities, cloud data processing, and analysis, coupled with existing infrastructure and in-field monitoring programs, have the potential to provide cost-effective and timely information to managers allowing them to better understand changes on reefs and apply effective remediation. Within a risk management framework for monitoring coral bleaching, we present an overview of how remote sensing can be used throughout the whole risk management cycle and highlight the role technological advancement has in earth observations of coral reefs for bleaching events

    Digitise This! A Quick and Easy Remote Sensing Method to Monitor the Daily Extent of Dredge Plumes

    Get PDF
    Technological advancements in remote sensing and GIS have improved natural resource managers’ abilities to monitor large-scale disturbances. In a time where many processes are heading towards automation, this study has regressed to simple techniques to bridge a gap found in the advancement of technology. The near-daily monitoring of dredge plume extent is common practice using Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and associated algorithms to predict the total suspended solids (TSS) concentration in the surface waters originating from floods and dredge plumes. Unfortunately, these methods cannot determine the difference between dredge plume and benthic features in shallow, clear water. This case study at Barrow Island, Western Australia, uses hand digitising to demonstrate the ability of human interpretation to determine this difference with a level of confidence and compares the method to contemporary TSS methods. Hand digitising was quick, cheap and required very little training of staff to complete. Results of ANOSIM R statistics show remote sensing derived TSS provided similar spatial results if they were thresholded to at least 3 mg L-1. However, remote sensing derived TSS consistently provided false-positive readings of shallow benthic features as Plume with a threshold up to TSS of 6 mg L-1, and began providing false-negatives (excluding actual plume) at a threshold as low as 4 mg L-1. Semi-automated processes that estimate plume concentration and distinguish between plumes and shallow benthic features without the arbitrary nature of human interpretation would be preferred as a plume monitoring method. However, at this stage, the hand digitising method is very useful and is more accurate at determining plume boundaries over shallow benthic features and is accessible to all levels of management with basic training

    Spectral Analysis of Estuarine Water for Characterisation of Inherent Optical Properties and Phytoplankton Concentration

    Get PDF
    In this study, a hyperspectral radiometer was used to measure the remote sensing reflecance continuously along the Swan-Canning Cleanup Programe (SCCP) water sampling sites. The results of this thesis show that the model and method used in this study are capable of accurately estimating the phytoplankton concentration in the Swan River for a continuous transect encompassing and connecting half of the SCCP sample locations up to, and including Nile Street

    Blockchain-enabled peer-to-peer energy trading

    No full text
    The increasing amount of distributed power generation from rooftop solar panels allows new electricity markets to emerge in which prosumers and consumers can trade locally produced energy. The use of blockchain technology has increasingly emerged in energy markets and shows great potential to facilitate Peer-to-Peer energy trading. However, blockchain technology is still in its infancy meaning it is not yet being used to its’ full potential. In this paper, blockchain technology for Peer-to-Peer energy trading and its implications are explored, especially in view of the ‘trilemma’: scalability, security, and decentralisation. Peer-to-Peer energy trading is the focus of this paper, which ultimately proposes a blockchain scalability solution. This solution is empirically modelled using data collected in a trial case study. The proposed solution increases scalability without compromising security and decentralisation when compared to base layer models

    Accelerating Species Recognition and Labelling of Fish From Underwater Video With Machine-Assisted Deep Learning

    No full text
    Machine-assisted object detection and classification of fish species from Baited Remote Underwater Video Station (BRUVS) surveys using deep learning algorithms presents an opportunity for optimising analysis time and rapid reporting of marine ecosystem statuses. Training object detection algorithms for BRUVS analysis presents significant challenges: the model requires training datasets with bounding boxes already applied identifying the location of all fish individuals in a scene, and it requires training datasets identifying species with labels. In both cases, substantial volumes of data are required and this is currently a manual, labour-intensive process, resulting in a paucity of the labelled data currently required for training object detection models for species detection. Here, we present a “machine-assisted” approach for i) a generalised model to automate the application of bounding boxes to any underwater environment containing fish and ii) fish detection and classification to species identification level, up to 12 target species. A catch-all “fish” classification is applied to fish individuals that remain unidentified due to a lack of available training and validation data. Machine-assisted bounding box annotation was shown to detect and label fish on out-of-sample datasets with a recall between 0.70 and 0.89 and automated labelling of 12 targeted species with an F1 score of 0.79. On average, 12% of fish were given a bounding box with species labels and 88% of fish were located and given a fish label and identified for manual labelling. Taking a combined, machine-assisted approach presents a significant advancement towards the applied use of deep learning for fish species detection in fish analysis and workflows and has potential for future fish ecologist uptake if integrated into video analysis software. Manual labelling and classification effort is still required, and a community effort to address the limitation presented by a severe paucity of training data would improve automation accuracy and encourage increased uptake

    Between a reef and a hard place: capacity to map the next coral reef catastrophe

    No full text
    Increasing sea surface temperature and extreme heat events pose the greatest threat to coral reefs globally, with trends exceeding previous norms. The resultant mass bleaching events, such as those evidenced on the Great Barrier Reef in 2016, 2017, and 2020 have substantial ecological costs in addition to economic and social costs. Advancing remote (nanosatellites, rapid revisit traditional satellites) and in-field (drones) technological capabilities, cloud data processing, and analysis, coupled with existing infrastructure and in-field monitoring programs, have the potential to provide cost-effective and timely information to managers allowing them to better understand changes on reefs and apply effective remediation. Within a risk management framework for monitoring coral bleaching, we present an overview of how remote sensing can be used throughout the whole risk management cycle and highlight the role technological advancement has in earth observations of coral reefs for bleaching events
    corecore