5,570 research outputs found
Geodetic monitoring of complex shaped infrastructures using Ground-Based InSAR
In the context of climate change, alternatives to fossil energies need to be used as much as possible to produce electricity. Hydroelectric power generation through the utilisation of dams stands out as an exemplar of highly effective methodologies in this endeavour. Various monitoring sensors can be installed with different characteristics w.r.t. spatial resolution, temporal resolution and accuracy to assess their safe usage. Among the array of techniques available, it is noteworthy that ground-based synthetic aperture radar (GB-SAR) has not yet been widely adopted for this purpose. Despite its remarkable equilibrium between the aforementioned attributes, its sensitivity to atmospheric disruptions, specific acquisition geometry, and the requisite for phase unwrapping collectively contribute to constraining its usage. Several processing strategies are developed in this thesis to capitalise on all the opportunities of GB-SAR systems, such as continuous, flexible and autonomous observation combined with high resolutions and accuracy.
The first challenge that needs to be solved is to accurately localise and estimate the azimuth of the GB-SAR to improve the geocoding of the image in the subsequent step. A ray tracing algorithm and tomographic techniques are used to recover these external parameters of the sensors. The introduction of corner reflectors for validation purposes confirms a significant error reduction. However, for the subsequent geocoding, challenges persist in scenarios involving vertical structures due to foreshortening and layover, which notably compromise the geocoding quality of the observed points. These issues arise when multiple points at varying elevations are encapsulated within a singular resolution cell, posing difficulties in pinpointing the precise location of the scattering point responsible for signal return. To surmount these hurdles, a Bayesian approach grounded in intensity models is formulated, offering a tool to enhance the accuracy of the geocoding process. The validation is assessed on a dam in the black forest in Germany, characterised by a very specific structure.
The second part of this thesis is focused on the feasibility of using GB-SAR systems for long-term geodetic monitoring of large structures. A first assessment is made by testing large temporal baselines between acquisitions for epoch-wise monitoring. Due to large displacements, the phase unwrapping can not recover all the information. An improvement is made by adapting the geometry of the signal processing with the principal component analysis. The main case study consists of several campaigns from different stations at Enguri Dam in Georgia. The consistency of the estimated displacement map is assessed by comparing it to a numerical model calibrated on the plumblines data. It exhibits a strong agreement between the two results and comforts the usage of GB-SAR for epoch-wise monitoring, as it can measure several thousand points on the dam. It also exhibits the possibility of detecting local anomalies in the numerical model. Finally, the instrument has been installed for continuous monitoring for over two years at Enguri Dam. An adequate flowchart is developed to eliminate the drift happening with classical interferometric algorithms to achieve the accuracy required for geodetic monitoring. The analysis of the obtained time series confirms a very plausible result with classical parametric models of dam deformations. Moreover, the results of this processing strategy are also confronted with the numerical model and demonstrate a high consistency. The final comforting result is the comparison of the GB-SAR time series with the output from four GNSS stations installed on the dam crest.
The developed algorithms and methods increase the capabilities of the GB-SAR for dam monitoring in different configurations. It can be a valuable and precious supplement to other classical sensors for long-term geodetic observation purposes as well as short-term monitoring in cases of particular dam operations
Flood dynamics derived from video remote sensing
Flooding is by far the most pervasive natural hazard, with the human impacts of floods expected to worsen in the coming decades due to climate change. Hydraulic models are a key tool for understanding flood dynamics and play a pivotal role in unravelling the processes that occur during a flood event, including inundation flow patterns and velocities. In the realm of river basin dynamics, video remote sensing is emerging as a transformative tool that can offer insights into flow dynamics and thus, together with other remotely sensed data, has the potential to be deployed to estimate discharge. Moreover, the integration of video remote sensing data with hydraulic models offers a pivotal opportunity to enhance the predictive capacity of these models.
Hydraulic models are traditionally built with accurate terrain, flow and bathymetric data and are often calibrated and validated using observed data to obtain meaningful and actionable model predictions. Data for accurately calibrating and validating hydraulic models are not always available, leaving the assessment of the predictive capabilities of some models deployed in flood risk management in question. Recent advances in remote sensing have heralded the availability of vast video datasets of high resolution. The parallel evolution of computing capabilities, coupled with advancements in artificial intelligence are enabling the processing of data at unprecedented scales and complexities, allowing us to glean meaningful insights into datasets that can be integrated with hydraulic models. The aims of the research presented in this thesis were twofold. The first aim was to evaluate and explore the potential applications of video from air- and space-borne platforms to comprehensively calibrate and validate two-dimensional hydraulic models. The second aim was to estimate river discharge using satellite video combined with high resolution topographic data. In the first of three empirical chapters, non-intrusive image velocimetry techniques were employed to estimate river surface velocities in a rural catchment. For the first time, a 2D hydraulicvmodel was fully calibrated and validated using velocities derived from Unpiloted Aerial Vehicle (UAV) image velocimetry approaches. This highlighted the value of these data in mitigating the limitations associated with traditional data sources used in parameterizing two-dimensional hydraulic models. This finding inspired the subsequent chapter where river surface velocities, derived using Large Scale Particle Image Velocimetry (LSPIV), and flood extents, derived using deep neural network-based segmentation, were extracted from satellite video and used to rigorously assess the skill of a two-dimensional hydraulic model. Harnessing the ability of deep neural networks to learn complex features and deliver accurate and contextually informed flood segmentation, the potential value of satellite video for validating two dimensional hydraulic model simulations is exhibited. In the final empirical chapter, the convergence of satellite video imagery and high-resolution topographical data bridges the gap between visual observations and quantitative measurements by enabling the direct extraction of velocities from video imagery, which is used to estimate river discharge. Overall, this thesis demonstrates the significant potential of emerging video-based remote sensing datasets and offers approaches for integrating these data into hydraulic modelling and discharge estimation practice. The incorporation of LSPIV techniques into flood modelling workflows signifies a methodological progression, especially in areas lacking robust data collection infrastructure. Satellite video remote sensing heralds a major step forward in our ability to observe river dynamics in real time, with potentially significant implications in the domain of flood modelling science
Machine learning applications in search algorithms for gravitational waves from compact binary mergers
Gravitational waves from compact binary mergers are now routinely observed by Earth-bound detectors. These observations enable exciting new science, as they have opened a new window to the Universe.
However, extracting gravitational-wave signals from the noisy detector data is a challenging problem. The most sensitive search algorithms for compact binary mergers use matched filtering, an algorithm that compares the data with a set of expected template signals. As detectors are upgraded and more sophisticated signal models become available, the number of required templates will increase, which can make some sources computationally prohibitive to search for. The computational cost is of particular concern when low-latency alerts should be issued to maximize the time for electromagnetic follow-up observations. One potential solution to reduce computational requirements that has started to be explored in the last decade is machine learning. However, different proposed deep learning searches target varying parameter spaces and use metrics that are not always comparable to existing literature. Consequently, a clear picture of the capabilities of machine learning searches has been sorely missing.
In this thesis, we closely examine the sensitivity of various deep learning gravitational-wave search algorithms and introduce new methods to detect signals from binary black hole and binary neutron star mergers at previously untested statistical confidence levels. By using the sensitive distance as our core metric, we allow for a direct comparison of our algorithms to state-of-the-art search pipelines. As part of this thesis, we organized a global mock data challenge to create a benchmark for machine learning search algorithms targeting compact binaries. This way, the tools developed in this thesis are made available to the greater community by publishing them as open source software.
Our studies show that, depending on the parameter space, deep learning gravitational-wave search algorithms are already competitive with current production search pipelines. We also find that strategies developed for traditional searches can be effectively adapted to their machine learning counterparts. In regions where matched filtering becomes computationally expensive, available deep learning algorithms are also limited in their capability. We find reduced sensitivity to long duration signals compared to the excellent results for short-duration binary black hole signals
Satellite remote sensing of surface winds, waves, and currents: Where are we now?
This review paper reports on the state-of-the-art concerning observations of surface winds, waves, and currents from space and their use for scientific research and subsequent applications. The development of observations of sea state parameters from space dates back to the 1970s, with a significant increase in the number and diversity of space missions since the 1990s. Sensors used to monitor the sea-state parameters from space are mainly based on microwave techniques. They are either specifically designed to monitor surface parameters or are used for their abilities to provide opportunistic measurements complementary to their primary purpose. The principles on which is based on the estimation of the sea surface parameters are first described, including the performance and limitations of each method. Numerous examples and references on the use of these observations for scientific and operational applications are then given. The richness and diversity of these applications are linked to the importance of knowledge of the sea state in many fields. Firstly, surface wind, waves, and currents are significant factors influencing exchanges at the air/sea interface, impacting oceanic and atmospheric boundary layers, contributing to sea level rise at the coasts, and interacting with the sea-ice formation or destruction in the polar zones. Secondly, ocean surface currents combined with wind- and wave- induced drift contribute to the transport of heat, salt, and pollutants. Waves and surface currents also impact sediment transport and erosion in coastal areas. For operational applications, observations of surface parameters are necessary on the one hand to constrain the numerical solutions of predictive models (numerical wave, oceanic, or atmospheric models), and on the other hand to validate their results. In turn, these predictive models are used to guarantee safe, efficient, and successful offshore operations, including the commercial shipping and energy sector, as well as tourism and coastal activities. Long-time series of global sea-state observations are also becoming increasingly important to analyze the impact of climate change on our environment. All these aspects are recalled in the article, relating to both historical and contemporary activities in these fields
Exploring the calibration of cosmological probes used in gravitational-wave and multi-messenger astronomy
The field of gravitational wave astronomy has grown remarkably since the first direct detection of gravitational waves on 14th September 2015. The signal, originating from the merger of two black holes, was detected by the two US-based Advanced LIGO interferometers in Hanford (Washington State) and Livingston (Louisiana). The second observing run of the Advanced LIGO and Virgo detectors marked the first detection of a binary neutron star merger, along with its electromagnetic counterparts. The optical follow-up of the merger led to the first confirmed observations of a kilonova, an electromagnetic counterpart to binary neutron star and neutron star-black hole mergers whose existence was first predicted in 1970s. Following the multimessenger observations of the binary neutron star merger GW170817, constraints were put on the rate of expansion of the Universe using both gravitational wave and electromagnetic data. These measurements could help us understand the current tension between early-Universe and late-Universe measurements of the Hubble constant H0. The use of gravitational wave signals for measuring the rate of expansion of the Universe was proposed by Schutz in 1986. Compact binary coalescences can be used as distance markers, a gravitational wave analogue to standard candles: "Standard Sirens". Measurements of the Hubble constant from standard sirens are independent from previous methods of constraining H0. Bright sirens are gravitational wave signals that are detected coincidentally with electromagnetic signatures. These "bright" gravitational wave sirens are powerful cosmological probes, allowing us to extract information on both the distance and the redshift of the source. It is therefore important to maximise these coincident detections, and to carefully calibrate the data extracted from any standard siren. The work presented in this thesis can be divided into three main topics, all under the umbrella of maximising scientific returns from observations of compact binary coalescences. These three topics are: kilonova parameter estimation, cosmology with gravitational waves, and calibration of advanced gravitational wave detectors. We present work on inferring parameters from kilonova light curves. Ejecta parameters and information about the merging time of the progenitor is extracted from simulated kilonova light curves. We explore the consequence of neglecting some aspects of microphysics on the resulting parameter estimation. We also present new results on the inference of the Hubble constant through the application of a robust test of galaxy catalogue completeness to the current gravitational wave cosmology pipeline. We explore the impact of adopting a robust estimate of the apparent magnitude threshold mthr for the galaxy catalogues used in gravitational wave cosmology on the final inference of the Hubble constant H0 from standard sirens, and compare the results to those obtained when adopting a conservative estimate for mthr. Finally, we present the first results from the prototype of a Newtonian Calibrator at the LIGO Hanford detector. Calibrating the LIGO detectors is crucial to the extraction of the gravitational wave source parameters that are used in cosmology with standard sirens
The BINGO Project IX: Search for Fast Radio Bursts -- A Forecast for the BINGO Interferometry System
The Baryon Acoustic Oscillations (BAO) from Integrated Neutral Gas
Observations (BINGO) radio telescope will use the neutral Hydrogen emission
line to map the Universe in the redshift range , with
the main goal of probing BAO. In addition, the instrument optical design and
hardware configuration support the search for Fast Radio Bursts (FRBs). In this
work, we propose the use of a BINGO Interferometry System (BIS) including new
auxiliary, smaller, radio telescopes (hereafter \emph{outriggers}). The
interferometric approach makes it possible to pinpoint the FRB sources in the
sky. We present here the results of several BIS configurations combining BINGO
horns with and without mirrors ( m, m, and m) and 5, 7, 9, or 10 for
single horns. We developed a new {\tt Python} package, the {\tt FRBlip}, which
generates synthetic FRB mock catalogs and computes, based on a telescope model,
the observed signal-to-noise ratio (S/N) that we used to compute numerically
the detection rates of the telescopes and how many interferometry pairs of
telescopes (\emph{baselines}) can observe an FRB. FRBs observed by more than
one baseline are the ones whose location can be determined. We thus evaluate
the performance of BIS regarding FRB localization. We found that BIS will be
able to localize 23 FRBs yearly with single horn outriggers in the best
configuration (using 10 outriggers of 6 m mirrors), with redshift ; the full localization capability depends on the number and the type of
the outriggers. Wider beams are best to pinpoint FRB sources because potential
candidates will be observed by more baselines, while narrow beams look deep in
redshift. The BIS can be a powerful extension of the regular BINGO telescope,
dedicated to observe hundreds of FRBs during Phase 1. Many of them will be well
localized with a single horn + 6 m dish as outriggers.(Abridged)Comment: 12 pages, 9 figures, 5 tables, submitted to A&
Development of a SQUID magnetometry system for cryogenic neutron electric dipole moment experiment
A measurement of the neutron electric dipole moment (nEDM) could hold the key to understanding why the visible universe is the way it is: why matter should predominate over antimatter. As a charge-parity violating (CPV) quantity, an nEDM could provide an insight into new mechanisms that address this baryon asymmetry. The motivation for an improved sensitivity to an nEDM is to find it to be non-zero at a level consistent with certain beyond the Standard Model theories that predict new sources of CPV, or to establish a new limit that constrains them.
CryoEDM is an experiment that sought to better the current limit of cm by an order of magnitude. It is designed to measure the nEDM via the Ramsey Method of Separated Oscillatory Fields, in which it is critical that the magnetic field remains stable throughout. A way of accurately tracking the magnetic fields, moreover at a temperature K, is crucial for CryoEDM, and for future cryogenic projects.
This thesis presents work focussing on the development of a 12-SQUID magnetometry system for CryoEDM, that enables the magnetic field to be monitored to a precision of pT. A major component of its infrastructure is the superconducting capillary shields, which screen the input lines of the SQUIDs from the pick up of spurious magnetic fields that will perturb a SQUID's measurement. These are shown to have a transverse shielding factor of , which is a few orders of magnitude greater than the calculated requirement.
Efforts to characterise the shielding of the SQUID chips themselves are also discussed. The use of Cryoperm for shields reveals a tension between improved SQUID noise and worse neutron statistics. Investigations show that without it, SQUIDs have an elevated noise when cooled in a substantial magnetic field; with it, magnetostatic simulations suggest that it is detrimental to the polarisation of neutrons in transport. The findings suggest that with proper consideration, it is possible to reach a compromise between the two behaviours.
Computational work to develop a simulation of SQUID data is detailed, which is based on the Laplace equation for the magnetic scalar potential. These data are ultimately used in the development of a linear regression technique to determine the volume-averaged magnetic field in the neutron cells. This proves highly effective in determining the fields within the pT requirement under certain conditions
Microwave-based quantum control and coherence protection of tin-vacancy spin qubits in a strain-tuned diamond membrane heterostructure
Robust spin-photon interfaces in solids are essential components in quantum
networking and sensing technologies. Ideally, these interfaces combine a
long-lived spin memory, coherent optical transitions, fast and high-fidelity
spin manipulation, and straightforward device integration and scaling. The
tin-vacancy center (SnV) in diamond is a promising spin-photon interface with
desirable optical and spin properties at 1.7 K. However, the SnV spin lacks
efficient microwave control and its spin coherence degrades with higher
temperature. In this work, we introduce a new platform that overcomes these
challenges - SnV centers in uniformly strained thin diamond membranes. The
controlled generation of crystal strain introduces orbital mixing that allows
microwave control of the spin state with 99.36(9) % gate fidelity and spin
coherence protection beyond a millisecond. Moreover, the presence of crystal
strain suppresses temperature dependent dephasing processes, leading to a
considerable improvement of the coherence time up to 223(10) s at 4 K, a
widely accessible temperature in common cryogenic systems. Critically, the
coherence of optical transitions is unaffected by the elevated temperature,
exhibiting nearly lifetime-limited optical linewidths. Combined with the
compatibility of diamond membranes with device integration, the demonstrated
platform is an ideal spin-photon interface for future quantum technologies
- …