922 research outputs found

    Geodetic monitoring of complex shaped infrastructures using Ground-Based InSAR

    Get PDF
    In the context of climate change, alternatives to fossil energies need to be used as much as possible to produce electricity. Hydroelectric power generation through the utilisation of dams stands out as an exemplar of highly effective methodologies in this endeavour. Various monitoring sensors can be installed with different characteristics w.r.t. spatial resolution, temporal resolution and accuracy to assess their safe usage. Among the array of techniques available, it is noteworthy that ground-based synthetic aperture radar (GB-SAR) has not yet been widely adopted for this purpose. Despite its remarkable equilibrium between the aforementioned attributes, its sensitivity to atmospheric disruptions, specific acquisition geometry, and the requisite for phase unwrapping collectively contribute to constraining its usage. Several processing strategies are developed in this thesis to capitalise on all the opportunities of GB-SAR systems, such as continuous, flexible and autonomous observation combined with high resolutions and accuracy. The first challenge that needs to be solved is to accurately localise and estimate the azimuth of the GB-SAR to improve the geocoding of the image in the subsequent step. A ray tracing algorithm and tomographic techniques are used to recover these external parameters of the sensors. The introduction of corner reflectors for validation purposes confirms a significant error reduction. However, for the subsequent geocoding, challenges persist in scenarios involving vertical structures due to foreshortening and layover, which notably compromise the geocoding quality of the observed points. These issues arise when multiple points at varying elevations are encapsulated within a singular resolution cell, posing difficulties in pinpointing the precise location of the scattering point responsible for signal return. To surmount these hurdles, a Bayesian approach grounded in intensity models is formulated, offering a tool to enhance the accuracy of the geocoding process. The validation is assessed on a dam in the black forest in Germany, characterised by a very specific structure. The second part of this thesis is focused on the feasibility of using GB-SAR systems for long-term geodetic monitoring of large structures. A first assessment is made by testing large temporal baselines between acquisitions for epoch-wise monitoring. Due to large displacements, the phase unwrapping can not recover all the information. An improvement is made by adapting the geometry of the signal processing with the principal component analysis. The main case study consists of several campaigns from different stations at Enguri Dam in Georgia. The consistency of the estimated displacement map is assessed by comparing it to a numerical model calibrated on the plumblines data. It exhibits a strong agreement between the two results and comforts the usage of GB-SAR for epoch-wise monitoring, as it can measure several thousand points on the dam. It also exhibits the possibility of detecting local anomalies in the numerical model. Finally, the instrument has been installed for continuous monitoring for over two years at Enguri Dam. An adequate flowchart is developed to eliminate the drift happening with classical interferometric algorithms to achieve the accuracy required for geodetic monitoring. The analysis of the obtained time series confirms a very plausible result with classical parametric models of dam deformations. Moreover, the results of this processing strategy are also confronted with the numerical model and demonstrate a high consistency. The final comforting result is the comparison of the GB-SAR time series with the output from four GNSS stations installed on the dam crest. The developed algorithms and methods increase the capabilities of the GB-SAR for dam monitoring in different configurations. It can be a valuable and precious supplement to other classical sensors for long-term geodetic observation purposes as well as short-term monitoring in cases of particular dam operations

    Flood dynamics derived from video remote sensing

    Get PDF
    Flooding is by far the most pervasive natural hazard, with the human impacts of floods expected to worsen in the coming decades due to climate change. Hydraulic models are a key tool for understanding flood dynamics and play a pivotal role in unravelling the processes that occur during a flood event, including inundation flow patterns and velocities. In the realm of river basin dynamics, video remote sensing is emerging as a transformative tool that can offer insights into flow dynamics and thus, together with other remotely sensed data, has the potential to be deployed to estimate discharge. Moreover, the integration of video remote sensing data with hydraulic models offers a pivotal opportunity to enhance the predictive capacity of these models. Hydraulic models are traditionally built with accurate terrain, flow and bathymetric data and are often calibrated and validated using observed data to obtain meaningful and actionable model predictions. Data for accurately calibrating and validating hydraulic models are not always available, leaving the assessment of the predictive capabilities of some models deployed in flood risk management in question. Recent advances in remote sensing have heralded the availability of vast video datasets of high resolution. The parallel evolution of computing capabilities, coupled with advancements in artificial intelligence are enabling the processing of data at unprecedented scales and complexities, allowing us to glean meaningful insights into datasets that can be integrated with hydraulic models. The aims of the research presented in this thesis were twofold. The first aim was to evaluate and explore the potential applications of video from air- and space-borne platforms to comprehensively calibrate and validate two-dimensional hydraulic models. The second aim was to estimate river discharge using satellite video combined with high resolution topographic data. In the first of three empirical chapters, non-intrusive image velocimetry techniques were employed to estimate river surface velocities in a rural catchment. For the first time, a 2D hydraulicvmodel was fully calibrated and validated using velocities derived from Unpiloted Aerial Vehicle (UAV) image velocimetry approaches. This highlighted the value of these data in mitigating the limitations associated with traditional data sources used in parameterizing two-dimensional hydraulic models. This finding inspired the subsequent chapter where river surface velocities, derived using Large Scale Particle Image Velocimetry (LSPIV), and flood extents, derived using deep neural network-based segmentation, were extracted from satellite video and used to rigorously assess the skill of a two-dimensional hydraulic model. Harnessing the ability of deep neural networks to learn complex features and deliver accurate and contextually informed flood segmentation, the potential value of satellite video for validating two dimensional hydraulic model simulations is exhibited. In the final empirical chapter, the convergence of satellite video imagery and high-resolution topographical data bridges the gap between visual observations and quantitative measurements by enabling the direct extraction of velocities from video imagery, which is used to estimate river discharge. Overall, this thesis demonstrates the significant potential of emerging video-based remote sensing datasets and offers approaches for integrating these data into hydraulic modelling and discharge estimation practice. The incorporation of LSPIV techniques into flood modelling workflows signifies a methodological progression, especially in areas lacking robust data collection infrastructure. Satellite video remote sensing heralds a major step forward in our ability to observe river dynamics in real time, with potentially significant implications in the domain of flood modelling science

    Systemic Circular Economy Solutions for Fiber Reinforced Composites

    Get PDF
    This open access book provides an overview of the work undertaken within the FiberEUse project, which developed solutions enhancing the profitability of composite recycling and reuse in value-added products, with a cross-sectorial approach. Glass and carbon fiber reinforced polymers, or composites, are increasingly used as structural materials in many manufacturing sectors like transport, constructions and energy due to their better lightweight and corrosion resistance compared to metals. However, composite recycling is still a challenge since no significant added value in the recycling and reprocessing of composites is demonstrated. FiberEUse developed innovative solutions and business models towards sustainable Circular Economy solutions for post-use composite-made products. Three strategies are presented, namely mechanical recycling of short fibers, thermal recycling of long fibers and modular car parts design for sustainable disassembly and remanufacturing. The validation of the FiberEUse approach within eight industrial demonstrators shows the potentials towards new Circular Economy value-chains for composite materials

    Geosciences and the Energy Transition

    Get PDF
    A substantial and rapid decarbonisation of the global economy is required to limit anthropogenic climate change to well below 2°C average global heating by 2050. Yet, emissions from fossil fuel energy generation—which dominate global greenhouse gas emissions—are at an all-time high. Progress and action for an energy transition to net zero carbon is critical, and one in which geoscience sectors and geoscientists will play multiple roles. Here, we outline the landscape of the geosciences and the energy transition in the context of the climate crisis, and intergovernmental policies on climate and social justice. We show how geoscience sectors, skills, knowledge, data, and infrastructure, both directly and indirectly, will play a key role in the energy transition. This may be in the responsible sourcing of raw materials for low carbon energy technologies; in the decarbonisation of heating; and in the near-permanent geological capture and storage of carbon through novel technology development. A new and unprecedented challenge is to reach Geological Net Zero, where zero carbon emissions from geological resource production and consumption are achieved via permanent geological storage. We identify overarching and cross-cutting issues for a sustainable and fair net zero carbon energy transition, and the associated geoscience challenges and opportunities. Finally, we call for geoscience professionals to recognise and take responsibility for their role in ensuring a fair and sustainable energy transition at the pace and scale required

    The European Experience: A Multi-Perspective History of Modern Europe, 1500–2000

    Get PDF
    The European Experience brings together the expertise of nearly a hundred historians from eight European universities to internationalise and diversify the study of modern European history, exploring a grand sweep of time from 1500 to 2000. Offering a valuable corrective to the Anglocentric narratives of previous English-language textbooks, scholars from all over Europe have pooled their knowledge on comparative themes such as identities, cultural encounters, power and citizenship, and economic development to reflect the complexity and heterogeneous nature of the European experience. Rather than another grand narrative, the international author teams offer a multifaceted and rich perspective on the history of the continent of the past 500 years. Each major theme is dissected through three chronological sub-chapters, revealing how major social, political and historical trends manifested themselves in different European settings during the early modern (1500–1800), modern (1800–1900) and contemporary period (1900–2000). This resource is of utmost relevance to today’s history students in the light of ongoing internationalisation strategies for higher education curricula, as it delivers one of the first multi-perspective and truly ‘European’ analyses of the continent’s past. Beyond the provision of historical content, this textbook equips students with the intellectual tools to interrogate prevailing accounts of European history, and enables them to seek out additional perspectives in a bid to further enrich the discipline

    Computational development of models and tools for the kinetic study of astrochemical gas-phase reactions

    Get PDF
    This PhD thesis focuses on the application and development of computational tools and methodologies for the modeling of the kinetics of gas-phase reactions of astrophysical interest in the interstellar medium (ISM). The complexity related to the investigation of chemical reactivity in space is mostly due to the extreme physical conditions of temperature, pressure and exposure to high-energy radiation, which in turn also lead to the formation of exotic species, like radicals and ions. Nevertheless, there is still much to be understood about the formation of molecules, the major issue being the lack of sufficient laboratory (experimental and computational) studies. A more detailed and accurate study of all the chemical processes occurring in the ISM will allow us to obtain the data necessary to simulate the chemical evolution of an interstellar cloud over time using kinetic models including thousands of reactions that involve hundreds of species. The collection of the kinetic parameters required for the relevant reactions has led to the growth of different astrochemical databases, such as KIDA and UMIST. However, the data gathered in these catalogues are incomplete, and rely extensively on crude estimations and extrapolations. These rates are of paramount importance to get a better comprehension of the relative abundances of the chemical compounds extrapolated by the astronomers from the spectral data recorded through the radio telescopes and the in-orbit devices, like the satellites. Accurate state-of-the-art computational approaches play a fundamental role in analyzing feasible reaction mechanisms and in accurately predicting the associated kinetics. Such approaches usually rely on chemical intuition where a by-hand search of the most likely pathways is performed. Unfortunately, thisprocedure can lead to overlook significant mechanisms, especially when large molecular systems are investigated. Increasing the size of a molecule can also increase the number of its possible conformers which can show a different chemical reactivity with respect to the same chemical partner. This brings to get very complex chemical reaction networks in which hundreds of chemical species are involved and thousands of chemical reactions can occur.During the last decades, a lot of effort has been done to develop computational techniques able to perform extensive and thorough investigations of complex reaction mechanisms. Such approaches rely on automated computational protocols which drastically decrease the risk of making blunders during the search for significant reaction pathways.Furthermore, the accurate characterization of the potential energy surfaces (PESs) critical points, like reactants, intermediates, transition states and products involved in the reaction mechanism, is crucial in order to carry out a reliable kinetic investigation. The kinetic analysis of an erroneous potential energy surface, would lead to gross errors in the estimation of the rate constants of the chemical species involved in the reaction.In order to avoid such errors, the combination of high-level electronic structure calculations via composite scheme can be helpful to get a more precise estimation of the energy barriers involved in the reaction mechanism. It has been proven that "cheap"[1] composite schemes can achieve subchemical accuracy without any empirical parameters and with convenient computation times, making them perfect for the purpose of this thesis.In recent decades, many efforts have been made to develop theoretical and computational methodologies to perform accurate numerical simulations of the kinetics of such complex reaction mechanisms in a wide range of thermodynamic conditions that mimic extreme reaction environmentsas for combustion systems, the atmosphere and the ISM. Such methodologies are based on the ab initio-transition-state-theory-based master equation approach, which allows the determination of rate coefficients and branching ratios of chemical species involved in complex chemical reactions. This methodology allows to make accurate predictions of the relative abundances of the reaction products for complex reactions even under conditions of temperature and pressure not experimentally accessible, such as those that characterize the ISM. Based on these premises, this dissertation has been focused on the application of a computational protocol for the ab initio-based computational modeling and kinetic investigation of gas-phase reactions which can occur in the ISM.This protocol is based on the application of validated methodologies for the automated discovery of complex reaction mechanisms by means of the AutoMeKin[2] program, the accurate calculation of the energetic of the potential energy surfaces (PESs) through the junChS and junChS-F12a "cheap" composite schemes and the kinetic investigation using the StarRate computer program specifically designed to study gas-phase reactions of astrochemical interest in conjunction with the MESS program. Furthermore, this dissertation has been also focused on the development and implementation of StarRate, a computer program for the accurate calculation of kinetics through a chemical master equation approach of multi-step chemical reactions. StarRate is an object-based program written in the so-called F language. It is structured in three main modules, namely molecules, steps and reactions, which extract the properties needed to calculate the kinetics for the single-step reactions partecipating in the overall reaction. Another module, in_out, handles program’s input and output operations. The main program,starrate, controls the sequences of the calling of the procedures contained in each of the three main modules.Through these modular structure, StarRate[3] can compute canonical and microcanonical rate coefficients taking into account for the tunneling effect and the energy-dependent and time-dependent evolution of the species concentrations involved in the reaction mechanism. Such protocol has been applied to investigate the formation reaction mechanisms of some complex interstellar polyatomic molecules, named interstellar complex organic molecules (iCOMs). More specifically, the formation of prebiotic iCOMs in space has raised considerable interest in the scientific community, because they are considered as precursors of more complex biological systems involved in the origin of life in the Universe. Debate on the origins of these biomolecular building blocks has been further stimulated by the discovery of nucleobases and amino acids in meteorites and other extraterrestrial sources. However, few insights on the chemistry which brings to the formation of such compounds is known.  References: [1] Jacopo Lupi,Silvia Alessandrini,Cristina Puzzarini,and Vincenzo Barone.junchs and junchs-F12 models:Parameter-free efficient yet accurate compositeschemes for energies and structures of noncovalent complexes. Journal of Chem-ical Theory and Computation, 17(11):6974–6992, 2021. PMID: 34677974.[2] Emilio Martínez-Núñez, George L. Barnes, David R. Glowacki, Sabine Kopec,Daniel Peláez, Aurelio Rodríguez, Roberto Rodríguez-Fernández, Robin J. Shan-non, James J. P. Stewart, Pablo G. Tahoces, and Saulo A. Vazquez.Au-tomekin2021: An open-source program for automated reaction discovery. Journalof Computational Chemistry, 42(28):2036–2048, 2021.[3] Surajit Nandi, Bernardo Ballotta, Sergio Rampino, and Vincenzo Barone.Ageneral user-friendly tool for kinetic calculations of multi-step reactions withinthe virtual multifrequency spectrometer project. Applied Sciences, 10(5), 2020

    GAC-MAC-SGA 2023 Sudbury Meeting: Abstracts, Volume 46

    Get PDF

    DĂ©tection des Ă©carts de tendance et analyse prĂ©dictive pour le traitement des flux d’évĂ©nements en temps rĂ©el

    Get PDF
    Les systĂšmes d’information produisent diffĂ©rents types de journaux d’évĂ©nements. Les donnĂ©es historiques contenues dans les journaux d’évĂ©nements peuvent rĂ©vĂ©ler des informations importantes sur l’exĂ©cution d’un processus mĂ©tier. Le volume croissant de ces donnĂ©es collectĂ©es, pour ĂȘtre utile, doit ĂȘtre traitĂ© afin d’extraire des informations pertinentes. Dans de nombreuses situations, il peut ĂȘtre souhaitable de rechercher des tendances dans ces journaux. En particulier, les tendances calculĂ©es par le traitement et l’analyse de la sĂ©quence d’évĂ©nements gĂ©nĂ©rĂ©s par plusieurs instances du mĂȘme processus servent de base pour produire des prĂ©visions sur les exĂ©cutions actuelles du processus. L’objectif de cette thĂšse est de proposer un cadre gĂ©nĂ©rique pour l’analyse des tendances sur ces flux d’évĂ©nement, en temps rĂ©el. En premier lieu, nous montrons comment des tendances de diffĂ©rents types peuvent ĂȘtre calculĂ©es sur des journaux d’évĂ©nements en temps rĂ©el, Ă  l’aide d’un cadre gĂ©nĂ©rique appelĂ© workflow de distance de tendance. De multiples calculs courants sur les flux d’évĂ©nements s’avĂšrent ĂȘtre des cas particuliers de ce flux de travail, selon la façon dont diffĂ©rents paramĂštres de flux de travail sont dĂ©finis. La suite naturelle de l’analyse statique des tendances est l’usage des algorithmes d’apprentissage. Nous joignons alors les concepts de traitement de flux d’évĂ©nements et d’apprentissage automatique pour crĂ©er un cadre qui permet le calcul de diffĂ©rents types de prĂ©dictions sur les journaux d’évĂ©nements. Le cadre proposĂ© est gĂ©nĂ©rique : en fournissant diffĂ©rentes dĂ©finitions Ă  une poignĂ©e de fonctions d’évĂ©nement, plusieurs types de prĂ©dictions diffĂ©rents peuvent ĂȘtre calculĂ©s Ă  l’aide du mĂȘme flux de travail de base. Les deux approches ont Ă©tĂ© mises en oeuvre et Ă©valuĂ©es expĂ©rimentalement en Ă©tendant un moteur de traitement de flux d’évĂ©nements existant, appelĂ© BeepBeep. Les rĂ©sultats expĂ©rimentaux montrent que les Ă©carts par rapport Ă  une tendance de rĂ©fĂ©rence peuvent ĂȘtre dĂ©tectĂ©s en temps rĂ©el pour des flux produisant jusqu’à des milliers d’évĂ©nements par seconde

    Security Technologies and Methods for Advanced Cyber Threat Intelligence, Detection and Mitigation

    Get PDF
    The rapid growth of the Internet interconnectivity and complexity of communication systems has led us to a significant growth of cyberattacks globally often with severe and disastrous consequences. The swift development of more innovative and effective (cyber)security solutions and approaches are vital which can detect, mitigate and prevent from these serious consequences. Cybersecurity is gaining momentum and is scaling up in very many areas. This book builds on the experience of the Cyber-Trust EU project’s methods, use cases, technology development, testing and validation and extends into a broader science, lead IT industry market and applied research with practical cases. It offers new perspectives on advanced (cyber) security innovation (eco) systems covering key different perspectives. The book provides insights on new security technologies and methods for advanced cyber threat intelligence, detection and mitigation. We cover topics such as cyber-security and AI, cyber-threat intelligence, digital forensics, moving target defense, intrusion detection systems, post-quantum security, privacy and data protection, security visualization, smart contracts security, software security, blockchain, security architectures, system and data integrity, trust management systems, distributed systems security, dynamic risk management, privacy and ethics
    • 

    corecore