1,360 research outputs found

    Bridging the capability gap in environmental gamma-ray spectrometry

    Get PDF
    Environmental gamma-ray spectroscopy provides a powerful tool that can be used in environmental monitoring given that it offers a compromise between measurement time and accuracy allowing for large areas to be surveyed quickly and relatively inexpensively. Depending on monitoring objectives, spectral information can then be analysed in real-time or post survey to characterise contamination and identify potential anomalies. Smaller volume detectors are of particular worth to environmental surveys as they can be operated in the most demanding environments. However, difficulties are encountered in the selection of an appropriate detector that is robust enough for environmental surveying yet still provides a high quality signal. Furthermore, shortcomings remain with methods employed for robust spectral processing since a number of complexities need to be overcome including: the non-linearity in detector response with source burial depth, large counting uncertainties, accounting for the heterogeneity in the natural background and unreliable methods for detector calibration. This thesis aimed to investigate the application of machine learning algorithms to environmental gamma-ray spectroscopy data to identify changes in spectral shape within large Monte Carlo calibration libraries to estimate source characteristics for unseen field results. Additionally, a 71 Ă— 71 mm lanthanum bromide detector was tested alongside a conventional 71 Ă— 71 mm sodium iodide to assess whether its higher energy efficiency and resolution could make it more reliable in handheld surveys. The research presented in this thesis demonstrates that machine learning algorithms could be successfully applied to noisy spectra to produce valuable source estimates. Of note, were the novel characterisation estimates made on borehole and handheld detector measurements taken from land historically contaminated with 226Ra. Through a novel combination of noise suppression and neural networks the burial depth, activity and source extent of contamination was estimated and mapped. Furthermore, it was demonstrated that Machine Learning techniques could be operated in real-time to identify hazardous 226Ra containing hot particles with much greater confidence than current deterministic approaches such as the gross counting algorithm. It was concluded that remediation of 226Ra contaminated legacy sites could be greatly improved using the methods described in this thesis. Finally, Neural Networks were also applied to estimate the activity distribution of 137Cs, derived from the nuclear industry, in an estuarine environment. Findings demonstrated the method to be theoretically sound, but practically inconclusive, given that much of the contamination at the site was buried beyond the detection limits of the method. It was generally concluded that the noise posed by intrinsic counts in the 71 Ă— 71 mm lanthanum bromide was too substantial to make any significant improvements over a comparable sodium iodide in contamination characterisation using 1 second counts

    Accurate prediction of X-ray pulse properties from a free-electron laser using machine learning

    Get PDF
    Free-electron lasers providing ultra-short high-brightness pulses of X-ray radiation have great potential for a wide impact on science, and are a critical element for unravelling the structural dynamics of matter. To fully harness this potential, we must accurately know the X-ray properties: intensity, spectrum and temporal profile. Owing to the inherent fluctuations in free-electron lasers, this mandates a full characterization of the properties for each and every pulse. While diagnostics of these properties exist, they are often invasive and many cannot operate at a high-repetition rate. Here, we present a technique for circumventing this limitation. Employing a machine learning strategy, we can accurately predict X-ray properties for every shot using only parameters that are easily recorded at high-repetition rate, by training a model on a small set of fully diagnosed pulses. This opens the door to fully realizing the promise of next-generation high-repetition rate X-ray lasers

    The model of an anomaly detector for HiLumi LHC magnets based on Recurrent Neural Networks and adaptive quantization

    Full text link
    This paper focuses on an examination of an applicability of Recurrent Neural Network models for detecting anomalous behavior of the CERN superconducting magnets. In order to conduct the experiments, the authors designed and implemented an adaptive signal quantization algorithm and a custom GRU-based detector and developed a method for the detector parameters selection. Three different datasets were used for testing the detector. Two artificially generated datasets were used to assess the raw performance of the system whereas the 231 MB dataset composed of the signals acquired from HiLumi magnets was intended for real-life experiments and model training. Several different setups of the developed anomaly detection system were evaluated and compared with state-of-the-art OC-SVM reference model operating on the same data. The OC-SVM model was equipped with a rich set of feature extractors accounting for a range of the input signal properties. It was determined in the course of the experiments that the detector, along with its supporting design methodology, reaches F1 equal or very close to 1 for almost all test sets. Due to the profile of the data, the best_length setup of the detector turned out to perform the best among all five tested configuration schemes of the detection system. The quantization parameters have the biggest impact on the overall performance of the detector with the best values of input/output grid equal to 16 and 8, respectively. The proposed solution of the detection significantly outperformed OC-SVM-based detector in most of the cases, with much more stable performance across all the datasets.Comment: Related to arXiv:1702.0083

    2022 Review of Data-Driven Plasma Science

    Get PDF
    Data-driven science and technology offer transformative tools and methods to science. This review article highlights the latest development and progress in the interdisciplinary field of data-driven plasma science (DDPS), i.e., plasma science whose progress is driven strongly by data and data analyses. Plasma is considered to be the most ubiquitous form of observable matter in the universe. Data associated with plasmas can, therefore, cover extremely large spatial and temporal scales, and often provide essential information for other scientific disciplines. Thanks to the latest technological developments, plasma experiments, observations, and computation now produce a large amount of data that can no longer be analyzed or interpreted manually. This trend now necessitates a highly sophisticated use of high-performance computers for data analyses, making artificial intelligence and machine learning vital components of DDPS. This article contains seven primary sections, in addition to the introduction and summary. Following an overview of fundamental data-driven science, five other sections cover widely studied topics of plasma science and technologies, i.e., basic plasma physics and laboratory experiments, magnetic confinement fusion, inertial confinement fusion and high-energy-density physics, space and astronomical plasmas, and plasma technologies for industrial and other applications. The final section before the summary discusses plasma-related databases that could significantly contribute to DDPS. Each primary section starts with a brief introduction to the topic, discusses the state-of-the-art developments in the use of data and/or data-scientific approaches, and presents the summary and outlook. Despite the recent impressive signs of progress, the DDPS is still in its infancy. This article attempts to offer a broad perspective on the development of this field and identify where further innovations are required

    Towards a cyber physical system for personalised and automatic OSA treatment

    Get PDF
    Obstructive sleep apnea (OSA) is a breathing disorder that takes place in the course of the sleep and is produced by a complete or a partial obstruction of the upper airway that manifests itself as frequent breathing stops and starts during the sleep. The real-time evaluation of whether or not a patient is undergoing OSA episode is a very important task in medicine in many scenarios, as for example for making instantaneous pressure adjustments that should take place when Automatic Positive Airway Pressure (APAP) devices are used during the treatment of OSA. In this paper the design of a possible Cyber Physical System (CPS) suited to real-time monitoring of OSA is described, and its software architecture and possible hardware sensing components are detailed. It should be emphasized here that this paper does not deal with a full CPS, rather with a software part of it under a set of assumptions on the environment. The paper also reports some preliminary experiments about the cognitive and learning capabilities of the designed CPS involving its use on a publicly available sleep apnea database
    • …
    corecore