61 research outputs found

    Resolution capacity of geophysical monitoring regarding permafrost degradation induced by hydrological processes

    Get PDF
    Geophysical methods are often used to characterize and monitor the subsurface composition of permafrost. The resolution capacity of standard methods, i.e. electrical resistivity tomography and refraction seismic tomography, depends not only on static parameters such as measurement geometry, but also on the temporal variability in the contrast of the geophysical target variables (electrical resistivity and P-wave velocity). Our study analyses the resolution capacity of electrical resistivity tomography and refraction seismic tomography for typical processes in the context of permafrost degradation using synthetic and field data sets of mountain permafrost terrain. In addition, we tested the resolution capacity of a petrophysically based quantitative combination of both methods, the so-called 4-phase model, and through this analysed the expected changes in water and ice content upon permafrost thaw. The results from the synthetic data experiments suggest a higher sensitivity regarding an increase in water content compared to a decrease in ice content. A potentially larger uncertainty originates from the individual geophysical methods than from the combined evaluation with the 4-phase model. In the latter, a loss of ground ice can be detected quite reliably, whereas artefacts occur in the case of increased horizontal or vertical water flow. Analysis of field data from a well-investigated rock glacier in the Swiss Alps successfully visualized the seasonal ice loss in summer and the complex spatially variable ice, water and air content changes in an interannual comparison

    Dissipative Dynamics of an Open Bose Einstein Condensate

    Full text link
    As an atomic Bose Einstein condensate (BEC) is coupled to a source of uncondensed atoms at the same temperature and to a sink (extraction towards an atom laser) the idealized description in terms of a Gross-Pitaevsky equation (GP) no longer holds. Under suitable physical assumptions we show that the dissipative BEC obeys a Complex Ginzburg Landau equation (CGL) and for some parameter range it undergoes a space time patterning. As a consequence, the density of BEC atoms within the trap displays non trivial space time correlations, which can be detected by monitoring the density profile of the outgoing atom laser. The patterning condition requires a negative scattering length, as e.g. in 7^7Li. In such a case we expect a many domain collapsed regime, rather than a single one as reported for a closed BEC.Comment: 13 pages, 5 figures, submitt. to Optics Comm., 18th Aug. 99 (special issue Scully Festschrift

    Software for the frontiers of quantum chemistry:An overview of developments in the Q-Chem 5 package

    Get PDF
    This article summarizes technical advances contained in the fifth major release of the Q-Chem quantum chemistry program package, covering developments since 2015. A comprehensive library of exchange–correlation functionals, along with a suite of correlated many-body methods, continues to be a hallmark of the Q-Chem software. The many-body methods include novel variants of both coupled-cluster and configuration-interaction approaches along with methods based on the algebraic diagrammatic construction and variational reduced density-matrix methods. Methods highlighted in Q-Chem 5 include a suite of tools for modeling core-level spectroscopy, methods for describing metastable resonances, methods for computing vibronic spectra, the nuclear–electronic orbital method, and several different energy decomposition analysis techniques. High-performance capabilities including multithreaded parallelism and support for calculations on graphics processing units are described. Q-Chem boasts a community of well over 100 active academic developers, and the continuing evolution of the software is supported by an “open teamware” model and an increasingly modular design

    Identification and reconstruction of low-energy electrons in the ProtoDUNE-SP detector

    Full text link
    Measurements of electrons from Îœe\nu_e interactions are crucial for the Deep Underground Neutrino Experiment (DUNE) neutrino oscillation program, as well as searches for physics beyond the standard model, supernova neutrino detection, and solar neutrino measurements. This article describes the selection and reconstruction of low-energy (Michel) electrons in the ProtoDUNE-SP detector. ProtoDUNE-SP is one of the prototypes for the DUNE far detector, built and operated at CERN as a charged particle test beam experiment. A sample of low-energy electrons produced by the decay of cosmic muons is selected with a purity of 95%. This sample is used to calibrate the low-energy electron energy scale with two techniques. An electron energy calibration based on a cosmic ray muon sample uses calibration constants derived from measured and simulated cosmic ray muon events. Another calibration technique makes use of the theoretically well-understood Michel electron energy spectrum to convert reconstructed charge to electron energy. In addition, the effects of detector response to low-energy electron energy scale and its resolution including readout electronics threshold effects are quantified. Finally, the relation between the theoretical and reconstructed low-energy electron energy spectrum is derived and the energy resolution is characterized. The low-energy electron selection presented here accounts for about 75% of the total electron deposited energy. After the addition of lost energy using a Monte Carlo simulation, the energy resolution improves from about 40% to 25% at 50~MeV. These results are used to validate the expected capabilities of the DUNE far detector to reconstruct low-energy electrons.Comment: 19 pages, 10 figure

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    Get PDF
    A primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)\mathcal{O}(10) MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the Îœe\nu_e component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(EÎœ)\sigma(E_\nu) for charged-current Îœe\nu_e absorption on argon. In the context of a simulated extraction of supernova Îœe\nu_e spectral parameters from a toy analysis, we investigate the impact of σ(EÎœ)\sigma(E_\nu) modeling uncertainties on DUNE's supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(EÎœ)\sigma(E_\nu) must be substantially reduced before the Îœe\nu_e flux parameters can be extracted reliably: in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10\% bias with DUNE requires σ(EÎœ)\sigma(E_\nu) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(EÎœ)\sigma(E_\nu). A direct measurement of low-energy Îœe\nu_e-argon scattering would be invaluable for improving the theoretical precision to the needed level.Comment: 25 pages, 21 figure

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Application of machine learning enhanced agent-based techniques in hydrology and water resource management

    No full text
    Um komplexe hydrologische Problemstellungen zu lösen, werden neue Analysetechniken benötigt. In dieser Arbeit werden dazu Agenten-basierte Methoden mit AnsĂ€tzen des maschinellen Lernens kombiniert, um komplexe Interaktionen in natĂŒrlichen Systemen zu beschreiben. In vier unterschiedlichen Anwendungen werden die Vorteile der neuen Methoden prĂ€sentiert. Dabei werden unter anderem die Bewegung von Bodenwasser modelliert, die Identifikation von bewĂ€sserter Landwirtschaft aus Fernerkundungsdaten verbessert, die Hochwasserereignisseparation automatisiert sowie die Wasserverteilung im mittelalterlichen Bali beschrieben. Die hier gezeigten Ergebnisse sind der Ausgangspunkt fĂŒr weitere Anwendungen von kĂŒnstlicher Intelligenz und Agenten-basierter Modellierung in der Hydrologie und dem Wasser Ressourcen Management.Hydrological and water resource management problems require new analysis techniques to answer the more complex research questions. In this thesis, machine learning and agent-based modelling approaches are combined to describe the dynamic interactions in environmental systems. In four different applications the merits of these approaches are shown and how they outperform traditional approaches. The applications comprise the description of soil water movement in the unsaturated zone, the identification of irrigated agriculture from remote sensing images, the separation of flood events from continuous time series of runoff, and eventually, the modelling of water distribution in medieval Bali. The outcome of this thesis is the starting point for a wide field of further applications of machine learning, artificial intelligence and agent-based modelling in hydrology and water resource management

    On the Automation of Flood Event Separation From Continuous Time Series

    No full text
    Gefördert durch den Publikationsfonds der UniversitÀt Kasse

    Zum Einsatz automatisierter Pipelines zur Datenqualifizierung im Talsperrenmanagement

    No full text
    Der Beitrag untersucht die automatisierte Datenqualifizierung von Bauwerksdaten in der Talsperrenwirtschaft, insbesondere die Verbindung zwischen dem SICA-Algorithmus von Okeanos und der Messdatenhaltung Aquarius von Aquatic Information. Die zuverlĂ€ssige Erfassung und Auswertung dieser Daten sind entscheidend fĂŒr die Sicherheit und Effizienz von Talsperren. Der Fokus liegt auf den neuesten Entwicklungen im Bereich der automatisierten Datenqualifizierung, einschließlich kĂŒnstlicher Intelligenz und maschinellem Lernen zur Fehlererkennung und -korrektur. Der SICA-Algorithmus zeichnet sich durch den Einsatz unbewachten Lernens aus, was eine flexible und adaptive Lösung fĂŒr die Bauwerksanalyse ermöglicht. Ein praktisches Beispiel illustriert die Anwendung des SICA-Algorithmus in Verbindung mit Aquarius und liefert Einblicke zur Weiterentwicklung der automatisierten Datenqualifizierung in der Talsperrenwirtschaft.This article examines the automated data qualification of structural data in the dam industry, in particular the connection between the SICA algorithm from Okeanos and the Aquarius measurement data management system from Aquatic Information. The reliable collection and evaluation of this data is crucial for the safety and efficiency of dams. The focus is on the latest developments in automated data qualification, including artificial intelligence and machine learning for error detection and correction. The SICA algorithm is characterized using unsupervised learning, which enables a flexible and adaptive solution for structural analysis. A practical example illustrates the application of the SICA algorithm in conjunction with Aquarius and provides insights into the further development of automated data qualification in the dam management

    IPA (v1)

    No full text
    In the last decade, agent-based modelling (ABM) became a popular modelling technique in social sciences, medicine, biology, and ecology. ABM was designed to simulate systems that are highly dynamic and sensitive to small variations in their composition and their state. As hydrological systems, and natural systems in general, often show dynamic and non-linear behaviour, ABM can be an appropriate way to model these systems. Nevertheless, only a few studies have utilized the ABM method for process-based modelling in hydrology. The percolation of water through the unsaturated soil is highly responsive to the current state of the soil system; small variations in composition lead to major changes in the transport system. Hence, we present a new approach for modelling the movement of water through a soil column: autonomous water agents that transport water through the soil while interacting with their environment as well as with other agents under physical laws
    • 

    corecore