156 research outputs found

    Helmholtz Portfolio Theme Large-Scale Data Management and Analysis (LSDMA)

    Get PDF
    The Helmholtz Association funded the "Large-Scale Data Management and Analysis" portfolio theme from 2012-2016. Four Helmholtz centres, six universities and another research institution in Germany joined to enable data-intensive science by optimising data life cycles in selected scientific communities. In our Data Life cycle Labs, data experts performed joint R&D together with scientific communities. The Data Services Integration Team focused on generic solutions applied by several communities

    2022 Review of Data-Driven Plasma Science

    Get PDF
    Data-driven science and technology offer transformative tools and methods to science. This review article highlights the latest development and progress in the interdisciplinary field of data-driven plasma science (DDPS), i.e., plasma science whose progress is driven strongly by data and data analyses. Plasma is considered to be the most ubiquitous form of observable matter in the universe. Data associated with plasmas can, therefore, cover extremely large spatial and temporal scales, and often provide essential information for other scientific disciplines. Thanks to the latest technological developments, plasma experiments, observations, and computation now produce a large amount of data that can no longer be analyzed or interpreted manually. This trend now necessitates a highly sophisticated use of high-performance computers for data analyses, making artificial intelligence and machine learning vital components of DDPS. This article contains seven primary sections, in addition to the introduction and summary. Following an overview of fundamental data-driven science, five other sections cover widely studied topics of plasma science and technologies, i.e., basic plasma physics and laboratory experiments, magnetic confinement fusion, inertial confinement fusion and high-energy-density physics, space and astronomical plasmas, and plasma technologies for industrial and other applications. The final section before the summary discusses plasma-related databases that could significantly contribute to DDPS. Each primary section starts with a brief introduction to the topic, discusses the state-of-the-art developments in the use of data and/or data-scientific approaches, and presents the summary and outlook. Despite the recent impressive signs of progress, the DDPS is still in its infancy. This article attempts to offer a broad perspective on the development of this field and identify where further innovations are required

    Accessible software frameworks for reproducible image analysis of host-pathogen interactions

    Get PDF
    Um die Mechanismen hinter lebensgefährlichen Krankheiten zu verstehen, müssen die zugrundeliegenden Interaktionen zwischen den Wirtszellen und krankheitserregenden Mikroorganismen bekannt sein. Die kontinuierlichen Verbesserungen in bildgebenden Verfahren und Computertechnologien ermöglichen die Anwendung von Methoden aus der bildbasierten Systembiologie, welche moderne Computeralgorithmen benutzt um das Verhalten von Zellen, Geweben oder ganzen Organen präzise zu messen. Um den Standards des digitalen Managements von Forschungsdaten zu genügen, müssen Algorithmen den FAIR-Prinzipien (Findability, Accessibility, Interoperability, and Reusability) entsprechen und zur Verbreitung ebenjener in der wissenschaftlichen Gemeinschaft beitragen. Dies ist insbesondere wichtig für interdisziplinäre Teams bestehend aus Experimentatoren und Informatikern, in denen Computerprogramme zur Verbesserung der Kommunikation und schnellerer Adaption von neuen Technologien beitragen können. In dieser Arbeit wurden daher Software-Frameworks entwickelt, welche dazu beitragen die FAIR-Prinzipien durch die Entwicklung von standardisierten, reproduzierbaren, hochperformanten, und leicht zugänglichen Softwarepaketen zur Quantifizierung von Interaktionen in biologischen System zu verbreiten. Zusammenfassend zeigt diese Arbeit wie Software-Frameworks zu der Charakterisierung von Interaktionen zwischen Wirtszellen und Pathogenen beitragen können, indem der Entwurf und die Anwendung von quantitativen und FAIR-kompatiblen Bildanalyseprogrammen vereinfacht werden. Diese Verbesserungen erleichtern zukünftige Kollaborationen mit Lebenswissenschaftlern und Medizinern, was nach dem Prinzip der bildbasierten Systembiologie zur Entwicklung von neuen Experimenten, Bildgebungsverfahren, Algorithmen, und Computermodellen führen wird

    Tracing back the source of contamination

    Get PDF
    From the time a contaminant is detected in an observation well, the question of where and when the contaminant was introduced in the aquifer needs an answer. Many techniques have been proposed to answer this question, but virtually all of them assume that the aquifer and its dynamics are perfectly known. This work discusses a new approach for the simultaneous identification of the contaminant source location and the spatial variability of hydraulic conductivity in an aquifer which has been validated on synthetic and laboratory experiments and which is in the process of being validated on a real aquifer

    Atomistic Simulation and Virtual Diffraction Characterization of Alumina Interfaces: Evaluating Structure and Stability for Predictive Physical Vapor Deposition Models

    Get PDF
    The objectives of this work are to investigate the structure and energetic stability of different alumina (Al2O3) phases using atomistic simulation and virtual diffraction characterization. To meet these objectives, this research performs molecular statics and molecular dynamics simulations employing the reactive force-field (ReaxFF) potential to model bulk, interface, and surface structures in the θ-, γ-, κ-, and α-Al2O3 system. Simulations throughout this study are characterized using a new virtual diffraction algorithm, developed and implemented for this work, that creates both selected area electron diffraction (SAED) and x-ray diffraction (XRD) line profiles without assuming prior knowledge of the crystal system. First, the transferability of the ReaxFF potential is evaluated by modelling different alumina bulk systems. ReaxFF is shown to correctly predict the energetic stability of α-Al2O3 among the crystalline alumina phases, but incorrectly predicts an even lower energy amorphous phase. Virtual XRD patterns uniquely identify each phase and validate the minimum energy bulk structures through experimental comparison. Second, stable and metastable alumina surfaces are studied at 0, 300, 500, and 700 K. ReaxFF predicts minimum energy surface structures and energies in good agreement with prior studies at 0 K; however, select surface models at 500 and 700 K undergo significant reconstructions caused by the unnatural bias for a lower-energy amorphous phase. Virtual SAED analysis performed on alumina surfaces allow advanced characterization and direct experimental validation of select models. Third, ReaxFF is used to model homophase and heterophase alumina interfaces at 0 K. Predicted minimum energy structures of α-Al2O3 interfaces show good agreement with prior works, which provides the foundation for the first atomistic study of metastable alumina grain boundaries and heterophase alumina interfaces. Virtual SAED patterns characterize select alumina interfaces and help guide the construction of low-energy heterophase alumina interfaces by providing insight into crystallographic compatibilities. Combined, the energetic data extracted from bulk, surface, and interface simulations as well as insights gained through virtual diffraction will aid the development of mesoscale predictive models of polycrystalline alumina formation during physical vapor deposition

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Applications Development for the Computational Grid

    Get PDF
    • …
    corecore