254 research outputs found

    Insights From New Age Constraints and Sediment Volumes From the Austrian Northern Alpine Foreland Basin

    Get PDF
    Detailed characterization of variations in sediment architecture, flux, and transport processes in peri-orogenic basins offers insights into external climatic or tectonic forcings. We tested how four well-known tectonic/erosional events in the Oligocene/Miocene Alpine source area are recorded in the sediment-accumulation rates (SARs) of the deep marine sink in the Northern Alpine Foreland Basin (NAFB): exhumation of the Lepontine Dome (starting at 30 Ma) and the Tauern Window (23-21 Ma), erosion of the Augenstein Formation (∌21 Ma), and the visco-elastic relaxation of the European Plate. The Upper Austrian NAFB offers a unique opportunity to investigate external forcings on sedimentary infill due to the large amount of data on the Alpine hinterland and foreland. Deep-marine sedimentation, forming the Puchkirchen Group and the basal Hall Formation, was controlled by a basin-axial submarine channel (3–5 km wide, >100 km length). Two basin-wide unconformities were recognized in seismic-reflection data: the Northern Slope Unconformity (NSU) and the Base Hall Unconformity (BHU). We combine biostratigraphic and chemostratigraphic analyses of 316 drill-cutting samples from three wells with a large 3D-seismic-reflection data set (3300 km2, >5 km depth) to determine age and duration of the unconformities and to calculate spatially averaged SARs for the submarine channel and its overbanks, separately. Deepening of the basin, recorded by the NSU, occurred between 28.1 and 26.9 Ma. The Puchkirchen Group (26.9–19.6 Ma) is characterized by constant SARs (within standard deviation) in the channel [432–623 (t/m2/Ma)] and on the overbanks [240–340 (t/m2/Ma)]. The visco-elastic relaxation of the European Plate results in low SARs on the overbanks [186 (t/m2/Ma)], a decrease in sediment grain size in channel deposits and a decrease in sea level at the BHU (19.6–19.0 Ma). In the upper Hall Formation (19.0–18.1 Ma), clinoforms prograding from the south filled up the basin [1497 (t/m2/Ma)] within 1 Myrs. We conclude that only two of the tectonic signals are recorded in this part of the deep-marine sink, erosion of Augenstein Formation and visco-elastic relaxation of the European Plate; the exhumation of the Tauern Window and Lepontine Dome remain unrecorded

    Testing Apps With Real-World Inputs

    Get PDF
    To test mobile apps, one requires realistic and coherent test inputs. The Link approach for Web testing has shown that knowledge bases such as DBPedia can be a reliable source of semantically coherent inputs. In this paper, we adapt and extend the Link approach towards test generation for mobile applications: (1) We identify and match descriptive labels with input fields, based on the Gestalt principles of human perception; (2) We then use natural language processing techniques to extract the concept associated with the label; (3) We use this concept to query a knowledge base for candidate input values; (4) We cluster the UI elements according to their functionality into input and actions, filling the input elements first and then interacting with the actions. Our evaluation shows that leveraging knowledge bases for testing mobile apps with realistic inputs is effective. On average, our approach covered 9% more statements than randomly generated text inputs

    "FAIR-by-Design" Artifacts: Enriching Publications and Software with FAIR Scientific Information at the Time of Creation

    Get PDF
    Presentation on the idea of "FAIR-by-Design" Artifacts at the NFDI4Ing Conference 2023. Abstract In several research disciplines, the use and development of software have become an integral part, with researchers reporting in publications the results obtained with software and concepts implemented in software. Consequently, publications and software have become two core artifacts in academia with increasing importance for measuring research impact and reputation. The research community has made great efforts to improve digital access to publications and software. However, even now that these artifacts are available in digital form, researchers still encapsulate the scientific information in static and relatively unstructured documents unsuitable for communication. The next step in the digital transformation of scholarly communication requires a more flexible, fine-grained, context-sensitive, and semantic representation of scientific information to be understandable, processable, and usable by humans and machines. Researchers need support in the form of infrastructures, services, and tools to organize FAIR scientific information from publications and software. Several research disciplines work on initiatives to organize scientific information, e.g., machine learning with “Papers-with-Code”, invasion biology with “Hi-Knowledge”, and biodiversity with “OpenBiodiv”. However, these initiatives are often technically diverse and limited to the respective application domain. For this reason, we from the task area Ellen of NFDI4Ing (and in collaboration with NFDI4DataScience and NFDI4Energy) decided to use the Open Research Knowledge Graph (ORKG), an innovative infrastructure for organizing scientific information from publications and software. The ORKG is a cross-discipline research knowledge graph that offers all research communities an easy-to-use and sustainably governed infrastructure. This infrastructure implements best practices, such as FAIR principles and versioning, with services combining manual crowd-sourcing and (semi-)automated approaches to support researchers in producing, curating, processing, and (re-)using FAIR scientific information from publications and software. As a result, organized scientific information is openly available in the long term and can be understood, processed, and used by humans and machines. Thus, research communities can constantly build, publish, maintain, (re-)use, update, and expand organized scientific information in a long-term and collaborative manner. While the ORKG currently focuses on organizing scientific information from published publications and software, we aim to help researchers create “FAIR-by-Design” artifacts to improve their storage, access, and (re-)use, using the ORKG as exemplary infrastructure. The idea of “FAIR-by-Design” artifacts is that the creators of an artifact describe it with extensive and FAIR information once and in parallel to the time of creation. This FAIR information is embedded directly into the artifact to be available to anyone at any time. Specifically, we developed two tools (SciKGTeX for publications and DataDesc for software) that support researchers in the role of author and developer to enrich their publications and software at the time of writing and development with FAIR scientific information embedded into the respective artifact. SciKGTeX is a LaTeX package to annotate research contributions directly in LaTeX source code. Authors can enrich their publications with structured, machine-actionable, and FAIR scientific information about their research contributions. SciKGTeX embeds the annotated contribution data into the PDF’s XMP metadata so that the FAIR scientific information persists for the lifetime of the artifact. DataDesc is a toolkit that combines different tools to describe software with machine-actionable metadata. Developers can describe Python software and its interfaces with extensive metadata by annotating individual classes and functions directly within the source code. DataDesc converts all metadata into an OpenAPI-compliant YAML file, which various tools can render and process. Regarding the research data management (RDM) lifecycle, both tools target the production phase to support researchers in creating“FAIR-by-Design” artifacts. Creating “FAIR-by-Design” artifacts helps to improve their storage, leading to better access to artifacts and thus laying the foundation for their effective (re-)use. Using the ORKG as exemplary infrastructure, we demonstrate with two proof-of-concepts how infrastructure providers can use the artifacts from SciKGTeX and DataDesc to store the FAIR scientific information in their systems. In the case of SciKGTeX, the ORKG recently added a new upload feature for SciKGTeX annotated PDFs to allow researchers to add the FAIR scientific information of publications quickly and easily. In addition, the ing.grid journal provides a version of their LaTeX template that integrates the SciKGTeX. For DataDesc, we plan such an upload feature and similar use by the community in future work. Researchers only need to create a “FAIR-by-Design” artifact once, and can reuse it on multiple infrastructures to improve their dissemination and discoverability. With improved storage, researchers can more easily discover and access publications and software to determine whether an artifact fulfills their information needs. However, researchers do not have to rely on such infrastructures to find, access, and assess publications or software. When they encounter a “FAIR-by-Design” artifact, it embeds the additional information itself so that they can review the artifact themselves with the same information base. Improved discoverability and accessibility lay the foundation for effective (re-)use as researchers can better understand an artifact. In the case of the ORKG, we can even (re-)use the information from SciKGTeX and DataDesc stored in the ORKG interchangeably. A publication annotated with SciKGTeX can reference a software annotated with DataDesc stored in the ORKG and vice versa. Overall, enabling researchers to create “FAIR-by-Design” artifacts is a promising approach to support the downstream phases of storage, access, and (re-)use in the RDM lifecycle. In our presentation, we want to explain the idea of “FAIR-by-Design” artifacts in more detail using concrete examples based on the two tools and in combination with the ORKG. We believe that the idea of “FAIR-by-Design” artifacts is of interest to the research community. The two tools can inspire other researchers to extend our original approaches and develop new ones to create more “FAIR-by-Design” artifacts by enriching artifacts with FAIR scientific knowledge at the time of creation. Furthermore, we hope to encourage and motivate researchers to use our tools more intensively and thus establish them. In particular, the existing and planned future integration with ORKG and the existing collaboration with the ing.grid journal are motivating incentives for researchers to use SciKGTeX and DataDesc actively.The authors thank the Federal Government, the Heads of Government of the LĂ€nder, as well as the Joint Science Conference (GWK), for their funding and support within the NFDI4Ing and NFDI4DataScience consortia. This work was funded by the German Research Foundation (DFG) -project numbers 442146713 and 460234259, by the European Research Council for the project ScienceGRAPH (Grant agreement ID: 819536), and by the TIB - Leibniz Information Centre for Science and Technology

    Structure and kinetics in the freezing of nearly hard spheres

    Full text link
    We consider homogeneous crystallisation rates in confocal microscopy experiments on colloidal nearly hard spheres at the single particle level. These we compare with Brownian dynamics simuations by carefully modelling the softness in the interactions with a Yukawa potential, which takes account of the electrostatic charges present in the experimental system. Both structure and dynamics of the colloidal fluid are very well matched between experiment and simulation, so we have confidence that the system simulated is close to that in the experiment. In the regimes we can access, we find reasonable agreement in crystallisation rates between experiment and simulations, noting that the larger system size in experiments enables the formation of critical nuclei and hence crystallisation at lower supersaturations than the simulations. We further examine the structure of the metastable fluid with a novel structural analysis, the topological cluster classification. We find that at densities where the hard sphere fluid becomes metastable, the dominant structure is a cluster of m=10 particles with five-fold symmetry. At a particle level, we find three regimes for the crystallisation process: metastable fluid (dominated by m=10 clusters), crystal and a transition region of frequent hopping between crystal-like environments and other (m\neq10) structuresComment: 10 page

    Timing and Pacing of Indonesian Throughflow Restriction and Its Connection to Late Pliocene Climate Shifts

    Get PDF
    drier conditions. This shift fundamentally reorganized Earth\u27s climate from the Miocene state toward conditions similar to the present. During the Pliocene, the progressive restriction of the Indonesian Throughflow (ITF) is suggested to have enhanced this shift toward stronger meridional thermal gradients. Reduced ITF, caused by the northward movement of Australia and uplift of Indonesia, impeded global thermohaline circulation, also contributing to late Pliocene Northern Hemisphere cooling via atmospheric and oceanographic teleconnections. Here we present an orbitally tuned high‐resolution sediment geochemistry, calcareous nannofossil, and X‐ray fluorescence record between 3.65 and 2.97 Ma from the northwest shelf of Australia within the Leeuwin Current. International Ocean Discovery Program Site U1463 provides a record of local surface water conditions and Australian climate in relation to changing ITF connectivity. Modern analogue‐based interpretations of nannofossil assemblages indicate that ITF configuration culminated ~3.54 Ma. A decrease in warm, oligotrophic taxa such as Umbilicosphaera sibogae, with a shift from Gephyrocapsa sp. to Reticulofenestra sp., and an increase of mesotrophic taxa (e.g., Umbilicosphaera jafari and Helicosphaera spp.) suggest that tropical Pacific ITF sources were replaced by cooler, fresher, northern Pacific waters. This initial tectonic reorganization enhanced the Indian Oceans sensitivity to orbitally forced cooling in the southern high latitudes culminating in the M2 glacial event (~3.3 Ma). After 3.3 Ma the restructured ITF established the boundary conditions for the inception of the Sahul‐Indian Ocean Bjerknes mechanism and increased the response to glacio‐eustatic variability

    Can Consistent Benchmarking within a Standardized Pain Management Concept Decrease Postoperative Pain after Total Hip Arthroplasty? A Prospective Cohort Study including 367 Patients

    Get PDF
    Background: The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. Methods: All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project "Quality Improvement in Postoperative Pain Management" (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. Results: From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (+/- 3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (+/- 1.2). Over time, the maximum pain score decreased (mean 3.0, +/- 2.0), whereas patient satisfaction significantly increased (mean 9.8, +/- 0.4; p<0.05). Among 49 anonymized hospitals, our clinic stayed on first rank in terms of lowest maximum pain and patient satisfaction over the period. Conclusion: Results were already acceptable at the beginning of benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management

    On measuring colloidal volume fractions

    Full text link
    Hard-sphere colloids are popular as models for testing fundamental theories in condensed matter and statistical physics, from crystal nucleation to the glass transition. A single parameter, the volume fraction (phi), characterizes an ideal, monodisperse hard-sphere suspension. In comparing experiments with theories and simulation, researchers to date have paid little attention to likely uncertainties in experimentally-quoted phi values. We critically review the experimental measurement of phi in hard-sphere colloids, and show that while statistical uncertainties in comparing relative values of phi can be as low as 0.0001, systematic errors of 3-6% are probably unavoidable. The consequences of this are illustrated by way of a case study comparing literature data sets on hard-sphere viscosity and diffusion.Comment: 11 page

    Regulatory strategies for selected Member States (Denmark, Germany, Netherlands, Spain, the UK):IMPROGRES project

    Get PDF
    Research Project supported by the European Commission, Directorate-General for Energy and Transport, under the Energy Intelligent Europe (EIE) programmeThis Work Package 6 report of the IMPROGRES project provides an overview of regulatory strategies and incentives, conducive to (i) network integration of increasing levels of distributed generation including notably intermittent renewable technology such as wind power and solar photovoltaics (PV) as well as (ii) options for reducing impacts on surging network integration costs. Similar to the IMPROGRES project in general, this report focuses on European distribution networks. It includes specific country studies of Denmark, Germany, the Netherlands, Spain and the UK. This summary presents the main findings of this report
    • 

    corecore