166 research outputs found

    Mud-clast armoring and its implications for turbidite systems

    Get PDF
    Seafloor sediment density flows are the primary mechanism for transporting sediment to the deep sea. These flows are important because they pose a hazard to seafloor infrastructure and deposit the largest sediment accumulations on Earth. The cohesive sediment content of a flow (i.e., clay) is an important control on its rheological state (e.g., turbulent or laminar); however, how clay becomes incorporated into a flow is poorly understood. One mechanism is by the abrasion of (clay-rich) mud clasts. Such clasts are common in deep-water deposits, often thought to have traveled over large (more than tens of kilometers) distances. These long travel distances are at odds with previous experimental work that suggests that mud clasts should disintegrate rapidly through abrasion. To address this apparent contradiction, we conduct laboratory experiments using a counter rotating annular flume to simulate clast transport in sediment density flows. We find that as clay clasts roll along a sandy floor, surficial armoring develops and reduces clast abrasion and thus enhances travel distance. For the first time we show armoring to be a process of renewal and replenishment, rather than forming a permanent layer. As armoring reduces the rate of clast abrasion, it delays the release of clay into the parent flow, which can therefore delay flow transformation from turbidity current to debris flow. We conclude that armored mud clasts can form only within a sandy turbidity current; hence where armored clasts are found in debrite deposits, the parent flow must have undergone flow transformation farther up slope

    A century of social wasp occupancy trends from natural history collections: spatiotemporal resolutions have little effect on model performance

    Get PDF
    1. The current dearth of long‐term insect population trends is a major obstacle to conservation. Occupancy models have been proposed as a solution, but it remains unclear whether they can yield long‐term trends from natural history collections, since specimen records are normally very sparse. A common approach for sparse data is to coarsen its spatial and/or temporal resolution, although coarsening risks violating model assumptions. 2. We (i) test whether occupancy trends of three social wasp (Hymenoptera: Vespidae: Vespinae) species – the common wasp (Vespula vulgaris), the German wasp (Vespula germanica) and the European hornet (Vespa crabro) – have changed in England between 1900 and 2016, and (ii) test the effect of spatiotemporal resolution on the performance of occupancy models using very sparse data. All models are based on an integrated dataset of occurrence records and natural history collection specimen records. 3. We show that occupancy models can yield long‐term species‐specific trends from very sparse natural history collection specimens. We present the first quantitative trends for three Vespinae species in England over 116 years. Vespula vulgaris and V. germanica show stable trends over the time series, whilst V. crabro's occupancy decreased from 1950 to 1970 and increased since 1970. Moreover, we show that spatiotemporal resolution has little effect on model performance, although coarsening the spatial grain is an appropriate method for achieving enough records to estimate long‐term changes. 4. With the increasing availability of biological records, the model formulation used here has the potential to provide novel insights by making use of natural history collections' unique specimen assemblages

    Salivary Metabolomics of Well and Poorly Controlled Type 1 and Type 2 Diabetes

    Get PDF
    Objective. The concentrations of endogenous metabolites in saliva can be altered based on the systemic condition of the hosts and may, in theory, serve as a reflection of systemic disease progression. Hemoglobin A1C is used clinically to measure long-term average glycemic control. The aim of the study was to demonstrate if there were differences in the salivary metabolic profiles between well and poorly controlled type 1 and type 2 subjects with diabetes. Subjects and Methods. Subjects with type 1 and type 2 diabetes were enrolled (n = 40). The subjects were assigned to phenotypic groups based on their current level of A1C: 7 = poorly controlled. Demographic data, age, gender, and ethnicity, were used to match the two phenotypic groups. Whole saliva samples were collected and immediately stored at -80°C. Samples were spiked using an isotopically labeled internal standard and analyzed by UPLC-TOF-MS using a Waters SYNAPT G2-Si mass spectrometer. Results. Unsupervised principal components analysis (PCA) and orthogonal partial least squares regression discrimination analysis (OPLS-DA) were used to define unique metabolomic profiles associated with well and poorly controlled diabetes based on A1C levels. Conclusion. OPLS-DA demonstrates good separation of well and poorly controlled in both type 1 and type 2 diabetes. This provides evidence for developing saliva-based monitoring tools for diabetes

    Author Correction: Rapidly-migrating and internally-generated knickpoints can control submarine channel evolution (Nature Communications, (2020), 11, 1, (3129), 10.1038/s41467-020-16861-x)

    Get PDF
    © 2020, The Author(s). The original version of this Article contained an error in the labelling of the cross-section in Fig. 2g and the vertical axis in Fig. 2b. This has been corrected in both the PDF and HTML versions of the Article

    Detector Description and Performance for the First Coincidence Observations between LIGO and GEO

    Get PDF
    For 17 days in August and September 2002, the LIGO and GEO interferometer gravitational wave detectors were operated in coincidence to produce their first data for scientific analysis. Although the detectors were still far from their design sensitivity levels, the data can be used to place better upper limits on the flux of gravitational waves incident on the earth than previous direct measurements. This paper describes the instruments and the data in some detail, as a companion to analysis papers based on the first data.Comment: 41 pages, 9 figures 17 Sept 03: author list amended, minor editorial change

    Global monitoring data shows grain size controls turbidity current structure

    Get PDF
    The first detailed measurements from active turbidity currents have been made in the last few years, at multiple sites worldwide. These data allow us to investigate the factors that control the structure of these flows. By analyzing the temporal evolution of the maximum velocity of turbidity currents at different sites, we aim to understand whether there are distinct types of flow, or if a continuum exists between end-members; and to investigate the physical controls on the different types of observed flow. Our results show that the evolution of the maximum velocity of turbidity currents falls between two end-members. Either the events show a rapid peak in velocity followed by an exponential decay or, flows continue at a plateau-like, near constant velocity. Our analysis suggests that rather than triggers or system input type, flow structure is primarily governed by the grain size of the sediment available for incorporation into the flow

    The design, implementation, and performance of the LZ calibration systems

    Get PDF
    LUX-ZEPLIN (LZ) is a tonne-scale experiment searching for direct dark matter interactions and other rare events. It is located at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. The core of the LZ detector is a dual-phase xenon time projection chamber (TPC), designed with the primary goal of detecting Weakly Interacting Massive Particles (WIMPs) via their induced low energy nuclear recoils. Surrounding the TPC, two veto detectors immersed in an ultra-pure water tank enable reducing background events to enhance the discovery potential. Intricate calibration systems are purposely designed to precisely understand the responses of these three detector volumes to various types of particle interactions and to demonstrate LZ's ability to discriminate between signals and backgrounds. In this paper, we present a comprehensive discussion of the key features, requirements, and performance of the LZ calibration systems, which play a crucial role in enabling LZ's WIMP-search and its broad science program. The thorough description of these calibration systems, with an emphasis on their novel aspects, is valuable for future calibration efforts in direct dark matter and other rare-event search experiments

    New constraints on ultraheavy dark matter from the LZ experiment

    Get PDF
    Searches for dark matter with liquid xenon time projection chamber experiments have traditionally focused on the region of the parameter space that is characteristic of weakly interacting massive particles, ranging from a few GeV/c2 to a few TeV/c2. Models of dark matter with a mass much heavier than this are well motivated by early production mechanisms different from the standard thermal freeze-out, but they have generally been less explored experimentally. In this work, we present a reanalysis of the first science run of the LZ experiment, with an exposure of 0.9  tonne×yr, to search for ultraheavy particle dark matter. The signal topology consists of multiple energy deposits in the active region of the detector forming a straight line, from which the velocity of the incoming particle can be reconstructed on an event-by-event basis. Zero events with this topology were observed after applying the data selection calibrated on a simulated sample of signal-like events. New experimental constraints are derived, which rule out previously unexplored regions of the dark matter parameter space of spin-independent interactions beyond a mass of 1017  GeV/c2. Published by the American Physical Society 2024 </jats:sec

    Search for excited taus from Z0 decays

    Full text link
    • 

    corecore