76 research outputs found

    Multi-platform Approach for Microbial Biomarker Identification Using Borrelia burgdorferi as a Model

    Get PDF
    The identification of microbial biomarkers is critical for the diagnosis of a disease early during infection. However, the identification of reliable biomarkers is often hampered by a low concentration of microbes or biomarkers within host fluids or tissues. We have outlined a multi-platform strategy to assess microbial biomarkers that can be consistently detected in host samples, using Borrelia burgdorferi, the causative agent of Lyme disease, as an example. Key aspects of the strategy include the selection of a macaque model of human disease, in vivo Microbial Antigen Discovery (InMAD), and proteomic methods that include microbial biomarker enrichment within samples to identify secreted proteins circulating during infection. Using the described strategy, we have identified 6 biomarkers from multiple samples. In addition, the temporal antibody response to select bacterial antigens was mapped. By integrating biomarkers identified from early infection with temporal patterns of expression, the described platform allows for the data driven selection of diagnostic targets

    Precision Measurement of the Beam-Normal Single-Spin Asymmetry in Forward-Angle Elastic Electron-Proton Scattering

    Get PDF
    A beam-normal single-spin asymmetry generated in the scattering of transversely polarized electrons from unpolarized nucleons is an observable related to the imaginary part of the two-photon exchange process. We report a 2% precision measurement of the beam-normal single-spin asymmetry in elastic electron-proton scattering with a mean scattering angle of theta_lab = 7.9 degrees and a mean energy of 1.149 GeV. The asymmetry result is B_n = -5.194 +- 0.067 (stat) +- 0.082 (syst) ppm. This is the most precise measurement of this quantity available to date and therefore provides a stringent test of two-photon exchange models at far-forward scattering angles (theta_lab -\u3e 0) where they should be most reliable

    Reimagining large river management using the Resist–Accept–Direct (RAD) framework in the Upper Mississippi River

    Get PDF
    Background: Large-river decision-makers are charged with maintaining diverse ecosystem services through unprecedented social-ecological transformations as climate change and other global stressors intensify. The interconnected, dendritic habitats of rivers, which often demarcate jurisdictional boundaries, generate complex management challenges. Here, we explore how the Resist–Accept–Direct (RAD) framework may enhance large-river management by promoting coordinated and deliberate responses to social-ecological trajectories of change. The RAD framework identifies the full decision space of potential management approaches, wherein managers may resist change to maintain historical conditions, accept change toward different conditions, or direct change to a specified future with novel conditions. In the Upper Mississippi River System, managers are facing social-ecological transformations from more frequent and extreme high-water events. We illustrate how RAD-informed basin-, reach-, and site-scale decisions could: (1) provide cross-spatial scale framing; (2) open the entire decision space of potential management approaches; and (3) enhance coordinated inter-jurisdictional management in response to the trajectory of the Upper Mississippi River hydrograph. Results: The RAD framework helps identify plausible long-term trajectories in different reaches (or subbasins) of the river and how the associated social-ecological transformations could be managed by altering site-scale conditions. Strategic reach-scale objectives may reprioritize how, where, and when site conditions could be altered to contribute to the basin goal, given the basin’s plausible trajectories of change (e.g., by coordinating action across sites to alter habitat connectivity, diversity, and redundancy in the river mosaic). Conclusions: When faced with long-term systemic transformations (e.g., \u3e 50 years), the RAD framework helps explicitly consider whether or when the basin vision or goals may no longer be achievable, and direct options may open yet unconsidered potential for the basin. Embedding the RAD framework in hierarchical decision-making clarifies that the selection of actions in space and time should be derived from basin-wide goals and reach-scale objectives to ensure that site-scale actions contribute effectively to the larger river habitat mosaic. Embedding the RAD framework in large-river decisions can provide the necessary conduit to link flexibility and innovation at the site scale with stability at larger scales for adaptive governance of changing social-ecological systems

    The molecular epidemiology of multiple zoonotic origins of SARS-CoV-2

    Get PDF
    Understanding the circumstances that lead to pandemics is important for their prevention. Here, we analyze the genomic diversity of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) early in the coronavirus disease 2019 (COVID-19) pandemic. We show that SARS-CoV-2 genomic diversity before February 2020 likely comprised only two distinct viral lineages, denoted A and B. Phylodynamic rooting methods, coupled with epidemic simulations, reveal that these lineages were the result of at least two separate cross-species transmission events into humans. The first zoonotic transmission likely involved lineage B viruses around 18 November 2019 (23 October–8 December), while the separate introduction of lineage A likely occurred within weeks of this event. These findings indicate that it is unlikely that SARS-CoV-2 circulated widely in humans prior to November 2019 and define the narrow window between when SARS-CoV-2 first jumped into humans and when the first cases of COVID-19 were reported. As with other coronaviruses, SARS-CoV-2 emergence likely resulted from multiple zoonotic events

    Stochastically Gating Ion Channels Enable Patterned Spike Firing through Activity-Dependent Modulation of Spike Probability

    Get PDF
    The transformation of synaptic input into patterns of spike output is a fundamental operation that is determined by the particular complement of ion channels that a neuron expresses. Although it is well established that individual ion channel proteins make stochastic transitions between conducting and non-conducting states, most models of synaptic integration are deterministic, and relatively little is known about the functional consequences of interactions between stochastically gating ion channels. Here, we show that a model of stellate neurons from layer II of the medial entorhinal cortex implemented with either stochastic or deterministically gating ion channels can reproduce the resting membrane properties of stellate neurons, but only the stochastic version of the model can fully account for perithreshold membrane potential fluctuations and clustered patterns of spike output that are recorded from stellate neurons during depolarized states. We demonstrate that the stochastic model implements an example of a general mechanism for patterning of neuronal output through activity-dependent changes in the probability of spike firing. Unlike deterministic mechanisms that generate spike patterns through slow changes in the state of model parameters, this general stochastic mechanism does not require retention of information beyond the duration of a single spike and its associated afterhyperpolarization. Instead, clustered patterns of spikes emerge in the stochastic model of stellate neurons as a result of a transient increase in firing probability driven by activation of HCN channels during recovery from the spike afterhyperpolarization. Using this model, we infer conditions in which stochastic ion channel gating may influence firing patterns in vivo and predict consequences of modifications of HCN channel function for in vivo firing patterns

    Safety, immunogenicity, and reactogenicity of BNT162b2 and mRNA-1273 COVID-19 vaccines given as fourth-dose boosters following two doses of ChAdOx1 nCoV-19 or BNT162b2 and a third dose of BNT162b2 (COV-BOOST): a multicentre, blinded, phase 2, randomised trial

    Get PDF

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    The impact of human expert visual inspection on the discovery of strong gravitational lenses

    Get PDF
    We investigate the ability of human ’expert’ classifiers to identify strong gravitational lens candidates in Dark Energy Survey like imaging. We recruited a total of 55 people that completed more than 25% of the project. During the classification task, we present to the participants 1489 images. The sample contains a variety of data including lens simulations, real lenses, non-lens examples, and unlabeled data. We find that experts are extremely good at finding bright, well-resolved Einstein rings, whilst arcs with g-band signal-to-noise less than ∼25 or Einstein radii less than ∼1.2 times the seeing are rarely recovered. Very few non-lenses are scored highly. There is substantial variation in the performance of individual classifiers, but they do not appear to depend on the classifier’s experience, confidence or academic position. These variations can be mitigated with a team of 6 or more independent classifiers. Our results give confidence that humans are a reliable pruning step for lens candidates, providing pure and quantifiably complete samples for follow-up studies

    Development of a Unifying Target and Consensus Indicators for Global Surgical Systems Strengthening: Proposed by the Global Alliance for Surgery, Obstetric, Trauma, and Anaesthesia Care (The G4 Alliance)

    Get PDF

    Search for gravitational-lensing signatures in the full third observing run of the LIGO-Virgo network

    Get PDF
    Gravitational lensing by massive objects along the line of sight to the source causes distortions of gravitational wave-signals; such distortions may reveal information about fundamental physics, cosmology and astrophysics. In this work, we have extended the search for lensing signatures to all binary black hole events from the third observing run of the LIGO--Virgo network. We search for repeated signals from strong lensing by 1) performing targeted searches for subthreshold signals, 2) calculating the degree of overlap amongst the intrinsic parameters and sky location of pairs of signals, 3) comparing the similarities of the spectrograms amongst pairs of signals, and 4) performing dual-signal Bayesian analysis that takes into account selection effects and astrophysical knowledge. We also search for distortions to the gravitational waveform caused by 1) frequency-independent phase shifts in strongly lensed images, and 2) frequency-dependent modulation of the amplitude and phase due to point masses. None of these searches yields significant evidence for lensing. Finally, we use the non-detection of gravitational-wave lensing to constrain the lensing rate based on the latest merger-rate estimates and the fraction of dark matter composed of compact objects
    corecore