1,731 research outputs found

    The extraordinary evolutionary history of the reticuloendotheliosis viruses

    Get PDF
    The reticuloendotheliosis viruses (REVs) comprise several closely related amphotropic retroviruses isolated from birds. These viruses exhibit several highly unusual characteristics that have not so far been adequately explained, including their extremely close relationship to mammalian retroviruses, and their presence as endogenous sequences within the genomes of certain large DNA viruses. We present evidence for an iatrogenic origin of REVs that accounts for these phenomena. Firstly, we identify endogenous retroviral fossils in mammalian genomes that share a unique recombinant structure with REVs—unequivocally demonstrating that REVs derive directly from mammalian retroviruses. Secondly, through sequencing of archived REV isolates, we confirm that contaminated Plasmodium lophurae stocks have been the source of multiple REV outbreaks in experimentally infected birds. Finally, we show that both phylogenetic and historical evidence support a scenario wherein REVs originated as mammalian retroviruses that were accidentally introduced into avian hosts in the late 1930s, during experimental studies of P. lophurae, and subsequently integrated into the fowlpox virus (FWPV) and gallid herpesvirus type 2 (GHV-2) genomes, generating recombinant DNA viruses that now circulate in wild birds and poultry. Our findings provide a novel perspective on the origin and evolution of REV, and indicate that horizontal gene transfer between virus families can expand the impact of iatrogenic transmission events

    Reaction rates and transport in neutron stars

    Full text link
    Understanding signals from neutron stars requires knowledge about the transport inside the star. We review the transport properties and the underlying reaction rates of dense hadronic and quark matter in the crust and the core of neutron stars and point out open problems and future directions.Comment: 74 pages; commissioned for the book "Physics and Astrophysics of Neutron Stars", NewCompStar COST Action MP1304; version 3: minor changes, references updated, overview graphic added in the introduction, improvements in Sec IV.A.

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    High cable forces deteriorate pinch force control in voluntary-closing body-powered prostheses

    Get PDF
    It is generally asserted that reliable and intuitive control of upper-limb prostheses requires adequate feedback of prosthetic finger positions and pinch forces applied to objects. Body-powered prostheses (BPPs) provide the user with direct proprioceptive feedback. Currently available BPPs often require high cable operation forces, which complicates control of the forces at the terminal device. The aim of this study is to quantify the influence of high cable forces on object manipulation with voluntary-closing prostheses. Able-bodied male subjects were fitted with a bypass-prosthesis with low and high cable force settings for the prehensor. Subjects were requested to grasp and transfer a collapsible object as fast as they could without dropping or breaking it. The object had a low and a high breaking force setting. Subjects conducted significantly more successful manipulations with the low cable force setting, both for the low (33 % more) and high (50 %) object’s breaking force. The time to complete the task was not different between settings during successful manipulation trials. In conclusion: high cable forces lead to reduced pinch force control during object manipulation. This implies that low cable operation forces should be a key design requirement for voluntary-closing BPPs

    Measurement of the B0 anti-B0 oscillation frequency using l- D*+ pairs and lepton flavor tags

    Full text link
    The oscillation frequency Delta-md of B0 anti-B0 mixing is measured using the partially reconstructed semileptonic decay anti-B0 -> l- nubar D*+ X. The data sample was collected with the CDF detector at the Fermilab Tevatron collider during 1992 - 1995 by triggering on the existence of two lepton candidates in an event, and corresponds to about 110 pb-1 of pbar p collisions at sqrt(s) = 1.8 TeV. We estimate the proper decay time of the anti-B0 meson from the measured decay length and reconstructed momentum of the l- D*+ system. The charge of the lepton in the final state identifies the flavor of the anti-B0 meson at its decay. The second lepton in the event is used to infer the flavor of the anti-B0 meson at production. We measure the oscillation frequency to be Delta-md = 0.516 +/- 0.099 +0.029 -0.035 ps-1, where the first uncertainty is statistical and the second is systematic.Comment: 30 pages, 7 figures. Submitted to Physical Review

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Search for supersymmetry in events with four or more leptons in √s =13 TeV pp collisions with ATLAS

    Get PDF
    Results from a search for supersymmetry in events with four or more charged leptons (electrons, muons and taus) are presented. The analysis uses a data sample corresponding to 36.1 fb −1 of proton-proton collisions delivered by the Large Hadron Collider at s √ =13 TeV and recorded by the ATLAS detector. Four-lepton signal regions with up to two hadronically decaying taus are designed to target a range of supersymmetric scenarios that can be either enriched in or depleted of events involving the production and decay of a Z boson. Data yields are consistent with Standard Model expectations and results are used to set upper limits on the event yields from processes beyond the Standard Model. Exclusion limits are set at the 95% confidence level in simplified models of General Gauge Mediated supersymmetry, where higgsino masses are excluded up to 295 GeV. In R -parity-violating simplified models with decays of the lightest supersymmetric particle to charged leptons, lower limits of 1.46 TeV, 1.06 TeV, and 2.25 TeV are placed on wino, slepton and gluino masses, respectively

    Measurements of integrated and differential cross sections for isolated photon pair production in pp collisions at √s=8 TeV with the ATLAS detector

    Get PDF
    A measurement of the production cross section for two isolated photons in proton-proton collisions at a center-of-mass energy of √s=8 TeV is presented. The results are based on an integrated luminosity of 20.2 fb−1 recorded by the ATLAS detector at the Large Hadron Collider. The measurement considers photons with pseudorapidities satisfying |ηγ|40GeV and EγT,2>30 GeV for the two leading photons ordered in transverse energy produced in the interaction. The background due to hadronic jets and electrons is subtracted using data-driven techniques. The fiducial cross sections are corrected for detector effects and measured differentially as a function of six kinematic observables. The measured cross section integrated within the fiducial volume is 16.8 ± 0.8  pb . The data are compared to fixed-order QCD calculations at next-to-leading-order and next-to-next-to-leading-order accuracy as well as next-to-leading-order computations including resummation of initial-state gluon radiation at next-to-next-to-leading logarithm or matched to a parton shower, with relative uncertainties varying from 5% to 20%
    corecore