165 research outputs found

    A probabilistic model for gene content evolution with duplication, loss, and horizontal transfer

    Full text link
    We introduce a Markov model for the evolution of a gene family along a phylogeny. The model includes parameters for the rates of horizontal gene transfer, gene duplication, and gene loss, in addition to branch lengths in the phylogeny. The likelihood for the changes in the size of a gene family across different organisms can be calculated in O(N+hM^2) time and O(N+M^2) space, where N is the number of organisms, hh is the height of the phylogeny, and M is the sum of family sizes. We apply the model to the evolution of gene content in Preoteobacteria using the gene families in the COG (Clusters of Orthologous Groups) database

    High Performance dynamic voltage/frequency scaling algorithm for real-time dynamic load management and code mobility

    Full text link
    Modern cyber-physical systems assume a complex and dynamic interaction between the real world and the computing system in real-time. In this context, changes in the physical environment trigger changes in the computational load to execute. On the other hand, task migration services offered by networked control systems require also management of dynamic real-time computing load in nodes. In such systems it would be difficult, if not impossible, to analyse off-line all the possible combinations of processor loads. For this reason, it is worthwhile attempting to define new flexible architectures that enable computing systems to adapt to potential changes in the environment. We assume a system composed by three main components: the first one is responsible of the management of the requests arisen when new tasks require to be executed. This management component asks to the second component about the resources available to accept the new tasks. The second component performs a feasibility analysis to determine if the new tasks can be accepted coping with its real-time constraints. A new processor speed is also computed. A third component monitors the execution of tasks applying a fixed priority scheduling policy and additionally controlling the frequency of the processor. This paper focus on the second component providing a "correct" (a task never is accepted if it is not schedulable) and "near-exact" (a task is rarely rejected if it is schedulable) algorithm that can be applicable in practice because its low/medium and predictable computational cost. The algorithm analyses task admission in terms of processor frequency scaling. The paper presents the details of a novel algorithm to analyse tasks admission and processor frequency assignment. Additionally, we perform several simulations to evaluate the comparative performance of the proposed approach. This evaluation is made in terms of energy consumption, task rejection ratios, and real computing costs. The results of simulations show that from the cost, execution predictability, and task acceptance points of view, the proposed algorithm mostly outperforms other constant voltage scaling algorithms. Š 2011 Elsevier Inc. All rights reserved.This work has been supported by the Spanish Government as part of the SIDIRELI project (DPI2008-06737-C02-02), COBAMI project (DPI2011-28507-C02-02) and by the Generalitat Valenciana (Project ACOMP-2010-038).Coronel Parada, JO.; Simó Ten, JE. (2012). High Performance dynamic voltage/frequency scaling algorithm for real-time dynamic load management and code mobility. Journal of Systems and Software. 85(4):906-919. https://doi.org/10.1016/j.jss.2011.11.284S90691985

    Detection of acute infections during HIV testing in North Carolina

    Get PDF
    BACKGROUND: North Carolina has added nucleic acid amplification testing for the human immunodeficiency virus (HIV) to standard HIV antibody tests to detect persons with acute HIV infection who are viremic but antibody-negative. METHODS: To determine the effect of nucleic acid amplification testing on the yield and accuracy of HIV detection in public health practice, we conducted a 12-month observational study of methods for state-funded HIV testing. We compared the diagnostic performance of standard HIV antibody tests (i.e., enzyme immunoassay and Western blot analysis) with an algorithm whereby serum samples that yielded negative results on standard antibody tests were tested again with the use of nucleic acid amplification. A surveillance algorithm with repeated sensitive-less-sensitive enzyme immunoassay tests was also evaluated. HIV infection was defined as a confirmed positive result on a nucleic acid amplification test or as HIV antibody seroconversion. RESULTS: Between November 1,2002, and October 31,2003,109,250 persons at risk for HIV infection who had consented to HIV testing presented at state-funded sites. There were 606 HIV-positive results. Established infection, as identified by standard enzyme immunoassay or Western blot analysis, appeared in 583 participants; of these, 107 were identified, with the use of sensitive-less-sensitive enzyme immunoassay tests, as recent infections. A total of 23 acutely infected persons were identified only with the use of the nucleic acid amplification algorithm. With all detectable infections taken into account, the sensitivity of standard antibody testing was 0.962 (95 percent confidence interval, 0.944 to 0.976). There were two false positive results on nucleic acid amplification tests. The specificity and positive predictive value of the algorithm that included nucleic acid amplification testing were greater than 0.999 (95 percent confidence interval, 0.999 to >0.999) and 0.997 (95 percent confidence interval, 0.988 to >0.999), respectively. Of the 23 acute HIV infections, 16 were detected at sexually transmitted disease clinics. Emergency measures for HIV prevention protected 48 sex partners and one fetus from high-risk exposure to HIV. CONCLUSIONS: The addition of nucleic acid amplification testing to an HIV testing algorithm significantly increases the identification of cases of infection without impairing the performance of diagnostic testing. The detection of highly contagious, acutely infected persons creates new opportunities for HIV surveillance and prevention

    Lambda Hyperons in 2 A*GeV Ni + Cu Collisions

    Full text link
    A sample of Lambda's produced in 2 A*GeV Ni + Cu collisions has been obtained with the EOS Time Projection Chamber at the Bevalac. Low background in the invariant mass distribution allows for the unambiguous demonstration of Lambda directed flow. The transverse mass spectrum at mid-rapidity has the characteristic shoulder-arm shape of particles undergoing radial transverse expansion. A linear dependence of Lambda multiplicity on impact parameter is observed, from which a total Lambda + Sigma^0 production cross section of $112 +/- 24 mb is deduced. Detailed comparisons with the ARC and RVUU models are made.Comment: Revised version accepted for publication in Phys. Lett.

    Strategies and performance of the CMS silicon tracker alignment during LHC Run 2

    Get PDF
    The strategies for and the performance of the CMS silicon tracking system alignment during the 2015–2018 data-taking period of the LHC are described. The alignment procedures during and after data taking are explained. Alignment scenarios are also derived for use in the simulation of the detector response. Systematic effects, related to intrinsic symmetries of the alignment task or to external constraints, are discussed and illustrated for different scenarios

    Clusters of galaxies: setting the stage

    Get PDF
    Clusters of galaxies are self-gravitating systems of mass ~10^14-10^15 Msun. They consist of dark matter (~80 %), hot diffuse intracluster plasma (< 20 %) and a small fraction of stars, dust, and cold gas, mostly locked in galaxies. In most clusters, scaling relations between their properties testify that the cluster components are in approximate dynamical equilibrium within the cluster gravitational potential well. However, spatially inhomogeneous thermal and non-thermal emission of the intracluster medium (ICM), observed in some clusters in the X-ray and radio bands, and the kinematic and morphological segregation of galaxies are a signature of non-gravitational processes, ongoing cluster merging and interactions. In the current bottom-up scenario for the formation of cosmic structure, clusters are the most massive nodes of the filamentary large-scale structure of the cosmic web and form by anisotropic and episodic accretion of mass. In this model of the universe dominated by cold dark matter, at the present time most baryons are expected to be in a diffuse component rather than in stars and galaxies; moreover, ~50 % of this diffuse component has temperature ~0.01-1 keV and permeates the filamentary distribution of the dark matter. The temperature of this Warm-Hot Intergalactic Medium (WHIM) increases with the local density and its search in the outer regions of clusters and lower density regions has been the quest of much recent observational effort. Over the last thirty years, an impressive coherent picture of the formation and evolution of cosmic structures has emerged from the intense interplay between observations, theory and numerical experiments. Future efforts will continue to test whether this picture keeps being valid, needs corrections or suffers dramatic failures in its predictive power.Comment: 20 pages, 8 figures, accepted for publication in Space Science Reviews, special issue "Clusters of galaxies: beyond the thermal view", Editor J.S. Kaastra, Chapter 2; work done by an international team at the International Space Science Institute (ISSI), Bern, organised by J.S. Kaastra, A.M. Bykov, S. Schindler & J.A.M. Bleeke

    Search for jet extinction in the inclusive jet-pT spectrum from proton-proton collisions at s=8 TeV

    Get PDF
    Published by the American Physical Society under the terms of the Creative Commons Attribution 3.0 License. Further distribution of this work must maintain attribution to the author(s) and the published articles title, journal citation, and DOI.The first search at the LHC for the extinction of QCD jet production is presented, using data collected with the CMS detector corresponding to an integrated luminosity of 10.7  fb−1 of proton-proton collisions at a center-of-mass energy of 8 TeV. The extinction model studied in this analysis is motivated by the search for signatures of strong gravity at the TeV scale (terascale gravity) and assumes the existence of string couplings in the strong-coupling limit. In this limit, the string model predicts the suppression of all high-transverse-momentum standard model processes, including jet production, beyond a certain energy scale. To test this prediction, the measured transverse-momentum spectrum is compared to the theoretical prediction of the standard model. No significant deficit of events is found at high transverse momentum. A 95% confidence level lower limit of 3.3 TeV is set on the extinction mass scale

    Measurement of the inclusive production cross sections for forward jets and for dijet events with one forward and one central jet in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    The inclusive production cross sections for forward jets, as well for jets in dijet events with at least one jet emitted at central and the other at forward pseudorapidities, are measured in the range of transverse momenta pt = 35-150 GeV/c in proton-proton collisions at sqrt(s) = 7 TeV by the CMS experiment at the LHC. Forward jets are measured within pseudorapidities 3.2<|eta|<4.7, and central jets within the |eta|<2.8 range. The double differential cross sections with respect to pt and eta are compared to predictions from three approaches in perturbative quantum chromodynamics: (i) next-to-leading-order calculations obtained with and without matching to parton-shower Monte Carlo simulations, (ii) PYTHIA and HERWIG parton-shower event generators with different tunes of parameters, and (iii) CASCADE and HEJ models, including different non-collinear corrections to standard single-parton radiation. The single-jet inclusive forward jet spectrum is well described by all models, but not all predictions are consistent with the spectra observed for the forward-central dijet events.Comment: Submitted to the Journal of High Energy Physic
    • …
    corecore