869 research outputs found
Tuberculosis and Human Immunodeficiency Virus Infection
Progressive human immunodeficiency virus infection eventually leads to activation and dissemination of a wide variety of microorganisms normally held in check by the cellular immune system. Mycobacterium tuberculosis is one of these pathogens, and the disease caused by it has become a common presenting infection in the patient with AIDS. Dr. Richard E. Chaisson and Dr. Gary Slutkin have studied tuberculosis in the United States and worldwide, respectively. In this AIDS Commentary they address the unique nature of this infection, its diagnosis, and its treatment in the patient with AID
A Model of Habitability Within the Milky Way Galaxy
We present a model of the Galactic Habitable Zone (GHZ), described in terms
of the spatial and temporal dimensions of the Galaxy that may favour the
development of complex life. The Milky Way galaxy is modelled using a
computational approach by populating stars and their planetary systems on an
individual basis using Monte-Carlo methods. We begin with well-established
properties of the disk of the Milky Way, such as the stellar number density
distribution, the initial mass function, the star formation history, and the
metallicity gradient as a function of radial position and time. We vary some of
these properties, creating four models to test the sensitivity of our
assumptions. To assess habitability on the Galactic scale, we model supernova
rates, planet formation, and the time required for complex life to evolve. Our
study improves on other literature on the GHZ by populating stars on an
individual basis and by modelling SNII and SNIa sterilizations by selecting
their progenitors from within this preexisting stellar population. Furthermore,
we consider habitability on tidally locked and non-tidally locked planets
separately, and study habitability as a function of height above and below the
Galactic midplane. In the model that most accurately reproduces the properties
of the Galaxy, the results indicate that an individual SNIa is ~5.6 \times more
lethal than an individual SNII on average. In addition, we predict that ~1.2%
of all stars host a planet that may have been capable of supporting complex
life at some point in the history of the Galaxy. Of those stars with a
habitable planet, ~75% of planets are predicted to be in a tidally locked
configuration with their host star. The majority of these planets that may
support complex life are found towards the inner Galaxy, distributed within,
and significantly above and below, the Galactic midplane.Comment: Accepted for publication in Astrobiology. 40 pages, 12 figures, 3
table
Tuberculosis preventive therapy : An underutilised strategy to reduce individual risk of TB and contribute to TB control
PKTuberculosis (TB) remains a global health problem, and South Africa (SA) has one of the world’s worst TB epidemics. The World Health Organization (WHO) estimated in 1999 that one-third of the world’s population was latently infected with TB. In SA up to 88% of HIV-uninfected young adults (31 - 35 years) are latently infected with TB. In the most recent meta-analysis, 6 - 12 months of isoniazid preventive therapy (IPT) was associated with a lower incidence of active TB than placebo (relative risk (RR) 0.68; 95% confidence interval (CI) 0.54 - 0.85), with the greatest benefit among individuals with a positive tuberculin skin test (TST) (RR 0.38; 95% CI 0.25 - 0.57). A clinical trial of IPT given with antiretroviral therapy (ART) for 12 months reduced TB incidence by 37% compared with ART alone (hazard ratio (HR) 0.63; 95% CI 0.41 - 0.94). The effect of IPT is limited in high-burden countries. IPT for 36 months v. 6 months reduced TB incidence among HIV-positive, TST-positive participants by 74% (HR 0.26; 95% CI 0.09 - 0.80). A study of more than 24 000 goldminers confirmed that IPT is safe, with only 0.5% experiencing adverse events. A meta-analysis of studies of IPT since 1951 did not show an increased risk of developing resistance. Alternative TB preventive therapy regimens, including high-dose isoniazid and rifapentine given weekly for 3 months, have been shown to have similar efficacy to IPT. Mathematical modelling suggests that scaling up continuous IPT targeted to HIV-positive persons, when used in combination with other treatment and prevention strategies, may substantially improve TB control
Reply to “At the crossroads between early or delayed antiretroviral therapy initiation during TB/HIV coinfection”
La digitalització dels textos, iniciada en la dècada dels setanta, ha originat sistemes i productes diversos que poden ser molt útils en la investigació literà ria. Un dels més coneguts, lâhipertext, és un bon exemple de les possibilitats de la lectura no seqüencial que caracteritza les obres de referència o determinades recerques en lâà mbit de la filologia, com ara lâedició de textos. La digitalització destaca les caracterÃstiques tant hipertextuals com intertextuals de la literatura i ajuda a entendre, aixÃ, alguns dels seus trets constitutius. Dâaltra banda, la publicació en lÃnia de textos de molt difÃcil abast proposada per comunitats cientÃfiques molt presents a Internet és una oferta plena de possibilitats i suggereix un camà a seguir per comunitats encara poc implantades, com ara la filologia catalana. | Digitisation of text, begun in the 1970s, has produced a diversity of systems and products that could be very useful for literary research. One of the better known examples, hypertext, demonstrates the potential for non-sequential reading that characterises the use made of reference works or specific searches performed in the field of philology, such as for text publishing. Digitisation highlights both | La digitalización de textos, iniciada en la década de los años setenta, ha dado lugar a sistemas y productos diversos que pueden ser muy útiles en la investigación literaria. Uno de los más conocidos, el hipertexto, constituye un buen ejemplo de las posibilidades de la lectura secuencial que caracteriza las obras de referencia o determinadas investigaciones en el ámbito de la FilologÃa, como, por ejemplo, la edición de textos. La digitalización destaca las caracterÃsticas tanto hipertextuales como intertextuales de la literatura facilitando, por tanto, su comprensión. Por otro lado, la publicación en lÃnea de textos de muy difÃcil acceso, propuesta por comunidades cientÃficas muy presentes en Internet, es una oferta llena de posibilidades y abre un camino a seguir por parte de comunidades poco implantadas todavÃa como es el caso de la FilologÃa catalana
Purine-Rich Foods Intake and Recurrent Gout Attacks
OBJECTIVE: To examine and quantify the relation between purine intake and the risk of recurrent gout attacks among gout patients. METHODS: The authors conducted a case-crossover study to examine associations of a set of putative risk factors with recurrent gout attacks. Individuals with gout were prospectively recruited and followed online for 1 year. Participants were asked about the following information when experiencing a gout attack: the onset date of the gout attack, clinical symptoms and signs, medications (including antigout medications), and presence of potential risk factors (including daily intake of various purine-containing food items) during the 2-day period prior to the gout attack. The same exposure information was also assessed over 2-day control periods. RESULTS: This study included 633 participants with gout. Compared with the lowest quintile of total purine intake over a 2-day period, OR of recurrent gout attacks were 1.17, 1.38, 2.21 and 4.76, respectively, with each increasing quintile (p for trend <0.001). The corresponding OR were 1.42, 1.34, 1.77 and 2.41 for increasing quintiles of purine intake from animal sources (p for trend <0.001), and 1.12, 0.99, 1.32 and 1.39 from plant sources (p=0.04), respectively. The effect of purine intake persisted across subgroups by sex, use of alcohol, diuretics, allopurinol, NSAIDs and colchicine. CONCLUSIONS: The study findings suggest that acute purine intake increases the risk of recurrent gout attacks by almost fivefold among gout patients. Avoiding or reducing amount of purine-rich foods intake, especially of animal origin, may help reduce the risk of gout attacks
Reconstructing complex regions of genomes using long-read sequencing technology
Cataloged from PDF version of article.Obtaining high-quality sequence continuity of complex regions of recent segmental duplication remains one of the major challenges of finishing genome assemblies. In the human and mouse genomes, this was achieved by targeting large-insert clones using costly and laborious capillary-based sequencing approaches. Sanger shotgun sequencing of clone inserts, however, has now been largely abandoned, leaving most of these regions unresolved in newer genome assemblies generated primarily by next-generation sequencing hybrid approaches. Here we show that it is possible to resolve regions that are complex in a genome-wide context but simple in isolation for a fraction of the time and cost of traditional methods using long-read single molecule, real-time (SMRT) sequencing and assembly technology from Pacific Biosciences (PacBio). We sequenced and assembled BAC clones corresponding to a 1.3-Mbp complex region of chromosome 17q21.31, demonstrating 99.994% identity to Sanger assemblies of the same clones. We targeted 44 differences using Illumina sequencing and find that PacBio and Sanger assemblies share a comparable number of validated variants, albeit with different sequence context biases. Finally, we targeted a poorly assembled 766-kbp duplicated region of the chimpanzee genome and resolved the structure and organization for a fraction of the cost and time of traditional finishing approaches. Our data suggest a straightforward path for upgrading genomes to a higher quality finished state
Discovery and genotyping of structural variation from long-read haploid genome sequence data
In an effort to more fully understand the full spectrum of human genetic variation, we generated deep single-molecule, real-time (SMRT) sequencing data from two haploid human genomes. By using an assembly-based approach (SMRT-SV), we systematically assessed each genome independently for structural variants (SVs) and indels resolving the sequence structure of 461,553 genetic variants from 2 bp to 28 kbp in length. We find that >89% of these variants have been missed as part of analysis of the 1000 Genomes Project even after adjusting for more common variants (MAF > 1%). We estimate that this theoretical human diploid differs by as much as ∼16 Mbp with respect to the human reference, with long-read sequencing data providing a fivefold increase in sensitivity for genetic variants ranging in size from 7 bp to 1 kbp compared with short-read sequence data. Although a large fraction of genetic variants were not detected by short-read approaches, once the alternate allele is sequence-resolved, we show that 61% of SVs can be genotyped in short-read sequence data sets with high accuracy. Uncoupling discovery from genotyping thus allows for the majority of this missed common variation to be genotyped in the human population. Interestingly, when we repeat SV detection on a pseudodiploid genome constructed in silico by merging the two haploids, we find that ∼59% of the heterozygous SVs are no longer detected by SMRT-SV. These results indicate that haploid resolution of long-read sequencing data will significantly increase sensitivity of SV detection.</jats:p
Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics
In this philosophical paper, we explore computational and biological
analogies to address the fine-tuning problem in cosmology. We first clarify
what it means for physical constants or initial conditions to be fine-tuned. We
review important distinctions such as the dimensionless and dimensional
physical constants, and the classification of constants proposed by
Levy-Leblond. Then we explore how two great analogies, computational and
biological, can give new insights into our problem. This paper includes a
preliminary study to examine the two analogies. Importantly, analogies are both
useful and fundamental cognitive tools, but can also be misused or
misinterpreted. The idea that our universe might be modelled as a computational
entity is analysed, and we discuss the distinction between physical laws and
initial conditions using algorithmic information theory. Smolin introduced the
theory of "Cosmological Natural Selection" with a biological analogy in mind.
We examine an extension of this analogy involving intelligent life. We discuss
if and how this extension could be legitimated.
Keywords: origin of the universe, fine-tuning, physical constants, initial
conditions, computational universe, biological universe, role of intelligent
life, cosmological natural selection, cosmological artificial selection,
artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres
- …