3,876 research outputs found

    A Case-Control Study to Identify Community Venues Associated with Genetically-clustered, Multidrug-resistant Tuberculosis Disease in Lima, Peru

    Get PDF
    Background: The majority of tuberculosis transmission occurs in community settings. Our primary aim in this study was to assess the association between exposure to community venues and multidrug-resistant (MDR) tuberculosis. Our secondary aim was to describe the social networks of MDR tuberculosis cases and controls. / Methods: We recruited laboratory-confirmed MDR tuberculosis cases and community controls that were matched on age and sex. Whole-genome sequencing was used to identify genetically clustered cases. Venue tracing interviews (nonblinded) were conducted to enumerate community venues frequented by participants. Logistic regression was used to assess the association between MDR tuberculosis and person-time spent in community venues. A location-based social network was constructed, with respondents connected if they reported frequenting the same venue, and an exponential random graph model (ERGM) was fitted to model the network. / Results: We enrolled 59 cases and 65 controls. Participants reported 729 unique venues. The mean number of venues reported was similar in both groups (P = .92). Person-time in healthcare venues (adjusted odds ratio [aOR] = 1.67, P = .01), schools (aOR = 1.53, P < .01), and transportation venues (aOR = 1.25, P = .03) was associated with MDR tuberculosis. Healthcare venues, markets, cinemas, and transportation venues were commonly shared among clustered cases. The ERGM indicated significant community segregation between cases and controls. Case networks were more densely connected. / Conclusions: Exposure to healthcare venues, schools, and transportation venues was associated with MDR tuberculosis. Intervention across the segregated network of case venues may be necessary to effectively stem transmission

    Simulating Reionization: Character and Observability

    Get PDF
    In recent years there has been considerable progress in our understanding of the nature and properties of the reionization process. In particular, the numerical simulations of this epoch have made a qualitative leap forward, reaching sufficiently large scales to derive the characteristic scales of the reionization process and thus allowing for realistic observational predictions. Our group has recently performed the first such large-scale radiative transfer simulations of reionization, run on top of state-of-the-art simulations of early structure formation. This allowed us to make the first realistic observational predictions about the Epoch of Reionization based on detailed radiative transfer and structure formation simulations. We discuss the basic features of reionization derived from our simulations and some recent results on the observational implications for the high-redshift Ly-alpha sources.Comment: 3 pages, to appear in the Proceedings of First Stars III, Santa Fe, July 2007, AIP Conference Serie

    A Fluid EOQ Model of Perishable Items with Intermittent High and Low Demand Rates

    Full text link

    Self-Supervised Discovery of Anatomical Shape Landmarks

    Full text link
    Statistical shape analysis is a very useful tool in a wide range of medical and biological applications. However, it typically relies on the ability to produce a relatively small number of features that can capture the relevant variability in a population. State-of-the-art methods for obtaining such anatomical features rely on either extensive preprocessing or segmentation and/or significant tuning and post-processing. These shortcomings limit the widespread use of shape statistics. We propose that effective shape representations should provide sufficient information to align/register images. Using this assumption we propose a self-supervised, neural network approach for automatically positioning and detecting landmarks in images that can be used for subsequent analysis. The network discovers the landmarks corresponding to anatomical shape features that promote good image registration in the context of a particular class of transformations. In addition, we also propose a regularization for the proposed network which allows for a uniform distribution of these discovered landmarks. In this paper, we present a complete framework, which only takes a set of input images and produces landmarks that are immediately usable for statistical shape analysis. We evaluate the performance on a phantom dataset as well as 2D and 3D images.Comment: Early accept at MICCAI 202

    The effect of beta-alanine supplementation on neuromuscular fatigue in elderly (55–92 Years): a double-blind randomized study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ageing is associated with a significant reduction in skeletal muscle carnosine which has been linked with a reduction in the buffering capacity of muscle and in theory, may increase the rate of fatigue during exercise. Supplementing beta-alanine has been shown to significantly increase skeletal muscle carnosine. The purpose of this study, therefore, was to examine the effects of ninety days of beta-alanine supplementation on the physical working capacity at the fatigue threshold (PWC<sub>FT</sub>) in elderly men and women.</p> <p>Methods</p> <p>Using a double-blind placebo controlled design, twenty-six men (n = 9) and women (n = 17) (age ± SD = 72.8 ± 11.1 yrs) were randomly assigned to either beta-alanine (BA: 800 mg × 3 per day; n = 12; CarnoSyn™) or Placebo (PL; n = 14) group. Before (pre) and after (post) the supplementation period, participants performed a discontinuous cycle ergometry test to determine the PWC<sub>FT</sub>.</p> <p>Results</p> <p>Significant increases in PWC<sub>FT </sub>(28.6%) from pre- to post-supplementation were found for the BA treatment group (p < 0.05), but no change was observed with PL treatment. These findings suggest that ninety days of BA supplementation may increase physical working capacity by delaying the onset of neuromuscular fatigue in elderly men and women.</p> <p>Conclusion</p> <p>We suggest that BA supplementation, by improving intracellular pH control, improves muscle endurance in the elderly. This, we believe, could have importance in the prevention of falls, and the maintenance of health and independent living in elderly men and women.</p

    One-sided versus two-sided stochastic descriptions

    Get PDF
    It is well-known that discrete-time finite-state Markov Chains, which are described by one-sided conditional probabilities which describe a dependence on the past as only dependent on the present, can also be described as one-dimensional Markov Fields, that is, nearest-neighbour Gibbs measures for finite-spin models, which are described by two-sided conditional probabilities. In such Markov Fields the time interpretation of past and future is being replaced by the space interpretation of an interior volume, surrounded by an exterior to the left and to the right. If we relax the Markov requirement to weak dependence, that is, continuous dependence, either on the past (generalising the Markov-Chain description) or on the external configuration (generalising the Markov-Field description), it turns out this equivalence breaks down, and neither class contains the other. In one direction this result has been known for a few years, in the opposite direction a counterexample was found recently. Our counterexample is based on the phenomenon of entropic repulsion in long-range Ising (or "Dyson") models.Comment: 13 pages, Contribution for "Statistical Mechanics of Classical and Disordered Systems

    No imminent quantum supremacy by boson sampling

    Get PDF
    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom

    Using ESTs to improve the accuracy of de novo gene prediction

    Get PDF
    BACKGROUND: ESTs are a tremendous resource for determining the exon-intron structures of genes, but even extensive EST sequencing tends to leave many exons and genes untouched. Gene prediction systems based exclusively on EST alignments miss these exons and genes, leading to poor sensitivity. De novo gene prediction systems, which ignore ESTs in favor of genomic sequence, can predict such "untouched" exons, but they are less accurate when predicting exons to which ESTs align. TWINSCAN is the most accurate de novo gene finder available for nematodes and N-SCAN is the most accurate for mammals, as measured by exact CDS gene prediction and exact exon prediction. RESULTS: TWINSCAN_EST is a new system that successfully combines EST alignments with TWINSCAN. On the whole C. elegans genome TWINSCAN_EST shows 14% improvement in sensitivity and 13% in specificity in predicting exact gene structures compared to TWINSCAN without EST alignments. Not only are the structures revealed by EST alignments predicted correctly, but these also constrain the predictions without alignments, improving their accuracy. For the human genome, we used the same approach with N-SCAN, creating N-SCAN_EST. On the whole genome, N-SCAN_EST produced a 6% improvement in sensitivity and 1% in specificity of exact gene structure predictions compared to N-SCAN. CONCLUSION: TWINSCAN_EST and N-SCAN_EST are more accurate than TWINSCAN and N-SCAN, while retaining their ability to discover novel genes to which no ESTs align. Thus, we recommend using the EST versions of these programs to annotate any genome for which EST information is available. TWINSCAN_EST and N-SCAN_EST are part of the TWINSCAN open source software package
    • …
    corecore