1,211 research outputs found

    Feasibility, acceptability, and cost of tuberculosis testing by whole-blood interferon-gamma assay

    Get PDF
    BACKGROUND: The whole-blood interferon-gamma release assay (IGRA) is recommended in some settings as an alternative to the tuberculin skin test (TST). Outcomes from field implementation of the IGRA for routine tuberculosis (TB) testing have not been reported. We evaluated feasibility, acceptability, and costs after 1.5 years of IGRA use in San Francisco under routine program conditions. METHODS: Patients seen at six community clinics serving homeless, immigrant, or injection-drug user (IDU) populations were routinely offered IGRA (Quantiferon-TB). Per guidelines, we excluded patients who were <17 years old, HIV-infected, immunocompromised, or pregnant. We reviewed medical records for IGRA results and completion of medical evaluation for TB, and at two clinics reviewed TB screening logs for instances of IGRA refusal or phlebotomy failure. RESULTS: Between November 1, 2003 and February 28, 2005, 4143 persons were evaluated by IGRA. 225(5%) specimens were not tested, and 89 (2%) were IGRA-indeterminate. Positive or negative IGRA results were available for 3829 (92%). Of 819 patients with positive IGRA results, 524 (64%) completed diagnostic evaluation within 30 days of their IGRA test date. Among 503 patients eligible for IGRA testing at two clinics, phlebotomy was refused by 33 (7%) and failed in 40 (8%). Including phlebotomy, laboratory, and personnel costs, IGRA use cost $33.67 per patient tested. CONCLUSION: IGRA implementation in a routine TB control program setting was feasible and acceptable among homeless, IDU, and immigrant patients in San Francisco, with results more frequently available than the historically described performance of TST. Laboratory-based diagnosis and surveillance for M. tuberculosis infection is now possible

    Charged Particle Production in Proton-, Deuteron-, Oxygen- and Sulphur-Nucleus Collisions at 200 GeV per Nucleon

    Get PDF
    The transverse momentum and rapidity distributions of net protons and negatively charged hadrons have been measured for minimum bias proton-nucleus and deuteron-gold interactions, as well as central oxygen-gold and sulphur-nucleus collisions at 200 GeV per nucleon. The rapidity density of net protons at midrapidity in central nucleus-nucleus collisions increases both with target mass for sulphur projectiles and with the projectile mass for a gold target. The shape of the rapidity distributions of net protons forward of midrapidity for d+Au and central S+Au collisions is similar. The average rapidity loss is larger than 2 units of rapidity for reactions with the gold target. The transverse momentum spectra of net protons for all reactions can be described by a thermal distribution with `temperatures' between 145 +- 11 MeV (p+S interactions) and 244 +- 43 MeV (central S+Au collisions). The multiplicity of negatively charged hadrons increases with the mass of the colliding system. The shape of the transverse momentum spectra of negatively charged hadrons changes from minimum bias p+p and p+S interactions to p+Au and central nucleus-nucleus collisions. The mean transverse momentum is almost constant in the vicinity of midrapidity and shows little variation with the target and projectile masses. The average number of produced negatively charged hadrons per participant baryon increases slightly from p+p, p+A to central S+S,Ag collisions.Comment: 47 pages, submitted to Z. Phys.

    Linear, Deterministic, and Order-Invariant Initialization Methods for the K-Means Clustering Algorithm

    Full text link
    Over the past five decades, k-means has become the clustering algorithm of choice in many application domains primarily due to its simplicity, time/space efficiency, and invariance to the ordering of the data points. Unfortunately, the algorithm's sensitivity to the initial selection of the cluster centers remains to be its most serious drawback. Numerous initialization methods have been proposed to address this drawback. Many of these methods, however, have time complexity superlinear in the number of data points, which makes them impractical for large data sets. On the other hand, linear methods are often random and/or sensitive to the ordering of the data points. These methods are generally unreliable in that the quality of their results is unpredictable. Therefore, it is common practice to perform multiple runs of such methods and take the output of the run that produces the best results. Such a practice, however, greatly increases the computational requirements of the otherwise highly efficient k-means algorithm. In this chapter, we investigate the empirical performance of six linear, deterministic (non-random), and order-invariant k-means initialization methods on a large and diverse collection of data sets from the UCI Machine Learning Repository. The results demonstrate that two relatively unknown hierarchical initialization methods due to Su and Dy outperform the remaining four methods with respect to two objective effectiveness criteria. In addition, a recent method due to Erisoglu et al. performs surprisingly poorly.Comment: 21 pages, 2 figures, 5 tables, Partitional Clustering Algorithms (Springer, 2014). arXiv admin note: substantial text overlap with arXiv:1304.7465, arXiv:1209.196

    Addiction to the nicotine gum in never smokers

    Get PDF
    Abstract Background Addiction to nicotine gum has never been described in never smokers or in never users of tobacco. Methods Internet questionnaire in 2004–2006 in a self-selected sample of 434 daily users of nicotine gum. To assess dependence on nicotine gum, we used modified versions of the Nicotine Dependence Syndrome Scale (NDSS), the Fagerström Test for Nicotine Dependence and the Cigarette Dependence Scale. Results Five never smokers used the nicotine gum daily. They had been using the nicotine gum for longer than the 429 ever smokers (median = 6 years vs 0.8 years, p = 0.004), and they had higher NDSS-gum Tolerance scores (median = 0.73 vs = -1.0, p = 0.03), a difference of 1.5 standard deviation units. Two never smokers had never used smokeless tobacco, both answered "extremely true" to: "I use nicotine gums because I am addicted to them", both "fully agreed" with: "after a few hours without chewing a nicotine gum, I feel an irresistible urge to chew one" and: "I am a prisoner of nicotine gum". Conclusion This is to our knowledge the first report of addiction to nicotine gum in never users of tobacco. However, this phenomenon is rare, and although the long-term effect of nicotine gum is unknown, this product is significantly less harmful than tobacco.</p

    Contrast-enhanced CMR in patients after percutaneous closure of the left atrial appendage: A pilot study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>To evaluate the feasibility and value of first-pass contrast-enhanced dynamic and post-contrast 3D CMR in patients after transcatheter occlusion of left atrial appendage (LAA) to identify incorrect placement and persistent leaks.</p> <p>Methods</p> <p>7 patients with different occluder systems (n = 4 PLAATO; n = 2 Watchman; n = 1 ACP) underwent 2 contrast-enhanced (Gd-DOTA) CMR sequences (2D TrueFISP first-pass perfusion and 3D-TurboFLASH) to assess localization, artifact size and potential leaks of the devices. Perfusion CMR was analyzed visually and semi-quantitatively to identify potential leaks.</p> <p>Results</p> <p>All occluders were positioned within the LAA. The ACP occluder presented the most extensive artifact size. Visual assessment revealed a residual perfusion of the LAA apex in 4 cases using first-pass perfusion and 3D-TurboFLASH indicating a suboptimal LAA occlusion.</p> <p>By assessing signal-to-time-curves the cases with a visually detected leak showed a 9-fold higher signal-peak in the LAA apex (567 ± 120% increase from baseline signal) than those without a leak (61 ± 22%; p < 0.03). In contrast, the signal increase in LAA proximal to the occluder showed no difference (leak 481 ± 201% vs. no leak 478 ± 125%; p = 0.48).</p> <p>Conclusion</p> <p>This CMR pilot study provides valuable non-invasive information in patients after transcatheter occlusion of the LAA to identify correct placement and potential leaks. We recommend incorporating CMR in future clinical studies to evaluate new device types.</p

    Mechanics of fragmentation of crocodile skin and other thin films

    Get PDF
    Fragmentation of thin layers of materials is mediated by a network of cracks on its surface. It is commonly seen in dehydrated paintings or asphalt pavements and even in graphene or other two-dimensional materials, but is also observed in the characteristic polygonal pattern on a crocodile’s head. Here, we build a simple mechanical model of a thin film and investigate the generation and development of fragmentation patterns as the material is exposed to various modes of deformation. We find that the characteristic size of fragmentation, defined by the mean diameter of polygons, is strictly governed by mechanical properties of the film material. Our result demonstrates that skin fragmentation on the head of crocodiles is dominated by that it features a small ratio between the fracture energy and Young’s modulus, and the patterns agree well with experimental observations. Understanding this mechanics-driven process could be applied to improve the lifetime and reliability of thin film coatings by mimicking crocodile skin

    Impacts of climate change on plant diseases – opinions and trends

    Get PDF
    There has been a remarkable scientific output on the topic of how climate change is likely to affect plant diseases in the coming decades. This review addresses the need for review of this burgeoning literature by summarizing opinions of previous reviews and trends in recent studies on the impacts of climate change on plant health. Sudden Oak Death is used as an introductory case study: Californian forests could become even more susceptible to this emerging plant disease, if spring precipitations will be accompanied by warmer temperatures, although climate shifts may also affect the current synchronicity between host cambium activity and pathogen colonization rate. A summary of observed and predicted climate changes, as well as of direct effects of climate change on pathosystems, is provided. Prediction and management of climate change effects on plant health are complicated by indirect effects and the interactions with global change drivers. Uncertainty in models of plant disease development under climate change calls for a diversity of management strategies, from more participatory approaches to interdisciplinary science. Involvement of stakeholders and scientists from outside plant pathology shows the importance of trade-offs, for example in the land-sharing vs. sparing debate. Further research is needed on climate change and plant health in mountain, boreal, Mediterranean and tropical regions, with multiple climate change factors and scenarios (including our responses to it, e.g. the assisted migration of plants), in relation to endophytes, viruses and mycorrhiza, using long-term and large-scale datasets and considering various plant disease control methods

    Linking the Epigenome to the Genome: Correlation of Different Features to DNA Methylation of CpG Islands

    Get PDF
    DNA methylation of CpG islands plays a crucial role in the regulation of gene expression. More than half of all human promoters contain CpG islands with a tissue-specific methylation pattern in differentiated cells. Still today, the whole process of how DNA methyltransferases determine which region should be methylated is not completely revealed. There are many hypotheses of which genomic features are correlated to the epigenome that have not yet been evaluated. Furthermore, many explorative approaches of measuring DNA methylation are limited to a subset of the genome and thus, cannot be employed, e.g., for genome-wide biomarker prediction methods. In this study, we evaluated the correlation of genetic, epigenetic and hypothesis-driven features to DNA methylation of CpG islands. To this end, various binary classifiers were trained and evaluated by cross-validation on a dataset comprising DNA methylation data for 190 CpG islands in HEPG2, HEK293, fibroblasts and leukocytes. We achieved an accuracy of up to 91% with an MCC of 0.8 using ten-fold cross-validation and ten repetitions. With these models, we extended the existing dataset to the whole genome and thus, predicted the methylation landscape for the given cell types. The method used for these predictions is also validated on another external whole-genome dataset. Our results reveal features correlated to DNA methylation and confirm or disprove various hypotheses of DNA methylation related features. This study confirms correlations between DNA methylation and histone modifications, DNA structure, DNA sequence, genomic attributes and CpG island properties. Furthermore, the method has been validated on a genome-wide dataset from the ENCODE consortium. The developed software, as well as the predicted datasets and a web-service to compare methylation states of CpG islands are available at http://www.cogsys.cs.uni-tuebingen.de/software/dna-methylation/

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal
    corecore