47 research outputs found

    A Bayesian method for calculating real-time quantitative PCR calibration curves using absolute plasmid DNA standards

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In real-time quantitative PCR studies using absolute plasmid DNA standards, a calibration curve is developed to estimate an unknown DNA concentration. However, potential differences in the amplification performance of plasmid DNA compared to genomic DNA standards are often ignored in calibration calculations and in some cases impossible to characterize. A flexible statistical method that can account for uncertainty between plasmid and genomic DNA targets, replicate testing, and experiment-to-experiment variability is needed to estimate calibration curve parameters such as intercept and slope. Here we report the use of a Bayesian approach to generate calibration curves for the enumeration of target DNA from genomic DNA samples using absolute plasmid DNA standards.</p> <p>Results</p> <p>Instead of the two traditional methods (classical and inverse), a Monte Carlo Markov Chain (MCMC) estimation was used to generate single, master, and modified calibration curves. The mean and the percentiles of the posterior distribution were used as point and interval estimates of unknown parameters such as intercepts, slopes and DNA concentrations. The software WinBUGS was used to perform all simulations and to generate the posterior distributions of all the unknown parameters of interest.</p> <p>Conclusion</p> <p>The Bayesian approach defined in this study allowed for the estimation of DNA concentrations from environmental samples using absolute standard curves generated by real-time qPCR. The approach accounted for uncertainty from multiple sources such as experiment-to-experiment variation, variability between replicate measurements, as well as uncertainty introduced when employing calibration curves generated from absolute plasmid DNA standards.</p

    Improving Phase Change Memory Performance with Data Content Aware Access

    Full text link
    A prominent characteristic of write operation in Phase-Change Memory (PCM) is that its latency and energy are sensitive to the data to be written as well as the content that is overwritten. We observe that overwriting unknown memory content can incur significantly higher latency and energy compared to overwriting known all-zeros or all-ones content. This is because all-zeros or all-ones content is overwritten by programming the PCM cells only in one direction, i.e., using either SET or RESET operations, not both. In this paper, we propose data content aware PCM writes (DATACON), a new mechanism that reduces the latency and energy of PCM writes by redirecting these requests to overwrite memory locations containing all-zeros or all-ones. DATACON operates in three steps. First, it estimates how much a PCM write access would benefit from overwriting known content (e.g., all-zeros, or all-ones) by comprehensively considering the number of set bits in the data to be written, and the energy-latency trade-offs for SET and RESET operations in PCM. Second, it translates the write address to a physical address within memory that contains the best type of content to overwrite, and records this translation in a table for future accesses. We exploit data access locality in workloads to minimize the address translation overhead. Third, it re-initializes unused memory locations with known all-zeros or all-ones content in a manner that does not interfere with regular read and write accesses. DATACON overwrites unknown content only when it is absolutely necessary to do so. We evaluate DATACON with workloads from state-of-the-art machine learning applications, SPEC CPU2017, and NAS Parallel Benchmarks. Results demonstrate that DATACON significantly improves system performance and memory system energy consumption compared to the best of performance-oriented state-of-the-art techniques.Comment: 18 pages, 21 figures, accepted at ACM SIGPLAN International Symposium on Memory Management (ISMM

    Evolutionary Game Theory and Social Learning Can Determine How Vaccine Scares Unfold

    Get PDF
    Immunization programs have often been impeded by vaccine scares, as evidenced by the measles-mumps-rubella (MMR) autism vaccine scare in Britain. A “free rider” effect may be partly responsible: vaccine-generated herd immunity can reduce disease incidence to such low levels that real or imagined vaccine risks appear large in comparison, causing individuals to cease vaccinating. This implies a feedback loop between disease prevalence and strategic individual vaccinating behavior. Here, we analyze a model based on evolutionary game theory that captures this feedback in the context of vaccine scares, and that also includes social learning. Vaccine risk perception evolves over time according to an exogenously imposed curve. We test the model against vaccine coverage data and disease incidence data from two vaccine scares in England & Wales: the whole cell pertussis vaccine scare and the MMR vaccine scare. The model fits vaccine coverage data from both vaccine scares relatively well. Moreover, the model can explain the vaccine coverage data more parsimoniously than most competing models without social learning and/or feedback (hence, adding social learning and feedback to a vaccine scare model improves model fit with little or no parsimony penalty). Under some circumstances, the model can predict future vaccine coverage and disease incidence—up to 10 years in advance in the case of pertussis—including specific qualitative features of the dynamics, such as future incidence peaks and undulations in vaccine coverage due to the population's response to changing disease incidence. Vaccine scares could become more common as eradication goals are approached for more vaccine-preventable diseases. Such models could help us predict how vaccine scares might unfold and assist mitigation efforts

    A new real-time PCR method to overcome significant quantitative inaccuracy due to slight amplification inhibition

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Real-time PCR analysis is a sensitive DNA quantification technique that has recently gained considerable attention in biotechnology, microbiology and molecular diagnostics. Although, the cycle-threshold (<it>Ct</it>) method is the present "gold standard", it is far from being a standard assay. Uniform reaction efficiency among samples is the most important assumption of this method. Nevertheless, some authors have reported that it may not be correct and a slight PCR efficiency decrease of about 4% could result in an error of up to 400% using the <it>Ct </it>method. This reaction efficiency decrease may be caused by inhibiting agents used during nucleic acid extraction or copurified from the biological sample.</p> <p>We propose a new method (<it>Cy</it><sub><it>0</it></sub>) that does not require the assumption of equal reaction efficiency between unknowns and standard curve.</p> <p>Results</p> <p>The <it>Cy</it><sub><it>0 </it></sub>method is based on the fit of Richards' equation to real-time PCR data by nonlinear regression in order to obtain the best fit estimators of reaction parameters. Subsequently, these parameters were used to calculate the <it>Cy</it><sub><it>0 </it></sub>value that minimizes the dependence of its value on PCR kinetic.</p> <p>The <it>Ct</it>, second derivative (<it>Cp</it>), sigmoidal curve fitting method (<it>SCF</it>) and <it>Cy</it><sub><it>0 </it></sub>methods were compared using two criteria: precision and accuracy. Our results demonstrated that, in optimal amplification conditions, these four methods are equally precise and accurate. However, when PCR efficiency was slightly decreased, diluting amplification mix quantity or adding a biological inhibitor such as IgG, the <it>SCF</it>, <it>Ct </it>and <it>Cp </it>methods were markedly impaired while the <it>Cy</it><sub><it>0 </it></sub>method gave significantly more accurate and precise results.</p> <p>Conclusion</p> <p>Our results demonstrate that <it>Cy</it><sub><it>0 </it></sub>represents a significant improvement over the standard methods for obtaining a reliable and precise nucleic acid quantification even in sub-optimal amplification conditions overcoming the underestimation caused by the presence of some PCR inhibitors.</p

    The spine in Paget’s disease

    Get PDF
    Paget’s disease (PD) is a chronic metabolically active bone disease, characterized by a disturbance in bone modelling and remodelling due to an increase in osteoblastic and osteoclastic activity. The vertebra is the second most commonly affected site. This article reviews the various spinal pathomechanisms and osseous dynamics involved in producing the varied imaging appearances and their clinical relevance. Advanced imaging of osseous, articular and bone marrow manifestations of PD in all the vertebral components are presented. Pagetic changes often result in clinical symptoms including back pain, spinal stenosis and neural dysfunction. Various pathological complications due to PD involvement result in these clinical symptoms. Recognition of the imaging manifestations of spinal PD and the potential complications that cause the clinical symptoms enables accurate assessment of patients prior to appropriate management
    corecore