180 research outputs found

    Spatial normalization improves the quality of genotype calling for Affymetrix SNP 6.0 arrays

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Microarray measurements are susceptible to a variety of experimental artifacts, some of which give rise to systematic biases that are spatially dependent in a unique way on each chip. It is likely that such artifacts affect many SNP arrays, but the normalization methods used in currently available genotyping algorithms make no attempt at spatial bias correction. Here, we propose an effective single-chip spatial bias removal procedure for Affymetrix 6.0 SNP arrays or platforms with similar design features. This procedure deals with both extreme and subtle biases and is intended to be applied before standard genotype calling algorithms.</p> <p>Results</p> <p>Application of the spatial bias adjustments on HapMap samples resulted in higher genotype call rates with equal or even better accuracy for thousands of SNPs. Consequently the normalization procedure is expected to lead to more meaningful biological inferences and could be valuable for genome-wide SNP analysis.</p> <p>Conclusions</p> <p>Spatial normalization can potentially rescue thousands of SNPs in a genetic study at the small cost of computational time. The approach is implemented in R and available from the authors upon request.</p

    Analysis of time-to-event for observational studies: Guidance to the use of intensity models

    Full text link
    This paper provides guidance for researchers with some mathematical background on the conduct of time-to-event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and when interpreting the results. Special attention is paid to how to avoid problems with immortal time bias by introducing time-dependent covariates. We discuss prediction based on hazard models and difficulties when attempting to draw proper causal conclusions from such models. Finally, we present a series of examples where the methods and check lists are exemplified. Computational details and implementation using the freely available R software are documented in Supplementary Material. The paper was prepared as part of the STRATOS initiative.Comment: 28 pages, 12 figures. For associated Supplementary material, see http://publicifsv.sund.ku.dk/~pka/STRATOSTG8

    Adjusted Survival Curves

    Get PDF

    Relation of vertebral deformities to bone density, structure, and strength.

    Get PDF
    Because they are not reliably discriminated by areal bone mineral density (aBMD) measurements, it is unclear whether minimal vertebral deformities represent early osteoporotic fractures. To address this, we compared 90 postmenopausal women with no deformity (controls) with 142 women with one or more semiquantitative grade 1 (mild) deformities and 51 women with any grade 2-3 (moderate/severe) deformities. aBMD was measured by dual-energy X-ray absorptiometry (DXA), lumbar spine volumetric bone mineral density (vBMD) and geometry by quantitative computed tomography (QCT), bone microstructure by high-resolution peripheral QCT at the radius (HRpQCT), and vertebral compressive strength and load-to-strength ratio by finite-element analysis (FEA) of lumbar spine QCT images. Compared with controls, women with grade 1 deformities had significantly worse values for many bone density, structure, and strength parameters, although deficits all were much worse for the women with grade 2-3 deformities. Likewise, these skeletal parameters were more strongly associated with moderate to severe than with mild deformities by age-adjusted logistic regression. Nonetheless, grade 1 vertebral deformities were significantly associated with four of the five main variable categories assessed: bone density (lumbar spine vBMD), bone geometry (vertebral apparent cortical thickness), bone strength (overall vertebral compressive strength by FEA), and load-to-strength ratio (45-degree forward bending ÷ vertebral compressive strength). Thus significantly impaired bone density, structure, and strength compared with controls indicate that many grade 1 deformities do represent early osteoporotic fractures, with corresponding implications for clinical decision making

    Quality assessment metrics for whole genome gene expression profiling of paraffin embedded samples

    Get PDF
    BACKGROUND: Formalin fixed, paraffin embedded tissues are most commonly used for routine pathology analysis and for long term tissue preservation in the clinical setting. Many institutions have large archives of Formalin fixed, paraffin embedded tissues that provide a unique opportunity for understanding genomic signatures of disease. However, genome-wide expression profiling of Formalin fixed, paraffin embedded samples have been challenging due to RNA degradation. Because of the significant heterogeneity in tissue quality, normalization and analysis of these data presents particular challenges. The distribution of intensity values from archival tissues are inherently noisy and skewed due to differential sample degradation raising two primary concerns; whether a highly skewed array will unduly influence initial normalization of the data and whether outlier arrays can be reliably identified. FINDINGS: Two simple extensions of common regression diagnostic measures are introduced that measure the stress an array undergoes during normalization and how much a given array deviates from the remaining arrays post-normalization. These metrics are applied to a study involving 1618 formalin-fixed, paraffin-embedded HER2-positive breast cancer samples from the N9831 adjuvant trial processed with Illumina’s cDNA-mediated Annealing Selection extension and Ligation assay. CONCLUSION: Proper assessment of array quality within a research study is crucial for controlling unwanted variability in the data. The metrics proposed in this paper have direct biological interpretations and can be used to identify arrays that should either be removed from analysis all together or down-weighted to reduce their influence in downstream analyses

    Factors associated with hospital readmission in sickle cell disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Sickle cell disease is the most frequent hereditary disease in Brazil, and people with the disease may be hospitalised several times in the course of their lives. The purpose of this study was to estimate the hazard ratios of factors associated with the time between hospital admissions.</p> <p>Methods</p> <p>The study sample comprised all patients admitted, from 2000 to 2004, to a university hospital in Rio de Janeiro State, south-east Brazil, as a result of acute complications from sickle cell disease (SCD). Considering the statistical problem of studying individuals with multiple events over time, the following extensions of Cox's proportional hazard ratio model were compared: the independent increment marginal model (Andersen-Gill) and the random effects model.</p> <p>Results</p> <p>The study considered 71 patients, who were admitted 223 times for acute events related to SCD. The hazard ratios for hospital readmission were statistically significant for the prior occurrence of vaso-occlusive crisis and development of renal failure. However, analysis of residuals of the marginal model revealed evidence of non-proportionality for some covariates.</p> <p>Conclusion</p> <p>the results from applying the two models were generally similar, indicating that the findings are not highly sensitive to different approaches. The better fit by the frailty model suggests that there are unmeasured individual factors with impact on hospital readmission.</p

    3' tag digital gene expression profiling of human brain and universal reference RNA using Illumina Genome Analyzer

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Massive parallel sequencing has the potential to replace microarrays as the method for transcriptome profiling. Currently there are two protocols: full-length RNA sequencing (RNA-SEQ) and 3'-tag digital gene expression (DGE). In this preliminary effort, we evaluated the 3' DGE approach using two reference RNA samples from the MicroArray Quality Control Consortium (MAQC).</p> <p>Results</p> <p>Using Brain RNA sample from multiple runs, we demonstrated that the transcript profiles from 3' DGE were highly reproducible between technical and biological replicates from libraries constructed by the same lab and even by different labs, and between two generations of Illumina's Genome Analyzers. Approximately 65% of all sequence reads mapped to mitochondrial genes, ribosomal RNAs, and canonical transcripts. The expression profiles of brain RNA and universal human reference RNA were compared which demonstrated that DGE was also highly quantitative with excellent correlation of differential expression with quantitative real-time PCR. Furthermore, one lane of 3' DGE sequencing, using the current sequencing chemistry and image processing software, had wider dynamic range for transcriptome profiling and was able to detect lower expressed genes which are normally below the detection threshold of microarrays.</p> <p>Conclusion</p> <p>3' tag DGE profiling with massive parallel sequencing achieved high sensitivity and reproducibility for transcriptome profiling. Although it lacks the ability of detecting alternative splicing events compared to RNA-SEQ, it is much more affordable and clearly out-performed microarrays (Affymetrix) in detecting lower abundant transcripts.</p
    • …
    corecore