32,080 research outputs found

    Coincidence between transcriptome analyses on different microarray platforms using a parametric framework

    Get PDF
    A parametric framework for the analysis of transcriptome data is demonstrated to yield coincident results when applied to data acquired using two different microarray platforms. Discrepancies among transcriptome studies are frequently reported, casting doubt on the reliability of collected data. The inconsistency among observations can be largely attributed to differences among the analytical frameworks employed for data analysis. The existing frameworks normalizes data against a standard determined from the data to be analyzed. In the present study, a parametric framework based on a strict model for normalization is applied to data acquired using an in-house printed chip and GeneChip. The framework is based on a common statistical characteristic of microarray data, and each data is normalized on the basis of a linear relationship with this model. In the proposed framework, the expressional changes observed and genes selected are coincident between platforms, achieving superior universality of data compared to other methods

    Multiple tests of association with biological annotation metadata

    Full text link
    We propose a general and formal statistical framework for multiple tests of association between known fixed features of a genome and unknown parameters of the distribution of variable features of this genome in a population of interest. The known gene-annotation profiles, corresponding to the fixed features of the genome, may concern Gene Ontology (GO) annotation, pathway membership, regulation by particular transcription factors, nucleotide sequences, or protein sequences. The unknown gene-parameter profiles, corresponding to the variable features of the genome, may be, for example, regression coefficients relating possibly censored biological and clinical outcomes to genome-wide transcript levels, DNA copy numbers, and other covariates. A generic question of great interest in current genomic research regards the detection of associations between biological annotation metadata and genome-wide expression measures. This biological question may be translated as the test of multiple hypotheses concerning association measures between gene-annotation profiles and gene-parameter profiles. A general and rigorous formulation of the statistical inference question allows us to apply the multiple hypothesis testing methodology developed in [Multiple Testing Procedures with Applications to Genomics (2008) Springer, New York] and related articles, to control a broad class of Type I error rates, defined as generalized tail probabilities and expected values for arbitrary functions of the numbers of Type I errors and rejected hypotheses. The resampling-based single-step and stepwise multiple testing procedures of [Multiple Testing Procedures with Applications to Genomics (2008) Springer, New York] take into account the joint distribution of the test statistics and provide Type I error control in testing problems involving general data generating distributions (with arbitrary dependence structures among variables), null hypotheses, and test statistics.Comment: Published in at http://dx.doi.org/10.1214/193940307000000446 the IMS Collections (http://www.imstat.org/publications/imscollections.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Prostate biopsies guided by three-dimensional real-time (4-D) transrectal ultrasonography on a phantom: comparative study versus two-dimensional transrectal ultrasound-guided biopsies

    Full text link
    OBJECTIVE: This study evaluated the accuracy in localisation and distribution of real-time three-dimensional (4-D) ultrasound-guided biopsies on a prostate phantom. METHODS: A prostate phantom was created. A three-dimensional real-time ultrasound system with a 5.9MHz probe was used, making it possible to see several reconstructed orthogonal viewing planes in real time. Fourteen operators performed biopsies first under 2-D then 4-D transurethral ultrasound (TRUS) guidance (336 biopsies). The biopsy path was modelled using segmentation in a 3-D ultrasonographic volume. Special software was used to visualise the biopsy paths in a reference prostate and assess the sampled area. A comparative study was performed to examine the accuracy of the entry points and target of the needle. Distribution was assessed by measuring the volume sampled and a redundancy ratio of the sampled prostate. RESULTS: A significant increase in accuracy in hitting the target zone was identified using 4-D ultrasonography as compared to 2-D. There was no increase in the sampled volume or improvement in the biopsy distribution with 4-D ultrasonography as compared to 2-D. CONCLUSION: The 4-D TRUS guidance appears to show, on a synthetic model, an improvement in location accuracy and in the ability to reproduce a protocol. The biopsy distribution does not seem improved

    Bayesian changepoint analysis for atomic force microscopy and soft material indentation

    Full text link
    Material indentation studies, in which a probe is brought into controlled physical contact with an experimental sample, have long been a primary means by which scientists characterize the mechanical properties of materials. More recently, the advent of atomic force microscopy, which operates on the same fundamental principle, has in turn revolutionized the nanoscale analysis of soft biomaterials such as cells and tissues. This paper addresses the inferential problems associated with material indentation and atomic force microscopy, through a framework for the changepoint analysis of pre- and post-contact data that is applicable to experiments across a variety of physical scales. A hierarchical Bayesian model is proposed to account for experimentally observed changepoint smoothness constraints and measurement error variability, with efficient Monte Carlo methods developed and employed to realize inference via posterior sampling for parameters such as Young's modulus, a key quantifier of material stiffness. These results are the first to provide the materials science community with rigorous inference procedures and uncertainty quantification, via optimized and fully automated high-throughput algorithms, implemented as the publicly available software package BayesCP. To demonstrate the consistent accuracy and wide applicability of this approach, results are shown for a variety of data sets from both macro- and micro-materials experiments--including silicone, neurons, and red blood cells--conducted by the authors and others.Comment: 20 pages, 6 figures; submitted for publicatio

    Towards Structural Testing of Superconductor Electronics

    Get PDF
    Many of the semiconductor technologies are already\ud facing limitations while new-generation data and\ud telecommunication systems are implemented. Although in\ud its infancy, superconductor electronics (SCE) is capable of\ud handling some of these high-end tasks. We have started a\ud defect-oriented test methodology for SCE, so that reliable\ud systems can be implemented in this technology. In this\ud paper, the details of the study on the Rapid Single-Flux\ud Quantum (RSFQ) process are presented. We present\ud common defects in the SCE processes and corresponding\ud test methodologies to detect them. The (measurement)\ud results prove that we are able to detect possible random\ud defects for statistical purposes in yield analysis. This\ud paper also presents possible test methodologies for RSFQ\ud circuits based on defect oriented testing (DOT)

    Product assurance technology for custom LSI/VLSI electronics

    Get PDF
    The technology for obtaining custom integrated circuits from CMOS-bulk silicon foundries using a universal set of layout rules is presented. The technical efforts were guided by the requirement to develop a 3 micron CMOS test chip for the Combined Release and Radiation Effects Satellite (CRRES). This chip contains both analog and digital circuits. The development employed all the elements required to obtain custom circuits from silicon foundries, including circuit design, foundry interfacing, circuit test, and circuit qualification

    Current advances in systems and integrative biology

    Get PDF
    Systems biology has gained a tremendous amount of interest in the last few years. This is partly due to the realization that traditional approaches focusing only on a few molecules at a time cannot describe the impact of aberrant or modulated molecular environments across a whole system. Furthermore, a hypothesis-driven study aims to prove or disprove its postulations, whereas a hypothesis-free systems approach can yield an unbiased and novel testable hypothesis as an end-result. This latter approach foregoes assumptions which predict how a biological system should react to an altered microenvironment within a cellular context, across a tissue or impacting on distant organs. Additionally, re-use of existing data by systematic data mining and re-stratification, one of the cornerstones of integrative systems biology, is also gaining attention. While tremendous efforts using a systems methodology have already yielded excellent results, it is apparent that a lack of suitable analytic tools and purpose-built databases poses a major bottleneck in applying a systematic workflow. This review addresses the current approaches used in systems analysis and obstacles often encountered in large-scale data analysis and integration which tend to go unnoticed, but have a direct impact on the final outcome of a systems approach. Its wide applicability, ranging from basic research, disease descriptors, pharmacological studies, to personalized medicine, makes this emerging approach well suited to address biological and medical questions where conventional methods are not ideal
    • …
    corecore