42,504 research outputs found

    Studies on the relationships between oligonucleotide probe properties and hybridization signal intensities

    Get PDF
    Microarray technology is a commonly used tool in biomedical research for assessing global gene expression, surveying DNA sequence variations, and studying alternative gene splicing. Given the wide range of applications of this technology, comprehensive understanding of its underlying mechanisms is of importance. The focus of this work is on contributions from microarray probe properties (probe secondary structure: ?Gss, probe-target binding energy: ?G, probe-target mismatch) to the signal intensity. The benefits of incorporating or ignoring these properties to the process of microarray probe design and selection, as well as to microarray data preprocessing and analysis, are reported. Four related studies are described in this thesis. In the first, probe secondary structure was found to account for up to 3% of all variation on Affymetrix microarrays. In the second, a dinucleotide affinity model was developed and found to enhance the detection of differentially expressed genes when implemented as a background correction procedure in GeneChip preprocessing algorithms. This model is consistent with physical models of binding affinity of the probe target pair, which depends on the nearest-neighbor stacking interactions in addition to base-pairing. In the remaining studies, the importance of incorporating biophysical factors in both the design and the analysis of microarrays ‘percent bound’, predicted by equilibrium models of hybridization, is a useful factor in predicting and assessing the behavior of long oligonucleotide probes. However, a universal probe-property-independent three-parameter Langmuir model has also been tested, and this simple model has been shown to be as, or more, effective as complex, computationally expensive models developed for microarray target concentration estimation. The simple, platform-independent model can equal or even outperform models that explicitly incorporate probe properties, such as the model incorporating probe percent bound developed in Chapter Three. This suggests that with a “spiked-in” concentration series targeting as few as 5-10 genes, reliable estimation of target concentration can be achieved for the entire microarray

    A flexible shrinkage operator for fussy grouped variable selection

    Get PDF
    Existing grouped variable selection methods rely heavily on prior group information, thus they may not be reliable if an incorrect group assignment is used. In this paper, we propose a family of shrinkage variable selection operators by controlling the k-th largest norm (KAN). The proposed KAN method exhibits some flexible group-wise variable selection naturally even though no correct prior group information is available. We also construct a group KAN shrinkage operator using a composite of KAN constraints. Neither ignoring nor relying completely on prior group information, the group KAN method has the flexibility of controlling within group strength and therefore can reduce the effect caused by incorrect group information. Finally, we investigate an unbiased estimator of the degrees of freedom for (group) KAN estimates in the framework of Stein’s unbiased risk estimation. Extensive simulation studies and real data analysis are performed to demonstrate the advantage of KAN and group KAN over the LASSO and group LASSO, respectively

    Atomic Parity Nonconservation: Electroweak Parameters and Nuclear Structure

    Full text link
    There have been suggestions to measure atomic parity nonconservation (PNC) along an isotopic chain, by taking ratios of observables in order to cancel complicated atomic structure effects. Precise atomic PNC measurements could make a significant contribution to tests of the Standard Model at the level of one loop radiative corrections. However, the results also depend upon certain features of nuclear structure, such as the spatial distribution of neutrons in the nucleus. To examine the sensitivity to nuclear structure, we consider the case of Pb isotopes using various recent relativistic and non-relativistic nuclear model calculations. Contributions from nucleon internal weak structure are included, but found to be fairly negligible. The spread among present models in predicted sizes of nuclear structure effects may preclude using Pb isotope ratios to test the Standard Model at better than a one percent level, unless there are adequate independent tests of the nuclear models by various alternative strong and electroweak nuclear probes. On the other hand, sufficiently accurate atomic PNC experiments would provide a unique method to measure neutron distributions in heavy nuclei.Comment: 44 pages, INT Preprint DOE/ER/40561-050-INT92-00-1

    Identifying Complexity by Means of Matrices

    Full text link
    Complexity is an interdisciplinary concept which, first of all, addresses the question of how order emerges out of randomness. For many reasons matrices provide a very practical and powerful tool in approaching and quantifying the related characteristics. Based on several natural complex dynamical systems, like the strongly interacting quantum many-body systems, the human brain and the financial markets, by relating empirical observations to the random matrix theory and quantifying deviations in term of a reduced dimensionality, we present arguments in favour of the statement that complexity is a pheomenon at the edge between collectivity and chaos.Comment: Talk given by S. Drozdz at "Horizons in Complex Systems", Messina, December 5-8, 200

    Quantum Topology Change in (2 + 1)d

    Get PDF
    The topology of orientable (2 + 1)d spacetimes can be captured by certain lumps of non-trivial topology called topological geons. They are the topological analogues of conventional solitons. We give a description of topological geons where the degrees of freedom related to topology are separated from the complete theory that contains metric (dynamical) degrees of freedom. The formalism also allows us to investigate processes of quantum topology change. They correspond to creation and annihilation of quantum geons. Selection rules for such processes are derived.Comment: LaTeX file, 33 pages, 10 postscript figures, some typos corrected, references updated, and other minor change

    Highly Scalable Algorithms for Robust String Barcoding

    Full text link
    String barcoding is a recently introduced technique for genomic-based identification of microorganisms. In this paper we describe the engineering of highly scalable algorithms for robust string barcoding. Our methods enable distinguisher selection based on whole genomic sequences of hundreds of microorganisms of up to bacterial size on a well-equipped workstation, and can be easily parallelized to further extend the applicability range to thousands of bacterial size genomes. Experimental results on both randomly generated and NCBI genomic data show that whole-genome based selection results in a number of distinguishers nearly matching the information theoretic lower bounds for the problem
    corecore