631 research outputs found

    Statistical Model of Shape Moments with Active Contour Evolution for Shape Detection and Segmentation

    Get PDF
    This paper describes a novel method for shape representation and robust image segmentation. The proposed method combines two well known methodologies, namely, statistical shape models and active contours implemented in level set framework. The shape detection is achieved by maximizing a posterior function that consists of a prior shape probability model and image likelihood function conditioned on shapes. The statistical shape model is built as a result of a learning process based on nonparametric probability estimation in a PCA reduced feature space formed by the Legendre moments of training silhouette images. A greedy strategy is applied to optimize the proposed cost function by iteratively evolving an implicit active contour in the image space and subsequent constrained optimization of the evolved shape in the reduced shape feature space. Experimental results presented in the paper demonstrate that the proposed method, contrary to many other active contour segmentation methods, is highly resilient to severe random and structural noise that could be present in the data

    A multi-species functional embedding integrating sequence and network structure

    Full text link
    A key challenge to transferring knowledge between species is that different species have fundamentally different genetic architectures. Initial computational approaches to transfer knowledge across species have relied on measures of heredity such as genetic homology, but these approaches suffer from limitations. First, only a small subset of genes have homologs, limiting the amount of knowledge that can be transferred, and second, genes change or repurpose functions, complicating the transfer of knowledge. Many approaches address this problem by expanding the notion of homology by leveraging high-throughput genomic and proteomic measurements, such as through network alignment. In this work, we take a new approach to transferring knowledge across species by expanding the notion of homology through explicit measures of functional similarity between proteins in different species. Specifically, our kernel-based method, HANDL (Homology Assessment across Networks using Diffusion and Landmarks), integrates sequence and network structure to create a functional embedding in which proteins from different species are embedded in the same vector space. We show that inner products in this space and the vectors themselves capture functional similarity across species, and are useful for a variety of functional tasks. We perform the first whole-genome method for predicting phenologs, generating many that were previously identified, but also predicting new phenologs supported from the biological literature. We also demonstrate the HANDL embedding captures pairwise gene function, in that gene pairs with synthetic lethal interactions are significantly separated in HANDL space, and the direction of separation is conserved across species. Software for the HANDL algorithm is available at http://bit.ly/lrgr-handl.Published versio

    Studies on dimension reduction and feature spaces :

    Get PDF
    Today's world produces and stores huge amounts of data, which calls for methods that can tackle both growing sizes and growing dimensionalities of data sets. Dimension reduction aims at answering the challenges posed by the latter. Many dimension reduction methods consist of a metric transformation part followed by optimization of a cost function. Several classes of cost functions have been developed and studied, while metrics have received less attention. We promote the view that metrics should be lifted to a more independent role in dimension reduction research. The subject of this work is the interaction of metrics with dimension reduction. The work is built on a series of studies on current topics in dimension reduction and neural network research. Neural networks are used both as a tool and as a target for dimension reduction. When the results of modeling or clustering are represented as a metric, they can be studied using dimension reduction, or they can be used to introduce new properties into a dimension reduction method. We give two examples of such use: visualizing results of hierarchical clustering, and creating supervised variants of existing dimension reduction methods by using a metric that is built on the feature space of a neural network. Combining clustering with dimension reduction results in a novel way for creating space-efficient visualizations, that tell both about hierarchical structure and about distances of clusters. We study feature spaces used in a recently developed neural network architecture called extreme learning machine. We give a novel interpretation for such neural networks, and recognize the need to parameterize extreme learning machines with the variance of network weights. This has practical implications for use of extreme learning machines, since the current practice emphasizes the role of hidden units and ignores the variance. A current trend in the research of deep neural networks is to use cost functions from dimension reduction methods to train the network for supervised dimension reduction. We show that equally good results can be obtained by training a bottlenecked neural network for classification or regression, which is faster than using a dimension reduction cost. We demonstrate that, contrary to the current belief, using sparse distance matrices for creating fast dimension reduction methods is feasible, if a proper balance between short-distance and long-distance entries in the sparse matrix is maintained. This observation opens up a promising research direction, with possibility to use modern dimension reduction methods on much larger data sets than which are manageable today

    Characterization and Reduction of Noise in Manifold Representations of Hyperspectral Imagery

    Get PDF
    A new workflow to produce dimensionality reduced manifold coordinates based on the improvements of landmark Isometric Mapping (ISOMAP) algorithms using local spectral models is proposed. Manifold space from nonlinear dimensionality reduction better addresses the nonlinearity of the hyperspectral data and often has better per- formance comparing to the results of linear methods such as Minimum Noise Fraction (MNF). The dissertation mainly focuses on using adaptive local spectral models to fur- ther improve the performance of ISOMAP algorithms by addressing local noise issues and perform guided landmark selection and nearest neighborhood construction in local spectral subsets. This work could benefit the performance of common hyperspectral image analysis tasks, such as classification, target detection, etc., but also keep the computational burden low. This work is based on and improves the previous ENH- ISOMAP algorithm in various ways. The workflow is based on a unified local spectral subsetting framework. Embedding spaces in local spectral subsets as local noise models are first proposed and used to perform noise estimation, MNF regression and guided landmark selection in a local sense. Passive and active methods are proposed and ver- ified to select landmarks deliberately to ensure local geometric structure coverage and local noise avoidance. Then, a novel local spectral adaptive method is used to construct the k-nearest neighbor graph. Finally, a global MNF transformation in the manifold space is also introduced to further compress the signal dimensions. The workflow is implemented using C++ with multiple implementation optimizations, including using heterogeneous computing platforms that are available in personal computers. The re- sults are presented and evaluated by Jeffries-Matsushita separability metric, as well as the classification accuracy of supervised classifiers. The proposed workflow shows sig- nificant and stable improvements over the dimensionality reduction performance from traditional MNF and ENH-ISOMAP on various hyperspectral datasets. The computa- tional speed of the proposed implementation is also improved
    • …
    corecore