3,907 research outputs found

    Approximate least trimmed sum of squares fitting and applications in image analysis

    Get PDF
    The least trimmed sum of squares (LTS) regression estimation criterion is a robust statistical method for model fitting in the presence of outliers. Compared with the classical least squares estimator, which uses the entire data set for regression and is consequently sensitive to outliers, LTS identifies the outliers and fits to the remaining data points for improved accuracy. Exactly solving an LTS problem is NP-hard, but as we show here, LTS can be formulated as a concave minimization problem. Since it is usually tractable to globally solve a convex minimization or concave maximization problem in polynomial time, inspired by [1], we instead solve LTS’ approximate complementary problem, which is convex minimization. We show that this complementary problem can be efficiently solved as a second order cone program. We thus propose an iterative procedure to approximately solve the original LTS problem. Our extensive experiments demonstrate that the proposed method is robust, efficient and scalable in dealing with problems where data are contaminated with outliers. We show several applications of our method in image analysis.Fumin Shen, Chunhua Shen, Anton van den Hengel and Zhenmin Tan

    Robust techniques and applications in fuzzy clustering

    Get PDF
    This dissertation addresses issues central to frizzy classification. The issue of sensitivity to noise and outliers of least squares minimization based clustering techniques, such as Fuzzy c-Means (FCM) and its variants is addressed. In this work, two novel and robust clustering schemes are presented and analyzed in detail. They approach the problem of robustness from different perspectives. The first scheme scales down the FCM memberships of data points based on the distance of the points from the cluster centers. Scaling done on outliers reduces their membership in true clusters. This scheme, known as the Mega-clustering, defines a conceptual mega-cluster which is a collective cluster of all data points but views outliers and good points differently (as opposed to the concept of Dave\u27s Noise cluster). The scheme is presented and validated with experiments and similarities with Noise Clustering (NC) are also presented. The other scheme is based on the feasible solution algorithm that implements the Least Trimmed Squares (LTS) estimator. The LTS estimator is known to be resistant to noise and has a high breakdown point. The feasible solution approach also guarantees convergence of the solution set to a global optima. Experiments show the practicability of the proposed schemes in terms of computational requirements and in the attractiveness of their simplistic frameworks. The issue of validation of clustering results has often received less attention than clustering itself. Fuzzy and non-fuzzy cluster validation schemes are reviewed and a novel methodology for cluster validity using a test for random position hypothesis is developed. The random position hypothesis is tested against an alternative clustered hypothesis on every cluster produced by the partitioning algorithm. The Hopkins statistic is used as a basis to accept or reject the random position hypothesis, which is also the null hypothesis in this case. The Hopkins statistic is known to be a fair estimator of randomness in a data set. The concept is borrowed from the clustering tendency domain and its applicability to validating clusters is shown here. A unique feature selection procedure for use with large molecular conformational datasets with high dimensionality is also developed. The intelligent feature extraction scheme not only helps in reducing dimensionality of the feature space but also helps in eliminating contentious issues such as the ones associated with labeling of symmetric atoms in the molecule. The feature vector is converted to a proximity matrix, and is used as an input to the relational fuzzy clustering (FRC) algorithm with very promising results. Results are also validated using several cluster validity measures from literature. Another application of fuzzy clustering considered here is image segmentation. Image analysis on extremely noisy images is carried out as a precursor to the development of an automated real time condition state monitoring system for underground pipelines. A two-stage FCM with intelligent feature selection is implemented as the segmentation procedure and results on a test image are presented. A conceptual framework for automated condition state assessment is also developed

    LIMO EEG: A Toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data

    Get PDF
    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses

    Volumetric Untrimming: Precise decomposition of trimmed trivariates into tensor products

    Full text link
    3D objects, modeled using Computer Aided Geometric Design tools, are traditionally represented using a boundary representation (B-rep), and typically use spline functions to parameterize these boundary surfaces. However, recent development in physical analysis, in isogeometric analysis (IGA) in specific, necessitates a volumetric parametrization of the interior of the object. IGA is performed directly by integrating over the spline spaces of the volumetric spline representation of the object. Typically, tensor-product B-spline trivariates are used to parameterize the volumetric domain. A general 3D object, that can be modeled in contemporary B-rep CAD tools, is typically represented using trimmed B-spline surfaces. In order to capture the generality of the contemporary B-rep modeling space, while supporting IGA needs, Massarwi and Elber (2016) proposed the use of trimmed trivariates volumetric elements. However, the use of trimmed geometry makes the integration process more difficult since integration over trimmed B-spline basis functions is a highly challenging task. In this work, we propose an algorithm that precisely decomposes a trimmed B-spline trivariate into a set of (singular only on the boundary) tensor-product B-spline trivariates, that can be utilized to simplify the integration process in IGA. The trimmed B-spline trivariate is first subdivided into a set of trimmed B\'ezier trivariates, at all its internal knots. Then, each trimmed B\'ezier trivariate, is decomposed into a set of mutually exclusive tensor-product B-spline trivariates, that precisely cover the entire trimmed domain. This process, denoted untrimming, can be performed in either the Euclidean space or the parametric space of the trivariate. We present examples on complex trimmed trivariates' based geometry, and we demonstrate the effectiveness of the method by applying IGA over the (untrimmed) results.Comment: 18 pages, 32 figures. Contribution accepted in International Conference on Geometric Modeling and Processing (GMP 2019

    On Feature-Based SAR Image Registration: Appropriate Feature and Retrieval Algorithm

    Get PDF
    An investigation on the appropriate feature and parameter retrieval algorithm is conducted for feature-based registration of synthetic aperture radar (SAR) images. The commonly used features such as tie points, Harris corner, SIFT, and SURF are comprehensively evaluated. SURF is shown to outperform others on criteria such as the geometrical invariance of feature and descriptor, the extraction and matching speed, the localization accuracy, as well as the robustness to decorrelation and speckling. The processing result reveals that SURF has nice flexibility to SAR speckles for the potential relationship between Fast-Hessian detector and refined Lee filter. Moreover, the use of Fast-Hessian to oversampled images with unaltered sampling step helps to improve the registration accuracy to subpixel (i.e., <1 pixel). As for parameter retrieval, the widely used random sample consensus (RANSAC) is inappropriate because it may trap into local occlusion and result in uncertain estimation. An extended fast least trimmed squares (EF-LTS) is proposed, which behaves stable and averagely better than RANSAC. Fitting SURF features with EF-LTS is hence suggested for SAR image registration. The nice performance of this scheme is validated on both InSAR and MiniSAR image pairs

    Plane-extraction from depth-data using a Gaussian mixture regression model

    Get PDF
    We propose a novel algorithm for unsupervised extraction of piecewise planar models from depth-data. Among other applications, such models are a good way of enabling autonomous agents (robots, cars, drones, etc.) to effectively perceive their surroundings and to navigate in three dimensions. We propose to do this by fitting the data with a piecewise-linear Gaussian mixture regression model whose components are skewed over planes, making them flat in appearance rather than being ellipsoidal, by embedding an outlier-trimming process that is formally incorporated into the proposed expectation-maximization algorithm, and by selectively fusing contiguous, coplanar components. Part of our motivation is an attempt to estimate more accurate plane-extraction by allowing each model component to make use of all available data through probabilistic clustering. The algorithm is thoroughly evaluated against a standard benchmark and is shown to rank among the best of the existing state-of-the-art methods.Comment: 11 pages, 2 figures, 1 tabl
    • 

    corecore