58,813 research outputs found

    Inductive Geometric Matrix Midranges

    Get PDF
    Covariance data as represented by symmetric positive definite (SPD) matrices are ubiquitous throughout technical study as efficient descriptors of interdependent systems. Euclidean analysis of SPD matrices, while computationally fast, can lead to skewed and even unphysical interpretations of data. Riemannian methods preserve the geometric structure of SPD data at the cost of expensive eigenvalue computations. In this paper, we propose a geometric method for unsupervised clustering of SPD data based on the Thompson metric. This technique relies upon a novel "inductive midrange" centroid computation for SPD data, whose properties are examined and numerically confirmed. We demonstrate the incorporation of the Thompson metric and inductive midrange into X-means and K-means++ clustering algorithms

    Robust Optimization using a new Volume-Based Clustering approach

    Get PDF
    We propose a new data-driven technique for constructing uncertainty sets for robust optimization problems. The technique captures the underlying structure of sparse data through volume-based clustering, resulting in less conservative solutions than most commonly used robust optimization approaches. This can aid management in making informed decisions under uncertainty, allowing a better understanding of the potential outcomes and risks associated with possible decisions. The paper demonstrates how clustering can be performed using any desired geometry and provides a mathematical optimization formulation for generating clusters and constructing the uncertainty set. In order to find an efficient solution to the problem, we explore different approaches since the method may be computationally expensive. This contribution to the field provides a novel data-driven approach to uncertainty set construction for robust optimization that can be applied to real-world scenarios

    A time series distance measure for efficient clustering of input output signals by their underlying dynamics

    Full text link
    Starting from a dataset with input/output time series generated by multiple deterministic linear dynamical systems, this paper tackles the problem of automatically clustering these time series. We propose an extension to the so-called Martin cepstral distance, that allows to efficiently cluster these time series, and apply it to simulated electrical circuits data. Traditionally, two ways of handling the problem are used. The first class of methods employs a distance measure on time series (e.g. Euclidean, Dynamic Time Warping) and a clustering technique (e.g. k-means, k-medoids, hierarchical clustering) to find natural groups in the dataset. It is, however, often not clear whether these distance measures effectively take into account the specific temporal correlations in these time series. The second class of methods uses the input/output data to identify a dynamic system using an identification scheme, and then applies a model norm-based distance (e.g. H2, H-infinity) to find out which systems are similar. This, however, can be very time consuming for large amounts of long time series data. We show that the new distance measure presented in this paper performs as good as when every input/output pair is modelled explicitly, but remains computationally much less complex. The complexity of calculating this distance between two time series of length N is O(N logN).Comment: 6 pages, 4 figures, sent in for review to IEEE L-CSS (CDC 2017 option
    • …
    corecore