2,724 research outputs found

    Visualizing data as objects by DC (difference of convex) optimization

    Get PDF
    In this paper we address the problem of visualizing in a bounded region a set of individuals, which has attached a dissimilarity measure and a statistical value, as convex objects. This problem, which extends the standard Multidimensional Scaling Analysis, is written as a global optimization problem whose objective is the difference of two convex functions (DC). Suitable DC decompositions allow us to use the Difference of Convex Algorithm (DCA) in a very efficient way. Our algorithmic approach is used to visualize two real-world datasets.Ministerio de Economía y CompetitividadJunta de AndalucíaUnión EuropeaUniversidad de Sevill

    Mathematical optimization for the visualization of complex datasets

    Get PDF
    This PhD dissertation focuses on developing new Mathematical Optimization models and solution approaches which help to gain insight into complex data structures arising in Information Visualization. The approaches developed in this thesis merge concepts from Multivariate Data Analysis and Mathematical Optimization, bridging theoretical mathematics with real life problems. The usefulness of Information Visualization lies with its power to improve interpretability and decision making from the unknown phenomena described by raw data, as fully discussed in Chapter 1. In particular, datasets involving frequency distributions and proximity relations, which even might vary over the time, are the ones studied in this thesis. Frameworks to visualize such enclosed information, which make use of Mixed Integer (Non)linear Programming and Difference of Convex tools, are formally proposed. Algorithmic approaches such as Large Neighborhood Search or Difference of Convex Algorithm enable us to develop matheuristics to handle such models. More specifically, Chapter 2 addresses the problem of visualizing a frequency distribution and an adjacency relation attached to a set of individuals. This information is represented using a rectangular map, i.e., a subdivision of a rectangle into rectangular portions so that their areas reflect the frequencies, and the adjacencies between portions represent the adjacencies between the individuals. The visualization problem is formulated as a Mixed Integer Linear Programming model, and a matheuristic that has this model at its heart is proposed. Chapter 3 generalizes the model presented in the previous chapter by developing a visualization framework which handles simultaneously the representation of a frequency distribution and a dissimilarity relation. This framework consists of a partition of a given rectangle into piecewise rectangular portions so that the areas of the regions represent the frequencies and the distances between them represent the dissimilarities. This visualization problem is formally stated as a Mixed Integer Nonlinear Programming model, which is solved by means of a matheuristic based on Large Neighborhood Search. Contrary to previous chapters in which a partition of the visualization region is sought, Chapter 4 addresses the problem of visualizing a set of individuals, which has attached a dissimilarity measure and a frequency distribution, without necessarily cov-ering the visualization region. In this visualization problem individuals are depicted as convex bodies whose areas are proportional to the given frequencies. The aim is to determine the location of the convex bodies in the visualization region. In order to solve this problem, which generalizes the standard Multidimensional Scaling, Difference of Convex tools are used. In Chapter 5, the model stated in the previous chapter is extended to the dynamic case, namely considering that frequencies and dissimilarities are observed along a set of time periods. The solution approach combines Difference of Convex techniques with Nonconvex Quadratic Binary Optimization. All the approaches presented are tested in real datasets. Finally, Chapter 6 closes this thesis with general conclusions and future lines of research.Esta tesis se centra en desarrollar nuevos modelos y algoritmos basados en la Optimización Matemática que ayuden a comprender estructuras de datos complejas frecuentes en el área de Visualización de la Información. Las metodologías propuestas fusionan conceptos de Análisis de Datos Multivariantes y de Optimización Matemática, aunando las matemáticas teóricas con problemas reales. Como se analiza en el Capítulo 1, una adecuada visualización de los datos ayuda a mejorar la interpretabilidad de los fenómenos desconocidos que describen, así como la toma de decisiones. Concretamente, esta tesis se centra en visualizar datos que involucran distribuciones de frecuencias y relaciones de proximidad, pudiendo incluso ambas variar a lo largo del tiempo. Se proponen diferentes herramientas para visualizar dicha información, basadas tanto en la Optimización (No) Lineal Entera Mixta como en la optimización de funciones Diferencia de Convexas. Además, metodologías como la Búsqueda por Entornos Grandes y el Algoritmo DCA permiten el desarrollo de mateheurísticas para resolver dichos modelos. Concretamente, el Capítulo 2 trata el problema de visualizar simultáneamente una distribución de frequencias y una relación de adyacencias en un conjunto de individuos. Esta información se representa a través de un mapa rectangular, es decir, una subdivisión de un rectángulo en porciones rectangulares, de manera que las áreas de estas porciones representen las frecuencias y las adyacencias entre las porciones representen las adyacencias entre los individuos. Este problema de visualización se formula con la ayuda de la Optimización Lineal Entera Mixta. Además, se propone una mateheurística basada en este modelo como método de resolución. En el Capítulo 3 se generaliza el modelo presentado en el capítulo anterior, construyendo una herramienta que permite visualizar simultáneamente una distribución de frecuencias y una relación de disimilaridades. Dicha visualización se realiza mediante la partición de un rectángulo en porciones rectangulares a trozos de manera que el área de las porciones refleje la distribución de frecuencias y las distancias entre las mismas las disimilaridades. Se plantea un modelo No Lineal Entero Mixto para este problema de visualización, que es resuelto a través de una mateheurística basada en la Búsqueda por Entornos Grandes. En contraposición a los capítulos anteriores, en los que se busca una partición de la región de visualización, el Capítulo 4 trata el problema de representar una distribución de frecuencias y una relación de disimilaridad sobre un conjunto de individuos, sin forzar a que haya que recubrir dicha región de visualización. En este modelo de visualización los individuos son representados como cuerpos convexos cuyas áreas son proporcionales a las frecuencias dadas. El objetivo es determinar la localización de dichos cuerpos convexos dentro de la región de visualización. Para resolver este problema, que generaliza el tradicional Escalado Multidimensional, se utilizan técnicas de optimización basadas en funciones Diferencia de Convexas. En el Capítulo 5, se extiende el modelo desarrollado en el capítulo anterior para el caso en el que los datos son dinámicos, es decir, las frecuencias y disimilaridades se observan a lo largo de varios instantes de tiempo. Se emplean técnicas de optimización de funciones Diferencias de Convexas así como Optimización Cuadrática Binaria No Convexa para la resolución del modelo. Todas las metodologías propuestas han sido testadas en datos reales. Finalmente, el Capítulo 6 contiene las conclusiones a esta tesis, así como futuras líneas de investigación.Premio Extraordinario de Doctorado U

    Solution Path Clustering with Adaptive Concave Penalty

    Full text link
    Fast accumulation of large amounts of complex data has created a need for more sophisticated statistical methodologies to discover interesting patterns and better extract information from these data. The large scale of the data often results in challenging high-dimensional estimation problems where only a minority of the data shows specific grouping patterns. To address these emerging challenges, we develop a new clustering methodology that introduces the idea of a regularization path into unsupervised learning. A regularization path for a clustering problem is created by varying the degree of sparsity constraint that is imposed on the differences between objects via the minimax concave penalty with adaptive tuning parameters. Instead of providing a single solution represented by a cluster assignment for each object, the method produces a short sequence of solutions that determines not only the cluster assignment but also a corresponding number of clusters for each solution. The optimization of the penalized loss function is carried out through an MM algorithm with block coordinate descent. The advantages of this clustering algorithm compared to other existing methods are as follows: it does not require the input of the number of clusters; it is capable of simultaneously separating irrelevant or noisy observations that show no grouping pattern, which can greatly improve data interpretation; it is a general methodology that can be applied to many clustering problems. We test this method on various simulated datasets and on gene expression data, where it shows better or competitive performance compared against several clustering methods.Comment: 36 page

    TVL<sub>1</sub> Planarity Regularization for 3D Shape Approximation

    Get PDF
    The modern emergence of automation in many industries has given impetus to extensive research into mobile robotics. Novel perception technologies now enable cars to drive autonomously, tractors to till a field automatically and underwater robots to construct pipelines. An essential requirement to facilitate both perception and autonomous navigation is the analysis of the 3D environment using sensors like laser scanners or stereo cameras. 3D sensors generate a very large number of 3D data points when sampling object shapes within an environment, but crucially do not provide any intrinsic information about the environment which the robots operate within. This work focuses on the fundamental task of 3D shape reconstruction and modelling from 3D point clouds. The novelty lies in the representation of surfaces by algebraic functions having limited support, which enables the extraction of smooth consistent implicit shapes from noisy samples with a heterogeneous density. The minimization of total variation of second differential degree makes it possible to enforce planar surfaces which often occur in man-made environments. Applying the new technique means that less accurate, low-cost 3D sensors can be employed without sacrificing the 3D shape reconstruction accuracy
    corecore