467 research outputs found

    Representation, Recognition and Collaboration with Digital Ink

    Get PDF
    Pen input for computing devices is now widespread, providing a promising interaction mechanism for many purposes. Nevertheless, the diverse nature of digital ink and varied application domains still present many challenges. First, the sampling rate and resolution of pen-based devices keep improving, making input data more costly to process and store. At the same time, existing applications typically record digital ink either in proprietary formats, which are restricted to single platforms and consequently lack portability, or simply as images, which lose important information. Moreover, in certain domains such as mathematics, current systems are now achieving good recognition rates on individual symbols, in general recognition of complete expressions remains a problem due to the absence of an effective method that can reliably identify the spatial relationships among symbols. Last, but not least, existing digital ink collaboration tools are platform-dependent and typically allow only one input method to be used at a time. Together with the absence of recognition, this has placed significant limitations on what can be done. In this thesis, we investigate these issues and make contributions to each. We first present an algorithm that can accurately approximate a digital ink curve by selecting a certain subset of points from the original trace. This allows a compact representation of digital ink for efficient processing and storage. We then describe an algorithm that can automatically identify certain important features in handwritten symbols. Identifying the features can help us solve a number of problems such as improving two-dimensional mathematical recognition. Last, we present a framework for multi-user online collaboration in a pen-based and graphical environment. This framework is portable across multiple platforms and allows multimodal interactions in collaborative sessions. To demonstrate our ideas, we present InkChat, a whiteboard application, which can be used to conduct collaborative sessions on a shared canvas. It allows participants to use voice and digital ink independently and simultaneously, which has been found useful in remote collaboration

    Non-acyclicity of coset lattices and generation of finite groups

    Get PDF

    Handbook of Computer Vision Algorithms in Image Algebra

    Full text link

    Skeletal representations of orthogonal shapes

    Get PDF
    Skeletal representations are important shape descriptors which encode topological and geometrical properties of shapes and reduce their dimension. Skeletons are used in several fields of science and attract the attention of many researchers. In the biocad field, the analysis of structural properties such as porosity of biomaterials requires the previous computation of a skeleton. As the size of three-dimensional images become larger, efficient and robust algorithms that extract simple skeletal structures are required. The most popular and prominent skeletal representation is the medial axis, defined as the shape points which have at least two closest points on the shape boundary. Unfortunately, the medial axis is highly sensitive to noise and perturbations of the shape boundary. That is, a small change of the shape boundary may involve a considerable change of its medial axis. Moreover, the exact computation of the medial axis is only possible for a few classes of shapes. For example, the medial axis of polyhedra is composed of non planar surfaces, and its accurate and robust computation is difficult. These problems led to the emergence of approximate medial axis representations. There exists two main approximation methods: the shape is approximated with another shape class or the Euclidean metric is approximated with another metric. The main contribution of this thesis is the combination of a specific shape and metric simplification. The input shape is approximated with an orthogonal shape, which are polygons or polyhedra enclosed by axis-aligned edges or faces, respectively. In the same vein, the Euclidean metric is replaced by the L infinity or Chebyshev metric. Despite the simpler structure of orthogonal shapes, there are few works on skeletal representations applied to orthogonal shapes. Much of the efforts have been devoted to binary images and volumes, which are a subset of orthogonal shapes. Two new skeletal representations based on this paradigm are introduced: the cube skeleton and the scale cube skeleton. The cube skeleton is shown to be composed of straight line segments or planar faces and to be homotopical equivalent to the input shape. The scale cube skeleton is based upon the cube skeleton, and introduces a family of skeletons that are more stable to shape noise and perturbations. In addition, the necessary algorithms to compute the cube skeleton of polygons and polyhedra and the scale cube skeleton of polygons are presented. Several experimental results confirm the efficiency, robustness and practical use of all the presented methods

    Efficient computation of discrete Voronoi diagram and homotopy-preserving simplified medial axis of a 3d polyhedron

    Get PDF
    The Voronoi diagram is a fundamental geometric data structure and has been well studied in computational geometry and related areas. A Voronoi diagram defined using the Euclidean distance metric is also closely related to the Blum medial axis, a well known skeletal representation. Voronoi diagrams and medial axes have been shown useful for many 3D computations and operations, including proximity queries, motion planning, mesh generation, finite element analysis, and shape analysis. However, their application to complex 3D polyhedral and deformable models has been limited. This is due to the difficulty of computing exact Voronoi diagrams in an efficient and reliable manner. In this dissertation, we bridge this gap by presenting efficient algorithms to compute discrete Voronoi diagrams and simplified medial axes of 3D polyhedral models with geometric and topological guarantees. We apply these algorithms to complex 3D models and use them to perform interactive proximity queries, motion planning and skeletal computations. We present three new results. First, we describe an algorithm to compute 3D distance fields of geometric models by using a linear factorization of Euclidean distance vectors. This formulation maps directly to the linearly interpolating graphics rasterization hardware and enables us to compute distance fields of complex 3D models at interactive rates. We also use clamping and culling algorithms based on properties of Voronoi diagrams to accelerate this computation. We introduce surface distance maps, which are a compact distance vector field representation based on a mesh parameterization of triangulated two-manifolds, and use them to perform proximity computations. Our second main result is an adaptive sampling algorithm to compute an approximate Voronoi diagram that is homotopy equivalent to the exact Voronoi diagram and preserves topological features. We use this algorithm to compute a homotopy-preserving simplified medial axis of complex 3D models. Our third result is a unified approach to perform different proximity queries among multiple deformable models using second order discrete Voronoi diagrams. We introduce a new query called N-body distance query and show that different proximity queries, including collision detection, separation distance and penetration depth can be performed based on Nbody distance query. We compute the second order discrete Voronoi diagram using graphics hardware and use distance bounds to overcome the sampling errors and perform conservative computations. We have applied these queries to various deformable simulations and observed up to an order of magnitude improvement over prior algorithms

    Computational Topology Methods for Shape Modelling Applications

    Get PDF
    This thesis deals with computational topology, a recent branch of research that involves both mathematics and computer science, and tackles the problem of discretizing the Morse theory to functions defined on a triangle mesh. The application context of Morse theory in general, and Reeb graphs in particular, deals with the analysis of geometric shapes and the extraction of skeletal structures that synthetically represents shape, preserving the topological properties and the main morphological characteristics. Regarding Computer Graphics, shapes, that is a one-, two- or higher- dimensional connected, compact space having a visual appearance, are typically approximated by digital models. Since topology focuses on the qualitative properties of spaces, such as the connectedness and how many and what type of holes it has, topology is the best tool to describe the shape of a mathematical model at a high level of abstraction. Geometry, conversely, is mainly related to the quantitative characteristics of a shape. Thus, the combination of topology and geometry creates a new generation of tools that provide a computational description of the most representative features of the shape along with their relationship. Extracting qualitative information, that is the information related to semantic of the shape and its morphological structure, from discrete models is a central goal in shape modeling. In this thesis a conceptual model is proposed which represents a given surface based on topological coding that defines a sketch of the surface, discarding irrelevant details and classifying its topological type. The approach is based on Morse theory and Reeb graphs, which provide a very useful shape abstraction method for the analysis and structuring of the information contained in the geometry of the discrete shape model. To fully develop the method, both theoretical and computational aspects have been considered, related to the definition and the extension of the Reeb graph to the discrete domain. For the definition and automatic construction of the conceptual model, a new method has been developed that analyzes and characterizes a triangle mesh with respect to the behavior of a real and at least continuous function defined on the mesh. The proposed solution handles also degenerate critical points, such as non-isolated critical points. To do that, the surface model is characterized using a contour-based strategy, recognizing critical areas instead of critical points and coding the evolution of the contour levels in a graph-like structure, named Extended Reeb Graph, (ERG), which is a high-level abstract model suitable for representing and manipulating piece-wise linear surfaces. The descriptive power of the (ERG) has been also augmented with the introduction of geometric information together with the topological ones, and it has been also studied the relation between the extracted topological and morphological features with respect to the real characteristics of the surface, giving and evaluation of the dimension of the discarded details. Finally, the effectiveness of our description framework has been evaluated in several application contexts

    Development of a CAD Model Simplification Framework for Finite Element Analysis

    Get PDF
    Analyzing complex 3D models using finite element analysis software requires suppressing features/parts that are not likely to influence the analysis results, but may significantly improve the computational performance both in terms of mesh size and mesh quality. The suppression step often depends on the context and application. Currently, most analysts perform this step manually. This step can take a long time to perform on a complex model and can be tedious in nature. The goal of this thesis was to generate a simplification framework for both part and assembly CAD models for finite element analysis model preparation. At the part level, a rule-based approach for suppressing holes, rounds, and chamfers is presented. Then a tool for suppressing multiple specified part models at once is described at the assembly level. Upon discussion of the frameworks, the tools are demonstrated on several different models to show the complete approach and the computational performances. The work presented in this thesis is expected to significantly reduce the manual time consuming activities within the model simplification stage. This is accomplished through multiple feature/part suppression compared to the industry standard of suppressing one feature/part at a time. A simplified model speeds up the overall analysis, reducing the meshing time and calculation of the analysis values, while maintaining and on occasion improving the quality of the analysis

    Automated evaluation of radiodensities in a digitized mammogram database using local contrast estimation

    Get PDF
    Mammographic radiodensity is one of the strongest risk factors for developing breast cancer and there exists an urgent need to develop automated methods for predicting this marker. Previous attempts for automatically identifying and quantifying radiodense tissue in digitized mammograms have fallen short of the ideal. Many algorithms require significant heuristic parameters to be evaluated and set for predicting radiodensity. Many others have not demonstrated the efficacy of their techniques with a sufficient large and diverse patient database. This thesis has attempted to address both of these drawbacks in previous work. Novel automated digital image processing algorithms are proposed that have demonstrated the ability to rapidly sift through digitized mammogram databases for accurately estimating radiodensity. A judicious combination of point-processing, statistical, neural and contrast enhancement techniques have been employed for addressing this formidable problem. The algorithms have been developed and exercised using over 700 mammograms obtained from multiple age and ethnic groups and digitized using more than one type of X-ray digitizer. The automated algorithms developed in this thesis have been validated by comparing the estimation results using 40 of these mammograms with those predicted by a previously established manual segmentation technique. The automated algorithms developed in this thesis show considerable promise to be extremely useful in epidemiological studies when correlating other behavioral and genetic risk factors with mammographic radiodensity

    Locally Persistent Categories And Metric Properties Of Interleaving Distances

    Get PDF
    This thesis presents a uniform treatment of different distances used in the applied topology literature. We introduce the notion of a locally persistent category, which is a category with a notion of approximate morphism that lets one define an interleaving distance on its collection of objects. The framework is based on a combination of enriched category theory and homotopy theory, and encompasses many well-known examples of interleaving distances, as well as weaker notions of distance, such as the homotopy interleaving distance and the Gromov–Hausdorff distance. We show that the approach is not only an organizational tool, but a useful theoretical tool that allows one to formulate simple conditions under which a certain construction is stable, or under which an interleaving distance is, e.g., complete and geodesic. Being based on the well-developed theory of enriched categories, constructions in the theory of interleavings can be conveniently cast as enriched universal constructions. We give several applications. We generalize Blumberg and Lesnick\u27s homotopy interleaving distance to categories of persistent objects of a model category and prove that this distance is intrinsic and complete. We identify a universal property for the Gromov–Hausdorff distance that gives simple conditions under which an invariant of metric spaces is stable. We define a distance for persistent metric spaces, a generalization of filtered metric spaces, that specializes to known distances on filtered metric spaces and dynamic metric spaces, and use it to lift stability results for invariants of metric spaces to invariants of persistent metric spaces. We present a new stable invariant of metric measure spaces, the kernel density filtration, that encodes the information of a kernel density estimate for all choices of bandwidth. We study the interleaving distance in the category of persistent sets and show that, when restricted to a well-behaved subcategory that in particular contains all dendrograms and merge trees, one gets a complete and geodesic distance. We relate our approach to previous categorical approaches by showing that categories of generalized persistence modules and categories with a flow give rise to locally persistent categories in a way that preserves both metric and categorical structure
    • …
    corecore