1,894 research outputs found

    The Convex Hull Problem in Practice : Improving the Running Time of the Double Description Method

    Get PDF
    The double description method is a simple but widely used algorithm for computation of extreme points in polyhedral sets. One key aspect of its implementation is the question of how to efficiently test extreme points for adjacency. In this dissertation, two significant contributions related to adjacency testing are presented. First, the currently used data structures are revisited and various optimizations are proposed. Empirical evidence is provided to demonstrate their competitiveness. Second, a new adjacency test is introduced. It is a refinement of the well known algebraic test featuring a technique for avoiding redundant computations. Its correctness is formally proven. Its superiority in multiple degenerate scenarios is demonstrated through experimental results. Parallel computation is one further aspect of the double description method covered in this work. A recently introduced divide-and-conquer technique is revisited and considerable practical limitations are demonstrated

    Computational topology with Regina: Algorithms, heuristics and implementations

    Full text link
    Regina is a software package for studying 3-manifold triangulations and normal surfaces. It includes a graphical user interface and Python bindings, and also supports angle structures, census enumeration, combinatorial recognition of triangulations, and high-level functions such as 3-sphere recognition, unknot recognition and connected sum decomposition. This paper brings 3-manifold topologists up-to-date with Regina as it appears today, and documents for the first time in the literature some of the key algorithms, heuristics and implementations that are central to Regina's performance. These include the all-important simplification heuristics, key choices of data structures and algorithms to alleviate bottlenecks in normal surface enumeration, modern implementations of 3-sphere recognition and connected sum decomposition, and more. We also give some historical background for the project, including the key role played by Rubinstein in its genesis 15 years ago, and discuss current directions for future development.Comment: 29 pages, 10 figures; v2: minor revisions. To appear in "Geometry & Topology Down Under", Contemporary Mathematics, AM

    Enumerating fundamental normal surfaces: Algorithms, experiments and invariants

    Full text link
    Computational knot theory and 3-manifold topology have seen significant breakthroughs in recent years, despite the fact that many key algorithms have complexity bounds that are exponential or greater. In this setting, experimentation is essential for understanding the limits of practicality, as well as for gauging the relative merits of competing algorithms. In this paper we focus on normal surface theory, a key tool that appears throughout low-dimensional topology. Stepping beyond the well-studied problem of computing vertex normal surfaces (essentially extreme rays of a polyhedral cone), we turn our attention to the more complex task of computing fundamental normal surfaces (essentially an integral basis for such a cone). We develop, implement and experimentally compare a primal and a dual algorithm, both of which combine domain-specific techniques with classical Hilbert basis algorithms. Our experiments indicate that we can solve extremely large problems that were once though intractable. As a practical application of our techniques, we fill gaps from the KnotInfo database by computing 398 previously-unknown crosscap numbers of knots.Comment: 17 pages, 5 figures; v2: Stronger experimental focus, restrict attention to primal & dual algorithms only, larger and more detailed experiments, more new crosscap number

    Optimizing the double description method for normal surface enumeration

    Full text link
    Many key algorithms in 3-manifold topology involve the enumeration of normal surfaces, which is based upon the double description method for finding the vertices of a convex polytope. Typically we are only interested in a small subset of these vertices, thus opening the way for substantial optimization. Here we give an account of the vertex enumeration problem as it applies to normal surfaces, and present new optimizations that yield strong improvements in both running time and memory consumption. The resulting algorithms are tested using the freely available software package Regina.Comment: 27 pages, 12 figures; v2: Removed the 3^n bound from Section 3.3, fixed the projective equation in Lemma 4.4, clarified "most triangulations" in the introduction to section 5; v3: replace -ise with -ize for Mathematics of Computation (note that this changes the title of the paper

    Evolution of Ada technology in the flight dynamics area: Implementation/testing phase analysis

    Get PDF
    An analysis is presented of the software engineering issues related to the use of Ada for the implementation and system testing phases of four Ada projects developed in the flight dynamics area. These projects reflect an evolving understanding of more effective use of Ada features. In addition, the testing methodology used on these projects has changed substantially from that used on previous FORTRAN projects

    Computation of elementary modes: a unifying framework and the new binary approach

    Get PDF
    BACKGROUND: Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. RESULTS: We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. CONCLUSIONS: The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks

    Polyhedral representation conversion up to symmetries

    Full text link

    Proposing a CNN method for primary and permanent tooth detection and enumeration on pediatric dental radiographs

    Get PDF
    OBJECTIVE: In this paper, we aimed to evaluate the performance of a deep learning system for automated tooth detection and numbering on pediatric panoramic radiographs. STUDY DESIGN: YOLO V4, a CNN (Convolutional Neural Networks) based object detection model was used for automated tooth detection and numbering. 4545 pediatric panoramic X-ray images, processed in labelImg, were trained and tested in the Yolo algorithm. RESULTS AND CONCLUSIONS: The model was successful in detecting and numbering both primary and permanent teeth on pediatric panoramic radiographs with the mean average precision (mAP) value of 92.22 %, mean average recall (mAR) value of 94.44% and weighted-F1 score of 0.91. The proposed CNN method yielded high and fast performance for automated tooth detection and numbering on pediatric panoramic radiographs. Automatic tooth detection could help dental practitioners to save time and also use it as a pre-processing tool for detection of dental pathologies
    corecore