17 research outputs found

    Reverse engineering of CAD models via clustering and approximate implicitization

    Full text link
    In applications like computer aided design, geometric models are often represented numerically as polynomial splines or NURBS, even when they originate from primitive geometry. For purposes such as redesign and isogeometric analysis, it is of interest to extract information about the underlying geometry through reverse engineering. In this work we develop a novel method to determine these primitive shapes by combining clustering analysis with approximate implicitization. The proposed method is automatic and can recover algebraic hypersurfaces of any degree in any dimension. In exact arithmetic, the algorithm returns exact results. All the required parameters, such as the implicit degree of the patches and the number of clusters of the model, are inferred using numerical approaches in order to obtain an algorithm that requires as little manual input as possible. The effectiveness, efficiency and robustness of the method are shown both in a theoretical analysis and in numerical examples implemented in Python

    Data-driven quasi-interpolant spline surfaces for point cloud approximation

    Get PDF
    In this paper we investigate a local surface approximation, the Weighted Quasi Interpolant Spline Approximation (wQISA), specifically designed for large and noisy point clouds. We briefly describe the properties of the wQISA representation and introduce a novel data-driven implementation, which combines prediction capability and complexity efficiency. We provide an extended comparative analysis with other continuous approximations on real data, including different types of surfaces and levels of noise, such as 3D models, terrain data and digital environmental data

    ANALYSIS OF THE INFLUENCE OF THE OGIVE RADIUS OF A 7.62X39 AMMUNITION BULLET ON THE CAVITATION CAVITY

    Get PDF
    The aim of the current study is to examine the impact of the ogive radius of a 7.62x39 ammunition projectile on the positioning of a radially-slotted channel that helps increase the angle of the cavitation cavity. The studies were conducted using CFD (Computational Fluid Dynamics) analyses in a SolidWorks environment, simulating the projectile\u27s movement in a water environment. The research findings indicate that lower values of the ogive radius result in higher values of the cavitation cavity angle, which in turn suggests that lower values are more favorable for creating bullets with slotted channels, which enhances linear-progressive movement in a water environment and thus increases the chance of hitting targets below sea level

    Non-acyclicity of coset lattices and generation of finite groups

    Get PDF

    High-dimensional polytopes defined by oracles: algorithms, computations and applications

    Get PDF
    Η επεξεργασία και ανάλυση γεωμετρικών δεδομένων σε υψηλές διαστάσεις διαδραματίζει ένα θεμελιώδη ρόλο σε διάφορους κλάδους της επιστήμης και της μηχανικής. Τις τελευταίες δεκαετίες έχουν αναπτυχθεί πολλοί επιτυχημένοι γεωμετρικοί αλγόριθμοι σε 2 και 3 διαστάσεις. Ωστόσο, στις περισσότερες περιπτώσεις, οι επιδόσεις τους σε υψηλότερες διαστάσεις δεν είναι ικανοποιητικές. Αυτή η συμπεριφορά είναι ευρέως γνωστή ως κατάρα των μεγάλων διαστάσεων (curse of dimensionality). Δυο πλαίσια λύσης που έχουν υιοθετηθεί για να ξεπεραστεί αυτή η δυσκολία είναι η εκμετάλλευση της ειδικής δομής των δεδομένων, όπως σε περιπτώσεις αραιών (sparse) δεδομένων ή στην περίπτωση που τα δεδομένα βρίσκονται σε χώρο χαμηλότερης διάστασης, και ο σχεδιασμός προσεγγιστικών αλγορίθμων. Στη διατριβή αυτή μελετάμε προβλήματα μέσα σε αυτά τα πλαίσια. Το κύριο ερευνητικό πεδίο της παρούσας εργασίας είναι η διακριτή και υπολογιστικής γεωμετρία και οι σχέσεις της με τους κλάδους της επιστήμης των υπολογιστών και τα εφαρμοσμένα μαθηματικά, όπως είναι η θεωρία πολυτόπων, οι υλοποιήσεις αλγορίθμων, οι πιθανοθεωρητικοί γεωμετρικοί αλγόριθμοι, η υπολογιστική αλγεβρική γεωμετρία και η βελτιστοποίηση. Τα θεμελιώδη γεωμετρικά αντικείμενα της μελέτης μας είναι τα πολύτοπα, και οι βασικές τους ιδιότητες είναι η κυρτότητα και ότι ορίζονται από ένα μαντείο (oracle) σε ένα χώρο υψηλής διάστασης. Η επεξεργασία και ανάλυση γεωμετρικών δεδομένων σε υψηλές διαστάσεις διαδραματίζει ένα θεμελιώδη ρόλο σε διάφορους κλάδους της επιστήμης και της μηχανικής. Τις τελευταίες δεκαετίες έχουν αναπτυχθεί πολλοί επιτυχημένοι γεωμετρικοί αλγόριθμοι σε 2 και 3 διαστάσεις. Ωστόσο, στις περισσότερες περιπτώσεις, οι επιδόσεις τους σε υψηλότερες διαστάσεις δεν είναι ικανοποιητικές. Δυο πλαίσια λύσης που έχουν υιοθετηθεί για να ξεπεραστεί αυτή η δυσκολία είναι η εκμετάλλευση της ειδικής δομής των δεδομένων, όπως σε περιπτώσεις αραιών (sparse) δεδομένων ή στην περίπτωση που τα δεδομένα βρίσκονται σε χώρο χαμηλότερης διάστασης, και ο σχεδιασμός προσεγγιστικών αλγορίθμων. Το κύριο ερευνητικό πεδίο της παρούσας εργασίας είναι η διακριτή και υπολογιστικής γεωμετρία και οι σχέσεις της με τους κλάδους της επιστήμης των υπολογιστών και τα εφαρμοσμένα μαθηματικά. Η συμβολή αυτής της διατριβής είναι τριπλή. Πρώτον, στο σχεδιασμό και την ανάλυση των γεωμετρικών αλγορίθμων για προβλήματα σε μεγάλες διαστάσεις. Δεύτερον, θεωρητικά αποτελέσματα σχετικά με το συνδυαστικό χαρακτηρισμό βασικών οικογενειών πολυτόπων. Τρίτον, η εφαρμογή και πειραματική ανάλυση των προτεινόμενων αλγορίθμων και μεθόδων. Η ανάπτυξη λογισμικού ανοιχτού κώδικα, που είναι διαθέσιμο στο κοινό και βασίζεται και επεκτείνει διαδεδομένες γεωμετρικές και αλγεβρικές βιβλιοθήκες λογισμικού, όπως η CGAL και το polymake.The processing and analysis of high dimensional geometric data plays a fundamental role in disciplines of science and engineering. The last decades many successful geometric algorithms has been developed in 2 and 3 dimensions. However, in most cases their performance in higher dimensions is poor. This behavior is commonly called the curse of dimensionality. A solution framework adopted for the healing of the curse of dimensionality is the exploitation of the special structure of the data, such as sparsity or low intrinsic dimension and the design of approximation algorithms. The main research area of this thesis is discrete and computational geometry and its connections to branches of computer science and applied mathematics. The contribution of this thesis is threefold. First, the design and analysis of geometric algorithms for problems concerning high-dimensional, convex polytopes, such as convex hull and volume computation and their applications to computational algebraic geometry and optimization. Second, the establishment of combinatorial characterization results for essential polytope families. Third, the implementation and experimental analysis of the proposed algorithms and methods. The developed software is opensource, publicly available and builds on and extends state-of-the-art geometric and algebraic software libraries such as CGAL and polymake

    Surface Remeshing and Applications

    Get PDF
    Due to the focus of popular graphic accelerators, triangle meshes remain the primary representation for 3D surfaces. They are the simplest form of interpolation between surface samples, which may have been acquired with a laser scanner, computed from a 3D scalar field resolved on a regular grid, or identified on slices of medical data. Typical methods for the generation of triangle meshes from raw data attempt to lose as less information as possible, so that the resulting surface models can be used in the widest range of scenarios. When such a general-purpose model has to be used in a particular application context, however, a pre-processing is often worth to be considered. In some cases, it is convenient to slightly modify the geometry and/or the connectivity of the mesh, so that further processing can take place more easily. Other applications may require the mesh to have a pre-defined structure, which is often different from the one of the original general-purpose mesh. The central focus of this thesis is the automatic remeshing of highly detailed surface triangulations. Besides a thorough discussion of state-of-the-art applications such as real-time rendering and simulation, new approaches are proposed which use remeshing for topological analysis, flexible mesh generation and 3D compression. Furthermore, innovative methods are introduced to post-process polygonal models in order to recover information which was lost, or hidden, by a prior remeshing process. Besides the technical contributions, this thesis aims at showing that surface remeshing is much more useful than it may seem at a first sight, as it represents a nearly fundamental step for making several applications feasible in practice

    Q(sqrt(-3))-Integral Points on a Mordell Curve

    Get PDF
    We use an extension of quadratic Chabauty to number fields,recently developed by the author with Balakrishnan, Besser and M ̈uller,combined with a sieving technique, to determine the integral points overQ(√−3) on the Mordell curve y2 = x3 − 4

    Algorithms for Geometric Optimization and Enrichment in Industrialized Building Construction

    Get PDF
    The burgeoning use of industrialized building construction, coupled with advances in digital technologies, is unlocking new opportunities to improve the status quo of construction projects being over-budget, delayed and having undesirable quality. Yet there are still several objective barriers that need to be overcome in order to fully realize the full potential of these innovations. Analysis of literature and examples from industry reveal the following notable barriers: (1) geometric optimization methods need to be developed for the stricter dimensional requirements in industrialized construction, (2) methods are needed to preserve model semantics during the process of generating an updated as-built model, (3) semantic enrichment methods are required for the end-of-life stage of industrialized buildings, and (4) there is a need to develop pragmatic approaches for algorithms to ensure they achieve required computational efficiency. The common thread across these examples is the need for developing algorithms to optimize and enrich geometric models. To date, a comprehensive approach paired with pragmatic solutions remains elusive. This research fills this gap by presenting a new approach for algorithm development along with pragmatic implementations for the industrialized building construction sector. Computational algorithms are effective for driving the design, analysis, and optimization of geometric models. As such, this thesis develops new computational algorithms for design, fabrication and assembly, onsite construction, and end-of-life stages of industrialized buildings. A common theme throughout this work is the development and comparison of varied algorithmic approaches (i.e., exact vs. approximate solutions) to see which is optimal for a given process. This is implemented in the following ways. First, a probabilistic method is used to simulate the accumulation of dimensional tolerances in order to optimize geometric models during design. Second, a series of exact and approximate algorithms are used to optimize the topology of 2D panelized assemblies to minimize material use during fabrication and assembly. Third, a new approach to automatically update geometric models is developed whereby initial model semantics are preserved during the process of generating an as-built model. Finally, a series of algorithms are developed to semantically enrich geometric models to enable industrialized buildings to be disassembled and reused. The developments made in this research form a rational and pragmatic approach to addressing the existing challenges faced in industrialized building construction. Such developments are shown not only to be effective in improving the status quo in the industry (i.e., improving cost, reducing project duration, and improving quality), but also for facilitating continuous innovation in construction. By way of assessing the potential impact of this work, the proposed algorithms can reduce rework risk during fabrication and assembly (65% rework reduction in the case study for the new tolerance simulation algorithm), reduce waste during manufacturing (11% waste reduction in the case study for the new panel unfolding and nesting algorithms), improve accuracy and automation of as-built model generation (model error reduction from 50.4 mm to 5.7 mm in the case study for the new parametric BIM updating algorithms), reduce lifecycle cost for adapting industrialized buildings (15% reduction in capital costs in the computational building configurator) and reducing lifecycle impacts for reusing structural systems from industrialized buildings (between 54% to 95% reduction in average lifecycle impacts for the approach illustrated in Appendix B). From a computational standpoint, the novelty of the algorithms developed in this research can be described as follows. Complex geometric processes can be codified solely on the innate properties of geometry – that is, by parameterizing geometry and using methods such as combinatorial optimization, topology can be optimized and semantics can be automatically enriched for building assemblies. Employing the use of functional discretization (whereby continuous variable domains are converted into discrete variable domains) is shown to be highly effective for complex geometric optimization approaches. Finally, the algorithms encapsulate and balance the benefits posed by both parametric and non-parametric schemas, resulting in the ability to achieve both high representational accuracy and semantically rich information (which has previously not been achieved or demonstrated). In summary, this thesis makes several key improvements to industrialized building construction. One of the key findings is that rather than pre-emptively determining the best suited algorithm for a given process or problem, it is often more pragmatic to derive both an exact and approximate solution and then decide which is optimal to use for a given process. Generally, most tasks related to optimizing or enriching geometric models is best solved using approximate methods. To this end, this research presents a series of key techniques that can be followed to improve the temporal performance of algorithms. The new approach for developing computational algorithms and the pragmatic demonstrations for geometric optimization and enrichment are expected to bring the industry forward and solve many of the current barriers it faces
    corecore