452 research outputs found

    Combined 3D thinning and greedy algorithm to approximate realistic particles with corrected mechanical properties

    Full text link
    The shape of irregular particles has significant influence on micro- and macro-scopic behavior of granular systems. This paper presents a combined 3D thinning and greedy set-covering algorithm to approximate realistic particles with a clump of overlapping spheres for discrete element method (DEM) simulations. First, the particle medial surface (or surface skeleton), from which all candidate (maximal inscribed) spheres can be generated, is computed by the topological 3D thinning. Then, the clump generation procedure is converted into a greedy set-covering (SCP) problem. To correct the mass distribution due to highly overlapped spheres inside the clump, linear programming (LP) is used to adjust the density of each component sphere, such that the aggregate properties mass, center of mass and inertia tensor are identical or close enough to the prototypical particle. In order to find the optimal approximation accuracy (volume coverage: ratio of clump's volume to the original particle's volume), particle flow of 3 different shapes in a rotating drum are conducted. It was observed that the dynamic angle of repose starts to converge for all particle shapes at 85% volume coverage (spheres per clump < 30), which implies the possible optimal resolution to capture the mechanical behavior of the system.Comment: 34 pages, 13 figure

    Powerful Parallel Symmetric 3D Thinning Schemes Based on Critical Kernels

    Get PDF
    The main contribution of the present article consists of new 3D parallel and symmetric thinning schemes which have the following qualities: - They are effective and sound, in the sense that they are guaranteed to preserve topology. This guarantee is obtained thanks to a theorem on critical kernels; - They are powerful, in the sense that they remove more points, in one iteration, than any other symmetric parallel thinning scheme; - They are versatile, as conditions for the preservation of geometrical features (e.g., curve extremities or surface borders) are independent of those accounting for topology preservation; - They are efficient: we provide in this article a small set of masks, acting in the grid Z3, that is sufficient, in addition to the classical simple point test, to straightforwardly implement them

    A skeletonization algorithm for gradient-based optimization

    Full text link
    The skeleton of a digital image is a compact representation of its topology, geometry, and scale. It has utility in many computer vision applications, such as image description, segmentation, and registration. However, skeletonization has only seen limited use in contemporary deep learning solutions. Most existing skeletonization algorithms are not differentiable, making it impossible to integrate them with gradient-based optimization. Compatible algorithms based on morphological operations and neural networks have been proposed, but their results often deviate from the geometry and topology of the true medial axis. This work introduces the first three-dimensional skeletonization algorithm that is both compatible with gradient-based optimization and preserves an object's topology. Our method is exclusively based on matrix additions and multiplications, convolutional operations, basic non-linear functions, and sampling from a uniform probability distribution, allowing it to be easily implemented in any major deep learning library. In benchmarking experiments, we prove the advantages of our skeletonization algorithm compared to non-differentiable, morphological, and neural-network-based baselines. Finally, we demonstrate the utility of our algorithm by integrating it with two medical image processing applications that use gradient-based optimization: deep-learning-based blood vessel segmentation, and multimodal registration of the mandible in computed tomography and magnetic resonance images.Comment: Accepted at ICCV 202

    Nonlinear rheology of colloidal dispersions

    Get PDF
    Colloidal dispersions are commonly encountered in everyday life and represent an important class of complex fluid. Of particular significance for many commercial products and industrial processes is the ability to control and manipulate the macroscopic flow response of a dispersion by tuning the microscopic interactions between the constituents. An important step towards attaining this goal is the development of robust theoretical methods for predicting from first-principles the rheology and nonequilibrium microstructure of well defined model systems subject to external flow. In this review we give an overview of some promising theoretical approaches and the phenomena they seek to describe, focusing, for simplicity, on systems for which the colloidal particles interact via strongly repulsive, spherically symmetric interactions. In presenting the various theories, we will consider first low volume fraction systems, for which a number of exact results may be derived, before moving on to consider the intermediate and high volume fraction states which present both the most interesting physics and the most demanding technical challenges. In the high volume fraction regime particular emphasis will be given to the rheology of dynamically arrested states.Comment: Review articl

    AI-based design methodologies for hot form quench (HFQ®)

    Get PDF
    This thesis aims to develop advanced design methodologies that fully exploit the capabilities of the Hot Form Quench (HFQ®) stamping process in stamping complex geometric features in high-strength aluminium alloy structural components. While previous research has focused on material models for FE simulations, these simulations are not suitable for early-phase design due to their high computational cost and expertise requirements. This project has two main objectives: first, to develop design guidelines for the early-stage design phase; and second, to create a machine learning-based platform that can optimise 3D geometries under hot stamping constraints, for both early and late-stage design. With these methodologies, the aim is to facilitate the incorporation of HFQ capabilities into component geometry design, enabling the full realisation of its benefits. To achieve the objectives of this project, two main efforts were undertaken. Firstly, the analysis of aluminium alloys for stamping deep corners was simplified by identifying the effects of corner geometry and material characteristics on post-form thinning distribution. New equation sets were proposed to model trends and design maps were created to guide component design at early stages. Secondly, a platform was developed to optimise 3D geometries for stamping, using deep learning technologies to incorporate manufacturing capabilities. This platform combined two neural networks: a geometry generator based on Signed Distance Functions (SDFs), and an image-based manufacturability surrogate model. The platform used gradient-based techniques to update the inputs to the geometry generator based on the surrogate model's manufacturability information. The effectiveness of the platform was demonstrated on two geometry classes, Corners and Bulkheads, with five case studies conducted to optimise under post-stamped thinning constraints. Results showed that the platform allowed for free morphing of complex geometries, leading to significant improvements in component quality. The research outcomes represent a significant contribution to the field of technologically advanced manufacturing methods and offer promising avenues for future research. The developed methodologies provide practical solutions for designers to identify optimal component geometries, ensuring manufacturing feasibility and reducing design development time and costs. The potential applications of these methodologies extend to real-world industrial settings and can significantly contribute to the continued advancement of the manufacturing sector.Open Acces

    SAR Image Edge Detection: Review and Benchmark Experiments

    Get PDF
    Edges are distinct geometric features crucial to higher level object detection and recognition in remote-sensing processing, which is a key for surveillance and gathering up-to-date geospatial intelligence. Synthetic aperture radar (SAR) is a powerful form of remote-sensing. However, edge detectors designed for optical images tend to have low performance on SAR images due to the presence of the strong speckle noise-causing false-positives (type I errors). Therefore, many researchers have proposed edge detectors that are tailored to deal with the SAR image characteristics specifically. Although these edge detectors might achieve effective results on their own evaluations, the comparisons tend to include a very limited number of (simulated) SAR images. As a result, the generalized performance of the proposed methods is not truly reflected, as real-world patterns are much more complex and diverse. From this emerges another problem, namely, a quantitative benchmark is missing in the field. Hence, it is not currently possible to fairly evaluate any edge detection method for SAR images. Thus, in this paper, we aim to close the aforementioned gaps by providing an extensive experimental evaluation for SAR images on edge detection. To that end, we propose the first benchmark on SAR image edge detection methods established by evaluating various freely available methods, including methods that are considered to be the state of the art

    Software for Exascale Computing - SPPEXA 2016-2019

    Get PDF
    This open access book summarizes the research done and results obtained in the second funding phase of the Priority Program 1648 "Software for Exascale Computing" (SPPEXA) of the German Research Foundation (DFG) presented at the SPPEXA Symposium in Dresden during October 21-23, 2019. In that respect, it both represents a continuation of Vol. 113 in Springer’s series Lecture Notes in Computational Science and Engineering, the corresponding report of SPPEXA’s first funding phase, and provides an overview of SPPEXA’s contributions towards exascale computing in today's sumpercomputer technology. The individual chapters address one or more of the research directions (1) computational algorithms, (2) system software, (3) application software, (4) data management and exploration, (5) programming, and (6) software tools. The book has an interdisciplinary appeal: scholars from computational sub-fields in computer science, mathematics, physics, or engineering will find it of particular interest

    The 2nd Conference of PhD Students in Computer Science

    Get PDF

    A non-rigid registration approach for quantifying myocardial contraction in tagged MRI using generalized information measures.

    Get PDF
    International audienceWe address the problem of quantitatively assessing myocardial function from tagged MRI sequences. We develop a two-step method comprising (i) a motion estimation step using a novel variational non-rigid registration technique based on generalized information measures, and (ii) a measurement step, yielding local and segmental deformation parameters over the whole myocardium. Experiments on healthy and pathological data demonstrate that this method delivers, within a reasonable computation time and in a fully unsupervised way, reliable measurements for normal subjects and quantitative pathology-specific information. Beyond cardiac MRI, this work redefines the foundations of variational non-rigid registration for information-theoretic similarity criteria with potential interest in multimodal medical imaging
    • …
    corecore