133 research outputs found

    Cutting in deformable objects

    Get PDF
    Virtual reality simulations of surgical procedures allow such procedures to be practiced on computers instead of patients and test-animals. The core of such a system is a soft tissue simulation, that has to react very quickly but be realistic at the same time. This thesis discusses how deformable models can be simulated for this context, using an existing mathematical technique, the Finite Element Method. This method represents the object with a mesh, the material is subdivided in geometric primitives, such as triangles. Both the number of primitives and their shape influence the speed of a simulation. Hence, when the mesh changes, e.g. when simulating a procedure, this has to be done with care. This thesis shows how the interaction of meshing and simulation can be handled in software

    Accurate Real-Time Framework for Complex Pre-defined Cuts in Finite Element Modeling

    Get PDF
    PhD ThesisAchieving detailed pre-defined cuts on deformable materials is vitally pivotal for many commercial applications, such as cutting scenes in games and vandalism effects in virtual movies. In these types of applications, the majority of resources are allocated to achieve high-fidelity representations of materials and the virtual environments. In the case of limited computing resources, it is challenging to achieve a convincing cutting effect. On the premise of sacrificing realism effects or computational cost, a considerable amount of research work has been carried out, but the best solution that can be compatible with both cases has not yet been identified. This doctoral dissertation is dedicated to developing a unique framework for representing pre-defined cuts of deformable surface models, which can achieve real-time, detailed cutting while maintaining the realistic physical behaviours. In order to achieve this goal, we have made in-depth explorations from geometric and numerical perspectives. From a geometric perspective, we propose a robust subdivision mechanism that allows users to make arbitrary predetermined cuts on elastic surface models based on the finite element method (FEM). Specifically, after the user separates the elements in an arbitrary manner (i.e., linear or non-linear) on the topological mesh, we then optimise the resulting mesh by regenerating the triangulation within the element based on the Delaunay triangulation principle. The optimisation of regenerated triangles, as a process of refining the ill-shaped elements that have small Aspect Ratio, greatly improves the realism of physical behaviours and guarantees that the refinement process is balanced with real-time requirements. The above subdivision mechanism can improve the visual effect of cutting, but it neglects the fact that elements cannot be perfectly cut through any pre-defined trajectories. The number of ill-shaped elements generated yield a significant impact on the optimisation time: a large number of ill-shaped elements will render the cutting slow or even collapse, and vice versa. Our idea is based on the core observation that the producing of ill-shaped elements is largely attributed to the condition number of the global stiffness matrix. Practically, for a stiffness matrix, a large condition number means that it is almost singular, and the calculation of its inverse or the solution of a system of linear equations are prone to large numerical errors and time-consuming. It motivates us to alleviate the impact of condition number of the global stiffness matrix from the numerical aspects. Specifically, we address this issue in a novel manner by converting the global stiffness matrix into the form of a covariance matrix, in which the number of conditions of the matrix can be reduced by exploiting appropriate matrix normalisation to the eigenvalues. Furthermore, we investigated the efficiency of two different scenarios: an exact square-root normalisation and its approximation based on the Newton-Schulz iteration. Experimental tests of the proposed framework demonstrate that it can successfully reproduce competitive visuals of detailed pre-defined cuts compared with the state-of-the-art method (Manteaux et al. 2015) while obtaining a significant improvement on the FPS, increasing up to 46.49 FPS and 21.93 FPS during and after the cuts, respectively. Also, the new refinement method can stably maintain the average Aspect Ratio of the model mesh after the cuts at less than 3 and the average Area Ratio around 3%. Besides, the proposed two matrix normalisation strategies, including ES-CGM and AS-CGM, have shown the superiority of time efficiency compared with the baseline method (Xin et al. 2018). Specifically, the ES-CGM and AS-CGM methods obtained 5 FPS and 10 FPS higher than the baseline method, respectively. These experimental results strongly support our conclusion which is that this new framework would provide significant benefits when implemented for achieving detailed pre-defined cuts at a real-time rate

    Frictional Contact in Interactive Deformable Environments

    Get PDF
    L\u2019uso di simulazioni garantisce notevoli vantaggi in termini di economia, realismo e di flessibilit\ue0 in molte aree di ricerca e in ambito dello sviluppo tecnologico. Per questo motivo le simulazioni vengono usate spesso in ambiti quali la prototipazione di parti meccaniche, nella pianificazione e nell\u2019addestramento di procedure di assemblaggio e disassemblaggio inoltre, di recente, le simulazioni si sono dimostrate validi strumenti anche nell\u2019assistenza e nell\u2019addestramento ai chirurghi, in particolare nel caso della chirurgia laparoscopica. La chirurgia laparoscopica, infatti, \ue8 considerata lo standard per molte procedure chirurgiche. La principale differenza rispetto alla chirurgia tradizionale risiede nella notevole limitazione che ha il chirurgo nell\u2019interagire e nel percepire l\u2019ambiente in lavora, sia nella vista che nel tatto. Questo rappresenta una forte limitazione per il chirurgo a cui \ue8 richiesta una lunga fase di addestramento prima di poter ottenere la necessaria destrezza per intervenire in laparoscopia con profitto. Queste limitazioni, d\u2019altra parte, rendono la laparoscopia il candidato ideale per l\u2019introduzione della simulazione nell\u2019addestramento. Attualmente sono disponibili in commercio dei software per l\u2019addestramento alla laparoscopia, tuttavia essi sono in genere basati su modelli rigidi, o modelli che comunque mancano del necessario realismo fisico. L\u2019introduzione di modelli deformabili migliorerebbe notevolmente l\u2019accuratezza e il realismo delle simulazioni. Nel caso dell\u2019addestramento il maggior realismo permetterebbe all\u2019utente di acquisire non solo le conoscenze motorie basilari ma anche capacit\ue0 e conoscenze di pi\uf9 alto livello. I corpi rigidi, infatti, rappresentano una buona approssimazione della realt\ue0 solo in situazioni particolari ed entro intervalli di sollecitazioni molto ristretti. Quando si considerano materiali non ingegneristici, come accade nelle simulazioni chirurgiche, le deformazioni non possono essere trascurate senza compromettere irrimediabilmente il realismo dei risultati. L\u2019uso di modelli deformabili tuttavia introduce notevole complessit\ue0 computazionale per il calcolo della fisica che regola le deformazioni e limita fortemente l\u2019uso di dati precalcolati, spesso utilizzati per velocizzare la fase di identificazione delle collisioni tra i corpi. I ritardi dovuti all\u2019uso di modelli deformabili rappresentano un grosso limite soprattutto nelle applicazioni interattive che, per consentire all\u2019utente di interagire con l\u2019ambiente, richiedono il calcolo della simulazione entro intervalli di tempo molto ridotti. In questa tesi viene affrontato il tema della simulazione di ambienti interattivi composti da corpi deformabili che interagiscono con attrito. Vengono analizzati e sviluppati differenti tecniche e metodi per le diverse componenti della simulazione: dalla simulazione di modelli deformabili, agli algoritmi di identificazione e soluzione delle collisioni e alla modellazione e integrazione dell\u2019attrito nella simulazione. In particolare vengono valutati i principali metodi che rappresentano lo stato dell\u2019arte nella modellazione di materiali deformabili. L\u2019analisi considera i fondamenti fisici su cui i modelli si basano e quindi sul grado di realismo che possono garantire in termini di deformazioni modellabili e la semplicit\ue0 d\u2019uso degli stessi (ovvero la facilit\ue0 di comprensione del metodo, la calibrazione del modello e la possibilit\ue0 di adattare il modello a situazioni differenti) ma viene considerata anche la complessit\ue0 computazionale di ciascun metodo in quanto essa rappresenta un fattore estremamente importante nella scelta e nell\u2019uso dei modelli deformabili nelle simulazioni. Il confronto dei differenti modelli e le caratteristiche identificate hanno motivato lo sviluppo di un metodo innovativo per fornire un\u2019interfaccia comune ai vari metodi di simulazione dei tessuti deformabili. Tale interfaccia ha il vantaggio di fornire dei metodi omogenei per la manipolazione dei diversi modelli deformabili. Ci\uf2 garantisce la possibilit\ue0 di scambiare il modello usato per la simulazione delle deformazioni mantenendo inalterati le altre strutture dati e i metodi della simulazione. L\u2019introduzione di tale interfaccia unificata si dimostra particolarmente vantaggiosa in quanto permette l\u2019uso di un solo metodo per l\u2019identificazione delle collisioni per tutti i differenti modelli deformabili. Ci\uf2 semplifica molto l\u2019analisi e la definizione dei requisiti di tale modulo software. L\u2019identificazione delle collisioni tra modelli rigidi generalmente precalcola delle partizioni dello spazio in cui i corpi sono definiti oppure sfrutta la suddivisione del corpo analizzato in parti convesse per velocizzare la simulazione. Nel caso di modelli deformabili non \ue8 possibile applicare tali tecniche a causa dei continui cambiamenti nella configurazione dei corpi. Dopo che le collisioni tra i corpi sono state riconosciute e che i punti di contatto sono stati identificati e necessario risolvere le collisioni tenendo conto della fisica sottostante i contatti. Per garantire il realismo \ue8 necessario assicurare che i corpi non si compenetrino mai e che nella simulazione delle collisioni tutti i fenomeni fisici di interesse coinvolti nel contatto tra i corpi vengano considerati: questi includono le forze elastiche che si esercitano tra i corpi e le forze di attrito che si generano lungo le superfici di contatto. L\u2019innovativo metodo proposto per la soluzione delle collisioni garantisce il realismo della simulazione e l\u2019integrazione con l\u2019interfaccia proposta per la gestione unificata dei modelli. Una caratteristica importante dei tessuti biologici \ue8 il comportamento anisotropico, dovuto, in genere, alla loro struttura fibrosa. In questa tesi viene proposto un nuovo metodo per aggiungere l\u2019anisotropia al comportamento dei modelli massa molla. Il metodo ha il vantaggio di mantenere la velocit\ue0 computazionale e la semplicit\ue0 di implementazione dei modelli massa molla classici e riesce a differenziare efficacemente la risposta del modello alle sollecitazioni lungo le differenti direzioni. Le tecniche descritte sono state integrate in due applicazioni che forniscono la simulazione della fisica di ambienti con corpi deformabili. La prima delle due implementa tutti i metodi descritti per la simulazione dei modelli deformabili, identifica le collisioni con precisione e le risolve fornendo la possibilit\ue0 di scegliere il modello di attrito pi\uf9 adatto, dimostrando cos\uec la fattibilit\ue0 dell\u2019approccio proposto. La limitazione principale di tale simulatore risiede nell\u2019alto tempo di calcolo richiesto per la simulazione dei singoli passi di simulazione. Tale limitazione \ue8 stata superata in una seconda implementazione che sfrutta il parallelismo intrinseco delle simulazioni fisiche per ottimizzare gli algoritmi e che, quindi, riesce a sfruttare al meglio la potenza computazionale delle architetture hardware parallele. Al fine di ottenere le prestazioni richieste per la simulazione di ambienti interattivi con ritorno di forza, la simulazione \ue8 basata su un algoritmo di identificazione delle collisioni semplificato, ma implementa gli altri metodi descritti in questa tesi. L\u2019implementazione parallela sfrutta le capacit\ue0 di calcolo delle moderne schede video munite di processori altamente paralleli e ci\uf2 permette di aggiornare la scena ogni millisecondo. Questo elimina ogni discontinuit\ue0 nel ritorno di forza reso all\u2019utente e nell\u2019aggiornamento della grafica della scena, inoltre garantisce il realismo necessario alla simulazione fisica sottostante. Le applicazioni implementate provano la fattibilit\ue0 della simulazione della fisica di interazioni complesse tra corpi deformabili. Inoltre, l\u2019implementazione parallela della simulazione rappresenta un promettente punto di partenza per la realizzazione di simulazioni interattive che potr\ue0 essere utilizzato in ambiti di ricerca differenti, quali l\u2019addestramento di chirurghi o la prototipazione rapida.The use of simulations provides great advantages in term of economy, realism, and adaptability to user requirements in many research and technological fields. For this reason simulations are currently exploited, for example, in prototyping of machinery parts, in assembly-disassembly test or training and, recently, simulations have also allowed the development of many useful and promising tools for the assistance and learning of surgical procedures. This is particularly true for laparoscopic intervention. Laparoscopy, in fact, represents the gold standard for many surgical procedures. The principal difference from standard surgery is the reduction of the surgeon ability to perceive the surgical scenario, both from visual and tactile point of view. This represents a great limitation for surgeons who undergo long training before being able to perform laparoscopic intervention with proficiency. This, on the other hand, makes laparoscopy an excellent candidate for the use of simulations for training. Some commercial training softwares are already available on the market, but they are usually based on rigid body models that completely lack the physical realism. The introduction of deformable models may leads to a great increment in terms of realism and accuracy. And, in the case of laparoscopy trainer it may allow the user to learn not only basic motor skills, but also higher level capabilities and knowledge. Rigid bodies, in fact, represents a good approximation of reality only in some situations and in very restricted ranges of solicitations. In particular, when non engineering materials are involved, as happens in surgical simulations, deformations cannot be neglected without completely loosing the realism of the environment. The use of deformable models, however, is limited for the high computational costs involved in the computation of the physics undergoing the deformations and because of the reduction in pre computable data in particular for collision detection between bodies. This represents a very limiting factor in interactive environments where, to allow the user to interactively control the virtual bodies, the simulation should be performed in real time. In this thesis we address the simulation of interactive environment populated with deformable models that interact with frictional contacts. This includes the analysis and the development of different techniques which implement the various parts of the simulation: mainly the methods for the simulation of deformable models, the collision detection and collision solution techniques but also the modelling and the integration of suitable friction models in the simulation. In particular we evaluated the principal methods that represent the state of the art in soft tissue modeling. Our analysis is based on the physical background of each method and thus on its realism in terms of deformations that the method can mimic and on the ease of use (i.e. method understanding, calibration and ability to adapt to different scenarios) but we also compared the computational complexity of different models, as it represents an extremely important factor in the choice and in the use of models in simulations. The comparison of different features in analyzed methods motivated us to the development of an innovative method to wrap in a common representation framework different methodologies of soft tissue simulation. This framework has the advantage of providing a unified interface for all the deformable models and thus it provides the ability to switch between deformable model keeping unchanged all other data structures and methods of the simulation. The use of this unique interface allows us to use one single method to perform the collision detection phase for all the analyzed deformable models, this greatly helped during the identification of requirements and features of such software module. Collision detection phase, when applied to rigid bodies, usually takes advantage of pre computation to subdivide body shapes in convex elements or to construct partitions of the space in which the body is defined to speed up the computation. When handling deformable models this is not possible because of the continuous changes in bodies shape. The collision detection method used in this work takes into account this problem and regularly adapt the data structures to the body configuration. After collisions have been detected and contact points have been identified on colliding bodies, it is necessary to solve the collision in a physics based way. To this extent we have to ensure that objects never compenetrate during the simulation and that, when solving collisions, all the physical phenomena involved in the contact of real bodies are taken into account: this include the elastic response of bodies during the contact and the frictional force exerted between each pair of colliding bodies. The innovative method for solving collision that we describe in this thesis ensures the realism of the simulation and the seamless interaction with the common framework used to integrate deformable models. One important feature of biologic tissues is their anisotropic behavior that usually comes from the fibrous structure of these tissues. In this thesis we propose a new method to introduce anisotropy in mass spring model. The method has the advantages of preserving the speed and ease of implementation of the model and it effectively introduces differentiation of the model behavior along the chosen directions. The described techniques have been integrated in two applications that allows the physical simulation of environments populated with deformable models. The first application implements all the described methods to simulate deformable models, it performs precise collision detection and solution with the possibility to chose the most suitable friction model for the simulation. It demonstrates the effectiveness of the proposed framework. The main limitation of this simulator, i.e. its high computation time, is tackled and solved in a second application that exploits the intrinsic parallelism of physical simulations to optimize the implementation and to exploit parallel architecture computational power. To obtain the performances required for an interactive environment the simulation is based on a simplified collision detection algorithm, but it features all the other techniques described in this thesis. The parallel implementation exploits graphic cards processor, a highly parallel architecture that update the scene every milliseconds. This allows the rendering of smooth haptic feedback to the user and ensures the realism of the physics simulation. The implemented applications prove the feasibility of the simulation of complex interactions between deformable models with physics realism. In addition, the parallel implementation of the simulator represents a promising starting point for the development of interactive simulations that can be used in different fields of research, such as surgeon training or fast prototyping

    Adaptive Physically Based Models in Computer Graphics

    Get PDF
    International audienceOne of the major challenges in physically-based modeling is making simulations efficient. Adaptive models provide an essential solution to these efficiency goals. These models are able to self-adapt in space and time, attempting to provide the best possible compromise between accuracy and speed. This survey reviews the adaptive solutions proposed so far in computer graphics. Models are classified according to the strategy they use for adaptation, from time-stepping and freezing techniques to geometric adaptivity in the form of structured grids, meshes, and particles. Applications range from fluids, through deformable bodies, to articulated solids

    Virtual Reality Simulator for Training in Myringotomy with Tube Placement

    Get PDF
    Myringotomy refers to a surgical incision in the eardrum, and it is often followed by ventilation tube placement to treat middle-ear infections. The procedure is difficult to learn; hence, the objectives of this work were to develop a virtual-reality training simulator, assess its face and content validity, and implement quantitative performance metrics and assess construct validity. A commercial digital gaming engine (Unity3D) was used to implement the simulator with support for 3D visualization of digital ear models and support for major surgical tasks. A haptic arm co-located with the stereo scene was used to manipulate virtual surgical tools and to provide force feedback. A questionnaire was developed with 14 face validity questions focusing on realism and 6 content validity questions focusing on training potential. Twelve participants from the Department of Otolaryngology were recruited for the study. Responses to 12 of the 14 face validity questions were positive. One concern was with contact modeling related to tube insertion into the eardrum, and the second was with movement of the blade and forceps. The former could be resolved by using a higher resolution digital model for the eardrum to improve contact localization. The latter could be resolved by using a higher fidelity haptic device. With regard to content validity, 64% of the responses were positive, 21% were neutral, and 15% were negative. In the final phase of this work, automated performance metrics were programmed and a construct validity study was conducted with 11 participants: 4 senior Otolaryngology consultants and 7 junior Otolaryngology residents. Each participant performed 10 procedures on the simulator and metrics were automatically collected. Senior Otolaryngologists took significantly less time to completion compared to junior residents. Junior residents had 2.8 times more errors as compared to experienced surgeons. The senior surgeons also had significantly longer incision lengths, more accurate incision angles, and lower magnification keeping both the umbo and annulus in view. All metrics were able to discriminate senior Otolaryngologists from junior residents with a significance of p \u3c 0.002. The simulator has sufficient realism, training potential and performance discrimination ability to warrant a more resource intensive skills transference study

    Workshop on the Integration of Finite Element Modeling with Geometric Modeling

    Get PDF
    The workshop on the Integration of Finite Element Modeling with Geometric Modeling was held on 12 May 1987. It was held to discuss the geometric modeling requirements of the finite element modeling process and to better understand the technical aspects of the integration of these two areas. The 11 papers are presented except for one for which only the abstract is given

    Doctor of Philosophy

    Get PDF
    dissertationShape analysis is a well-established tool for processing surfaces. It is often a first step in performing tasks such as segmentation, symmetry detection, and finding correspondences between shapes. Shape analysis is traditionally employed on well-sampled surfaces where the geometry and topology is precisely known. When the form of the surface is that of a point cloud containing nonuniform sampling, noise, and incomplete measurements, traditional shape analysis methods perform poorly. Although one may first perform reconstruction on such a point cloud prior to performing shape analysis, if the geometry and topology is far from the true surface, then this can have an adverse impact on the subsequent analysis. Furthermore, for triangulated surfaces containing noise, thin sheets, and poorly shaped triangles, existing shape analysis methods can be highly unstable. This thesis explores methods of shape analysis applied directly to such defect-laden shapes. We first study the problem of surface reconstruction, in order to obtain a better understanding of the types of point clouds for which reconstruction methods contain difficulties. To this end, we have devised a benchmark for surface reconstruction, establishing a standard for measuring error in reconstruction. We then develop a new method for consistently orienting normals of such challenging point clouds by using a collection of harmonic functions, intrinsically defined on the point cloud. Next, we develop a new shape analysis tool which is tolerant to imperfections, by constructing distances directly on the point cloud defined as the likelihood of two points belonging to a mutually common medial ball, and apply this for segmentation and reconstruction. We extend this distance measure to define a diffusion process on the point cloud, tolerant to missing data, which is used for the purposes of matching incomplete shapes undergoing a nonrigid deformation. Lastly, we have developed an intrinsic method for multiresolution remeshing of a poor-quality triangulated surface via spectral bisection

    Geometry–aware finite element framework for multi–physics simulations: an algorithmic and software-centric perspective

    Get PDF
    In finite element simulations, the handling of geometrical objects and their discrete representation is a critical aspect in both serial and parallel scientific software environments. The development of codes targeting such envinronments is subject to great development effort and man-hours invested. In this thesis we approach these issues from three fronts. First, stable and efficient techniques for the transfer of discrete fields between non matching volume or surface meshes are an essential ingredient for the discretization and numerical solution of coupled multi-physics and multi-scale problems. In particular L2-projections allows for the transfer of discrete fields between unstructured meshes, both in the volume and on the surface. We present an algorithm for parallelizing the assembly of the L2-transfer operator for unstructured meshes which are arbitrarily distributed among different processes. The algorithm requires no a priori information on the geometrical relationship between the different meshes. Second, the geometric representation is often a limiting factor which imposes a trade-off between how accurately the shape is described, and what methods can be employed for solving a system of differential equations. Parametric finite-elements and bijective mappings between polygons or polyhedra allow us to flexibly construct finite element discretizations with arbitrary resolutions without sacrificing the accuracy of the shape description. Such flexibility allows employing state-of-the-art techniques, such as geometric multigrid methods, on meshes with almost any shape.t, the way numerical techniques are represented in software libraries and approached from a development perspective, affect both usability and maintainability of such libraries. Completely separating the intent of high-level routines from the actual implementation and technologies allows for portable and maintainable performance. We provide an overview on current trends in the development of scientific software and showcase our open-source library utopia

    Curve Skeleton and Moments of Area Supported Beam Parametrization in Multi-Objective Compliance Structural Optimization

    Get PDF
    This work addresses the end-to-end virtual automation of structural optimization up to the derivation of a parametric geometry model that can be used for application areas such as additive manufacturing or the verification of the structural optimization result with the finite element method. A holistic design in structural optimization can be achieved with the weighted sum method, which can be automatically parameterized with curve skeletonization and cross-section regression to virtually verify the result and control the local size for additive manufacturing. is investigated in general. In this paper, a holistic design is understood as a design that considers various compliances as an objective function. This parameterization uses the automated determination of beam parameters by so-called curve skeletonization with subsequent cross-section shape parameter estimation based on moments of area, especially for multi-objective optimized shapes. An essential contribution is the linking of the parameterization with the results of the structural optimization, e.g., to include properties such as boundary conditions, load conditions, sensitivities or even density variables in the curve skeleton parameterization. The parameterization focuses on guiding the skeletonization based on the information provided by the optimization and the finite element model. In addition, the cross-section detection considers circular, elliptical, and tensor product spline cross-sections that can be applied to various shape descriptors such as convolutional surfaces, subdivision surfaces, or constructive solid geometry. The shape parameters of these cross-sections are estimated using stiffness distributions, moments of area of 2D images, and convolutional neural networks with a tailored loss function to moments of area. Each final geometry is designed by extruding the cross-section along the appropriate curve segment of the beam and joining it to other beams by using only unification operations. The focus of multi-objective structural optimization considering 1D, 2D and 3D elements is on cases that can be modeled using equations by the Poisson equation and linear elasticity. This enables the development of designs in application areas such as thermal conduction, electrostatics, magnetostatics, potential flow, linear elasticity and diffusion, which can be optimized in combination or individually. Due to the simplicity of the cases defined by the Poisson equation, no experts are required, so that many conceptual designs can be generated and reconstructed by ordinary users with little effort. Specifically for 1D elements, a element stiffness matrices for tensor product spline cross-sections are derived, which can be used to optimize a variety of lattice structures and automatically convert them into free-form surfaces. For 2D elements, non-local trigonometric interpolation functions are used, which should significantly increase interpretability of the density distribution. To further improve the optimization, a parameter-free mesh deformation is embedded so that the compliances can be further reduced by locally shifting the node positions. Finally, the proposed end-to-end optimization and parameterization is applied to verify a linear elasto-static optimization result for and to satisfy local size constraint for the manufacturing with selective laser melting of a heat transfer optimization result for a heat sink of a CPU. For the elasto-static case, the parameterization is adjusted until a certain criterion (displacement) is satisfied, while for the heat transfer case, the manufacturing constraints are satisfied by automatically changing the local size with the proposed parameterization. This heat sink is then manufactured without manual adjustment and experimentally validated to limit the temperature of a CPU to a certain level.:TABLE OF CONTENT III I LIST OF ABBREVIATIONS V II LIST OF SYMBOLS V III LIST OF FIGURES XIII IV LIST OF TABLES XVIII 1. INTRODUCTION 1 1.1 RESEARCH DESIGN AND MOTIVATION 6 1.2 RESEARCH THESES AND CHAPTER OVERVIEW 9 2. PRELIMINARIES OF TOPOLOGY OPTIMIZATION 12 2.1 MATERIAL INTERPOLATION 16 2.2 TOPOLOGY OPTIMIZATION WITH PARAMETER-FREE SHAPE OPTIMIZATION 17 2.3 MULTI-OBJECTIVE TOPOLOGY OPTIMIZATION WITH THE WEIGHTED SUM METHOD 18 3. SIMULTANEOUS SIZE, TOPOLOGY AND PARAMETER-FREE SHAPE OPTIMIZATION OF WIREFRAMES WITH B-SPLINE CROSS-SECTIONS 21 3.1 FUNDAMENTALS IN WIREFRAME OPTIMIZATION 22 3.2 SIZE AND TOPOLOGY OPTIMIZATION WITH PERIODIC B-SPLINE CROSS-SECTIONS 27 3.3 PARAMETER-FREE SHAPE OPTIMIZATION EMBEDDED IN SIZE OPTIMIZATION 32 3.4 WEIGHTED SUM SIZE AND TOPOLOGY OPTIMIZATION 36 3.5 CROSS-SECTION COMPARISON 39 4. NON-LOCAL TRIGONOMETRIC INTERPOLATION IN TOPOLOGY OPTIMIZATION 41 4.1 FUNDAMENTALS IN MATERIAL INTERPOLATIONS 43 4.2 NON-LOCAL TRIGONOMETRIC SHAPE FUNCTIONS 45 4.3 NON-LOCAL PARAMETER-FREE SHAPE OPTIMIZATION WITH TRIGONOMETRIC SHAPE FUNCTIONS 49 4.4 NON-LOCAL AND PARAMETER-FREE MULTI-OBJECTIVE TOPOLOGY OPTIMIZATION 54 5. FUNDAMENTALS IN SKELETON GUIDED SHAPE PARAMETRIZATION IN TOPOLOGY OPTIMIZATION 58 5.1 SKELETONIZATION IN TOPOLOGY OPTIMIZATION 61 5.2 CROSS-SECTION RECOGNITION FOR IMAGES 66 5.3 SUBDIVISION SURFACES 67 5.4 CONVOLUTIONAL SURFACES WITH META BALL KERNEL 71 5.5 CONSTRUCTIVE SOLID GEOMETRY 73 6. CURVE SKELETON GUIDED BEAM PARAMETRIZATION OF TOPOLOGY OPTIMIZATION RESULTS 75 6.1 FUNDAMENTALS IN SKELETON SUPPORTED RECONSTRUCTION 76 6.2 SUBDIVISION SURFACE PARAMETRIZATION WITH PERIODIC B-SPLINE CROSS-SECTIONS 78 6.3 CURVE SKELETONIZATION TAILORED TO TOPOLOGY OPTIMIZATION WITH PRE-PROCESSING 82 6.4 SURFACE RECONSTRUCTION USING LOCAL STIFFNESS DISTRIBUTION 86 7. CROSS-SECTION SHAPE PARAMETRIZATION FOR PERIODIC B-SPLINES 96 7.1 PRELIMINARIES IN B-SPLINE CONTROL GRID ESTIMATION 97 7.2 CROSS-SECTION EXTRACTION OF 2D IMAGES 101 7.3 TENSOR SPLINE PARAMETRIZATION WITH MOMENTS OF AREA 105 7.4 B-SPLINE PARAMETRIZATION WITH MOMENTS OF AREA GUIDED CONVOLUTIONAL NEURAL NETWORK 110 8. FULLY AUTOMATED COMPLIANCE OPTIMIZATION AND CURVE-SKELETON PARAMETRIZATION FOR A CPU HEAT SINK WITH SIZE CONTROL FOR SLM 115 8.1 AUTOMATED 1D THERMAL COMPLIANCE MINIMIZATION, CONSTRAINED SURFACE RECONSTRUCTION AND ADDITIVE MANUFACTURING 118 8.2 AUTOMATED 2D THERMAL COMPLIANCE MINIMIZATION, CONSTRAINT SURFACE RECONSTRUCTION AND ADDITIVE MANUFACTURING 120 8.3 USING THE HEAT SINK PROTOTYPES COOLING A CPU 123 9. CONCLUSION 127 10. OUTLOOK 131 LITERATURE 133 APPENDIX 147 A PREVIOUS STUDIES 147 B CROSS-SECTION PROPERTIES 149 C CASE STUDIES FOR THE CROSS-SECTION PARAMETRIZATION 155 D EXPERIMENTAL SETUP 15
    • …
    corecore