54 research outputs found
Cuckoo Search Algorithm with Lévy Flights for Global-Support Parametric Surface Approximation in Reverse Engineering
This paper concerns several important topics of the Symmetry journal, namely, computer-aided design, computational geometry, computer graphics, visualization, and pattern recognition. We also take advantage of the symmetric structure of the tensor-product surfaces, where the parametric variables u and v play a symmetric role in shape reconstruction. In this paper we address the general problem of global-support parametric surface approximation from clouds of data points for reverse engineering applications. Given a set of measured data points, the approximation is formulated as a nonlinear continuous least-squares optimization problem. Then, a recent metaheuristics called Cuckoo Search Algorithm (CSA) is applied to compute all relevant free variables of this minimization problem (namely, the data parameters and the surface poles). The method includes the iterative generation of new solutions by using the Lévy flights to promote the diversity of solutions and prevent stagnation. A critical advantage of this method is its simplicity: the CSA requires only two parameters, many fewer than any other metaheuristic approach, so the parameter tuning becomes a very easy task. The method is also simple to understand and easy to implement. Our approach has been applied to a benchmark of three illustrative sets of noisy data points corresponding to
surfaces exhibiting several challenging features. Our experimental results show that the method performs very well even for the cases of noisy and unorganized data points. Therefore, the method can be directly used for real-world applications for reverse engineering without further pre/post-processing. Comparative work with the most classical mathematical techniques for this problem as well as a recent modification of the CSA called Improved CSA (ICSA) is also reported. Two nonparametric statistical tests show that our method outperforms the classical mathematical techniques and provides equivalent results to ICSA for all instances in our benchmark.This research work has received funding from the project PDE-GIR (Partial Differential Equations for Geometric modelling, Image processing, and shape Reconstruction) of the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Grant agreement No. 778035, the Spanish Ministry of Economy and Competitiveness (Computer Science National Program) under Grant #TIN2017-89275-R of the Agencia Estatal de Investigación and European Funds FEDER (AEI/FEDER, UE), and the project #JU12, jointly supported by public body SODERCAN of the Regional Government of Cantabria and European Funds FEDER (SODERCAN/FEDER UE). We also thank Toho University, Nihon University, and the Symmetry 2018, 10, 58 23 of 25 University of Cantabria for their support to conduct this research wor
Multiple 2D self organising map network for surface reconstruction of 3D unstructured data
Surface reconstruction is a challenging task in reverse engineering because it must represent the surface which is similar to the original object based on the data obtained. The data obtained are mostly in unstructured type whereby there is not enough information and incorrect surface will be obtained. Therefore, the data should be reorganised by finding the correct topology with minimum surface error. Previous studies showed that Self Organising Map (SOM) model, the conventional surface approximation approach with Non Uniform Rational B-Splines (NURBS) surfaces, and optimisation methods such as Genetic Algorithm (GA), Differential Evolution (DE) and Particle Swarm Optimisation (PSO) methods are widely implemented in solving the surface reconstruction. However, the model, approach and optimisation methods are still suffer from the unstructured data and accuracy problems. Therefore, the aims of this research are to propose Cube SOM (CSOM) model with multiple 2D SOM network in organising the unstructured surface data, and to propose optimised surface approximation approach in generating the NURBS surfaces. GA, DE and PSO methods are implemented to minimise the surface error by adjusting the NURBS control points. In order to test and validate the proposed model and approach, four primitive objects data and one medical image data are used. As to evaluate the performance of the proposed model and approach, three performance measurements have been used: Average Quantisation Error (AQE) and Number Of Vertices (NOV) for the CSOM model while surface error for the proposed optimised surface approximation approach. The accuracy of AQE for CSOM model has been improved to 64% and 66% when compared to 2D and 3D SOM respectively. The NOV for CSOM model has been reduced from 8000 to 2168 as compared to 3D SOM. The accuracy of surface error for the optimised surface approximation approach has been improved to 7% compared to the conventional approach. The proposed CSOM model and optimised surface approximation approach have successfully reconstructed surface of all five data with better performance based on three performance measurements used in the evaluation
Curve Reconstruction By Metaheuristics Algorithms On Cubic Rational Bézier Function
Curve reconstruction regularly used in reverse engineering. Meanwhile, curve fitting is one of the main compositions of curve reconstruction that is usually represented by mathematical functions, most suitable for representing a set of data points, and may need to meet some constraints. Various of curve fitting studies had been done by many researchers specifically using optimisation technique. The optimisation technique consists of exact algorithm, and approximate algorithm. The approximate algorithm is a good technique to be highlighted since it is a feasible way to develop an easier, more convenient curve fitting method, that will save great computation, solve a large scale problem and produce a better quality end result. Metaheuristics has strong and intelligent mechanisms to avoid being trapped in the local minimum
Representation Of Rational Bézier Quadratics Using Genetic Algorithm, Differential Evolution And Particle Swarm Optimization
Data representation is a challenging problem in areas such as font reconstruction, medical
image and scanned images. Direct mathematical techniques usually give smallest errors but
sometime take a much longer time to compute. Alternatively, artificial intelligence techniques
are widely used for optimization problem with shorter computation time. Besides, the usage
of artificial technique for data representation is getting popular lately. Thus, this thesis is dedicated
for the representation of curves and surfaces. Three soft computing techniques namely
Genetic Algorithm (GA), Differential Evolution (DE) and Particle Swarm Optimization (PSO)
are utilized for the desired manipulation of curves and surfaces. These techniques have been
used to optimize control points and weights in the description of spline functions used. Preprocessing
components such as corner detection and chord length parameterization are also
explained in this thesis. For each proposed soft computing technique, parameter tuning is done
as an essential study. The sum of squares error (SSE) is used as an objective function. Therefore,
this is also a minimization problem where the best values for control points and weights
are found when SSE value is minimized. Rational Bézier quadratics have been utilized for
the representation of curves. Reconstruction of surfaces is achieved by extending the rational
Bézier quadratics to their rational Bézier bi-quadratic counterpart. Our proposed curve and
surface methods with additional help from soft computing techniques have been utilized to
vectorize the 2D and 3D shapes and objects
Two simulated annealing optimization schemas for rational bézier curve fitting in the presence of noise
Fitting curves to noisy data points is a difficult problem arising in many scientific and industrial domains. Although polynomial functions are usually applied to this task, there are many shapes that cannot be properly fitted by using this approach. In this paper, we tackle this issue by using rational Bézier curves. This is a very difficult problem that requires computing four different sets of unknowns (data parameters, poles, weights, and the curve degree) strongly related to each other in a highly nonlinear way. This leads to a difficult continuous nonlinear optimization problem. In this paper, we propose two simulated annealing schemas (the all-in-one schema and the sequential schema) to determine the data parameterization and the weights of the poles of the fitting curve. These schemas are combined with least-squares minimization and the Bayesian Information Criterion to calculate the poles and the optimal degree of the best fitting Bézier rational curve, respectively. We apply our methods to a benchmark of three carefully chosen examples of 2D and 3D noisy data points. Our experimental results show that this methodology (particularly, the sequential schema) outperforms previous polynomial-based approaches for our data fitting problem, even in the presence of noise of low-medium intensity.This research has been kindly supported by the Computer Science National Program of the Spanish Ministry of Economy and Competitiveness, Project Ref. #TIN2012-30768, Toho University (Funabashi, Japan), and the University of Cantabria (Santander, Spain)
Evalutionary algorithms for ship hull skinning approximation
Traditionally, the design process of a hull involves simulation using clay models. This must be done cautiously, accurately and efficiently in order to sustain the performance of ship. Presently, the current technology of Computer Aided Design, Manufacturing, Engineering and Computational Fluid Dynamic has enabled a 3D design and simulation of a hull be done at a lower cost and within a shorter period of time. Besides that, automated design tools allow the transformation of offset data in designing the hull be done automatically. One of the most common methods in constructing a hull from the offset data is the skinning method. Generally, the skinning method comprised of skinning interpolation and skinning approximation. Skinning interpolation constructs the surface perfectly but improper selection of parameterization methods may cause bumps, wiggles, or uneven surfaces on the generated surface. On the other hand, using the skinning surface approximation would mean that the surface can only be constructed closer to data points. Thus, the error between the generated surface and the data points must be minimized to increase the accuracy. Therefore, this study aims to solve the error minimization problem in order to produce a smoother and fairer surface by proposing Non Uniform Rational B-Spline surface using various evolutionary optimization algorithms, namely, Gravitational Search Algorithm, Particle Swarm Optimization and Genetic Algorithm. The proposed methods involve four procedures: extraction of offset data from line drawing plan; generation of control points; optimization of a surface; and validations of hull surfaces. Validation is done by analyzing the surface curvature and errors between the generated surface and the given data points. The experiments were implemented on both ship hull and free form models. The findings from the experiments are compared with interpolated skinning surface and conventional skinning surface approximation. The results show that the optimized skinning surfaces using the proposed methods yield a smaller error, less control points generation and feasible surfaces while maintaining the shape of the hull
Geometric Data Analysis: Advancements of the Statistical Methodology and Applications
Data analysis has become fundamental to our society and comes in multiple facets and approaches. Nevertheless, in research and applications, the focus was primarily on data from Euclidean vector spaces. Consequently, the majority of methods that are applied today are not suited for more general data types. Driven by needs from fields like image processing, (medical) shape analysis, and network analysis, more and more attention has recently been given to data from non-Euclidean spaces–particularly (curved) manifolds. It has led to the field of geometric data analysis whose methods explicitly take the structure (for example, the topology and geometry) of the underlying space into account.
This thesis contributes to the methodology of geometric data analysis by generalizing several fundamental notions from multivariate statistics to manifolds. We thereby focus on two different viewpoints.
First, we use Riemannian structures to derive a novel regression scheme for general manifolds that relies on splines of generalized Bézier curves. It can accurately model non-geodesic relationships, for example, time-dependent trends with saturation effects or cyclic trends. Since Bézier curves can be evaluated with the constructive de Casteljau algorithm, working with data from manifolds of high dimensions (for example, a hundred thousand or more) is feasible. Relying on the regression, we further develop
a hierarchical statistical model for an adequate analysis of longitudinal data in manifolds, and a method to control for confounding variables.
We secondly focus on data that is not only manifold- but even Lie group-valued, which is frequently the case in applications. We can only achieve this by endowing the group with an affine connection structure that is generally not Riemannian. Utilizing it, we derive generalizations of several well-known dissimilarity measures between data distributions that can be used for various tasks, including hypothesis testing. Invariance under data translations is proven, and a connection to continuous distributions is given for one measure.
A further central contribution of this thesis is that it shows use cases for all notions in real-world applications, particularly in problems from shape analysis in medical imaging and archaeology. We can replicate or further quantify several known findings for shape changes of the femur and the right hippocampus under osteoarthritis and Alzheimer's, respectively. Furthermore, in an archaeological application, we obtain new insights into the construction principles of ancient sundials. Last but not least, we use the geometric structure underlying human brain connectomes to predict cognitive scores. Utilizing a sample selection procedure, we obtain state-of-the-art results
Desenvolvimento de metodologias para identificação de parâmetros e otimização de forma em simulações numéricas de processos de conformação plástica
Doutoramento em Engenharia MecânicaPor parte da indústria de estampagem tem-se verificado um interesse
crescente em simulações numéricas de processos de conformação de chapa,
incluindo também métodos de engenharia inversa. Este facto ocorre
principalmente porque as técnicas de tentativa-erro, muito usadas no passado,
não são mais competitivas a nível económico. O uso de códigos de simulação
é, atualmente, uma prática corrente em ambiente industrial, pois os resultados
tipicamente obtidos através de códigos com base no Método dos Elementos
Finitos (MEF) são bem aceites pelas comunidades industriais e científicas
Na tentativa de obter campos de tensão e de deformação precisos, uma
análise eficiente com o MEF necessita de dados de entrada corretos, como
geometrias, malhas, leis de comportamento não-lineares, carregamentos, leis
de atrito, etc.. Com o objetivo de ultrapassar estas dificuldades podem ser
considerados os problemas inversos. No trabalho apresentado, os seguintes
problemas inversos, em Mecânica computacional, são apresentados e
analisados: (i) problemas de identificação de parâmetros, que se referem à
determinação de parâmetros de entrada que serão posteriormente usados em
modelos constitutivos nas simulações numéricas e (ii) problemas de definição
geométrica inicial de chapas e ferramentas, nos quais o objetivo é determinar a
forma inicial de uma chapa ou de uma ferramenta tendo em vista a obtenção
de uma determinada geometria após um processo de conformação.
São introduzidas e implementadas novas estratégias de otimização, as quais
conduzem a parâmetros de modelos constitutivos mais precisos. O objetivo
destas estratégias é tirar vantagem das potencialidades de cada algoritmo e
melhorar a eficiência geral dos métodos clássicos de otimização, os quais são
baseados em processos de apenas um estágio. Algoritmos determinísticos,
algoritmos inspirados em processos evolucionários ou mesmo a combinação
destes dois são usados nas estratégias propostas. Estratégias de cascata,
paralelas e híbridas são apresentadas em detalhe, sendo que as estratégias
híbridas consistem na combinação de estratégias em cascata e paralelas.
São apresentados e analisados dois métodos distintos para a avaliação da
função objetivo em processos de identificação de parâmetros. Os métodos
considerados são uma análise com um ponto único ou uma análise com
elementos finitos. A avaliação com base num único ponto caracteriza uma
quantidade infinitesimal de material sujeito a uma determinada história de
deformação. Por outro lado, na análise através de elementos finitos, o modelo
constitutivo é implementado e considerado para cada ponto de integração.
Problemas inversos são apresentados e descritos, como por exemplo, a
definição geométrica de chapas e ferramentas.
Considerando o caso da otimização da forma inicial de uma chapa metálica a
definição da forma inicial de uma chapa para a conformação de um elemento
de cárter é considerado como problema em estudo. Ainda neste âmbito, um
estudo sobre a influência da definição geométrica inicial da chapa no processo
de otimização é efetuado. Este estudo é realizado considerando a formulação
de NURBS na definição da face superior da chapa metálica, face cuja
geometria será alterada durante o processo de conformação plástica.
No caso dos processos de otimização de ferramentas, um processo de
forjamento a dois estágios é apresentado. Com o objetivo de obter um cilindro
perfeito após o forjamento, dois métodos distintos são considerados. No
primeiro, a forma inicial do cilindro é otimizada e no outro a forma da
ferramenta do primeiro estágio de conformação é otimizada. Para parametrizar
a superfície livre do cilindro são utilizados diferentes métodos. Para a definição
da ferramenta são também utilizados diferentes parametrizações.
As estratégias de otimização propostas neste trabalho resolvem eficientemente
problemas de otimização para a indústria de conformação metálica.The interest of the stamping industry in the numerical simulation of sheet metal
forming, including inverse engineering approaches, is increasing. This fact
occurs mainly because trial and error design procedures, commonly used in the
past, are no longer economically competitive. The use of simulation codes is
currently a common practice in the industrial forming environment, as the
results typically obtained by means of the Finite Element Method (FEM) are
well accepted by both the industrial and scientific communities.
In order to obtain accurate stress and strain fields, an effective FEM analysis
requires reliable input data such as geometry, mesh, non-linear material
behaviour laws, loading cases, friction laws, etc.. In order to overcome these
difficulties, a possible approach is based on inverse problems. In this work, the
following inverse problems in computational Mechanics are presented and
analysed: (i) parameter identification problem, that refer to the definition of input
parameters to be used in constitutive models for numerical simulations, based
on experimental data, and (ii) initial blank and tool design problem, where the
aim would be to estimate the initial shape of a blank or a tool in order to
achieve the desired geometry after the forming process.
New optimization strategies in parameter identification problems that lead more
efficiently to accurate material parameters are introduced and implemented.
The aim of these strategies is to take advantage of the strength of each
selected algorithm and improve the overall robustness and efficiency of
classical optimization methodologies based on single stages. Deterministic
algorithms, evolutionary-inspired algorithms or even the combination of these
two algorithms are used in the proposed strategies. Strategies such as
cascade, parallel and hybrid approaches are analysed in detail. In hybrid
strategies, cascade and parallel approaches are integrated.
Two different approaches are presented and analyzed for the evaluation of the
objective functions in parameter identification processes. The approaches
considered are single-point and FE analyses. The single infinitesimal point
evaluation seems to characterize an infinitesimal amount of material subjected
to all kind of deformation history. On the other hand, in all FE analysis codes,
the constitutive model is implemented and accounted for in each element
integration point.
Inverse problems, such as blank and tool design, are presented and described.
In the case of the initial blank optimization process the design of a carter is
presented. Also related to the initial blank optimization process, a study of the
influence of the initial geometry definition in the optimization process is
conducted. This study is performed considering the NURBS formulation to
model the blank upper surface that will be changed during the optimization
process.
In the case of the tool design problem, a two-stage forging process is
presented. In order to achieve a straight cylinder after forging, two different
approaches are analyzed. In the first one, the initial geometry of the cylinder is
optimized and, in the other one, the shape of the first stage tool is optimized.
To parameterize the free surface of the cylinder different methods are
presented. Furthermore, in order to define the tool in this example, different
parameterizations are presented.
The optimisation strategies proposed in this work efficiently solve optimisation
problems for the industrial metal forming
Delaunay-kolmioinnin hyödyntäminen infrastruktuurin suunnitteluohjelmistoissa
In Finland, irregular triangulation has traditionally been used in infrastructural design software, such as road, railroad, bridge, tunnel and environmental design software, to model ground surfaces. Elsewhere, methods like regular square and triangle network, approximating surface without a surface presentation, and algebraic surfaces, have been used for the same task.
Approximating the ground surface is necessary for tasks such as determining the height of a point on the ground, interpolating 2D polylines onto the ground, calculating height lines, calculating volumes and visualization.
In most of these cases, a continuous surface representation, a digital terrain model is needed. Delaunay triangulation is a way of forming an irregular triangulation out of a 2D point set, in such a way that the triangles are well-formed. Well-formed triangles are essential for the accuracy of the surface representation.
This Master's Thesis studies how much time and memory it takes to form a Delaunay triangulation for large point sets, and how Delaunay triangulation compares to other methods of forming a surface representation. In addition, the run-time and accuracy of the resulting surface representations is studied in different interpolation and volume calculation tasks.Infrastruktuurin suunnitteluohjelmistoissa, kuten tien-, rautatien-, sillan-, tunnelin-, ja ympäristönsuunnitteluohjelmistoissa, on Suomessa perinteisesti käytetty maaston pinnan mallintamiseen mittapisteistä muodostettua epäsäännöllistä kolmioverkkoa. Muualla maailmassa ovat käytössä olleet säännölliset neliö- ja kolmioverkot, maaston approksimointi ilman pintaesitystä, sekä joissain tapauksissa algebralliset pintaesitykset.
Pinnan approksimaatiota tarvitaan em. sovelluksissa mm. pisteen korkeuden arviointiin, 2-ulotteisten murtoviivojen interpolointiin maaston pinnalle, korkeuskäyrien laskemiseen ja massan (tilavuuden) laskentaan annetuilta alueilta sekä visualisointiin.
Delaunay-kolmiointi on tapa muodosta 2-ulotteisesta pistejoukosta epäsäännöllinen kolmioverkko, jonka kolmiot hyvin tasamuotoisia. Kolmioiden tasamuotoisuus on oleellisesta pintamallin tarkkuudelle.
Tässä työssä tutkitaan Delaunay-kolmioinnin käytettävyyttä maaston mallintamiseen suurilla pistejoukoilla, sekä epäsäännöllisen kolmioinnin käytettävyyttä em. tehtäviin. Työssä vertaillaan Delaunay-kolmioinnin muodostamisen ajan ja muistin kulutusta pintaesityksen muodostamiseen muilla menetelmillä. Lisäksi tutkitaan näin muodostettujen pintamallien tilavuuslaskennan ja interpolaation nopeutta ja tarkkuutta
- …