1,228 research outputs found
Contour matching using ant colony optimization and curve evolution
Shape retrieval is a very important topic in computer vision. Image retrieval consists
of selecting images that fulfil specific criteria from a collection of images. This thesis
concentrates on contour-based image retrieval, in which we only explore the
information located on the shape contour. There are many different kinds of shape
retrieval methods. Most of the research in this field has till now concentrated on
matching methods and how to achieve a meaningful correspondence. The matching
process consist of finding correspondence between the points located on the designed
contours. However, the huge number of incorporated points in the correspondence
makes the matching process more complex. Furthermore, this scheme does not
support computation of the correspondence intuitively without considering noise
effect and distortions. Hence, heuristics methods are convoked to find acceptable
solution. Moreover, some researches focus on improving polygonal modelling
methods of a contour in such a way that the resulted contour is a good approximation
of the original contour, which can be used to reduce the number of incorporated
points in the matching. In this thesis, a novel approach for Ant Colony Optimization
(ACO) contour matching that can be used to find an acceptable matching between
contour shapes is developed. A polygonal evolution method proposed previously is
selected to simplify the extracted contour. The main reason behind selecting this
method is due to the use of a stopping criterion which must be predetermined. The
match process is formulated as a Quadratic Assignment Problem (QAP) and resolved
by using ACO. An approximated similarity is computed using original shape context
descriptor and the Euclidean metric. The experimental results justify that the
proposed approach is invariant to noise and distortions, and it is more robust to noise
and distortion compared to the previously introduced Dominant Point (DP)
Approach. This work serves as the fundamental study for assessing the Bender Test
to diagnose dyslexic and non-dyslexic symptom in children
On the Detection of Visual Features from Digital Curves using a Metaheuristic Approach
In computational shape analysis a crucial step consists in extracting meaningful features from digital curves. Dominant points are those points with curvature extreme on the curve that can suitably describe the curve both for visual perception and for recognition. Many approaches have been developed for detecting dominant points. In this paper we present a novel method that combines the dominant point detection and the ant colony optimization search. The method is inspired by the ant colony search (ACS) suggested by Yin in [1] but it results in a much more efficient and effective approximation algorithm. The excellent results have been compared both to works using an optimal search approach and to works based on exact approximation strateg
Geometric-based Optimization Algorithms for Cable Routing and Branching in Cluttered Environments
The need for designing lighter and more compact systems often leaves limited space for planning routes for the connectors that enable interactions among the system’s components. Finding optimal routes for these connectors in a densely populated environment left behind at the detail design stage has been a challenging problem for decades.
A variety of deterministic as well as heuristic methods has been developed to address different instances of this problem. While the focus of the deterministic methods is primarily on the optimality of the final solution, the heuristics offer acceptable solutions, especially for such problems, in a reasonable amount of time without guaranteeing to find optimal solutions. This study is an attempt to furthering the efforts in deterministic optimization methods to tackle the routing problem in two and three dimensions by focusing on the optimality of final solutions.
The objective of this research is twofold. First, a mathematical framework is proposed for the optimization of the layout of wiring connectors in planar cluttered environments. The problem looks at finding the optimal tree network that spans multiple components to be connected with the aim of minimizing the overall length of the connectors while maximizing their common length (for maintainability and traceability of connectors). The optimization problem is formulated as a bi-objective problem and two solution methods are proposed: (1) to solve for the optimal locations of a known number of breakouts (where the connectors branch out) using mixed-binary optimization and visibility notion and (2) to find the minimum length tree that spans multiple components of the system and generates the optimal layout using the previously-developed convex hull based routing. The computational performance of these methods in solving a variety of problems is further evaluated.
Second, the problem of finding the shortest route connecting two given nodes in a 3D cluttered environment is considered and addressed through deterministically generating a graphical representation of the collision-free space and searching for the shortest path on the found graph. The method is tested on sample workspaces with scattered convex polyhedra and its computational performance is evaluated. The work demonstrates the NP-hardness aspect of the problem which becomes quickly intractable as added components or increase in facets are considered
Efficient Decimation of Polygonal Models Using Normal Field Deviation
A simple and robust greedy algorithm has been proposed for efficient and quality decimation of polygonal models. The performance of a simplification algorithm depends on how the local geometric deviation caused by a local decimation operation is measured. As normal field of a surface plays key role in its visual appearance, exploiting the local normal field deviation in a novel way, a new measure of geometric fidelity has been introduced. This measure has the potential to identify and preserve the salient features of a surface model automatically. The resulting algorithm is simple to implement, produces approximations of better quality and is efficient in running time. Subjective and objective comparisons validate the assertion. It is suitable for applications where the focus is better speed-quality trade-off, and simplification is used as a processing step in other algorithms
An Efficient Paradigm for Feasibility Guarantees in Legged Locomotion
Developing feasible body trajectories for legged systems on arbitrary
terrains is a challenging task. Given some contact points, the trajectories for
the Center of Mass (CoM) and body orientation, designed to move the robot, must
satisfy crucial constraints to maintain balance, and to avoid violating
physical actuation and kinematic limits. In this paper, we present a paradigm
that allows to design feasible trajectories in an efficient manner. In
continuation to our previous work, we extend the notion of the 2D feasible
region, where static balance and the satisfaction of actuation limits were
guaranteed, whenever the projection of the CoM lies inside the proposed
admissible region. We here develop a general formulation of the improved
feasible region to guarantee dynamic balance alongside the satisfaction of both
actuation and kinematic limits for arbitrary terrains in an efficient manner.
To incorporate the feasibility of the kinematic limits, we introduce an
algorithm that computes the reachable region of the CoM. Furthermore, we
propose an efficient planning strategy that utilizes the improved feasible
region to design feasible CoM and body orientation trajectories. Finally, we
validate the capabilities of the improved feasible region and the effectiveness
of the proposed planning strategy, using simulations and experiments on the HyQ
robot and comparing them to a previously developed heuristic approach. Various
scenarios and terrains that mimic confined and challenging environments are
used for the validation.Comment: 17 pages, 13 figures, submitted to Transaction on Robotic
Analysis of Farthest Point Sampling for Approximating Geodesics in a Graph
A standard way to approximate the distance between any two vertices and
on a mesh is to compute, in the associated graph, a shortest path from
to that goes through one of sources, which are well-chosen vertices.
Precomputing the distance between each of the sources to all vertices of
the graph yields an efficient computation of approximate distances between any
two vertices. One standard method for choosing sources, which has been used
extensively and successfully for isometry-invariant surface processing, is the
so-called Farthest Point Sampling (FPS), which starts with a random vertex as
the first source, and iteratively selects the farthest vertex from the already
selected sources.
In this paper, we analyze the stretch factor of
approximate geodesics computed using FPS, which is the maximum, over all pairs
of distinct vertices, of their approximated distance over their geodesic
distance in the graph. We show that can be bounded in terms
of the minimal value of the stretch factor obtained using an
optimal placement of sources as , where is the ratio of the lengths of
the longest and the shortest edges of the graph. This provides some evidence
explaining why farthest point sampling has been used successfully for
isometry-invariant shape processing. Furthermore, we show that it is
NP-complete to find sources that minimize the stretch factor.Comment: 13 pages, 4 figure
Geometric data for testing implementations of point reduction algorithms : case study using Mapshaper v 0.2.28 and previous versions
There are several open source and commercial implementations of the Visvalingam algorithm for line generalisation. The algorithm provides scope for implementation-specific interpretations, with different outcomes. This is inevitable and sometimes necessary and, they do not imply that an implementation is flawed. The only restriction is that the output must not be so inconsistent with the intent of the algorithm that it becomes inappropriate. The aim of this paper is to place the algorithm within the literature, and demonstrate the value of the teragon-test for evaluating the appropriateness of implementations; Mapshaper v 0.2.28 and earlier versions are used for illustrative purposes. Data pertaining to natural features, such as coastlines, are insufficient for establishing whether deviations in output are significant. The teragon-test produced an unexpected loss of symmetry from both the Visvalingam and Douglas-Peucker options, making the tested versions unsuitable for some applications outside of cartography. This paper describes the causes, and discusses their implications. Mapshaper 0.3.17 passes the teragon test. Other developers and users should check their implementations using contrived geometric data, such as the teragon data provided in this paper, especially when the source code is not available. The teragon-test is also useful for evaluating other point reduction algorithms
The implementation of a disambiguation marching cubes algorithm
This thesis first systematically analyzes a classic surface generation algorithm, the marching cubes algorithm, in computer volume visualization, with emphasis on the mathematical background and the ambiguity problem of the algorithm. A simple and elegant disambiguation algorithm is then described and implemented. Finally, generated data from mathematical functions and real world data from scientific experiment are used to test the original marching cubes algorithm and the disambiguation algorithm
- …