9,210 research outputs found
Collision detection or nearest-neighbor search? On the computational bottleneck in sampling-based motion planning
The complexity of nearest-neighbor search dominates the asymptotic running
time of many sampling-based motion-planning algorithms. However, collision
detection is often considered to be the computational bottleneck in practice.
Examining various asymptotically optimal planning algorithms, we characterize
settings, which we call NN-sensitive, in which the practical computational role
of nearest-neighbor search is far from being negligible, i.e., the portion of
running time taken up by nearest-neighbor search is comparable, or sometimes
even greater than the portion of time taken up by collision detection. This
reinforces and substantiates the claim that motion-planning algorithms could
significantly benefit from efficient and possibly specifically-tailored
nearest-neighbor data structures. The asymptotic (near) optimality of these
algorithms relies on a prescribed connection radius, defining a ball around a
configuration , such that needs to be connected to all other
configurations in that ball. To facilitate our study, we show how to adapt this
radius to non-Euclidean spaces, which are prevalent in motion planning. This
technical result is of independent interest, as it enables to compare the
radial-connection approach with the common alternative, namely, connecting each
configuration to its nearest neighbors (-NN). Indeed, as we demonstrate,
there are scenarios where using the radial connection scheme, a solution path
of a specific cost is produced ten-fold (and more) faster than with -NN
Property-Testing in Sparse Directed Graphs: 3-Star-Freeness and Connectivity
We study property testing in directed graphs in the bounded degree model,
where we assume that an algorithm may only query the outgoing edges of a
vertex, a model proposed by Bender and Ron in 2002. As our first main result,
we we present a property testing algorithm for strong connectivity in this
model, having a query complexity of
for arbitrary ; it is based on a reduction to estimating the vertex
indegree distribution. For subgraph-freeness we give a property testing
algorithm with a query complexity of , where is the
number of connected componentes in the queried subgraph which have no incoming
edge. We furthermore take a look at the problem of testing whether a weakly
connected graph contains vertices with a degree of least , which can be
viewed as testing for freeness of all orientations of -stars; as our second
main result, we show that this property can be tested with a query complexity
of instead of, what would be expected,
.Comment: Results partly published at ESA 201
Stochastic Modeling of Distance to Collision for Robot Manipulators
Evaluating distance to collision for robot manipulators is useful for
assessing the feasibility of a robot configuration or for defining safe robot
motion in unpredictable environments. However, distance estimation is a
timeconsuming operation, and the sensors involved in measuring the distance are
always noisy. A challenge thus exists in evaluating the expected distance to
collision for safer robot control and planning. In this work, we propose the
use of Gaussian process (GP) regression and the forward kinematics (FK) kernel
(a similarity function for robot manipulators) to efficiently and accurately
estimate distance to collision. We show that the GP model with the FK kernel
achieves 70 times faster distance evaluations compared to a standard geometric
technique, and up to 13 times more accurate evaluations compared to other
regression models, even when the GP is trained on noisy distance measurements.
We employ this technique in trajectory optimization tasks and observe 9 times
faster optimization than with the noise-free geometric approach yet obtain
similar optimized motion plans. We also propose a confidence-based hybrid model
that uses model-based predictions in regions of high confidence and switches to
a more expensive sensor-based approach in other areas, and we demonstrate the
usefulness of this hybrid model in an application involving reaching into a
narrow passage
Proximity in the Age of Distraction: Robust Approximate Nearest Neighbor Search
We introduce a new variant of the nearest neighbor search problem, which
allows for some coordinates of the dataset to be arbitrarily corrupted or
unknown. Formally, given a dataset of points in
high-dimensions, and a parameter , the goal is to preprocess the dataset,
such that given a query point , one can compute quickly a point ,
such that the distance of the query to the point is minimized, when
ignoring the "optimal" coordinates. Note, that the coordinates being
ignored are a function of both the query point and the point returned.
We present a general reduction from this problem to answering ANN queries,
which is similar in spirit to LSH (locality sensitive hashing) [IM98].
Specifically, we give a sampling technique which achieves a bi-criterion
approximation for this problem. If the distance to the nearest neighbor after
ignoring coordinates is , the data-structure returns a point that is
within a distance of after ignoring coordinates. We also present
other applications and further extensions and refinements of the above result.
The new data-structures are simple and (arguably) elegant, and should be
practical -- specifically, all bounds are polynomial in all relevant parameters
(including the dimension of the space, and the robustness parameter )
PolyDepth: Real-time Penetration Depth Computation using Iterative Contact-Space Projection
We present a real-time algorithm that finds the Penetration Depth (PD)
between general polygonal models based on iterative and local optimization
techniques. Given an in-collision configuration of an object in configuration
space, we find an initial collision-free configuration using several methods
such as centroid difference, maximally clear configuration, motion coherence,
random configuration, and sampling-based search. We project this configuration
on to a local contact space using a variant of continuous collision detection
algorithm and construct a linear convex cone around the projected
configuration. We then formulate a new projection of the in-collision
configuration onto the convex cone as a Linear Complementarity Problem (LCP),
which we solve using a type of Gauss-Seidel iterative algorithm. We repeat this
procedure until a locally optimal PD is obtained. Our algorithm can process
complicated models consisting of tens of thousands triangles at interactive
rates.Comment: Presented in ACM SIGGRAPH 2012. 15 pages, 23 figure
Peg-in-Hole Revisited: A Generic Force Model for Haptic Assembly
The development of a generic and effective force model for semi-automatic or
manual virtual assembly with haptic support is not a trivial task, especially
when the assembly constraints involve complex features of arbitrary shape. The
primary challenge lies in a proper formulation of the guidance forces and
torques that effectively assist the user in the exploration of the virtual
environment (VE), from repulsing collisions to attracting proper contact. The
secondary difficulty is that of efficient implementation that maintains the
standard 1 kHz haptic refresh rate. We propose a purely geometric model for an
artificial energy field that favors spatial relations leading to proper
assembly, differentiated to obtain forces and torques for general motions. The
energy function is expressed in terms of a cross-correlation of shape-dependent
affinity fields, precomputed offline separately for each object. We test the
effectiveness of the method using familiar peg-in-hole examples. We show that
the proposed technique unifies the two phases of free motion and precise
insertion into a single interaction mode and provides a generic model to
replace the ad hoc mating constraints or virtual fixtures, with no restrictive
assumption on the types of the involved assembly features.Comment: A shorter version was presented in ASME Computers and Information in
Engineering Conference (CIE'2014) (Best Paper Award
Efficient Penetration Depth Computation between Rigid Models using Contact Space Propagation Sampling
We present a novel method to compute the approximate global penetration depth
(PD) between two non-convex geometric models. Our approach consists of two
phases: offline precomputation and run-time queries. In the first phase, our
formulation uses a novel sampling algorithm to precompute an approximation of
the high-dimensional contact space between the pair of models. As compared with
prior random sampling algorithms for contact space approximation, our
propagation sampling considerably speeds up the precomputation and yields a
high quality approximation. At run-time, we perform a nearest-neighbor query
and local projection to efficiently compute the translational or generalized
PD. We demonstrate the performance of our approach on complex 3D benchmarks
with tens or hundreds of thousands of triangles, and we observe significant
improvement over previous methods in terms of accuracy, with a modest
improvement in the run-time performance.Comment: 10 pages. add the acknowledgemen
LSwarm: Efficient Collision Avoidance for Large Swarms with Coverage Constraints in Complex Urban Scenes
In this paper, we address the problem of collision avoidance for a swarm of
UAVs used for continuous surveillance of an urban environment. Our method,
LSwarm, efficiently avoids collisions with static obstacles, dynamic obstacles
and other agents in 3-D urban environments while considering coverage
constraints. LSwarm computes collision avoiding velocities that (i) maximize
the conformity of an agent to an optimal path given by a global coverage
strategy and (ii) ensure sufficient resolution of the coverage data collected
by each agent. Our algorithm is formulated based on ORCA (Optimal Reciprocal
Collision Avoidance) and is scalable with respect to the size of the swarm. We
evaluate the coverage performance of LSwarm in realistic simulations of a swarm
of quadrotors in complex urban models. In practice, our approach can compute
collision avoiding velocities for a swarm composed of tens to hundreds of
agents in a few milliseconds on dense urban scenes consisting of tens of
buildings.Comment: 11 page
Haptic Assembly Using Skeletal Densities and Fourier Transforms
Haptic-assisted virtual assembly and prototyping has seen significant
attention over the past two decades. However, in spite of the appealing
prospects, its adoption has been slower than expected. We identify the main
roadblocks as the inherent geometric complexities faced when assembling objects
of arbitrary shape, and the computation time limitation imposed by the
notorious 1 kHz haptic refresh rate. We addressed the first problem in a recent
work by introducing a generic energy model for geometric guidance and
constraints between features of arbitrary shape. In the present work, we
address the second challenge by leveraging Fourier transforms to compute the
constraint forces and torques. Our new concept of 'geometric energy' field is
computed automatically from a cross-correlation of 'skeletal densities' in the
frequency domain, and serves as a generalization of the manually specified
virtual fixtures or heuristically identified mating constraints proposed in the
literature. The formulation of the energy field as a convolution enables
efficient computation using fast Fourier transforms (FFT) on the graphics
processing unit (GPU). We show that our method is effective for low-clearance
assembly of objects of arbitrary geometric and syntactic complexity.Comment: A shorter version was presented in ASME Computers and Information in
Engineering Conference (CIE'2015) (Best Paper Award
Robotic bees: Algorithms for collision detection and prevention
In the following paper we will discuss data structures suited for distance
threshold queries keeping in mind real life application such as collision
detection on robotic bees. We will focus on spatial hashes designed to store 3D
points and capable of fastly determining which of them surpass a specific
threshold from any other. In this paper we will discuss related literature,
explain in depth the data structure chosen with its design criteria, operations
and speed and memory efficiency analysis
- …