185 research outputs found
2-Player Nash and Nonsymmetric Bargaining Games: Algorithms and Structural Properties
The solution to a Nash or a nonsymmetric bargaining game is obtained by
maximizing a concave function over a convex set, i.e., it is the solution to a
convex program. We show that each 2-player game whose convex program has linear
constraints, admits a rational solution and such a solution can be found in
polynomial time using only an LP solver. If in addition, the game is succinct,
i.e., the coefficients in its convex program are ``small'', then its solution
can be found in strongly polynomial time. We also give a non-succinct linear
game whose solution can be found in strongly polynomial time
NR-SLAM: Non-Rigid Monocular SLAM
In this paper we present NR-SLAM, a novel non-rigid monocular SLAM system
founded on the combination of a Dynamic Deformation Graph with a Visco-Elastic
deformation model. The former enables our system to represent the dynamics of
the deforming environment as the camera explores, while the later allows us to
model general deformations in a simple way. The presented system is able to
automatically initialize and extend a map modeled by a sparse point cloud in
deforming environments, that is refined with a sliding-window Deformable Bundle
Adjustment. This map serves as base for the estimation of the camera motion and
deformation and enables us to represent arbitrary surface topologies,
overcoming the limitations of previous methods. To assess the performance of
our system in challenging deforming scenarios, we evaluate it in several
representative medical datasets. In our experiments, NR-SLAM outperforms
previous deformable SLAM systems, achieving millimeter reconstruction accuracy
and bringing automated medical intervention closer. For the benefit of the
community, we make the source code public.Comment: 12 pages, 7 figures, submited to the IEEE Transactions on Robotics
(T-RO
Photometric single-view dense 3D reconstruction in endoscopy
Visual SLAM inside the human body will open the way to computer-assisted navigation in endoscopy. However, due to space limitations, medical endoscopes only provide monocular images, leading to systems lacking true scale. In this paper, we exploit the controlled lighting in colonoscopy to achieve the first in-vivo 3D reconstruction of the human colon using photometric stereo on a calibrated monocular endoscope. Our method works in a real medical environment, providing both a suitable in-place calibration procedure and a depth estimation technique adapted to the colon's tubular geometry. We validate our method on simulated colonoscopies, obtaining a mean error of 7% on depth estimation, which is below 3 mm on average. Our qualitative results on the EndoMapper dataset show that the method is able to correctly estimate the colon shape in real human colonoscopies, paving the ground for truescale monocular SLAM in endoscopy
Maximum Edge-Disjoint Paths in -sums of Graphs
We consider the approximability of the maximum edge-disjoint paths problem
(MEDP) in undirected graphs, and in particular, the integrality gap of the
natural multicommodity flow based relaxation for it. The integrality gap is
known to be even for planar graphs due to a simple
topological obstruction and a major focus, following earlier work, has been
understanding the gap if some constant congestion is allowed.
In this context, it is natural to ask for which classes of graphs does a
constant-factor constant-congestion property hold. It is easy to deduce that
for given constant bounds on the approximation and congestion, the class of
"nice" graphs is nor-closed. Is the converse true? Does every proper
minor-closed family of graphs exhibit a constant factor, constant congestion
bound relative to the LP relaxation? We conjecture that the answer is yes.
One stumbling block has been that such bounds were not known for bounded
treewidth graphs (or even treewidth 3). In this paper we give a polytime
algorithm which takes a fractional routing solution in a graph of bounded
treewidth and is able to integrally route a constant fraction of the LP
solution's value. Note that we do not incur any edge congestion. Previously
this was not known even for series parallel graphs which have treewidth 2. The
algorithm is based on a more general argument that applies to -sums of
graphs in some graph family, as long as the graph family has a constant factor,
constant congestion bound. We then use this to show that such bounds hold for
the class of -sums of bounded genus graphs
Smoothed Analysis of the Minimum-Mean Cycle Canceling Algorithm and the Network Simplex Algorithm
The minimum-cost flow (MCF) problem is a fundamental optimization problem
with many applications and seems to be well understood. Over the last half
century many algorithms have been developed to solve the MCF problem and these
algorithms have varying worst-case bounds on their running time. However, these
worst-case bounds are not always a good indication of the algorithms'
performance in practice. The Network Simplex (NS) algorithm needs an
exponential number of iterations for some instances, but it is considered the
best algorithm in practice and performs best in experimental studies. On the
other hand, the Minimum-Mean Cycle Canceling (MMCC) algorithm is strongly
polynomial, but performs badly in experimental studies.
To explain these differences in performance in practice we apply the
framework of smoothed analysis. We show an upper bound of
for the number of iterations of the MMCC algorithm.
Here is the number of nodes, is the number of edges, and is a
parameter limiting the degree to which the edge costs are perturbed. We also
show a lower bound of for the number of iterations of the
MMCC algorithm, which can be strengthened to when
. For the number of iterations of the NS algorithm we show a
smoothed lower bound of .Comment: Extended abstract to appear in the proceedings of COCOON 201
Recognizing hyperelliptic graphs in polynomial time
Recently, a new set of multigraph parameters was defined, called
"gonalities". Gonality bears some similarity to treewidth, and is a relevant
graph parameter for problems in number theory and multigraph algorithms.
Multigraphs of gonality 1 are trees. We consider so-called "hyperelliptic
graphs" (multigraphs of gonality 2) and provide a safe and complete sets of
reduction rules for such multigraphs, showing that for three of the flavors of
gonality, we can recognize hyperelliptic graphs in O(n log n+m) time, where n
is the number of vertices and m the number of edges of the multigraph.Comment: 33 pages, 8 figure
Rheology of moist food powders as affected by moisture content
Dynamic testing to determine rheological characteristics of moist food powders (semolina, coarse wheat flour, potato starch) was carried out using a powder rheometer of a new construction. The unique feature of the rheometer is that scale of shearing was confined to the thickness of shearing band of powder bed only. It was found that flow pattern of moistened samples was noticeably and diversely affected by both moisture content (varying in the range of 0–15% w/w) and shear rate. The observed changes showed statistical significance p < 0.01 in all trials carried out. What is noteworthy about the conducted research is that at some shear rate values, the shear stress of the bed reached the maximum for specific moisture content levels, irrespective of particle size of the bed. Such behavior may provide an indication of complex interference of different powder shearing mechanisms in the presence of moisture. For beds consisted of larger particles, shear stress values decreased considerably with increasing moisture content. To explain this, modeling of the shearing process with Discrete Element Method (DEM) was performed. The results obtained supported the idea that friction coefficients of particulate material were significantly reduced at higher moisture content of the powder bed in the whole range of shear rates applied
- …