781 research outputs found
Grafting Hypersequents onto Nested Sequents
We introduce a new Gentzen-style framework of grafted hypersequents that
combines the formalism of nested sequents with that of hypersequents. To
illustrate the potential of the framework, we present novel calculi for the
modal logics and , as well as for extensions of the
modal logics and with the axiom for shift
reflexivity. The latter of these extensions is also known as
in the context of deontic logic. All our calculi enjoy syntactic cut
elimination and can be used in backwards proof search procedures of optimal
complexity. The tableaufication of the calculi for and
yields simplified prefixed tableau calculi for these logic
reminiscent of the simplified tableau system for , which might be
of independent interest
Continuous Multiclass Labeling Approaches and Algorithms
We study convex relaxations of the image labeling problem on a continuous
domain with regularizers based on metric interaction potentials. The generic
framework ensures existence of minimizers and covers a wide range of
relaxations of the originally combinatorial problem. We focus on two specific
relaxations that differ in flexibility and simplicity -- one can be used to
tightly relax any metric interaction potential, while the other one only covers
Euclidean metrics but requires less computational effort. For solving the
nonsmooth discretized problem, we propose a globally convergent
Douglas-Rachford scheme, and show that a sequence of dual iterates can be
recovered in order to provide a posteriori optimality bounds. In a quantitative
comparison to two other first-order methods, the approach shows competitive
performance on synthetical and real-world images. By combining the method with
an improved binarization technique for nonstandard potentials, we were able to
routinely recover discrete solutions within 1%--5% of the global optimum for
the combinatorial image labeling problem
Sublabel-Accurate Relaxation of Nonconvex Energies
We propose a novel spatially continuous framework for convex relaxations
based on functional lifting. Our method can be interpreted as a
sublabel-accurate solution to multilabel problems. We show that previously
proposed functional lifting methods optimize an energy which is linear between
two labels and hence require (often infinitely) many labels for a faithful
approximation. In contrast, the proposed formulation is based on a piecewise
convex approximation and therefore needs far fewer labels. In comparison to
recent MRF-based approaches, our method is formulated in a spatially continuous
setting and shows less grid bias. Moreover, in a local sense, our formulation
is the tightest possible convex relaxation. It is easy to implement and allows
an efficient primal-dual optimization on GPUs. We show the effectiveness of our
approach on several computer vision problems
Duality-based Higher-order Non-smooth Optimization on Manifolds
We propose a method for solving non-smooth optimization problems on
manifolds. In order to obtain superlinear convergence, we apply a Riemannian
Semi-smooth Newton method to a non-smooth non-linear primal-dual optimality
system based on a recent extension of Fenchel duality theory to Riemannian
manifolds. We also propose an inexact version of the Riemannian Semi-smooth
Newton method and prove conditions for local linear and superlinear
convergence. Numerical experiments on l2-TV-like problems confirm superlinear
convergence on manifolds with positive and negative curvature
- …