338 research outputs found
Feedback Vertex Set Inspired Kernel for Chordal Vertex Deletion
Given a graph and a parameter , the Chordal Vertex Deletion (CVD)
problem asks whether there exists a subset of size at most
that hits all induced cycles of size at least 4. The existence of a
polynomial kernel for CVD was a well-known open problem in the field of
Parameterized Complexity. Recently, Jansen and Pilipczuk resolved this question
affirmatively by designing a polynomial kernel for CVD of size
, and asked whether one can design a kernel of size
. While we do not completely resolve this question, we design a
significantly smaller kernel of size , inspired by the
-size kernel for Feedback Vertex Set. Furthermore, we introduce the
notion of the independence degree of a vertex, which is our main conceptual
contribution
An Approximate Kernel for Connected Feedback Vertex Set
The Feedback Vertex Set problem is a fundamental computational problem which has been the subject of intensive study in various domains of algorithmics. In this problem, one is given an undirected graph G and an integer k as input. The objective is to determine whether at most k vertices can be deleted from G such that the resulting graph is acyclic. The study of preprocessing algorithms for this problem has a long and rich history, culminating in the quadratic kernelization of Thomasse [SODA 2010].
However, it is known that when the solution is required to induce a connected subgraph (such a set is called a connected feedback vertex set), a polynomial kernelization is unlikely to exist and the problem is NP-hard to approximate below a factor of 2 (assuming the Unique Games Conjecture).
In this paper, we show that if one is interested in only preserving approximate solutions (even of quality arbitrarily close to the optimum), then there is a drastic improvement in our ability to preprocess this problem. Specifically, we prove that for every fixed 0<epsilon<1, graph G, and k in N, the following holds:
There is a polynomial time computable graph G\u27 of size k^O(1) such that for every c >= 1, any c-approximate connected feedback vertex set of G\u27 of size at most k is a c * (1+epsilon)-approximate connected feedback vertex set of G.
Our result adds to the set of approximate kernelization algorithms introduced by Lokshtanov et al. [STOC 2017]. As a consequence of our main result, we show that Connected Feedback Vertex Set can be approximated within a factor min{OPT^O(1),n^(1-delta)} in polynomial time for some delta>0
Data Reductions and Combinatorial Bounds for Improved Approximation Algorithms
Kernelization algorithms in the context of Parameterized Complexity are often
based on a combination of reduction rules and combinatorial insights. We will
expose in this paper a similar strategy for obtaining polynomial-time
approximation algorithms. Our method features the use of
approximation-preserving reductions, akin to the notion of parameterized
reductions. We exemplify this method to obtain the currently best approximation
algorithms for \textsc{Harmless Set}, \textsc{Differential} and
\textsc{Multiple Nonblocker}, all of them can be considered in the context of
securing networks or information propagation
A Linear Kernel for Planar Total Dominating Set
A total dominating set of a graph is a subset such
that every vertex in is adjacent to some vertex in . Finding a total
dominating set of minimum size is NP-hard on planar graphs and W[2]-complete on
general graphs when parameterized by the solution size. By the meta-theorem of
Bodlaender et al. [J. ACM, 2016], there exists a linear kernel for Total
Dominating Set on graphs of bounded genus. Nevertheless, it is not clear how
such a kernel can be effectively constructed, and how to obtain explicit
reduction rules with reasonably small constants. Following the approach of
Alber et al. [J. ACM, 2004], we provide an explicit kernel for Total Dominating
Set on planar graphs with at most vertices, where is the size of the
solution. This result complements several known constructive linear kernels on
planar graphs for other domination problems such as Dominating Set, Edge
Dominating Set, Efficient Dominating Set, Connected Dominating Set, or Red-Blue
Dominating Set.Comment: 33 pages, 13 figure
Scalable Nonlinear Embeddings for Semantic Category-based Image Retrieval
We propose a novel algorithm for the task of supervised discriminative
distance learning by nonlinearly embedding vectors into a low dimensional
Euclidean space. We work in the challenging setting where supervision is with
constraints on similar and dissimilar pairs while training. The proposed method
is derived by an approximate kernelization of a linear Mahalanobis-like
distance metric learning algorithm and can also be seen as a kernel neural
network. The number of model parameters and test time evaluation complexity of
the proposed method are O(dD) where D is the dimensionality of the input
features and d is the dimension of the projection space - this is in contrast
to the usual kernelization methods as, unlike them, the complexity does not
scale linearly with the number of training examples. We propose a stochastic
gradient based learning algorithm which makes the method scalable (w.r.t. the
number of training examples), while being nonlinear. We train the method with
up to half a million training pairs of 4096 dimensional CNN features. We give
empirical comparisons with relevant baselines on seven challenging datasets for
the task of low dimensional semantic category based image retrieval.Comment: ICCV 2015 preprin
- …