5,845 research outputs found

    Upper and Lower Bounds for Weak Backdoor Set Detection

    Full text link
    We obtain upper and lower bounds for running times of exponential time algorithms for the detection of weak backdoor sets of 3CNF formulas, considering various base classes. These results include (omitting polynomial factors), (i) a 4.54^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Horn formulas; (ii) a 2.27^k algorithm to detect whether there is a weak backdoor set of at most k variables into the class of Krom formulas. These bounds improve an earlier known bound of 6^k. We also prove a 2^k lower bound for these problems, subject to the Strong Exponential Time Hypothesis.Comment: A short version will appear in the proceedings of the 16th International Conference on Theory and Applications of Satisfiability Testin

    Horn Renamability and Hypergraphs

    Get PDF
    Satisfiability testing in the context of directed hypergraphs is discussed. A characterization of Horn-renamable formulae is given and a subclass of SAT that belongs to QTRcalPQTR{cal}{P} is described. An algorithm for Horn renaming with linear time complexity is presented

    Linear Time Parameterized Algorithms via Skew-Symmetric Multicuts

    Full text link
    A skew-symmetric graph (D=(V,A),σ)(D=(V,A),\sigma) is a directed graph DD with an involution σ\sigma on the set of vertices and arcs. In this paper, we introduce a separation problem, dd-Skew-Symmetric Multicut, where we are given a skew-symmetric graph DD, a family of T\cal T of dd-sized subsets of vertices and an integer kk. The objective is to decide if there is a set XAX\subseteq A of kk arcs such that every set JJ in the family has a vertex vv such that vv and σ(v)\sigma(v) are in different connected components of D=(V,A(Xσ(X))D'=(V,A\setminus (X\cup \sigma(X)). In this paper, we give an algorithm for this problem which runs in time O((4d)k(m+n+))O((4d)^{k}(m+n+\ell)), where mm is the number of arcs in the graph, nn the number of vertices and \ell the length of the family given in the input. Using our algorithm, we show that Almost 2-SAT has an algorithm with running time O(4kk4)O(4^kk^4\ell) and we obtain algorithms for {\sc Odd Cycle Transversal} and {\sc Edge Bipartization} which run in time O(4kk4(m+n))O(4^kk^4(m+n)) and O(4kk5(m+n))O(4^kk^5(m+n)) respectively. This resolves an open problem posed by Reed, Smith and Vetta [Operations Research Letters, 2003] and improves upon the earlier almost linear time algorithm of Kawarabayashi and Reed [SODA, 2010]. We also show that Deletion q-Horn Backdoor Set Detection is a special case of 3-Skew-Symmetric Multicut, giving us an algorithm for Deletion q-Horn Backdoor Set Detection which runs in time O(12kk5)O(12^kk^5\ell). This gives the first fixed-parameter tractable algorithm for this problem answering a question posed in a paper by a superset of the authors [STACS, 2013]. Using this result, we get an algorithm for Satisfiability which runs in time O(12kk5)O(12^kk^5\ell) where kk is the size of the smallest q-Horn deletion backdoor set, with \ell being the length of the input formula

    Guarantees and Limits of Preprocessing in Constraint Satisfaction and Reasoning

    Full text link
    We present a first theoretical analysis of the power of polynomial-time preprocessing for important combinatorial problems from various areas in AI. We consider problems from Constraint Satisfaction, Global Constraints, Satisfiability, Nonmonotonic and Bayesian Reasoning under structural restrictions. All these problems involve two tasks: (i) identifying the structure in the input as required by the restriction, and (ii) using the identified structure to solve the reasoning task efficiently. We show that for most of the considered problems, task (i) admits a polynomial-time preprocessing to a problem kernel whose size is polynomial in a structural problem parameter of the input, in contrast to task (ii) which does not admit such a reduction to a problem kernel of polynomial size, subject to a complexity theoretic assumption. As a notable exception we show that the consistency problem for the AtMost-NValue constraint admits a polynomial kernel consisting of a quadratic number of variables and domain values. Our results provide a firm worst-case guarantees and theoretical boundaries for the performance of polynomial-time preprocessing algorithms for the considered problems.Comment: arXiv admin note: substantial text overlap with arXiv:1104.2541, arXiv:1104.556

    Deformable kernels for early vision

    Get PDF
    Early vision algorithms often have a first stage of linear-filtering that `extracts' from the image information at multiple scales of resolution and multiple orientations. A common difficulty in the design and implementation of such schemes is that one feels compelled to discretize coarsely the space of scales and orientations in order to reduce computation and storage costs. A technique is presented that allows: 1) computing the best approximation of a given family using linear combinations of a small number of `basis' functions; and 2) describing all finite-dimensional families, i.e., the families of filters for which a finite dimensional representation is possible with no error. The technique is based on singular value decomposition and may be applied to generating filters in arbitrary dimensions and subject to arbitrary deformations. The relevant functional analysis results are reviewed and precise conditions for the decomposition to be feasible are stated. Experimental results are presented that demonstrate the applicability of the technique to generating multiorientation multi-scale 2D edge-detection kernels. The implementation issues are also discussed
    corecore