8,344 research outputs found

    Heuristic Backtracking Algorithms for SAT

    No full text
    In recent years backtrack search SAT solvers have been the subject of dramatic improvements. These improvements allowed SAT solvers to successfully replace BDDs in many areas of formal verification, and also motivated the development of many new challenging problem instances, many of which too hard for the current generation of SAT solvers. As a result, further improvements to SAT technology are expected to have key consequences in formal verification. The objective of this paper is to propose heuristic approaches to the backtrack step of backtrack search SAT solvers, with the goal of increasing the ability of the SAT solver to search different parts of the search space. The proposed heuristics to the backtrack step are inspired by the heuristics proposed in recent years for the branching step of SAT solvers, namely VSIDS and some of its improvements. The preliminary experimental results are promising, and motivate the integration of heuristic backtracking in state-of-the-art SAT solvers. 1

    SlowFuzz: Automated Domain-Independent Detection of Algorithmic Complexity Vulnerabilities

    Full text link
    Algorithmic complexity vulnerabilities occur when the worst-case time/space complexity of an application is significantly higher than the respective average case for particular user-controlled inputs. When such conditions are met, an attacker can launch Denial-of-Service attacks against a vulnerable application by providing inputs that trigger the worst-case behavior. Such attacks have been known to have serious effects on production systems, take down entire websites, or lead to bypasses of Web Application Firewalls. Unfortunately, existing detection mechanisms for algorithmic complexity vulnerabilities are domain-specific and often require significant manual effort. In this paper, we design, implement, and evaluate SlowFuzz, a domain-independent framework for automatically finding algorithmic complexity vulnerabilities. SlowFuzz automatically finds inputs that trigger worst-case algorithmic behavior in the tested binary. SlowFuzz uses resource-usage-guided evolutionary search techniques to automatically find inputs that maximize computational resource utilization for a given application.Comment: ACM CCS '17, October 30-November 3, 2017, Dallas, TX, US

    On Spectral Graph Embedding: A Non-Backtracking Perspective and Graph Approximation

    Full text link
    Graph embedding has been proven to be efficient and effective in facilitating graph analysis. In this paper, we present a novel spectral framework called NOn-Backtracking Embedding (NOBE), which offers a new perspective that organizes graph data at a deep level by tracking the flow traversing on the edges with backtracking prohibited. Further, by analyzing the non-backtracking process, a technique called graph approximation is devised, which provides a channel to transform the spectral decomposition on an edge-to-edge matrix to that on a node-to-node matrix. Theoretical guarantees are provided by bounding the difference between the corresponding eigenvalues of the original graph and its graph approximation. Extensive experiments conducted on various real-world networks demonstrate the efficacy of our methods on both macroscopic and microscopic levels, including clustering and structural hole spanner detection.Comment: SDM 2018 (Full version including all proofs

    Automatic frequency assignment for cellular telephones using constraint satisfaction techniques

    Get PDF
    We study the problem of automatic frequency assignment for cellular telephone systems. The frequency assignment problem is viewed as the problem to minimize the unsatisfied soft constraints in a constraint satisfaction problem (CSP) over a finite domain of frequencies involving co-channel, adjacent channel, and co-site constraints. The soft constraints are automatically derived from signal strength prediction data. The CSP is solved using a generalized graph coloring algorithm. Graph-theoretical results play a crucial role in making the problem tractable. Performance results from a real-world frequency assignment problem are presented. We develop the generalized graph coloring algorithm by stepwise refinement, starting from DSATUR and augmenting it with local propagation, constraint lifting, intelligent backtracking, redundancy avoidance, and iterative deepening

    A New Look at the Easy-Hard-Easy Pattern of Combinatorial Search Difficulty

    Full text link
    The easy-hard-easy pattern in the difficulty of combinatorial search problems as constraints are added has been explained as due to a competition between the decrease in number of solutions and increased pruning. We test the generality of this explanation by examining one of its predictions: if the number of solutions is held fixed by the choice of problems, then increased pruning should lead to a monotonic decrease in search cost. Instead, we find the easy-hard-easy pattern in median search cost even when the number of solutions is held constant, for some search methods. This generalizes previous observations of this pattern and shows that the existing theory does not explain the full range of the peak in search cost. In these cases the pattern appears to be due to changes in the size of the minimal unsolvable subproblems, rather than changing numbers of solutions.Comment: See http://www.jair.org/ for any accompanying file
    corecore