14,471 research outputs found

    The Complexity of Finding Effectors

    Full text link
    The NP-hard EFFECTORS problem on directed graphs is motivated by applications in network mining, particularly concerning the analysis of probabilistic information-propagation processes in social networks. In the corresponding model the arcs carry probabilities and there is a probabilistic diffusion process activating nodes by neighboring activated nodes with probabilities as specified by the arcs. The point is to explain a given network activation state as well as possible by using a minimum number of "effector nodes"; these are selected before the activation process starts. We correct, complement, and extend previous work from the data mining community by a more thorough computational complexity analysis of EFFECTORS, identifying both tractable and intractable cases. To this end, we also exploit a parameterization measuring the "degree of randomness" (the number of "really" probabilistic arcs) which might prove useful for analyzing other probabilistic network diffusion problems as well.Comment: 28 page

    Limits of Preprocessing

    Full text link
    We present a first theoretical analysis of the power of polynomial-time preprocessing for important combinatorial problems from various areas in AI. We consider problems from Constraint Satisfaction, Global Constraints, Satisfiability, Nonmonotonic and Bayesian Reasoning. We show that, subject to a complexity theoretic assumption, none of the considered problems can be reduced by polynomial-time preprocessing to a problem kernel whose size is polynomial in a structural problem parameter of the input, such as induced width or backdoor size. Our results provide a firm theoretical boundary for the performance of polynomial-time preprocessing algorithms for the considered problems.Comment: This is a slightly longer version of a paper that appeared in the proceedings of AAAI 201

    Different Approaches to Proof Systems

    Get PDF
    The classical approach to proof complexity perceives proof systems as deterministic, uniform, surjective, polynomial-time computable functions that map strings to (propositional) tautologies. This approach has been intensively studied since the late 70’s and a lot of progress has been made. During the last years research was started investigating alternative notions of proof systems. There are interesting results stemming from dropping the uniformity requirement, allowing oracle access, using quantum computations, or employing probabilism. These lead to different notions of proof systems for which we survey recent results in this paper
    • …
    corecore