8,511 research outputs found

    Solution of Optimal Power Flow Problems using Moment Relaxations Augmented with Objective Function Penalization

    Full text link
    The optimal power flow (OPF) problem minimizes the operating cost of an electric power system. Applications of convex relaxation techniques to the non-convex OPF problem have been of recent interest, including work using the Lasserre hierarchy of "moment" relaxations to globally solve many OPF problems. By preprocessing the network model to eliminate low-impedance lines, this paper demonstrates the capability of the moment relaxations to globally solve large OPF problems that minimize active power losses for portions of several European power systems. Large problems with more general objective functions have thus far been computationally intractable for current formulations of the moment relaxations. To overcome this limitation, this paper proposes the combination of an objective function penalization with the moment relaxations. This combination yields feasible points with objective function values that are close to the global optimum of several large OPF problems. Compared to an existing penalization method, the combination of penalization and the moment relaxations eliminates the need to specify one of the penalty parameters and solves a broader class of problems.Comment: 8 pages, 1 figure, to appear in IEEE 54th Annual Conference on Decision and Control (CDC), 15-18 December 201

    Limits of Preprocessing

    Full text link
    We present a first theoretical analysis of the power of polynomial-time preprocessing for important combinatorial problems from various areas in AI. We consider problems from Constraint Satisfaction, Global Constraints, Satisfiability, Nonmonotonic and Bayesian Reasoning. We show that, subject to a complexity theoretic assumption, none of the considered problems can be reduced by polynomial-time preprocessing to a problem kernel whose size is polynomial in a structural problem parameter of the input, such as induced width or backdoor size. Our results provide a firm theoretical boundary for the performance of polynomial-time preprocessing algorithms for the considered problems.Comment: This is a slightly longer version of a paper that appeared in the proceedings of AAAI 201

    Guarantees and Limits of Preprocessing in Constraint Satisfaction and Reasoning

    Full text link
    We present a first theoretical analysis of the power of polynomial-time preprocessing for important combinatorial problems from various areas in AI. We consider problems from Constraint Satisfaction, Global Constraints, Satisfiability, Nonmonotonic and Bayesian Reasoning under structural restrictions. All these problems involve two tasks: (i) identifying the structure in the input as required by the restriction, and (ii) using the identified structure to solve the reasoning task efficiently. We show that for most of the considered problems, task (i) admits a polynomial-time preprocessing to a problem kernel whose size is polynomial in a structural problem parameter of the input, in contrast to task (ii) which does not admit such a reduction to a problem kernel of polynomial size, subject to a complexity theoretic assumption. As a notable exception we show that the consistency problem for the AtMost-NValue constraint admits a polynomial kernel consisting of a quadratic number of variables and domain values. Our results provide a firm worst-case guarantees and theoretical boundaries for the performance of polynomial-time preprocessing algorithms for the considered problems.Comment: arXiv admin note: substantial text overlap with arXiv:1104.2541, arXiv:1104.556

    Tractable Pathfinding for the Stochastic On-Time Arrival Problem

    Full text link
    We present a new and more efficient technique for computing the route that maximizes the probability of on-time arrival in stochastic networks, also known as the path-based stochastic on-time arrival (SOTA) problem. Our primary contribution is a pathfinding algorithm that uses the solution to the policy-based SOTA problem---which is of pseudo-polynomial-time complexity in the time budget of the journey---as a search heuristic for the optimal path. In particular, we show that this heuristic can be exceptionally efficient in practice, effectively making it possible to solve the path-based SOTA problem as quickly as the policy-based SOTA problem. Our secondary contribution is the extension of policy-based preprocessing to path-based preprocessing for the SOTA problem. In the process, we also introduce Arc-Potentials, a more efficient generalization of Stochastic Arc-Flags that can be used for both policy- and path-based SOTA. After developing the pathfinding and preprocessing algorithms, we evaluate their performance on two different real-world networks. To the best of our knowledge, these techniques provide the most efficient computation strategy for the path-based SOTA problem for general probability distributions, both with and without preprocessing.Comment: Submission accepted by the International Symposium on Experimental Algorithms 2016 and published by Springer in the Lecture Notes in Computer Science series on June 1, 2016. Includes typographical corrections and modifications to pre-processing made after the initial submission to SODA'15 (July 7, 2014

    The size of BDDs and other data structures in temporal logics model checking

    Get PDF
    Temporal Logic Model Checking is a verification method in which we describe a system, the model, and then we verify whether important properties, expressed in a temporal logic formula, hold in the system. Many Model Checking tools employ BDDs or some other data structure to represent sets of states. It has been empirically observed that the BDDs used in these algorithms may grow exponentially as the model and formula increase in size. We formally prove that no kind of data structure of polynomial size can represent the set of valid initial states for all models and all formulae. This result holds for all data structures where a state can be checked in polynomial time. Therefore, it holds not only for all types of BDDs regardless of variable ordering, but also for more powerful data structures, such as RBCs, MTBDDs, ADDs and SDDs. Thus, the size explosion of BDDs is not a limit of these specific data representation structures, but is unavoidable: every formalism used in the same way would lead to an exponential size blow up

    Stochastic Divergence Minimization for Biterm Topic Model

    Full text link
    As the emergence and the thriving development of social networks, a huge number of short texts are accumulated and need to be processed. Inferring latent topics of collected short texts is useful for understanding its hidden structure and predicting new contents. Unlike conventional topic models such as latent Dirichlet allocation (LDA), a biterm topic model (BTM) was recently proposed for short texts to overcome the sparseness of document-level word co-occurrences by directly modeling the generation process of word pairs. Stochastic inference algorithms based on collapsed Gibbs sampling (CGS) and collapsed variational inference have been proposed for BTM. However, they either require large computational complexity, or rely on very crude estimation. In this work, we develop a stochastic divergence minimization inference algorithm for BTM to estimate latent topics more accurately in a scalable way. Experiments demonstrate the superiority of our proposed algorithm compared with existing inference algorithms.Comment: 19 pages, 4 figure

    On the Complexity of Case-Based Planning

    Full text link
    We analyze the computational complexity of problems related to case-based planning: planning when a plan for a similar instance is known, and planning from a library of plans. We prove that planning from a single case has the same complexity than generative planning (i.e., planning "from scratch"); using an extended definition of cases, complexity is reduced if the domain stored in the case is similar to the one to search plans for. Planning from a library of cases is shown to have the same complexity. In both cases, the complexity of planning remains, in the worst case, PSPACE-complete

    Making Queries Tractable on Big Data with Preprocessing

    Get PDF
    A query class is traditionally considered tractable if there exists a polynomial-time (PTIME) algorithm to answer its queries. When it comes to big data, however, PTIME al-gorithms often become infeasible in practice. A traditional and effective approach to coping with this is to preprocess data off-line, so that queries in the class can be subsequently evaluated on the data efficiently. This paper aims to pro-vide a formal foundation for this approach in terms of com-putational complexity. (1) We propose a set of Π-tractable queries, denoted by ΠT0Q, to characterize classes of queries that can be answered in parallel poly-logarithmic time (NC) after PTIME preprocessing. (2) We show that several natu-ral query classes are Π-tractable and are feasible on big data. (3) We also study a set ΠTQ of query classes that can be ef-fectively converted to Π-tractable queries by re-factorizing its data and queries for preprocessing. We introduce a form of NC reductions to characterize such conversions. (4) We show that a natural query class is complete for ΠTQ. (5) We also show that ΠT0Q ⊂ P unless P = NC, i.e., the set ΠT0Q of all Π-tractable queries is properly contained in the set P of all PTIME queries. Nonetheless, ΠTQ = P, i.e., all PTIME query classes can be made Π-tractable via proper re-factorizations. This work is a step towards understanding the tractability of queries in the context of big data. 1
    • 

    corecore