271 research outputs found

    A hybrid constraint programming and semidefinite programming approach for the stable set problem

    Full text link
    This work presents a hybrid approach to solve the maximum stable set problem, using constraint and semidefinite programming. The approach consists of two steps: subproblem generation and subproblem solution. First we rank the variable domain values, based on the solution of a semidefinite relaxation. Using this ranking, we generate the most promising subproblems first, by exploring a search tree using a limited discrepancy strategy. Then the subproblems are being solved using a constraint programming solver. To strengthen the semidefinite relaxation, we propose to infer additional constraints from the discrepancy structure. Computational results show that the semidefinite relaxation is very informative, since solutions of good quality are found in the first subproblems, or optimality is proven immediately.Comment: 14 page

    Physical accessible transformations on a finite number of quantum states

    Get PDF
    We consider to treat the usual probabilistic cloning, state separation, unambiguous state discrimination, \emph{etc} in a uniform framework. All these transformations can be regarded as special examples of generalized completely positive trace non-increasing maps on a finite number of input states. From the system-ancilla model we construct the corresponding unitary implementation of pure \to pure, pure \to mixed, mixed \to pure, and mixed \to mixed states transformations in the whole system and obtain the necessary and sufficient conditions on the existence of the desired maps. We expect our work will be helpful to explore what we can do on a finite set of input states.Comment: 7 page

    On the Maximum Crossing Number

    Full text link
    Research about crossings is typically about minimization. In this paper, we consider \emph{maximizing} the number of crossings over all possible ways to draw a given graph in the plane. Alpert et al. [Electron. J. Combin., 2009] conjectured that any graph has a \emph{convex} straight-line drawing, e.g., a drawing with vertices in convex position, that maximizes the number of edge crossings. We disprove this conjecture by constructing a planar graph on twelve vertices that allows a non-convex drawing with more crossings than any convex one. Bald et al. [Proc. COCOON, 2016] showed that it is NP-hard to compute the maximum number of crossings of a geometric graph and that the weighted geometric case is NP-hard to approximate. We strengthen these results by showing hardness of approximation even for the unweighted geometric case and prove that the unweighted topological case is NP-hard.Comment: 16 pages, 5 figure

    Incremental Medians via Online Bidding

    Full text link
    In the k-median problem we are given sets of facilities and customers, and distances between them. For a given set F of facilities, the cost of serving a customer u is the minimum distance between u and a facility in F. The goal is to find a set F of k facilities that minimizes the sum, over all customers, of their service costs. Following Mettu and Plaxton, we study the incremental medians problem, where k is not known in advance, and the algorithm produces a nested sequence of facility sets where the kth set has size k. The algorithm is c-cost-competitive if the cost of each set is at most c times the cost of the optimum set of size k. We give improved incremental algorithms for the metric version: an 8-cost-competitive deterministic algorithm, a 2e ~ 5.44-cost-competitive randomized algorithm, a (24+epsilon)-cost-competitive, poly-time deterministic algorithm, and a (6e+epsilon ~ .31)-cost-competitive, poly-time randomized algorithm. The algorithm is s-size-competitive if the cost of the kth set is at most the minimum cost of any set of size k, and has size at most s k. The optimal size-competitive ratios for this problem are 4 (deterministic) and e (randomized). We present the first poly-time O(log m)-size-approximation algorithm for the offline problem and first poly-time O(log m)-size-competitive algorithm for the incremental problem. Our proofs reduce incremental medians to the following online bidding problem: faced with an unknown threshold T, an algorithm submits "bids" until it submits a bid that is at least the threshold. It pays the sum of all its bids. We prove that folklore algorithms for online bidding are optimally competitive.Comment: conference version appeared in LATIN 2006 as "Oblivious Medians via Online Bidding

    Spotting Trees with Few Leaves

    Full text link
    We show two results related to the Hamiltonicity and kk-Path algorithms in undirected graphs by Bj\"orklund [FOCS'10], and Bj\"orklund et al., [arXiv'10]. First, we demonstrate that the technique used can be generalized to finding some kk-vertex tree with ll leaves in an nn-vertex undirected graph in O(1.657k2l/2)O^*(1.657^k2^{l/2}) time. It can be applied as a subroutine to solve the kk-Internal Spanning Tree (kk-IST) problem in O(min(3.455k,1.946n))O^*(\min(3.455^k, 1.946^n)) time using polynomial space, improving upon previous algorithms for this problem. In particular, for the first time we break the natural barrier of O(2n)O^*(2^n). Second, we show that the iterated random bipartition employed by the algorithm can be improved whenever the host graph admits a vertex coloring with few colors; it can be an ordinary proper vertex coloring, a fractional vertex coloring, or a vector coloring. In effect, we show improved bounds for kk-Path and Hamiltonicity in any graph of maximum degree Δ=4,,12\Delta=4,\ldots,12 or with vector chromatic number at most 8

    Finite quantum tomography via semidefinite programming

    Full text link
    Using the the convex semidefinite programming method and superoperator formalism we obtain the finite quantum tomography of some mixed quantum states such as: qudit tomography, N-qubit tomography, phase tomography and coherent spin state tomography, where that obtained results are in agreement with those of References \cite{schack,Pegg,Barnett,Buzek,Weigert}.Comment: 25 page

    Uplifting manhood to wonderful heights? News coverage of the human costs of military conflict from world war I to Gulf war Two

    Get PDF
    Domestic political support is an important factor constraining the use of American military power around the world. Although the dynamics of war support are thought to reflect a cost-benefit calculus, with costs represented by numbers of friendly war deaths, no previous study has examined how information about friendly, enemy, and civilian casualties is routinely presented to domestic audiences. This paper establishes a baseline measure of historical casualty reporting by examining New York Times coverage of five major wars that occurred over the past century. Despite important between-war differences in the scale of casualties, the use of conscription, the type of warfare, and the use of censorship, the frequency of casualty reporting and the framing of casualty reports has remained fairly consistent over the past 100 years. Casualties are rarely mentioned in American war coverage. When casualties are reported, it is often in ways that minimize or downplay the human costs of war
    corecore