1,009,284 research outputs found

    Optimal Quantum Sample Complexity of Learning Algorithms

    Get PDF
    \newcommand{\eps}{\varepsilon} In learning theory, the VC dimension of a concept class CC is the most common way to measure its "richness." In the PAC model \Theta\Big(\frac{d}{\eps} + \frac{\log(1/\delta)}{\eps}\Big) examples are necessary and sufficient for a learner to output, with probability 1δ1-\delta, a hypothesis hh that is \eps-close to the target concept cc. In the related agnostic model, where the samples need not come from a cCc\in C, we know that \Theta\Big(\frac{d}{\eps^2} + \frac{\log(1/\delta)}{\eps^2}\Big) examples are necessary and sufficient to output an hypothesis hCh\in C whose error is at most \eps worse than the best concept in CC. Here we analyze quantum sample complexity, where each example is a coherent quantum state. This model was introduced by Bshouty and Jackson, who showed that quantum examples are more powerful than classical examples in some fixed-distribution settings. However, Atici and Servedio, improved by Zhang, showed that in the PAC setting, quantum examples cannot be much more powerful: the required number of quantum examples is \Omega\Big(\frac{d^{1-\eta}}{\eps} + d + \frac{\log(1/\delta)}{\eps}\Big)\mbox{ for all }\eta> 0. Our main result is that quantum and classical sample complexity are in fact equal up to constant factors in both the PAC and agnostic models. We give two approaches. The first is a fairly simple information-theoretic argument that yields the above two classical bounds and yields the same bounds for quantum sample complexity up to a \log(d/\eps) factor. We then give a second approach that avoids the log-factor loss, based on analyzing the behavior of the "Pretty Good Measurement" on the quantum state identification problems that correspond to learning. This shows classical and quantum sample complexity are equal up to constant factors.Comment: 31 pages LaTeX. Arxiv abstract shortened to fit in their 1920-character limit. Version 3: many small changes, no change in result

    Complexity of and Algorithms for Borda Manipulation

    Full text link
    We prove that it is NP-hard for a coalition of two manipulators to compute how to manipulate the Borda voting rule. This resolves one of the last open problems in the computational complexity of manipulating common voting rules. Because of this NP-hardness, we treat computing a manipulation as an approximation problem where we try to minimize the number of manipulators. Based on ideas from bin packing and multiprocessor scheduling, we propose two new approximation methods to compute manipulations of the Borda rule. Experiments show that these methods significantly outperform the previous best known %existing approximation method. We are able to find optimal manipulations in almost all the randomly generated elections tested. Our results suggest that, whilst computing a manipulation of the Borda rule by a coalition is NP-hard, computational complexity may provide only a weak barrier against manipulation in practice

    Simplex minimisation for multiple-reference motion estimation

    Get PDF
    This paper investigates the properties of the multiple-reference block motion field. Guided by the results of this investigation, the paper proposes three fast multiple-reference block matching motion estimation algorithms. The proposed algorithms are extensions of the single-reference simplex minimisation search (SMS) algorithm. The algorithms provide different degrees of compromise between prediction quality and computational complexity. Simulation results using a multi-frame memory of 50 frames indicate that the proposed multiple-reference SMS algorithms have a computational complexity comparable to that of single-reference full-search while still maintaining the prediction gain of multiple-reference motion estimatio

    Logical Algorithms meets CHR: A meta-complexity result for Constraint Handling Rules with rule priorities

    Full text link
    This paper investigates the relationship between the Logical Algorithms language (LA) of Ganzinger and McAllester and Constraint Handling Rules (CHR). We present a translation schema from LA to CHR-rp: CHR with rule priorities, and show that the meta-complexity theorem for LA can be applied to a subset of CHR-rp via inverse translation. Inspired by the high-level implementation proposal for Logical Algorithm by Ganzinger and McAllester and based on a new scheduling algorithm, we propose an alternative implementation for CHR-rp that gives strong complexity guarantees and results in a new and accurate meta-complexity theorem for CHR-rp. It is furthermore shown that the translation from Logical Algorithms to CHR-rp combined with the new CHR-rp implementation, satisfies the required complexity for the Logical Algorithms meta-complexity result to hold.Comment: To appear in Theory and Practice of Logic Programming (TPLP
    corecore