429 research outputs found

    Games for the Strategic Influence of Expectations

    Full text link
    We introduce a new class of games where each player's aim is to randomise her strategic choices in order to affect the other players' expectations aside from her own. The way each player intends to exert this influence is expressed through a Boolean combination of polynomial equalities and inequalities with rational coefficients. We offer a logical representation of these games as well as a computational study of the existence of equilibria.Comment: In Proceedings SR 2014, arXiv:1404.041

    The quantum measurement problem and physical reality: a computation theoretic perspective

    Full text link
    Is the universe computable? If yes, is it computationally a polynomial place? In standard quantum mechanics, which permits infinite parallelism and the infinitely precise specification of states, a negative answer to both questions is not ruled out. On the other hand, empirical evidence suggests that NP-complete problems are intractable in the physical world. Likewise, computational problems known to be algorithmically uncomputable do not seem to be computable by any physical means. We suggest that this close correspondence between the efficiency and power of abstract algorithms on the one hand, and physical computers on the other, finds a natural explanation if the universe is assumed to be algorithmic; that is, that physical reality is the product of discrete sub-physical information processing equivalent to the actions of a probabilistic Turing machine. This assumption can be reconciled with the observed exponentiality of quantum systems at microscopic scales, and the consequent possibility of implementing Shor's quantum polynomial time algorithm at that scale, provided the degree of superposition is intrinsically, finitely upper-bounded. If this bound is associated with the quantum-classical divide (the Heisenberg cut), a natural resolution to the quantum measurement problem arises. From this viewpoint, macroscopic classicality is an evidence that the universe is in BPP, and both questions raised above receive affirmative answers. A recently proposed computational model of quantum measurement, which relates the Heisenberg cut to the discreteness of Hilbert space, is briefly discussed. A connection to quantum gravity is noted. Our results are compatible with the philosophy that mathematical truths are independent of the laws of physics.Comment: Talk presented at "Quantum Computing: Back Action 2006", IIT Kanpur, India, March 200

    Quantum Proofs

    Get PDF
    Quantum information and computation provide a fascinating twist on the notion of proofs in computational complexity theory. For instance, one may consider a quantum computational analogue of the complexity class \class{NP}, known as QMA, in which a quantum state plays the role of a proof (also called a certificate or witness), and is checked by a polynomial-time quantum computation. For some problems, the fact that a quantum proof state could be a superposition over exponentially many classical states appears to offer computational advantages over classical proof strings. In the interactive proof system setting, one may consider a verifier and one or more provers that exchange and process quantum information rather than classical information during an interaction for a given input string, giving rise to quantum complexity classes such as QIP, QSZK, and QMIP* that represent natural quantum analogues of IP, SZK, and MIP. While quantum interactive proof systems inherit some properties from their classical counterparts, they also possess distinct and uniquely quantum features that lead to an interesting landscape of complexity classes based on variants of this model. In this survey we provide an overview of many of the known results concerning quantum proofs, computational models based on this concept, and properties of the complexity classes they define. In particular, we discuss non-interactive proofs and the complexity class QMA, single-prover quantum interactive proof systems and the complexity class QIP, statistical zero-knowledge quantum interactive proof systems and the complexity class \class{QSZK}, and multiprover interactive proof systems and the complexity classes QMIP, QMIP*, and MIP*.Comment: Survey published by NOW publisher

    Absztrakt automaták és formális nyelvek = Abstract Automata and Formal Languages

    Get PDF
    Monográfiában foglaltuk össze a véges automata hálózatok elméletének alapvető eredményeit . Megadtuk a Leticsevszkij kritérium nélküli automata-hálózatok egy új jellemzését. Megadtunk bizonyos szimbólum osztályokkal ellátott többszalagos automatákat. Megadtuk és vizsgáltuk a kutatásaink során felfedezett automataelméleti elvű új titkosítási rendszert. Új bizonyítást adtunk a Lyndon-Schützenberger tételre és a Shyr-Yu tételre. Általánosítottuk a szavak primitivitásának, illetve periodicitásának fogalmát, s megadtuk, hogy melyek azok a Marcus nyelvtanok, amelyek az adott típusú szavakból álló nyelveket képesek generálni. Sikerült találni egy iterációs lemmát azon környezetfüggetlen nyelvekre, melyek nem lineárisak. A contextuális sztringnyelvek általánosításaként bevezettük és vizsgáltuk a hypergráf contextuális nyelvek és nyelvtanok fogalmát. Automaták segítségével jellemeztük az uniómentes nyelveket. Leírtuk a különféle logikai kalkulusok és valamely levezetési rendszer szabályai szerint megadott levezetések kapcsolatait, s a különféle kalkulusok normalizálhatósági tulajdonságait. Új elvű számítási kutatásainkban az intervallum-értékű számításokat, mint új számítási modellt írtuk le. Digitális geometriai kutatásainkban digitális távolság alapján értelmezett szakaszokat, köröket, hiperbolákat és parabolákat vizsgáltunk. | We summarized in monograph the fundamental results of theory of finite automata networks. We gave a new characterization of automata-networks having no Letichevsky criteria. We gave multi-tape automata supplied by certain symbol classes. We gave and investigated a novel cryptosystem based on automata theory discovered during our research.. We gave a new proof of Lyndon-Schützenberger Theorem and Shyr-Yu Theorem. We generalized the concept of primitivity and periodicity of words and we give the Marcus gramars which are able to generate languages consisting of given type words. It succeed in finding an iteration lemma for non-linear context-free languages. As a generalization of contextual string languages, we introduced and investigated the concept of hypergraph contextual grammars and languages. We characterized the union-free languages by automata. We described the connections of derivations given by various logical calculi and certain derivation system, moreover the normalizable properties of various calculi. In our new computation principle researches we described the intervallum-value computations as new computation model In our digital geometric researches we investigated the sessions, circles, hyperbolas and parabolas defined by a digital distance

    Algebrai logika; relativitáselmélet logikai struktúrájának vizsgálata = Algebraic logic; investigating the logical structure of relativity theory

    Get PDF
    Gödel, Einstein és Tarski hagyományait kívánjuk folytatni, elmélyítve a Gödel-Einstein együttműködés eredményeit is, és folytatva Tarski tudományegyesítési programmját. Ismert, hogy a logika és a matematika modern megalapozása Gödel és Tarski úttörő munkásságára vezethető vissza. Kevésbbé ismert, hogy Gödel 1948-tól majdnem élete végéig Einsteinnel szorosan együttműködve relativitáselméleten dolgozott, ahol ugyanolyan meghökkentő új horizontokat tárt fel mint logikában, és hogy Gödel relativitáselméleti gondolatai folytatásaként fogható fel a forgó fekete lyukak mai elmélete. Ezen előzmények folytatása a jelen projektum, mely Tarskival és munkatársaival való személyes együttműködés (pl. közös könyv) keretében kezdődött. Az alapgondolat a logika, algebra, geometria, téridőelmélet és relativitáselmélet egységben való művelése. Eredményeinkből egy példa: Nagy, lassan forgó fekete lyukakról bizonyítottuk, hogy a belsejében létrejövő un. zárt időszerű görbe (időhurok) létrejöttére vonatkozó szokásos irodalmi magyarázatok tévesek. Nem az un. drag effect (mozgó anyag magával vonszolja a téridőt) okozza a zárt görbéket, hanem egy egészen más jellegű hatás: a fénykúpok kinyílása a forgással ellentétes irányban. Az eredmény a General Relativity and Gravitation című folyóiratban jelenik meg. | The reported project intends to continue traditions of Gödel, Einstein and Tarski continuing the spirit of the Gödel-Einstein collaboration and pursuing Tarski's programme for unifying science. Modern logic and meta-mathematics was created (basically) by Gödel and Tarski. It is less well known that beginning with 1948 Gödel spent much time with Einstein and worked on relativity theory. Of course, he remained a logician in spirit. Gödel obtained fundamental breakthroughs in relativity like his ones in logic and foundations. The theory of general relativistic spacetimes not admitting a global Time was initiated by Gödel, and came to full blossom during the renaissance of black hole physics during the last 25 years. The present project was originally started in personal cooperation with Tarski and his collaborators. The idea is to study logic, algebra, geometry, spacetime theory and relativity in a strong unity. A sample result of ours: We proved about big, slowly rotating black holes that the usual explanation in the literature of why such black holes contain a closed timelike curve (CTC) is flawed. Namely, it is not the gravitational frame dragging effect which creates CTCs, instead, there is a completely different kind of effect in action there: light cones open up in the direction opposite to that of the rotation of the source and this goes on to such an extreme extent that CTCs are created. Our paper on this appears in the journal General Relativity and Gravitation

    Rational proofs

    Get PDF
    We study a new type of proof system, where an unbounded prover and a polynomial time verifier interact, on inputs a string x and a function f, so that the Verifier may learn f(x). The novelty of our setting is that there no longer are "good" or "malicious" provers, but only rational ones. In essence, the Verifier has a budget c and gives the Prover a reward r ∈ [0,c] determined by the transcript of their interaction; the prover wishes to maximize his expected reward; and his reward is maximized only if he the verifier correctly learns f(x). Rational proof systems are as powerful as their classical counterparts for polynomially many rounds of interaction, but are much more powerful when we only allow a constant number of rounds. Indeed, we prove that if f ∈ #P, then f is computable by a one-round rational Merlin-Arthur game, where, on input x, Merlin's single message actually consists of sending just the value f(x). Further, we prove that CH, the counting hierarchy, coincides with the class of languages computable by a constant-round rational Merlin-Arthur game. Our results rely on a basic and crucial connection between rational proof systems and proper scoring rules, a tool developed to elicit truthful information from experts.United States. Office of Naval Research (Award number N00014-09-1-0597

    Analysis in weak systems

    Get PDF
    The authors survey and comment their work on weak analysis. They describe the basic set-up of analysis in a feasible second-order theory and consider the impact of adding to it various forms of weak Konig's lemma. A brief discussion of the Baire categoricity theorem follows. It is then considered a strengthening of feasibility obtained (fundamentally) by the addition of a counting axiom and showed how it is possible to develop Riemann integration in the stronger system. The paper finishes with three questions in weak analysis.info:eu-repo/semantics/publishedVersio

    A single-shot measurement of the energy of product states in a translation invariant spin chain can replace any quantum computation

    Get PDF
    In measurement-based quantum computation, quantum algorithms are implemented via sequences of measurements. We describe a translationally invariant finite-range interaction on a one-dimensional qudit chain and prove that a single-shot measurement of the energy of an appropriate computational basis state with respect to this Hamiltonian provides the output of any quantum circuit. The required measurement accuracy scales inverse polynomially with the size of the simulated quantum circuit. This shows that the implementation of energy measurements on generic qudit chains is as hard as the realization of quantum computation. Here a ''measurement'' is any procedure that samples from the spectral measure induced by the observable and the state under consideration. As opposed to measurement-based quantum computation, the post-measurement state is irrelevant.Comment: 19 pages, transition rules for the CA correcte

    Computational complexity of the landscape I

    Get PDF
    We study the computational complexity of the physical problem of finding vacua of string theory which agree with data, such as the cosmological constant, and show that such problems are typically NP hard. In particular, we prove that in the Bousso-Polchinski model, the problem is NP complete. We discuss the issues this raises and the possibility that, even if we were to find compelling evidence that some vacuum of string theory describes our universe, we might never be able to find that vacuum explicitly. In a companion paper, we apply this point of view to the question of how early cosmology might select a vacuum.Comment: JHEP3 Latex, 53 pp, 2 .eps figure

    Invariant Generation through Strategy Iteration in Succinctly Represented Control Flow Graphs

    Full text link
    We consider the problem of computing numerical invariants of programs, for instance bounds on the values of numerical program variables. More specifically, we study the problem of performing static analysis by abstract interpretation using template linear constraint domains. Such invariants can be obtained by Kleene iterations that are, in order to guarantee termination, accelerated by widening operators. In many cases, however, applying this form of extrapolation leads to invariants that are weaker than the strongest inductive invariant that can be expressed within the abstract domain in use. Another well-known source of imprecision of traditional abstract interpretation techniques stems from their use of join operators at merge nodes in the control flow graph. The mentioned weaknesses may prevent these methods from proving safety properties. The technique we develop in this article addresses both of these issues: contrary to Kleene iterations accelerated by widening operators, it is guaranteed to yield the strongest inductive invariant that can be expressed within the template linear constraint domain in use. It also eschews join operators by distinguishing all paths of loop-free code segments. Formally speaking, our technique computes the least fixpoint within a given template linear constraint domain of a transition relation that is succinctly expressed as an existentially quantified linear real arithmetic formula. In contrast to previously published techniques that rely on quantifier elimination, our algorithm is proved to have optimal complexity: we prove that the decision problem associated with our fixpoint problem is in the second level of the polynomial-time hierarchy.Comment: 35 pages, conference version published at ESOP 2011, this version is a CoRR version of our submission to Logical Methods in Computer Scienc
    corecore