16 research outputs found

    A generalized characterization of algorithmic probability

    Get PDF
    An a priori semimeasure (also known as "algorithmic probability" or "the Solomonoff prior" in the context of inductive inference) is defined as the transformation, by a given universal monotone Turing machine, of the uniform measure on the infinite strings. It is shown in this paper that the class of a priori semimeasures can equivalently be defined as the class of transformations, by all compatible universal monotone Turing machines, of any continuous computable measure in place of the uniform measure. Some consideration is given to possible implications for the prevalent association of algorithmic probability with certain foundational statistical principles

    (Non-)Equivalence of Universal Priors

    Full text link
    Ray Solomonoff invented the notion of universal induction featuring an aptly termed "universal" prior probability function over all possible computable environments. The essential property of this prior was its ability to dominate all other such priors. Later, Levin introduced another construction --- a mixture of all possible priors or `universal mixture'. These priors are well known to be equivalent up to multiplicative constants. Here, we seek to clarify further the relationships between these three characterisations of a universal prior (Solomonoff's, universal mixtures, and universally dominant priors). We see that the the constructions of Solomonoff and Levin define an identical class of priors, while the class of universally dominant priors is strictly larger. We provide some characterisation of the discrepancy.Comment: 10 LaTeX pages, 1 figur

    Sequential Predictions based on Algorithmic Complexity

    Get PDF
    This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-log m, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's universal prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses. Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction. We show that for deterministic computable environments, the "posterior" and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence convergence can be slow. In probabilistic environments, neither the posterior nor the losses converge, in general.Comment: 26 pages, LaTe

    Mind before matter: reversing the arrow of fundamentality

    Full text link
    In this contribution to FQXi's essay contest 2018, I suggest that it is sometimes a step forward to reverse our intuition on "what is fundamental", a move that is somewhat reminiscent of the idea of noncommutative geometry. I argue that some foundational conceptual problems in physics and related fields motivate us to attempt such a reversal of perspective, and to take seriously the idea that an information-theoretic notion of observer ("mind") could in some sense be more fundamental than our intuitive idea of a physical world ("matter"). I sketch what such an approach could look like, and why it would complement but not contradict the view that the material world is the cause of our experience.Comment: Contribution to the 2018 FQXi essay contest "What is fundamental?

    Solomonoff Prediction and Occam's Razor

    Get PDF
    Algorithmic information theory gives an idealized notion of compressibility, that is often presented as an objective measure of simplicity. It is suggested at times that Solomonoff prediction, or algorithmic information theory in a predictive setting, can deliver an argument to justify Occam's razor. This paper explicates the relevant argument, and, by converting it into a Bayesian framework, reveals why it has no such justificatory force. The supposed simplicity concept is better perceived as a specific inductive assumption, the assumption of effectiveness. It is this assumption that is the characterizing element of Solomonoff prediction, and wherein its philosophical interest lies

    Solomonoff Prediction and Occam’s Razor

    Full text link

    From quantum foundations to quantum information protocols and back

    Get PDF
    Physics has two main ambitions: to predict and to understand. Indeed, physics aims for the prediction of all natural phenomena. Prediction entails modeling the correlation between an action, the input, and what is subsequently observed, the output.Understanding, on the other hand, involves developing insightful principles and models that can explain the widest possible varietyof correlations present in nature. Remarkably, advances in both prediction and understanding foster our physical intuition and, as a consequence, novel and powerful applications are discovered. Quantum mechanics is a very successful physical theory both in terms of its predictive power as well as in its wide applicability. Nonetheless and despite many decades of development, we do not yet have a proper physical intuition of quantum phenomena. I believe that improvements in our understanding of quantum theory will yield better, and more innovative, protocols and vice versa.This dissertation aims at advancing our understanding and developing novel protocols. This is done through four approaches. The first one is to study quantum theory within a broad family of theories. In particular, we study quantum theory within the family of locally quantum theories. We found out that the principle that singles out quantum theory out of this family, thus connecting quantum local and nonlocal structure, is dynamical reversibility. This implies that the viability of large scale quantum computing can be based on concrete physical principles that can be experimentally tested at a local level without needing to test millions of qubits simultaneously. The second approach is to study quantum correlations from a black box perspective thus making as few assumptions as possible. The strategy is to study the completeness of quantum predictions by benchmarking them against alternative models. Three main results and applications come out of our study. Firstly, we prove that performing complete amplification of randomness starting from a source of arbitrarily weak randomness - a task that is impossible with classical resources - is indeed possible via nonlocality. This establishes in our opinion the strongest evidence for a truly random event in nature so far. Secondly, we prove that there exist finite events where quantum theory gives predictions as complete as any no-signaling theory can give, showing that the completeness of quantum theory is not an asymptotic property. Finally, we prove that maximally nonlocal theories can never be maximally random while quantum theory can, showing a trade-off between the nonlocality of a theory and its randomness capabilities. We also prove that quantum theory is not unique in this respect. The third approach we follow is to study quantum correlations in scenarios where some parties have a restriction on the available quantum degrees of freedom. The future progress of semi-device-independent quantum information depends crucially on our ability to bound the strength of these correlations. Here we provide a full characterization via a complete hierarchy of sets that approximate the target set from the outside. Each set can be in turn characterized using standard numerical techniques. One application of our work is certifying multidimensional entanglement device-independently.The fourth approach is to confront quantum theory with computer science principles. In particular, we establish two interesting implications for quantum theory results of raising the Church-Turing thesis to the level of postulate. Firstly, we show how different preparations of the same mixed state, indistinguishable according to the quantum postulates, become distinguishable when prepared computably. Secondly, we identify a new loophole for Bell-like experiments: if some parties in a Bell-like experiment use private pseudorandomness to choose their measurement inputs, the computational resources of an eavesdropper have to be limited to observe a proper violation of non locality.La física tiene dos finalidades: predecir y comprender. En efecto, la física aspira a poder predecir todos los fenómenos naturales. Predecir implica modelar correlaciones entre una acción y la reacción subsiguiente.Comprender, implica desarrollar leyes profundas que expliquen la más amplia gama de correlaciones presentes en la naturaleza. Avances tanto en la capacidad de predicción como en nuestra comprensión fomentan la intuición física y, como consecuencia, surgen nuevas y poderosas aplicaciones. La mecánica cuántica es una teoría física de enorme éxito por su capacidad de predicción y amplia aplicabilidad.Sin embargo, a pesar de décadas de gran desarrollo, no poseemos una intuición física satisfactoria de los fenómenos cuánticos.Creo que mejoras en nuestra comprensión de la teoría cuántica traerán consigo mejores y más innovadores protocolos y vice versa.Ésta tesis doctoral trata simultáneamente de avanzar nuestra comprensión y de desarrollar nuevos protocolos mediante cuatro enfoques distintos.El primero consiste en estudiar la mecánica cuántica como miembro de una familia de teorías: las teorías localmente cuánticas. Probamos que el principio que selecciona a la mecánica cuántica, conectando por tanto la estructura cuántica local y no local, es la reversibilidad de su dinámica.Ésto implica que la viabilidad de la computación cuántica a gran escala puede ser estudiada de manera local, comprobando experimentalmente ciertos principios físicos. El segundo enfoque consiste en estudiar las correlaciones cuánticas desde una perspectiva de 'caja negra', haciendo así el mínimo de asunciones físicas. La estrategia consiste en estudiar la completitud de las predicciones cuánticas, comparándolas con todos los modelos alternativos. Hemos obtenido tres grandes resultados. Primero, probamos que se puede amplificar completamente la aleatoriedad de una fuente de aleatoriedad arbitrariamente débil.Ésta tarea, imposible mediante recursos puramente clásicos, se vuelve factible gracias a la no localidad. Ésto establece a nuestro parecer la evidencia más fuerte de la existencia de eventos totalmente impredecibles en la naturaleza. Segundo, probamos que existen eventos finitos cuyas predicciones cuánticas son tan completas como permite el principio de 'no signaling'. Ésto prueba que la completitud de la mecánica cuántica no es una propiedad asintótica. Finalmente, probamos que las teorías máximamente no locales no pueden ser máximamente aleatorias, mientras que la mecánica cuántica lo es. Ésto muestra que hay una compensación entre la no localidad de una teoría y su capacidad para generar aleatoriedad. También probamos que la mecánica cuántica no es única en éste respecto. En tercer lugar, estudiamos las correlaciones cuánticas en escenarios dónde algunas partes tienen restricciones en el número de grados de libertad cuánticos accesibles. Éste escenario se denomina 'semi-device-independent'. Aquí encontramos una caracterización completa de éstas correlaciones mediante una jerarquía de conjuntos que aproximan al conjunto buscado desde fuera y que pueden ser caracterizados a su vez mediante técnicas numéricas estandar. Un aplicación de nuestro trabajo es la certificación de entrelazamiento multidimensional de manera 'device-independent'. El cuarto y último enfoque consiste en enfrentar a la mecánica cuántica con principios provenientes de la computación. En particular, establecemos dos implicaciones para la mecánica cuántica de elevar la tesis de Church-Turing al nivel de postulado. Primero, mostramos que diferentes preparaciones de un mismo estado mixto, indistinguibles de acuerdo a los axiomas cuánticos, devienen distinguibles cuando son preparados de manera computable. Segundo, identificamos un nuevo 'loophole' en experimentos de Bell: si algunas partes en un experimento de Bell usan pseudo aleatoriedad para escoger sus medidas, los recursos computacionales de un espía deben ser limitados a fin de observar verdaderamente la no localidad
    corecore