3,971 research outputs found

    Cloud Computing and Cloud Automata as A New Paradigm for Computation

    Get PDF
    Cloud computing addresses how to make right resources available to right computation to improve scaling, resiliency and efficiency of the computation. We argue that cloud computing indeed, is a new paradigm for computation with a higher order of artificial intelligence (AI), and put forward cloud automata as a new model for computation. A high-level AI requires infusing features that mimic human functioning into AI systems. One of the central features is that humans learn all the time and the learning is incremental. Consequently, for AI, we need to use computational models, which reflect incremental learning without stopping (sentience). These features are inherent in reflexive, inductive and limit Turing machines. To construct cloud automata, we use the mathematical theory of Oracles, which include Oracles of Turing machines as its special case. We develop a hierarchical approach based on Oracles with different ranks that includes Oracle AI as a special case. Discussing a named-set approach, we describe an implementation of a high-performance edge cloud using hierarchical name-oriented networking and Oracle AI-based orchestration. We demonstrate how cloud automata with a control overlay allows microservice network provisioning, monitoring and reconfiguration to address non-deterministic fluctuations affecting their behavior without interrupting the overall evolution of computation

    Quantum Probability as an Application of Data Compression Principles

    Full text link
    Realist, no-collapse interpretations of quantum mechanics, such as Everett's, face the probability problem: how to justify the norm-squared (Born) rule from the wavefunction alone. While any basis-independent measure can only be norm-squared (due to the Gleason-Busch Theorem) this fact conflicts with various popular, non-wavefunction-based phenomenological measures - such as observer, outcome or world counting - that are frequently demanded of Everettians. These alternatives conflict, however, with the wavefunction realism upon which Everett's approach rests, which seems to call for an objective, basis-independent measure based only on wavefunction amplitudes. The ability of quantum probabilities to destructively interfere with each other, however, makes it difficult to see how probabilities can be derived solely from amplitudes in an intuitively appealing way. I argue that the use of algorithmic probability can solve this problem, since the objective, single-case probability measure that wavefunction realism demands is exactly what algorithmic information theory was designed to provide. The result is an intuitive account of complex-valued amplitudes, as coefficients in an optimal lossy data compression, such that changes in algorithmic information content (entropy deltas) are associated with phenomenal transitions.Comment: In Proceedings PC 2016, arXiv:1606.0651
    • …
    corecore