118 research outputs found

    Martingales, Efficient Market Hypothesis and Kolmogorov’s Complexity Theory

    Get PDF
    Efficient market theory states that financial markets can process information instantly. Empirical observations have challenged the stricter form of the efficient market hypothesis (EMH). These empirical observations and theoretical considerations show that price changes are difficult to predict if one starts from the time series of price changes. This paper provides an explanation in terms of algorithmic complexity theory of Kolmogorov that makes a clearer connection between the efficient market hypothesis and the unpredictable character of stock returns

    Computing absolutely normal numbers in nearly linear time

    Get PDF
    A real number x is absolutely normal if, for every base b ≥ 2, every two equally long strings of digits appear with equal asymptotic frequency in the base-b expansion of x. This paper presents an explicit algorithm that generates the binary expansion of an absolutely normal number x, with the nth bit of x appearing after npolylog(n) computation steps. This speed is achieved by simultaneously computing and diagonalizing against a martingale that incorporates Lempel-Ziv parsing algorithms in all bases

    User-friendly tail bounds for sums of random matrices

    Get PDF
    This paper presents new probability inequalities for sums of independent, random, self-adjoint matrices. These results place simple and easily verifiable hypotheses on the summands, and they deliver strong conclusions about the large-deviation behavior of the maximum eigenvalue of the sum. Tail bounds for the norm of a sum of random rectangular matrices follow as an immediate corollary. The proof techniques also yield some information about matrix-valued martingales. In other words, this paper provides noncommutative generalizations of the classical bounds associated with the names Azuma, Bennett, Bernstein, Chernoff, Hoeffding, and McDiarmid. The matrix inequalities promise the same diversity of application, ease of use, and strength of conclusion that have made the scalar inequalities so valuable.Comment: Current paper is the version of record. The material on Freedman's inequality has been moved to a separate note; other martingale bounds are described in Caltech ACM Report 2011-0

    Kolmogorov complexity

    Get PDF
    In dieser Dissertation werden neue Ergebnisse über Kolmogorovkomplexität diskutiert. Ihr erster Teil konzentriert sich auf das Studium von Kolmogorovkomplexität ohne Zeitschranken. Hier beschäftigen wir uns mit dem Konzept nicht-monotoner Zufälligkeit, d.h. Zufälligkeit, die von Martingalen charakterisiert wird, die in nicht-monotoner Reihenfolge wetten dürfen. Wir werden in diesem Zusammenhang eine Reihe von Zufälligkeitsklassen einführen, und diese dann von einander separieren. Wir präsentieren auß erdem einen systematischen überblick über verschiedene Traceability-Begriffe und charakterisieren diese durch (Auto-)Komplexitätsbegriffe. Traceabilities sind eine Gruppe von Begriffen, die ausdrücken, dass eine Menge beinahe berechenbar ist. Der zweite Teil dieses Dokuments beschäftigt sich mit dem Thema zeitbeschränkter Kolmogorovkomplexität. Zunächst untersuchen wir den Unterschied zwischen zwei Arten, ein Wort zu beschreiben: Die Komplexität, es genau genug zu beschreiben, damit es von anderen Wörter unterschieden werden kann; sowie die Komplexität, es genau genug zu beschreiben, damit das Wort aus der Beschreibung tatsächlich generiert werden kann. Diese Unterscheidung ist im Falle zeitunbeschränkter Kolmogorovkomplexität nicht von Bedeutung; sobald wir jedoch Zeitschranken einführen, wird sie essentiell. Als nächstes führen wir den Begriff der Tiefe ein und beweisen ein ihn betreffendes Dichotomieresultat, das in seiner Struktur an Kummers bekanntes Gap-Theorem erinnert. Zu guter Letzt betrachten wir den wichtigen Begriff der Solovayfunktionen. Hierbei handelt es sich um berechenbare obere Schranken der Kolmogorovkomplexität, die unendlich oft scharf sind. Wir benutzen sie, um in einem gewissen Zusammenhang Martin-Löf-Zufälligkeit zu charakterisieren, und um eine Charakterisierung von Jump-Traceability anzugeben

    Computability and Fractal Dimension

    Get PDF
    This thesis combines computability theory and various notions of fractal dimension, mainly Hausdorff dimension. An algorithmic approach to Hausdorff measures makes it possible to define the Hausdorff dimension of individual points instead of sets in a metric space. This idea was first realized by Lutz (2000). Working in the Cantor space of all infinite binary sequences, we study the theory of Hausdorff and other dimensions for individual sequences. After giving an overview over the classical theory of fractal dimension in Cantor space, we develop the theory of effective Hausdorff dimension and its variants systematically. Our presentation is inspired by the approach to algorithmic information theory developed by Kolmogorov and his students. We are able to give a new and much easier proof of a central result of the effective theory: Effective Hausdorff dimension coincides with the lower asymptotic algorithmic entropy, defined in terms of Kolmogorov complexity. Besides, we prove a general theorem on the behavior of effective dimension under r-expansive mappings, which can be seen as a generalization of Hölder mappings in Cantor space. Furthermore, we study the connections between other notions of effective fractal dimension and algorithmic entropy. Besides, we are able to show that the set of sequences of effective Hausdorff dimension s has Hausdorff dimension s and infinite s-dimensional Hausdorff measure (for every 0<s<1). Next, we study the Hausdorff dimension (effective and classical) of objects arising in computability theory. We prove that the upper cone of any sequence under a standard reducibility has Hausdorff dimension 1, thereby exposing a Lebesgue nullset that has maximal Hausdorff dimension. Furthermore, using the behavior of effective dimension under r-expansive transformations, we are able to show that the effective Hausdorff dimension of the lower cone and the degree of a sequence coincide. For many-one reducibility, we prove the existence of lower cones of non-integral dimension. After giving some natural' examples of sequences of effective dimension 0, we prove that every effectively closed set A of positive Hausdorff dimension admits a computable, surjective mapping onto Cantor space. We go on to study the complex interrelation between algorithmic entropy, randomness, effective Hausdorff dimension, and reducibility more closely. For this purpose we generalize effective Hausdorff dimension by introducing the notion of strong effective Hausdorff measure 0. We are able to show that not having strong effective Hausdorff measure 0 does not necessarily allow to compute a Martin-Löf random sequence, a sequence of highest possible algorithmic entropy. Besides, we show that a generalization of the notion of effective randomness to noncomputable measures yields a very coarse concept of randomness in the sense that every noncomputable sequence is random with respect to some measure. Next, we introduce Schnorr dimension, a notion of dimension which is algorithmically more restrictive than effective dimension. We prove a machine characterization of Schnorr dimension and show that, on the computably enumerable sets, Schnorr Hausdorff dimension and Schnorr packing dimension do not coincide, in contrast to the case of effective dimension. We also study subrecursive notions of effective Hausdorff dimension. Using resource-bounded martingales, we are able to transfer the use of r-expansiveness to the resource-bounded case, which enables us to show that the Small-Span Theorem does not hold for dimension in exponential time E. Finally, we investigate the effective Hausdorff dimension of sequences against which no computable nonmonotonic betting strategy can succeed. Computable nonmonotonic betting games are a generalization of computable martingales, and it is a major open question whether the randomness notion induced by them is equivalent to Martin-Löf randomness. We are able to show that the sequences which are random with respect to computable nonmonotonic betting games have effective Hausdorff dimension 1, which implies that, from the viewpoint of algorithmic entropy, they are rather close to Martin-Löf randomness

    Stochastic Mechanics and the Unification of Quantum Mechanics with Brownian Motion

    Full text link
    We unify Brownian motion and quantum mechanics in a single mathematical framework. In particular, we show that non-relativistic quantum mechanics of a single spinless particle on a flat space can be described by a Wiener process that is rotated in the complex plane. We then extend this theory to relativistic stochastic theories on manifolds using the framework of second order geometry. As a byproduct, our results suggest that a consistent path integral based formulation of a quantum theory on a Lorentzian (Riemannian) manifold requires an Ito deformation of the Poincare (Galilean) symmetry, arising due to the coupling of the quadratic variation to the affine connection.Comment: 113 pages; preprint of a book published by Springer Natur
    • …
    corecore