13 research outputs found

    The type problem for Riemann surfaces via Fenchel-Nielsen parameters

    Full text link
    A Riemann surface XX is said to be of \emph{parabolic type} if it supports a Green's function. Equivalently, the geodesic flow on the unit tangent of XX is ergodic. Given a Riemann surface XX of arbitrary topological type and a hyperbolic pants decomposition of XX we obtain sufficient conditions for parabolicity of XX in terms of the Fenchel-Nielsen parameters of the decomposition. In particular, we initiate the study of the effect of twist parameters on parabolicity. A key ingredient in our work is the notion of \textit{non standard half-collar} about a hyperbolic geodesic. We show that the modulus of such a half-collar is much larger than the modulus of a standard half-collar as the hyperbolic length of the core geodesic tends to infinity. Moreover, the modulus of the annulus obtained by gluing two non standard half-collars depends on the twist parameter, unlike in the case of standard collars. Our results are sharp in many cases. For instance, for zero-twist flute surfaces as well as half-twist flute surfaces with concave sequences of lengths our results provide a complete characterization of parabolicity in terms of the length parameters. It follows that parabolicity is equivalent to completeness in these cases. Applications to other topological types such as surfaces with infinite genus and one end (a.k.a. the infinite Loch-Ness monster), the ladder surface, Abelian covers of compact surfaces are also studied.Comment: 51 pages, 17 figures. Typos corrected. Comparison between glued standard and nonstandard collars emphasized on page 10, formula (12). Abstract correcte

    Conformal Dimension of the Brownian Graph

    Full text link
    Conformal dimension of a metric space XX, denoted by dimCX\dim_C X, is the infimum of the Hausdorff dimension among all its quasisymmetric images. If conformal dimension of XX is equal to its Hausdorff dimension, XX is said to be minimal for conformal dimension. In this paper we show that the graph of the one dimensional Brownian motion is almost surely minimal for conformal dimension. We also give many other examples of minimal sets for conformal dimension, which we call Bedford-McMullen type sets. In particular we show that Bedford-McMullen self-affine sets with uniform fibers are minimal for conformal dimension. The main technique in the proofs is the construction of ``rich families of minimal sets of conformal dimension one''. The latter concept is quantified using Fuglede's modulus of measures.Comment: 42 pages, 6 figure

    Z3Z_3 and (×Z3)3(\times Z_3)^3 symmetry protected topological paramagnets

    Full text link
    We identify two-dimensional three-state Potts paramagnets with gapless edge modes on a triangular lattice protected by (×Z3)3Z3×Z3×Z3(\times Z_3)^3\equiv Z_3\times Z_3\times Z_3 symmetry and smaller Z3Z_3 symmetry. We derive microscopic models for the gapless edge, uncover their symmetries and analyze the conformal properties. We study the properties of the gapless edge by employing the numerical density-matrix renormalization group (DMRG) simulation and exact diagonalization. We discuss the corresponding conformal field theory, its central charge, and the scaling dimension of the corresponding primary field. The discussed two-dimensional models realize a variety of symmetry-protected topological phases, opening a window for studies of the unconventional quantum criticalities between them.Comment: 33 pages, 9 figure

    Deep Lake: a Lakehouse for Deep Learning

    Full text link
    Traditional data lakes provide critical data infrastructure for analytical workloads by enabling time travel, running SQL queries, ingesting data with ACID transactions, and visualizing petabyte-scale datasets on cloud storage. They allow organizations to break down data silos, unlock data-driven decision-making, improve operational efficiency, and reduce costs. However, as deep learning takes over common analytical workflows, traditional data lakes become less useful for applications such as natural language processing (NLP), audio processing, computer vision, and applications involving non-tabular datasets. This paper presents Deep Lake, an open-source lakehouse for deep learning applications developed at Activeloop. Deep Lake maintains the benefits of a vanilla data lake with one key difference: it stores complex data, such as images, videos, annotations, as well as tabular data, in the form of tensors and rapidly streams the data over the network to (a) Tensor Query Language, (b) in-browser visualization engine, or (c) deep learning frameworks without sacrificing GPU utilization. Datasets stored in Deep Lake can be accessed from PyTorch, TensorFlow, JAX, and integrate with numerous MLOps tools
    corecore