129 research outputs found

    Packing dimensions of the divergence points of self-similar measures with the open set condition

    Full text link
    Let μ\mu be the self-similar measure supported on the self-similar set KK with open set condition. In this article, we discuss the packing dimension of the set {xK:A(logμ(B(x,r))logr)=I}\{x\in K: A(\frac{\log\mu(B(x,r))}{\log r})=I\} for IRI\subseteq\mathbb{R}, where A(logμ(B(x,r))logr)A(\frac{\log\mu(B(x,r))}{\log r}) denotes the set of accumulation points of \frac{\log\mu(B(x,r))}{\log r}as as r\searrow0$. Our main result solves the conjecture about packing dimension posed by Olsen and Winter \cite{OlsWin} and generalizes the result in \cite{BaeOlsSni}.Comment: 13 page

    Multifractal Analysis of Ergodic Averages in Some Nonuniformly Hyperbolic Systems

    Full text link
    This article is devoted to the study of the multifractal analysis of ergodic averages in some nonuniformly hyperbolic systems. In particular, our results hold for the robust classes of multidimensional nonuniformly expanding local diffeomorphisms and Viana maps.Comment: 15 page

    Multifractal analysis for historic set in topological dynamical systems

    Full text link
    In this article, the historic set is divided into different level sets and we use topological pressure to describe the size of these level sets. We give an application of these results to dimension theory. Especially, we use topological pressure to describe the relative multifractal spectrum of ergodic averages and give a positive answer to the conjecture posed by L. Olsen (J. Math. Pures Appl. {\bf 82} (2003)).Comment: 30 page

    The variational principle of local pressure for actions of sofic group

    Full text link
    This study establishes the variational principle for local pressure in the sofic context.Comment: 13 page

    The Bowen's topological entropy of the Cartesian product sets

    Full text link
    This article is devoted to showing the product theorem for Bowen's topological entropy.Comment: 11 pages. arXiv admin note: text overlap with arXiv:1012.1103 by other author

    The Historic Set of Ergodic Averages in Some Nonuniformly Hyperbolic Systems

    Full text link
    This article is devoted to the study of the historic set of ergodic averages in some nonuniformly hyperbolic systems. In particular, our results hold for the robust classes of multidimensional nonuniformly expanding local diffeomorphisms and Viana maps.Comment: 18 pages. Comments are welcome. arXiv admin note: text overlap with arXiv:1310.234

    Topological pressure, mistake functions and average metric

    Full text link
    In this paper, we showed that the Pesin pressure of any subset under a mistake function is equal to the classical Pesin pressure of the subset in dynamical systems. Our result extended the result of [1] in additive case, which proved the topological pressure of the whole system is self adaptable under a mistake function. As an application, we showed that the Pesin pressure defined by average metric is equal to the classical Pesin pressure.Comment: 7 page

    Shadowing and mixing on systems of countable group actions

    Full text link
    Let (X,G,Φ)(X,G,\Phi) be a dynamical system, where XX is compact Hausdorff space, and GG is a countable discrete group. We investigate shadowing property and mixing between subshifts and general dynamical systems. For the shadowing property, fix some finite subset SGS\subset G. We prove that if XX is totally disconnected, then Φ\Phi has SS-shadowing property if and only if (X,G,Φ)(X,G,\Phi) is conjugate to an inverse limit of a sequence of shifts of finite type which satisfies Mittag-Leffler condition. Also, suppose that XX is metric space (may be not totally disconnected), we prove that if Φ\Phi has SS-shadowing property, then (X,G,Φ)(X,G,\Phi) is a factor of an inverse limit of a sequence of shifts of finite type by a factor map which almost lifts pseudo-orbit for SS. On the other hand, let property PP be one of the following property: transitivity, minimal, totally transitivity, weakly mixing, mixing, and specification property. We prove that if XX is totally disconnected, then Φ\Phi has property PP if and only if (X,G,Φ)(X,G,\Phi) is conjugate to an inverse limit of an inverse system that consists of subshifts with property PP which satisfies Mittag-Leffler condition. Also, for the case of metric space (may be not totally disconnected), if property PP is not minimal or specification property, we prove that Φ\Phi has property PP if and only if (X,G,Φ)(X,G,\Phi) is a factor of an inverse limit of a sequence of subshifts with property PP which satisfies Mittag-Leffler condition.Comment: 23 page

    Entropy and Emergence of Topological Dynamical Systems

    Full text link
    A topological dynamical system (X,f)(X,f) induces two natural systems, one is on the probability measure spaces and other one is on the hyperspace. We introduce a concept for these two spaces, which is called entropy order, and prove that it coincides with topological entropy of (X,f)(X,f). We also consider the entropy order of an invariant measure and a variational principle is established.Comment: Any comments are welcom

    AXNet: ApproXimate computing using an end-to-end trainable neural network

    Full text link
    Neural network based approximate computing is a universal architecture promising to gain tremendous energy-efficiency for many error resilient applications. To guarantee the approximation quality, existing works deploy two neural networks (NNs), e.g., an approximator and a predictor. The approximator provides the approximate results, while the predictor predicts whether the input data is safe to approximate with the given quality requirement. However, it is non-trivial and time-consuming to make these two neural network coordinate---they have different optimization objectives---by training them separately. This paper proposes a novel neural network structure---AXNet---to fuse two NNs to a holistic end-to-end trainable NN. Leveraging the philosophy of multi-task learning, AXNet can tremendously improve the invocation (proportion of safe-to-approximate samples) and reduce the approximation error. The training effort also decrease significantly. Experiment results show 50.7% more invocation and substantial cuts of training time when compared to existing neural network based approximate computing framework.Comment: Accepted by ICCAD 201
    corecore