50,786 research outputs found

    Entropy of complex relevant components of Boolean networks

    Full text link
    Boolean network models of strongly connected modules are capable of capturing the high regulatory complexity of many biological gene regulatory circuits. We study numerically the previously introduced basin entropy, a parameter for the dynamical uncertainty or information storage capacity of a network as well as the average transient time in random relevant components as a function of their connectivity. We also demonstrate that basin entropy can be estimated from time-series data and is therefore also applicable to non-deterministic networks models.Comment: 8 pages, 6 figure

    The approach for complexity analysis of multivariate time series

    Get PDF
    This paper proposes to estimate the complexity of a multivariate time series by the spatio-temporal entropy based on multivariate singular spectrum analysis (M-SSA). In order to account for both within- and cross-component dependencies in multiple data channels the high dimensional block Toeplitz covariance matrix is decomposed as a Kronecker product of a spatial and a temporal covariance matrix and the multivariate spatio-temporal entropy is defined in terms of modulus and angle of the complex quantity constructed from the spatial and temporal components of the multivariate entropy. The benefits of the proposed approach are illustrated by simulations on complexity analysis of multivariate deterministic and stochastic processes

    Complexity of Nondeterministic Functions

    Get PDF
    The complexity of a nondeterministic function is the minimum possible complexity of its determinisation. The entropy of a nondeterministic function, F, is minus the logarithm of the ratio between the number of determinisations of F and the number of all deterministic functions. We obtain an upper bound on the complexity of a nondeterministic function with restricted entropy for the worst case. These bounds have strong applications in the problem of algorithm derandomization. A lot of randomized algorithms can be converted to deterministic ones if we have an effective hitting set with certain parameters (a set is hitting for a set system if it has a nonempty intersection with any set from the system). Linial, Luby, Saks and Zuckerman (1993) constructed the best effective hitting set for the system of k-value, n-dimensional rectangles. The set size is polynomial in k log n / epsilon. Our bounds of nondeterministic functions complexity offer a possibility to construct an effective hitting set for this system with almost linear size in k log n / epsilon

    A generalized permutation entropy for noisy dynamics and random processes

    Get PDF
    Permutation entropy measures the complexity of a deterministic time series via a data symbolic quantization consisting of rank vectors called ordinal patterns or simply permutations. Reasons for the increasing popularity of this entropy in time series analysis include that (i) it converges to the Kolmogorov–Sinai entropy of the underlying dynamics in the limit of ever longer permutations and (ii) its computation dispenses with generating and ad hoc partitions. However, permutation entropy diverges when the number of allowed permutations grows super-exponentially with their length, as happens when time series are output by dynamical systems with observational or dynamical noise or purely random processes. In this paper, we propose a generalized permutation entropy, belonging to the class of group entropies, that is finite in that situation, which is actually the one found in practice. The theoretical results are illustrated numerically by random processes with short- and long-term dependencies, as well as by noisy deterministic signals
    • …
    corecore