92,636 research outputs found

    Expressing the entropy of lattice systems as sums of conditional entropies

    Full text link
    Whether a system is to be considered complex or not depends on how one searches for correlations. We propose a general scheme for calculation of entropies in lattice systems that has high flexibility in how correlations are successively taken into account. Compared to the traditional approach for estimating the entropy density, in which successive approximations builds on step-wise extensions of blocks of symbols, we show that one can take larger steps when collecting the statistics necessary to calculate the entropy density of the system. In one dimension this means that, instead of a single sweep over the system in which states are read sequentially, one take several sweeps with larger steps so that eventually the whole lattice is covered. This means that the information in correlations is captured in a different way, and in some situations this will lead to a considerably much faster convergence of the entropy density estimate as a function of the size of the configurations used in the estimate. The formalism is exemplified with both an example of a free energy minimisation scheme for the two-dimensional Ising model, and an example of increasingly complex spatial correlations generated by the time evolution of elementary cellular automaton rule 60

    Frequency Effects on Predictability of Stock Returns

    Full text link
    We propose that predictability is a prerequisite for profitability on financial markets. We look at ways to measure predictability of price changes using information theoretic approach and employ them on all historical data available for NYSE 100 stocks. This allows us to determine whether frequency of sampling price changes affects the predictability of those. We also relations between price changes predictability and the deviation of the price formation processes from iid as well as the stock's sector. We also briefly comment on the complicated relationship between predictability of price changes and the profitability of algorithmic trading.Comment: 8 pages, 16 figures, submitted for possible publication to Computational Intelligence for Financial Engineering and Economics 2014 conferenc

    Time Resolution Dependence of Information Measures for Spiking Neurons: Atoms, Scaling, and Universality

    Full text link
    The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step towards that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate). We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarse-grained structural properties of interspike interval statistics; e.g., τ\tau-entropy rates that diverge less quickly than the firing rate indicate interspike interval correlations. We also find evidence that the excess entropy and regularized statistical complexity of different types of integrate-and-fire neurons are universal in the continuous-time limit in the sense that they do not depend on mechanism details. This suggests a surprising simplicity in the spike trains generated by these model neurons. Interestingly, neurons with gamma-distributed ISIs and neurons whose spike trains are alternating renewal processes do not fall into the same universality class. These results lead to two conclusions. First, the dependence of information measures on time resolution reveals mechanistic details about spike train generation. Second, information measures can be used as model selection tools for analyzing spike train processes.Comment: 20 pages, 6 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/trdctim.ht

    Outlier Detection Techniques For Wireless Sensor Networks: A Survey

    Get PDF
    In the field of wireless sensor networks, measurements that significantly deviate from the normal pattern of sensed data are considered as outliers. The potential sources of outliers include noise and errors, events, and malicious attacks on the network. Traditional outlier detection techniques are not directly applicable to wireless sensor networks due to the multivariate nature of sensor data and specific requirements and limitations of the wireless sensor networks. This survey provides a comprehensive overview of existing outlier detection techniques specifically developed for the wireless sensor networks. Additionally, it presents a technique-based taxonomy and a decision tree to be used as a guideline to select a technique suitable for the application at hand based on characteristics such as data type, outlier type, outlier degree
    corecore