246 research outputs found

    Improving partial mutual information-based input variable selection by consideration of boundary issues associated with bandwidth estimation

    Get PDF
    Abstract not availableXuyuan Li, Aaron C. Zecchin, Holger R. Maie

    Reconstructing dynamical networks via feature ranking

    Full text link
    Empirical data on real complex systems are becoming increasingly available. Parallel to this is the need for new methods of reconstructing (inferring) the topology of networks from time-resolved observations of their node-dynamics. The methods based on physical insights often rely on strong assumptions about the properties and dynamics of the scrutinized network. Here, we use the insights from machine learning to design a new method of network reconstruction that essentially makes no such assumptions. Specifically, we interpret the available trajectories (data) as features, and use two independent feature ranking approaches -- Random forest and RReliefF -- to rank the importance of each node for predicting the value of each other node, which yields the reconstructed adjacency matrix. We show that our method is fairly robust to coupling strength, system size, trajectory length and noise. We also find that the reconstruction quality strongly depends on the dynamical regime

    Incremental Mutual Information: A New Method for Characterizing the Strength and Dynamics of Connections in Neuronal Circuits

    Get PDF
    Understanding the computations performed by neuronal circuits requires characterizing the strength and dynamics of the connections between individual neurons. This characterization is typically achieved by measuring the correlation in the activity of two neurons. We have developed a new measure for studying connectivity in neuronal circuits based on information theory, the incremental mutual information (IMI). By conditioning out the temporal dependencies in the responses of individual neurons before measuring the dependency between them, IMI improves on standard correlation-based measures in several important ways: 1) it has the potential to disambiguate statistical dependencies that reflect the connection between neurons from those caused by other sources (e. g. shared inputs or intrinsic cellular or network mechanisms) provided that the dependencies have appropriate timescales, 2) for the study of early sensory systems, it does not require responses to repeated trials of identical stimulation, and 3) it does not assume that the connection between neurons is linear. We describe the theory and implementation of IMI in detail and demonstrate its utility on experimental recordings from the primate visual system

    Configuration model for correlation matrices preserving the node strength

    Get PDF
    Correlation matrices are a major type of multivariate data. To examine properties of a given correlation matrix, a common practice is to compare the same quantity between the original correlation matrix and reference correlation matrices, such as those derived from random matrix theory, that partially preserve properties of the original matrix. We propose a model to generate such reference correlation and covariance matrices for the given matrix. Correlation matrices are often analysed as networks, which are heterogeneous across nodes in terms of the total connectivity to other nodes for each node. Given this background, the present algorithm generates random networks that preserve the expectation of total connectivity of each node to other nodes, akin to configuration models for conventional networks. Our algorithm is derived from the maximum entropy principle. We will apply the proposed algorithm to measurement of clustering coefficients and community detection, both of which require a null model to assess the statistical significance of the obtained results.Comment: 8 figures, 4 table
    • …
    corecore