5 research outputs found

    Find Your Place: Simple Distributed Algorithms for Community Detection

    Get PDF
    International audienceGiven an underlying graph, we consider the following dynamics: Initially, each node locally chooses a value in {−1, 1}, uniformly at random and independently of other nodes. Then, in each consecutive round, every node updates its local value to the average of the values held by its neighbors, at the same time applying an elementary, local clustering rule that only depends on the current and the previous values held by the node. We prove that the process resulting from this dynamics produces a clustering that exactly or approximately (depending on the graph) reects the underlying cut in logarithmic time, under various graph models that exhibit a sparse balanced cut, including the stochastic block model. We also prove that a natural extension of this dynamics performs community detection on a regularized version of the stochastic block model with multiple communities. Rather surprisingly, our results provide rigorous evidence for the ability of an extremely simple and natural dynamics which is non-trivial even in a centralized setting

    GANs schön kompliziert: Applications of Generative Adversarial Networks

    Get PDF
    Scientific research progresses via model-building. Researchers attempt to build realistic models of real-world phenomena, ranging from bacterial growth to galactic motion, and study these models as a means of understanding these phenomena. However, making these models as realistic as possible often involves fitting them to experimentally measured data. Recent advances in experimental methods have allowed for the collection of large-scale datasets. Simultaneously, advancements in computational capacity have allowed for more complex model-building. The confluence of these two factors accounts for the rise of machine learning methods as powerful tools, both for building models and fitting these models to large scale datasets. In this thesis, we use a particular machine learning technique: generative adversarial networks (GANs). GANs are a flexible and powerful tool, capable of fitting a wide variety of models. We explore the properties of GANs that underpin this flexibility, and show how we can capitalize on them in different scientific applications, beyond the image- and text-generating applications they are well-known for. Here we present three different applications of GANs. First, we show how GANs can be used as generative models of neural spike trains, and how they are capable of capturing more features of these spike trains compared to other approaches. We also show how this could enable insight into how information about stimuli are encoded in the spike trains. Second, we demonstrate how GANs can be used as density estimators for extending simulation-based Bayesian inference to high-dimensional parameter spaces. In this form, we also show how GANs bridge Bayesian inference methods and variational inference with autoencoders and use them to fit complex climate models to data. Finally, we use GANs to infer synaptic plasticity rules for biological rate networks directly from data. We then show how GANs be used to test the robustness of the inferred rules to differences in data and network initialisation. Overall, we repurpose GANs in new ways for a variety of scientific domains, and show that they confer specific advantages over the state-of-the-art methods in each of these domains

    A Cognitive Information Theory of Music: A Computational Memetics Approach

    Get PDF
    This thesis offers an account of music cognition based on information theory and memetics. My research strategy is to split the memetic modelling into four layers: Data, Information, Psychology and Application. Multiple cognitive models are proposed for the Information and Psychology layers, and the MDL best-fit models with published human data are selected. Then, for the Psychology layer only, new experiments are conducted to validate the best-fit models. In the information chapter, an information-theoretic model of musical memory is proposed, along with two competing models. The proposed model exhibited a better fit with human data than the competing models. Higher-level psychological theories are then built on top of this information layer. In the similarity chapter, I proposed three competing models of musical similarity, and conducted a new experiment to validate the best-fit model. In the fitness chapter, I again proposed three competing models of musical fitness, and conducted a new experiment to validate the best-fit model. In both cases, the correlations with human data are statistically significant. All in all, my research has shown that the memetic strategy is sound, and the modelling results are encouraging. Implications of this research are discussed

    Blind source separation the effects of signal non-stationarity

    Get PDF

    Oja's algorithm for graph clustering, Markov spectral decomposition, and risk sensitive control

    No full text
    Given a positive definite matrix M and an integer N-m >= 1, Oja's subspace algorithm will provide convergent estimates of the first N-m eigenvalues of M along with the corresponding eigenvectors. It is a common approach to principal component analysis. This paper introduces a normalized stochastic-approximation implementation of Oja's subspace algorithm, as well as new applications to the spectral decomposition of a reversible Markov chain. Recall that this means that the stationary distribution satisfies the detailed balance equations (Meyn & Tweedie, 2009). Equivalently, the statistics of the process in steady state do not change when time is reversed. Stability and convergence of Oja's algorithm are established under conditions far milder than that assumed in previous work. Applications to graph clustering, Markov spectral decomposition, and multiplicative ergodic theory are surveyed, along with numerical results. (C) 2012 Elsevier Ltd. All rights reserved
    corecore