1,280,399 research outputs found

    Contour detection by CORF operator

    Get PDF
    We propose a contour operator, called CORF, inspired by the properties of simple cells in visual cortex. It combines, by a weighted geometric mean, the blurred responses of difference-of-Gaussian operators that model cells in the lateral geniculate nucleus (LGN). An operator that has gained particular popularity as a computational model of a simple cell is based on a family of Gabor Functions (GFs). However, the GF operator short-cuts the LGN, and its effectiveness in contour detection tasks, which is assumed to be the primary biological role of simple cells, has never been compared with the effectiveness of alternative operators. We compare the performances of the CORF and the GF operators using the RuG and the Berkeley data sets of natural scenes with associated ground truths. The proposed CORF operator outperforms the GF operator (RuG: t(39)=4.39, p<10−4 and Berkeley: t(499)=4.95, p<10−6).peer-reviewe

    Analog Neural Networks as Decoders

    Get PDF
    Analog neural networks with feedback can be used to implement l(Winner-Take-All (KWTA) networks. In turn, KWTA networks can be used as decoders of a class of nonlinear error-correcting codes. By interconnecting such KWTA networks, we can construct decoders capable of decoding more powerful codes. We consider several families of interconnected KWTA networks, analyze their performance in terms of coding theory metrics, and consider the feasibility of embedding such networks in VLSI technologies

    Spectrum-based deep neural networks for fraud detection

    Full text link
    In this paper, we focus on fraud detection on a signed graph with only a small set of labeled training data. We propose a novel framework that combines deep neural networks and spectral graph analysis. In particular, we use the node projection (called as spectral coordinate) in the low dimensional spectral space of the graph's adjacency matrix as input of deep neural networks. Spectral coordinates in the spectral space capture the most useful topology information of the network. Due to the small dimension of spectral coordinates (compared with the dimension of the adjacency matrix derived from a graph), training deep neural networks becomes feasible. We develop and evaluate two neural networks, deep autoencoder and convolutional neural network, in our fraud detection framework. Experimental results on a real signed graph show that our spectrum based deep neural networks are effective in fraud detection

    Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey

    Full text link
    Dynamic networks are used in a wide range of fields, including social network analysis, recommender systems, and epidemiology. Representing complex networks as structures changing over time allow network models to leverage not only structural but also temporal patterns. However, as dynamic network literature stems from diverse fields and makes use of inconsistent terminology, it is challenging to navigate. Meanwhile, graph neural networks (GNNs) have gained a lot of attention in recent years for their ability to perform well on a range of network science tasks, such as link prediction and node classification. Despite the popularity of graph neural networks and the proven benefits of dynamic network models, there has been little focus on graph neural networks for dynamic networks. To address the challenges resulting from the fact that this research crosses diverse fields as well as to survey dynamic graph neural networks, this work is split into two main parts. First, to address the ambiguity of the dynamic network terminology we establish a foundation of dynamic networks with consistent, detailed terminology and notation. Second, we present a comprehensive survey of dynamic graph neural network models using the proposed terminologyComment: 28 pages, 9 figures, 8 table
    corecore