49,624 research outputs found

    Towards more realistic network models based on Graph Neural Networks

    Get PDF
    Recently, a Graph Neural Network (GNN) model called RouteNet was proposed as an efficient method to estimate end-to-end network performance metrics such as delay or jitter, given the topology, routing, and traffic of the network. Despite its success in making accurate estimations and generalizing to unseen topologies, the model makes some simplifying assumptions about the network, and does not consider all the particularities of how real networks operate. In this work we extend the architecture of RouteNet to support different features on forwarding devices, specifically we focus on devices with variable queue sizes, and we experimentally evaluate the accuracy of the extended RouteNet architecture.This work was supported by the Spanish MINECO under contract TEC2017-90034-C2-1-R (ALLIANCE), the Catalan Institution for Research and Advanced Studies (ICREA), FI-AGAUR grant by the Catalan Government and the AGH University of Science and by the Polish Ministry of Science and Higher Education with the subvention funds of the Faculty of Computer Science, Electronics and Telecommunications of AGH University. The research was also supported in part by PL-Grid Infrastructure.Peer ReviewedPostprint (author's final draft

    Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey

    Full text link
    Dynamic networks are used in a wide range of fields, including social network analysis, recommender systems, and epidemiology. Representing complex networks as structures changing over time allow network models to leverage not only structural but also temporal patterns. However, as dynamic network literature stems from diverse fields and makes use of inconsistent terminology, it is challenging to navigate. Meanwhile, graph neural networks (GNNs) have gained a lot of attention in recent years for their ability to perform well on a range of network science tasks, such as link prediction and node classification. Despite the popularity of graph neural networks and the proven benefits of dynamic network models, there has been little focus on graph neural networks for dynamic networks. To address the challenges resulting from the fact that this research crosses diverse fields as well as to survey dynamic graph neural networks, this work is split into two main parts. First, to address the ambiguity of the dynamic network terminology we establish a foundation of dynamic networks with consistent, detailed terminology and notation. Second, we present a comprehensive survey of dynamic graph neural network models using the proposed terminologyComment: 28 pages, 9 figures, 8 table

    Deformable Shape Completion with Graph Convolutional Autoencoders

    Full text link
    The availability of affordable and portable depth sensors has made scanning objects and people simpler than ever. However, dealing with occlusions and missing parts is still a significant challenge. The problem of reconstructing a (possibly non-rigidly moving) 3D object from a single or multiple partial scans has received increasing attention in recent years. In this work, we propose a novel learning-based method for the completion of partial shapes. Unlike the majority of existing approaches, our method focuses on objects that can undergo non-rigid deformations. The core of our method is a variational autoencoder with graph convolutional operations that learns a latent space for complete realistic shapes. At inference, we optimize to find the representation in this latent space that best fits the generated shape to the known partial input. The completed shape exhibits a realistic appearance on the unknown part. We show promising results towards the completion of synthetic and real scans of human body and face meshes exhibiting different styles of articulation and partiality.Comment: CVPR 201

    Synaptic Noise Facilitates the Emergence of Self-Organized Criticality in the Caenorhabditis elegans Neuronal Network

    Get PDF
    Avalanches with power-law distributed size parameters have been observed in neuronal networks. This observation might be a manifestation of the self-organized criticality (SOC). Yet, the physiological mechanicsm of this behavior is currently unknown. Describing synaptic noise as transmission failures mainly originating from the probabilistic nature of neurotransmitter release, this study investigates the potential of this noise as a mechanism for driving the functional architecture of the neuronal networks towards SOC. To this end, a simple finite state neuron model, with activity dependent and synapse specific failure probabilities, was built based on the known anatomical connectivity data of the nematode Ceanorhabditis elegans. Beginning from random values, it was observed that synaptic noise levels picked out a set of synapses and consequently an active subnetwork which generates power-law distributed neuronal avalanches. The findings of this study brings up the possibility that synaptic failures might be a component of physiological processes underlying SOC in neuronal networks
    • …
    corecore