315 research outputs found

    A Degeneracy Framework for Scalable Graph Autoencoders

    Full text link
    In this paper, we present a general framework to scale graph autoencoders (AE) and graph variational autoencoders (VAE). This framework leverages graph degeneracy concepts to train models only from a dense subset of nodes instead of using the entire graph. Together with a simple yet effective propagation mechanism, our approach significantly improves scalability and training speed while preserving performance. We evaluate and discuss our method on several variants of existing graph AE and VAE, providing the first application of these models to large graphs with up to millions of nodes and edges. We achieve empirically competitive results w.r.t. several popular scalable node embedding methods, which emphasizes the relevance of pursuing further research towards more scalable graph AE and VAE.Comment: International Joint Conference on Artificial Intelligence (IJCAI 2019

    Suggesting new words to extract keywords from title and abstract

    Get PDF
    When talking about the fundamentals of writing research papers, we find that keywords are still present in most research papers, but that does not mean that they exist in all of them, we can find papers that do not contain keywords. Keywords are those words or phrases that accurately reflect the content of the research paper. Keywords are an exact abbreviation of what the research carries in its content. The right keywords may increase the chance of finding the article or research paper and chances of reaching more people who should reach them. The importance of keywords and the essence of the research and address is mainly to attract these highly specialized and highly influential writers in their fields and who specialize in reading what holds the appropriate characteristics but they do not read and cannot read everything. In this paper, we extract new keywords by suggesting a set of words, these words were suggested according to the many mentioned in the researches with multiple disciplines in the field of computer. In our system, we take a number of words (as many as specified in the program) that come before the proposed words and consider it as new keywords. This system proved to be effective in finding keywords that correspond to some extent with the keywords developed by the author in his research

    Large-scale clique cover of real-world networks

    Get PDF
    The edge clique cover (ECC ) problem deals with discovering a set of (possibly overlapping) cliques in a given graph that covers each of the graph's edges. This problem finds applications ranging from social networks to compiler optimization and stringology. We consider several variants of the ECC problem, using classical quality measures (like the number of cliques) and new ones. We describe efficient heuristic algorithms, the fastest one taking O(mdG) time for a graph with m edges, degeneracy dG (also known as k-core number). For large real-world networks with millions of nodes, like social networks, an algorithm should have (almost) linear running time to be practical: Our algorithm for finding ECCs of large networks has linear-time performance in practice because dG is small, as our experiments show, on real-world networks with thousands to several million nodes
    • …
    corecore