129 research outputs found

    AAANE: Attention-based Adversarial Autoencoder for Multi-scale Network Embedding

    Full text link
    Network embedding represents nodes in a continuous vector space and preserves structure information from the Network. Existing methods usually adopt a "one-size-fits-all" approach when concerning multi-scale structure information, such as first- and second-order proximity of nodes, ignoring the fact that different scales play different roles in the embedding learning. In this paper, we propose an Attention-based Adversarial Autoencoder Network Embedding(AAANE) framework, which promotes the collaboration of different scales and lets them vote for robust representations. The proposed AAANE consists of two components: 1) Attention-based autoencoder effectively capture the highly non-linear network structure, which can de-emphasize irrelevant scales during training. 2) An adversarial regularization guides the autoencoder learn robust representations by matching the posterior distribution of the latent embeddings to given prior distribution. This is the first attempt to introduce attention mechanisms to multi-scale network embedding. Experimental results on real-world networks show that our learned attention parameters are different for every network and the proposed approach outperforms existing state-of-the-art approaches for network embedding.Comment: 8 pages, 5 figure

    Quasi-T\"oplitz functions in KAM theorem

    Full text link
    We define and describe the class of Quasi-T\"oplitz functions. We then prove an abstract KAM theorem where the perturbation is in this class. We apply this theorem to a Non-Linear-Scr\"odinger equation on the torus TdT^d, thus proving existence and stability of quasi-periodic solutions and recovering the results of [10]. With respect to that paper we consider only the NLS which preserves the total Momentum and exploit this conserved quantity in order to simplify our treatment.Comment: 34 pages, 1 figur
    • …
    corecore