8,341 research outputs found

    Transfer Learning across Networks for Collective Classification

    Full text link
    This paper addresses the problem of transferring useful knowledge from a source network to predict node labels in a newly formed target network. While existing transfer learning research has primarily focused on vector-based data, in which the instances are assumed to be independent and identically distributed, how to effectively transfer knowledge across different information networks has not been well studied, mainly because networks may have their distinct node features and link relationships between nodes. In this paper, we propose a new transfer learning algorithm that attempts to transfer common latent structure features across the source and target networks. The proposed algorithm discovers these latent features by constructing label propagation matrices in the source and target networks, and mapping them into a shared latent feature space. The latent features capture common structure patterns shared by two networks, and serve as domain-independent features to be transferred between networks. Together with domain-dependent node features, we thereafter propose an iterative classification algorithm that leverages label correlations to predict node labels in the target network. Experiments on real-world networks demonstrate that our proposed algorithm can successfully achieve knowledge transfer between networks to help improve the accuracy of classifying nodes in the target network.Comment: Published in the proceedings of IEEE ICDM 201

    Geometric conservation laws for cells or vesicles with membrane nanotubes or singular points

    Get PDF
    On the basis of the integral theorems about the mean curvature and Gauss curvature, geometric conservation laws for cells or vesicles are proved. These conservation laws may depict various special bionano structures discovered in experiments, such as the membrane nanotubes and singular points grown from the surfaces of cells or vesicles. Potential applications of the conservation laws to lipid nanotube junctions that interconnect cells or vesicles are discussed

    The Small-scale Peasant Economy in the French German Peasant Problem

    Get PDF
    Capitalist socialized mass production has continuously squeezed the production environment of farmers, increased productivity and bankrupted farmers have also increased, and more farmers come to the city but can not stand firm in the city. In this regard, Engels believed that small-scale peasant household production was no longer suitable for the needs of economic development. He wrote the “French German Peasants’ Problem” to discuss his ideas about farmers taking the road of cooperative production, leaving valuable inspiration for the development of farmers, rural areas and agriculture in today’s social development

    Search Efficient Binary Network Embedding

    Full text link
    Traditional network embedding primarily focuses on learning a dense vector representation for each node, which encodes network structure and/or node content information, such that off-the-shelf machine learning algorithms can be easily applied to the vector-format node representations for network analysis. However, the learned dense vector representations are inefficient for large-scale similarity search, which requires to find the nearest neighbor measured by Euclidean distance in a continuous vector space. In this paper, we propose a search efficient binary network embedding algorithm called BinaryNE to learn a sparse binary code for each node, by simultaneously modeling node context relations and node attribute relations through a three-layer neural network. BinaryNE learns binary node representations efficiently through a stochastic gradient descent based online learning algorithm. The learned binary encoding not only reduces memory usage to represent each node, but also allows fast bit-wise comparisons to support much quicker network node search compared to Euclidean distance or other distance measures. Our experiments and comparisons show that BinaryNE not only delivers more than 23 times faster search speed, but also provides comparable or better search quality than traditional continuous vector based network embedding methods
    • …
    corecore