57,382 research outputs found

    κ·Έλž˜ν”„ ν•™μŠ΅μ‹œ μ½œλ“œ μŠ€νƒ€νŠΈ 문제 해결을 μœ„ν•œ λ©€ν‹°ν…ŒμŠ€ν¬ ν•™μŠ΅ μ „λž΅

    Get PDF
    ν•™μœ„λ…Όλ¬Έ(석사) -- μ„œμšΈλŒ€ν•™κ΅λŒ€ν•™μ› : μžμ—°κ³Όν•™λŒ€ν•™ ν˜‘λ™κ³Όμ • λ‡Œκ³Όν•™μ „κ³΅, 2021.8. μž₯병탁.Data in real world can commonly be represented as graph data. From molecules, social networks, Internet, and even natural language can be represented as graph. Therefore, generating fine representations from the graph data is critical in machine learning. One of the common attributes of most real-world graphs is that graphs are dynamic and can eventually face the cold start problem. A fundamental question is how the new cold nodes acquire initial information in order to be adapted into the existing graph. Here we postulates the cold start problem as a fundamental issue in graph learning and propose a new learning setting, β€œExpanded Semi-supervised Learning.” In expanded semi-supervised learning we extend the original semi-supervised learning setting even to new cold nodes that are disconnected from the graph. To this end, we propose Cold- Expand model that classifies the cold nodes based on link prediction with mul- tiple goals to tackle. We experimentally prove that by adding additional goal to existing link prediction method, our method outperforms the baseline in both expanded semi-supervised link prediction (at most 24%) and node classifica- tion tasks (at most 15%). To the best of our knowledge this is the first study to address expansion of semi-supervised learning to unseen nodes.λŒ€λ‹€μˆ˜μ˜ μ‹€μ œν•˜λŠ” κ·Έλž˜ν”„ λ°μ΄ν„°λŠ” μ‹œκ°„μ— 따라 λ…Έλ“œμ™€ κ·Έ 연결이 λ³€ν™”ν•˜κ³  ν•„μ—° 적으둜 μ½œλ“œμŠ€νƒ€νŠΈ 문제λ₯Ό μ§λ©΄ν•˜κ²Œ λœλ‹€. μ½œλ“œμŠ€νƒ€νŠΈ 문제λ₯Ό ν•΄κ²°ν•˜κΈ° μœ„ν•΄μ„œλŠ” μƒˆλ‘œ λ°œμƒν•œ μ½œλ“œ λ…Έλ“œλ“€μ΄ κΈ°μ‘΄ κ·Έλž˜ν”„μ— ν¬ν•¨λ˜κΈ° μœ„ν•΄ μ–΄λ–»κ²Œ 초기 정보λ₯Ό 얻을 수 μžˆλŠ”μ§€κ°€ 맀우 μ€‘μš”ν•˜λ‹€. λ³Έ λ…Όλ¬Έμ—μ„œλŠ” κ·Έλž˜ν”„ ν•™μŠ΅μ—μ„œμ˜ μ½œλ“œμŠ€νƒ€νŠΈ 문제λ₯Ό μ œκΈ°ν•˜κ³  μƒˆλ‘­κ³  ν˜„μ‹€μ μΈ ν•™μŠ΅ μ„ΈνŒ…μΈ ”확μž₯된 μ€€μ§€λ„ν•™μŠ΅β€μ„ μ œμ•ˆν•œλ‹€. ν™•μž₯된 μ€€μ§€λ„ν•™μŠ΅μ—μ„œλŠ” 기쑴의 쀀지도 ν•™μŠ΅μ„ μƒˆλ‘œ μ†Œκ°œλ˜μ–΄ κΈ°μ‘΄ κ·Έλž˜ν”„μ™€μ˜ μ—°κ²°μ„± 이 μ—†λŠ” μ½œλ“œ λ…Έλ“œλ“€μ—λ„ μ μš©ν•œλ‹€. μƒˆλ‘œμš΄ ν•™μŠ΅ μ„ΈνŒ…μ„ μœ„ν•΄ λ…Όλ¬Έμ—μ„œλŠ” μ½œλ“œ λ…Έλ“œλ“€μ˜ λΆ„λ₯˜λ₯Ό μœ„ν•΄ 링크 μ˜ˆμΈ‘μ„ ν•¨κ»˜ λ©€ν‹°ν…ŒμŠ€ν¬ λͺ©μ μœΌλ‘œ κ°–λŠ” ColdExpand λͺ¨λΈμ„ μ œμ‹œν•œλ‹€. λ˜ν•œ μ‹€ν—˜μ μœΌλ‘œ 우리 λͺ¨λΈμ΄ 좔가적인 λͺ©μ μ„ λͺ¨λΈμ— λ”ν•΄μ€Œ 으둜써 μ½œλ“œ λ…Έλ“œλ“€μ˜ 링크 예츑과 λΆ„λ₯˜ λ¬Έμ œμ—μ„œ κ³Όκ±° λͺ¨λΈλ“€μ„ 크게 μƒνšŒν•˜λŠ” κ²°κ³Όκ°€ λ‚˜μ˜΄μ„ 증λͺ…ν•œλ‹€.Chapter 1 Introduction 1 Chapter 2 Related Works 5 2.1 Semi-supervisedNodeClassification 5 2.2 LinkPrediction 6 2.3 ColdStartProblem 8 2.4 Multi-taskLearningStrategy 9 Chapter 3 Method 11 3.1 ProblemDefinition 11 3.2 ColdExpand: Semi-Supervised Graph Learning in Cold Start 12 3.2.1 LinkPredictionofColdNodes 12 3.2.2 NodeClassificationofColdNodes 13 Chapter 4 Experiment 17 4.1 Dataset 17 4.2 Baseline 18 4.2.1 Expanded Semi-supervised Link Prediction 18 4.2.2 Expanded Semi-supervised Node Classification 18 4.3 Results. 19 4.3.1 Results of Link Prediction on Cold Nodes 19 4.3.2 Results of Semi-supervised Node Classification on Cold Nodes. 20 4.3.3 Qualitative Results for Node Classification in Cold Start Environment 23 4.3.4 ColdExpand also Improves Conventional Semi-supervised NodeClassification 23 4.3.5 AblationStudy 25 Chapter 5 Discussion 28석

    Deep Learning for Link Prediction in Dynamic Networks using Weak Estimators

    Full text link
    Link prediction is the task of evaluating the probability that an edge exists in a network, and it has useful applications in many domains. Traditional approaches rely on measuring the similarity between two nodes in a static context. Recent research has focused on extending link prediction to a dynamic setting, predicting the creation and destruction of links in networks that evolve over time. Though a difficult task, the employment of deep learning techniques have shown to make notable improvements to the accuracy of predictions. To this end, we propose the novel application of weak estimators in addition to the utilization of traditional similarity metrics to inexpensively build an effective feature vector for a deep neural network. Weak estimators have been used in a variety of machine learning algorithms to improve model accuracy, owing to their capacity to estimate changing probabilities in dynamic systems. Experiments indicate that our approach results in increased prediction accuracy on several real-world dynamic networks

    Node Embedding over Temporal Graphs

    Full text link
    In this work, we present a method for node embedding in temporal graphs. We propose an algorithm that learns the evolution of a temporal graph's nodes and edges over time and incorporates this dynamics in a temporal node embedding framework for different graph prediction tasks. We present a joint loss function that creates a temporal embedding of a node by learning to combine its historical temporal embeddings, such that it optimizes per given task (e.g., link prediction). The algorithm is initialized using static node embeddings, which are then aligned over the representations of a node at different time points, and eventually adapted for the given task in a joint optimization. We evaluate the effectiveness of our approach over a variety of temporal graphs for the two fundamental tasks of temporal link prediction and multi-label node classification, comparing to competitive baselines and algorithmic alternatives. Our algorithm shows performance improvements across many of the datasets and baselines and is found particularly effective for graphs that are less cohesive, with a lower clustering coefficient
    • …
    corecore