57,382 research outputs found
κ·Έλν νμ΅μ μ½λ μ€ννΈ λ¬Έμ ν΄κ²°μ μν λ©ν°ν μ€ν¬ νμ΅ μ λ΅
νμλ
Όλ¬Έ(μμ¬) -- μμΈλνκ΅λνμ : μμ°κ³Όνλν νλκ³Όμ λκ³Όνμ 곡, 2021.8. μ₯λ³ν.Data in real world can commonly be represented as graph data. From molecules, social networks, Internet, and even natural language can be represented as graph. Therefore, generating fine representations from the graph data is critical in machine learning. One of the common attributes of most real-world graphs is that graphs are dynamic and can eventually face the cold start problem. A fundamental question is how the new cold nodes acquire initial information in order to be adapted into the existing graph. Here we postulates the cold start problem as a fundamental issue in graph learning and propose a new learning setting, βExpanded Semi-supervised Learning.β In expanded semi-supervised learning we extend the original semi-supervised learning setting even to new cold nodes that are disconnected from the graph. To this end, we propose Cold- Expand model that classifies the cold nodes based on link prediction with mul- tiple goals to tackle. We experimentally prove that by adding additional goal to existing link prediction method, our method outperforms the baseline in both expanded semi-supervised link prediction (at most 24%) and node classifica- tion tasks (at most 15%). To the best of our knowledge this is the first study to address expansion of semi-supervised learning to unseen nodes.λλ€μμ μ€μ νλ κ·Έλν λ°μ΄ν°λ μκ°μ λ°λΌ λ
Έλμ κ·Έ μ°κ²°μ΄ λ³ννκ³ νμ° μ μΌλ‘ μ½λμ€ννΈ λ¬Έμ λ₯Ό μ§λ©΄νκ² λλ€. μ½λμ€ννΈ λ¬Έμ λ₯Ό ν΄κ²°νκΈ° μν΄μλ μλ‘ λ°μν μ½λ λ
Έλλ€μ΄ κΈ°μ‘΄ κ·Έλνμ ν¬ν¨λκΈ° μν΄ μ΄λ»κ² μ΄κΈ° μ 보λ₯Ό μ»μ μ μλμ§κ° λ§€μ° μ€μνλ€. λ³Έ λ
Όλ¬Έμμλ κ·Έλν νμ΅μμμ μ½λμ€ννΈ λ¬Έμ λ₯Ό μ κΈ°νκ³ μλ‘κ³ νμ€μ μΈ νμ΅ μΈν
μΈ βνμ₯λ μ€μ§λνμ΅βμ μ μνλ€. νμ₯λ μ€μ§λνμ΅μμλ κΈ°μ‘΄μ μ€μ§λ νμ΅μ μλ‘ μκ°λμ΄ κΈ°μ‘΄ κ·Έλνμμ μ°κ²°μ± μ΄ μλ μ½λ λ
Έλλ€μλ μ μ©νλ€. μλ‘μ΄ νμ΅ μΈν
μ μν΄ λ
Όλ¬Έμμλ μ½λ λ
Έλλ€μ λΆλ₯λ₯Ό μν΄ λ§ν¬ μμΈ‘μ ν¨κ» λ©ν°ν
μ€ν¬ λͺ©μ μΌλ‘ κ°λ ColdExpand λͺ¨λΈμ μ μνλ€. λν μ€νμ μΌλ‘ μ°λ¦¬ λͺ¨λΈμ΄ μΆκ°μ μΈ λͺ©μ μ λͺ¨λΈμ λν΄μ€ μΌλ‘μ¨ μ½λ λ
Έλλ€μ λ§ν¬ μμΈ‘κ³Ό λΆλ₯ λ¬Έμ μμ κ³Όκ±° λͺ¨λΈλ€μ ν¬κ² μννλ κ²°κ³Όκ° λμ΄μ μ¦λͺ
νλ€.Chapter 1 Introduction 1
Chapter 2 Related Works 5
2.1 Semi-supervisedNodeClassification 5
2.2 LinkPrediction 6
2.3 ColdStartProblem 8
2.4 Multi-taskLearningStrategy 9
Chapter 3 Method 11
3.1 ProblemDefinition 11
3.2 ColdExpand: Semi-Supervised Graph Learning in Cold Start 12
3.2.1 LinkPredictionofColdNodes 12
3.2.2 NodeClassificationofColdNodes 13
Chapter 4 Experiment 17
4.1 Dataset 17
4.2 Baseline 18
4.2.1 Expanded Semi-supervised Link Prediction 18
4.2.2 Expanded Semi-supervised Node Classification 18
4.3 Results. 19
4.3.1 Results of Link Prediction on Cold Nodes 19
4.3.2 Results of Semi-supervised Node Classification on Cold Nodes. 20
4.3.3 Qualitative Results for Node Classification in Cold Start Environment 23
4.3.4 ColdExpand also Improves Conventional Semi-supervised NodeClassification 23
4.3.5 AblationStudy 25
Chapter 5 Discussion 28μ
Deep Learning for Link Prediction in Dynamic Networks using Weak Estimators
Link prediction is the task of evaluating the probability that an edge exists in a network, and it has useful applications in many domains. Traditional approaches rely on measuring the similarity between two nodes in a static context. Recent research has focused on extending link prediction to a dynamic setting, predicting the creation and destruction of links in networks that evolve over time. Though a difficult task, the employment of deep learning techniques have shown to make notable improvements to the accuracy of predictions. To this end, we propose the novel application of weak estimators in addition to the utilization of traditional similarity metrics to inexpensively build an effective feature vector for a deep neural network. Weak estimators have been used in a variety of machine learning algorithms to improve model accuracy, owing to their capacity to estimate changing probabilities in dynamic systems. Experiments indicate that our approach results in increased prediction accuracy on several real-world dynamic networks
Node Embedding over Temporal Graphs
In this work, we present a method for node embedding in temporal graphs. We
propose an algorithm that learns the evolution of a temporal graph's nodes and
edges over time and incorporates this dynamics in a temporal node embedding
framework for different graph prediction tasks. We present a joint loss
function that creates a temporal embedding of a node by learning to combine its
historical temporal embeddings, such that it optimizes per given task (e.g.,
link prediction). The algorithm is initialized using static node embeddings,
which are then aligned over the representations of a node at different time
points, and eventually adapted for the given task in a joint optimization. We
evaluate the effectiveness of our approach over a variety of temporal graphs
for the two fundamental tasks of temporal link prediction and multi-label node
classification, comparing to competitive baselines and algorithmic
alternatives. Our algorithm shows performance improvements across many of the
datasets and baselines and is found particularly effective for graphs that are
less cohesive, with a lower clustering coefficient
- β¦