1 research outputs found
Neural Topological Ordering for Computation Graphs
Recent works on machine learning for combinatorial optimization have shown
that learning based approaches can outperform heuristic methods in terms of
speed and performance. In this paper, we consider the problem of finding an
optimal topological order on a directed acyclic graph with focus on the memory
minimization problem which arises in compilers. We propose an end-to-end
machine learning based approach for topological ordering using an
encoder-decoder framework. Our encoder is a novel attention based graph neural
network architecture called \emph{Topoformer} which uses different topological
transforms of a DAG for message passing. The node embeddings produced by the
encoder are converted into node priorities which are used by the decoder to
generate a probability distribution over topological orders. We train our model
on a dataset of synthetically generated graphs called layered graphs. We show
that our model outperforms, or is on-par, with several topological ordering
baselines while being significantly faster on synthetic graphs with up to 2k
nodes. We also train and test our model on a set of real-world computation
graphs, showing performance improvements.Comment: To appear in NeurIPS 202