322 research outputs found
Experience-driven Networking: A Deep Reinforcement Learning based Approach
Modern communication networks have become very complicated and highly
dynamic, which makes them hard to model, predict and control. In this paper, we
develop a novel experience-driven approach that can learn to well control a
communication network from its own experience rather than an accurate
mathematical model, just as a human learns a new skill (such as driving,
swimming, etc). Specifically, we, for the first time, propose to leverage
emerging Deep Reinforcement Learning (DRL) for enabling model-free control in
communication networks; and present a novel and highly effective DRL-based
control framework, DRL-TE, for a fundamental networking problem: Traffic
Engineering (TE). The proposed framework maximizes a widely-used utility
function by jointly learning network environment and its dynamics, and making
decisions under the guidance of powerful Deep Neural Networks (DNNs). We
propose two new techniques, TE-aware exploration and actor-critic-based
prioritized experience replay, to optimize the general DRL framework
particularly for TE. To validate and evaluate the proposed framework, we
implemented it in ns-3, and tested it comprehensively with both representative
and randomly generated network topologies. Extensive packet-level simulation
results show that 1) compared to several widely-used baseline methods, DRL-TE
significantly reduces end-to-end delay and consistently improves the network
utility, while offering better or comparable throughput; 2) DRL-TE is robust to
network changes; and 3) DRL-TE consistently outperforms a state-ofthe-art DRL
method (for continuous control), Deep Deterministic Policy Gradient (DDPG),
which, however, does not offer satisfying performance.Comment: 9 pages, 12 figures, paper is accepted as a conference paper at IEEE
Infocom 201
- …