573,196 research outputs found
Accelerating Data Loading in Deep Neural Network Training
Data loading can dominate deep neural network training time on large-scale
systems. We present a comprehensive study on accelerating data loading
performance in large-scale distributed training. We first identify performance
and scalability issues in current data loading implementations. We then propose
optimizations that utilize CPU resources to the data loader design. We use an
analytical model to characterize the impact of data loading on the overall
training time and establish the performance trend as we scale up distributed
training. Our model suggests that I/O rate limits the scalability of
distributed training, which inspires us to design a locality-aware data loading
method. By utilizing software caches, our method can drastically reduce the
data loading communication volume in comparison with the original data loading
implementation. Finally, we evaluate the proposed optimizations with various
experiments. We achieved more than 30x speedup in data loading using 256 nodes
with 1,024 learners.Comment: 11 pages, 12 figures, accepted for publication in IEEE International
Conference on High Performance Computing, Data and Analytics (HiPC) 201
- …