Deep neural networks have gained great success due to the increasing amounts
of data, and diverse effective neural network designs. However, it also brings
a heavy computing burden as the amount of training data is proportional to the
training time. In addition, a well-behaved model requires repeated trials of
different structure designs and hyper-parameters, which may take a large amount
of time even with state-of-the-art (SOTA) hyper-parameter optimization (HPO)
algorithms and neural architecture search (NAS) algorithms. In this paper, we
propose an Automatic Selection of Proxy dataset framework (ASP) aimed to
dynamically find the informative proxy subsets of training data at each epoch,
reducing the training data size as well as saving the AutoML processing time.
We verify the effectiveness and generalization of ASP on CIFAR10, CIFAR100,
ImageNet16-120, and ImageNet-1k, across various public model benchmarks. The
experiment results show that ASP can obtain better results than other data
selection methods at all selection ratios. ASP can also enable much more
efficient AutoML processing with a speedup of 2x-20x while obtaining better
architectures and better hyper-parameters compared to utilizing the entire
dataset.Comment: This paper was actually finished in 202