It is frequently observed that overparameterized neural networks generalize
well. Regarding such phenomena, existing theoretical work mainly devotes to
linear settings or fully-connected neural networks. This paper studies the
learning ability of an important family of deep neural networks, deep
convolutional neural networks (DCNNs), under both underparameterized and
overparameterized settings. We establish the first learning rates of
underparameterized DCNNs without parameter or function variable structure
restrictions presented in the literature. We also show that by adding
well-defined layers to a non-interpolating DCNN, we can obtain some
interpolating DCNNs that maintain the good learning rates of the
non-interpolating DCNN. This result is achieved by a novel network deepening
scheme designed for DCNNs. Our work provides theoretical verification of how
overfitted DCNNs generalize well