Spoken languages show significant variation across mandarin and accent.
Despite the high performance of mandarin automatic speech recognition (ASR),
accent ASR is still a challenge task. In this paper, we introduce meta-learning
techniques for fast accent domain expansion in mandarin speech recognition,
which expands the field of accents without deteriorating the performance of
mandarin ASR. Meta-learning or learn-to-learn can learn general relation in
multi domains not only for over-fitting a specific domain. So we select
meta-learning in the domain expansion task. This more essential learning will
cause improved performance on accent domain extension tasks. We combine the
methods of meta learning and freeze of model parameters, which makes the
recognition performance more stable in different cases and the training faster
about 20%. Our approach significantly outperforms other methods about 3%
relatively in the accent domain expansion task. Compared to the baseline model,
it improves relatively 37% under the condition that the mandarin test set
remains unchanged. In addition, it also proved this method to be effective on a
large amount of data with a relative performance improvement of 4% on the
accent test set