650 research outputs found

    The historical characteristics of the source domains in Chinese LIFE metaphor

    Get PDF
    This research investigates the diachronic variation of the source domains in Chinese LIFE metaphor. Close examination of data from historical corpora has revealed that the source domain types evolve gradual diversified changes based on the social material and cultural life. Specifically, the results show that (1) harsh living environment and farming understanding account for Chinese ancestors’ preference for crops as the source domain in their life metaphors, (2) the territory extension and duplicate metaphysics together give reasons why the Tang Chinese favor transportation as well as natural phenomenon as the source domains in their life metaphors, (3) the increasing material enrichment and cultural diversification of modern times provide experiential motivation of the gamut of source domain types in Mandarin life metaphors. Thus, a conclusion can be reached that metaphor variation reflects social material level and intellectual level throughout the ages.

    Convergence Theory of Learning Over-parameterized ResNet: A Full Characterization

    Full text link
    ResNet structure has achieved great empirical success since its debut. Recent work established the convergence of learning over-parameterized ResNet with a scaling factor τ=1/L\tau=1/L on the residual branch where LL is the network depth. However, it is not clear how learning ResNet behaves for other values of τ\tau. In this paper, we fully characterize the convergence theory of gradient descent for learning over-parameterized ResNet with different values of τ\tau. Specifically, with hiding logarithmic factor and constant coefficients, we show that for τ≤1/L\tau\le 1/\sqrt{L} gradient descent is guaranteed to converge to the global minma, and especially when τ≤1/L\tau\le 1/L the convergence is irrelevant of the network depth. Conversely, we show that for τ>L−12+c\tau>L^{-\frac{1}{2}+c}, the forward output grows at least with rate LcL^c in expectation and then the learning fails because of gradient explosion for large LL. This means the bound τ≤1/L\tau\le 1/\sqrt{L} is sharp for learning ResNet with arbitrary depth. To the best of our knowledge, this is the first work that studies learning ResNet with full range of τ\tau.Comment: 31 page
    • …
    corecore