46,193 research outputs found

    Boundary expansions and convergence for complex Monge-Ampere equations

    Full text link
    We study boundary expansions of solutions of complex Monge-Ampere equations and discuss the convergence of such expansions. We prove a global conver- gence result under that assumption that the entire boundary is analytic. If a portion of the boundary is assumed to be analytic, the expansions may not converge locally

    On black hole spectroscopy via adiabatic invariance

    Get PDF
    In this paper, we obtain the black hole spectroscopy by combining the black hole property of adiabaticity and the oscillating velocity of the black hole horizon. This velocity is obtained in the tunneling framework. In particular, we declare, if requiring canonical invariance, the adiabatic invariant quantity should be of the covariant form Iadia=∮pidqiI_{\textrm{adia}}=\oint p_idq_i. Using it, the horizon area of a Schwarzschild black hole is quantized independent of the choice of coordinates, with an equally spaced spectroscopy always given by ΔA=8πlp2\Delta \mathcal{A}=8\pi l_p^2 in the Schwarzschild and Painlev\'{e} coordinates.Comment: 13 pages, some references added, to be published in Phys. Lett.

    Dependency Grammar Induction with Neural Lexicalization and Big Training Data

    Full text link
    We study the impact of big models (in terms of the degree of lexicalization) and big data (in terms of the training corpus size) on dependency grammar induction. We experimented with L-DMV, a lexicalized version of Dependency Model with Valence and L-NDMV, our lexicalized extension of the Neural Dependency Model with Valence. We find that L-DMV only benefits from very small degrees of lexicalization and moderate sizes of training corpora. L-NDMV can benefit from big training data and lexicalization of greater degrees, especially when enhanced with good model initialization, and it achieves a result that is competitive with the current state-of-the-art.Comment: EMNLP 201
    • …
    corecore