1,638 research outputs found

    On the Volume of Isolated Singularities

    Full text link
    We give an equivalent definition of the local volume of an isolated singularity Vol_{BdFF}(X,0) given in [BdFF12] in the Q-Gorenstein case and we generalize it to the non-Q-Gorenstein case. We prove that there is a positive lower bound depending only on the dimension for the non-zero local volume of an isolated singularity if X is Gorenstein. We also give a non-Q-Gorenstein example with Vol_{BdFF}(X,0)=0, which does not allow a boundary \Delta such that the pair (X,\Delta) is log canonical.Comment: 12 pages. Final version. To appear in Compos. Mat

    Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization

    Get PDF
    We consider a generic convex optimization problem associated with regularized empirical risk minimization of linear predictors. The problem structure allows us to reformulate it as a convex-concave saddle point problem. We propose a stochastic primal-dual coordinate (SPDC) method, which alternates between maximizing over a randomly chosen dual variable and minimizing over the primal variable. An extrapolation step on the primal variable is performed to obtain accelerated convergence rate. We also develop a mini-batch version of the SPDC method which facilitates parallel computing, and an extension with weighted sampling probabilities on the dual variables, which has a better complexity than uniform sampling on unnormalized data. Both theoretically and empirically, we show that the SPDC method has comparable or better performance than several state-of-the-art optimization methods

    Macro Grammars and Holistic Triggering for Efficient Semantic Parsing

    Full text link
    To learn a semantic parser from denotations, a learning algorithm must search over a combinatorially large space of logical forms for ones consistent with the annotated denotations. We propose a new online learning algorithm that searches faster as training progresses. The two key ideas are using macro grammars to cache the abstract patterns of useful logical forms found thus far, and holistic triggering to efficiently retrieve the most relevant patterns based on sentence similarity. On the WikiTableQuestions dataset, we first expand the search space of an existing model to improve the state-of-the-art accuracy from 38.7% to 42.7%, and then use macro grammars and holistic triggering to achieve an 11x speedup and an accuracy of 43.7%.Comment: EMNLP 201
    • …
    corecore