2,686 research outputs found

    On Lie Algebras Generated by Few Extremal Elements

    Get PDF
    We give an overview of some properties of Lie algebras generated by at most 5 extremal elements. In particular, for any finite graph {\Gamma} and any field K of characteristic not 2, we consider an algebraic variety X over K whose K-points parametrize Lie algebras generated by extremal elements. Here the generators correspond to the vertices of the graph, and we prescribe commutation relations corresponding to the nonedges of {\Gamma}. We show that, for all connected undirected finite graphs on at most 5 vertices, X is a finite-dimensional affine space. Furthermore, we show that for maximal-dimensional Lie algebras generated by 5 extremal elements, X is a point. The latter result implies that the bilinear map describing extremality must be identically zero, so that all extremal elements are sandwich elements and the only Lie algebra of this dimension that occurs is nilpotent. These results were obtained by extensive computations with the Magma computational algebra system. The algorithms developed can be applied to arbitrary {\Gamma} (i.e., without restriction on the number of vertices), and may be of independent interest.Comment: 19 page

    Rational motivic path spaces and Kim's relative unipotent section conjecture

    Full text link
    We initiate a study of path spaces in the nascent context of "motivic dga's", under development in doctoral work by Gabriella Guzman. This enables us to reconstruct the unipotent fundamental group of a pointed scheme from the associated augmented motivic dga, and provides us with a factorization of Kim's relative unipotent section conjecture into several smaller conjectures with a homotopical flavor. Based on a conversation with Joseph Ayoub, we prove that the path spaces of the punctured projective line over a number field are concentrated in degree zero with respect to Levine's t-structure for mixed Tate motives. This constitutes a step in the direction of Kim's conjecture.Comment: Minor corrections, details added, and major improvements to exposition throughout. 52 page

    Curriculum Learning by Transfer Learning: Theory and Experiments with Deep Networks

    Full text link
    We provide theoretical investigation of curriculum learning in the context of stochastic gradient descent when optimizing the convex linear regression loss. We prove that the rate of convergence of an ideal curriculum learning method is monotonically increasing with the difficulty of the examples. Moreover, among all equally difficult points, convergence is faster when using points which incur higher loss with respect to the current hypothesis. We then analyze curriculum learning in the context of training a CNN. We describe a method which infers the curriculum by way of transfer learning from another network, pre-trained on a different task. While this approach can only approximate the ideal curriculum, we observe empirically similar behavior to the one predicted by the theory, namely, a significant boost in convergence speed at the beginning of training. When the task is made more difficult, improvement in generalization performance is also observed. Finally, curriculum learning exhibits robustness against unfavorable conditions such as excessive regularization.Comment: ICML 201
    corecore