38,754 research outputs found

    Real root finding for equivariant semi-algebraic systems

    Get PDF
    Let RR be a real closed field. We consider basic semi-algebraic sets defined by nn-variate equations/inequalities of ss symmetric polynomials and an equivariant family of polynomials, all of them of degree bounded by 2d<n2d < n. Such a semi-algebraic set is invariant by the action of the symmetric group. We show that such a set is either empty or it contains a point with at most 2d12d-1 distinct coordinates. Combining this geometric result with efficient algorithms for real root finding (based on the critical point method), one can decide the emptiness of basic semi-algebraic sets defined by ss polynomials of degree dd in time (sn)O(d)(sn)^{O(d)}. This improves the state-of-the-art which is exponential in nn. When the variables x1,,xnx_1, \ldots, x_n are quantified and the coefficients of the input system depend on parameters y1,,yty_1, \ldots, y_t, one also demonstrates that the corresponding one-block quantifier elimination problem can be solved in time (sn)O(dt)(sn)^{O(dt)}

    Automatic differentiation in machine learning: a survey

    Get PDF
    Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD), also called algorithmic differentiation or simply "autodiff", is a family of techniques similar to but more general than backpropagation for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. AD is a small but established field with applications in areas including computational fluid dynamics, atmospheric sciences, and engineering design optimization. Until very recently, the fields of machine learning and AD have largely been unaware of each other and, in some cases, have independently discovered each other's results. Despite its relevance, general-purpose AD has been missing from the machine learning toolbox, a situation slowly changing with its ongoing adoption under the names "dynamic computational graphs" and "differentiable programming". We survey the intersection of AD and machine learning, cover applications where AD has direct relevance, and address the main implementation techniques. By precisely defining the main differentiation techniques and their interrelationships, we aim to bring clarity to the usage of the terms "autodiff", "automatic differentiation", and "symbolic differentiation" as these are encountered more and more in machine learning settings.Comment: 43 pages, 5 figure

    Two Decades of Maude

    Get PDF
    This paper is a tribute to José Meseguer, from the rest of us in the Maude team, reviewing the past, the present, and the future of the language and system with which we have been working for around two decades under his leadership. After reviewing the origins and the language's main features, we present the latest additions to the language and some features currently under development. This paper is not an introduction to Maude, and some familiarity with it and with rewriting logic are indeed assumed.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
    corecore