10,954 research outputs found

    Divided Differences of Implicit Functions

    Get PDF
    Under general conditions, the equation g(x,y)=0g(x,y) = 0 implicitly defines yy locally as a function of xx. In this article, we express divided differences of yy in terms of bivariate divided differences of gg, generalizing a recent result on divided differences of inverse functions

    A parabolic free boundary problem with Bernoulli type condition on the free boundary

    Full text link
    Consider the parabolic free boundary problem Ξ”uβˆ’βˆ‚tu=0in{u>0},βˆ£βˆ‡u∣=1onβˆ‚{u>0}. \Delta u - \partial_t u = 0 \textrm{in} \{u>0\}, |\nabla u|=1 \textrm{on} \partial\{u>0\} . For a realistic class of solutions, containing for example {\em all} limits of the singular perturbation problem Ξ”uΟ΅βˆ’βˆ‚tuΟ΅=Ξ²Ο΅(uΟ΅)asΟ΅β†’0,\Delta u_\epsilon - \partial_t u_\epsilon = \beta_\epsilon(u_\epsilon) \textrm{as} \epsilon\to 0, we prove that one-sided flatness of the free boundary implies regularity. In particular, we show that the topological free boundary βˆ‚{u>0}\partial\{u>0\} can be decomposed into an {\em open} regular set (relative to βˆ‚{u>0}\partial\{u>0\}) which is locally a surface with H\"older-continuous space normal, and a closed singular set. Our result extends the main theorem in the paper by H.W. Alt-L.A. Caffarelli (1981) to more general solutions as well as the time-dependent case. Our proof uses methods developed in H.W. Alt-L.A. Caffarelli (1981), however we replace the core of that paper, which relies on non-positive mean curvature at singular points, by an argument based on scaling discrepancies, which promises to be applicable to more general free boundary or free discontinuity problems

    Learning Equations for Extrapolation and Control

    Full text link
    We present an approach to identify concise equations from data using a shallow neural network approach. In contrast to ordinary black-box regression, this approach allows understanding functional relations and generalizing them from observed data to unseen parts of the parameter space. We show how to extend the class of learnable equations for a recently proposed equation learning network to include divisions, and we improve the learning and model selection strategy to be useful for challenging real-world data. For systems governed by analytical expressions, our method can in many cases identify the true underlying equation and extrapolate to unseen domains. We demonstrate its effectiveness by experiments on a cart-pendulum system, where only 2 random rollouts are required to learn the forward dynamics and successfully achieve the swing-up task.Comment: 9 pages, 9 figures, ICML 201
    • …
    corecore