916 research outputs found

    Theory of Lexicographic Differentiation in the Banach Space Setting

    Get PDF
    Derivative information is useful for many problems found in science and engineering that require equation solving or optimization. Driven by its utility and mathematical curiosity, researchers over the years have developed a variety of generalized derivatives. In this thesis, we will first take a look at Clarke’s generalized derivative for locally Lipschitz continuous functions between Euclidean spaces, which roughly is the smallest convex set containing all nearby derivatives of a domain point of interest. Clarke’s generalized derivative in this setting possesses a strong theoretical and numerical toolkit, which is analogous to that of the classical derivative. It includes nonsmooth versions of the chain rule, the mean value theorem, and the implicit function theorem, as well as nonsmooth equation-solving and optimization methods. However, it is generally difficult to obtain elements of Clarke’s generalized derivative in the Euclidean space setting. To address this issue, we use lexicographic differentiation by Nesterov and lexicographic directional differentiation by Khan and Barton. They are generalized derivatives theories for a subclass of locally Lipschitz continuous functions, called the class of lexicographically smooth functions, which help to find elements of Clarke\u27s generalized derivative in the Euclidean space setting systematically. Lexicographic derivatives are either elements of Clarke\u27s generalized derivative in the Euclidean space setting or at least indistinguishable from them as far as numerical tools are concerned. We outline a process by which we can find a lexicographic derivative once a lexicographic directional derivative is known. Lastly, we present lexicographic differentiation theory for a subclass of locally Lipschitz continuous functions mapping between Banach spaces that have Schauder bases, called, unsurprisingly, the class of lexicographically smooth functions. We provide a proof for Nesterov\u27s result that, as in the Euclidean space setting, lexicographic derivatives in this setting satisfy a sharp calculus rule

    Constructing a subgradient from directional derivatives for functions of two variables

    Full text link
    For any scalar-valued bivariate function that is locally Lipschitz continuous and directionally differentiable, it is shown that a subgradient may always be constructed from the function's directional derivatives in the four compass directions, arranged in a so-called "compass difference". When the original function is nonconvex, the obtained subgradient is an element of Clarke's generalized gradient, but the result appears to be novel even for convex functions. The function is not required to be represented in any particular form, and no further assumptions are required, though the result is strengthened when the function is additionally L-smooth in the sense of Nesterov. For certain optimal-value functions and certain parametric solutions of differential equation systems, these new results appear to provide the only known way to compute a subgradient. These results also imply that centered finite differences will converge to a subgradient for bivariate nonsmooth functions. As a dual result, we find that any compact convex set in two dimensions contains the midpoint of its interval hull. Examples are included for illustration, and it is demonstrated that these results do not extend directly to functions of more than two variables or sets in higher dimensions.Comment: 16 pages, 2 figure

    Lexicographic Sensitivity Functions for Nonsmooth Models in Mathematical Biology

    Get PDF
    Systems of ordinary differential equations (ODEs) may be used to model a wide variety of real-world phenomena in biology and engineering. Classical sensitivity theory is well-established and concerns itself with quantifying the responsiveness of such models to changes in parameter values. By performing a sensitivity analysis, a variety of insights can be gained into a model (and hence, the real-world system that it represents); in particular, the information gained can uncover a system\u27s most important aspects, for use in design, control or optimization of the system. However, while the results of such analysis are desirable, the approach that classical theory offers is limited to the case of ODE systems whose right-hand side functions are at least once continuously differentiable. This requirement is restrictive in many real-world systems in which sudden changes in behavior are observed, since a sharp change of this type often translates to a point of nondifferentiability in the model itself. To contend with this issue, recently-developed theory employing a specific class of tools called lexicographic derivatives has been shown to extend classical sensitivity results into a broad subclass of locally Lipschitz continuous ODE systems whose right-hand side functions are referred to as lexicographically smooth. In this thesis, we begin by exploring relevant background theory before presenting lexicographic sensitivity functions as a useful extension of classical sensitivity functions; after establishing the theory, we apply it to two models in mathematical biology. The first of these concerns a model of glucose-insulin kinetics within the body, in which nondifferentiability arises from a biochemical threshold being crossed within the body; the second models the spread of rioting activity, in which similar nonsmooth behavior is introduced out of a desire to capture a tipping point behavior where susceptible individuals suddenly begin to join a riot at a quicker rate after a threshold riot size is crossed. Simulations and lexicographic sensitivity functions are given for each model, and the implications of our results are discussed
    • …
    corecore