1,581 research outputs found

    Non-differentiable functions defined in terms of classical representations of real numbers

    Full text link
    The present article is devoted to functions from a certain subclass of non-differentiable functions. The arguments and values of considered functions represented by the s-adic representation or the nega-s-adic representation of real numbers. The technique of modeling such functions is the simplest as compared with well-known techniques of modeling non-differentiable functions. In other words, values of these functions are obtained from the s-adic or nega-s-adic representation of the argument by a certain change of digits or combinations of digits.Comment: 16 page

    Generalized monotonicity and convexity of non-differentiable functions

    Get PDF
    AbstractThe relationships between (strict, strong) convexity of non-differentiable functions and (strict, strong) monotonicity of set-valued mappings, and (strict, strong, sharp) pseudo convexity of non-differentiable functions and (strict, strong) pseudo monotonicity of set-valued mappings, as well as quasi convexity of non-differentiable functions and quasi monotonicity of set-valued mappings are studied in this paper. In addition, the relations between generalized convexity of non-differentiable functions and generalized co-coerciveness of set-valued mappings are also analyzed

    Gel Estimation and Inference with Non-Smooth Moment Indicators and Dynamic Data

    Get PDF
    In this paper we demonstrate consistency and asymptotic normality for Generalized Empirical Likelihood (GEL) estimation in dynamic models when the moment indicators being used are the non-differentiable functions of the parameters of interest.

    On Correctness of Automatic Differentiation for Non-Differentiable Functions

    Get PDF
    Differentiation lies at the core of many machine-learning algorithms, and is well-supported by popular autodiff systems, such as TensorFlow and PyTorch. Originally, these systems have been developed to compute derivatives of differentiable functions, but in practice, they are commonly applied to functions with non-differentiabilities. For instance, neural networks using ReLU define non-differentiable functions in general, but the gradients of losses involving those functions are computed using autodiff systems in practice. This status quo raises a natural question: are autodiff systems correct in any formal sense when they are applied to such non-differentiable functions? In this paper, we provide a positive answer to this question. Using counterexamples, we first point out flaws in often-used informal arguments, such as: non-differentiabilities arising in deep learning do not cause any issues because they form a measure-zero set. We then investigate a class of functions, called PAP functions, that includes nearly all (possibly non-differentiable) functions in deep learning nowadays. For these PAP functions, we propose a new type of derivatives, called intensional derivatives, and prove that these derivatives always exist and coincide with standard derivatives for almost all inputs. We also show that these intensional derivatives are what most autodiff systems compute or try to compute essentially. In this way, we formally establish the correctness of autodiff systems applied to non-differentiable functions

    A Non-Monotone Conjugate Subgradient Type Method for Minimization of Convex Functions

    Full text link
    We suggest a conjugate subgradient type method without any line-search for minimization of convex non differentiable functions. Unlike the custom methods of this class, it does not require monotone decrease of the goal function and reduces the implementation cost of each iteration essentially. At the same time, its step-size procedure takes into account behavior of the method along the iteration points. Preliminary results of computational experiments confirm efficiency of the proposed modification.Comment: 11 page

    Gel Estimation and Inference with Non-Smooth Moment Indicators and Dynamic Data

    Get PDF
    In this paper we demonstrate consistency and asymptotic normality for Generalized Empirical Likelihood (GEL) estimation in dynamic models when the moment indicators being used are the non-differentiable functions of the parameters of interest

    From Ordients to Optimization: Substitution Effects without Differentiability

    Get PDF
    This paper introduces the concept of ordient for binary relations (preferences), a relative of the concept of gradients for functions (utilities). The lexicographic order, albeit not representable, has an ordient. Not only binary relations representable by differentiable functions have an ordient, but also preferences representable by non-differentiable functions might. We characterize the constrained maxima of binary relations through ordients and provide an implicit function theorem and an envelope theorem. Ordients have a natural economic interpretation as marginal rates of substitution. We apply our results to the classic problem of maximizing preferences over budget sets.Binary relation; ordient; maxima; envelope theorem; implicit function theorem
    corecore