11,716 research outputs found

    On the Almost Everywhere Continuity

    Full text link
    The aim of this paper is to provide characterizations of the Lebesgue-almost everywhere continuity of a function f : [a, b] \rightarrow R. These characterizations permit to obtain necessary and sufficient conditions for the Riemann integrability of f

    On the multiplier rules

    Full text link
    We establish new results of first-order necessary conditions of optimality for finite-dimensional problems with inequality constraints and for problems with equality and inequality constraints, in the form of John's theorem and in the form of Karush-Kuhn-Tucker's theorem. In comparison with existing results we weaken assumptions of continuity and of differentiability.Comment: 9 page

    Discrete time pontryagin principles in banach spaces

    Full text link
    The aim of this paper is to establish Pontryagin's principles in a dicrete-time infinite-horizon setting when the state variables and the control variables belong to infinite dimensional Banach spaces. In comparison with previous results on this question, we delete conditions of finiteness of codi-mension of subspaces. To realize this aim, the main idea is the introduction of new recursive assumptions and useful consequences of the Baire category theorem and of the Banach isomorphism theorem

    Infinite Dimensional Multipliers and Pontryagin Principles for Discrete-Time Problems

    Full text link
    The aim of this paper is to provide improvments to Pontryagin principles in infinite-horizon discrete-time framework when the space of states and of space of controls are infinite-dimensional. We use the method of reduction to finite horizon and several functional-analytic lemmas to realize our aim

    Maxmin convolutional neural networks for image classification

    Get PDF
    Convolutional neural networks (CNN) are widely used in computer vision, especially in image classification. However, the way in which information and invariance properties are encoded through in deep CNN architectures is still an open question. In this paper, we propose to modify the standard convo- lutional block of CNN in order to transfer more information layer after layer while keeping some invariance within the net- work. Our main idea is to exploit both positive and negative high scores obtained in the convolution maps. This behav- ior is obtained by modifying the traditional activation func- tion step before pooling. We are doubling the maps with spe- cific activations functions, called MaxMin strategy, in order to achieve our pipeline. Extensive experiments on two classical datasets, MNIST and CIFAR-10, show that our deep MaxMin convolutional net outperforms standard CNN

    Pontryagin principle for a Mayer problem governed by a delay functional differential equation

    Full text link
    We establish Pontryagin principles for a Mayer's optimal control problem governed by a functional differential equation. The control functions are piecewise continuous and the state functions are piecewise continuously differentiable. To do that, we follow the method created by Philippe Michel for systems governed by ordinary differential equations, and we use properties of the resolvent of a linear functional differential equation
    corecore