263 research outputs found

    Variational Analysis of Marginal Functions with Applications to Bilevel Programming

    Get PDF
    This paper pursues a twofold goal. First to derive new results on generalized differentiation in variational analysis focusing mainly on a broad class of intrinsically nondifferentiable marginal/value functions. Then the results established in this direction apply to deriving necessary optimality conditions for the optimistic version of bilevel programs that occupy a remarkable place in optimization theory and its various applications. We obtain new sets of optimality conditions in both smooth and smooth settings of finite-dimensional and infinite-dimensional spaces

    A Unified Approach to Convex and Convexified Generalized Differentiation of Nonsmooth Functions and Set-Valued Mappings

    Full text link
    In the early 1960's, Moreau and Rockafellar introduced a concept of called \emph{subgradient} for convex functions, initiating the developments of theoretical and applied convex analysis. The needs of going beyond convexity motivated the pioneer works by Clarke considering generalized differentiation theory of Lipschitz continuous functions. Although Clarke generalized differentiation theory is applicable for nonconvex functions, convexity still plays a crucial role in Clarke subdifferential calculus. In the mid 1970's, Mordukhovich developed another generalized differentiation theory for nonconvex functions and set-valued mappings in which the "umbilical cord with convexity" no longer exists. The primary goal of this paper is to present a unified approach and shed new light on convex and Clarke generalized differentiation theories using the concepts and techniques from Mordukhovich's developments

    Augmented Lagrangians and Marginal Values in Parameterized Optimization Problems

    Get PDF
    When an optimization problem depends on parameters, the minimum value in the problem as a function of the parameters is typically far from being differentiable. Certain subderivatives nevertheless exist and can be intepreted as generalized marginal values. In this paper such subderivatives are studied in an abstract setting that allows for infinite dimensionality of the decision space. By means of the notion of proximal subgradients, a new general formula of subdifferentiation is established which provides an upper bound for the marginal values in question and a very broad criterion for local Lipschitz continuity of the optimal value function. Augmented Lagrangians are introduced and shown to lead to still sharper estimates in terms of special multiplier vectors. This approach opens a way to taking higher-order optimality conditions into account in such estimates

    Reflection methods for user-friendly submodular optimization

    Get PDF
    Recently, it has become evident that submodularity naturally captures widely occurring concepts in machine learning, signal processing and computer vision. Consequently, there is need for efficient optimization procedures for submodular functions, especially for minimization problems. While general submodular minimization is challenging, we propose a new method that exploits existing decomposability of submodular functions. In contrast to previous approaches, our method is neither approximate, nor impractical, nor does it need any cumbersome parameter tuning. Moreover, it is easy to implement and parallelize. A key component of our method is a formulation of the discrete submodular minimization problem as a continuous best approximation problem that is solved through a sequence of reflections, and its solution can be easily thresholded to obtain an optimal discrete solution. This method solves both the continuous and discrete formulations of the problem, and therefore has applications in learning, inference, and reconstruction. In our experiments, we illustrate the benefits of our method on two image segmentation tasks.Comment: Neural Information Processing Systems (NIPS), \'Etats-Unis (2013

    Imposing Economic Constraints in Nonparametric Regression: Survey, Implementation and Extension

    Get PDF
    Economic conditions such as convexity, homogeneity, homotheticity, and monotonicity are all important assumptions or consequences of assumptions of economic functionals to be estimated. Recent research has seen a renewed interest in imposing constraints in nonparametric regression. We survey the available methods in the literature, discuss the challenges that present themselves when empirically implementing these methods and extend an existing method to handle general nonlinear constraints. A heuristic discussion on the empirical implementation for methods that use sequential quadratic programming is provided for the reader and simulated and empirical evidence on the distinction between constrained and unconstrained nonparametric regression surfaces is covered.identification, concavity, Hessian, constraint weighted bootstrapping, earnings function
    corecore