305 research outputs found
A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis
This paper presents a nonlinear projection neural network for solving interval
quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper
A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis
This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper
A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis
This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper
A Max-Norm Constrained Minimization Approach to 1-Bit Matrix Completion
We consider in this paper the problem of noisy 1-bit matrix completion under
a general non-uniform sampling distribution using the max-norm as a convex
relaxation for the rank. A max-norm constrained maximum likelihood estimate is
introduced and studied. The rate of convergence for the estimate is obtained.
Information-theoretical methods are used to establish a minimax lower bound
under the general sampling model. The minimax upper and lower bounds together
yield the optimal rate of convergence for the Frobenius norm loss.
Computational algorithms and numerical performance are also discussed.Comment: 33 pages, 3 figure
Graph Oracle Models, Lower Bounds, and Gaps for Parallel Stochastic Optimization
We suggest a general oracle-based framework that captures different parallel
stochastic optimization settings described by a dependency graph, and derive
generic lower bounds in terms of this graph. We then use the framework and
derive lower bounds for several specific parallel optimization settings,
including delayed updates and parallel processing with intermittent
communication. We highlight gaps between lower and upper bounds on the oracle
complexity, and cases where the "natural" algorithms are not known to be
optimal
Backstepping PDE Design: A Convex Optimization Approach
Abstract\u2014Backstepping design for boundary linear PDE is
formulated as a convex optimization problem. Some classes of
parabolic PDEs and a first-order hyperbolic PDE are studied,
with particular attention to non-strict feedback structures. Based
on the compactness of the Volterra and Fredholm-type operators
involved, their Kernels are approximated via polynomial
functions. The resulting Kernel-PDEs are optimized using Sumof-
Squares (SOS) decomposition and solved via semidefinite
programming, with sufficient precision to guarantee the stability
of the system in the L2-norm. This formulation allows optimizing
extra degrees of freedom where the Kernel-PDEs are included
as constraints. Uniqueness and invertibility of the Fredholm-type
transformation are proved for polynomial Kernels in the space
of continuous functions. The effectiveness and limitations of the
approach proposed are illustrated by numerical solutions of some
Kernel-PDEs
International Conference on Continuous Optimization (ICCOPT) 2019 Conference Book
The Sixth International Conference on Continuous Optimization took place on the campus of the Technical University of Berlin, August 3-8, 2019. The ICCOPT is a flagship conference of the Mathematical Optimization Society (MOS), organized every three years. ICCOPT 2019 was hosted by the Weierstrass Institute for Applied Analysis and Stochastics (WIAS) Berlin. It included a Summer School and a Conference with a series of plenary and semi-plenary talks, organized and contributed sessions, and poster sessions.
This book comprises the full conference program. It contains, in particular, the scientific program in survey style as well as with all details, and information on the social program, the venue, special meetings, and more
- …