43 research outputs found
SINRD Filter Optimization using Heuristic Algorithm
In this paper, new SINRD designs are explored thanks to an original modelling method. A heuristic algorithm using this method is developed to reduce the number of holes in an SINRD filter. Simulations are processed with a modal modeling method called WCIP. Optimal results are successfully compared to measurements. The number of holes is reduced of 44% with comparable filter performances. Keywords: SINRD filter, heuristic algorithm
On lower bounds using separable terms in interval B&B for one-dimensional poblems
Interval Branch-and-Bound (B&B) algorithms are powerful methods which aim for guaranteed solutions of Global Optimization problems. Lower bounds for a function in a given interval can be obtained directly with Interval Arithmetic. The use of lower bounds based on Taylor forms show a faster convergence to the minimum with decreasing size of the search interval. Our research focuses on one dimensional functions that can be decomposed into several terms (sub-functions). The question is whether using this characteristic leads to sharper bounds when based on bounds of the sub-functions. This paper deals with separable functions in two sub-functions. The use of the separability is investigated for the so-called Baumann form and Lower Bound Value Form (LBVF). It is proven that using the additively separability in the LBVF form may lead to a combination of linear minorants that are sharper than the original one. Numerical experiments confirm this improving behaviour and also show that not all separable methods do always provide sharper additively lower bounds. Additional research is needed to obtain better lower bounds for multiplicatively separable functions and to address higher dimensional problems
Robustness Verification of Support Vector Machines
We study the problem of formally verifying the robustness to adversarial
examples of support vector machines (SVMs), a major machine learning model for
classification and regression tasks. Following a recent stream of works on
formal robustness verification of (deep) neural networks, our approach relies
on a sound abstract version of a given SVM classifier to be used for checking
its robustness. This methodology is parametric on a given numerical abstraction
of real values and, analogously to the case of neural networks, needs neither
abstract least upper bounds nor widening operators on this abstraction. The
standard interval domain provides a simple instantiation of our abstraction
technique, which is enhanced with the domain of reduced affine forms, which is
an efficient abstraction of the zonotope abstract domain. This robustness
verification technique has been fully implemented and experimentally evaluated
on SVMs based on linear and nonlinear (polynomial and radial basis function)
kernels, which have been trained on the popular MNIST dataset of images and on
the recent and more challenging Fashion-MNIST dataset. The experimental results
of our prototype SVM robustness verifier appear to be encouraging: this
automated verification is fast, scalable and shows significantly high
percentages of provable robustness on the test set of MNIST, in particular
compared to the analogous provable robustness of neural networks
On interval branch-and-bound for additively separable functions with common variables
Interval branch-and-bound (B&B) algorithms are powerful methods which look for guaranteed solutions of global optimisation problems. The computational effort needed to reach this aim, increases exponentially with the problem dimension in the worst case. For separable functions this effort is less, as lower dimensional sub-problems can be solved individually. The question is how to design specific methods for cases where the objective function can be considered separable, but common variables occur in the sub-problems. This paper is devoted to establish the bases of B&B algorithms for separable problems. New B&B rules are presented based on derived properties to compute bounds. A numerical illustration is elaborated with a test-bed of problems mostly generated by combining traditional box constrained global optimisation problems, to show the potential of using the derived theoretical basis
Upper Bounding in Inner Regions for Global Optimization under Inequality Constraints
International audienceIn deterministic continuous constrained global optimization, upper bounding the objective function generally resorts to local minimization at several nodes/iterations of the branch and bound. We propose in this paper an alternative approach when the constraints are inequalities and the feasible space has a non-null volume. First, we extract an inner region , i.e., an entirely feasible convex polyhedron or box in which all points satisfy the constraints. Second, we select a point inside the extracted inner region and update the upper bound with its cost. We describe in this paper two original inner region extraction algorithms implemented in our interval B&B called IbexOpt. They apply to nonconvex constraints involving mathematical operators like +,x,power,sqrt,exp,log,sin. This upper bounding shows very good performance obtained on medium-sized systems proposed in the COCONUT suite
A Contractor Based on Convex Interval Taylor
International audienceInterval Taylor has been proposed in the sixties by the interval analysis community for relaxing continuous non-convex constraint systems. However, it generally produces a non-convex relaxation of the solution set. A simple way to build a convex polyhedral relaxation is to select a corner of the studied domain/box as expansion point of the interval Taylor form, instead of the usual midpoint. The idea has been proposed by Neumaier to produce a sharp range of a single function andby Lin and Stadtherr to handle n Ă— n (square) systems of equations. This paper presents an interval Newton-like operator, called X-Newton, that iteratively calls this interval convexification based on an endpoint interval Taylor. This general-purpose contractor uses no preconditioning and can handle any system of equality and inequality constraints. It uses Hansen's variant to compute the interval Taylor form and uses two opposite corners of the domain for every constraint. The X-Newton operator can be rapidly encoded, and produces good speedups in constrained global optimization and constraint satisfaction. First experiments compare X-Newton with affine arithmetic
Compact relaxations for polynomial programming problems
Reduced RLT constraints are a special class of Reformulation- Linearization Technique (RLT) constraints. They apply to nonconvex (both continuous and mixed-integer) quadratic programming problems subject to systems of linear equality constraints. We present an extension to the general case of polynomial programming problems and discuss the derived convex relaxation. We then show how to perform rRLT constraint generation so as to reduce the number of inequality constraints in the relaxation, thereby making it more compact and faster to solve. We present some computational results validating our approach
A metaheuristic methodology based on the limitation of the memory of interval branch and bound algorithms
Interval analysis, Branch and bound, Limited memory, Metaheuristic, Algorithmic complexity,