268,248 research outputs found
Robust measurement-based buffer overflow probability estimators for QoS provisioning and traffic anomaly prediction applicationm
Suitable estimators for a class of Large Deviation approximations of rare
event probabilities based on sample realizations of random processes have been
proposed in our earlier work. These estimators are expressed as non-linear
multi-dimensional optimization problems of a special structure. In this paper,
we develop an algorithm to solve these optimization problems very efficiently
based on their characteristic structure. After discussing the nature of the
objective function and constraint set and their peculiarities, we provide a
formal proof that the developed algorithm is guaranteed to always converge. The
existence of efficient and provably convergent algorithms for solving these
problems is a prerequisite for using the proposed estimators in real time
problems such as call admission control, adaptive modulation and coding with
QoS constraints, and traffic anomaly detection in high data rate communication
networks
Robust measurement-based buffer overflow probability estimators for QoS provisioning and traffic anomaly prediction applications
Suitable estimators for a class of Large Deviation approximations of rare event probabilities based on sample realizations of random processes have been proposed in our earlier work. These estimators are expressed as non-linear multi-dimensional optimization problems of a special structure. In this paper, we develop an algorithm to solve these optimization problems very efficiently based on their characteristic structure. After discussing the nature of the objective function and constraint set and their peculiarities, we provide a formal proof that the developed algorithm is guaranteed to always converge. The existence of efficient and provably convergent algorithms for solving these problems is a prerequisite for using the proposed estimators in real time problems such as call admission control, adaptive modulation and coding with QoS constraints, and traffic anomaly detection in high data rate communication networks
Neural-BO: A Black-box Optimization Algorithm using Deep Neural Networks
Bayesian Optimization (BO) is an effective approach for global optimization
of black-box functions when function evaluations are expensive. Most prior
works use Gaussian processes to model the black-box function, however, the use
of kernels in Gaussian processes leads to two problems: first, the kernel-based
methods scale poorly with the number of data points and second, kernel methods
are usually not effective on complex structured high dimensional data due to
curse of dimensionality. Therefore, we propose a novel black-box optimization
algorithm where the black-box function is modeled using a neural network. Our
algorithm does not need a Bayesian neural network to estimate predictive
uncertainty and is therefore computationally favorable. We analyze the
theoretical behavior of our algorithm in terms of regret bound using advances
in NTK theory showing its efficient convergence. We perform experiments with
both synthetic and real-world optimization tasks and show that our algorithm is
more sample efficient compared to existing methods
Energy-Efficient Power Allocation in OFDM Systems with Wireless Information and Power Transfer
This paper considers an orthogonal frequency division multiplexing (OFDM)
downlink point-to-point system with simultaneous wireless information and power
transfer. It is assumed that the receiver is able to harvest energy from noise,
interference, and the desired signals.
We study the design of power allocation algorithms maximizing the energy
efficiency of data transmission (bit/Joule delivered to the receiver). In
particular, the algorithm design is formulated as a high-dimensional non-convex
optimization problem which takes into account the circuit power consumption,
the minimum required data rate, and a constraint on the minimum power delivered
to the receiver. Subsequently, by exploiting the properties of nonlinear
fractional programming, the considered non-convex optimization problem, whose
objective function is in fractional form, is transformed into an equivalent
optimization problem having an objective function in subtractive form, which
enables the derivation of an efficient iterative power allocation algorithm. In
each iteration, the optimal power allocation solution is derived based on dual
decomposition and a one-dimensional search. Simulation results illustrate that
the proposed iterative power allocation algorithm converges to the optimal
solution, and unveil the trade-off between energy efficiency, system capacity,
and wireless power transfer: (1) In the low transmit power regime, maximizing
the system capacity may maximize the energy efficiency. (2) Wireless power
transfer can enhance the energy efficiency, especially in the interference
limited regime.Comment: 6 pages, Accepted for presentation at the IEEE International
Conference on Communications (ICC) 201
SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget
In the context of industrial engineering, it is important to integrate
efficient computational optimization methods in the product development
process. Some of the most challenging simulation-based engineering design
optimization problems are characterized by: a large number of design variables,
the absence of analytical gradients, highly non-linear objectives and a limited
function evaluation budget. Although a huge variety of different optimization
algorithms is available, the development and selection of efficient algorithms
for problems with these industrial relevant characteristics, remains a
challenge. In this communication, a hybrid variant of Differential Evolution
(DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG)
methods within the framework of DE, in order to improve optimization efficiency
on problems with the previously mentioned characteristics. The performance of
the resulting derivative-free algorithm is compared with other state-of-the-art
DE variants on 25 commonly used benchmark functions, under tight function
evaluation budget constraints of 1000 evaluations. The experimental results
indicate that the new algorithm performs excellent on the 'difficult' (high
dimensional, multi-modal, inseparable) test functions. The operations used in
the proposed mutation scheme, are computationally inexpensive, and can be
easily implemented in existing differential evolution variants or other
population-based optimization algorithms by a few lines of program code as an
non-invasive optional setting. Besides the applicability of the presented
algorithm by itself, the described concepts can serve as a useful and
interesting addition to the algorithmic operators in the frameworks of
heuristics and evolutionary optimization and computing
B2Opt: Learning to Optimize Black-box Optimization with Little Budget
The core challenge of high-dimensional and expensive black-box optimization
(BBO) is how to obtain better performance faster with little function
evaluation cost. The essence of the problem is how to design an efficient
optimization strategy tailored to the target task. This paper designs a
powerful optimization framework to automatically learn the optimization
strategies from the target or cheap surrogate task without human intervention.
However, current methods are weak for this due to poor representation of
optimization strategy. To achieve this, 1) drawing on the mechanism of genetic
algorithm, we propose a deep neural network framework called B2Opt, which has a
stronger representation of optimization strategies based on survival of the
fittest; 2) B2Opt can utilize the cheap surrogate functions of the target task
to guide the design of the efficient optimization strategies. Compared to the
state-of-the-art BBO baselines, B2Opt can achieve multiple orders of magnitude
performance improvement with less function evaluation cost. We validate our
proposal on high-dimensional synthetic functions and two real-world
applications. We also find that deep B2Opt performs better than shallow ones
An Efficient Optimization Algorithm for Super High Dimensional Numerical Function Inspired by Cellular Differentiation
Inspired by the cellular differentiation behaviors, a new biomimetic optimization algorithm, cellular differentiation optimization algorithm (CDOA), is proposed. First, a certain number of cells are randomly distributed in the search space in which each cell represents a solution. Then, several cellular differentiation behaviors such as division, growth, migration, adhesion and apoptosis are exhibited for finding the optimal solution according to the activity value of a cell. The proposed algorithm is applied to several benchmark complex functions optimization with 20-1000 dimensions. Experimental results show that the proposed cellular differentiation optimization algorithm can converge to the optimum of complex numerical functions with super high dimensions rapidly in spite of its simple procedure and effortless implementation
- …