18 research outputs found

    IMRT Beam Angle Optimization Using Non-descent Pattern Search Methods

    Get PDF
    https://thekeep.eiu.edu/commencement_spring2015/1304/thumbnail.jp

    Sensor-less maximum power extraction control of a hydrostatic tidal turbine based on adaptive extreme learning machine

    Get PDF
    In this paper, a hydrostatic tidal turbine (HTT) is designed and modelled, which uses more reliable hydrostatic transmission to replace existing fixed ratio gearbox transmission. The HTT dynamic model is derived by integrating governing equations of all the components of the hydraulic machine. A nonlinear observer is proposed to predict the turbine torque and tidal speeds in real time based on extreme learning machine (ELM). A sensor-less double integral sliding mode controller is then designed for the HTT to achieve the maximum power extraction in the presence of large parametric uncertainties and nonlinearities. Simscape design experiments are conducted to verify the proposed design, model and control system, which show that the proposed control system can efficiently achieve the maximum power extraction and has much better performance than conventional control. Unlike the existing works on ELM, the weights and biases in the ELM are updated online continuously. Furthermore, the overall stability of the controlled HTT system including the ELM is proved and the selection criteria for ELM learning rates is derived. The proposed sensor-less control system has prominent advantages in robustness and accuracy, and is also easy to implement in practice

    Using generalized simplex methods to approximate derivatives

    Full text link
    This paper presents two methods for approximating a proper subset of the entries of a Hessian using only function evaluations. These approximations are obtained using the techniques called \emph{generalized simplex Hessian} and \emph{generalized centered simplex Hessian}. We show how to choose the matrices of directions involved in the computation of these two techniques depending on the entries of the Hessian of interest. We discuss the number of function evaluations required in each case and develop a general formula to approximate all order-PP partial derivatives. Since only function evaluations are required to compute the methods discussed in this paper, they are suitable for use in derivative-free optimization methods.Comment: arXiv admin note: text overlap with arXiv:2304.0322

    Trust-Region Methods Without Using Derivatives: Worst Case Complexity and the NonSmooth Case

    Get PDF
    Trust-region methods are a broad class of methods for continuous optimization that found application in a variety of problems and contexts. In particular, they have been studied and applied for problems without using derivatives. The analysis of trust-region derivative-free methods has focused on global convergence, and they have been proven to generate a sequence of iterates converging to stationarity independently of the starting point. Most of such an analysis is carried out in the smooth case, and, moreover, little is known about the complexity or global rate of these methods. In this paper, we start by analyzing the worst case complexity of general trust-region derivative-free methods for smooth functions. For the nonsmooth case, we propose a smoothing approach, for which we prove global convergence and bound the worst case complexity effort. For the special case of nonsmooth functions that result from the composition of smooth and nonsmooth/convex components, we show how to improve the existing results of the literature and make them applicable to the general methodology

    Quantifying uncertainty with ensembles of surrogates for blackbox optimization

    Full text link
    This work is in the context of blackbox optimization where the functions defining the problem are expensive to evaluate and where no derivatives are available. A tried and tested technique is to build surrogates of the objective and the constraints in order to conduct the optimization at a cheaper computational cost. This work proposes different uncertainty measures when using ensembles of surrogates. The resulting combination of an ensemble of surrogates with our measures behaves as a stochastic model and allows the use of efficient Bayesian optimization tools. The method is incorporated in the search step of the mesh adaptive direct search (MADS) algorithm to improve the exploration of the search space. Computational experiments are conducted on seven analytical problems, two multi-disciplinary optimization problems and two simulation problems. The results show that the proposed approach solves expensive simulation-based problems at a greater precision and with a lower computational effort than stochastic models.Comment: 36 pages, 11 figures, submitte
    corecore