240 research outputs found

    Towards a Better Understanding of the Local Attractor in Particle Swarm Optimization: Speed and Solution Quality

    Full text link
    Particle Swarm Optimization (PSO) is a popular nature-inspired meta-heuristic for solving continuous optimization problems. Although this technique is widely used, the understanding of the mechanisms that make swarms so successful is still limited. We present the first substantial experimental investigation of the influence of the local attractor on the quality of exploration and exploitation. We compare in detail classical PSO with the social-only variant where local attractors are ignored. To measure the exploration capabilities, we determine how frequently both variants return results in the neighborhood of the global optimum. We measure the quality of exploitation by considering only function values from runs that reached a search point sufficiently close to the global optimum and then comparing in how many digits such values still deviate from the global minimum value. It turns out that the local attractor significantly improves the exploration, but sometimes reduces the quality of the exploitation. As a compromise, we propose and evaluate a hybrid PSO which switches off its local attractors at a certain point in time. The effects mentioned can also be observed by measuring the potential of the swarm

    Novel sampling techniques for reservoir history matching optimisation and uncertainty quantification in flow prediction

    Get PDF
    Modern reservoir management has an increasing focus on accurately predicting the likely range of field recoveries. A variety of assisted history matching techniques has been developed across the research community concerned with this topic. These techniques are based on obtaining multiple models that closely reproduce the historical flow behaviour of a reservoir. The set of resulted history matched models is then used to quantify uncertainty in predicting the future performance of the reservoir and providing economic evaluations for different field development strategies. The key step in this workflow is to employ algorithms that sample the parameter space in an efficient but appropriate manner. The algorithm choice has an impact on how fast a model is obtained and how well the model fits the production data. The sampling techniques that have been developed to date include, among others, gradient based methods, evolutionary algorithms, and ensemble Kalman filter (EnKF). This thesis has investigated and further developed the following sampling and inference techniques: Particle Swarm Optimisation (PSO), Hamiltonian Monte Carlo, and Population Markov Chain Monte Carlo. The inspected techniques have the capability of navigating the parameter space and producing history matched models that can be used to quantify the uncertainty in the forecasts in a faster and more reliable way. The analysis of these techniques, compared with Neighbourhood Algorithm (NA), has shown how the different techniques affect the predicted recovery from petroleum systems and the benefits of the developed methods over the NA. The history matching problem is multi-objective in nature, with the production data possibly consisting of multiple types, coming from different wells, and collected at different times. Multiple objectives can be constructed from these data and explicitly be optimised in the multi-objective scheme. The thesis has extended the PSO to handle multi-objective history matching problems in which a number of possible conflicting objectives must be satisfied simultaneously. The benefits and efficiency of innovative multi-objective particle swarm scheme (MOPSO) are demonstrated for synthetic reservoirs. It is demonstrated that the MOPSO procedure can provide a substantial improvement in finding a diverse set of good fitting models with a fewer number of very costly forward simulations runs than the standard single objective case, depending on how the objectives are constructed. The thesis has also shown how to tackle a large number of unknown parameters through the coupling of high performance global optimisation algorithms, such as PSO, with model reduction techniques such as kernel principal component analysis (PCA), for parameterising spatially correlated random fields. The results of the PSO-PCA coupling applied to a recent SPE benchmark history matching problem have demonstrated that the approach is indeed applicable for practical problems. A comparison of PSO with the EnKF data assimilation method has been carried out and has concluded that both methods have obtained comparable results on the example case. This point reinforces the need for using a range of assisted history matching algorithms for more confidence in predictions

    An Analysis of Particle Swarm Optimizers

    Get PDF
    Many scientific, engineering and economic problems involve the optimisation of a set of parameters. These problems include examples like minimising the losses in a power grid by finding the optimal configuration of the components, or training a neural network to recognise images of people's faces. Numerous optimisation algorithms have been proposed to solve these problems, with varying degrees of success. The Particle Swarm Optimiser (PSO) is a relatively new technique that has been empirically shown to perform well on many of these optimisation problems. This thesis presents a theoretical model that can be used to describe the long-term behaviour of the algorithm. An enhanced version of the Particle Swarm Optimiser is constructed and shown to have guaranteed convergence on local minima. This algorithm is extended further, resulting in an algorithm with guaranteed convergence on global minima. A model for constructing cooperative PSO algorithms is developed, resulting in the introduction of two new PSO-based algorithms. Empirical results are presented to support the theoretical properties predicted by the various models, using synthetic benchmark functions to investigate specific properties. The various PSO-based algorithms are then applied to the task of training neural networks, corroborating the results obtained on the synthetic benchmark functions.Thesis (PhD)--University of Pretoria, 2007.Computer ScienceUnrestricte

    Industrial machine structural components’ optimization and redesign

    Get PDF
    Tese de doutoramento em Líderes para as Indústrias TecnológicasO corte por laser é um processo altamente flexível com numerosas vantagens sobre tecnologias concorrentes. O crescimento do mercado é revelador do seu potencial, totalizando 4300 milhões de dólares americanos em 2020. O processo é utilizado em muitas indústrias e as tendências atuais passam por melhorias ao nível do tempo de ciclo, qualidade, custos e exatidão. Os materiais compósitos (nomeadamente polímeros reforçados por fibras) apresentam propriedades mecânicas atrativas para várias aplicações, incluindo a que se relaciona com o presente trabalho: componentes de máquinas industriais. A utilização de compósitos resulta tipicamente em máquinas mais eficientes, exatidão dimensional acrescida, melhor qualidade superficial, melhor eficiência energética e menor impacto ambiental. O principal objetivo deste trabalho é aumentar a produtividade de uma máquina de corte laser, através do redesign de um componente crítico (o pórtico), grande influenciador da exatidão da máquina. Pretende-se com isto criar uma metodologia genérica capaz de auxiliar no processo de redesign de componentes industriais. Dado que o problema lida com dois objetivos concorrentes (redução de peso e aumento de rigidez) e com um elevado número de variáveis, a implementação de uma rotina de otimização é um aspeto central. É crucial demonstrar que o processo de otimização proposto resulta em soluções efetivas. Estas foram validadas através de análise de elementos finitos e de validação experimental, com recurso a um protótipo à escala. O algoritmo de otimização usado é uma metaheurística, inspirado no comportamento de grupos de animais. Algoritmos Particle Swarm são sugeridos com sucesso para problemas de otimização semelhantes. A otimização focou-se na espessura de cada laminado, para diferentes orientações. A rotina de otimização resultou na definição de uma solução quase-ótima para os laminados analisados e permitiu a redução do peso da peça em 43% relativamente à solução atual, bem como um aumento de 25% na aceleração máxima permitida, o que se reflete na produtividade da máquina, enquanto a mesma exatidão é garantida. A comparação entre os resultados numéricos e experimentais para os protótipos mostra uma boa concordância, com divergências pontuais, mas que ainda assim resultam na validação do modelo de elementos finitos no qual se baseia a otimização.Laser cutting is a highly flexible process with numerous advantages over competing technologies. These have ensured the growth of its market, totalling 4300 million United States dollars in 2020. Being used in many industries, the current trends are focused on reduced lead time, increased quality standards and competitive costs, while ensuring accuracy. Composite materials (namely fibre reinforced polymers) present attractive mechanical properties that poses them as advantageous for several applications, including the matter of this thesis: industrial machine components. The use of these materials leads to machines with higher efficiency, dimensional accuracy, surface quality, energy efficiency, and environmental impact. The main goal of this work is to increase the productivity of a laser cutting machine through the redesign of a critical component (gantry), also key for the overall machine accuracy. Beyond that, it is intended that this work lays out a methodology capable of assisting in the redesign of other machine critical components. As the problem leads with two opposing objectives (reducing weight and increasing stiffness), and with many variables, the implementation of an optimization routine is a central aspect of the present work. It is of major importance that the proposed optimization method leads to reliable results, demonstrated in this work by a finite element analysis and through experimental validation, by means of a scale prototype. The optimization algorithm selected is a metaheuristic inspired by the behaviour of swarms of animals. Particle swarm algorithms are proven to provide good and fast results in similar optimization problems. The optimization was performed focusing on the thickness of each laminate and on the orientations present in these. The optimization routine resulted in a definition of a near-optimal solution for the laminates analysed and allowed a weight reduction of 43% regarding the current solution, as well as an increase of 25% in the maximum allowed acceleration, which reflects on the productivity of the machine, while ensuring the same accuracy. The comparison between numeric and experimental testing of the prototypes shows a good agreement, with punctual divergences, but that still validates the Finite elements upon which the optimization process is supported.Portuguese Foundation for Science and Technology - SFRH/BD/51106/2010

    Stochastic time-changed Lévy processes with their implementation

    Get PDF
    Includes bibliographical references.We focus on the implementation details for Lévy processes and their extension to stochastic volatility models for pricing European vanilla options and exotic options. We calibrated five models to European options on the S&P500 and used the calibrated models to price a cliquet option using Monte Carlo simulation. We provide the algorithms required to value the options when using Lévy processes. We found that these models were able to closely reproduce the market option prices for many strikes and maturities. We also found that the models we studied produced different prices for the cliquet option even though all the models produced the same prices for vanilla options. This highlighted a feature of model uncertainty when valuing a cliquet option. Further research is required to develop tools to understand and manage this model uncertainty. We make a recommendation on how to proceed with this research by studying the cliquet option’s sensitivity to the model parameters
    corecore