69 research outputs found

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    Development of a multi-objective optimization algorithm based on lichtenberg figures

    Get PDF
    This doctoral dissertation presents the most important concepts of multi-objective optimization and a systematic review of the most cited articles in the last years of this subject in mechanical engineering. The State of the Art shows a trend towards the use of metaheuristics and the use of a posteriori decision-making techniques to solve engineering problems. This fact increases the demand for algorithms, which compete to deliver the most accurate answers at the lowest possible computational cost. In this context, a new hybrid multi-objective metaheuristic inspired by lightning and Linchtenberg Figures is proposed. The Multi-objective Lichtenberg Algorithm (MOLA) is tested using complex test functions and explicit contrainted engineering problems and compared with other metaheuristics. MOLA outperformed the most used algorithms in the literature: NSGA-II, MOPSO, MOEA/D, MOGWO, and MOGOA. After initial validation, it was applied to two complex and impossible to be analytically evaluated problems. The first was a design case: the multi-objective optimization of CFRP isogrid tubes using the finite element method. The optimizations were made considering two methodologies: i) using a metamodel, and ii) the finite element updating. The last proved to be the best methodology, finding solutions that reduced at least 45.69% of the mass, 18.4% of the instability coefficient, 61.76% of the Tsai-Wu failure index and increased by at least 52.57% the natural frequency. In the second application, MOLA was internally modified and associated with feature selection techniques to become the Multi-objective Sensor Selection and Placement Optimization based on the Lichtenberg Algorithm (MOSSPOLA), an unprecedented Sensor Placement Optimization (SPO) algorithm that maximizes the acquired modal response and minimizes the number of sensors for any structure. Although this is a structural health monitoring principle, it has never been done before. MOSSPOLA was applied to a real helicopter’s main rotor blade using the 7 best-known metrics in SPO. Pareto fronts and sensor configurations were unprecedentedly generated and compared. Better sensor distributions were associated with higher hypervolume and the algorithm found a sensor configuration for each sensor number and metric, including one with 100% accuracy in identifying delamination considering triaxial modal displacements, minimum number of sensors, and noise for all blade sections.Esta tese de doutorado traz os conceitos mais importantes de otimização multi-objetivo e uma revisão sistemática dos artigos mais citados nos últimos anos deste tema em engenharia mecânica. O estado da arte mostra uma tendência no uso de meta-heurísticas e de técnicas de tomada de decisão a posteriori para resolver problemas de engenharia. Este fato aumenta a demanda sobre os algoritmos, que competem para entregar respostas mais precisas com o menor custo computacional possível. Nesse contexto, é proposta uma nova meta-heurística híbrida multi-objetivo inspirada em raios e Figuras de Lichtenberg. O Algoritmo de Lichtenberg Multi-objetivo (MOLA) é testado e comparado com outras metaheurísticas usando funções de teste complexas e problemas restritos e explícitos de engenharia. Ele superou os algoritmos mais utilizados na literatura: NSGA-II, MOPSO, MOEA/D, MOGWO e MOGOA. Após validação, foi aplicado em dois problemas complexos e impossíveis de serem analiticamente otimizados. O primeiro foi um caso de projeto: otimização multi-objetivo de tubos isogrid CFRP usando o método dos elementos finitos. As otimizações foram feitas considerando duas metodologias: i) usando um meta-modelo, e ii) atualização por elementos finitos. A última provou ser a melhor metodologia, encontrando soluções que reduziram pelo menos 45,69% da massa, 18,4% do coeficiente de instabilidade, 61,76% do TW e aumentaram em pelo menos 52,57% a frequência natural. Na segunda aplicação, MOLA foi modificado internamente e associado a técnicas de feature selection para se tornar o Seleção e Alocação ótima de Sensores Multi-objetivo baseado no Algoritmo de Lichtenberg (MOSSPOLA), um algoritmo inédito de Otimização de Posicionamento de Sensores (SPO) que maximiza a resposta modal adquirida e minimiza o número de sensores para qualquer estrutura. Embora isto seja um princípio de Monitoramento da Saúde Estrutural, nunca foi feito antes. O MOSSPOLA foi aplicado na pá do rotor principal de um helicóptero real usando as 7 métricas mais conhecidas em SPO. Frentes de Pareto e configurações de sensores foram ineditamente geradas e comparadas. Melhores distribuições de sensores foram associadas a um alto hipervolume e o algoritmo encontrou uma configuração de sensor para cada número de sensores e métrica, incluindo uma com 100% de precisão na identificação de delaminação considerando deslocamentos modais triaxiais, número mínimo de sensores e ruído para todas as seções da lâmina

    Applied Metaheuristic Computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC

    Optimisation, Optimal Control and Nonlinear Dynamics in Electrical Power, Energy Storage and Renewable Energy Systems

    Get PDF
    The electrical power system is undergoing a revolution enabled by advances in telecommunications, computer hardware and software, measurement, metering systems, IoT, and power electronics. Furthermore, the increasing integration of intermittent renewable energy sources, energy storage devices, and electric vehicles and the drive for energy efficiency have pushed power systems to modernise and adopt new technologies. The resulting smart grid is characterised, in part, by a bi-directional flow of energy and information. The evolution of the power grid, as well as its interconnection with energy storage systems and renewable energy sources, has created new opportunities for optimising not only their techno-economic aspects at the planning stages but also their control and operation. However, new challenges emerge in the optimization of these systems due to their complexity and nonlinear dynamic behaviour as well as the uncertainties involved.This volume is a selection of 20 papers carefully made by the editors from the MDPI topic “Optimisation, Optimal Control and Nonlinear Dynamics in Electrical Power, Energy Storage and Renewable Energy Systems”, which was closed in April 2022. The selected papers address the above challenges and exemplify the significant benefits that optimisation and nonlinear control techniques can bring to modern power and energy systems

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering

    Text Similarity Between Concepts Extracted from Source Code and Documentation

    Get PDF
    Context: Constant evolution in software systems often results in its documentation losing sync with the content of the source code. The traceability research field has often helped in the past with the aim to recover links between code and documentation, when the two fell out of sync. Objective: The aim of this paper is to compare the concepts contained within the source code of a system with those extracted from its documentation, in order to detect how similar these two sets are. If vastly different, the difference between the two sets might indicate a considerable ageing of the documentation, and a need to update it. Methods: In this paper we reduce the source code of 50 software systems to a set of key terms, each containing the concepts of one of the systems sampled. At the same time, we reduce the documentation of each system to another set of key terms. We then use four different approaches for set comparison to detect how the sets are similar. Results: Using the well known Jaccard index as the benchmark for the comparisons, we have discovered that the cosine distance has excellent comparative powers, and depending on the pre-training of the machine learning model. In particular, the SpaCy and the FastText embeddings offer up to 80% and 90% similarity scores. Conclusion: For most of the sampled systems, the source code and the documentation tend to contain very similar concepts. Given the accuracy for one pre-trained model (e.g., FastText), it becomes also evident that a few systems show a measurable drift between the concepts contained in the documentation and in the source code.</p

    Towards a more efficient use of computational budget in large-scale black-box optimization

    Get PDF
    Evolutionary algorithms are general purpose optimizers that have been shown effective in solving a variety of challenging optimization problems. In contrast to mathematical programming models, evolutionary algorithms do not require derivative information and are still effective when the algebraic formula of the given problem is unavailable. Nevertheless, the rapid advances in science and technology have witnessed the emergence of more complex optimization problems than ever, which pose significant challenges to traditional optimization methods. The dimensionality of the search space of an optimization problem when the available computational budget is limited is one of the main contributors to its difficulty and complexity. This so-called curse of dimensionality can significantly affect the efficiency and effectiveness of optimization methods including evolutionary algorithms. This research aims to study two topics related to a more efficient use of computational budget in evolutionary algorithms when solving large-scale black-box optimization problems. More specifically, we study the role of population initializers in saving the computational resource, and computational budget allocation in cooperative coevolutionary algorithms. Consequently, this dissertation consists of two major parts, each of which relates to one of these research directions. In the first part, we review several population initialization techniques that have been used in evolutionary algorithms. Then, we categorize them from different perspectives. The contribution of each category to improving evolutionary algorithms in solving large-scale problems is measured. We also study the mutual effect of population size and initialization technique on the performance of evolutionary techniques when dealing with large-scale problems. Finally, assuming uniformity of initial population as a key contributor in saving a significant part of the computational budget, we investigate whether achieving a high-level of uniformity in high-dimensional spaces is feasible given the practical restriction in computational resources. In the second part of the thesis, we study the large-scale imbalanced problems. In many real world applications, a large problem may consist of subproblems with different degrees of difficulty and importance. In addition, the solution to each subproblem may contribute differently to the overall objective value of the final solution. When the computational budget is restricted, which is the case in many practical problems, investing the same portion of resources in optimizing each of these imbalanced subproblems is not the most efficient strategy. Therefore, we examine several ways to learn the contribution of each subproblem, and then, dynamically allocate the limited computational resources in solving each of them according to its contribution to the overall objective value of the final solution. To demonstrate the effectiveness of the proposed framework, we design a new set of 40 large-scale imbalanced problems and study the performance of some possible instances of the framework

    Constitutive surveillance and social media

    Get PDF
    Starting from the premise that surveillance is the ‘dominant organising practice’ of our time (Lyon et al 2012: 1), this thesis establishes a framework of ‘constitutive surveillance’ in relation to social media, taking Facebook as its key example. Constitutive surveillance is made up of four forms: economic, political, lateral, and oppositional surveillance. These four surveillance forms – and the actors who undertake them – intersect, compound, and confront one another in the co-production of social media spaces. The framework of constitutive surveillance is structured around a Foucauldian understanding of power, and the thesis shows how each surveillance form articulates strategies of power for organising, administering, and subjectifying populations. After outlining the four surveillance forms, each chapter unpacks the relationship of one form to social media, building throughout the thesis an extensive critical framework of constitutive surveillance
    corecore