557 research outputs found
A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications
Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms
A review of RFID based solutions for indoor localization and location-based classification of tags
Wireless communication systems are very used for indoor localization of items. In particular, two main application field can be identified. The former relates to detection or localization of static items. The latter relates to real-time tracking of moving objects, whose movements can be reconstructed over identified timespans. Among the adopted technologies, Radio-Frequency IDentification (RFID), especially if based on cheap passive RFID tags, stands out for its affordability and reasonable efficiency. This aspect makes RFID suitable for both the above-mentioned applications, especially when a large number of objects need to be tagged. The reason lies in a suitable trade-off between low cost for implementing the position sensing system, and its precision and accuracy. However, RFID-based solutions suffer for limited reading range and lower accuracy. Solutions have been proposed by academia and industry. However, a structured analysis of developed solutions, useful for further implementations, is missing. The purpose of this paper is to highlight and review the recently proposed solutions for indoor localization making use of RFID passive tags. The paper focuses on both precise and qualitative location of objects. The form relates to (i) the correct position of tags, namely mapping their right position in a 2D or 3D environment. The latter relates to the classification of tags, namely the identification of the area where the tag is regardless its specific position
Robust modeling and planning of radio-frequency identification network in logistics under uncertainties
To realize higher coverage rate, lower reading interference, and cost efficiency of radio-frequency identification networkin logistics under uncertainties, a novel robust radio-frequency identification network planning model is built and arobust particle swarm optimization is proposed. In radio-frequency identification network planning model, coverage isestablished by referring the probabilistic sensing model of sensor with uncertain sensing range; reading interference iscalculated by concentric map–based Monte Carlo method; cost efficiency is described with the quantity of readers. Inrobust particle swarm optimization, a sampling method, the sampling size of which varies with iterations, is put forwardto improve the robustness of robust particle swarm optimization within limited sampling size. In particular, the exploita-tion speed in the prophase of robust particle swarm optimization is quickened by smaller expected sampling size; theexploitation precision in the anaphase of robust particle swarm optimization is ensured by larger expected sampling size.Simulation results show that, compared with the other three methods, the planning solution obtained by this work ismore conducive to enhance the coverage rate and reduce interference and cost.info:eu-repo/semantics/publishedVersio
Imbalanced data classification using support vector machine based on simulated annealing for enhancing penalty parameter
For pattern cataloguing and regression issues, the support vector machine (SVM) is an eminent and computationally prevailing machine learning method. It’s been effectively addressing several concrete issues across an extensive gamut of domains. SVM possesses a key aspect called penalty factor C. The choice of these aspects has a substantial impact on the classification precision of SVM as unsuitable parameter settings might drive substandard classification outcomes. Penalty factor C is required to achieve an adequate trade-off between classification errors and generalisation performance. Hence, formulating an SVM model having appropriate performance requires parameter optimisation. The simulated annealing (SA) algorithm is employed to formulate a hybrid method for evaluating SVM parameters. Additionally, the intent is to enhance system efficacy to obtain the optimal penalty parameter and balance classification performance at the same time. Our experiments with many UCI datasets indicate that the recommended technique could attain enhanced classification precision
Immunity-Based Framework for Autonomous Flight in GPS-Challenged Environment
In this research, the artificial immune system (AIS) paradigm is used for the development of a conceptual framework for autonomous flight when vehicle position and velocity are not available from direct sources such as the global navigation satellite systems or external landmarks and systems. The AIS is expected to provide corrections of velocity and position estimations that are only based on the outputs of onboard inertial measurement units (IMU). The AIS comprises sets of artificial memory cells that simulate the function of memory T- and B-cells in the biological immune system of vertebrates. The innate immune system uses information about invading antigens and needed antibodies. This information is encoded and sorted by T- and B-cells. The immune system has an adaptive component that can accelerate and intensify the immune response upon subsequent infection with the same antigen. The artificial memory cells attempt to mimic these characteristics for estimation error compensation and are constructed under normal conditions when all sensor systems function accurately, including those providing vehicle position and velocity information. The artificial memory cells consist of two main components: a collection of instantaneous measurements of relevant vehicle features representing the antigen and a set of instantaneous estimation errors or correction features, representing the antibodies. The antigen characterizes the dynamics of the system and is assumed to be correlated with the required corrections of position and velocity estimation or antibodies. When the navigation source is unavailable, the currently measured vehicle features from the onboard sensors are matched against the AIS antigens and the corresponding corrections are extracted and used to adjust the position and velocity estimation algorithm and provide the corrected estimation as actual measurement feedback to the vehicle’s control system. The proposed framework is implemented and tested through simulation in two versions: with corrections applied to the output or the input of the estimation scheme. For both approaches, the vehicle feature or antigen sets include increments of body axes components of acceleration and angular rate. The correction feature or antibody sets include vehicle position and velocity and vehicle acceleration adjustments, respectively. The impact on the performance of the proposed methodology produced by essential elements such as path generation method, matching algorithm, feature set, and the IMU grade was investigated. The findings demonstrated that in all cases, the proposed methodology could significantly reduce the accumulation of dead reckoning errors and can become a viable solution in situations where direct accurate measurements and other sources of information are not available. The functionality of the proposed methodology and its promising outcomes were successfully illustrated using the West Virginia University unmanned aerial system simulation environment
Development of a multi-objective optimization algorithm based on lichtenberg figures
This doctoral dissertation presents the most important concepts of multi-objective optimization and a systematic review of the most cited articles in the last years of this subject in mechanical engineering. The State of the Art shows a trend towards the use of metaheuristics and the use of a posteriori decision-making techniques to solve engineering problems. This fact increases the demand for algorithms, which compete to deliver the most accurate answers at the lowest possible computational cost. In this context, a new hybrid multi-objective metaheuristic inspired by lightning and Linchtenberg Figures is proposed. The Multi-objective Lichtenberg Algorithm (MOLA) is tested using complex test functions and explicit contrainted engineering problems and compared with other metaheuristics. MOLA outperformed the most used algorithms in the literature: NSGA-II, MOPSO, MOEA/D, MOGWO, and MOGOA. After initial validation, it was applied to two complex and impossible to be analytically evaluated problems. The first was a design case: the multi-objective optimization of CFRP isogrid tubes using the finite element method. The optimizations were made considering two methodologies: i) using a metamodel, and ii) the finite element updating. The last proved to be the best methodology, finding solutions that reduced at least 45.69% of the mass, 18.4% of the instability coefficient, 61.76% of the Tsai-Wu failure index and increased by at least 52.57% the natural frequency. In the second application, MOLA was internally modified and associated with feature selection techniques to become the Multi-objective Sensor Selection and Placement Optimization based on the Lichtenberg Algorithm (MOSSPOLA), an unprecedented Sensor Placement Optimization (SPO) algorithm that maximizes the acquired modal response and minimizes the number of sensors for any structure. Although this is a structural health monitoring principle, it has never been done before. MOSSPOLA was applied to a real helicopter’s main rotor blade using the 7 best-known metrics in SPO. Pareto fronts and sensor configurations were unprecedentedly generated and compared. Better sensor distributions were associated with higher hypervolume and the algorithm found a sensor configuration for each sensor number and metric, including one with 100% accuracy in identifying delamination considering triaxial modal displacements, minimum number of sensors, and noise for all blade sections.Esta tese de doutorado traz os conceitos mais importantes de otimização multi-objetivo e uma revisão sistemática dos artigos mais citados nos últimos anos deste tema em engenharia mecânica. O estado da arte mostra uma tendência no uso de meta-heurísticas e de técnicas de tomada de decisão a posteriori para resolver problemas de engenharia. Este fato aumenta a demanda sobre os algoritmos, que competem para entregar respostas mais precisas com o menor custo computacional possível. Nesse contexto, é proposta uma nova meta-heurística híbrida multi-objetivo inspirada em raios e Figuras de Lichtenberg. O Algoritmo de Lichtenberg Multi-objetivo (MOLA) é testado e comparado com outras metaheurísticas usando funções de teste complexas e problemas restritos e explícitos de engenharia. Ele superou os algoritmos mais utilizados na literatura: NSGA-II, MOPSO, MOEA/D, MOGWO e MOGOA. Após validação, foi aplicado em dois problemas complexos e impossíveis de serem analiticamente otimizados. O primeiro foi um caso de projeto: otimização multi-objetivo de tubos isogrid CFRP usando o método dos elementos finitos. As otimizações foram feitas considerando duas metodologias: i) usando um meta-modelo, e ii) atualização por elementos finitos. A última provou ser a melhor metodologia, encontrando soluções que reduziram pelo menos 45,69% da massa, 18,4% do coeficiente de instabilidade, 61,76% do TW e aumentaram em pelo menos 52,57% a frequência natural. Na segunda aplicação, MOLA foi modificado internamente e associado a técnicas de feature selection para se tornar o Seleção e Alocação ótima de Sensores Multi-objetivo baseado no Algoritmo de Lichtenberg (MOSSPOLA), um algoritmo inédito de Otimização de Posicionamento de Sensores (SPO) que maximiza a resposta modal adquirida e minimiza o número de sensores para qualquer estrutura. Embora isto seja um princípio de Monitoramento da Saúde Estrutural, nunca foi feito antes. O MOSSPOLA foi aplicado na pá do rotor principal de um helicóptero real usando as 7 métricas mais conhecidas em SPO. Frentes de Pareto e configurações de sensores foram ineditamente geradas e comparadas. Melhores distribuições de sensores foram associadas a um alto hipervolume e o algoritmo encontrou uma configuração de sensor para cada número de sensores e métrica, incluindo uma com 100% de precisão na identificação de delaminação considerando deslocamentos modais triaxiais, número mínimo de sensores e ruído para todas as seções da lâmina
Component-wise analysis of metaheuristic algorithms for novel fuzzy-meta classifier
Metaheuristic research has proposed promising results in science, business, and engineering problems. But, mostly high-level analysis is performed on metaheuristic performances. This leaves several critical questions unanswered due to black-box issue that does not reveal why certain metaheuristic algorithms performed better on some problems and not on others. To address the significant gap between theory and practice in metaheuristic research, this study proposed in-depth analysis approach using component-view of metaheuristic algorithms and diversity measurement for determining exploration and exploitation abilities. This research selected three commonly used swarm-based metaheuristic algorithms – Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), and Cuckoo Search (CS) – to perform component-wise analysis. As a result, the study able to address premature convergence problem in PSO, poor exploitation in ABC, and imbalanced exploration and exploitation issue in CS. The proposed improved PSO (iPSO), improved ABC (iABC), and improved CS (iCS) outperformed standard algorithms and variants from existing literature, as well as, Grey Wolf Optimization (GWO) and Animal Migration Optimization (AMO) on ten numerical optimization problems with varying modalities. The proposed iPSO, iABC, and iCS were then employed on proposed novel Fuzzy-Meta Classifier (FMC) which offered highly reduced model complexity and high accuracy as compared to Adaptive Neuro-Fuzzy Inference System (ANFIS). The proposed three-layer FMC produced efficient rules that generated nearly 100% accuracies on ten different classification datasets, with significantly reduced number of trainable parameters and number of nodes in the network architecture, as compared to ANFIS
Applications
Volume 3 describes how resource-aware machine learning methods and techniques are used to successfully solve real-world problems. The book provides numerous specific application examples: in health and medicine for risk modelling, diagnosis, and treatment selection for diseases in electronics, steel production and milling for quality control during manufacturing processes in traffic, logistics for smart cities and for mobile communications
Navigation of mobile robot in cluttered environment
Now a day’s mobile robots are widely used in many applications. Navigation of mobile robot is primary issue in robotic research field. The mobile robots to be successful, they must quickly and robustly perform useful tasks in a complex, dynamic, known and unknown surrounding. Navigation plays an important role in all mobile robots activities and tasks. Mobile robots are machines, which navigate around their environment extracting sensory information from the surrounding, and performing actions depend on the information given by the sensors. The main aim of navigation of mobile robot is to give shortest and safest path while avoiding obstacles with the help of suitable navigation technique such as Fuzzy logic. In this, we build up mobile robot then simulation and experiments are carried out in the lab. Comparison between the simulation and experimental results are done and are found to be in good
- …