220 research outputs found

    Genetic Transfer or Population Diversification? Deciphering the Secret Ingredients of Evolutionary Multitask Optimization

    Full text link
    Evolutionary multitasking has recently emerged as a novel paradigm that enables the similarities and/or latent complementarities (if present) between distinct optimization tasks to be exploited in an autonomous manner simply by solving them together with a unified solution representation scheme. An important matter underpinning future algorithmic advancements is to develop a better understanding of the driving force behind successful multitask problem-solving. In this regard, two (seemingly disparate) ideas have been put forward, namely, (a) implicit genetic transfer as the key ingredient facilitating the exchange of high-quality genetic material across tasks, and (b) population diversification resulting in effective global search of the unified search space encompassing all tasks. In this paper, we present some empirical results that provide a clearer picture of the relationship between the two aforementioned propositions. For the numerical experiments we make use of Sudoku puzzles as case studies, mainly because of their feature that outwardly unlike puzzle statements can often have nearly identical final solutions. The experiments reveal that while on many occasions genetic transfer and population diversity may be viewed as two sides of the same coin, the wider implication of genetic transfer, as shall be shown herein, captures the true essence of evolutionary multitasking to the fullest.Comment: 7 pages, 6 figure

    Evolutionary Computation 2020

    Get PDF
    Intelligent optimization is based on the mechanism of computational intelligence to refine a suitable feature model, design an effective optimization algorithm, and then to obtain an optimal or satisfactory solution to a complex problem. Intelligent algorithms are key tools to ensure global optimization quality, fast optimization efficiency and robust optimization performance. Intelligent optimization algorithms have been studied by many researchers, leading to improvements in the performance of algorithms such as the evolutionary algorithm, whale optimization algorithm, differential evolution algorithm, and particle swarm optimization. Studies in this arena have also resulted in breakthroughs in solving complex problems including the green shop scheduling problem, the severe nonlinear problem in one-dimensional geodesic electromagnetic inversion, error and bug finding problem in software, the 0-1 backpack problem, traveler problem, and logistics distribution center siting problem. The editors are confident that this book can open a new avenue for further improvement and discoveries in the area of intelligent algorithms. The book is a valuable resource for researchers interested in understanding the principles and design of intelligent algorithms

    Diagnosis of early mild cognitive impairment using a multiobjective optimization algorithm based on T1-MRI data

    Get PDF
    Alzheimer’s disease (AD) is the most prevalent form of dementia. The accurate diagnosis of AD, especially in the early phases is very important for timely intervention. It has been suggested that brain atrophy, as measured with structural magnetic resonance imaging (sMRI), can be an efficacy marker of neurodegeneration. While classification methods have been successful in diagnosis of AD, the performance of such methods have been very poor in diagnosis of those in early stages of mild cognitive impairment (EMCI). Therefore, in this study we investigated whether optimisation based on evolutionary algorithms (EA) can be an effective tool in diagnosis of EMCI as compared to cognitively normal participants (CNs). Structural MRI data for patients with EMCI (n = 54) and CN participants (n = 56) was extracted from Alzheimer’s disease Neuroimaging Initiative (ADNI). Using three automatic brain segmentation methods, we extracted volumetric parameters as input to the optimisation algorithms. Our method achieved classification accuracy of greater than 93%. This accuracy level is higher than the previously suggested methods of classification of CN and EMCI using a single- or multiple modalities of imaging data. Our results show that with an effective optimisation method, a single modality of biomarkers can be enough to achieve a high classification accuracy

    Holistic, data-driven, service and supply chain optimisation: linked optimisation.

    Get PDF
    The intensity of competition and technological advancements in the business environment has made companies collaborate and cooperate together as a means of survival. This creates a chain of companies and business components with unified business objectives. However, managing the decision-making process (like scheduling, ordering, delivering and allocating) at the various business components and maintaining a holistic objective is a huge business challenge, as these operations are complex and dynamic. This is because the overall chain of business processes is widely distributed across all the supply chain participants; therefore, no individual collaborator has a complete overview of the processes. Increasingly, such decisions are automated and are strongly supported by optimisation algorithms - manufacturing optimisation, B2B ordering, financial trading, transportation scheduling and allocation. However, most of these algorithms do not incorporate the complexity associated with interacting decision-making systems like supply chains. It is well-known that decisions made at one point in supply chains can have significant consequences that ripple through linked production and transportation systems. Recently, global shocks to supply chains (COVID-19, climate change, blockage of the Suez Canal) have demonstrated the importance of these interdependencies, and the need to create supply chains that are more resilient and have significantly reduced impact on the environment. Such interacting decision-making systems need to be considered through an optimisation process. However, the interactions between such decision-making systems are not modelled. We therefore believe that modelling such interactions is an opportunity to provide computational extensions to current optimisation paradigms. This research study aims to develop a general framework for formulating and solving holistic, data-driven optimisation problems in service and supply chains. This research achieved this aim and contributes to scholarship by firstly considering the complexities of supply chain problems from a linked problem perspective. This leads to developing a formalism for characterising linked optimisation problems as a model for supply chains. Secondly, the research adopts a method for creating a linked optimisation problem benchmark by linking existing classical benchmark sets. This involves using a mix of classical optimisation problems, typically relating to supply chain decision problems, to describe different modes of linkages in linked optimisation problems. Thirdly, several techniques for linking supply chain fragmented data have been proposed in the literature to identify data relationships. Therefore, this thesis explores some of these techniques and combines them in specific ways to improve the data discovery process. Lastly, many state-of-the-art algorithms have been explored in the literature and these algorithms have been used to tackle problems relating to supply chain problems. This research therefore investigates the resilient state-of-the-art optimisation algorithms presented in the literature, and then designs suitable algorithmic approaches inspired by the existing algorithms and the nature of problem linkages to address different problem linkages in supply chains. Considering research findings and future perspectives, the study demonstrates the suitability of algorithms to different linked structures involving two sub-problems, which suggests further investigations on issues like the suitability of algorithms on more complex structures, benchmark methodologies, holistic goals and evaluation, processmining, game theory and dependency analysis

    A sustainable ultrafiltration of sub-20 nm nanoparticles in water and isopropanol: experiments, theory and machine learning

    Get PDF
    This research focused on ultrafiltration (UF) for particles down to 2 nm against membranes with larger pore size in water and IPA, which has the potential to save up to 90% of energy. This study developed electrospray (ES) - scanning mobility particle sizer (SMPS) method to fast and effective measure retention efficiencies for small particles (ZnS, Au and PSL) on polytetrafluoroethylene (PTFE), polyvinylidene fluoride (PVDF) and polycarbonate (PCTE) in different liquids. Theoretical models that could quantitatively explain the experimental results for small particles in medium-polarity organic solvents were also developed. Results showed that the highest efficiency was up to ~80% with 10 nm Au nanoparticle challenged on 100 nm rated PTFE, which demonstrated the feasibility of the proposed sustainable UF. The theoretical models were validated by experimental results and indicated that a higher efficiency was possible by enhancing material properties of membranes, particles, or liquids. Therefore, optimization on filtration condition was performed. A hybrid artificial neural network (ANN) and particle swarm optimization algorithm (PSO) models was firstly applied in this case. The dataset includes all the experimental results and some additional calculated retention efficiencies. Optimization parameters include membrane zeta potential, pore size, particle size, particle zeta potential, and Hamaker constant. The ANN model provided highly correlated predicted values with target values. The PSO model showed that a filtration efficiency of 99.9% could be achieved by using a 52.2 nm filter with a -20.3 mV zeta potential, 5.5 nm nanoparticles with a 41.4 mV zeta potential, and a combined Hamaker constan
    corecore