74 research outputs found

    Computational Intelligence Sequential Monte Carlos for Recursive Bayesian Estimation

    Get PDF
    Recursive Bayesian estimation using sequential Monte Carlos methods is a powerful numerical technique to understand latent dynamics of non-linear non-Gaussian dynamical systems. Classical sequential Monte Carlos suffer from weight degeneracy which is where the number of distinct particles collapse. Traditionally this is addressed by resampling, which effectively replaces high weight particles with many particles with high inter-particle correlation. Frequent resampling, however, leads to a lack of diversity amongst the particle set in a problem known as sample impoverishment. Traditional sequential Monte Carlo methods attempt to resolve this correlated problem however introduce further data processing issues leading to minimal to comparable performance improvements over the sequential Monte Carlo particle filter. A new method, the adaptive path particle filter, is proposed for recursive Bayesian estimation of non-linear non-Gaussian dynamical systems. Our method addresses the weight degeneracy and sample impoverishment problem by embedding a computational intelligence step of adaptive path switching between generations based on maximal likelihood as a fitness function. Preliminary tests on a scalar estimation problem with non-linear non-Gaussian dynamics and a non-stationary observation model and the traditional univariate stochastic volatility problem are presented. Building on these preliminary results, we evaluate our adaptive path particle filter on the stochastic volatility estimation problem. We calibrate the Heston stochastic volatility model employing a Markov chain Monte Carlo on six securities. Finally, we investigate the efficacy of sequential Monte Carlos for recursive Bayesian estimation of astrophysical time series. We posit latent dynamics for both regularized and irregular astrophysical time series, calibrating fifty-five quasar time series using the CAR(1) model. We find the adaptive path particle filter to statistically significantly outperform the standard sequential importance resampling particle filter, the Markov chain Monte Carlo particle filter and, upon Heston model estimation, the particle learning algorithm particle filter. In addition, from our quasar MCMC calibration we find the characteristic timescale τ to be first-order stable in contradiction to the literature though indicative of a unified underlying structure. We offer detailed analysis throughout, and conclude with a discussion and suggestions for future work

    Intensity based image registration of satellite images using evolutionary techniques

    Get PDF
    Image registration is the fundamental image processing technique to determine geometrical transformation that gives the most accurate match between reference and floating images. Its main aim is to align two images. Satellite images to be fused for numerous applications must be registered before use. The main challenges in satellite image registration are finding out the optimum transformation parameters. Here in this work the non-alignment parameters are considered to be rigid and affine transformation. An intensity based satellite image registration technique is being used to register the floating image to the native co-ordinate system where the normalized mutual information (NMI) is taken as the similarity metric for optimizing and updating transform parameters. Because of no assumptions are made regarding the nature of the relationship between the image intensities in both modalities NMI is very general and powerful and can be applied automatically without prior segmentation on a large variety of data and as well works better for overlapped images as compared to mutual information(MI). In order to get maximum accuracy of registration the NMI is optimized using Genetic algorithm, particle swarm optimization and hybrid GA-PSO. The random initialization and computational complexity makes GA oppressive, whereas weak local search ability with a premature convergence is the main drawback of PSO. Hybrid GA-PSO makes a trade-off between the local and global search in order to achieve a better balance between convergence speed and computational complexity. The above registration algorithm is being validated with several satellite data sets. The hybrid GA-PSO outperforms in terms of optimized NMI value and percentage of mis-registration error

    Bio-inspired computation for big data fusion, storage, processing, learning and visualization: state of the art and future directions

    Get PDF
    This overview gravitates on research achievements that have recently emerged from the confluence between Big Data technologies and bio-inspired computation. A manifold of reasons can be identified for the profitable synergy between these two paradigms, all rooted on the adaptability, intelligence and robustness that biologically inspired principles can provide to technologies aimed to manage, retrieve, fuse and process Big Data efficiently. We delve into this research field by first analyzing in depth the existing literature, with a focus on advances reported in the last few years. This prior literature analysis is complemented by an identification of the new trends and open challenges in Big Data that remain unsolved to date, and that can be effectively addressed by bio-inspired algorithms. As a second contribution, this work elaborates on how bio-inspired algorithms need to be adapted for their use in a Big Data context, in which data fusion becomes crucial as a previous step to allow processing and mining several and potentially heterogeneous data sources. This analysis allows exploring and comparing the scope and efficiency of existing approaches across different problems and domains, with the purpose of identifying new potential applications and research niches. Finally, this survey highlights open issues that remain unsolved to date in this research avenue, alongside a prescription of recommendations for future research.This work has received funding support from the Basque Government (Eusko Jaurlaritza) through the Consolidated Research Group MATHMODE (IT1294-19), EMAITEK and ELK ARTEK programs. D. Camacho also acknowledges support from the Spanish Ministry of Science and Education under PID2020-117263GB-100 grant (FightDIS), the Comunidad Autonoma de Madrid under S2018/TCS-4566 grant (CYNAMON), and the CHIST ERA 2017 BDSI PACMEL Project (PCI2019-103623, Spain)

    Optimistic Variants of Single-Objective Bilevel Optimization for Evolutionary Algorithms

    Get PDF
    Single-objective bilevel optimization is a specialized form of constraint optimization problems where one of the constraints is an optimization problem itself. These problems are typically non-convex and strongly NP-Hard. Recently, there has been an increased interest from the evolutionary computation community to model bilevel problems due to its applicability in real-world applications for decision-making problems. In this work, a partial nested evolutionary approach with a local heuristic search has been proposed to solve the benchmark problems and have outstanding results. This approach relies on the concept of intermarriage-crossover in search of feasible regions by exploiting information from the constraints. A new variant has also been proposed to the commonly used convergence approaches, i.e., optimistic and pessimistic. It is called an extreme optimistic approach. The experimental results demonstrate the algorithm converges differently to known optimum solutions with the optimistic variants. Optimistic approach also outperforms pessimistic approach. Comparative statistical analysis of our approach with other recently published partial to complete evolutionary approaches demonstrates very competitive results

    A Comprehensive Review of Bio-Inspired Optimization Algorithms Including Applications in Microelectronics and Nanophotonics

    Get PDF
    The application of artificial intelligence in everyday life is becoming all-pervasive and unavoidable. Within that vast field, a special place belongs to biomimetic/bio-inspired algorithms for multiparameter optimization, which find their use in a large number of areas. Novel methods and advances are being published at an accelerated pace. Because of that, in spite of the fact that there are a lot of surveys and reviews in the field, they quickly become dated. Thus, it is of importance to keep pace with the current developments. In this review, we first consider a possible classification of bio-inspired multiparameter optimization methods because papers dedicated to that area are relatively scarce and often contradictory. We proceed by describing in some detail some more prominent approaches, as well as those most recently published. Finally, we consider the use of biomimetic algorithms in two related wide fields, namely microelectronics (including circuit design optimization) and nanophotonics (including inverse design of structures such as photonic crystals, nanoplasmonic configurations and metamaterials). We attempted to keep this broad survey self-contained so it can be of use not only to scholars in the related fields, but also to all those interested in the latest developments in this attractive area

    A comprehensive survey on cultural algorithms

    Get PDF
    Peer reviewedPostprin

    Enhancement of Metaheuristic Algorithm for Scheduling Workflows in Multi-fog Environments

    Get PDF
    Whether in computer science, engineering, or economics, optimization lies at the heart of any challenge involving decision-making. Choosing between several options is part of the decision- making process. Our desire to make the "better" decision drives our decision. An objective function or performance index describes the assessment of the alternative's goodness. The theory and methods of optimization are concerned with picking the best option. There are two types of optimization methods: deterministic and stochastic. The first is a traditional approach, which works well for small and linear problems. However, they struggle to address most of the real-world problems, which have a highly dimensional, nonlinear, and complex nature. As an alternative, stochastic optimization algorithms are specifically designed to tackle these types of challenges and are more common nowadays. This study proposed two stochastic, robust swarm-based metaheuristic optimization methods. They are both hybrid algorithms, which are formulated by combining Particle Swarm Optimization and Salp Swarm Optimization algorithms. Further, these algorithms are then applied to an important and thought-provoking problem. The problem is scientific workflow scheduling in multiple fog environments. Many computer environments, such as fog computing, are plagued by security attacks that must be handled. DDoS attacks are effectively harmful to fog computing environments as they occupy the fog's resources and make them busy. Thus, the fog environments would generally have fewer resources available during these types of attacks, and then the scheduling of submitted Internet of Things (IoT) workflows would be affected. Nevertheless, the current systems disregard the impact of DDoS attacks occurring in their scheduling process, causing the amount of workflows that miss deadlines as well as increasing the amount of tasks that are offloaded to the cloud. Hence, this study proposed a hybrid optimization algorithm as a solution for dealing with the workflow scheduling issue in various fog computing locations. The proposed algorithm comprises Salp Swarm Algorithm (SSA) and Particle Swarm Optimization (PSO). In dealing with the effects of DDoS attacks on fog computing locations, two Markov-chain schemes of discrete time types were used, whereby one calculates the average network bandwidth existing in each fog while the other determines the number of virtual machines existing in every fog on average. DDoS attacks are addressed at various levels. The approach predicts the DDoS attack’s influences on fog environments. Based on the simulation results, the proposed method can significantly lessen the amount of offloaded tasks that are transferred to the cloud data centers. It could also decrease the amount of workflows with missed deadlines. Moreover, the significance of green fog computing is growing in fog computing environments, in which the consumption of energy plays an essential role in determining maintenance expenses and carbon dioxide emissions. The implementation of efficient scheduling methods has the potential to mitigate the usage of energy by allocating tasks to the most appropriate resources, considering the energy efficiency of each individual resource. In order to mitigate these challenges, the proposed algorithm integrates the Dynamic Voltage and Frequency Scaling (DVFS) technique, which is commonly employed to enhance the energy efficiency of processors. The experimental findings demonstrate that the utilization of the proposed method, combined with the Dynamic Voltage and Frequency Scaling (DVFS) technique, yields improved outcomes. These benefits encompass a minimization in energy consumption. Consequently, this approach emerges as a more environmentally friendly and sustainable solution for fog computing environments

    Hybridizing Bayesian and variational data assimilation for high-resolution hydrologic forecasting

    Get PDF
    The success of real-time estimation and forecasting applications based on geophysical models has been possible thanks to the two main existing frameworks for the determination of the models' initial conditions: Bayesian data assimilation and variational data assimilation. However, while there have been efforts to unify these two paradigms, existing attempts struggle to fully leverage the advantages of both in order to face the challenges posed by modern high-resolution models – mainly related to model indeterminacy and steep computational requirements. In this article we introduce a hybrid algorithm called OPTIMISTS (Optimized PareTo Inverse Modeling through Integrated STochastic Search) which is targeted at non-linear high-resolution problems and that brings together ideas from particle filters (PFs), four-dimensional variational methods (4D-Var), evolutionary Pareto optimization, and kernel density estimation in a unique way. Streamflow forecasting experiments were conducted to test which specific configurations of OPTIMISTS led to higher predictive accuracy. The experiments were conducted on two watersheds: the Blue River (low resolution) using the VIC (Variable Infiltration Capacity) model and the Indiantown Run (high resolution) using the DHSVM (Distributed Hydrology Soil Vegetation Model). By selecting kernel-based non-parametric sampling, non-sequential evaluation of candidate particles, and through the multi-objective minimization of departures from the streamflow observations and from the background states, OPTIMISTS was shown to efficiently produce probabilistic forecasts with comparable accuracy to that obtained from using a particle filter. Moreover, the experiments demonstrated that OPTIMISTS scales well in high-resolution cases without imposing a significant computational overhead. With the combined advantages of allowing for fast, non-Gaussian, non-linear, high-resolution prediction, the algorithm shows the potential to increase the efficiency of operational prediction systems.</p
    corecore