51 research outputs found

    Dynamic Offloading Technique for Latency-Sensitive Internet of Things Applications using Fog Computing

    Get PDF
    Internet of Things (IoT) has evolved as a novel paradigm that provides com-putation power to different entities connected to it. IoT offers services to multiple sectors such as home automation, industrial automation, traffic management, healthcare sector, agriculture industry etc. IoT generally relies on cloud data centers for extended analytics, processing and storage support. The cloud offers highly scalable and robust platform for IoT applications. But latency sensitive IoT applications suffer delay issues as the cloud lies in remote location. Edge/fog computing was introduced to overcome the issues faced by delay-sensitive IoT applications. These platforms lie close to the IoT network, reducing the delay and response time. The fog nodes are usually distributed in nature. The data has to be properly offloaded to available fog nodes using efficient strategies to gain benefit from the integration. Differ-ent offloading schemes are available in the literature to overcome this prob-lem This paper proposes a novel offloading approach by combining two effi-cient metaheuristic algorithms, Honey Badger Algorithm (HBA) and Fla-mingo Search Algorithm (FSA) termed as HB-FS algorithm. The HB-FS is executed in an iterative manner optimizing the objective function in each it-eration. The performance evaluation of the proposed approach is done with different existing metaheuristic algorithms and the evaluations show that the proposed work outperforms the existing algorithms in terms of latency, response time and execution time. The methodology also offers better degree of imbalance with proper load balancing under different conditions

    An elitism-based multi-objective evolutionary algorithm for min-cost network disintegration

    Get PDF
    Network disintegration or strengthening is a significant problem, which is widely used in infrastructure construction, social networks, infectious disease prevention and so on. But most studies assume that the cost of attacking anyone node is equal. In this paper, we investigate the robustness of complex networks under a more realistic assumption that costs are functions of degrees of nodes. A multi-objective, elitism-based, evolutionary algorithm (MOEEA) is proposed for the network disintegration problem with heterogeneous costs. By defining a new unit cost influence measure of the target attack node and combining with an elitism strategy, some combination nodes’ information can be retained. Through an ingenious update mechanism, this information is passed on to the next generation to guide the population to move to more promising regions, which can improve the rate of convergence of the proposed algorithm. A series of experiments have been carried out on four benchmark networks and some model networks, the results show that our method performs better than five other state-of-the-art attack strategies. MOEEA can usually find min-cost network disintegration solutions. Simultaneously, through testing different cost functions, we find that the stronger the cost heterogeneity, the better performance of our algorithm

    Self-Improved Pelican optimization for Task Scheduling in Edge Computing: Neural Network based risk probability estimation

    Get PDF
    The aim of edge computing is to drastically speed up the task response time while using less energy. The computational resources in EC are situated in closer proximity to the information generation sources, resulting in lower network latency and bandwidth utilization related to cloud computing (CC).  In an EC system, the edge server handles and manages the requests of task and generated information from adjacent IoT machines. The task's schedule is regarded as the optimization problem. Thus, this paper aims to provide a novel task scheduling model that considers Risk probability, Execution cost, Execution time, and Makespan in to account. The Neural Network specifically estimates risk probabilities while taking task security and virtual machine security into consideration. This work suggests a new Paetro distribution-based pelican optimization algorithm (PDPOA) model for optimum scheduling of tasks. Results from the proposed system are examined and compared to existing methods via certain measures including Makespan, Execution time, Execution cost, Risk probability, etc

    RSU based Joint Congestion-Intrusion Detection System in Vanets Using Deep Learning Technique

    Get PDF
    Vehicular Ad hoc Network (VANET) is a technology that makes it possible to provide many practical services in intelligent transportation systems, but it is also susceptible to several intrusion threats. Through the identification of unusual network behavior, intrusion detection systems (ID Ss) can reduce security vulnerabilities. However, rather than detecting anomalous network behaviors throughout the whole VANET, current IDS systems are only able to do so for local sub-networks. Hence there is a need for a Joint Congestion and Intrusion Detection System (JCIDS). We designed an JCICS model that can collect network data cooperatively from vehicles and Roadside Units (RSUs).This paper, proposes a new deep learning model to improve the performance of JCIDS by using k-means and a posterior detection based on coresets to improve the detection accuracy and eliminate the redundant messages. The efficacy of the current Recurrent Neural Network (RNN) and Honey badger Algorithm (HBA)on the fundamental AODV protocol is combined with the advantages of the JCIDS is suggested in this protocol. First, formation of clusters using vehicle’s mobility parameters like, velocity and distance to enhance route stability. Moreover, a vehicle will be chosen as Cluster Head with highest route stability. Second, the efficient intrusion detection is achieved with the consumption using RNN method. In the RNN, the optimal weighting factor is selected with the help of HBA. The RNN is performing efficient prediction with the assistance of HBA. The finest path for data dissemination is selected by choosing link lifetime, hop count and residual energy along the path.As a result, multimedia data streaming is improved network life time, in terms of reduced packet loss ratio and energy consumption as compared to existing DNN and SVM scheme for different node density and speed

    Metaheuristic optimal design of adaptive composite beams

    Get PDF
    The first author also wishes to thank the support of Project IPL/2019/MOCHVar/ISEL.The integration of active materials in composite structures confers them the capability of adapting its behaviour when submitted to loads, overcoming a merely passive response. This adaptive character can be further optimized if the adequate design variable parameters are identified and selected. This work presents a study on the use of a metaheuristic bio-inspired technique to optimize (non-)skewed adaptive composite beams where material properties vary along the length. The results illustrate the performance of the analysis-optimization package implemented as well as the possible range of behaviours, beyond the optimal configurations achieved for the set of case studies.publishersversionpublishe

    Antenna S-parameter optimization based on golden sine mechanism based honey badger algorithm with tent chaos

    Get PDF
    This work proposed a new method to optimize the antenna S-parameter using a Golden Sine mechanism-based Honey Badger Algorithm that employs Tent chaos (GST-HBA). The Honey Badger Algorithm (HBA) is a promising optimization method that similar to other metaheuristic algorithms, is prone to premature convergence and lacks diversity in the population. The Honey Badger Algorithm is inspired by the behavior of honey badgers who use their sense of smell and honeyguide birds to move toward the honeycomb. Our proposed approach aims to improve the performance of HBA and enhance the accuracy of the optimization process for antenna S-parameter optimization. The approach we propose in this study leverages the strengths of both tent chaos and the golden sine mechanism to achieve fast convergence, population diversity, and a good tradeoff between exploitation and exploration. We begin by testing our approach on 20 standard benchmark functions, and then we apply it to a test suite of 8 S-parameter functions. We perform tests comparing the outcomes to those of other optimization algorithms, the result shows that the suggested algorithm is superior. © 202

    Application of Group Method of Data Handling and New Optimization Algorithms for Predicting Sediment Transport Rate under Vegetation Cover

    Full text link
    Planting vegetation is one of the practical solutions for reducing sediment transfer rates. Increasing vegetation cover decreases environmental pollution and sediment transport rate (STR). Since sediments and vegetation interact complexly, predicting sediment transport rates is challenging. This study aims to predict sediment transport rate under vegetation cover using new and optimized versions of the group method of data handling (GMDH). Additionally, this study introduces a new ensemble model for predicting sediment transport rates. Model inputs include wave height, wave velocity, density cover, wave force, D50, the height of vegetation cover, and cover stem diameter. A standalone GMDH model and optimized GMDH models, including GMDH honey badger algorithm (HBA) GMDH rat swarm algorithm (RSOA)vGMDH sine cosine algorithm (SCA), and GMDH particle swarm optimization (GMDH-PSO), were used to predict sediment transport rates. As the next step, the outputs of standalone and optimized GMDH were used to construct an ensemble model. The MAE of the ensemble model was 0.145 m3/s, while the MAEs of GMDH-HBA, GMDH-RSOA, GMDH-SCA, GMDH-PSOA, and GMDH in the testing level were 0.176 m3/s, 0.312 m3/s, 0.367 m3/s, 0.498 m3/s, and 0.612 m3/s, respectively. The Nash Sutcliffe coefficient (NSE) of ensemble model, GMDH-HBA, GMDH-RSOA, GMDH-SCA, GMDH-PSOA, and GHMDH were 0.95 0.93, 0.89, 0.86, 0.82, and 0.76, respectively. Additionally, this study demonstrated that vegetation cover decreased sediment transport rate by 90 percent. The results indicated that the ensemble and GMDH-HBA models could accurately predict sediment transport rates. Based on the results of this study, sediment transport rate can be monitored using the IMM and GMDH-HBA. These results are useful for managing and planning water resources in large basins.Comment: 65 pages, 10 figures, 5 table

    Filter-Wrapper Methods For Gene Selection In Cancer Classification

    Get PDF
    In microarray gene expression studies, finding the smallest subset of informative genes from microarray datasets for clinical diagnosis and accurate cancer classification is one of the most difficult challenges in machine learning task. Many researchers have devoted their efforts to address this problem by using a filter method, a wrapper method or a combination of both approaches. A hybrid method is a hybridisation approach between filter and wrapper methods. It benefits from the speed of the filter approach and the accuracy of the wrapper approach. Several hybrid filter-wrapper methods have been proposed to select informative genes. However, hybrid methods encounter a number of limitations, which are associated with filter and wrapper approaches. The gene subset that is produced by filter approaches lacks predictiveness and robustness. The wrapper approach encounters problems of complex interactions among genes and stagnation in local optima. To address these drawbacks, this study investigates filter and wrapper methods to develop effective hybrid methods for gene selection. This study proposes new hybrid filter-wrapper methods based on Maximum Relevancy Minimum Redundancy (MRMR) as a filter approach and adapted bat-inspired algorithm (BA) as a wrapper approach. First, MRMR hybridisation and BA adaptation are investigated to resolve the gene selection problem. The proposed method is called MRMR-BA

    Reducing water conveyance footprint through an advanced optimization framework

    Get PDF
    This study investigates the optimal and safe operation of pumping stations in water distribution systems (WDSs) with the aim of reducing the environmental footprint of water conveyance processes. We introduced the nonlinear chaotic honey badger algorithm (NCHBA), a novel and robust optimization method. The proposed method utilizes chaotic maps to enhance exploration and convergence speed, incorporating a nonlinear control parameter to effectively balance local and global search dynamics. Single-objective optimization results on a WDS show that NCHBA outperforms other algorithms in solution accuracy and convergence speed. The application of the proposed approach on a water network with two variable-speed pumps demonstrated a significant 27% reduction in energy consumption. Expanding our focus to the multi-objective optimization of pump scheduling programs in large-scale water distribution systems (WDSs), we employ the non-dominated sorting nonlinear chaotic honey badger algorithm (MONCHBA). The findings reveal that the use of variable-speed pumps not only enhances energy efficiency but also bolsters WDS reliability compared to the use of single-speed pumps. The results showcase the potential and robustness of the proposed multi-objective NCHBA in achieving an optimal Pareto front that effectively balances energy consumption, pressure levels, and water quality risk, facilitating carbon footprint reduction and sustainable management of WDSs

    An Efficient High-Dimensional Gene Selection Approach based on Binary Horse Herd Optimization Algorithm for Biological Data Classification

    Full text link
    The Horse Herd Optimization Algorithm (HOA) is a new meta-heuristic algorithm based on the behaviors of horses at different ages. The HOA was introduced recently to solve complex and high-dimensional problems. This paper proposes a binary version of the Horse Herd Optimization Algorithm (BHOA) in order to solve discrete problems and select prominent feature subsets. Moreover, this study provides a novel hybrid feature selection framework based on the BHOA and a minimum Redundancy Maximum Relevance (MRMR) filter method. This hybrid feature selection, which is more computationally efficient, produces a beneficial subset of relevant and informative features. Since feature selection is a binary problem, we have applied a new Transfer Function (TF), called X-shape TF, which transforms continuous problems into binary search spaces. Furthermore, the Support Vector Machine (SVM) is utilized to examine the efficiency of the proposed method on ten microarray datasets, namely Lymphoma, Prostate, Brain-1, DLBCL, SRBCT, Leukemia, Ovarian, Colon, Lung, and MLL. In comparison to other state-of-the-art, such as the Gray Wolf (GW), Particle Swarm Optimization (PSO), and Genetic Algorithm (GA), the proposed hybrid method (MRMR-BHOA) demonstrates superior performance in terms of accuracy and minimum selected features. Also, experimental results prove that the X-Shaped BHOA approach outperforms others methods
    corecore