24 research outputs found

    Genetic algorithm for the design and optimization of a shell and tube heat exchanger from a performance point of view

    Get PDF
    A new approach to optimize the design of a shell and tube heat exchanger (STHX) is developed via a genetic algorithm (GA) to get the optimal configuration from a performance point of view. The objective is to develop and test a model for optimizing the early design stage of the STHX and solve the design problem quickly. GA is implemented to maximize heat transfer rate while minimizing pressure drop. GA is applied to oil cooler type OKG 33/244, and the results are compared with the original data of the STHX. The simulation outcomes reveal that the STHX\u27s operating performance has been improved, indicating that GA can be successfully employed for the design optimization of STHX from a performance standpoint. A maximum increase in the effectiveness achieves 57% using GA, while the achieved minimum increase is 47%. Furthermore, the average effectiveness of the heat exchanger is 55%, and the number of transfer units (NTU) has improved from 0.475319 to 1.825664 by using GA

    Smart Bagged Tree-based Classifier optimized by Random Forests (SBT-RF) to Classify Brain- Machine Interface Data

    Get PDF
    Brain-Computer Interface (BCI) is a new technology that uses electrodes and sensors to connect machines and computers with the human brain to improve a person\u27s mental performance. Also, human intentions and thoughts are analyzed and recognized using BCI, which is then translated into Electroencephalogram (EEG) signals. However, certain brain signals may contain redundant information, making classification ineffective. Therefore, relevant characteristics are essential for enhancing classification performance. . Thus, feature selection has been employed to eliminate redundant data before sorting to reduce computation time. BCI Competition III Dataset Iva was used to investigate the efficacy of the proposed system. A Smart Bagged Tree-based Classifier (SBT-RF) technique is presented to determine the importance of the features for selecting and classifying the data. As a result, SBT-RF is better at improving the mean accuracy of the dataset. It also decreases computation cost and training time and increases prediction speed. Furthermore, fewer features mean fewer electrodes, thus lowering the risk of damage to the brain. The proposed algorithm has the greatest average accuracy of ~98% compared to other relevant algorithms in the literature. SBT-RF is compared to state-of-the-art algorithms based on the following performance metrics: Confusion Matrix, ROC-AUC, F1-Score, Training Time, Prediction speed, and Accuracy

    Image compression algorithms in wireless multimedia sensor networks: A survey

    Get PDF
    Unlike classical wired networks and wireless sensor networks, WMSN differs from their predecessor’s scalar network basically in the following points; nature and size of data being transmitted, important memory resources, as well as, power consumed per each node for processing and transmission. The most effective solution to overcome those problems is image compression. As the image contains massive amount of redundancies resulting from high correlation between pixels, many compression algorithms have been developed. The main objective of this survey was to study and analyze relevant research directions and the most recent algorithms of image compression over WMSN. This survey characterizes the benefits and shortcomings of recent efforts of such algorithms. Moreover, it provides an open research issue for each compression method; and its potentials to WMSN. Reducing consumed power thus granting long life time is considered the main performance metric and will be the main target in the investigated solution

    Fragmented protein sequence alignment using two-layer particle swarm optimization (FTLPSO)

    Get PDF
    This paper presents a Fragmented protein sequence alignment using two-layer PSO (FTLPSO) method to overcome the drawbacks of particle swarm optimization (PSO) and improve its performance in solving multiple sequence alignment (MSA) problem. The standard PSO suffers from the trapping in local optima, and its disability to do better alignment for longer sequences. To overcome these problems, a fragmentation technique is first introduced to divide the longer datasets to a number of fragments. Then a two-layer PSO algorithm is applied to align each fragment, which has ability to deal with unconstrained optimization problems and increase diversity of particles. The proposed method is tested on some Balibase benchmarks of different lengths. The numerical results are compared with CLUSTAL Omega, CLUSTAL W2, TCOFFEE, KALIGN, and DIALIGN-PFAM. It has been shown that better alignment scores have been achieved using the proposed technique FTLPSO. Further, studies on PSO update equation’s parameters and the parameters of the used scoring functions are presented and discussed

    I<sup>2</sup>OT-EC: A Framework for Smart Real-Time Monitoring and Controlling Crude Oil Production Exploiting IIOT and Edge Computing

    No full text
    The oil and gas business has high operating costs and frequently has significant difficulties due to asset, process, and operational failures. Remote monitoring and management of the oil field operations are essential to ensure efficiency and safety. Oil field operations often use SCADA or wireless sensor network (WSN)-based monitoring and control systems; both have numerous drawbacks. WSN-based systems are not uniform or are incompatible. Additionally, they lack transparent communication and coordination. SCADA systems also cost a lot, are rigid, are not scalable, and deliver data slowly. Edge computing and the Industrial Internet of Things (IIoT) help to overcome SCADA’s constraints by establishing an automated monitoring and control system for oil and gas operations that is effective, secure, affordable, and transparent. The main objective of this study is to exploit the IIOT and Edge Computing (EC). This study introduces an I2OT-EC framework with flowcharts, a simulator, and system architecture. The validity of the I2OT-EC framework is demonstrated by experimental findings and implementation with an application example to verify the research results as an additional verification and testing that proves the framework results were satisfactory. The significant increase of 12.14% in the runtime for the crude well using the proposed framework, coupled with other advantages, such as reduced operational costs, decentralization, and a dependable platform, highlights the benefits of this solution and its suitability for the automatic monitoring and control of oil field operations

    A Honey Badger Optimization for Minimizing the Pollutant Environmental Emissions-Based Economic Dispatch Model Integrating Combined Heat and Power Units

    No full text
    Traditionally, the Economic Dispatch Model (EDM) integrating Combined Heat and Power (CHP) units aims to reduce fuel costs by managing power-only, CHP, and heat-only units. Today, reducing pollutant emissions to the environment is of paramount concern. This research presents a novel honey badger optimization algorithm (HBOA) for EDM-integrated CHP units. HBOA is a novel meta-heuristic search strategy inspired by the honey badger’s sophisticated hunting behavior. In HBOA, the dynamic searching activity of the honey badger, which includes digging and honing, is separated into exploration and exploitation phases. In addition, several modern meta-heuristic optimization algorithms are employed, which are the African Vultures Algorithm (AVO), Dwarf Mongoose Optimization Algorithm (DMOA), Coot Optimization Algorithm (COA), and Beluga Whale Optimization Algorithm (BWOA). These algorithms are applied in a comparative manner considering the seven-unit test system. Various loading levels are considered with different power and heat loading. Four cases are investigated for each loading level, which differ based on the objective task and the consideration of power losses. Moreover, considering the pollutant emissions minimization objective, the proposed HBOA achieves reductions, without loss considerations, of 75.32%, 26.053%, and 87.233% for the three loading levels, respectively, compared to the initial case. Moreover, considering minimizing pollutant emissions, the suggested HBOA achieves decreases of 75.32%, 26.053%, and 87.233%, relative to the baseline scenario, for the three loading levels, respectively. Similarly, it performs reductions of 73.841%, 26.155%, and 92.595%, respectively, for the three loading levels compared to the baseline situation when power losses are considered. Consequently, the recommended HBOA surpasses the AVO, DMOA, COA, and BWOA when the purpose is to minimize fuel expenditures. In addition, the proposed HBOA significantly reduces pollutant emissions compared to the baseline scenario

    A comparative study of soft computing methods to solve inverse kinematics problem

    No full text
    Robot arms are essential tools nowadays in industries due to its accuracy through high speed manufacturing. One of the most challenging problems in industrial robots is solving inverse kinematics. Inverse Kinematic Problem concerns with finding the values of angles which are related to the desired Cartesian location. With the development of Softcomputing-based methods, it's become easier to solve the inverse kinematic problem in higher speed with sufficient solutions rather than using traditional methods like numerical, geometric and algebraic. This paper presents a comparative study between different soft-computing based methods (Artificial Neural Network, Adaptive Neuro Fuzzy Inference System & Genetic Algorithms) applied to the problem of inverse kinematics. With the help of proposed method called minimized error function, both ANN and ANFIS are able to outperform other methods. The experimental test are done using 5DOF robot arm and analyzing the results proved the simulation results. Keywords: ANFIS, Forward kinematics, GA, Inverse kinematics, Meta-heuristic, NN, Robot arm, Soft-computin

    Modelling and practical studying of heat recovery steam generator (HRSG) drum dynamics and approach point effect on control valves

    No full text
    In this paper, we present a simple procedure to build a model of a heat recovery steam generator (HRSG) evaporator and drum within the environment of MATLAB/Simulink. The HRSG is part of combined cycle power plant that is located at Talkha power station (130 km north of Cairo, capital of Egypt). The model captures the response of water and steam inside HRSG evaporator and drum like drum level, pressure, steam quality and others during different conditions. We discuss some practical concepts in HRSG design, and the importance of HRSG approach point value to drum level stability and control. We also discuss how HRSG approach point can have destroying effect on level control valves of the drum and increase the maintenance cost of the combined cycle power plant. Keywords: Heat recovery steam generator, Evaporator, Model, Drum, Combined cycl

    An Optimized Quadratic Support Vector Machine for EEG Based Brain Computer Interface

    Get PDF
    The Brain Computer Interface (BCI) has a great impact on mankind. Many researchers have been trying to employ different classifiers to figure out the human brain\u27s thoughts accurately. In order to overcome the poor performance of a single classifier, some researchers used a combined classifier. Others delete redundant information in some channels before applying the classifier as they thought it might reduce the accuracy of the classifier. BCI helps clinicians to learn more about brain problems and disabilities such as stroke to use in recovery. The main objective of this paper is to propose an optimized High-Performance Support Vector Machines (SVM) based classifier (HPSVM-BCI) using the SelectKBest (SKB). In the proposed HPSVM-BCI, the SKB algorithm is used to select the features of the BCI competition III Dataset IVa subjects. Then, to classify the prepared data from the previous phase, SVM with Quadratic kernel (QSVM) were used in the second phase. As well as enhancing the mean accuracy of the dataset, HPSVM-BCI reduces the computational cost and computational time. A major objective of this research is to improve the classification of the BCI dataset. Furthermore, decreased feature count translates to fewer electrodes, a factor that reduces the risk to the human brain. Comparative studies have been conducted with recent models using the same dataset. The results obtained from the study show that HPSVM-BCI has the highest average accuracy, with 99.24% for each subject with 40 channels only

    Brain Strategy Algorithm for Multiple Object Tracking Based on Merging Semantic Attributes and Appearance Features

    No full text
    The human brain can effortlessly perform vision processes using the visual system, which helps solve multi-object tracking (MOT) problems. However, few algorithms simulate human strategies for solving MOT. Therefore, devising a method that simulates human activity in vision has become a good choice for improving MOT results, especially occlusion. Eight brain strategies have been studied from a cognitive perspective and imitated to build a novel algorithm. Two of these strategies gave our algorithm novel and outstanding results, rescuing saccades and stimulus attributes. First, rescue saccades were imitated by detecting the occlusion state in each frame, representing the critical situation that the human brain saccades toward. Then, stimulus attributes were mimicked by using semantic attributes to reidentify the person in these occlusion states. Our algorithm favourably performs on the MOT17 dataset compared to state-of-the-art trackers. In addition, we created a new dataset of 40,000 images, 190,000 annotations and 4 classes to train the detection model to detect occlusion and semantic attributes. The experimental results demonstrate that our new dataset achieves an outstanding performance on the scaled YOLOv4 detection model by achieving a 0.89 mAP 0.5
    corecore