53 research outputs found

    Adaptive power control aware depth routing in underwater sensor networks

    Get PDF
    Underwater acoustic sensor network (UASN) refers to a procedure that promotes a broad spectrum of aquatic applications. UASNs can be practically applied in seismic checking, ocean mine identification, resource exploration, pollution checking, and disaster avoidance. UASN confronts many difficulties and issues, such as low bandwidth, node movements, propagation delay, 3D arrangement, energy limitation, and high-cost production and arrangement costs caused by antagonistic underwater situations. Underwater wireless sensor networks (UWSNs) are considered a major issue being encountered in energy management because of the limited battery power of their nodes. Moreover, the harsh underwater environment requires vendors to design and deploy energy-hungry devices to fulfil the communication requirements and maintain an acceptable quality of service. Moreover, increased transmission power levels result in higher channel interference, thereby increasing packet loss. Considering the facts mentioned above, this research presents a controlled transmission power-based sparsity-aware energy-efficient clustering in UWSNs. The contributions of this technique is threefold. First, it uses the adaptive power control mechanism to utilize the sensor nodes’ battery and reduce channel interference effectively. Second, thresholds are defined to ensure successful communication. Third, clustering can be implemented in dense areas to decrease the repetitive transmission that ultimately affects the energy consumption of nodes and interference significantly. Additionally, mobile sinks are deployed to gather information locally to achieve the previously mentioned benefits. The suggested protocol is meticulously examined through extensive simulations and is validated through comparison with other advanced UWSN strategies. Findings show that the suggested protocol outperforms other procedures in terms of network lifetime and packet delivery ratio

    Performance Evaluation of a B2C Model Based on Trust Requirements and Factors

    No full text
    This paper evaluates the performance of a newly proposed B2C e-commerce model based on trust factors and requirements in the context of Saudi Arabia. Two categories of trust factors, namely, governmental and nongovernmental types, are identified to create the model for determining the feasibility of an efficient online business strategy in the Kingdom. Data are collected over a duration of 10 weeks based on the designed questionnaire, carefully analyzed, and interpreted. The standpoint of the end user is analyzed to determine the influence of the proposed trust requirements and factors on B2C e-commerce in Saudi Arabia. The reliability of the questionnaires for each requirement with their factors is quantitatively analyzed using Cronbach’s alpha. The questionnaire consists of three parts, namely, demographic component, questions related to the identified requirements, and additional notes as an open question. Questions are designed as per the requirements and the factors of trust models to demonstrate their possible relationship. Furthermore, the questionnaires’ content validity is judged by expert lecturers with relevant specialization before distributing them, which are well organized together with easy understandability. They are randomly distributed among 222 academic and administrative staff (female and male) in addition to university students from the Faculty of Computer Science and Information System in Saudi Arabia. This random selection performed on total 222 responders ensures the statistical accuracy of the sampling. Adaptable government approaches, enactment, rules, insurance of buyer rights, and banking network situation with less web expenses are demonstrated to be significant to e-commerce expansion in the Kingdom. Implementation of the proposed model is believed to augment consumer self-confidence and reliance together with e-commerce growth in Saudi Arabia

    Osteo-NeT: An Automated System for Predicting Knee Osteoarthritis from X-ray Images Using Transfer-Learning-Based Neural Networks Approach

    No full text
    Knee osteoarthritis is a challenging problem affecting many adults around the world. There are currently no medications that cure knee osteoarthritis. The only way to control the progression of knee osteoarthritis is early detection. Currently, X-ray imaging is a central technique used for the prediction of osteoarthritis. However, the manual X-ray technique is prone to errors due to the lack of expertise of radiologists. Recent studies have described the use of automated systems based on machine learning for the effective prediction of osteoarthritis from X-ray images. However, most of these techniques still need to achieve higher predictive accuracy to detect osteoarthritis at an early stage. This paper suggests a method with higher predictive accuracy that can be employed in the real world for the early detection of knee osteoarthritis. In this paper, we suggest the use of transfer learning models based on sequential convolutional neural networks (CNNs), Visual Geometry Group 16 (VGG-16), and Residual Neural Network 50 (ResNet-50) for the early detection of osteoarthritis from knee X-ray images. In our analysis, we found that all the suggested models achieved a higher level of predictive accuracy, greater than 90%, in detecting osteoarthritis. However, the best-performing model was the pretrained VGG-16 model, which achieved a training accuracy of 99% and a testing accuracy of 92%

    A Predictive Checkpoint Technique for Iterative Phase of Container Migration

    No full text
    Cloud computing is a cost-effective method of delivering numerous services in Industry 4.0. The demand for dynamic cloud services is rising day by day and, because of this, data transit across the network is extensive. Virtualization is a significant component and the cloud servers might be physical or virtual. Containerized services are essential for reducing data transmission, cost, and time, among other things. Containers are lightweight virtual environments that share the host operating system’s kernel. The majority of businesses are transitioning from virtual machines to containers. The major factor affecting the performance is the amount of data transfer over the network. It has a direct impact on the migration time, downtime and cost. In this article, we propose a predictive iterative-dump approach using long short-term memory (LSTM) to anticipate which memory pages will be moved, by limiting data transmission during the iterative phase. In each loop, the pages are shortlisted to be migrated to the destination host based on predictive analysis of memory alterations. Dirty pages will be predicted and discarded using a prediction technique based on the alteration rate. The results show that the suggested technique surpasses existing alternatives in overall migration time and amount of data transmitted. There was a 49.42% decrease in migration time and a 31.0446% reduction in the amount of data transferred during the iterative phase

    A Novel Approach for Securing Nodes Using Two-Ray Model and Shadow Effects in Flying Ad-Hoc Network

    No full text
    In the last decades, flying ad-hoc networks (FANET) have provided unique features in the field of unmanned aerial vehicles (UAVs). This work intends to propose an efficient algorithm for secure load balancing in FANET. It is performed with the combination of the firefly algorithm and radio propagation model. To provide the optimal path and to improve the data communication of different nodes, two-ray and shadow fading models are used, which secured the multiple UAVs in some high-level applications. The performance analysis of the proposed efficient optimization technique is compared in terms of packet loss, throughput, end-to-end delay, and routing overhead. Simulation results showed that the secure firefly algorithm and radio propagation models demonstrated the least packet loss, maximum throughput, least delay, and least overhead compared with other existing techniques and models

    Patron–Prophet Artificial Bee Colony Approach for Solving Numerical Continuous Optimization Problems

    No full text
    The swarm-based Artificial Bee Colony (ABC) algorithm has a significant range of applications and is competent, compared to other algorithms, regarding many optimization problems. However, the ABC’s performance in higher-dimension situations towards global optima is not on par with other models due to its deficiency in balancing intensification and diversification. In this research, two different strategies are applied for the improvement of the search capability of the ABC in a multimodal search space. In the ABC, the first strategy, Patron–Prophet, is assessed in the scout bee phase to incorporate a cooperative nature. This strategy works based on the donor–acceptor concept. In addition, a self-adaptability approach is included to balance intensification and diversification. This balancing helps the ABC to search for optimal solutions without premature convergence. The first strategy explores unexplored regions with better insight, and more profound intensification occurs in the discovered areas. The second strategy controls the trap of being in local optima and diversification without the pulse of intensification. The proposed model, named the PP-ABC, was evaluated with mathematical benchmark functions to prove its efficiency in comparison with other existing models. Additionally, the standard and statistical analyses show a better outcome of the proposed algorithm over the compared techniques. The proposed model was applied to a three-bar truss engineering design problem to validate the model’s efficacy, and the results were recorded

    Oppositional Pigeon-Inspired Optimizer for Solving the Non-Convex Economic Load Dispatch Problem in Power Systems

    No full text
    Economic Load Dispatch (ELD) belongs to a non-convex optimization problem that aims to reduce total power generation cost by satisfying demand constraints. However, solving the ELD problem is a challenging task, because of its parity and disparity constraints. The Pigeon-Inspired Optimizer (PIO) is a recently proposed optimization algorithm, which belongs to the family of swarm intelligence algorithms. The PIO algorithm has the benefit of conceptual simplicity, and provides better outcomes for various real-world problems. However, this algorithm has the drawback of premature convergence and local stagnation. Therefore, we propose an Oppositional Pigeon-Inspired Optimizer (OPIO) algorithm—to overcome these deficiencies. The proposed algorithm employs Oppositional-Based Learning (OBL) to enhance the quality of the individual, by exploring the global search space. The proposed algorithm would be used to determine the load demand of a power system, by sustaining the various equality and inequality constraints, to diminish the overall generation cost. In this work, the OPIO algorithm was applied to solve the ELD problem of small- (13-unit, 40-unit), medium- (140-unit, 160-unit) and large-scale (320-unit, 640-unit) test systems. The experimental results of the proposed OPIO algorithm demonstrate its efficiency over the conventional PIO algorithm, and other state-of-the-art approaches in the literature. The comparative results demonstrate that the proposed algorithm provides better results—in terms of improved accuracy, higher convergence rate, less computation time, and reduced fuel cost—than the other approaches

    Deep Learning Approach for the Detection of Noise Type in Ancient Images

    No full text
    Recent innovations in digital image capturing techniques facilitate the capture of stationary and moving objects. The images can be easily captured via high-end digital cameras, mobile phones and other handheld devices. Most of the time, captured images vary compared to actual objects. The captured images may be contaminated by dark, grey shades and undesirable black spots. There are various reasons for contamination, such as atmospheric conditions, limitations of capturing device and human errors. There are various mechanisms to process the image, which can clean up contaminated image to match with the original one. The image processing applications primarily require detection of accurate noise type which is used as input for image restoration. There are filtering techniques, fractional differential gradient and machine learning techniques to detect and identify the type of noise. These methods primarily rely on image content and spatial domain information of a given image. With the advancements in the technologies, deep learning (DL) is a technology that can be trained to mimic human intelligence to recognize various image patterns, audio files and text for accuracy. A deep learning framework empowers correct processing of multiple images for object identification and quick decision abilities without human interventions. Here Convolution Neural Network (CNN) model has been implemented to detect and identify types of noise in the given image. Over the multiple internal iterations to optimize the results, the identified noise is classified with 99.25% accuracy using the Proposed System Architecture (PSA) compared with AlexNet, Yolo V5, Yolo V3, RCNN and CNN. The proposed model in this study proved to be suitable for the classification of mural images on the basis of every performance parameter. The precision, accuracy, f1-score and recall of the PSA are 98.50%, 99.25%, 98.50% and 98.50%, respectively. This study contributes to the development of mural art recovery

    Patron–Prophet Artificial Bee Colony Approach for Solving Numerical Continuous Optimization Problems

    No full text
    The swarm-based Artificial Bee Colony (ABC) algorithm has a significant range of applications and is competent, compared to other algorithms, regarding many optimization problems. However, the ABC’s performance in higher-dimension situations towards global optima is not on par with other models due to its deficiency in balancing intensification and diversification. In this research, two different strategies are applied for the improvement of the search capability of the ABC in a multimodal search space. In the ABC, the first strategy, Patron–Prophet, is assessed in the scout bee phase to incorporate a cooperative nature. This strategy works based on the donor–acceptor concept. In addition, a self-adaptability approach is included to balance intensification and diversification. This balancing helps the ABC to search for optimal solutions without premature convergence. The first strategy explores unexplored regions with better insight, and more profound intensification occurs in the discovered areas. The second strategy controls the trap of being in local optima and diversification without the pulse of intensification. The proposed model, named the PP-ABC, was evaluated with mathematical benchmark functions to prove its efficiency in comparison with other existing models. Additionally, the standard and statistical analyses show a better outcome of the proposed algorithm over the compared techniques. The proposed model was applied to a three-bar truss engineering design problem to validate the model’s efficacy, and the results were recorded
    • …
    corecore