24 research outputs found
Analysis of Blackhole Attack in AODV and DS
Mobile Ad-Hoc Networks (MANETs) are supreme ruler and demoralization wireless scheme. MANETs are infrastructure less i.e. their structure is not fixed, and the nodes be able to move about and can leave the network whenever they want. The nodes are to perform as more over router and host. In MANETs, the node can be in contact with every node as their configuration is not fixed and the nodes starts transmitting the packets to each other for the establishment of the connection. To hitch the link, the nodes make use of some routing protocols like Ad-Hoc On Demand Distance Vector (AODV), Dynamic Source Routing (DSR), and DestinationSequenced Distance Vector (DSDV). Security in MANET is the key matter meant for the fundamental utility of network. There are many attacks caused in MANET. Blackhole attack is one that occurs in MANET. A Black hole attack is an attack where the node, which is malicious advertise itself as having the optimal route to the destination and drops all the packets instead of forwarding further to the destination. Here, we have shown the blackhole attack in AODV and DSR. Through simulation we evaluate the performance of the two above protocols under blackhole attack
A New Approach for DDoS attacks to discriminate the attack level and provide security for DDoS nodes in MANET
Mobile Ad Hoc Networks (MANETs) enable versatile hosts to frame a correspondence arrange without a prefixed framework. In military applications portable specially appointed system assumes essential part since it is particularly planned network for on request necessity and in circumstances where set up of physical network isn't conceivable. Despite the fact that it gives high adaptability, it likewise conveys more difficulties for MANETs to battle against malicious assaults. In any case, the property of mobility and excess additionally motivates new plans to outline safeguard procedure. In this paper, we propose a procedure to relieve DDoS assaults in MANETs. Expect that a malicious attacker ordinarily targets particular victims. The attacker will surrender if the assault neglected to accomplish the coveted objectives after a specific length of assaulting time. In our assurance system, we exploit high excess and select a protection node. Once a DDoS attack has been identified, the suspicious movement will be diverted to the protection node. The victim will work typically, and it is sensible to expect that the attacker will stop the trivial endeavors. Through escalated recreation test utilizing NS-2, we have confirmed the viability of our approach and assessed the cost and overhead of the framework
Internet of things and multi-class deep feature-fusion based classification of tomato leaf disease
A deep transfer learning (deep-TL) classification model has been proposed to diagnose tomato leaf disease. The main challenge of inaccurate classification of a convolution neural network (CNN) model was the availability of the small-sized dataset. This model deals with the challenges like availability of small-sized and imbalanced datasets. The proposed Alex support vector machine (SVM) fused hybrid classification (ASFHC) model is based on fully fusion technology that avoids overfitting to classify the type of disease in tomato leaves. The proposed model achieves the best performance in terms of accuracy by data augmentation of the training data. It uses a pre-trained network for feature extraction with the modification of architecture by concatenating two layers FC6 and FC7 (fully connected layer), plus a linear SVM classifier for classification of the disease. The uniqueness of the research is although the dataset is not balanced, the performance of the model has achieved the maximum. Compared with VGG 16 and VGG 19, the proposed model (ASFHC) has been evaluated using different measuring parameters, indicating remarkable computation time for implementation in the internet of things (IoT) domain. The overall accuracy attained by the model is 99.62%
Energy Efficient MANET Protocol Using Cross Layer Design for Military Applications
In military applications mobile adhoc network plays very important role because it is specifically designed network for on demand requirement and in situations where set up of physical network is not possible. This special type of network which takes control in infrastructure less communication handles serious challenges tactfully such as highly robust and dynamic military work stations, devices and smaller sub-networks in the battle field. Therefore there is a high demand of designing efficient routing protocols ensuring security and reliability for successful transmission of highly sensitive and confidential military information in defence networks. With this objective, a power efficient network layer routing protocol in the network for military application is designed and simulated using a new cross layer approach of design to increase reliability and network lifetime up to a greater extent.
IoT-Fog-Edge-Cloud Computing Simulation Tools, A Systematic Review
The Internet of Things (IoT) perspective promises substantial advancements in sectors such as smart homes and infrastructure, smart health, smart environmental conditions, smart cities, energy, transportation and mobility, manufacturing and retail, farming, and so on. Cloud computing (CC) offers appealing computational and storage options; nevertheless, cloud-based explanations are frequently conveyed by downsides and constraints, such as energy consumption, latency, privacy, and bandwidth. To address the shortcomings related to CC, the advancements like Fog Computing (FC) and Edge Computing (EC) are introduced later on. FC is a novel and developing technology that connects the cloud to the network edges, allowing for decentrali zed computation. EC, in which processing and storage are performed nearer to where data is created, may be able to assist address these issues by satisfying particular needs such as low latency or lower energy use. This study provides a comprehensive overview and analysis of IoT-Fog-Edge-Cloud Computing simulation tools to assist researchers and developers in selecting the appropriate device for research studies while working through various scenarios and addressing current reality challenges. This study also takes a close look at various modeling tools, which are examined and contrasted to improve the future
A Smart Waste Management System Framework Using IoT and LoRa for Green City Project
Waste management is a pressing concern for society, requiring substantial labor resources and impacting various social aspects. Green cities strive for achieving a net zero-carbon footprint, including efficient waste management. The waste management system deals with three problems that are interrelated: a) the timely checking of the status of bins to prevent overflow; b) checking the precise location of bins; and c) finding the optimal route to the filled bins. The existing systems fail to satisfy all three problem areas with a single solution. To track the overflow of the bin, the proposed model uses ultrasonic sensors, which are complemented with LoRa to transmit the exact location of the bins in a real-time environment. The existing models are not that efficient at calculating the exact bin-filled status along with the precise location of the bins. The Floyd-Warshall algorithm in the proposed model optimizes waste collection using the Floyd-Warshall algorithm to determine the shortest path. Leveraging low-cost IoT technologies, specifically LoRa modules for data transfer, our solution offers benefits such as simplicity, affordability, and ease of replacement. By employing the Floyd-Warshall algorithm with a time complexity of O (n^3), our method efficiently determines the most optimal waste pickup route, saving time and resources. This study presents a smart waste management solution utilising Arduino UNO microcontrollers, ultrasonic sensors, and LoRaWAN to measure waste levels accurately. The proposed strategy aims to create clean and pollution-free cities by addressing the problem of waste distribution caused by poor collection techniques
Dynamic Task Migration for Enhanced Load Balancing in Cloud Computing using K-means Clustering and Ant Colony Optimization
Cloud computing efficiently allocates resources, and timely execution of user tasks is pivotal for ensuring seamless service delivery. Central to this endeavour is the dynamic orchestration of task scheduling and migration, which collectively contribute to load balancing within virtual machines (VMs). Load balancing is a cornerstone, empowering clouds to fulfill user requirements promptly. To facilitate the migration of tasks, we propose a novel method that exploits the synergistic potential of K-means clustering and Ant Colony Optimization (ACO). Our approach aims to maximize the cloud ecosystem by improving several critical factors, such as the system's make time, resource utilization efficiency, and workload imbalance mitigation. The core objective of our work revolves around the reduction of makespan, a metric directly tied to the overall system performance. By strategically employing K-means clustering, we effectively group tasks with similar attributes, enabling the identification of prime candidates for migration. Subsequently, the ACO algorithm takes the reins, orchestrating the migration process with an inherent focus on achieving global optimization. The multifaceted benefits of our approach are quantitatively assessed through comprehensive comparisons with established algorithms, namely Round Robin (RR), First-Come-First-Serve (FCFS), Shortest Job First (SJF), and a genetic load balancing algorithm. To facilitate this evaluation, we harness the capabilities of the CloudSim simulation tool, which provides a platform for realistic and accurate performance analysis. Our research enhances cloud computing paradigms by harmonizing task migration with innovative optimization techniques. The proposed approach demonstrates its prowess in harmonizing diverse goals: reducing makespan, elevating resource utilization efficiency, and attenuating the degree of workload imbalance. These outcomes collectively pave the way for a more responsive and dependable cloud infrastructure primed to cater to user needs with heightened efficacy. Our study delves into the intricate domain of cloud-based task scheduling and migration. By synergizing K-means clustering and ACO algorithms, we introduce a dynamic methodology that refines cloud resource management and bolsters the quintessential facet of load balancing. Through rigorous comparisons and meticulous analysis, we underscore the superior attributes of our approach, showcasing its potential to reshape the landscape of cloud computing optimization
Editorial
Emergence of data communication networks and the evolution of global data communication via the Internet have provided a potential platform for researchers around the globe to disseminate their research findings to the global community
Challenges and Solution for Identification of Plant Disease Using Machine Learning & IoT
Internet of Thing (IoT) is a groundbreaking technology that has been introduced in the field of agriculture to improve the quality and quantity of food production. As agriculture plays a vital role in feeding most of the world\u27s population, the increasing demand for food has led to a rise in food grain production. The identification of plant diseases is a critical task for farmers and agronomists as it enables them to take proactive measures to prevent the spread of diseases, protect crops, and maximize yields. Traditional methods of plant disease detection involve visual inspections by experts, which can be time-consuming and often subject to human error. However, with technological advancements, IoT and Machine Learning (ML) has emerged as promising solution for automating and improving plant disease identification. This paper explores the challenges and solutions for identifying plant diseases using IoT and ML. The challenges discussed include data collection, quality, scalability, and interpretability. The proposed solutions include using sensor networks, data pre-processing techniques, transfer learning, and explainable AI