9 research outputs found

    DATA REPLICATION IN DISTRIBUTED SYSTEMS USING OLYMPIAD OPTIMIZATION ALGORITHM

    Get PDF
    Achieving timely access to data objects is a major challenge in big distributed systems like the Internet of Things (IoT) platforms. Therefore, minimizing the data read and write operation time in distributed systems has elevated to a higher priority for system designers and mechanical engineers. Replication and the appropriate placement of the replicas on the most accessible data servers is a problem of NP-complete optimization. The key objectives of the current study are minimizing the data access time, reducing the quantity of replicas, and improving the data availability. The current paper employs the Olympiad Optimization Algorithm (OOA) as a novel population-based and discrete heuristic algorithm to solve the replica placement problem which is also applicable to other fields such as mechanical and computer engineering design problems. This discrete algorithm was inspired by the learning process of student groups who are preparing for the Olympiad exams. The proposed algorithm, which is divide-and-conquer-based with local and global search strategies, was used in solving the replica placement problem in a standard simulated distributed system. The 'European Union Database' (EUData) was employed to evaluate the proposed algorithm, which contains 28 nodes as servers and a network architecture in the format of a complete graph. It was revealed that the proposed technique reduces data access time by 39% with around six replicas, which is vastly superior to the earlier methods. Moreover, the standard deviation of the results of the algorithm's different executions is approximately 0.0062, which is lower than the other techniques' standard deviation within the same experiments

    Heart failure patients monitoring using IoT-based remote monitoring system

    Get PDF
    Intelligent health monitoring systems are becoming more important and popular as technology advances. Nowadays, online services are replacing physical infrastructure in several domains including medical services as well. The COVID-19 pandemic has also changed the way medical services are delivered. Intelligent appliances, smart homes, and smart medical systems are some of the emerging concepts. The Internet of Things (IoT) has changed the way communication occurs alongside data collection sources aided by smart sensors. It also has deployed artificial intelligence (AI) methods for better decision-making provided by efficient data collection, storage, retrieval, and data management. This research employs health monitoring systems for heart patients using IoT and AI-based solutions. Activities of heart patients are monitored and reported using the IoT system. For heart disease prediction, an ensemble model ET-CNN is presented which provides an accuracy score of 0.9524. The investigative data related to this system is very encouraging in real-time reporting and classifying heart patients with great accuracy

    Context-Aware Wireless Connectivity and Processing Unit Optimization for IoT Networks

    Get PDF
    A novel approach is presented in this work for context-aware connectivity and processing optimization of Internet of things (IoT) networks. Different from the state-of-the-art approaches, the proposed approach simultaneously selects the best connectivity and processing unit (e.g., device, fog, and cloud) along with the percentage of data to be offloaded by jointly optimizing energy consumption, response-time, security, and monetary cost. The proposed scheme employs a reinforcement learning algorithm, and manages to achieve significant gains compared to deterministic solutions. In particular, the requirements of IoT devices in terms of response-time and security are taken as inputs along with the remaining battery level of the devices, and the developed algorithm returns an optimized policy. The results obtained show that only our method is able to meet the holistic multi-objective optimization criteria, albeit, the benchmark approaches may achieve better results on a particular metric at the cost of failing to reach the other targets. Thus, the proposed approach is a device-centric and context-aware solution that accounts for the monetary and battery constraints

    A Survey on Behavioral Pattern Mining from Sensor Data in Internet of Things

    Get PDF
    The deployment of large-scale wireless sensor networks (WSNs) for the Internet of Things (IoT) applications is increasing day-by-day, especially with the emergence of smart city services. The sensor data streams generated from these applications are largely dynamic, heterogeneous, and often geographically distributed over large areas. For high-value use in business, industry and services, these data streams must be mined to extract insightful knowledge, such as about monitoring (e.g., discovering certain behaviors over a deployed area) or network diagnostics (e.g., predicting faulty sensor nodes). However, due to the inherent constraints of sensor networks and application requirements, traditional data mining techniques cannot be directly used to mine IoT data streams efficiently and accurately in real-time. In the last decade, a number of works have been reported in the literature proposing behavioral pattern mining algorithms for sensor networks. This paper presents the technical challenges that need to be considered for mining sensor data. It then provides a thorough review of the mining techniques proposed in the recent literature to mine behavioral patterns from sensor data in IoT, and their characteristics and differences are highlighted and compared. We also propose a behavioral pattern mining framework for IoT and discuss possible future research directions in this area. © 2013 IEEE

    Advancements in intrusion detection: A lightweight hybrid RNN-RF model

    Get PDF
    Computer networks face vulnerability to numerous attacks, which pose significant threats to our data security and the freedom of communication. This paper introduces a novel intrusion detection technique that diverges from traditional methods by leveraging Recurrent Neural Networks (RNNs) for both data preprocessing and feature extraction. The proposed process is based on the following steps: (1) training the data using RNNs, (2) extracting features from their hidden layers, and (3) applying various classification algorithms. This methodology offers significant advantages and greatly differs from existing intrusion detection practices. The effectiveness of our method is demonstrated through trials on the Network Security Laboratory (NSL) and Canadian Institute for Cybersecurity (CIC) 2017 datasets, where the application of RNNs for intrusion detection shows substantial practical implications. Specifically, we achieved accuracy scores of 99.6% with Decision Tree, Random Forest, and CatBoost classifiers on the NSL dataset, and 99.8% and 99.9%, respectively, on the CIC 2017 dataset. By reversing the conventional sequence of training data with RNNs and then extracting features before applying classification algorithms, our approach provides a major shift in intrusion detection methodologies. This modification in the pipeline underscores the benefits of utilizing RNNs for feature extraction and data preprocessing, meeting the critical need to safeguard data security and communication freedom against ever-evolving network threats

    A Two-branch Edge Guided Lightweight Network for infrared image saliency detection

    Get PDF
    In the dynamic landscape of saliency detection, convolutional neural networks have emerged as catalysts for innovation, but remain largely tailored for RGB imagery, falling short in the context of infrared images, particularly in memory-restricted environments. These existing approaches tend to overlook the wealth of contour information vital for a nuanced analysis of infrared images. Addressing this notable gap, we introduce the novel Two-branch Edge Guided Lightweight Network (TBENet), designed explicitly for the robust analysis of infrared image saliency detection. The main contributions of this paper are as follows. First, we formulate the saliency detection task as two subtasks, contour enhancement and foreground segmentation. Therefore, the TBENet is divided into two specialized branches: a contour prediction branch for extracting target contour and a saliency map generation branch for separating the foreground from the background. The first branch employs an encoder–decoder architecture to meticulously delineate object contours, serving as a guiding blueprint for the second branch. This latter segment adeptly integrates spatial and semantic data, creating a precise saliency map that is refined further by an innovative edge-weighted contour loss function. Second, to enhance feature integration capabilities, we propose depthwise multi-scale and multi-cue modules, facilitating sophisticated feature aggregation. Third, a high-level linear bottleneck module is devised to ensure the extraction of rich semantic information, and by replacing the standard convolution with the depthwise convolution, it is beneficial to reduce model complexity. Additional, we reduce the number of channels of the feature maps from each stage of the decoder to further enhance the lightweight of the model. Last, we construct a novel infrared ship dataset Small-IRShip to train and evaluate our proposed model. Experimental results on the homemade dataset Small-IRShip and two publicly available datasets, namely RGB-T and IRSTD-1k, demonstrate TBENet’s superior performance over state-of-the-art methods, affirming its effectiveness in harnessing edge information and incorporating advanced feature integration strategies

    Indeterminacy-aware prediction model for authentication in IoT.

    Get PDF
    The Internet of Things (IoT) has opened a new chapter in data access. It has brought obvious opportunities as well as major security and privacy challenges. Access control is one of the challenges in IoT. This holds true as the existing, conventional access control paradigms do not fit into IoT, thus access control requires more investigation and remains an open issue. IoT has a number of inherent characteristics, including scalability, heterogeneity and dynamism, which hinder access control. While most of the impact of these characteristics have been well studied in the literature, we highlighted “indeterminacy” in authentication as a neglected research issue. This work stresses that an indeterminacy-resilient model for IoT authentication is missing from the literature. According to our findings, indeterminacy consists of at least two facets: “uncertainty” and “ambiguity”. As a result, various relevant theories were studied in this work. Our proposed framework is based on well-known machine learning models and Attribute-Based Access Control (ABAC). To implement and evaluate our framework, we first generate datasets, in which the location of the users is a main dataset attribute, with the aim to analyse the role of user mobility in the performance of the prediction models. Next, multiple classification algorithms were used with our datasets in order to build our best-fit prediction models. Our results suggest that our prediction models are able to determine the class of the authentication requests while considering both the uncertainty and ambiguity in the IoT system

    Konzepte zur Kollaboration zwischen Intelligenten Geräten zum Aufbau vernetzter Städte

    Get PDF
    corecore