216 research outputs found

    Adaboost‑Based Security Level Classifcation of Mobile Intelligent Terminals

    Get PDF
    With the rapid development of Internet of Things, massive mobile intelligent terminals are ready to access edge servers for real-time data calculation and interaction. However, the risk of private data leakage follows simultaneously. As the administrator of all intelligent terminals in a region, the edge server needs to clarify the ability of the managed intelligent terminals to defend against malicious attacks. Therefore, the security level classification for mobile intelligent terminals before accessing the network is indispensable. In this paper, we firstly propose a safety assessment method to detect the weakness of mobile intelligent terminals. Secondly, we match the evaluation results to the security level. Finally, a scheme of security level classification for mobile intelligent terminals based on Adaboost algorithm is proposed. The experimental results demonstrate that compared to a baseline that statistically calculates the security level, the proposed method can complete the security level classification with lower latency and high accuracy when massive mobile intelligent terminals access the network at the same time

    An Enhanced AdaBoost Classifier for Smart City Big Data Analytics

    Get PDF
    The targeted goal regarding the smart cities is improving the goodness of their people and to raise the economic improvement in maintaining certain rate or level. Smart cities would increase all set of utilities, which involves healthcare, education, transportation and agriculture among other utilities. Smart cities are depended on the ICT framework, which includes the Internet of Things methodology. These methodologies make bulk of diverse in data, which referred to as big data. Moreover, these data have no purpose by themselves. Modules needed to improve as new to explain the large amount of data collected and one of the good methods to solve is to use the methods of big data analytics. It shall be maintained and designed through the methods of analytics to get good understanding and in order to increase the utilities of smart city

    Cyber Physical System Based Smart Healthcare System with Federated Deep Learning Architectures with Data Analytics

    Get PDF
    Data shared between hospitals and patients using mobile and wearable Internet of Medical Things (IoMT) devices raises privacy concerns due to the methods used in training. the development of the Internet of Medical Things (IoMT) and related technologies and the most current advances in these areas The Internet of Medical Things and other recent technological advancements have transformed the traditional healthcare system into a smart one. improvement in computing power and the spread of information have transformed the healthcare system into a high-tech, data-driven operation. On the other hand, mobile and wearable IoMT devices present privacy concerns regarding the data transmitted between hospitals and end users because of the way in which artificial intelligence is trained (AI-centralized). In terms of machine learning (AI-centralized). Devices connected to the IoMT network transmit highly confidential information that could be intercepted by adversaries. Due to the portability of electronic health record data for clinical research made possible by medical cyber-physical systems, the rate at which new scientific discoveries can be made has increased. While AI helps improve medical informatics, the current methods of centralised data training and insecure data storage management risk exposing private medical information to unapproved foreign organisations. New avenues for protecting users' privacy in IoMT without requiring access to their data have been opened by the federated learning (FL) distributive AI paradigm. FL safeguards user privacy by concealing all but gradients during training. DeepFed is a novel Federated Deep Learning approach presented in this research for the purpose of detecting cyber threats to intelligent healthcare CPSs

    A Novel Chimp Optimized Linear Kernel Regression (COLKR) Model for Call Drop Prediction in Mobile Networks

    Get PDF
    Call failure can be caused by a variety of factors, including inadequate cellular infrastructure, undesirable system structuring, busy mobile phone towers, changing between towers, and many more. Outdated equipment and networks worsen call failure, and installing more towers to improve coverage might harm the regional ecosystems. In the existing studies, a variety of machine learning algorithms are implemented for call drop prediction in the mobile networks. But it facing problems in terms of high error rate, low prediction accuracy, system complexity, and more training time. Therefore, the proposed work intends to develop a new and sophisticated framework, named as, Chimp Optimized Linear Kernel Regression (COLKR) for predicting call drops in the mobile networks. For the analysis, the Call Detail Record (CDR) has been collected and used in this framework. By preprocessing the attributes, the normalized dataset is constructed using the median regression-based filtering technique. To extract the most significant features for training the classifier with minimum processing complexity, a sophisticated Chimp Optimization Algorithm (COA) is applied. Then, a new machine learning model known as the Linear Kernel Regression Model (LKRM) has been deployed to predict call drops with greater accuracy and less error. For the performance assessment of COLKR, several machine learning classifiers are compared with the proposed model using a variety of measures. By using the proposed COLKR mechanism, the call drop detection accuracy is improved to 99.4%, and the error rate is reduced to 0.098%, which determines the efficiency and superiority of the proposed system

    Sequential Feature Selection Using Hybridized Differential Evolution Algorithm and Haar Cascade for Object Detection Framework

    Get PDF
    Intelligent systems an aspect of artificial intelligence have been developed to improve satellite image interpretation with several foci on object-based machine learning methods but lack an optimal feature selection technique. Existing techniques applied to satellite images for feature selection and object detection have been reported to be ineffective in detecting objects. In this paper, differential Evolution (DE) algorithm has been introduced as a technique for selecting and mapping features to Haarcascade machine learning classifier for optimal detection of satellite image was acquired, pre-processed and features engineering was carried out and mapped using adopted DE algorithm. The selected feature was trained using Haarcascade machine learning algorithm. The result shows that the proposed technique has performance Accuracy of 86.2%, sensitivity 89.7%, and Specificity 82.2% respectively

    Optimization of Energy-Efficient Cluster Head Selection Algorithm for Internet of Things in Wireless Sensor Networks

    Get PDF
    The Internet of Things (IoT) now uses the Wireless Sensor Network (WSN) as a platform to sense and communicate data. The increase in the number of embedded and interconnected devices on the Internet has resulted in a need for software solutions to manage them proficiently in an elegant and scalable manner. Also, these devices can generate massive amounts of data, resulting in a classic Big Data problem that must be stored and processed. Large volumes of information have to be produced by using IoT applications, thus raising two major issues in big data analytics. To ensure an efficient form of mining of both spatial and temporal data, a sensed sample has to be collected. So for this work, a new strategy to remove redundancy has been proposed. This classifies all forms of collected data to be either relevant or irrelevant in choosing suitable information even before they are forwarded to the base station or the cluster head. A Low-Energy Adaptive Clustering Hierarchy (LEACH) is a cluster-based routing protocol that uses cluster formation. The LEACH chooses one head from the network sensor nodes, such as the Cluster Head (CH), to rotate the role to a new distributed energy load. The CHs were chosen randomly with the possibility of all CHs being concentrated in one locality. The primary idea behind such dynamic clustering was them resulted in more overheads due to changes in the CH and advertisements. Therefore, the LEACH was not suitable for large networks. Here, Particle Swarm Optimization (PSO) and River Formation Dynamics are used to optimize the CH selection (RFD). The results proved that the proposed method to have performed better compared to other methods

    Ensemble deep learning: A review

    Get PDF
    Ensemble learning combines several individual models to obtain better generalization performance. Currently, deep learning models with multilayer processing architecture is showing better performance as compared to the shallow or traditional classification models. Deep ensemble learning models combine the advantages of both the deep learning models as well as the ensemble learning such that the final model has better generalization performance. This paper reviews the state-of-art deep ensemble models and hence serves as an extensive summary for the researchers. The ensemble models are broadly categorised into ensemble models like bagging, boosting and stacking, negative correlation based deep ensemble models, explicit/implicit ensembles, homogeneous /heterogeneous ensemble, decision fusion strategies, unsupervised, semi-supervised, reinforcement learning and online/incremental, multilabel based deep ensemble models. Application of deep ensemble models in different domains is also briefly discussed. Finally, we conclude this paper with some future recommendations and research directions

    Machine Learning-Enabled IoT Security: Open Issues and Challenges Under Advanced Persistent Threats

    Full text link
    Despite its technological benefits, Internet of Things (IoT) has cyber weaknesses due to the vulnerabilities in the wireless medium. Machine learning (ML)-based methods are widely used against cyber threats in IoT networks with promising performance. Advanced persistent threat (APT) is prominent for cybercriminals to compromise networks, and it is crucial to long-term and harmful characteristics. However, it is difficult to apply ML-based approaches to identify APT attacks to obtain a promising detection performance due to an extremely small percentage among normal traffic. There are limited surveys to fully investigate APT attacks in IoT networks due to the lack of public datasets with all types of APT attacks. It is worth to bridge the state-of-the-art in network attack detection with APT attack detection in a comprehensive review article. This survey article reviews the security challenges in IoT networks and presents the well-known attacks, APT attacks, and threat models in IoT systems. Meanwhile, signature-based, anomaly-based, and hybrid intrusion detection systems are summarized for IoT networks. The article highlights statistical insights regarding frequently applied ML-based methods against network intrusion alongside the number of attacks types detected. Finally, open issues and challenges for common network intrusion and APT attacks are presented for future research.Comment: ACM Computing Surveys, 2022, 35 pages, 10 Figures, 8 Table
    • …
    corecore