1,422 research outputs found

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Edge AI for Internet of Energy: Challenges and Perspectives

    Full text link
    The digital landscape of the Internet of Energy (IoE) is on the brink of a revolutionary transformation with the integration of edge Artificial Intelligence (AI). This comprehensive review elucidates the promise and potential that edge AI holds for reshaping the IoE ecosystem. Commencing with a meticulously curated research methodology, the article delves into the myriad of edge AI techniques specifically tailored for IoE. The myriad benefits, spanning from reduced latency and real-time analytics to the pivotal aspects of information security, scalability, and cost-efficiency, underscore the indispensability of edge AI in modern IoE frameworks. As the narrative progresses, readers are acquainted with pragmatic applications and techniques, highlighting on-device computation, secure private inference methods, and the avant-garde paradigms of AI training on the edge. A critical analysis follows, offering a deep dive into the present challenges including security concerns, computational hurdles, and standardization issues. However, as the horizon of technology ever expands, the review culminates in a forward-looking perspective, envisaging the future symbiosis of 5G networks, federated edge AI, deep reinforcement learning, and more, painting a vibrant panorama of what the future beholds. For anyone vested in the domains of IoE and AI, this review offers both a foundation and a visionary lens, bridging the present realities with future possibilities

    Adversarial Attacks and Defenses in 6G Network-Assisted IoT Systems

    Full text link
    The Internet of Things (IoT) and massive IoT systems are key to sixth-generation (6G) networks due to dense connectivity, ultra-reliability, low latency, and high throughput. Artificial intelligence, including deep learning and machine learning, offers solutions for optimizing and deploying cutting-edge technologies for future radio communications. However, these techniques are vulnerable to adversarial attacks, leading to degraded performance and erroneous predictions, outcomes unacceptable for ubiquitous networks. This survey extensively addresses adversarial attacks and defense methods in 6G network-assisted IoT systems. The theoretical background and up-to-date research on adversarial attacks and defenses are discussed. Furthermore, we provide Monte Carlo simulations to validate the effectiveness of adversarial attacks compared to jamming attacks. Additionally, we examine the vulnerability of 6G IoT systems by demonstrating attack strategies applicable to key technologies, including reconfigurable intelligent surfaces, massive multiple-input multiple-output (MIMO)/cell-free massive MIMO, satellites, the metaverse, and semantic communications. Finally, we outline the challenges and future developments associated with adversarial attacks and defenses in 6G IoT systems.Comment: 17 pages, 5 figures, and 4 tables. Submitted for publication

    A Systematic Review of LPWAN and Short-Range Network using AI to Enhance Internet of Things

    Get PDF
    Artificial intelligence (AI) has recently been used frequently, especially concerning the Internet of Things (IoT). However, IoT devices cannot work alone, assisted by Low Power Wide Area Network (LPWAN) for long-distance communication and Short-Range Network for a short distance. However, few reviews about AI can help LPWAN and Short-Range Network. Therefore, the author took the opportunity to do this review. This study aims to review LPWAN and Short-Range Networks AI papers in systematically enhancing IoT performance. Reviews are also used to systematically maximize LPWAN systems and Short-Range networks to enhance IoT quality and discuss results that can be applied to a specific scope. The author utilizes selected reporting items for systematic review and meta-analysis (PRISMA). The authors conducted a systematic review of all study results in support of the authors' objectives. Also, the authors identify development and related study opportunities. The author found 79 suitable papers in this systematic review, so a discussion of the presented papers was carried out. Several technologies are widely used, such as LPWAN in general, with several papers originating from China. Many reports from conferences last year and papers related to this matter were from 2020-2021. The study is expected to inspire experimental studies in finding relevant scientific papers and become another review

    Applying distance metrics for anomaly detection of energy-based attacks in IoT sensors / Aplicação de métricas de distancias para detecção por anomalia de ataques baseados em energia em sensores IoT

    Get PDF
    Internet of Things (IoT) has gained significant mindshare in academia and industry over the years. It is usually composed of tiny devices/sensors with low processing, memory, and energy available. As an emerging technology, many open challenges about the security of those devices are described in the literature. In this context, some attacks aim to drain the energy of IoT sensors. They are called energy-based attacks or energy exhausting attacks. Detecting such attacks with minimal resources has become a challenge. Several intrusion detection proposals require exchange information among sensors and base station, demanding data transmission and increasing the energy consumption of sensors. Aware of this problem, we propose a lightweight statistical model of anomaly detection that uses energy consumption analysis for the intrusion detection task. Our main contribution is an energy-efficient detection algorithm that is deployed directly at sensors. It applies statistical distance metrics to discriminate between normal and anomaly energy consumption and does not require data transmission in the network. In this work, we compare three distance metrics to evaluate the best of them for the discrimination phase: Sibson, Euclidian, and Hellinger. Thus, we simulate the detection algorithm and assess the results applying the F-measure approach on detection data. The results show an efficient intrusion detection model, with high F-score values and low energy expenditure on the detection task.

    Machine Learning Enabled Vital Sign Monitoring System

    Get PDF
    Internet of Things (IoT)- based remote health monitoring systems have an enormous potential of becoming an integral part of the future medical system. In particular, these systems can play life-saving roles for treating or monitoring patients with critical health issues. On the other hand, it can also reduce pressure on the health-care system by reducing unnecessary hospital visits of patients. Any health care monitoring system must be free from erroneous data, which may arise because of instrument failure or communication errors. In this thesis, machine-learning techniques are implemented to detect reliability and accuracy of data obtained by the IoT-based remote health monitoring. A system is a set-up where vital health signs, namely, blood pressure, respiratory rate, and pulse rate, are collected by using Spire Stone and iHealth Sense devices. This data is then sent to the intermediate device and then to the cloud. In this system, it is assumed that the channel for transmission of data (vital signs) from users to cloud server is error-free. Afterward, the information is extracted from the cloud, and two machine learning techniques, i.e., Support Vector Machines and K-Nearest Neighbor are applied to compare their accuracy in distinguishing correct and erroneous data. The thesis undertakes two different approaches of erroneous data detection. In the first approach, an unsupervised classifier called Auto Encoder (AE) is used for labeling data by using the latent features. Then the labeled data from AE is used as ground truth for comparing the accuracy of supervised learning models. In the second approach, the raw data is labeled based on the correlation between various features. The accuracy comparison is performed between strongly correlated features and weakly correlated features. Finally, the accuracy comparison between two approaches is performed to check which method is performing better for detecting erroneous data for the given dataset

    Big data analytics for large-scale wireless networks: Challenges and opportunities

    Full text link
    © 2019 Association for Computing Machinery. The wide proliferation of various wireless communication systems and wireless devices has led to the arrival of big data era in large-scale wireless networks. Big data of large-scale wireless networks has the key features of wide variety, high volume, real-time velocity, and huge value leading to the unique research challenges that are different from existing computing systems. In this article, we present a survey of the state-of-art big data analytics (BDA) approaches for large-scale wireless networks. In particular, we categorize the life cycle of BDA into four consecutive stages: Data Acquisition, Data Preprocessing, Data Storage, and Data Analytics. We then present a detailed survey of the technical solutions to the challenges in BDA for large-scale wireless networks according to each stage in the life cycle of BDA. Moreover, we discuss the open research issues and outline the future directions in this promising area

    Machine Learning for Internet of Things Data Analysis: A Survey

    Get PDF
    Rapid developments in hardware, software, and communication technologies have facilitated the emergence of Internet-connected sensory devices that provide observations and data measurements from the physical world. By 2020, it is estimated that the total number of Internet-connected devices being used will be between 25 and 50 billion. As these numbers grow and technologies become more mature, the volume of data being published will increase. The technology of Internet-connected devices, referred to as Internet of Things (IoT), continues to extend the current Internet by providing connectivity and interactions between the physical and cyber worlds. In addition to an increased volume, the IoT generates big data characterized by its velocity in terms of time and location dependency, with a variety of multiple modalities and varying data quality. Intelligent processing and analysis of this big data are the key to developing smart IoT applications. This article assesses the various machine learning methods that deal with the challenges presented by IoT data by considering smart cities as the main use case. The key contribution of this study is the presentation of a taxonomy of machine learning algorithms explaining how different techniques are applied to the data in order to extract higher level information. The potential and challenges of machine learning for IoT data analytics will also be discussed. A use case of applying a Support Vector Machine (SVM) to Aarhus smart city traffic data is presented for a more detailed exploration
    corecore