99 research outputs found

    Dynamic Bandwidth Allocation for Internet of Things System Using Elastic Wireless Local Area Network

    Get PDF
    Rapid technological development, triggering various applications development that are increasingly innovative. One of them is the Internet of Things (IoT) system that makes human works easier and more effective. Along with sensor technology development in monitoring and controlling through IoT systems, a mechanism is needed to manage bandwidth so that IoT system can function optimally, especially in buildings designated as public areas. Smart building supported by various integrated sensors to maintain safety and comfort in the area. This study proposes the application of Elastic WLAN as a model for dynamic bandwidth management in IoT systems. In this model, IoT bandwidth changes automatically according to the number of traffic measurements for each IoT connected to the network As an effort to determine the performance of the elastic WLAN mechanism, this study succeeded in developing a prototype IoT device that implements Elastic WLAN on an Access-Point Raspberry Pi by using two temperature sensors placed in separate locations. The system successfully allocates bandwidth to each IoT according to the amount of data input from each temperature sensor installed. The higher the amount of data captured by the sensor, the system will automatically allocate the higher bandwidth to the sensor system, and vice versa

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    A Framework of Fog Computing: Architecture, Challenges and Optimization

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this record.Fog Computing (FC) is an emerging distributed computing platform aimed at bringing computation close to its data sources, which can reduce the latency and cost of delivering data to a remote cloud. This feature and related advantages are desirable for many Internet-of-Things applications, especially latency sensitive and mission intensive services. With comparisons to other computing technologies, the definition and architecture of FC are presented in this article. The framework of resource allocation for latency reduction combined with reliability, fault tolerance, privacy, and underlying optimization problems are also discussed. We then investigate an application scenario and conduct resource optimization by formulating the optimization problem and solving it via a Genetic Algorithm. The resulting analysis generates some important insights on the scalability of FC systems.This work was supported by the Engineering and Physical Sciences Research Council [grant number EP/P020224/1] and the EU FP7 QUICK project under Grant Agreement No. PIRSES-GA-2013-612652. Yang Liu was supported by the Chinese Research Council

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) and Ultra-Reliable and Low Latency Communications (URLLC), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include Quality of Service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenario along with the recent advances towards enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering

    Enable Reliable and Secure Data Transmission in Resource-Constrained Emerging Networks

    Get PDF
    The increasing deployment of wireless devices has connected humans and objects all around the world, benefiting our daily life and the entire society in many aspects. Achieving those connectivity motivates the emergence of different types of paradigms, such as cellular networks, large-scale Internet of Things (IoT), cognitive networks, etc. Among these networks, enabling reliable and secure data transmission requires various resources including spectrum, energy, and computational capability. However, these resources are usually limited in many scenarios, especially when the number of devices is considerably large, bringing catastrophic consequences to data transmission. For example, given the fact that most of IoT devices have limited computational abilities and inadequate security protocols, data transmission is vulnerable to various attacks such as eavesdropping and replay attacks, for which traditional security approaches are unable to address. On the other hand, in the cellular network, the ever-increasing data traffic has exacerbated the depletion of spectrum along with the energy consumption. As a result, mobile users experience significant congestion and delays when they request data from the cellular service provider, especially in many crowded areas. In this dissertation, we target on reliable and secure data transmission in resource-constrained emerging networks. The first two works investigate new security challenges in the current heterogeneous IoT environment, and then provide certain countermeasures for reliable data communication. To be specific, we identify a new physical-layer attack, the signal emulation attack, in the heterogeneous environment, such as smart home IoT. To defend against the attack, we propose two defense strategies with the help of a commonly found wireless device. In addition, to enable secure data transmission in large-scale IoT network, e.g., the industrial IoT, we apply the amply-and-forward cooperative communication to increase the secrecy capacity by incentivizing relay IoT devices. Besides security concerns in IoT network, we seek data traffic alleviation approaches to achieve reliable and energy-efficient data transmission for a group of users in the cellular network. The concept of mobile participation is introduced to assist data offloading from the base station to users in the group by leveraging the mobility of users and the social features among a group of users. Following with that, we deploy device-to-device data offloading within the group to achieve the energy efficiency at the user side while adapting to their increasing traffic demands. In the end, we consider a perpendicular topic - dynamic spectrum access (DSA) - to alleviate the spectrum scarcity issue in cognitive radio network, where the spectrum resource is limited to users. Specifically, we focus on the security concerns and further propose two physical-layer schemes to prevent spectrum misuse in DSA in both additive white Gaussian noise and fading environments
    corecore