99 research outputs found
IoT無線ネットワークの電力効率化・セキュア化を実現する通信システムの設計に関する研究
Tohoku University加藤寧課
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained Machine-Type Communication
(MTC) devices is leading to the critical challenge of fulfilling diverse
communication requirements in dynamic and ultra-dense wireless environments.
Among different application scenarios that the upcoming 5G and beyond cellular
networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the
unique technical challenge of supporting a huge number of MTC devices, which is
the main focus of this paper. The related challenges include QoS provisioning,
handling highly dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this paper aims to
identify and analyze the involved technical issues, to review recent advances,
to highlight potential solutions and to propose new research directions. First,
starting with an overview of mMTC features and QoS provisioning issues, we
present the key enablers for mMTC in cellular networks. Along with the
highlights on the inefficiency of the legacy Random Access (RA) procedure in
the mMTC scenario, we then present the key features and channel access
mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT.
Subsequently, we present a framework for the performance analysis of
transmission scheduling with the QoS support along with the issues involved in
short data packet transmission. Next, we provide a detailed overview of the
existing and emerging solutions towards addressing RAN congestion problem, and
then identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in ultra-dense
cellular networks. Out of several ML techniques, we focus on the application of
low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss
some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future
publication in IEEE Communications Surveys and Tutorial
Towards efficient support for massive Internet of Things over cellular networks
The usage of Internet of Things (IoT) devices over cellular networks is seeing tremendous
growth in recent years, and that growth in only expected to increase in the near
future. While existing 4G and 5G cellular networks offer several desirable features for
this type of applications, their design has historically focused on accommodating traditional
mobile devices (e.g. smartphones). As IoT devices have very different characteristics
and use cases, they create a range of problems to current networks which often
struggle to accommodate them at scale. Although newer cellular network technologies,
such as Narrowband-IoT (NB-IoT), were designed to focus on the IoT characteristics,
they were extensively based on 4G and 5G networks to preserve interoperability, and
decrease their deployment cost. As such, several inefficiencies of 4G/5G were also
carried over to the newer technologies.
This thesis focuses on identifying the core issues that hinder the large scale deployment
of IoT over cellular networks, and proposes novel protocols to largely alleviate
them. We find that the most significant challenges arise mainly in three distinct areas:
connection establishment, network resource utilisation and device energy efficiency.
Specifically, we make the following contributions. First, we focus on the connection
establishment process and argue that the current procedures, when used by IoT devices,
result in increased numbers of collisions, network outages and a signalling overhead
that is disproportionate to the size of the data transmitted, and the connection duration
of IoT devices. Therefore, we propose two mechanisms to alleviate these inefficiencies.
Our first mechanism, named ASPIS, focuses on both the number of collisions
and the signalling overhead simultaneously, and provides enhancements to increase the
number of successful IoT connections, without disrupting existing background traffic.
Our second mechanism focuses specifically on the collisions at the connection establishment
process, and used a novel approach with Reinforcement Learning, to decrease
their number and allow a larger number of IoT devices to access the network with fewer
attempts.
Second, we propose a new multicasting mechanism to reduce network resource
utilisation in NB-IoT networks, by delivering common content (e.g. firmware updates)
to multiple similar devices simultaneously. Notably, our mechanism is both more efficient
during multicast data transmission, but also frees up resources that would otherwise
be perpetually reserved for multicast signalling under the existing scheme.
Finally, we focus on energy efficiency and propose novel protocols that are designed
for the unique usage characteristics of NB-IoT devices, in order to reduce the
device power consumption. Towards this end, we perform a detailed energy consumption
analysis, which we use as a basis to develop an energy consumption model for
realistic energy consumption assessment. We then take the insights from our analysis,
and propose optimisations to significantly reduce the energy consumption of IoT
devices, and assess their performance
Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
The ever-increasing number of resource-constrained
Machine-Type Communication (MTC) devices is leading to the
critical challenge of fulfilling diverse communication requirements
in dynamic and ultra-dense wireless environments. Among
different application scenarios that the upcoming 5G and beyond
cellular networks are expected to support, such as enhanced Mobile
Broadband (eMBB), massive Machine Type Communications
(mMTC) and Ultra-Reliable and Low Latency Communications
(URLLC), the mMTC brings the unique technical challenge of
supporting a huge number of MTC devices in cellular networks,
which is the main focus of this paper. The related challenges
include Quality of Service (QoS) provisioning, handling highly
dynamic and sporadic MTC traffic, huge signalling overhead and
Radio Access Network (RAN) congestion. In this regard, this
paper aims to identify and analyze the involved technical issues,
to review recent advances, to highlight potential solutions and to
propose new research directions. First, starting with an overview
of mMTC features and QoS provisioning issues, we present
the key enablers for mMTC in cellular networks. Along with
the highlights on the inefficiency of the legacy Random Access
(RA) procedure in the mMTC scenario, we then present the key
features and channel access mechanisms in the emerging cellular
IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT).
Subsequently, we present a framework for the performance
analysis of transmission scheduling with the QoS support along
with the issues involved in short data packet transmission. Next,
we provide a detailed overview of the existing and emerging
solutions towards addressing RAN congestion problem, and then
identify potential advantages, challenges and use cases for the
applications of emerging Machine Learning (ML) techniques in
ultra-dense cellular networks. Out of several ML techniques, we
focus on the application of low-complexity Q-learning approach
in the mMTC scenario along with the recent advances towards
enhancing its learning performance and convergence. Finally,
we discuss some open research challenges and promising future
research directions
User-oriented mobility management in cellular wireless networks
2020 Spring.Includes bibliographical references.Mobility Management (MM) in wireless mobile networks is a vital process to keep an individual User Equipment (UE) connected while moving within the network coverage area—this is required to keep the network informed about the UE's mobility (i.e., location changes). The network must identify the exact serving cell of a specific UE for the purpose of data-packet delivery. The two MM procedures that are necessary to localize a specific UE and deliver data packets to that UE are known as Tracking Area Update (TAU) and Paging, which are burdensome not only to the network resources but also UE's battery—the UE and network always initiate the TAU and Paging, respectively. These two procedures are used in current Long Term Evolution (LTE) and its next generation (5G) networks despite the drawback that it consumes bandwidth and energy. Because of potentially very high-volume traffic and increasing density of high-mobility UEs, the TAU/Paging procedure incurs significant costs in terms of the signaling overhead and the power consumption in the battery-limited UE. This problem will become even worse in 5G, which is expected to accommodate exceptional services, such as supporting mission-critical systems (close-to-zero latency) and extending battery lifetime (10 times longer). This dissertation examines and discusses a variety of solution schemes for both the TAU and Paging, emphasizing a new key design to accommodate 5G use cases. However, ongoing efforts are still developing new schemes to provide seamless connections to the ever-increasing density of high-mobility UEs. In this context and toward achieving 5G use cases, we propose a novel solution to solve the MM issues, named gNB-based UE Mobility Tracking (gNB-based UeMT). This solution has four features aligned with achieving 5G goals. First, the mobile UE will no longer trigger the TAU to report their location changes, giving much more power savings with no signaling overhead. Instead, second, the network elements, gNBs, take over the responsibility of Tracking and Locating these UE, giving always-known UE locations. Third, our Paging procedure is markedly improved over the conventional one, providing very fast UE reachability with no Paging messages being sent simultaneously. Fourth, our solution guarantees lightweight signaling overhead with very low Paging delay; our simulation studies show that it achieves about 92% reduction in the corresponding signaling overhead. To realize these four features, this solution adds no implementation complexity. Instead, it exploits the already existing LTE/5G communication protocols, functions, and measurement reports. Our gNB-based UeMT solution by design has the potential to deal with mission-critical applications. In this context, we introduce a new approach for mission-critical and public-safety communications. Our approach aims at emergency situations (e.g., natural disasters) in which the mobile wireless network becomes dysfunctional, partially or completely. Specifically, this approach is intended to provide swift network recovery for Search-and-Rescue Operations (SAROs) to search for survivors after large-scale disasters, which we call UE-based SAROs. These SAROs are based on the fact that increasingly almost everyone carries wireless mobile devices (UEs), which serve as human-based wireless sensors on the ground. Our UE-based SAROs are aimed at accounting for limited UE battery power while providing critical information to first responders, as follows: 1) generate immediate crisis maps for the disaster-impacted areas, 2) provide vital information about where the majority of survivors are clustered/crowded, and 3) prioritize the impacted areas to identify regions that urgently need communication coverage. UE-based SAROs offer first responders a vital tool to prioritize and manage SAROs efficiently and effectively in a timely manner
D2D-Based Grouped Random Access to Mitigate Mobile Access Congestion in 5G Sensor Networks
The Fifth Generation (5G) wireless service of sensor networks involves
significant challenges when dealing with the coordination of ever-increasing
number of devices accessing shared resources. This has drawn major interest
from the research community as many existing works focus on the radio access
network congestion control to efficiently manage resources in the context of
device-to-device (D2D) interaction in huge sensor networks. In this context,
this paper pioneers a study on the impact of D2D link reliability in
group-assisted random access protocols, by shedding the light on beneficial
performance and potential limitations of approaches of this kind against
tunable parameters such as group size, number of sensors and reliability of D2D
links. Additionally, we leverage on the association with a Geolocation Database
(GDB) capability to assist the grouping decisions by drawing parallels with
recent regulatory-driven initiatives around GDBs and arguing benefits of the
suggested proposal. Finally, the proposed method is approved to significantly
reduce the delay over random access channels, by means of an exhaustive
simulation campaign.Comment: First submission to IEEE Communications Magazine on Oct.28.2017.
Accepted on Aug.18.2019. This is the camera-ready versio
- …