10,597 research outputs found
Massive Machine Type Communication with Data Aggregation and Resource Scheduling
To enable massive machine type communication (mMTC), data aggregation is a promising approach to reduce the congestion caused by a massive number of machine type devices (MTDs). In this paper, we consider a two-phase cellular-based mMTC network, where MTDs transmit to aggregators (i.e., aggregation phase) and the aggregated data is then relayed to base stations (i.e., relaying phase). Due to the limited resources, the aggregators not only aggregate data, but also schedule resources among MTDs. We consider two scheduling schemes: random resource scheduling (RRS) and channel-aware resource scheduling (CRS). By leveraging the stochastic geometry, we present a tractable analytical framework to investigate the signal-to-interference ratio (SIR) for each phase, thereby computing the MTD success probability, the average number of successful MTDs and probability of successful channel utilization, which are the key metrics characterizing the overall mMTC performance. Our numerical results show that, although the CRS outperforms the RRS in terms of SIR at the aggregation phase, the simpler RRS has almost the same performance as the CRS for most of the cases with regards to the overall mMTC performance. Furthermore, the provision of more resources at the aggregation phase is not always beneficial to the mMTC performance.This work was supported by the Australian Research
Council’s Discovery Project Funding Scheme (Project number DP170100939)
Fine-grained performance analysis of massive MTC networks with scheduling and data aggregation
Abstract. The Internet of Things (IoT) represents a substantial shift within wireless communication and constitutes a relevant topic of social, economic, and overall technical impact. It refers to resource-constrained devices communicating without or with low human intervention. However, communication among machines imposes several challenges compared to traditional human type communication (HTC). Moreover, as the number of devices increases exponentially, different network management techniques and technologies are needed. Data aggregation is an efficient approach to handle the congestion introduced by a massive number of machine type devices (MTDs). The aggregators not only collect data but also implement scheduling mechanisms to cope with scarce network resources.
This thesis provides an overview of the most common IoT applications and the network technologies to support them. We describe the most important challenges in machine type communication (MTC). We use a stochastic geometry (SG) tool known as the meta distribution (MD) of the signal-to-interference ratio (SIR), which is the distribution of the conditional SIR distribution given the wireless nodes’ locations, to provide a fine-grained description of the per-link reliability. Specifically, we analyze the performance of two scheduling methods for data aggregation of MTC: random resource scheduling (RRS) and channel-aware resource scheduling (CRS). The results show the fraction of users in the network that achieves a target reliability, which is an important aspect to consider when designing wireless systems with stringent service requirements. Finally, the impact on the fraction of MTDs that communicate with a target reliability when increasing the aggregators density is investigated
Data Aggregation and Packet Bundling of Uplink Small Packets for Monitoring Applications in LTE
In cellular massive Machine-Type Communications (MTC), a device can transmit
directly to the base station (BS) or through an aggregator (intermediate node).
While direct device-BS communication has recently been in the focus of 5G/3GPP
research and standardization efforts, the use of aggregators remains a less
explored topic. In this paper we analyze the deployment scenarios in which
aggregators can perform cellular access on behalf of multiple MTC devices. We
study the effect of packet bundling at the aggregator, which alleviates
overhead and resource waste when sending small packets. The aggregators give
rise to a tradeoff between access congestion and resource starvation and we
show that packet bundling can minimize resource starvation, especially for
smaller numbers of aggregators. Under the limitations of the considered model,
we investigate the optimal settings of the network parameters, in terms of
number of aggregators and packet-bundle size. Our results show that, in
general, data aggregation can benefit the uplink massive MTC in LTE, by
reducing the signalling overhead.Comment: to appear in IEEE Networ
SymbioCity: Smart Cities for Smarter Networks
The "Smart City" (SC) concept revolves around the idea of embodying
cutting-edge ICT solutions in the very fabric of future cities, in order to
offer new and better services to citizens while lowering the city management
costs, both in monetary, social, and environmental terms. In this framework,
communication technologies are perceived as subservient to the SC services,
providing the means to collect and process the data needed to make the services
function. In this paper, we propose a new vision in which technology and SC
services are designed to take advantage of each other in a symbiotic manner.
According to this new paradigm, which we call "SymbioCity", SC services can
indeed be exploited to improve the performance of the same communication
systems that provide them with data. Suggestive examples of this symbiotic
ecosystem are discussed in the paper. The dissertation is then substantiated in
a proof-of-concept case study, where we show how the traffic monitoring service
provided by the London Smart City initiative can be used to predict the density
of users in a certain zone and optimize the cellular service in that area.Comment: 14 pages, submitted for publication to ETT Transactions on Emerging
Telecommunications Technologie
Massive Non-Orthogonal Multiple Access for Cellular IoT: Potentials and Limitations
The Internet of Things (IoT) promises ubiquitous connectivity of everything
everywhere, which represents the biggest technology trend in the years to come.
It is expected that by 2020 over 25 billion devices will be connected to
cellular networks; far beyond the number of devices in current wireless
networks. Machine-to-Machine (M2M) communications aims at providing the
communication infrastructure for enabling IoT by facilitating the billions of
multi-role devices to communicate with each other and with the underlying data
transport infrastructure without, or with little, human intervention. Providing
this infrastructure will require a dramatic shift from the current protocols
mostly designed for human-to-human (H2H) applications. This article reviews
recent 3GPP solutions for enabling massive cellular IoT and investigates the
random access strategies for M2M communications, which shows that cellular
networks must evolve to handle the new ways in which devices will connect and
communicate with the system. A massive non-orthogonal multiple access (NOMA)
technique is then presented as a promising solution to support a massive number
of IoT devices in cellular networks, where we also identify its practical
challenges and future research directions.Comment: To appear in IEEE Communications Magazin
Deep Reinforcement Learning for Real-Time Optimization in NB-IoT Networks
NarrowBand-Internet of Things (NB-IoT) is an emerging cellular-based
technology that offers a range of flexible configurations for massive IoT radio
access from groups of devices with heterogeneous requirements. A configuration
specifies the amount of radio resource allocated to each group of devices for
random access and for data transmission. Assuming no knowledge of the traffic
statistics, there exists an important challenge in "how to determine the
configuration that maximizes the long-term average number of served IoT devices
at each Transmission Time Interval (TTI) in an online fashion". Given the
complexity of searching for optimal configuration, we first develop real-time
configuration selection based on the tabular Q-learning (tabular-Q), the Linear
Approximation based Q-learning (LA-Q), and the Deep Neural Network based
Q-learning (DQN) in the single-parameter single-group scenario. Our results
show that the proposed reinforcement learning based approaches considerably
outperform the conventional heuristic approaches based on load estimation
(LE-URC) in terms of the number of served IoT devices. This result also
indicates that LA-Q and DQN can be good alternatives for tabular-Q to achieve
almost the same performance with much less training time. We further advance
LA-Q and DQN via Actions Aggregation (AA-LA-Q and AA-DQN) and via Cooperative
Multi-Agent learning (CMA-DQN) for the multi-parameter multi-group scenario,
thereby solve the problem that Q-learning agents do not converge in
high-dimensional configurations. In this scenario, the superiority of the
proposed Q-learning approaches over the conventional LE-URC approach
significantly improves with the increase of configuration dimensions, and the
CMA-DQN approach outperforms the other approaches in both throughput and
training efficiency
- …