777 research outputs found
Analysis of LoRaWAN Uplink with Multiple Demodulating Paths and Capture Effect
Low power wide area networks (LPWANs), such as the ones based on the LoRaWAN
protocol, are seen as enablers of large number of IoT applications and
services. In this work, we assess the scalability of LoRaWAN by analyzing the
frame success probability (FSP) of a LoRa frame while taking into account the
capture effect and the number of parallel demodulation paths of the receiving
gateway. We have based our model on the commonly used {SX1301 gateway chipset},
which is capable of demodulating {up to} eight frames simultaneously; however,
the results of the model can be generalized to architectures with arbitrary
number of demodulation paths. We have also introduced and investigated {three}
policies for Spreading Factor (SF) allocation. Each policy is evaluated in
terms of coverage {probability}, {FSP}, and {throughput}. The overall
conclusion is that the presence of multiple demodulation paths introduces a
significant change in the analysis and performance of the LoRa random access
schemes
Data Aggregation and Packet Bundling of Uplink Small Packets for Monitoring Applications in LTE
In cellular massive Machine-Type Communications (MTC), a device can transmit
directly to the base station (BS) or through an aggregator (intermediate node).
While direct device-BS communication has recently been in the focus of 5G/3GPP
research and standardization efforts, the use of aggregators remains a less
explored topic. In this paper we analyze the deployment scenarios in which
aggregators can perform cellular access on behalf of multiple MTC devices. We
study the effect of packet bundling at the aggregator, which alleviates
overhead and resource waste when sending small packets. The aggregators give
rise to a tradeoff between access congestion and resource starvation and we
show that packet bundling can minimize resource starvation, especially for
smaller numbers of aggregators. Under the limitations of the considered model,
we investigate the optimal settings of the network parameters, in terms of
number of aggregators and packet-bundle size. Our results show that, in
general, data aggregation can benefit the uplink massive MTC in LTE, by
reducing the signalling overhead.Comment: to appear in IEEE Networ
What Can Wireless Cellular Technologies Do about the Upcoming Smart Metering Traffic?
The introduction of smart electricity meters with cellular radio interface
puts an additional load on the wireless cellular networks. Currently, these
meters are designed for low duty cycle billing and occasional system check,
which generates a low-rate sporadic traffic. As the number of distributed
energy resources increases, the household power will become more variable and
thus unpredictable from the viewpoint of the Distribution System Operator
(DSO). It is therefore expected, in the near future, to have an increased
number of Wide Area Measurement System (WAMS) devices with Phasor Measurement
Unit (PMU)-like capabilities in the distribution grid, thus allowing the
utilities to monitor the low voltage grid quality while providing information
required for tighter grid control. From a communication standpoint, the traffic
profile will change drastically towards higher data volumes and higher rates
per device. In this paper, we characterize the current traffic generated by
smart electricity meters and supplement it with the potential traffic
requirements brought by introducing enhanced Smart Meters, i.e., meters with
PMU-like capabilities. Our study shows how GSM/GPRS and LTE cellular system
performance behaves with the current and next generation smart meters traffic,
where it is clearly seen that the PMU data will seriously challenge these
wireless systems. We conclude by highlighting the possible solutions for
upgrading the cellular standards, in order to cope with the upcoming smart
metering traffic.Comment: Submitted; change: corrected location of eSM box in Fig. 1; May 22,
2015: Major revision after review; v4: revised, accepted for publicatio
Scatter of mass changes estimates at basin scale for Greenland and Antarctica
During the last decade, the GRACE mission has provided valuable data for determining the mass changes of the Greenland and Antarctic ice sheets. Yet, discrepancies still exist in the published mass balance results, and comprehensive analyses on the sources of errors and discrepancies are lacking. Here, we present monthly mass changes together with trends derived from GRACE data at basin scale for both the Greenland and Antarctic ice sheets, and we assess the variability and errors for each of the possible sources of discrepancies, and we do this in an unprecedented systematic way, taking into account mass inference methods, data sets and background models. We find a very good agreement between the monthly mass change results derived from two independent methods, which represents a cross validation. For the monthly solutions, we find that most of the scatter is caused by the use of the two different data sets rather than the two different methods applied. Besides the well-known GIA trend uncertainty, we find that the geocenter motion and the recent de-aliasing corrections significantly impact the trends, with contributions of +13.2 Gt yr<sup>−1</sup> and −20 Gt yr<sup>−1</sup>, respectively, for Antarctica, which is more affected by these than Greenland. We show differences between the use of release RL04 and the new RL05 and confirm a lower noise content in the new release. The overall scatter of the solutions well exceeds the uncertainties propagated from the data errors and the leakage (as done in the past); hence we calculate new sound total errors for the monthly solutions and the trends. We find that the scatter in the monthly solutions caused by applying different estimates of geocenter motion time series (degree-1 corrections) is significant – contributing with up to 40% of the total error. For the whole GRACE period (2003–2011) our trend estimate for Greenland is −234 ± 20 Gt yr<sup>−1</sup> and −83 ± 36 Gt yr<sup>−1</sup> for Antarctica (−111 ± 15 Gt yr<sup>−1</sup> in the western part). We also find a clear (with respect to our errors) increase of mass loss in the last four years
Machine Learning Methods for Monitoring of Quasi-Periodic Traffic in Massive IoT Networks
One of the central problems in massive Internet of Things (IoT) deployments
is the monitoring of the status of a massive number of links. The problem is
aggravated by the irregularity of the traffic transmitted over the link, as the
traffic intermittency can be disguised as a link failure and vice versa. In
this work we present a traffic model for IoT devices running quasi-periodic
applications and we present both supervised and unsupervised machine learning
methods for monitoring the network performance of IoT deployments with
quasi-periodic reporting, such as smart-metering, environmental monitoring and
agricultural monitoring. The unsupervised methods are based on the Lomb-Scargle
periodogram, an approach developed by astronomers for estimating the spectral
density of unevenly sampled time series
5G NB-IoT via Low Density LEO Constellations
5G NB-IoT is seen as a key technology for providing truly ubiquitous, global 5G coverage (1.000.000 devices/km2) for machine type communications in the internet of things. A non-terrestrial network (NTN) variant of NB-IoT is being standardized in the 3GPP, which along with inexpensive and non-complex chip-sets enables the production of competitively priced IoT devices with truly global coverage. NB-IoT allows for narrowband single carrier transmissions in the uplink, which improves the uplink link-budget by as much as 16.8 dB over the 180 [kHz] downlink. This allows for a long range sufficient for ground to low earth orbit (LEO) communication without the need for complex and expensive antennas in the IoT devices. In this paper the feasibility of 5G NB-IoT in the context of low-density constellations of small-satellites carrying base-stations in LEO is analyzed and required adaptations to NB-IoT are discussed
- …