321 research outputs found
A Traffic Model for Machine-Type Communications Using Spatial Point Processes
A source traffic model for machine-to-machine communications is presented in
this paper. We consider a model in which devices operate in a regular mode
until they are triggered into an alarm mode by an alarm event. The positions of
devices and events are modeled by means of Poisson point processes, where the
generated traffic by a given device depends on its position and event
positions. We first consider the case where devices and events are static and
devices generate traffic according to a Bernoulli process, where we derive the
total rate from the devices at the base station. We then extend the model by
defining a two-state Markov chain for each device, which allows for devices to
stay in alarm mode for a geometrically distributed holding time. The temporal
characteristics of this model are analyzed via the autocovariance function,
where the effect of event density and mean holding time are shown.Comment: Accepted at the 2017 IEEE 28th Annual International Symposium on
Personal, Indoor, and Mobile Radio Communications (PIMRC) - Workshop WS-07 on
"The Internet of Things (IoT), the Road Ahead: Applications, Challenges, and
Solutions
On the performance of machine-type communications networks under Markovian arrival sources
Abstract. This thesis evaluates the performance of reliability and latency in machine type communication networks, which composed of single transmitter and receiver in the presence of Rayleigh fading channel. The source’s traffic arrivals are modeled as Markovian processes namely Discrete-Time Markov process, Fluid Markov process, Discrete-Time Markov Modulated Poisson process and Continuous-Time Markov Modulated Poisson process, and delay/buffer overflow constraints are imposed. Our approach is based on the reliability and latency outage probability, where transmitter not knowing the channel condition, therefore the transmitter would be transmitting information over the fixed rate. The fixed rate transmission is modeled as a two-state Discrete-time Markov process, which identifies the reliability level of wireless transmission. Using effective bandwidth and effective capacity theories, we evaluate the trade-off between reliability-latency and identify QoS requirement. The impact of different source traffic originated from MTC devices under QoS constraints on the effective transmission rate are investigated
Learning and Management for Internet-of-Things: Accounting for Adaptivity and Scalability
Internet-of-Things (IoT) envisions an intelligent infrastructure of networked
smart devices offering task-specific monitoring and control services. The
unique features of IoT include extreme heterogeneity, massive number of
devices, and unpredictable dynamics partially due to human interaction. These
call for foundational innovations in network design and management. Ideally, it
should allow efficient adaptation to changing environments, and low-cost
implementation scalable to massive number of devices, subject to stringent
latency constraints. To this end, the overarching goal of this paper is to
outline a unified framework for online learning and management policies in IoT
through joint advances in communication, networking, learning, and
optimization. From the network architecture vantage point, the unified
framework leverages a promising fog architecture that enables smart devices to
have proximity access to cloud functionalities at the network edge, along the
cloud-to-things continuum. From the algorithmic perspective, key innovations
target online approaches adaptive to different degrees of nonstationarity in
IoT dynamics, and their scalable model-free implementation under limited
feedback that motivates blind or bandit approaches. The proposed framework
aspires to offer a stepping stone that leads to systematic designs and analysis
of task-specific learning and management schemes for IoT, along with a host of
new research directions to build on.Comment: Submitted on June 15 to Proceeding of IEEE Special Issue on Adaptive
and Scalable Communication Network
Traffic Prediction Based Fast Uplink Grant for Massive IoT
This paper presents a novel framework for traffic prediction of IoT devices
activated by binary Markovian events. First, we consider a massive set of IoT
devices whose activation events are modeled by an On-Off Markov process with
known transition probabilities. Next, we exploit the temporal correlation of
the traffic events and apply the forward algorithm in the context of hidden
Markov models (HMM) in order to predict the activation likelihood of each IoT
device. Finally, we apply the fast uplink grant scheme in order to allocate
resources to the IoT devices that have the maximal likelihood for transmission.
In order to evaluate the performance of the proposed scheme, we define the
regret metric as the number of missed resource allocation opportunities. The
proposed fast uplink scheme based on traffic prediction outperforms both
conventional random access and time division duplex in terms of regret and
efficiency of system usage, while it maintains its superiority over random
access in terms of average age of information for massive deployments.Comment: Accepted to IEEE International Symposium on Personal, Indoor and
Mobile Radio Communications (PIMRC) 202
Traffic classification and prediction, and fast uplink grant allocation for machine type communications via support vector machines and long short-term memory
Abstract. The current random access (RA) allocation techniques suffer from congestion and high signaling overhead while serving machine type communication (MTC) applications. Therefore, 3GPP has introduced the need to use fast uplink grant (FUG) allocation. This thesis proposes a novel FUG allocation based on support vector machine (SVM) and long short-term memory (LSTM). First, MTC devices are prioritized using SVM classifier. Second, LSTM architecture is used to predict activation time of each device. Both results are used to achieve an efficient resource scheduler in terms of the average latency and total throughput. Furthermore, a set of correction techniques is introduced to overcome the classification and prediction errors. The Coupled Markov Modulated Poisson Process (CMMPP) traffic model is applied to compare the proposed FUG allocation to other existing allocation techniques. In addition, an extended traffic model based CMMPP is used to evaluate the proposed algorithm in a more dense network. Our simulation results show the proposed model outperforms the existing RA allocation schemes by achieving the highest throughput and the lowest access delay when serving the target massive and critical MTC applications
Analysis of buffer allocations in time-dependent and stochastic flow lines
This thesis reviews and classifies the literature on the Buffer Allocation Problem under steady-state conditions and on performance evaluation approaches for queueing systems with time-dependent parameters. Subsequently, new performance evaluation approaches are developed. Finally, a local search algorithm for the derivation of time-dependent buffer allocations is proposed. The algorithm is based on numerically observed monotonicity properties of the system performance in the time-dependent buffer allocations. Numerical examples illustrate that time-dependent buffer allocations represent an adequate way of minimizing the average WIP in the flow line while achieving a desired service level
- …