51,330 research outputs found
A Traffic Model for Machine-Type Communications Using Spatial Point Processes
A source traffic model for machine-to-machine communications is presented in
this paper. We consider a model in which devices operate in a regular mode
until they are triggered into an alarm mode by an alarm event. The positions of
devices and events are modeled by means of Poisson point processes, where the
generated traffic by a given device depends on its position and event
positions. We first consider the case where devices and events are static and
devices generate traffic according to a Bernoulli process, where we derive the
total rate from the devices at the base station. We then extend the model by
defining a two-state Markov chain for each device, which allows for devices to
stay in alarm mode for a geometrically distributed holding time. The temporal
characteristics of this model are analyzed via the autocovariance function,
where the effect of event density and mean holding time are shown.Comment: Accepted at the 2017 IEEE 28th Annual International Symposium on
Personal, Indoor, and Mobile Radio Communications (PIMRC) - Workshop WS-07 on
"The Internet of Things (IoT), the Road Ahead: Applications, Challenges, and
Solutions
Data Aggregation and Packet Bundling of Uplink Small Packets for Monitoring Applications in LTE
In cellular massive Machine-Type Communications (MTC), a device can transmit
directly to the base station (BS) or through an aggregator (intermediate node).
While direct device-BS communication has recently been in the focus of 5G/3GPP
research and standardization efforts, the use of aggregators remains a less
explored topic. In this paper we analyze the deployment scenarios in which
aggregators can perform cellular access on behalf of multiple MTC devices. We
study the effect of packet bundling at the aggregator, which alleviates
overhead and resource waste when sending small packets. The aggregators give
rise to a tradeoff between access congestion and resource starvation and we
show that packet bundling can minimize resource starvation, especially for
smaller numbers of aggregators. Under the limitations of the considered model,
we investigate the optimal settings of the network parameters, in terms of
number of aggregators and packet-bundle size. Our results show that, in
general, data aggregation can benefit the uplink massive MTC in LTE, by
reducing the signalling overhead.Comment: to appear in IEEE Networ
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
Optimisation of Mobile Communication Networks - OMCO NET
The mini conference âOptimisation of Mobile Communication Networksâ focuses on advanced methods for search and optimisation applied to wireless communication networks. It is sponsored by Research & Enterprise Fund Southampton Solent University.
The conference strives to widen knowledge on advanced search methods capable of optimisation of wireless communications networks. The aim is to provide a forum for exchange of recent knowledge, new ideas and trends in this progressive and challenging area. The conference will popularise new successful approaches on resolving hard tasks such as minimisation of transmit power, cooperative and optimal routing
2D Proactive Uplink Resource Allocation Algorithm for Event Based MTC Applications
We propose a two dimension (2D) proactive uplink resource allocation
(2D-PURA) algorithm that aims to reduce the delay/latency in event-based
machine-type communications (MTC) applications. Specifically, when an event of
interest occurs at a device, it tends to spread to the neighboring devices.
Consequently, when a device has data to send to the base station (BS), its
neighbors later are highly likely to transmit. Thus, we propose to cluster
devices in the neighborhood around the event, also referred to as the
disturbance region, into rings based on the distance from the original event.
To reduce the uplink latency, we then proactively allocate resources for these
rings. To evaluate the proposed algorithm, we analytically derive the mean
uplink delay, the proportion of resource conservation due to successful
allocations, and the proportion of uplink resource wastage due to unsuccessful
allocations for 2D-PURA algorithm. Numerical results demonstrate that the
proposed method can save over 16.5 and 27 percent of mean uplink delay,
compared with the 1D algorithm and the standard method, respectively.Comment: 6 pages, 6 figures, Published in 2018 IEEE Wireless Communications
and Networking Conference (WCNC
- âŚ