717 research outputs found
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
Cross-Layer Optimization and Dynamic Spectrum Access for Distributed Wireless Networks
We proposed a novel spectrum allocation approach for distributed cognitive radio networks. Cognitive radio systems are capable of sensing the prevailing environmental conditions and automatically adapting its operating parameters in order to enhance system and network performance. Using this technology, our proposed approach optimizes each individual wireless device and its single-hop communication links using the partial operating parameter and environmental information from adjacent devices within the wireless network. Assuming stationary wireless nodes, all wireless communication links employ non-contiguous orthogonal frequency division multiplexing (NC-OFDM) in order to enable dynamic spectrum access (DSA). The proposed approach will attempt to simultaneously minimize the bit error rate, minimize out-of-band (OOB) interference, and maximize overall throughput using a multi-objective fitness function. Without loss in generality, genetic algorithms are employed to perform the actual optimization. Two generic optimization approaches, subcarrier-wise approach and block-wise approach, were proposed to access spectrum. We also proposed and analyzed several approaches implemented via genetic algorithms (GA), such as quantizing variables, using adaptive variable ranges, and Multi-Objective Genetic Algorithms, for increasing the speed and improving the results of combined spectrum utilization/cross-layer optimization approaches proposed, together with several assisting processes and modifications devised to make the optimization to improve efficiency and execution time
Towards Cognitive Radio for emergency networks
Large parts of the assigned spectrum is underutilized while the increasing number
of wireless multimedia applications leads to spectrum scarcity. Cognitive Radio
is an option to utilize non-used parts of the spectrum that actually are assigned to primary
services. The benefits of Cognitive Radio are clear when used in emergency
situations. Current emergency services rely much on the public networks. This is not
reliable in emergency situations, where the public networks can get overloaded. The
major limitation of emergency networks is spectrum scarcity, since multimedia data
in the emergency network needs a lot of radio resources. The idea of applying Cognitive
Radio to the emergency network is to alleviate this spectrum shortage problem
by dynamically accessing free spectrum resources. Cognitive Radio is able to work
in different frequency bands and various wireless channels and supports multimedia
services such as voice, data and video. A reconfigurable radio architecture is proposed
to enable the evolution from the traditional software defined radio to Cognitive Radio
- …