6,207 research outputs found
Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications
Wireless sensor networks monitor dynamic environments that change rapidly
over time. This dynamic behavior is either caused by external factors or
initiated by the system designers themselves. To adapt to such conditions,
sensor networks often adopt machine learning techniques to eliminate the need
for unnecessary redesign. Machine learning also inspires many practical
solutions that maximize resource utilization and prolong the lifespan of the
network. In this paper, we present an extensive literature review over the
period 2002-2013 of machine learning methods that were used to address common
issues in wireless sensor networks (WSNs). The advantages and disadvantages of
each proposed algorithm are evaluated against the corresponding problem. We
also provide a comparative guide to aid WSN designers in developing suitable
machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
Transfer Learning for Improving Model Predictions in Highly Configurable Software
Modern software systems are built to be used in dynamic environments using
configuration capabilities to adapt to changes and external uncertainties. In a
self-adaptation context, we are often interested in reasoning about the
performance of the systems under different configurations. Usually, we learn a
black-box model based on real measurements to predict the performance of the
system given a specific configuration. However, as modern systems become more
complex, there are many configuration parameters that may interact and we end
up learning an exponentially large configuration space. Naturally, this does
not scale when relying on real measurements in the actual changing environment.
We propose a different solution: Instead of taking the measurements from the
real system, we learn the model using samples from other sources, such as
simulators that approximate performance of the real system at low cost. We
define a cost model that transform the traditional view of model learning into
a multi-objective problem that not only takes into account model accuracy but
also measurements effort as well. We evaluate our cost-aware transfer learning
solution using real-world configurable software including (i) a robotic system,
(ii) 3 different stream processing applications, and (iii) a NoSQL database
system. The experimental results demonstrate that our approach can achieve (a)
a high prediction accuracy, as well as (b) a high model reliability.Comment: To be published in the proceedings of the 12th International
Symposium on Software Engineering for Adaptive and Self-Managing Systems
(SEAMS'17
Guardauto: A Decentralized Runtime Protection System for Autonomous Driving
Due to the broad attack surface and the lack of runtime protection, potential
safety and security threats hinder the real-life adoption of autonomous
vehicles. Although efforts have been made to mitigate some specific attacks,
there are few works on the protection of the self-driving system. This paper
presents a decentralized self-protection framework called Guardauto to protect
the self-driving system against runtime threats. First, Guardauto proposes an
isolation model to decouple the self-driving system and isolate its components
with a set of partitions. Second, Guardauto provides self-protection mechanisms
for each target component, which combines different methods to monitor the
target execution and plan adaption actions accordingly. Third, Guardauto
provides cooperation among local self-protection mechanisms to identify the
root-cause component in the case of cascading failures affecting multiple
components. A prototype has been implemented and evaluated on the open-source
autonomous driving system Autoware. Results show that Guardauto could
effectively mitigate runtime failures and attacks, and protect the control
system with acceptable performance overhead
- …