70,694 research outputs found
Task allocation in group of nodes in the IoT: A consensus approach
The realization of the Internet of Things (IoT) paradigm relies on the implementation of systems of cooperative intelligent objects with key interoperability capabilities. In order for objects to dynamically cooperate to IoT applications' execution, they need to make their resources available in a flexible way. However, available resources such as electrical energy, memory, processing, and object capability to perform a given task, are often limited. Therefore, resource allocation that ensures the fulfilment of network requirements is a critical challenge. In this paper, we propose a distributed optimization protocol based on consensus algorithm, to solve the problem of resource allocation and management in IoT heterogeneous networks. The proposed protocol is robust against links or nodes failures, so it's adaptive in dynamic scenarios where the network topology changes in runtime. We consider an IoT scenario where nodes involved in the same IoT task need to adjust their task frequency and buffer occupancy. We demonstrate that, using the proposed protocol, the network converges to a solution where resources are homogeneously allocated among nodes. Performance evaluation of experiments in simulation mode and in real scenarios show that the algorithm converges with a percentage error of about±5% with respect to the optimal allocation obtainable with a centralized approach
Radio Resource Allocation in LTE-Advanced Cellular Networks with M2M Communications
Machine-to-machine (M2M) communications are expected to provide ubiquitous
connectivity between machines without the need of human intervention. To
support such a large number of autonomous devices, the M2M system architecture
needs to be extremely power and spectrally efficient. This article thus briefly
reviews the features of M2M services in the third generation (3G) long-term
evolution and its advancement (LTE-Advanced) networks. Architectural
enhancements are then presented for supporting M2M services in LTE-Advanced
cellular networks. To increase spectral efficiency, the same spectrum is
expected to be utilized for human-to-human (H2H) communications as well as M2M
communications. We therefore present various radio resource allocation schemes
and quantify their utility in LTE-Advanced cellular networks. System-level
simulation results are provided to validate the performance effectiveness of
M2M communications in LTE-Advanced cellular networks
Application of Machine Learning in Wireless Networks: Key Techniques and Open Issues
As a key technique for enabling artificial intelligence, machine learning
(ML) is capable of solving complex problems without explicit programming.
Motivated by its successful applications to many practical tasks like image
recognition, both industry and the research community have advocated the
applications of ML in wireless communication. This paper comprehensively
surveys the recent advances of the applications of ML in wireless
communication, which are classified as: resource management in the MAC layer,
networking and mobility management in the network layer, and localization in
the application layer. The applications in resource management further include
power control, spectrum management, backhaul management, cache management,
beamformer design and computation resource management, while ML based
networking focuses on the applications in clustering, base station switching
control, user association and routing. Moreover, literatures in each aspect is
organized according to the adopted ML techniques. In addition, several
conditions for applying ML to wireless communication are identified to help
readers decide whether to use ML and which kind of ML techniques to use, and
traditional approaches are also summarized together with their performance
comparison with ML based approaches, based on which the motivations of surveyed
literatures to adopt ML are clarified. Given the extensiveness of the research
area, challenges and unresolved issues are presented to facilitate future
studies, where ML based network slicing, infrastructure update to support ML
based paradigms, open data sets and platforms for researchers, theoretical
guidance for ML implementation and so on are discussed.Comment: 34 pages,8 figure
Intelligent Wireless Communications Enabled by Cognitive Radio and Machine Learning
The ability to intelligently utilize resources to meet the need of growing
diversity in services and user behavior marks the future of wireless
communication systems. Intelligent wireless communications aims at enabling the
system to perceive and assess the available resources, to autonomously learn to
adapt to the perceived wireless environment, and to reconfigure its operating
mode to maximize the utility of the available resources. The perception
capability and reconfigurability are the essential features of cognitive radio
while modern machine learning techniques project great potential in system
adaptation. In this paper, we discuss the development of the cognitive radio
technology and machine learning techniques and emphasize their roles in
improving spectrum and energy utility of wireless communication systems. We
describe the state-of-the-art of relevant techniques, covering spectrum sensing
and access approaches and powerful machine learning algorithms that enable
spectrum- and energy-efficient communications in dynamic wireless environments.
We also present practical applications of these techniques and identify further
research challenges in cognitive radio and machine learning as applied to the
existing and future wireless communication systems
Hypergraph Theory: Applications in 5G Heterogeneous Ultra-Dense Networks
Heterogeneous ultra-dense network (HUDN) can significantly increase the
spectral efficiency of cellular networks and cater for the explosive growth of
data traffic in the fifth-generation (5G) communications. Due to the dense
deployment of small cells (SCs), interference among neighboring cells becomes
severe. As a result, the effective resource allocation and user association
algorithms are essential to minimize inter-cell interference and optimize
network performance. However, optimizing network resources in HUDN is extremely
complicated as resource allocation and user association are coupled. Therefore,
HUDN requires low-complexity but effective resource allocation schemes to
address these issues. Hypergraph theory has been recognized as a useful
mathematical tool to model the complex relations among multiple entities. In
this article, we show how the hypergraph models can be used to effectively
tackle resource allocation problems in HUDN. We also discuss several potential
research issues in this field
Relay Assisted Device-to-Device Communication: Approaches and Issues
Enabling technologies for 5G and future wireless communication have attracted
the interest of industry and research communities. One of such technologies is
Device-to-Device (D2D) communication which exploits user proximity to offer
spectral efficiency, energy efficiency and increased throughput. Data
offloading, public safety communication, context aware communication and
content sharing are some of the use cases for D2D communication. D2D
communication can be direct or through a relay depending on the nature of the
channel in between the D2D devices. Apart from the problem of interference, a
key challenge of relay aided D2D communication is appropriately assigning
relays to a D2D pair while maintaining the QoS requirement of the cellular
users. In this article, relay assisted D2D communication is reviewed and
research issues are highlighted. We also propose matching theory with
incomplete information for relay allocation considering uncertainties which the
mobility of the relay introduces to the set up
The edge cloud: A holistic view of communication, computation and caching
The evolution of communication networks shows a clear shift of focus from
just improving the communications aspects to enabling new important services,
from Industry 4.0 to automated driving, virtual/augmented reality, Internet of
Things (IoT), and so on. This trend is evident in the roadmap planned for the
deployment of the fifth generation (5G) communication networks. This ambitious
goal requires a paradigm shift towards a vision that looks at communication,
computation and caching (3C) resources as three components of a single holistic
system. The further step is to bring these 3C resources closer to the mobile
user, at the edge of the network, to enable very low latency and high
reliability services. The scope of this chapter is to show that signal
processing techniques can play a key role in this new vision. In particular, we
motivate the joint optimization of 3C resources. Then we show how graph-based
representations can play a key role in building effective learning methods and
devising innovative resource allocation techniques.Comment: to appear in the book "Cooperative and Graph Signal Pocessing:
Principles and Applications", P. Djuric and C. Richard Eds., Academic Press,
Elsevier, 201
Aqua Computing: Coupling Computing and Communications
The authors introduce a new vision for providing computing services for
connected devices. It is based on the key concept that future computing
resources will be coupled with communication resources, for enhancing user
experience of the connected users, and also for optimising resources in the
providers' infrastructures. Such coupling is achieved by Joint/Cooperative
resource allocation algorithms, by integrating computing and communication
services and by integrating hardware in networks. Such type of computing, by
which computing services are not delivered independently but dependent of
networking services, is named Aqua Computing. The authors see Aqua Computing as
a novel approach for delivering computing resources to end devices, where
computing power of the devices are enhanced automatically once they are
connected to an Aqua Computing enabled network. The process of resource
coupling is named computation dissolving. Then, an Aqua Computing architecture
is proposed for mobile edge networks, in which computing and wireless
networking resources are allocated jointly or cooperatively by a Mobile Cloud
Controller, for the benefit of the end-users and/or for the benefit of the
service providers. Finally, a working prototype of the system is shown and the
gathered results show the performance of the Aqua Computing prototype.Comment: A shorter version of this paper will be submitted to an IEEE magazin
White Paper on Critical and Massive Machine Type Communication Towards 6G
The society as a whole, and many vertical sectors in particular, is becoming
increasingly digitalized. Machine Type Communication (MTC), encompassing its
massive and critical aspects, and ubiquitous wireless connectivity are among
the main enablers of such digitization at large. The recently introduced 5G New
Radio is natively designed to support both aspects of MTC to promote the
digital transformation of the society. However, it is evident that some of the
more demanding requirements cannot be fully supported by 5G networks.
Alongside, further development of the society towards 2030 will give rise to
new and more stringent requirements on wireless connectivity in general, and
MTC in particular. Driven by the societal trends towards 2030, the next
generation (6G) will be an agile and efficient convergent network serving a set
of diverse service classes and a wide range of key performance indicators
(KPI). This white paper explores the main drivers and requirements of an
MTC-optimized 6G network, and discusses the following six key research
questions:
- Will the main KPIs of 5G continue to be the dominant KPIs in 6G; or will
there emerge new key metrics?
- How to deliver different E2E service mandates with different KPI
requirements considering joint-optimization at the physical up to the
application layer?
- What are the key enablers towards designing ultra-low power receivers and
highly efficient sleep modes?
- How to tackle a disruptive rather than incremental joint design of a
massively scalable waveform and medium access policy for global MTC
connectivity?
- How to support new service classes characterizing mission-critical and
dependable MTC in 6G?
- What are the potential enablers of long term, lightweight and flexible
privacy and security schemes considering MTC device requirements?Comment: White paper by http://www.6GFlagship.co
Machine Intelligence Techniques for Next-Generation Context-Aware Wireless Networks
The next generation wireless networks (i.e. 5G and beyond), which would be
extremely dynamic and complex due to the ultra-dense deployment of
heterogeneous networks (HetNets), poses many critical challenges for network
planning, operation, management and troubleshooting. At the same time,
generation and consumption of wireless data are becoming increasingly
distributed with ongoing paradigm shift from people-centric to machine-oriented
communications, making the operation of future wireless networks even more
complex. In mitigating the complexity of future network operation, new
approaches of intelligently utilizing distributed computational resources with
improved context-awareness becomes extremely important. In this regard, the
emerging fog (edge) computing architecture aiming to distribute computing,
storage, control, communication, and networking functions closer to end users,
have a great potential for enabling efficient operation of future wireless
networks. These promising architectures make the adoption of artificial
intelligence (AI) principles which incorporate learning, reasoning and
decision-making mechanism, as natural choices for designing a tightly
integrated network. Towards this end, this article provides a comprehensive
survey on the utilization of AI integrating machine learning, data analytics
and natural language processing (NLP) techniques for enhancing the efficiency
of wireless network operation. In particular, we provide comprehensive
discussion on the utilization of these techniques for efficient data
acquisition, knowledge discovery, network planning, operation and management of
the next generation wireless networks. A brief case study utilizing the AI
techniques for this network has also been provided.Comment: ITU Special Issue N.1 The impact of Artificial Intelligence (AI) on
communication networks and services, (To appear
- …