1,219 research outputs found
ILCAS: Imitation Learning-Based Configuration-Adaptive Streaming for Live Video Analytics with Cross-Camera Collaboration
The high-accuracy and resource-intensive deep neural networks (DNNs) have
been widely adopted by live video analytics (VA), where camera videos are
streamed over the network to resource-rich edge/cloud servers for DNN
inference. Common video encoding configurations (e.g., resolution and frame
rate) have been identified with significant impacts on striking the balance
between bandwidth consumption and inference accuracy and therefore their
adaption scheme has been a focus of optimization. However, previous
profiling-based solutions suffer from high profiling cost, while existing deep
reinforcement learning (DRL) based solutions may achieve poor performance due
to the usage of fixed reward function for training the agent, which fails to
craft the application goals in various scenarios. In this paper, we propose
ILCAS, the first imitation learning (IL) based configuration-adaptive VA
streaming system. Unlike DRL-based solutions, ILCAS trains the agent with
demonstrations collected from the expert which is designed as an offline
optimal policy that solves the configuration adaption problem through dynamic
programming. To tackle the challenge of video content dynamics, ILCAS derives
motion feature maps based on motion vectors which allow ILCAS to visually
``perceive'' video content changes. Moreover, ILCAS incorporates a cross-camera
collaboration scheme to exploit the spatio-temporal correlations of cameras for
more proper configuration selection. Extensive experiments confirm the
superiority of ILCAS compared with state-of-the-art solutions, with 2-20.9%
improvement of mean accuracy and 19.9-85.3% reduction of chunk upload lag.Comment: This work has been submitted to the IEEE Transactions on Mobile
Computing for possible publication. Copyright may be transferred without
notice, after which this version may no longer be accessibl
A survey of online data-driven proactive 5G network optimisation using machine learning
In the fifth-generation (5G) mobile networks, proactive network optimisation plays an important role in meeting the exponential traffic growth, more stringent service requirements, and to reduce capitaland operational expenditure. Proactive network optimisation is widely acknowledged as on e of the most promising ways to transform the 5G network based on big data analysis and cloud-fog-edge computing, but there are many challenges. Proactive algorithms will require accurate forecasting of highly contextualised traffic demand and quantifying the uncertainty to drive decision making with performance guarantees. Context in Cyber-Physical-Social Systems (CPSS) is often challenging to uncover, unfolds over time, and even more difficult to quantify and integrate into decision making. The first part of the review focuses on mining and inferring CPSS context from heterogeneous data sources, such as online user-generated-content. It will examine the state-of-the-art methods currently employed to infer location, social behaviour, and traffic demand through a cloud-edge computing framework; combining them to form the input to proactive algorithms. The second part of the review focuses on exploiting and integrating the demand knowledge for a range of proactive optimisation techniques, including the key aspects of load balancing, mobile edge caching, and interference management. In both parts, appropriate state-of-the-art machine learning techniques (including probabilistic uncertainty cascades in proactive optimisation), complexity-performance trade-offs, and demonstrative examples are presented to inspire readers. This survey couples the potential of online big data analytics, cloud-edge computing, statistical machine learning, and proactive network optimisation in a common cross-layer wireless framework. The wider impact of this survey includes better cross-fertilising the academic fields of data analytics, mobile edge computing, AI, CPSS, and wireless communications, as well as informing the industry of the promising potentials in this area
Big data analytics for large-scale wireless networks: Challenges and opportunities
© 2019 Association for Computing Machinery. The wide proliferation of various wireless communication systems and wireless devices has led to the arrival of big data era in large-scale wireless networks. Big data of large-scale wireless networks has the key features of wide variety, high volume, real-time velocity, and huge value leading to the unique research challenges that are different from existing computing systems. In this article, we present a survey of the state-of-art big data analytics (BDA) approaches for large-scale wireless networks. In particular, we categorize the life cycle of BDA into four consecutive stages: Data Acquisition, Data Preprocessing, Data Storage, and Data Analytics. We then present a detailed survey of the technical solutions to the challenges in BDA for large-scale wireless networks according to each stage in the life cycle of BDA. Moreover, we discuss the open research issues and outline the future directions in this promising area
Smart Cities: An In-Depth Study of AI Algorithms and Advanced Connectivity
The goal of smart city development is to improve the quality of life by incorporating technology into daily activities. Artificial intelligence (AI) is critical to the ongoing development of future smart cities. The Internet of Things (IoT) idea connects every internet-enabled device for improved access and control. AI in various domains has changed ordinary towns into highly equipped smart cities. Machine learning and deep learning algorithms have proven indispensable in a variety of industries, and they are now being implemented into smart city concepts to automate and improve urban activities and operations on a large scale. IoT and machine learning technology are frequently used in smart cities to collect data from various sources. This article delves deeply into the significance, scope, and developments of AI-based smart cities. It also addresses some of the difficulties and restrictions associated with smart cities powered by AI. The goal of the study is to inspire and encourage academics to create original smart city solutions based on AI technologies
CPSOR-GCN: A Vehicle Trajectory Prediction Method Powered by Emotion and Cognitive Theory
Active safety systems on vehicles often face problems with false alarms. Most
active safety systems predict the driver's trajectory with the assumption that
the driver is always in a normal emotion, and then infer risks. However, the
driver's trajectory uncertainty increases under abnormal emotions. This paper
proposes a new trajectory prediction model: CPSOR-GCN, which predicts vehicle
trajectories under abnormal emotions. At the physical level, the interaction
features between vehicles are extracted by the physical GCN module. At the
cognitive level, SOR cognitive theory is used as prior knowledge to build a
Dynamic Bayesian Network (DBN) structure. The conditional probability and state
transition probability of nodes from the calibrated SOR-DBN quantify the causal
relationship between cognitive factors, which is embedded into the cognitive
GCN module to extract the characteristics of the influence mechanism of
emotions on driving behavior. The CARLA-SUMO joint driving simulation platform
was built to develop dangerous pre-crash scenarios. Methods of recreating
traffic scenes were used to naturally induce abnormal emotions. The experiment
collected data from 26 participants to verify the proposed model. Compared with
the model that only considers physical motion features, the prediction accuracy
of the proposed model is increased by 68.70%. Furthermore,considering the
SOR-DBN reduces the prediction error of the trajectory by 15.93%. Compared with
other advanced trajectory prediction models, the results of CPSOR-GCN also have
lower errors. This model can be integrated into active safety systems to better
adapt to the driver's emotions, which could effectively reduce false alarms.Comment: 15 pages, 31 figures, submitted to IEEE Transactions on Intelligent
Vehicle
Mobile Oriented Future Internet (MOFI)
This Special Issue consists of seven papers that discuss how to enhance mobility management and its associated performance in the mobile-oriented future Internet (MOFI) environment. The first two papers deal with the architectural design and experimentation of mobility management schemes, in which new schemes are proposed and real-world testbed experimentations are performed. The subsequent three papers focus on the use of software-defined networks (SDN) for effective service provisioning in the MOFI environment, together with real-world practices and testbed experimentations. The remaining two papers discuss the network engineering issues in newly emerging mobile networks, such as flying ad-hoc networks (FANET) and connected vehicular networks
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
- …