2 research outputs found
A Computation Offloading Model over Collaborative Cloud-Edge Networks with Optimal Transport Theory
As novel applications spring up in future network scenarios, the requirements
on network service capabilities for differentiated services or burst services
are diverse. Aiming at the research of collaborative computing and resource
allocation in edge scenarios, migrating computing tasks to the edge and cloud
for computing requires a comprehensive consideration of energy consumption,
bandwidth, and delay. Our paper proposes a collaboration mechanism based on
computation offloading, which is flexible and customizable to meet the
diversified requirements of differentiated networks. This mechanism handles the
terminal's differentiated computing tasks by establishing a collaborative
computation offloading model between the cloud server and edge server.
Experiments show that our method has more significant improvements over regular
optimization algorithms, including reducing the execution time of computing
tasks, improving the utilization of server resources, and decreasing the
terminal's energy consumption
Edge Intelligence: The Confluence of Edge Computing and Artificial Intelligence
Along with the rapid developments in communication technologies and the surge
in the use of mobile devices, a brand-new computation paradigm, Edge Computing,
is surging in popularity. Meanwhile, Artificial Intelligence (AI) applications
are thriving with the breakthroughs in deep learning and the many improvements
in hardware architectures. Billions of data bytes, generated at the network
edge, put massive demands on data processing and structural optimization. Thus,
there exists a strong demand to integrate Edge Computing and AI, which gives
birth to Edge Intelligence. In this paper, we divide Edge Intelligence into AI
for edge (Intelligence-enabled Edge Computing) and AI on edge (Artificial
Intelligence on Edge). The former focuses on providing more optimal solutions
to key problems in Edge Computing with the help of popular and effective AI
technologies while the latter studies how to carry out the entire process of
building AI models, i.e., model training and inference, on the edge. This paper
provides insights into this new inter-disciplinary field from a broader
perspective. It discusses the core concepts and the research road-map, which
should provide the necessary background for potential future research
initiatives in Edge Intelligence.Comment: 13 pages, 3 figure