348 research outputs found
An Advanced Home ElderCare Service
With the increase of welfare cost all over the developed world, there is a need to resort to new technologies
that could help reduce this enormous cost and provide some quality eldercare services. This paper presents a
middleware-level solution that integrates monitoring and emergency detection solutions with networking solutions. The proposed system enables efficient integration between a variety of sensors and actuators deployed
at home for emergency detection and provides a framework for creating and managing rescue teams willing
to assist elders in case of emergency situations. A prototype of the proposed system was designed and implemented. Results were obtained from both computer simulations and a real-network testbed. These results show that the proposed system can help overcome some of the current problems and help reduce the enormous cost of eldercare service
Recent trends in IP/NGEO satellite communication systems: transport, routing, and mobility management concerns
科研費報告書収録論文(課題番号:17500030/研究代表者:加藤寧/インターネットと高親和性を有する次世代低軌道衛星ネットワークに関する基盤研究
Cost-Efficient Data Backup for Data Center Networks against {\epsilon}-Time Early Warning Disaster
Data backup in data center networks (DCNs) is critical to minimize the data
loss under disaster. This paper considers the cost-efficient data backup for
DCNs against a disaster with early warning time. Given
geo-distributed DCNs and such a -time early warning disaster, we
investigate the issue of how to back up the data in DCN nodes under risk to
other safe DCN nodes within the early warning time constraint,
which is significant because it is an emergency data protection scheme against
a predictable disaster and also help DCN operators to build a complete backup
scheme, i.e., regular backup and emergency backup. Specifically, an Integer
Linear Program (ILP)-based theoretical framework is proposed to identify the
optimal selections of backup DCN nodes and data transmission paths, such that
the overall data backup cost is minimized. Extensive numerical results are also
provided to illustrate the proposed framework for DCN data backup
Optimization of Flow Allocation in Asynchronous Deterministic 5G Transport Networks by Leveraging Data Analytics
This research work was supported in part by the Euro-
pean Union’s Horizon 2020 Research and Innovation Program
under the “Cloud for Holography and Augmented Reality
(CHARITY)” Project under Agreement 101016509, and 5G-
CLARITY Project under Agreement 871428. It is also partially
supported by the Spanish national research project TRUE5G:
PID2019-108713RB-C53.Time-Sensitive Networking (TSN) and Deterministic
Networking (DetNet) technologies are increasingly recognized as
key levers of the future 5G transport networks (TNs) due to their
capabilities for providing deterministic Quality-of-Service and
enabling the coexistence of critical and best-effort services. Addi-
tionally, they rely on programmable and cost-effective Ethernet-
based forwarding planes. This article addresses the flow alloca-
tion problem in 5G backhaul networks realized as asynchronous
TSN networks, whose building block is the Asynchronous Traffic
Shaper. We propose an offline solution, dubbed “Next Generation
Transport Network Optimizer” (NEPTUNO), that combines ex-
act optimization methods and heuristic techniques and leverages
data analytics to solve the flow allocation problem. NEPTUNO
aims to maximize the flow acceptance ratio while guaranteeing
the deterministic Quality-of-Service requirements of the critical
flows. We carried out a performance evaluation of NEPTUNO
regarding the degree of optimality, execution time, and flow
rejection ratio. Furthermore, we compare NEPTUNO with a
novel online baseline solution for two different optimization goals.
Online methods compute the flow’s allocation configuration right
after the flow arrives at the network, whereas offline solutions
like NEPTUNO compute a long-term configuration allocation for
the whole network. Our results highlight the potential of data
analytics for the self-optimization of the future 5G TNs.Union’s Horizon 2020, 1010165095G-CLARITY 871428TRUE5G: PID2019-108713RB-C53
Blockchain and Deep Learning-Based IDS for Securing SDN-Enabled Industrial IoT Environments
The industrial Internet of Things (IIoT) involves the integration of Internet
of Things (IoT) technologies into industrial settings. However, given the high
sensitivity of the industry to the security of industrial control system
networks and IIoT, the use of software-defined networking (SDN) technology can
provide improved security and automation of communication processes. Despite
this, the architecture of SDN can give rise to various security threats.
Therefore, it is of paramount importance to consider the impact of these
threats on SDN-based IIoT environments. Unlike previous research, which focused
on security in IIoT and SDN architectures separately, we propose an integrated
method including two components that work together seamlessly for better
detecting and preventing security threats associated with SDN-based IIoT
architectures. The two components consist in a convolutional neural
network-based Intrusion Detection System (IDS) implemented as an SDN
application and a Blockchain-based system (BS) to empower application layer and
network layer security, respectively. A significant advantage of the proposed
method lies in jointly minimizing the impact of attacks such as command
injection and rule injection on SDN-based IIoT architecture layers. The
proposed IDS exhibits superior classification accuracy in both binary and
multiclass categories
Joint Network Slicing, Routing, and In-Network Computing for Energy-Efficient 6G
To address the evolving landscape of next-generation mobile networks,
characterized by an increasing number of connected users, surging traffic
demands, and the continuous emergence of new services, a novel communication
paradigm is essential. One promising candidate is the integration of network
slicing and in-network computing, offering resource isolation, deterministic
networking, enhanced resource efficiency, network expansion, and energy
conservation. Although prior research has explored resource allocation within
network slicing, routing, and in-network computing independently, a
comprehensive investigation into their joint approach has been lacking. This
paper tackles the joint problem of network slicing, path selection, and the
allocation of in-network and cloud computing resources, aiming to maximize the
number of accepted users while minimizing energy consumption. First, we
introduce a Mixed-Integer Linear Programming (MILP) formulation of the problem
and analyze its complexity, proving that the problem is NP-hard. Next, a Water
Filling-based Joint Slicing, Routing, and In-Network Computing (WF-JSRIN)
heuristic algorithm is proposed to solve it. Finally, a comparative analysis
was conducted among WF-JSRIN, a random allocation technique, and two optimal
approaches, namely Opt-IN (utilizing in-network computation) and Opt-C (solely
relying on cloud node resources). The results emphasize WF-JSRIN's efficiency
in delivering highly efficient near-optimal solutions with significantly
reduced execution times, solidifying its suitability for practical real-world
applications.Comment: Accepted at the 2024 IEEE Wireless Communications and Networking
Conference (WCNC 2024
Deep Reinforcement Learning based Collision Avoidance in UAV Environment
Unmanned Aerial Vehicles (UAVs) have recently
attracted both academia and industry representatives due to
their utilization in tremendous emerging applications. Most
UAV applications adopt Visual Line of Sight (VLOS) due to
ongoing regulations. There is a consensus between industry for
extending UAVs’ commercial operations to cover the urban and
populated area controlled airspace Beyond VLOS (BVLOS).
There is ongoing regulation for enabling BVLOS UAV management. Regrettably, this comes with unavoidable challenges
related to UAVs’ autonomy for detecting and avoiding static
and mobile objects. An intelligent component should either
be deployed onboard the UAV or at a Multi-Access Edge
Computing (MEC) that can read the gathered data from
different UAV’s sensors, process them, and then make the
right decision to detect and avoid the physical collision. The
sensing data should be collected using various sensors but
not limited to Lidar, depth camera, video, or ultrasonic. This
paper proposes probabilistic and Deep Reinforcement Learning
(DRL)-based algorithms for avoiding collisions while saving
energy consumption. The proposed algorithms can be either run
on top of the UAV or at the MEC according to the UAV capacity
and the task overhead. We have designed and developed
our algorithms to work for any environment without a need
for any prior knowledge. The proposed solutions have been
evaluated in a harsh environment that consists of many UAVs
moving randomly in a small area without any correlation. The
obtained results demonstrated the efficiency of these solutions
for avoiding the collision while saving energy consumption in
familiar and unfamiliar environments.This work has been partially funded by the Spanish national project TRUE-5G (PID2019-108713RB-C53)
Cooperative Jamming and Relay Selection for Covert Communications
This paper investigates the covert communications via cooperative jamming and
relay selection in a wireless relay system, where a source intends to transmit
a message to its destination with the help of a selected relay, and a warden
attempts to detect the existence of wireless transmissions from both the source
and relay, while friendly jammers send jamming signals to prevent warden from
detecting the transmission process. To this end, we first propose two relay
selection schemes, namely random relay selection (RRS) and max-min relay
selection (MMRS), as well as their corresponding cooperative jamming (CJ)
schemes for ensuring covertness in the system. We then provide theoretical
modeling for the covert rate performance under each relay selection scheme and
its CJ scheme and further explore the optimal transmit power controls of both
the source and relay for covert rate maximization. Finally, extensive
simulation/numerical results are presented to validate our theoretical models
and also to illustrate the covert rate performance of the relay system under
cooperative jamming and relay selection
Explicit Load Balancing Technique for NGEO Satellite IP Networks With On-Board Processing Capabilities
科研費報告書収録論文(課題番号:17500030/研究代表者:加藤寧/インターネットと高親和性を有する次世代低軌道衛星ネットワークに関する基盤研究
Moving Target Defense based Secured Network Slicing System in the O-RAN Architecture
The open radio access network (O-RAN) architecture's native virtualization
and embedded intelligence facilitate RAN slicing and enable comprehensive
end-to-end services in post-5G networks. However, any vulnerabilities could
harm security. Therefore, artificial intelligence (AI) and machine learning
(ML) security threats can even threaten O-RAN benefits. This paper proposes a
novel approach to estimating the optimal number of predefined VNFs for each
slice while addressing secure AI/ML methods for dynamic service admission
control and power minimization in the O-RAN architecture. We solve this problem
on two-time scales using mathematical methods for determining the predefined
number of VNFs on a large time scale and the proximal policy optimization
(PPO), a Deep Reinforcement Learning algorithm, for solving dynamic service
admission control and power minimization for different slices on a small-time
scale. To secure the ML system for O-RAN, we implement a moving target defense
(MTD) strategy to prevent poisoning attacks by adding uncertainty to the
system. Our experimental results show that the proposed PPO-based service
admission control approach achieves an admission rate above 80\% and that the
MTD strategy effectively strengthens the robustness of the PPO method against
adversarial attacks.Comment: 6 page
- …