1,746 research outputs found
Meta-learning applications for machine-type wireless communications
Abstract. Machine Type Communication (MTC) emerged as a key enabling technology for 5G wireless networks and beyond towards the 6G networks. MTC provides two service modes. Massive MTC (mMTC) provides connectivity to a huge number of users. Ultra-Reliable Low Latency Communication (URLLC) achieves stringent reliability and latency requirements to enable industrial and interactive applications. Recently, data-driven learning-based approaches have been proposed to optimize the operation of various MTC applications and allow for obtaining the desired strict performance metrics. In our work, we propose implementing meta-learning alongside other deep-learning models in MTC applications. First, we analyze the model-agnostic meta-learning algorithm (MAML) and its convergence for regression and reinforcement learning (RL) problems. Then, we discuss uncrewed aerial vehicles (UAVs) trajectory planning as a case study in mMTC and RL, illustrating the system model and the main challenges. Hence, we propose the MAML-RL formulation to solve the UAV path learning problem. Moreover, we address the MAML-based few-pilot demodulation problem in massive IoT deployments. Finally, we extend the problem to include the interference cancellation with Non-Orthogonal Multiple Access (NOMA) as a paradigm shift towards non-orthogonal communication thanks to its potential to scale well in massive deployments. We propose a novel, data-driven, meta-learning-aided NOMA uplink model that minimizes the channel estimation overhead and does not require perfect channel knowledge. Unlike conventional deep learning successive interference cancellation (SICNet), Meta-Learning aided SIC (meta-SICNet) can share experiences across different devices, facilitating learning for new incoming devices while reducing training over- head. Our results show the superiority of MAML performance in addressing many problems compared to other deep learning schemes. The simulations also prove that MAML can successfully solve the few-pilot demodulation problem and achieve better performance in terms of symbol error rates (SERs) and convergence latency. Moreover, the analysis confirms that the proposed meta-SICNet outperforms classical SIC and conventional SICNet as it can achieve a lower SER with fewer pilots
A Comparison of Impairment Abstractions by Multiple Users of an Installed Fiber Infrastructure
We compare three independent impairment abstractions of an installed fibre infrastructure. Abstractions agreed to within 1.3dB despite being obtained from different nodes using different terminal equipment. Validation using a DWDM virtual topology was within 1.4dB
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Fast Radio Burst 121102 Pulse Detection and Periodicity: A Machine Learning Approach
We report the detection of 72 new pulses from the repeating fast radio burst
FRB 121102 in Breakthrough Listen C-band (4-8 GHz) observations at the Green
Bank Telescope. The new pulses were found with a convolutional neural network
in data taken on August 26, 2017, where 21 bursts have been previously
detected. Our technique combines neural network detection with dedispersion
verification. For the current application we demonstrate its advantage over a
traditional brute-force dedis- persion algorithm in terms of higher
sensitivity, lower false positive rates, and faster computational speed.
Together with the 21 previously reported pulses, this observa- tion marks the
highest number of FRB 121102 pulses from a single observation, total- ing 93
pulses in five hours, including 45 pulses within the first 30 minutes. The
number of data points reveal trends in pulse fluence, pulse detection rate, and
pulse frequency structure. We introduce a new periodicity search technique,
based on the Rayleigh test, to analyze the time of arrivals, with which we
exclude with 99% confidence pe- riodicity in time of arrivals with periods
larger than 5.1 times the model-dependent time-stamp uncertainty. In
particular, we rule out constant periods >10 ms in the barycentric arrival
times, though intrinsic periodicity in the time of emission remains plausible.Comment: 32 pages, 10 figure
- …