592 research outputs found
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
The Challenge of Machine Learning in Space Weather Nowcasting and Forecasting
The numerous recent breakthroughs in machine learning (ML) make imperative to
carefully ponder how the scientific community can benefit from a technology
that, although not necessarily new, is today living its golden age. This Grand
Challenge review paper is focused on the present and future role of machine
learning in space weather. The purpose is twofold. On one hand, we will discuss
previous works that use ML for space weather forecasting, focusing in
particular on the few areas that have seen most activity: the forecasting of
geomagnetic indices, of relativistic electrons at geosynchronous orbits, of
solar flares occurrence, of coronal mass ejection propagation time, and of
solar wind speed. On the other hand, this paper serves as a gentle introduction
to the field of machine learning tailored to the space weather community and as
a pointer to a number of open challenges that we believe the community should
undertake in the next decade. The recurring themes throughout the review are
the need to shift our forecasting paradigm to a probabilistic approach focused
on the reliable assessment of uncertainties, and the combination of
physics-based and machine learning approaches, known as gray-box.Comment: under revie
Spectrum Sensing in Cognitive Radio Using CNN-RNN and Transfer Learning
Cognitive radio has been proposed to improve spectrum utilization in wireless communication. Spectrum sensing is an essential component of cognitive radio. The traditional methods of spectrum sensing are based on feature extraction of a received signal at a given point. The development in artificial intelligence and deep learning have given an opportunity to improve the accuracy of spectrum sensing by using cooperative spectrum sensing and analyzing the radio scene. This research proposed a hybrid model of convolution and recurrent neural network for spectrum sensing. The research further enhances the accuracy of sensing for low SNR signals through transfer learning. The results of modelling show improvement in spectrum sensing using CNN-RNN compared to other models studied in this field. The complexity of an algorithm is analyzed to show an improvement in the performance of the algorithm.publishedVersio
The 8th International Conference on Time Series and Forecasting
The aim of ITISE 2022 is to create a friendly environment that could lead to the establishment or strengthening of scientific collaborations and exchanges among attendees. Therefore, ITISE 2022 is soliciting high-quality original research papers (including significant works-in-progress) on any aspect time series analysis and forecasting, in order to motivating the generation and use of new knowledge, computational techniques and methods on forecasting in a wide range of fields
A NOVEL PATH LOSS FORECAST MODEL TO SUPPORT DIGITAL TWINS FOR HIGH FREQUENCY COMMUNICATIONS NETWORKS
The need for long-distance High Frequency (HF) communications in the 3-30 MHz frequency range seemed to diminish at the end of the 20th century with the advent of space-based communications and the spread of fiber optic-connected digital networks. Renewed interest in HF has emerged as an enabler for operations in austere locations and for its ability to serve as a redundant link when space-based and terrestrial communication channels fail. Communications system designers can create a “digital twin” system to explore the operational advantages and constraints of the new capability. Existing wireless channel models can adequately simulate communication channel conditions with enough fidelity to support digital twin simulations, but only when the transmitter and receiver have clear line of sight or a relatively simple multi-path reflection between them. With over-the-horizon communications, the received signal depends on refractions of the transmitted signal through ionospheric layers. The time-varying nature of the free electron density of the ionosphere affects the resulting path loss between the transmitter and receiver and is difficult to model over several days. This dissertation examined previous efforts to characterize the ionosphere and to develop HF propagation models, including the Voice of America Coverage Analysis Prediction (VOACAP) tool, to support path loss forecasts. Analysis of data from the Weak Signal Propagation Reporter Network (WSPRnet), showed an average Root Mean Squared Error (RMSE) of 12.9 dB between VOACAP predictions and actual propagation reports on the WSPRnet system. To address the significant error in VOACAP forecasts, alternative predictive models were developed, including the Forecasting Ionosphere-Induced Path Loss (FIIPL) model and evaluated against one month of WSPRnet data collected at eight geographically distributed sites. The FIIPL model leveraged a machine learning algorithm, Long Short Term Memory, to generate predictions that reduced the SNR errors to an average of 4.0 dB RMSE. These results could support more accurate 24-hour predictions and provides an accurate model of the channel conditions for digital twin simulations.
Advisor: Hamid R. Sharif-Kashan
- …