26,541 research outputs found
Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks
Future wireless networks have a substantial potential in terms of supporting
a broad range of complex compelling applications both in military and civilian
fields, where the users are able to enjoy high-rate, low-latency, low-cost and
reliable information services. Achieving this ambitious goal requires new radio
techniques for adaptive learning and intelligent decision making because of the
complex heterogeneous nature of the network structures and wireless services.
Machine learning (ML) algorithms have great success in supporting big data
analytics, efficient parameter estimation and interactive decision making.
Hence, in this article, we review the thirty-year history of ML by elaborating
on supervised learning, unsupervised learning, reinforcement learning and deep
learning. Furthermore, we investigate their employment in the compelling
applications of wireless networks, including heterogeneous networks (HetNets),
cognitive radios (CR), Internet of things (IoT), machine to machine networks
(M2M), and so on. This article aims for assisting the readers in clarifying the
motivation and methodology of the various ML algorithms, so as to invoke them
for hitherto unexplored services as well as scenarios of future wireless
networks.Comment: 46 pages, 22 fig
Transfer Learning with Deep Convolutional Neural Network (CNN) for Pneumonia Detection using Chest X-ray
Pneumonia is a life-threatening disease, which occurs in the lungs caused by
either bacterial or viral infection. It can be life-endangering if not acted
upon in the right time and thus an early diagnosis of pneumonia is vital. The
aim of this paper is to automatically detect bacterial and viral pneumonia
using digital x-ray images. It provides a detailed report on advances made in
making accurate detection of pneumonia and then presents the methodology
adopted by the authors. Four different pre-trained deep Convolutional Neural
Network (CNN)- AlexNet, ResNet18, DenseNet201, and SqueezeNet were used for
transfer learning. 5247 Bacterial, viral and normal chest x-rays images
underwent preprocessing techniques and the modified images were trained for the
transfer learning based classification task. In this work, the authors have
reported three schemes of classifications: normal vs pneumonia, bacterial vs
viral pneumonia and normal, bacterial and viral pneumonia. The classification
accuracy of normal and pneumonia images, bacterial and viral pneumonia images,
and normal, bacterial and viral pneumonia were 98%, 95%, and 93.3%
respectively. This is the highest accuracy in any scheme than the accuracies
reported in the literature. Therefore, the proposed study can be useful in
faster-diagnosing pneumonia by the radiologist and can help in the fast airport
screening of pneumonia patients.Comment: 13 Figures, 5 tables. arXiv admin note: text overlap with
arXiv:2003.1314
Seeing the Unobservable: Channel Learning for Wireless Communication Networks
Wireless communication networks rely heavily on channel state information
(CSI) to make informed decision for signal processing and network operations.
However, the traditional CSI acquisition methods is facing many difficulties:
pilot-aided channel training consumes a great deal of channel resources and
reduces the opportunities for energy saving, while location-aided channel
estimation suffers from inaccurate and insufficient location information. In
this paper, we propose a novel channel learning framework, which can tackle
these difficulties by inferring unobservable CSI from the observable one. We
formulate this framework theoretically and illustrate a special case in which
the learnability of the unobservable CSI can be guaranteed. Possible
applications of channel learning are then described, including cell selection
in multi-tier networks, device discovery for device-to-device (D2D)
communications, as well as end-to-end user association for load balancing. We
also propose a neuron-network-based algorithm for the cell selection problem in
multi-tier networks. The performance of this algorithm is evaluated using
geometry-based stochastic channel model (GSCM). In settings with 5 small cells,
the average cell-selection accuracy is 73% - only a 3.9% loss compared with a
location-aided algorithm which requires genuine location information.Comment: 6 pages, 4 figures, accepted by GlobeCom'1
A Survey of Prediction and Classification Techniques in Multicore Processor Systems
In multicore processor systems, being able to accurately predict the future provides new optimization opportunities, which otherwise could not be exploited. For example, an oracle able to predict a certain application\u27s behavior running on a smart phone could direct the power manager to switch to appropriate dynamic voltage and frequency scaling modes that would guarantee minimum levels of desired performance while saving energy consumption and thereby prolonging battery life. Using predictions enables systems to become proactive rather than continue to operate in a reactive manner. This prediction-based proactive approach has become increasingly popular in the design and optimization of integrated circuits and of multicore processor systems. Prediction transforms from simple forecasting to sophisticated machine learning based prediction and classification that learns from existing data, employs data mining, and predicts future behavior. This can be exploited by novel optimization techniques that can span across all layers of the computing stack. In this survey paper, we present a discussion of the most popular techniques on prediction and classification in the general context of computing systems with emphasis on multicore processors. The paper is far from comprehensive, but, it will help the reader interested in employing prediction in optimization of multicore processor systems
Collaboration between a human group and artificial intelligence can improve prediction of multiple sclerosis course. A proof-of-principle study
Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR) phase, which proceeds to a secondary progressive (SP) form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm) forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients
Synergistic combination of systems for structural health monitoring and earthquake early warning for structural health prognosis and diagnosis
Earthquake early warning (EEW) systems are currently operating nationwide in Japan and are in beta-testing in California. Such a system detects an earthquake initiation using online signals from a seismic sensor network and broadcasts a warning of the predicted location and magnitude a few seconds to a minute or so before an earthquake hits a site. Such a system can be used synergistically with installed structural health monitoring (SHM) systems to enhance pre-event prognosis and post-event diagnosis of structural health. For pre-event prognosis, the EEW system information can be used to make probabilistic predictions of the anticipated damage to a structure using seismic loss estimation methodologies from performance-based earthquake engineering. These predictions can support decision-making regarding the activation of appropriate mitigation systems, such as stopping traffic from entering a bridge that has a predicted high probability of damage. Since the time between warning and arrival of the strong shaking is very short, probabilistic predictions must be rapidly calculated and the decision making automated for the mitigation actions. For post-event diagnosis, the SHM sensor data can be used in Bayesian updating of the probabilistic damage predictions with the EEW predictions as a prior. Appropriate Bayesian methods for SHM have been published. In this paper, we use pre-trained surrogate models (or emulators) based on machine learning methods to make fast damage and loss predictions that are then used in a cost-benefit decision framework for activation of a mitigation measure. A simple illustrative example of an infrastructure application is presented
TeachOpenCADD: a teaching platform for computer-aided drug design using open source packages and data
Owing to the increase in freely available software and data for cheminformatics and structural bioinformatics, research for computer-aided drug design (CADD) is more and more built on modular, reproducible, and easy-to-share pipelines. While documentation for such tools is available, there are only a few freely accessible examples that teach the underlying concepts focused on CADD, especially addressing users new to the field. Here, we present TeachOpenCADD, a teaching platform developed by students for students, using open source compound and protein data as well as basic and CADD-related Python packages. We provide interactive Jupyter notebooks for central CADD topics, integrating theoretical background and practical code. TeachOpenCADD is freely available on GitHub: https://github.com/volkamerlab/TeachOpenCAD
- …