1,377 research outputs found

    Machine learning for localization in narrowband IoT networks

    Get PDF
    Low power wide area networks (LPWANs) are designed for Internet of Things (IoT) appli- cations because of their long-range coverage, low bit rate, and low battery consumption. In the LPWAN networks, Narrow-band IoT (NB-IoT) is a type of network that uses the licensed cellular spectrum, working over the deployed LTE infrastructure. It is rising as a promising technology because of its characteristics and deployment advantages against other LPWAN networks. In NB-IoT networks, localization is an essential service for applications such as smart cities, traffic control, logistics tracking, and others. The outdoor localization is often performed using a Global Navigation Satellite System (GNSS) like Global Positioning System (GPS) to send the current device position with some meters accuracy. However, due to GPSÂżs power and size drawbacks, recent reports focus on alternatives to replace GPS-based localization systems with cost and power efficient solutions. This work analyses a database collected over an NB-IoT deployed network in the city of Antwerp in Belgium and implements a solution for outdoor localization based on Machine Learning (ML) methods for distance estimation. The data analysis starts in the pre-processing step, where the databases are cleaned and prepared for the ML analysis. The following process merges and debugs the data to obtain an integrated database with classification for urban and rural areas. The localization solution performs a support vector regression, random forest regression, and a multi-layer perceptron regression using as input parameters the received signal strength indicator (RSSI) and the base station (BS) position details in order to predict the distance to the IoT nodes and estimate the current position (latitude and longitude) of them. This implementation includes hyper-parameter tuning, the train and test process, and mathematical calculations to obtain the estimated position with mean and median location estimation errors expressed in meters. The implementation of the methodology processes results in 280 and 220 meters corre- sponding to the mean and median location errors for the urban area and 920 and 570 meters for the rural area. The accuracy levels obtained in the results turn this solution suitable for the most common uses of localization in IoT instead of using a GPS device. As a result, this study proposes a new approach for localization in IoT networks. In addition to the implemented solution defines valuable research lines to improve the accuracy levels and generate more contributions to optimize the equipment resources and reduce the IoT deviceÂżs final cost.OutgoingObjectius de Desenvolupament Sostenible::9 - IndĂşstria, InnovaciĂł i InfraestructuraObjectius de Desenvolupament Sostenible::11 - Ciutats i Comunitats Sostenible

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Medical imaging analysis with artificial neural networks

    Get PDF
    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging

    Deep learning in remote sensing: a review

    Get PDF
    Standing at the paradigm shift towards data-intensive science, machine learning techniques are becoming increasingly important. In particular, as a major breakthrough in the field, deep learning has proven as an extremely powerful tool in many fields. Shall we embrace deep learning as the key to all? Or, should we resist a 'black-box' solution? There are controversial opinions in the remote sensing community. In this article, we analyze the challenges of using deep learning for remote sensing data analysis, review the recent advances, and provide resources to make deep learning in remote sensing ridiculously simple to start with. More importantly, we advocate remote sensing scientists to bring their expertise into deep learning, and use it as an implicit general model to tackle unprecedented large-scale influential challenges, such as climate change and urbanization.Comment: Accepted for publication IEEE Geoscience and Remote Sensing Magazin
    • …
    corecore