7,564 research outputs found

    A Comparison of Nature Inspired Algorithms for Multi-threshold Image Segmentation

    Full text link
    In the field of image analysis, segmentation is one of the most important preprocessing steps. One way to achieve segmentation is by mean of threshold selection, where each pixel that belongs to a determined class islabeled according to the selected threshold, giving as a result pixel groups that share visual characteristics in the image. Several methods have been proposed in order to solve threshold selectionproblems; in this work, it is used the method based on the mixture of Gaussian functions to approximate the 1D histogram of a gray level image and whose parameters are calculated using three nature inspired algorithms (Particle Swarm Optimization, Artificial Bee Colony Optimization and Differential Evolution). Each Gaussian function approximates thehistogram, representing a pixel class and therefore a threshold point. Experimental results are shown, comparing in quantitative and qualitative fashion as well as the main advantages and drawbacks of each algorithm, applied to multi-threshold problem.Comment: 16 pages, this is a draft of the final version of the article sent to the Journa

    Automatic recognition of the digital modulation types using the artificial neural networks

    Get PDF
    As digital communication technologies continue to grow and evolve, applications for this steady development are also growing. This growth has generated a growing need to look for automated methods for recognizing and classifying the digital modulation type used in the communication system, which has an important effect on many civil and military applications. This paper suggests a recognizing system capable of classifying multiple and different types of digital modulation methods (64QAM, 2PSK, 4PSK, 8PSK, 4ASK, 2FSK, 4FSK, 8FSK). This paper focuses on trying to recognize the type of digital modulation using the artificial neural network (ANN) with its complex algorithm to boost the performance and increase the noise immunity of the system. This system succeeded in recognizing all the digital modulation types under the current study without any prior information. The proposed system used 8 signal features that were used to classify these 8 modulation methods. The system succeeded in achieving a recognition ratio of at least 68% for experimental signals on a signal to noise ratio (SNR = 5dB) and 89.1% for experimental signals at (SNR = 10dB) and 91% for experimental signals at (SNR = 15dB) for a channel with Additive White Gaussian Noise (AWGN)

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    PENDETEKSI TIPE MODULASI DIGITAL MENGGUNAKAN ALGORITMA GENETIKA DAN JARINGAN SYARAF TIRUAN PADA REKOGNISI MODULASI OTOMATIS

    Get PDF
    ABSTRAKSI: Automatic Digital Recognition Modulation (ADRM) mulai berkembang untuk mendukung performansi teknologi termutakhir saat ini. Sub-bagian dari blok receiver ini mengambil peranan penting dalam segi optimasi dan fleksibilitas. Pendekatan hardware optimation dalam suatu perangkat, akan memerlukan biaya yang mahal dan implementasi yang sulit jika dilakukan modifikasi atau system upgrade. Sehingga, solusi yang mungkin yaitu dengan melalui pendekatan software. Pendekatan inilah yang kemudian kita kenal sebagai Software Defined Radio (SDR).Dalam tugas akhir ini, pendeteksi modulasi digital difokuskan pada QPSK, 16QAM, dan 64QAM yang termasuk modulasi yang digunakan dalam WiMAX. Sistem pendeteksian tipe modulasi digital pada tugas akhir ini yaitu menggunakan metode spektral dan statistik pada bagian ekstraksi ciri (feature extraction), lalu dengan menambah blok bagian baru yaitu pemilihan ciri (feature selection) menggunakan Algoritma Genetika. Dan pada bagian keputusan (desicion part) menggunakan Jaringan Syaraf Tiruan. Sedangkan kanal transmisi yang digunakan adalah kanal fading.Hasil penelitian menunjukkan bahwa reduksi ciri sebesar 50% s.d 75% berpengaruh pada peningkatan akurasi sistem. Pemilihan ciri yang optimum untuk mendeteksi benar ≥90% tipe modulasi yang digunakan, menunjukkan hasil yang sama pada tiga kecepatan kanal transmit. Pada kecepatan 3, 30 dan 120 Km/jam, pemilihan ciri yang optimum adalah dengan empat ciri: STD Frequency, Mean, Varians, PSD Max pada SNR minimum 0 dB.Kata Kunci : Automatic Digital Recognition Modulation, Software Defined Radio, Algoritma Genetika, Jaringan Syaraf Tiruan, Kanal fadingABSTRACT: Automatic Digital Modulation Recognition (ADRM) began to develop to support current performance technologies. This sub-section of the receiver blocks is taking an important role in terms of optimization and flexibility. Optimation in hardware approach for a device, would be costly and difficult to implement in modification or system upgrade. Thus, a possible solution is to approach through software. This approach is then known as Software Defined Radio (SDR).This final project is focused on the detection of digital modulation QPSK, 16QAM, and 64QAM modulation that used in WiMAX. Detection of digital modulation type in this final project is using spectral and statistical methods in the feature extraction, and by adding a new section, feature selection using a Genetic Algorithm. And in the desicion part using Artificial Neural Network (ANN). While the transmission channel used is fading channels.The results showed that the feature reduction of 50% to 75% effect on increasing the accuracy of the system. Selection of the optimum features for the ≥ 90% detection correct of modulation type used, showed same results in the three-speed transmit channels. At 3, 30 and 120 Km / h, the selection of the optimum features are the four features: STD Frequency, Mean, Variance, PSD Max with minimum SNR 0 dB.Keyword: Automatic Digital Modulation Recognition, Software Defined Radio, Genetic Algorithms, Neural Networks, Channel fadin

    Evolutionary robotics and neuroscience

    Get PDF
    No description supplie

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig
    • …
    corecore