379 research outputs found

    Region-Based Classification of PolSAR Data Using Radial Basis Kernel Functions With Stochastic Distances

    Full text link
    Region-based classification of PolSAR data can be effectively performed by seeking for the assignment that minimizes a distance between prototypes and segments. Silva et al (2013) used stochastic distances between complex multivariate Wishart models which, differently from other measures, are computationally tractable. In this work we assess the robustness of such approach with respect to errors in the training stage, and propose an extension that alleviates such problems. We introduce robustness in the process by incorporating a combination of radial basis kernel functions and stochastic distances with Support Vector Machines (SVM). We consider several stochastic distances between Wishart: Bhatacharyya, Kullback-Leibler, Chi-Square, R\'{e}nyi, and Hellinger. We perform two case studies with PolSAR images, both simulated and from actual sensors, and different classification scenarios to compare the performance of Minimum Distance and SVM classification frameworks. With this, we model the situation of imperfect training samples. We show that SVM with the proposed kernel functions achieves better performance with respect to Minimum Distance, at the expense of more computational resources and the need of parameter tuning. Code and data are provided for reproducibility.Comment: Accepted for publication in the International Journal of Digital Eart

    DETECTING MYOCARDIAL INFARCTIONS USING MACHINE LEARNING METHODS

    Get PDF
    Myocardial Infarction (MI), commonly known as a heart attack, occurs when one of the three major blood vessels carrying blood to the heart get blocked, causing the death of myocardial (heart) cells. If not treated immediately, MI may cause cardiac arrest, which can ultimately cause death. Risk factors for MI include diabetes, family history, unhealthy diet and lifestyle. Medical treatments include various types of drugs and surgeries which can prove very expensive for patients due to high healthcare costs. Therefore, it is imperative that MI is diagnosed at the right time. Electrocardiography (ECG) is commonly used to detect MI. ECG is a process in which the electrical signals of the heart are measured by electrodes placed on a patient’s limbs and chest to measure heart signals. In recent years, the availability of medical datasets and the invention of wearable devices have opened new possibilities in early detection of this disease. Wearable devices that measure ECG correctly and have built- in machine learning models can potentially save millions of lives the world over. This research explores traditional machine learning techniques such as Support Vector Machines and Decision Trees as well as a new technique, Capsule Neural Network, for MI detection. Even though the new technique achieves remarkable results, its accuracy is less compared to the traditional machine learning techniques used for MI detection

    Multisensor systems and flood risk management. Application to the Danube Delta using radar and hyperspectral imagery

    Get PDF
    International audienceAt the beginning of the 21st century, flood risk still represents a major world threat ( 60% of natural disasters are initiated by storms ) and the climate warming might even accentuate this phenomenon in the future. In Europe, despite all the policies in place and the measures taken during the past decades, large floods have occurred almost every year. The news regularly confirms this reality and the serious threat posed by flood risks in Europe. This paper presents an application to the Danube Delta exploiting radar imagery ENVISAT/ASAR and hyperspectral imagery CHRIS/PROBA for mapping flooded and floodable areas during the events of spring 2006. The uses of multisensor systems, such as radar and hyperspectral imagers, contribute to a better comprehension of floods in this wetland, their impacts, and risk management and sustainable development in the delta. In the section Risk management, this paper approaches the methodological aspects related to the characterization of the flood hazard whereas in the section Forecasting we will focus on the knowledge and modeling of the Land cover. The method of kernels, particularly adapted to the highlighting of the special-temporal variations - Support Vector Machine - and the methods based on the principle of the vague logic ( object-oriented classifications ) will be implemented so as to obtain the information plan of the spatial data.En ce début de 21e siècle, le risque d'inondation constitue encore le risque majeur au monde ( avec les tempêtes, les inondations représentent 60% des catastrophes naturelles ) et le réchauffement climatique pourrait encore renforcer ce phénomène à l'avenir. En Europe, malgré toutes les politiques et les mesures prises, au cours des dernières décennies, de grandes inondations ont lieu quasiment chaque année. Les actualités confirment régulièrement la réalité et la prégnance du risque d'inondation en Europe. Cet article présente une application concernant le risque d'inondation durant les événements du printemps 2006 dans le delta du Danube en exploitant des images radar ENVISAT/ASAR et l'imagerie hyperspectrale CHRIS/PROBA en matière d'analyse et de cartographie des zones inondées et de la classe de l'inondable. L'utilisation couplée des techniques spatiales ( radar et hyperspectrale ) pourrait contribuer à une meilleure compréhension des phénomènes liés aux inondations dans le Delta du Danube, ainsi qu'à la gestion de ce risque dans le delta et à son développement durable. Dans la partie Gestion du risque, ce travail aborde des aspects méthodologiques liés à la caractérisation de l'aléa de l'inondation tandis que dans la partie Prévision, la connaissance et la modélisation de l'Occupation du sol seront abordés. Des méthodes des noyaux ( kernels ), adaptées en particulier à la mise en évidence des variations spatio-temporelles - Suport Vector Machine - ainsi que des méthodes basées sur le principe de la logique floue ( classifications orientées objet ) sont mis en place afin d'obtenir le plan d'information des données spatiales

    Early Disease Detection Through Nail Image Processing Based on Ensemble of Classifier Models

    Get PDF
    Medical science has progressed in many ways and different methods have been developed for the diagnosis of diseases in the human body and one of the ways to identify the diseases is through the close examination of nails of the human palm. The main aim of this study is to compare the performance of various classifier models that are used for the prediction of various diseases. The Performance analysis is done by applying image processing, different data mining and machine learning techniques to the extracted nail image through our proposed system which does nail analysis using a combination of 13 features (Nail Color, Shape and Texture) extracted from the nail image. In this paper we have compared different machine learning classifiers like Support Vector Machine, Multiclass SVM and K-Nearest Neighbor through ensemble of these classifiers with different features so as to classify patients with different diseases like Psoriasis, Red Lunula, Beau�s Lines, Clubbing, etc. These approaches were tested with data images from Hospitals and workplaces. The performance of the different classifiers have been measured in terms of Accuracy, Sensitivity and Specificity

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Two-Stage Fuzzy Multiple Kernel Learning Based on Hilbert-Schmidt Independence Criterion

    Full text link
    © 1993-2012 IEEE. Multiple kernel learning (MKL) is a principled approach to kernel combination and selection for a variety of learning tasks, such as classification, clustering, and dimensionality reduction. In this paper, we develop a novel fuzzy multiple kernel learning model based on the Hilbert-Schmidt independence criterion (HSIC) for classification, which we call HSIC-FMKL. In this model, we first propose an HSIC Lasso-based MKL formulation, which not only has a clear statistical interpretation that minimum redundant kernels with maximum dependence on output labels are found and combined, but also enables the global optimal solution to be computed efficiently by solving a Lasso optimization problem. Since the traditional support vector machine (SVM) is sensitive to outliers or noises in the dataset, fuzzy SVM (FSVM) is used to select the prediction hypothesis once the optimal kernel has been obtained. The main advantage of FSVM is that we can associate a fuzzy membership with each data point such that these data points can have different effects on the training of the learning machine. We propose a new fuzzy membership function using a heuristic strategy based on the HSIC. The proposed HSIC-FMKL is a two-stage kernel learning approach and the HSIC is applied in both stages. We perform extensive experiments on real-world datasets from the UCI benchmark repository and the application domain of computational biology which validate the superiority of the proposed model in terms of prediction accuracy

    Urban objects classification using Mueller matrix polarimetry and machine learning

    Get PDF
    Detecting and recognizing different kinds of urban objects is an important problem, in particular, in autonomous driving. In this context, we studied the potential of Mueller matrix polarimetry for classifying a set of relevant real-world objects: vehicles, pedestrians, traffic signs, pavements, vegetation and tree trunks. We created a database with their experimental Mueller matrices measured at 1550 nm and trained two machine learning classifiers, support vector machine and artificial neural network, to classify new samples. The overall accuracy of over 95% achieved with this approach, with either models, reveals the potential of polarimetry, specially combined with other remote sensing techniques, to enhance object recognition.European Regional Development Fund (POCI-01-0247-FEDER-037902); Fundação para a Ciência e a Tecnologia (UIDB/04650/2020).This work is supported by the European Structural and Investment Funds in the FEDER component, through the Operational Competitiveness and Internationalization Program (COMPETE 2020) [Project n° 037902; Funding Reference: POCI-01-0247-FEDER-037902] and partially supported by the Portuguese Foundation for Science and Technology (FCT) in the framework of the Strategic Funding UIDB/04650/2020. The authors acknowledge Alexandre Correia and Moisés Duarte (Bosch Car Multimedia Portugal S.A) and Dr. Rui Pereira and Dr. Stéphane Clain (Minho University) for fruitful discussions on data analysis. The authors also acknowledge city council of Braga (Portugal) for the supply of samples

    Mitigation of Nonlinear Impairments by Using Support Vector Machine and Nonlinear Volterra Equalizer

    Get PDF
    A support vector machine (SVM) based detection is applied to different equalization schemes for a data center interconnect link using coherent 64 GBd 64-QAM over 100 km standard single mode fiber (SSMF). Without any prior knowledge or heuristic assumptions, the SVM is able to learn and capture the transmission characteristics from only a short training data set. We show that, with the use of suitable kernel functions, the SVM can create nonlinear decision thresholds and reduce the errors caused by nonlinear phase noise (NLPN), laser phase noise, I/Q imbalances and so forth. In order to apply the SVM to 64-QAM we introduce a binary coding SVM, which provides a binary multiclass classification with reduced complexity. We investigate the performance of this SVM and show how it can improve the bit-error rate (BER) of the entire system. After 100 km the fiber-induced nonlinear penalty is reduced by 2 dB at a BER of 3.7 × 10 −3 . Furthermore, we apply a nonlinear Volterra equalizer (NLVE), which is based on the nonlinear Volterra theory, as another method for mitigating nonlinear effects. The combination of SVM and NLVE reduces the large computational complexity of the NLVE and allows more accurate compensation of nonlinear transmission impairments

    Assessment of multi-temporal, multi-sensor radar and ancillary spatial data for grasslands monitoring in Ireland using machine learning approaches

    Get PDF
    Accurate inventories of grasslands are important for studies of carbon dynamics, biodiversity conservation and agricultural management. For regions with persistent cloud cover the use of multi-temporal synthetic aperture radar (SAR) data provides an attractive solution for generating up-to-date inventories of grasslands. This is even more appealing considering the data that will be available from upcoming missions such as Sentinel-1 and ALOS-2. In this study, the performance of three machine learning algorithms; Random Forests (RF), Support Vector Machines (SVM) and the relatively underused Extremely Randomised Trees (ERT) is evaluated for discriminating between grassland types over two large heterogeneous areas of Ireland using multi-temporal, multi-sensor radar and ancillary spatial datasets. A detailed accuracy assessment shows the efficacy of the three algorithms to classify different types of grasslands. Overall accuracies ≥ 88.7% (with kappa coefficient of 0.87) were achieved for the single frequency classifications and maximum accuracies of 97.9% (kappa coefficient of 0.98) for the combined frequency classifications. For most datasets, the ERT classifier outperforms SVM and RF

    Anwendung von maschinellem Lernen in der optischen Nachrichtenübertragungstechnik

    Get PDF
    Aufgrund des zunehmenden Datenverkehrs wird erwartet, dass die optischen Netze zukünftig mit höheren Systemkapazitäten betrieben werden. Dazu wird bspw. die kohärente Übertragung eingesetzt, bei der das Modulationsformat erhöht werden kann, erforder jedoch ein größeres SNR. Um dies zu erreichen, wird die optische Signalleistung erhöht, wodurch die Datenübertragung durch die nichtlinearen Beeinträchtigungen gestört wird. Der Schwerpunkt dieser Arbeit liegt auf der Entwicklung von Modellen des maschinellen Lernens, die auf diese nichtlineare Signalverschlechterung reagieren. Es wird die Support-Vector-Machine (SVM) implementiert und als klassifizierende Entscheidungsmaschine verwendet. Die Ergebnisse zeigen, dass die SVM eine verbesserte Kompensation sowohl der nichtlinearen Fasereffekte als auch der Verzerrungen der optischen Systemkomponenten ermöglicht. Das Prinzip von EONs bietet eine Technologie zur effizienten Nutzung der verfügbaren Ressourcen, die von der optischen Faser bereitgestellt werden. Ein Schlüsselelement der Technologie ist der bandbreitenvariable Transponder, der bspw. die Anpassung des Modulationsformats oder des Codierungsschemas an die aktuellen Verbindungsbedingungen ermöglicht. Um eine optimale Ressourcenauslastung zu gewährleisten wird der Einsatz von Algorithmen des Reinforcement Learnings untersucht. Die Ergebnisse zeigen, dass der RL-Algorithmus in der Lage ist, sich an unbekannte Link-Bedingungen anzupassen, während vergleichbare heuristische Ansätze wie der genetische Algorithmus für jedes Szenario neu trainiert werden müssen
    corecore