29 research outputs found

    A fuzzified BRAIN algorithm for learning DNF from incomplete data

    Get PDF
    Aim of this paper is to address the problem of learning Boolean functions from training data with missing values. We present an extension of the BRAIN algorithm, called U-BRAIN (Uncertainty-managing Batch Relevance-based Artificial INtelligence), conceived for learning DNF Boolean formulas from partial truth tables, possibly with uncertain values or missing bits. Such an algorithm is obtained from BRAIN by introducing fuzzy sets in order to manage uncertainty. In the case where no missing bits are present, the algorithm reduces to the original BRAIN

    Shape-based defect classification for Non Destructive Testing

    Full text link
    The aim of this work is to classify the aerospace structure defects detected by eddy current non-destructive testing. The proposed method is based on the assumption that the defect is bound to the reaction of the probe coil impedance during the test. Impedance plane analysis is used to extract a feature vector from the shape of the coil impedance in the complex plane, through the use of some geometric parameters. Shape recognition is tested with three different machine-learning based classifiers: decision trees, neural networks and Naive Bayes. The performance of the proposed detection system are measured in terms of accuracy, sensitivity, specificity, precision and Matthews correlation coefficient. Several experiments are performed on dataset of eddy current signal samples for aircraft structures. The obtained results demonstrate the usefulness of our approach and the competiveness against existing descriptors.Comment: 5 pages, IEEE International Worksho

    Neural Network Aided Glitch-Burst Discrimination and Glitch Classification

    Full text link
    We investigate the potential of neural-network based classifiers for discriminating gravitational wave bursts (GWBs) of a given canonical family (e.g. core-collapse supernova waveforms) from typical transient instrumental artifacts (glitches), in the data of a single detector. The further classification of glitches into typical sets is explored.In order to provide a proof of concept,we use the core-collapse supernova waveform catalog produced by H. Dimmelmeier and co-Workers, and the data base of glitches observed in laser interferometer gravitational wave observatory (LIGO) data maintained by P. Saulson and co-Workers to construct datasets of (windowed) transient waveforms (glitches and bursts) in additive (Gaussian and compound-Gaussian) noise with different signal-tonoise ratios (SNR). Principal component analysis (PCA) is next implemented for reducing data dimensionality, yielding results consistent with, and extending those in the literature. Then, a multilayer perceptron is trained by a backpropagation algorithm (MLP-BP) on a data subset, and used to classify the transients as glitch or burst. A Self-Organizing Map (SOM) architecture is finally used to classify the glitches. The glitch/burst discrimination and glitch classification abilities are gauged in terms of the related truth tables. Preliminary results suggest that the approach is effective and robust throughout the SNR range of practical interest. Perspective applications pertain both to distributed (network, multisensor) detection of GWBs, where someintelligenceat the single node level can be introduced, and instrument diagnostics/optimization, where spurious transients can be identified, classified and hopefully traced back to their entry point

    Linear Codes Interpolation from Noisy Patterns by means of a Vector Quantization Process

    Get PDF
    An algorithm inferring a boolean linear code from noisy patterns received by a noisy channel, under the assumption of uniform occurrence distribution over the codewords, and an upper bound to the amount of data are presented. A vector quantizer is designed from the noisy patterns, choosing the obtained codebook as code approximation

    Recognition of splice junctions on DNA sequences by BRAIN learning algorithm

    Get PDF
    Motivation: The problem addressed in this paper is the prediction of splice site locations in human DNA. The aims of the proposed approach are explicit splicing rule description, high recognition quality, and robust and stable `one shot' data processing

    Forecasting the spread of SARS-CoV-2 in the campania region using genetic programming

    No full text
    Coronavirus disease 19 (COVID-19) is an infectious disease caused by the SARS-CoV-2 virus, which is responsible for the ongoing global pandemic. Stringent measures have been adopted to face the pandemic, such as complete lockdown, shutting down businesses and trade, as well as travel restrictions. Nevertheless, such solutions have had a tremendous economic impact. Although the use of recent vaccines seems to reduce the scale of the problem, the pandemic does not appear to finish soon. Therefore, having a forecasting model about the COVID-19 spread is of paramount importance to plan interventions and, then, to limit the economic and social damage. In this paper, we use Genetic Programming to evidence dependences of the SARS-CoV-2 spread from past data in a given Country. Namely, we analyze real data of the Campania Region, in Italy. The resulting models prove their effectiveness in forecasting the number of new positives 10/15 days before, with quite a high accuracy. The developed models have been integrated into the context of SVIMAC-19, an analytical-forecasting system for the containment, contrast, and monitoring of Covid-19 within the Campania Region
    corecore