1,917 research outputs found

    Data fusion approach for eucalyptus trees identification

    Get PDF
    UIDB/00066/2020 DSAIPA/AI/0100/2018Remote sensing is based on the extraction of data, acquired by satellites or aircrafts, through multispectral images, that allow their remote analysis and classification. Analysing those images with data fusion techniques is a promising approach for identification and classification of forest types. Fusion techniques can aggregate various sources of heterogeneous information to generate value-added maps, facilitating forest-type classification. This work applies a data fusion algorithm, denoted FIF (Fuzzy Information Fusion), which combines computational intelligence techniques with multicriteria concepts and techniques, to automatically distinguish Eucalyptus trees from satellite images. The algorithm customization was performed with a Portuguese area planted with Eucalyptus. After customizing and validating the approach with several representative scenarios to assess its suitability for automatic classification of Eucalyptus, we tested on a large tile obtaining a sensitivity of 69.61%, with a specificity of 99.43%, and an overall accuracy of 98.19%. This work demonstrates the potential of our approach to automatically classify specific forest types from satellite images, since this is a novel approach dedicated to the identification of eucalyptus trees.publishersversionpublishe

    A Multi Views Approach for Remote Sensing Fusion Based on Spectral, Spatial and Temporal Information

    Get PDF
    The objectives of this chapter are to contribute to the apprehension of image fusion approaches including concepts definition, techniques ethics and results assessment. It is structured in five sections. Following this introduction, a definition of image fusion provides involved fundamental concepts. Respectively, we explain cases in which image fusion might be useful. Most existing techniques and architectures are reviewed and classified in the third section. In fourth section, we focuses heavily on algorithms based on multi-views approach, we compares and analyses the process model and algorithms including advantages, limitations and applicability of each view. The last part of the chapter summarized the benefits and limitations of a multi-view approach image fusion; it gives some recommendations on the effectiveness and the performance of these methods. These recommendations, based on a comprehensive study and meaningful quantitative metrics, evaluate various proposed views by applying them to various environmental applications with different remotely sensed images coming from different sensors. In the concluding section, we fence the chapter with a summary and recommendations for future researches

    Algorithms for sensor validation and multisensor fusion

    Get PDF
    Existing techniques for sensor validation and sensor fusion are often based on analytical sensor models. Such models can be arbitrarily complex and consequently Gaussian distributions are often assumed, generally with a detrimental effect on overall system performance. A holistic approach has therefore been adopted in order to develop two novel and complementary approaches to sensor validation and fusion based on empirical data. The first uses the Nadaraya-Watson kernel estimator to provide competitive sensor fusion. The new algorithm is shown to reliably detect and compensate for bias errors, spike errors, hardover faults, drift faults and erratic operation, affecting up to three of the five sensors in the array. The inherent smoothing action of the kernel estimator provides effective noise cancellation and the fused result is more accurate than the single 'best sensor'. A Genetic Algorithm has been used to optimise the Nadaraya-Watson fuser design. The second approach uses analytical redundancy to provide the on-line sensor status output μH∈[0,1], where μH=1 indicates the sensor output is valid and μH=0 when the sensor has failed. This fuzzy measure is derived from change detection parameters based on spectral analysis of the sensor output signal. The validation scheme can reliably detect a wide range of sensor fault conditions. An appropriate context dependent fusion operator can then be used to perform competitive, cooperative or complementary sensor fusion, with a status output from the fuser providing a useful qualitative indication of the status of the sensors used to derive the fused result. The operation of both schemes is illustrated using data obtained from an array of thick film metal oxide pH sensor electrodes. An ideal pH electrode will sense only the activity of hydrogen ions, however the selectivity of the metal oxide device is worse than the conventional glass electrode. The use of sensor fusion can therefore reduce measurement uncertainty by combining readings from multiple pH sensors having complementary responses. The array can be conveniently fabricated by screen printing sensors using different metal oxides onto a single substrate

    Linear Dimensionality Reduction for Margin-Based Classification: High-Dimensional Data and Sensor Networks

    Get PDF
    Low-dimensional statistics of measurements play an important role in detection problems, including those encountered in sensor networks. In this work, we focus on learning low-dimensional linear statistics of high-dimensional measurement data along with decision rules defined in the low-dimensional space in the case when the probability density of the measurements and class labels is not given, but a training set of samples from this distribution is given. We pose a joint optimization problem for linear dimensionality reduction and margin-based classification, and develop a coordinate descent algorithm on the Stiefel manifold for its solution. Although the coordinate descent is not guaranteed to find the globally optimal solution, crucially, its alternating structure enables us to extend it for sensor networks with a message-passing approach requiring little communication. Linear dimensionality reduction prevents overfitting when learning from finite training data. In the sensor network setting, dimensionality reduction not only prevents overfitting, but also reduces power consumption due to communication. The learned reduced-dimensional space and decision rule is shown to be consistent and its Rademacher complexity is characterized. Experimental results are presented for a variety of datasets, including those from existing sensor networks, demonstrating the potential of our methodology in comparison with other dimensionality reduction approaches.National Science Foundation (U.S.). Graduate Research Fellowship ProgramUnited States. Army Research Office (MURI funded through ARO Grant W911NF-06-1-0076)United States. Air Force Office of Scientific Research (Award FA9550-06-1-0324)Shell International Exploration and Production B.V

    ARTMAP Neural Networks for Information Fusion and Data Mining: Map Production and Target Recognition Methodologies

    Full text link
    The Sensor Exploitation Group of MIT Lincoln Laboratory incorporated an early version of the ARTMAP neural network as the recognition engine of a hierarchical system for fusion and data mining of registered geospatial images. The Lincoln Lab system has been successfully fielded, but is limited to target I non-target identifications and does not produce whole maps. Procedures defined here extend these capabilities by means of a mapping method that learns to identify and distribute arbitrarily many target classes. This new spatial data mining system is designed particularly to cope with the highly skewed class distributions of typical mapping problems. Specification of canonical algorithms and a benchmark testbed has enabled the evaluation of candidate recognition networks as well as pre- and post-processing and feature selection options. The resulting mapping methodology sets a standard for a variety of spatial data mining tasks. In particular, training pixels are drawn from a region that is spatially distinct from the mapped region, which could feature an output class mix that is substantially different from that of the training set. The system recognition component, default ARTMAP, with its fully specified set of canonical parameter values, has become the a priori system of choice among this family of neural networks for a wide variety of applications.Air Force Office of Scientific Research (F49620-01-1-0397, F49620-01-1-0423); Office of Naval Research (N00014-01-1-0624

    ARTMAP Neural Networks for Information Fusion and Data Mining: Map Production and Target Recognition Methodologies

    Full text link
    The Sensor Exploitation Group of MIT Lincoln Laboratory incorporated an early version of the ARTMAP neural network as the recognition engine of a hierarchical system for fusion and data mining of registered geospatial images. The Lincoln Lab system has been successfully fielded, but is limited to target I non-target identifications and does not produce whole maps. Procedures defined here extend these capabilities by means of a mapping method that learns to identify and distribute arbitrarily many target classes. This new spatial data mining system is designed particularly to cope with the highly skewed class distributions of typical mapping problems. Specification of canonical algorithms and a benchmark testbed has enabled the evaluation of candidate recognition networks as well as pre- and post-processing and feature selection options. The resulting mapping methodology sets a standard for a variety of spatial data mining tasks. In particular, training pixels are drawn from a region that is spatially distinct from the mapped region, which could feature an output class mix that is substantially different from that of the training set. The system recognition component, default ARTMAP, with its fully specified set of canonical parameter values, has become the a priori system of choice among this family of neural networks for a wide variety of applications.Air Force Office of Scientific Research (F49620-01-1-0397, F49620-01-1-0423); Office of Naval Research (N00014-01-1-0624

    Combination of Evidence in Dempster-Shafer Theory

    Full text link

    Multisource Data Integration in Remote Sensing

    Get PDF
    Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system

    Remote Sensing and Data Fusion for Eucalyptus Trees Identification

    Get PDF
    Satellite remote sensing is supported by the extraction of data/information from satellite images or aircraft, through multispectral images, that allows their remote analysis and classification. Analyzing those images with data fusion tools and techniques, seem a suitable approach for the identification and classification of land cover. This land cover classification is possible because the fusion/merging techniques can aggregate various sources of heterogeneous information to generate value-added products that facilitate features classification and analysis. This work proposes to apply a data fusion algorithm, denoted FIF (Fuzzy Information Fusion), which combines computational intelligence techniques with multicriteria concepts and techniques to automatically distinguish Eucalyptus trees, in satellite images To assess the proposed approach, a Portuguese region, which includes planted Eucalyptus, will be used. This region is chosen because it includes a significant number of eucalyptus, and, currently, it is hard to automatically distinguish them from other types of trees (through satellite images), which turns this study into an interesting experiment of using data fusion techniques to differentiate types of trees. Further, the proposed approach is tested and validated with several fusion/aggregation operators to verify its versatility. Overall, the results of the study demonstrate the potential of this approach for automatic classification of land types.A deteção remota de imagens de satélite é baseada na extração de dados / informações de imagens de satélite ou aeronaves, através de imagens multiespectrais, que permitem a sua análise e classificação. Quando estas imagens são analisadas com ferramentas e técnicas de fusão de dados, torna-se num método muito útil para a identificação e classificação de diferentes tipos de ocupação de solo. Esta classificação é possível porque as técnicas de fusão podem processar várias fontes de informações heterogéneas, procedendo depois à sua agregação, para gerar produtos de valor agregado que facilitam a classificação e análise de diferentes entidades - neste caso a deteção de eucaliptos. Esta dissertação propõe a utilização de um algoritmo, denominado FIF (Fuzzy Information Fusion), que combina técnicas de inteligência computacional com conceitos e técnicas multicritério. Para avaliar o trabalho proposto, será utilizada uma região portuguesa, que inclui uma vasta área de eucaliptos. Esta região foi escolhida porque inclui um número significativo de eucaliptos e, atualmente, é difícil diferenciá-los automaticamente de outros tipos de árvores (através de imagens de satélite), o que torna este estudo numa experiência interessante relativamente ao uso de técnicas de fusão de dados para diferenciar tipos de árvores. Além disso, o trabalho desenvolvido será testado com vários operadores de fusão/agregação para verificar sua versatilidade. No geral, os resultados do estudo demonstram o potencial desta abordagem para a classificação automática de diversos tipos de ocupação de solo (e.g. água, árvores, estradas etc)

    Hypothesis Testing Using Spatially Dependent Heavy-Tailed Multisensor Data

    Get PDF
    The detection of spatially dependent heavy-tailed signals is considered in this dissertation. While the central limit theorem, and its implication of asymptotic normality of interacting random processes, is generally useful for the theoretical characterization of a wide variety of natural and man-made signals, sensor data from many different applications, in fact, are characterized by non-Gaussian distributions. A common characteristic observed in non-Gaussian data is the presence of heavy-tails or fat tails. For such data, the probability density function (p.d.f.) of extreme values decay at a slower-than-exponential rate, implying that extreme events occur with greater probability. When these events are observed simultaneously by several sensors, their observations are also spatially dependent. In this dissertation, we develop the theory of detection for such data, obtained through heterogeneous sensors. In order to validate our theoretical results and proposed algorithms, we collect and analyze the behavior of indoor footstep data using a linear array of seismic sensors. We characterize the inter-sensor dependence using copula theory. Copulas are parametric functions which bind univariate p.d.f. s, to generate a valid joint p.d.f. We model the heavy-tailed data using the class of alpha-stable distributions. We consider a two-sided test in the Neyman-Pearson framework and present an asymptotic analysis of the generalized likelihood test (GLRT). Both, nested and non-nested models are considered in the analysis. We also use a likelihood maximization-based copula selection scheme as an integral part of the detection process. Since many types of copula functions are available in the literature, selecting the appropriate copula becomes an important component of the detection problem. The performance of the proposed scheme is evaluated numerically on simulated data, as well as using indoor seismic data. With appropriately selected models, our results demonstrate that a high probability of detection can be achieved for false alarm probabilities of the order of 10^-4. These results, using dependent alpha-stable signals, are presented for a two-sensor case. We identify the computational challenges associated with dependent alpha-stable modeling and propose alternative schemes to extend the detector design to a multisensor (multivariate) setting. We use a hierarchical tree based approach, called vines, to model the multivariate copulas, i.e., model the spatial dependence between multiple sensors. The performance of the proposed detectors under the vine-based scheme are evaluated on the indoor footstep data, and significant improvement is observed when compared against the case when only two sensors are deployed. Some open research issues are identified and discussed
    corecore