283 research outputs found

    A MTF-based distance for the assessment of geometrical quality of fused products

    No full text
    International audienceThis paper deals with the assessment of the quality of the products resulting from the application of fusion methods, exploiting the synergy of multimodal images at a low spatial resolution and images at a higher spatial resolution but with a lower spectral content. It concentrates on the assessment of the geometrical quality through the analysis of the quality of selected contours. The first objective of our paper is to present a method for the estimation of the MTF. This method applies to excerpts of images containing long well contrasted linear features. The performance of the method is assessed by applying it to several Ikonos images and comparing the results to published works. The second objective is to demonstrate that the method may be exploited to assess the geometrical quality of the fused product

    A method to better account for modulation transfer functions in ARSIS-based pansharpening methods

    No full text
    International audienceMultispectral (MS) images provided by Earth observation satellites have generally a poor spatial resolution while panchromatic images (PAN) exhibit a spatial resolution two or four times better. Data fusion is a means to synthesize MS images at higher spatial resolution than original by exploiting the high spatial resolution of the PAN. This process is often called pansharpening. The synthesis property states that the synthesized MS images should be as close as possible to those that would have been acquired by the corresponding sensors if they had this high resolution. The methods based on the concept Amélioration de la Résolution Spatiale par Injection de Structures (ARSIS) are able to deliver synthesized images with good spectral quality but whose geometrical quality can still be improved. We propose a more precise definition of the synthesis property in terms of geometry. Then, we present a method that takes explicitly into account the difference in modulation transfer function (MTF) between PAN and MS in the fusion process. This method is applied to an existing ARSIS-based fusion method, i.e., A trou wavelet transform-model 3. Simulated images of the sensors Pleiades and SPOT-5 are used to illustrate the performances of the approach. Although this paper is limited in methods and data, we observe a better restitution of the geometry and an improvement in all indices classically used in quality budget in pansharpening. We present also a means to assess the respect of the synthesis property from an MTF point of view

    Comparison of Pansharpening Algorithms: Outcome of the 2006 GRS-S Data Fusion Contest

    No full text
    International audienceIn January 2006, the Data Fusion Committee of the IEEE Geoscience and Remote Sensing Society launched a public contest for pansharpening algorithms, which aimed to identify the ones that perform best. Seven research groups worldwide participated in the contest, testing eight algorithms following different philosophies [component substitution, multiresolution analysis (MRA), detail injection, etc.]. Several complete data sets from two different sensors, namely, QuickBird and simulated Pléiades, were delivered to all participants. The fusion results were collected and evaluated, both visually and objectively. Quantitative results of pansharpening were possible owing to the availability of reference originals obtained either by simulating the data collected from the satellite sensor by means of higher resolution data from an airborne platform, in the case of the Pléiades data, or by first degrading all the available data to a coarser resolution and saving the original as the reference, in the case of the QuickBird data. The evaluation results were presented during the special session on Data Fusion at the 2006 International Geoscience and Remote Sensing Symposium in Denver, and these are discussed in further detail in this paper. Two algorithms outperform all the others, the visual analysis being confirmed by the quantitative evaluation. These two methods share the same philosophy: they basically rely on MRA and employ adaptive models for the injection of high-pass details

    COMPARATIVE ASSESSMENT OF VERY HIGH RESOLUTION SATELLITE AND AERIAL ORTHOIMAGERY

    Get PDF

    Enhancement of High-Resolution 3D Inkjet-rinting of Optical Freeform Surfaces Using Digital Twins

    Get PDF
    3D-inkjet-printing is just beginning to take off in the optical field. Advantages of this technique include its fast and cost-efficient fabrication without tooling costs. However, there are still obstacles preventing 3D inkjet-printing from a broad usage in optics, e.g., insufficient form fidelity. In this article, we present the formulation of a digital twin by the enhancement of an optical model by integrating geometrical measurement data. This approach strengthens the high-precision 3D printing process to fulfil optical precision requirements. A process flow between the design of freeform components, fabrication by inkjet printing, the geometrical measurement of the fabricated optical surface, and the feedback of the measurement data into the simulation model was developed, and its interfaces were defined. The evaluation of the measurements allowed for the adaptation of the printing process to compensate for process errors and tolerances. Furthermore, the performance of the manufactured component was simulated and compared with the nominal performance, and the enhanced model could be used for sensitivity analysis. The method was applied to a highly complex helical surface that allowed for the adjustment of the optical power by rotation. We show that sensitivity analysis could be used to define acceptable tolerance budgets of the process

    Multisource and Multitemporal Data Fusion in Remote Sensing

    Get PDF
    The sharp and recent increase in the availability of data captured by different sensors combined with their considerably heterogeneous natures poses a serious challenge for the effective and efficient processing of remotely sensed data. Such an increase in remote sensing and ancillary datasets, however, opens up the possibility of utilizing multimodal datasets in a joint manner to further improve the performance of the processing approaches with respect to the application at hand. Multisource data fusion has, therefore, received enormous attention from researchers worldwide for a wide variety of applications. Moreover, thanks to the revisit capability of several spaceborne sensors, the integration of the temporal information with the spatial and/or spectral/backscattering information of the remotely sensed data is possible and helps to move from a representation of 2D/3D data to 4D data structures, where the time variable adds new information as well as challenges for the information extraction algorithms. There are a huge number of research works dedicated to multisource and multitemporal data fusion, but the methods for the fusion of different modalities have expanded in different paths according to each research community. This paper brings together the advances of multisource and multitemporal data fusion approaches with respect to different research communities and provides a thorough and discipline-specific starting point for researchers at different levels (i.e., students, researchers, and senior researchers) willing to conduct novel investigations on this challenging topic by supplying sufficient detail and references

    STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Get PDF

    Conceptual design and specification of a microsatellite forest fire detection system

    Get PDF
    The burning of our forests and other forms of biomass are increasingly harming the local, regional and global environment. As evidenced by studies of the earth\u27s atmosphere, biomass burning is a significant global source of greenhouse gases and particulate matter that impact the chemistry of the troposphere and stratosphere. Current remote sensing methods used for monitoring forest fires and other forms of biomass burning rely on sensors primarily designed for measurement of temperatures near 300 degrees Kelvin or the average surface temperatures of the earth’s surface. Fires radiate intensely against a low-temperature background, therefore it is possible to detect fires occupying only a fraction of a pixel. However, sensors used in present remote sensing satellites saturate at temperatures well below the peak temperatures of fires, or have revisit times unsuitable for monitoring the diurnal activity of fires. The purpose of this study is to review past and present space-based sensors used to monitor fire on a global scale and propose a design intended specifically for fire detection and geo-location. Early detection of forest fires can save lives, prevent losses of property and help reduce the impact on our environment
    • …
    corecore