1,808 research outputs found

    Multisource and Multitemporal Data Fusion in Remote Sensing

    Get PDF
    The sharp and recent increase in the availability of data captured by different sensors combined with their considerably heterogeneous natures poses a serious challenge for the effective and efficient processing of remotely sensed data. Such an increase in remote sensing and ancillary datasets, however, opens up the possibility of utilizing multimodal datasets in a joint manner to further improve the performance of the processing approaches with respect to the application at hand. Multisource data fusion has, therefore, received enormous attention from researchers worldwide for a wide variety of applications. Moreover, thanks to the revisit capability of several spaceborne sensors, the integration of the temporal information with the spatial and/or spectral/backscattering information of the remotely sensed data is possible and helps to move from a representation of 2D/3D data to 4D data structures, where the time variable adds new information as well as challenges for the information extraction algorithms. There are a huge number of research works dedicated to multisource and multitemporal data fusion, but the methods for the fusion of different modalities have expanded in different paths according to each research community. This paper brings together the advances of multisource and multitemporal data fusion approaches with respect to different research communities and provides a thorough and discipline-specific starting point for researchers at different levels (i.e., students, researchers, and senior researchers) willing to conduct novel investigations on this challenging topic by supplying sufficient detail and references

    Generating a series of fine spatial and temporal resolution land cover maps by fusing coarse spatial resolution remotely sensed images and fine spatial resolution land cover maps

    Get PDF
    Studies of land cover dynamics would benefit greatly from the generation of land cover maps at both fine spatial and temporal resolutions. Fine spatial resolution images are usually acquired relatively infrequently, whereas coarse spatial resolution images may be acquired with a high repetition rate but may not capture the spatial detail of the land cover mosaic of the region of interest. Traditional image spatial–temporal fusion methods focus on the blending of pixel spectra reflectance values and do not directly provide land cover maps or information on land cover dynamics. In this research, a novel Spatial–Temporal remotely sensed Images and land cover Maps Fusion Model (STIMFM) is proposed to produce land cover maps at both fine spatial and temporal resolutions using a series of coarse spatial resolution images together with a few fine spatial resolution land cover maps that pre- and post-date the series of coarse spatial resolution images. STIMFM integrates both the spatial and temporal dependences of fine spatial resolution pixels and outputs a series of fine spatial–temporal resolution land cover maps instead of reflectance images, which can be used directly for studies of land cover dynamics. Here, three experiments based on simulated and real remotely sensed images were undertaken to evaluate the STIMFM for studies of land cover change. These experiments included comparative assessment of methods based on single-date image such as the super-resolution approaches (e.g., pixel swapping-based super-resolution mapping) and the state-of-the-art spatial–temporal fusion approach that used the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) and the Flexible Spatiotemporal DAta Fusion model (FSDAF) to predict the fine-resolution images, in which the maximum likelihood classifier and the automated land cover updating approach based on integrated change detection and classification method were then applied to generate the fine-resolution land cover maps. Results show that the methods based on single-date image failed to predict the pixels of changed and unchanged land cover with high accuracy. The land cover maps that were obtained by classification of the reflectance images outputted from ESTARFM and FSDAF contained substantial misclassification, and the classification accuracy was lower for pixels of changed land cover than for pixels of unchanged land cover. In addition, STIMFM predicted fine spatial–temporal resolution land cover maps from a series of Landsat images and a few Google Earth images, to which ESTARFM and FSDAF that require correlation in reflectance bands in coarse and fine images cannot be applied. Notably, STIMFM generated higher accuracy for pixels of both changed and unchanged land cover in comparison with other methods

    The application of ocean front metrics for understanding habitat selection by marine predators

    Get PDF
    Marine predators such as seabirds, cetaceans, turtles, pinnipeds, sharks and large teleost fish are essential components of healthy, biologically diverse marine ecosystems. However, intense anthropogenic pressure on the global ocean is causing rapid and widespread change, and many predator populations are in decline. Conservation solutions are urgently required, yet only recently have we begun to comprehend how these animals interact with the vast and dynamic oceans that they inhabit. A better understanding of the mechanisms that underlie habitat selection at sea is critical to our knowledge of marine ecosystem functioning, and to ecologically-sensitive marine spatial planning. The collection of studies presented in this thesis aims to elucidate the influence of biophysical coupling at oceanographic fronts – physical interfaces at the transitions between water masses – on habitat selection by marine predators. High-resolution composite front mapping via Earth Observation remote sensing is used to provide oceanographic context to several biologging datasets describing the movements and behaviours of animals at sea. A series of species-habitat models reveal the influence of mesoscale (10s to 100s of kilometres) thermal and chlorophyll-a fronts on habitat selection by taxonomically diverse species inhabiting contrasting ocean regions; northern gannets (Morus bassanus; Celtic Sea), basking sharks (Cetorhinus maximus; north-east Atlantic), loggerhead turtles (Caretta caretta; Canary Current), and grey-headed albatrosses (Thalassarche chrysostoma; Southern Ocean). Original aspects of this work include an exploration of quantitative approaches to understanding habitat selection using remotely-sensed front metrics; and explicit investigation of how the biophysical properties of fronts and species-specific foraging ecology interact to influence associations. Main findings indicate that front metrics, particularly seasonal indices, are useful predictors of habitat preference across taxa. Moreover, frontal persistence and spatiotemporal predictability appear to mediate the use of front-associated foraging habitats, both in shelf seas and in the open oceans. These findings have implications for marine spatial planning and the design of protected area networks, and may prove useful in the development of tools supporting spatially dynamic ocean management

    Scalable Crop Yield Prediction with Sentinel-2 Time Series and Temporal Convolutional Network

    Get PDF
    One of the precepts of food security is the proper functioning of the global food markets. This calls for open and timely intelligence on crop production on an agroclimatically meaningful territorial scale. We propose an operationally suitable method for large-scale in-season crop yield estimations from a satellite image time series (SITS) for statistical production. As an object-based method, it is spatially scalable from parcel to regional scale, making it useful for prediction tasks in which the reference data are available only at a coarser level, such as counties. We show that deep learning-based temporal convolutional network (TCN) outperforms the classical machine learning method random forests and produces more accurate results overall than published national crop forecasts. Our novel contribution is to show that mean-aggregated regional predictions with histogram-based features calculated from farm-level observations perform better than other tested approaches. In addition, TCN is robust to the presence of cloudy pixels, suggesting TCN can learn cloud masking from the data. The temporal compositing of information do not improve prediction performance. This indicates that with end-to-end learning less preprocessing in SITS tasks seems viable
    corecore