11,876 research outputs found

    Crop growth and yield monitoring in smallholder agricultural systems:a multi-sensor data fusion approach

    Get PDF
    Smallholder agricultural systems are highly vulnerable to production risks posed by the intensification of extreme weather events such as drought and flooding, soil degradation, pests, lack of access to agricultural inputs, and political instability. Monitoring the spatial and temporal variability of crop growth and yield is crucial for farm management, national-level food security assessments, and famine early warning. However, agricultural monitoring is difficult in fragmented agricultural landscapes because of scarcity and uncertainty of data to capture small crop fields. Traditional pre- and post-harvest crop monitoring and yield estimation based on fieldwork is costly, slow, and can be unrepresentative of heterogeneous agricultural landscapes as found in smallholder systems in sub-Saharan Africa. Devising accurate and timely crop phenology detection and yield estimation methods can improve our understanding of the status of crop production and food security in these regions.Satellite-based Earth observation (EO) data plays a key role in monitoring the spatial and temporal variability of crop growth and yield over large areas. The small field sizes and variability in management practices in fragmented landscapes requires high spatial and high temporal resolution EO data. This thesis develops and demonstrates methods to investigate the spatiotemporal variability of crop phenology detection and yield estimation using Landsat and MODIS data fusion in smallholder agricultural systems in the Lake Tana sub-basin of Ethiopia. The overall aim is to further broaden the application of multi-sensor EO data for crop growth monitoring in smallholder agricultural systems.The thesis addressed two important aspects of crop monitoring applications of EO data: phenology detection and yield estimation. First, the ESTARFM data fusion workflow was modified based on local knowledge of crop calendars and land cover to improve crop phenology monitoring in fragmented agricultural landscapes. The approach minimized data fusion uncertainties in predicting temporal reflectance change of crops during the growing season and the reflectance value of fused data was comparable to the original Landsat image reserved for validation. The main sources of uncertainty in data fusion are the small field size and abrupt crop growth changes between the base andviiprediction dates due to flooding, weeding, fertiliser application, and harvesting. The improved data fusion approach allowed us to determine crop phenology and estimate LAI more accurately than both the standard ESTARFM data fusion method and when using MODIS data without fusion. We also calibrated and validated a dynamic threshold phenology detection method using maize and rice crop sowing and harvest date information. Crop-specific phenology determined from data fusion minimized the mismatch between EO-derived phenometrics and the actual crop calendar. The study concluded that accurate phenology detection and LAI estimation from Landsat–MODIS data fusion demonstrates the feasibility of crop growth monitoring using multi-sensor data fusion in fragmented and persistently cloudy agricultural landscapes.Subsequently, the validated data fusion and phenology detection methods were implemented to understand crop phenology trends from 2000 to 2020. These trends are often less understood in smallholder agricultural systems due to the lack of high spatial resolution data to distinguish crops from the surrounding natural vegetation. Trends based on Landsat–MODIS fusion were compared with those detected using MODIS alone to assess the contribution of data fusion to discern crop phenometric change. Landsat and MODIS fusion discerned crop and environment-specific trends in the magnitude and direction of crop phenology change. The results underlined the importance of high spatial and temporal resolution EO data to capture environment-specific crop phenology change, which has implications in designing adaptation and crop management practices in these regions.The second important aspect of the crop monitoring problem addressed in this thesis is improving crop yield estimation in smallholder agricultural systems. The large input requirements of crop models and lack of spatial information about the heterogeneous crop-growing environment and agronomic management practices are major challenges to the accurate estimation of crop yield. We assimilated leaf area index (LAI) and phenology information from Landsat–MODIS fusion in a crop model (simple algorithm for yield estimation: SAFY) to obtain reasonably reliable crop yield estimates. The SAFY model is sensitive to the spatial and temporal resolution of the calibration input LAI, phenology information, and the effective light use efficiency (ELUE) parameter, which needs accurate field level inputs during modelviiioptimization. Assimilating fused EO-based phenology information minimized model uncertainty and captured the large management and environmental variation in smallholder agricultural systems.In the final research chapter of the thesis, we analysed the contribution of assimilating LAI at different phenological stages. The frequency and timing of LAI observations influences the retrieval accuracy of the assimilating LAI in crop growth simulation models. The use of (optical) EO data to estimate LAI is constrained by limited repeat frequency and cloud cover, which can reduce yield estimation accuracy. We evaluated the relative contribution of EO observations at different crop growth stages for accurate calibration of crop model parameters. We found that LAI between jointing and grain filling has the highest contribution to SAFY yield estimation and that the distribution of LAI during the key development stages was more useful than the frequency of LAI to improve yield estimation. This information on the optimal timing of EO data assimilation is important to develop better in-season crop yield forecasting in smallholder systems

    Reconstruction of Daily 30 m Data from HJ CCD, GF-1 WFV, Landsat, and MODIS Data for Crop Monitoring

    Get PDF
    With the recent launch of new satellites and the developments of spatiotemporal data fusion methods, we are entering an era of high spatiotemporal resolution remote-sensing analysis. This study proposed a method to reconstruct daily 30 m remote-sensing data for monitoring crop types and phenology in two study areas located in Xinjiang Province, China. First, the Spatial and Temporal Data Fusion Approach (STDFA) was used to reconstruct the time series high spatiotemporal resolution data from the Huanjing satellite charge coupled device (HJ CCD), Gaofen satellite no. 1 wide field-of-view camera (GF-1 WFV), Landsat, and Moderate Resolution Imaging Spectroradiometer (MODIS) data. Then, the reconstructed time series were applied to extract crop phenology using a Hybrid Piecewise Logistic Model (HPLM). In addition, the onset date of greenness increase (OGI) and greenness decrease (OGD) were also calculated using the simulated phenology. Finally, crop types were mapped using the phenology information. The results show that the reconstructed high spatiotemporal data had a high quality with a proportion of good observations (PGQ) higher than 0.95 and the HPLM approach can simulate time series Normalized Different Vegetation Index (NDVI) very well with R2 ranging from 0.635 to 0.952 in Luntai and 0.719 to 0.991 in Bole, respectively. The reconstructed high spatiotemporal data were able to extract crop phenology in single crop fields, which provided a very detailed pattern relative to that from time series MODIS data. Moreover, the crop types can be classified using the reconstructed time series high spatiotemporal data with overall accuracy equal to 0.91 in Luntai and 0.95 in Bole, which is 0.028 and 0.046 higher than those obtained by using multi-temporal Landsat NDVI data

    Crop monitoring and yield estimation using polarimetric SAR and optical satellite data in southwestern Ontario

    Get PDF
    Optical satellite data have been proven as an efficient source to extract crop information and monitor crop growth conditions over large areas. In local- to subfield-scale crop monitoring studies, both high spatial resolution and high temporal resolution of the image data are important. However, the acquisition of optical data is limited by the constant contamination of clouds in cloudy areas. This thesis explores the potential of polarimetric Synthetic Aperture Radar (SAR) satellite data and the spatio-temporal data fusion approach in crop monitoring and yield estimation applications in southwestern Ontario. Firstly, the sensitivity of 16 parameters derived from C-band Radarsat-2 polarimetric SAR data to crop height and fractional vegetation cover (FVC) was investigated. The results show that the SAR backscatters are affected by many factors unrelated to the crop canopy such as the incidence angle and the soil background and the degree of sensitivity varies with the crop types, growing stages, and the polarimetric SAR parameters. Secondly, the Minimum Noise Fraction (MNF) transformation, for the first time, was applied to multitemporal Radarsat-2 polarimetric SAR data in cropland area mapping based on the random forest classifier. An overall classification accuracy of 95.89% was achieved using the MNF transformation of the multi-temporal coherency matrix acquired from July to November. Then, a spatio-temporal data fusion method was developed to generate Normalized Difference Vegetation Index (NDVI) time series with both high spatial and high temporal resolution in heterogeneous regions using Landsat and MODIS imagery. The proposed method outperforms two other widely used methods. Finally, an improved crop phenology detection method was proposed, and the phenology information was then forced into the Simple Algorithm for Yield Estimation (SAFY) model to estimate crop biomass and yield. Compared with the SAFY model without forcing the remotely sensed phenology and a simple light use efficiency (LUE) model, the SAFY incorporating the remotely sensed phenology can improve the accuracy of biomass estimation by about 4% in relative Root Mean Square Error (RRMSE). The studies in this thesis improve the ability to monitor crop growth status and production at subfield scale

    Fusing optical and SAR time series for LAI gap filling with multioutput Gaussian processes

    Get PDF
    The availability of satellite optical information is often hampered by the natural presence of clouds, which can be problematic for many applications. Persistent clouds over agricultural fields can mask key stages of crop growth, leading to unreliable yield predictions. Synthetic Aperture Radar (SAR) provides all-weather imagery which can potentially overcome this limitation, but given its high and distinct sensitivity to different surface properties, the fusion of SAR and optical data still remains an open challenge. In this work, we propose the use of Multi-Output Gaussian Process (MOGP) regression, a machine learning technique that learns automatically the statistical relationships among multisensor time series, to detect vegetated areas over which the synergy between SAR-optical imageries is profitable. For this purpose, we use the Sentinel-1 Radar Vegetation Index (RVI) and Sentinel-2 Leaf Area Index (LAI) time series over a study area in north west of the Iberian peninsula. Through a physical interpretation of MOGP trained models, we show its ability to provide estimations of LAI even over cloudy periods using the information shared with RVI, which guarantees the solution keeps always tied to real measurements. Results demonstrate the advantage of MOGP especially for long data gaps, where optical-based methods notoriously fail. The leave-one-image-out assessment technique applied to the whole vegetation cover shows MOGP predictions improve standard GP estimations over short-time gaps (R 2 of 74% vs 68%, RMSE of 0.4 vs 0.44 [m 2 m −2 ]) and especially over long-time gaps (R 2 of 33% vs 12%, RMSE of 0.5 vs 1.09 [m 2 m −2 ])

    OBSUM: An object-based spatial unmixing model for spatiotemporal fusion of remote sensing images

    Full text link
    Spatiotemporal fusion aims to improve both the spatial and temporal resolution of remote sensing images, thus facilitating time-series analysis at a fine spatial scale. However, there are several important issues that limit the application of current spatiotemporal fusion methods. First, most spatiotemporal fusion methods are based on pixel-level computation, which neglects the valuable object-level information of the land surface. Moreover, many existing methods cannot accurately retrieve strong temporal changes between the available high-resolution image at base date and the predicted one. This study proposes an Object-Based Spatial Unmixing Model (OBSUM), which incorporates object-based image analysis and spatial unmixing, to overcome the two abovementioned problems. OBSUM consists of one preprocessing step and three fusion steps, i.e., object-level unmixing, object-level residual compensation, and pixel-level residual compensation. OBSUM can be applied using only one fine image at the base date and one coarse image at the prediction date, without the need of a coarse image at the base date. The performance of OBSUM was compared with five representative spatiotemporal fusion methods. The experimental results demonstrated that OBSUM outperformed other methods in terms of both accuracy indices and visual effects over time-series. Furthermore, OBSUM also achieved satisfactory results in two typical remote sensing applications. Therefore, it has great potential to generate accurate and high-resolution time-series observations for supporting various remote sensing applications
    corecore