2 research outputs found

    Long-Term Prediction of Sea Surface Temperature by Temporal Embedding Transformer With Attention Distilling and Partial Stacked Connection

    No full text
    Sea surface temperature (SST) is one of the most important parameters in the global ocean–atmosphere system, and its long-term changes will have a significant impact on global climate and ecosystems. Accurate prediction of SST, therefore, especially the improvement of long-term predictive skills is of great significance for fishery farming, marine ecological protection, and planning of maritime activities. Since the effective and precise description of the long-range dependence between input and output requires higher model prediction ability, it is an extremely challenging task to achieve accurate long-term prediction of SST. Inspired by the successful application of the transformer and its variants in natural language processing similar to time-series prediction, we introduce it to the SST prediction in the China Sea. The model Transformer with temporal embedding, attention Distilling, and Stacked connection in Part (TransDtSt-Part) is developed by embedding the temporal information in the classic transformer, combining attention distillation and partial stacked connection, and performing generative decoding. High-resolution satellite-derived data from the National Oceanic and Atmospheric Administration is utilized, and long-term SST predictions with day granularity are achieved under univariate and multivariate patterns. With root mean square error and mean absolute error as metrics, the TransDtSt-Part outperforms all competitive baselines in five oceans (i.e., subareas of Bohai, Yellow Sea, East China Sea, Taiwan Strait, and South China Sea) and six prediction horizons (i.e., 30, 60, 90, 180, 270, and 360 days). Experimental results demonstrate that the performance of the innovative model is encouraging and promising for the long-term prediction of SST
    corecore