Recently, the superiority of Transformer for long-term time series
forecasting (LTSF) tasks has been challenged, particularly since recent work
has shown that simple models can outperform numerous Transformer-based
approaches. This suggests that a notable gap remains in fully leveraging the
potential of Transformer in LTSF tasks. Consequently, this study investigates
key issues when applying Transformer to LTSF, encompassing aspects of temporal
continuity, information density, and multi-channel relationships. We introduce
the Placeholder-enhanced Technique (PET) to enhance the computational
efficiency and predictive accuracy of Transformer in LTSF tasks. Furthermore,
we delve into the impact of larger patch strategies and channel interaction
strategies on Transformer's performance, specifically Long Sub-sequence
Division (LSD) and Multi-channel Separation and Interaction (MSI). These
strategies collectively constitute a novel model termed PETformer. Extensive
experiments have demonstrated that PETformer achieves state-of-the-art
performance on eight commonly used public datasets for LTSF, surpassing all
existing models. The insights and enhancement methodologies presented in this
paper serve as valuable reference points and sources of inspiration for future
research endeavors