@Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting

[[Abstract]]

  • Accurate prediction of the future given the past based on time series data is of paramount importance, since it opens the door for decision making and risk management ahead of time. In practice, the challenge is to build a flexible but parsimonious model that can capture a wide range of temporal dependencies.

  • In this paper, we propose Pyraformer by exploring the [[multiresolution representation]] of the time series.

    • Specifically, we introduce the [[pyramidal attention module]] (PAM) in which

      • the inter-scale tree structure summarizes features at different resolutions

      • and the intra-scale neighboring connections model the temporal dependencies of different ranges.

    • Under mild conditions, the maximum length of the signal traversing path in Pyraformer is a constant (i.e., O(1)\mathcal O(1)) with regard to the sequence length LL, while its time and space complexity scale linearly with LL.

  • Extensive numerical results show that Pyraformer typically achieves the highest prediction accuracy in both single-step and long-range forecasting tasks with the least amount of time and memory consumption, especially when the sequence is long.

[[Summary]]

@Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting

https://blog.xiang578.com/post/logseq/@Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting.html

Author

Ryen Xiang

Posted on

2022-03-16

Updated on

2025-03-30

Licensed under


网络回响

Comments