MP-Transformer: A Hybrid Model Integrating Multi-Period ARIMA and Dynamically Gated Attention for Time-Series Forecasting
DOI:
https://doi.org/10.5755/j01.itc.55.1.43646Keywords:
MP-Transformer, multi-period ARIMA, dynamically gated attention, time-series forecastingAbstract
Accurate time-series forecasting is challenging when multiple seasonalities interact with non-linear effects. We present MP-Transformer, a hybrid" decompose-then-refine" framework that couples an interpretable Multi-Period ARIMA baseline with a dynamically gated attention residual learner. The ARIMA component extracts dominant linear trends and multi-scale seasonality via seasonal phase templates with synchronous differencing, yielding an approximately stationary residual series and an interpretable baseline. A Transformer then models the remaining non-linear dynamics using a gated fusion of global attention (for long-range periodic dependencies) and content-driven Top-k local attention (for abrupt short-term variations). Period contributions are learned through non-negative, normalized gating weights. Across multiple real-world datasets, MP-Transformer consistently improves multi-horizon accuracy over statistical, deep, and hybrid baselines. The results demonstrate that combining explicit linear decomposition with implicit residual learning yields robust, data-efficient forecasting and enhanced interpretability.
Downloads
Published
Issue
Section
License
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.


