MP-Transformer: A Hybrid Model Integrating Multi-Period ARIMA and Dynamically Gated Attention for Time-Series Forecasting

Authors

  • Yunlong Shi School of Information Engineering, Liaoning University of Traditional Chinese Medicine, Shenyang, 110847, China
  • Weijie Zhou School of Information Engineering, Liaoning University of Traditional Chinese Medicine, Shenyang, 110847, China

DOI:

https://doi.org/10.5755/j01.itc.55.1.43646

Keywords:

MP-Transformer, multi-period ARIMA, dynamically gated attention, time-series forecasting

Abstract

Accurate time-series forecasting is challenging when multiple seasonalities interact with non-linear effects. We present MP-Transformer, a hybrid" decompose-then-refine" framework that couples an interpretable Multi-Period ARIMA baseline with a dynamically gated attention residual learner. The ARIMA component extracts dominant linear trends and multi-scale seasonality via seasonal phase templates with synchronous differencing, yielding an approximately stationary residual series and an interpretable baseline. A Transformer then models the remaining non-linear dynamics using a gated fusion of global attention (for long-range periodic dependencies) and content-driven Top-k local attention (for abrupt short-term variations). Period contributions are learned through non-negative, normalized gating weights. Across multiple real-world datasets, MP-Transformer consistently improves multi-horizon accuracy over statistical, deep, and hybrid baselines. The results demonstrate that combining explicit linear decomposition with implicit residual learning yields robust, data-efficient forecasting and enhanced interpretability. 

Downloads

Published

2026-04-03

Issue

Section

Articles