International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences
E-ISSN: 2349-7300Impact Factor - 9.907

A Widely Indexed Open Access Peer Reviewed Online Scholarly International Journal

Call for Paper Volume 13 Issue 2 March-April 2025 Submit your research for publication

Hybrid Transformer-Based Architecture for Multi-Horizon Time Series Forecasting with Uncertainty Quantification

Authors: Jwalin Thaker

DOI: https://doi.org/10.5281/zenodo.15086753

Short DOI: https://doi.org/g89tmr

Country: United States

Full-text Research PDF File:   View   |   Download


Abstract: Time series forecasting remains a critical challenge across numerous domains, with recent transformer-based ar- chitectures demonstrating remarkable capabilities in capturing complex temporal dependencies. This paper introduces a novel hybrid architecture that integrates state-of-the-art transformer models—including PatchTST, Temporal Fusion Transformers (TFT) [2], and Informer [3]—with traditional statistical methods to enhance multi-horizon forecasting performance. Our approach leverages specialized multi-head attention mechanisms for tempo- ral data, patch embedding techniques, and probabilistic forecast- ing components to quantify prediction uncertainty. The proposed architecture adaptively handles varying time horizons while efficiently processing static and dynamic covariates, missing data, and irregular sampling patterns. Extensive experiments across diverse applications—financial markets, energy consumption, supply chain, weather forecasting, and healthcare—demonstrate that our hybrid model consistently outperforms existing methods on both traditional metrics (MAE, RMSE, MAPE) and prob- abilistic evaluation criteria (CRPS, calibration). Furthermore, we incorporate interpretability layers that provide actionable insights for business decision-making, addressing a significant limitation of black-box deep learning approaches. Our work contributes to advancing the field of time series forecasting by combining the strengths of transformer architectures with uncertainty quantification techniques in a computationally efficient framework.

Keywords: Time Series Forecasting, Transformer Models, Un- certainty Quantification, Multi-Horizon Prediction, Deep Learning, Machine Learning, Artificial Intelligence


Paper Id: 232285

Published On: 2023-12-23

Published In: Volume 11, Issue 6, November-December 2023

Share this