Article contents
Time Series Forecasting Using Deep Learning: A Comparative Study of LSTM, GRU, and Transformer Models
Abstract
Time series prediction is especially important in fields of finances, energy, and healthcare where correct predictions are used to make strategic decisions. Conventional statistical models tend to be ineffective in nonlinear trends and long- term relationships, which has resulted in additional research focus on deep learning models. The current work describes the comparative analysis of three popular architectures Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Transformer, which were applied to different time series data. The study compares prediction accuracy in predictive control, computation, and scalability through the standard metrics, which shows the trade-offs between recursive and attention-based models. Findings prove that though LSTM and GRU prove to be extremely efficient in modelling sequential dependencies, Transformer models provide better parallelization and flexibility of complex temporal dynamics. The results highlight the significance of model selection depending on the context of application, the nature of data and the limitations of resources. The given comparative study can help to expand the methodology selection in predictive analytics and provide useful suggestions to researchers and industry practitioners.
Article information
Journal
Journal of Computer Science and Technology Studies
Volume (Issue)
5 (1)
Pages
74-89
Published
Copyright
Open access

This work is licensed under a Creative Commons Attribution 4.0 International License.