Research Article

Time Series Forecasting Using Deep Learning: A Comparative Study of LSTM, GRU, and Transformer Models

Authors

  • Dhirman Preet Singh Sachar Katapult Group Inc. Plano, TX, USA

Abstract

Time series prediction is especially important in fields of finances, energy, and healthcare where correct predictions are used to make strategic decisions. Conventional statistical models tend to be ineffective in nonlinear trends and long- term relationships, which has resulted in additional research focus on deep learning models. The current work describes the comparative analysis of three popular architectures Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Transformer, which were applied to different time series data. The study compares prediction accuracy in predictive control, computation, and scalability through the standard metrics, which shows the trade-offs between recursive and attention-based models.  Findings prove that though LSTM and GRU prove to be extremely efficient in modelling sequential dependencies, Transformer models provide better parallelization and flexibility of complex temporal dynamics. The results highlight the significance of model selection depending on the context of application, the nature of data and the limitations of resources. The given comparative study can help to expand the methodology selection in predictive analytics and provide useful suggestions to researchers and industry practitioners.

Article information

Journal

Journal of Computer Science and Technology Studies

Volume (Issue)

5 (1)

Pages

74-89

Published

2023-03-25

How to Cite

Dhirman Preet Singh Sachar. (2023). Time Series Forecasting Using Deep Learning: A Comparative Study of LSTM, GRU, and Transformer Models. Journal of Computer Science and Technology Studies, 5(1), 74-89. https://doi.org/10.32996/jcsts.2023.5.1.9

Downloads

Views

12

Downloads

3

Keywords:

Time Series Forecasting, Deep Learning, LSTM, GRU, Transformer, Predictive Analytics, Attention Mechanism