# -------------------------------------------- # CITATION file created with {cffr} R package # See also: https://docs.ropensci.org/cffr/ # -------------------------------------------- cff-version: 1.2.0 message: 'To cite package "transformerForecasting" in publications use:' type: software license: GPL-3.0-only title: 'transformerForecasting: Transformer Deep Learning Model for Time Series Forecasting' version: 0.1.0 doi: 10.32614/CRAN.package.transformerForecasting abstract: 'Time series forecasting faces challenges due to the non-stationarity, nonlinearity, and chaotic nature of the data. Traditional deep learning models like Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU) process data sequentially but are inefficient for long sequences. To overcome the limitations of these models, we proposed a transformer-based deep learning architecture utilizing an attention mechanism for parallel processing, enhancing prediction accuracy and efficiency. This paper presents user-friendly code for the implementation of the proposed transformer-based deep learning architecture utilizing an attention mechanism for parallel processing. References: Nayak et al. (2024) and Nayak et al. (2024) .' authors: - family-names: Harish Nayak given-names: G H email: harishnayak626@gmail.com repository: https://harish11999.r-universe.dev commit: 1b61751eb1131e422dd8766ec78466f283a295fb date-released: '2025-03-07' contact: - family-names: Harish Nayak given-names: G H email: harishnayak626@gmail.com