Utilize este identificador para referenciar este registo: http://hdl.handle.net/10071/34898
Registo completo
Campo DCValorIdioma
dc.contributor.authorDos Santos, R. P.-
dc.contributor.authorMatos-Carvalho, J.-
dc.contributor.authorLeithardt, V.-
dc.date.accessioned2025-07-29T14:49:07Z-
dc.date.available2025-07-29T14:49:07Z-
dc.date.issued2025-
dc.identifier.citationDos Santos, R. P., Matos-Carvalho, J. & Leithardt, V. (2025). Deep learning in time series forecasting with transformer models and RNNs. PeerJ Computer Science 11, e3001. https://doi.org/10.7717/peerj-cs.3001.-
dc.identifier.issn2376-5992-
dc.identifier.urihttp://hdl.handle.net/10071/34898-
dc.description.abstractGiven the increasing need for accurate weather forecasts, the use of neural networks, especially transformer and recurrent neural networks (RNNs), has been highlighted for their ability to capture complex patterns in time series. This study examined 14 neural network models applied to forecast weather variables, evaluated using metrics such as median absolute error (MedianAbsE), mean absolute error (MeanAbsE), maximum absolute error (MaxAbsE), root mean squared percent error (RMSPE), and root mean square error (RMSE). Transformer-based models such as Informer, iTransformer, Former, and patch time series transformer (PatchTST) stood out for their accuracy in capturing long-term patterns, with Informer showing the best performance. In contrast, RNN models such as auto-temporal convolutional networks (TCN) and bidirectional TCN (BiTCN) were better suited to short-term forecasting, despite being more prone to significant errors. Using iTransformer it was possible to achieve a MedianAbsE of 1.21, MeanAbsE of 1.24, MaxAbsE of 2.86, RMSPE de 0.66, and RMSE de 1.43. This study demonstrates the potential of neural networks, especially transformers, to improve accuracy, providing a practical and theoretical basis for selecting the most suitable models for predictive applications.eng
dc.language.isoeng-
dc.publisherPeerJ-
dc.relation101183162-
dc.relationUID/00408/2025-
dc.rightsopenAccess-
dc.subjectDeep learningeng
dc.subjectNeural networkseng
dc.subjectRecurrent neural networks (RNNs)eng
dc.subjectTransformer modelseng
dc.subjectPredictive applicationseng
dc.subjectAccuracy in forecastingeng
dc.titleDeep learning in time series forecasting with transformer models and RNNseng
dc.typearticle-
dc.peerreviewedyes-
dc.volume11-
dc.date.updated2025-07-29T15:45:15Z-
dc.description.versioninfo:eu-repo/semantics/publishedVersion-
dc.identifier.doi10.7717/peerj-cs.3001-
dc.subject.fosDomínio/Área Científica::Ciências Naturais::Ciências da Computação e da Informaçãopor
dc.subject.fosDomínio/Área Científica::Engenharia e Tecnologia::Outras Engenharias e Tecnologiaspor
dc.subject.fosDomínio/Área Científica::Engenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informáticapor
iscte.subject.odsEducação de qualidadepor
iscte.subject.odsIndústria, inovação e infraestruturaspor
iscte.subject.odsReduzir as desigualdadespor
iscte.identifier.cienciahttps://ciencia.iscte-iul.pt/id/ci-pub-112474-
iscte.journalPeerJ Computer Science-
Aparece nas coleções:ISTAR-RI - Artigos em revistas científicas internacionais com arbitragem científica

Ficheiros deste registo:
Ficheiro TamanhoFormato 
article_11247413,8 MBAdobe PDFVer/Abrir


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Todos os registos no repositório estão protegidos por leis de copyright, com todos os direitos reservados.