Please use this identifier to cite or link to this item: http://hdl.handle.net/10071/37063
Author(s): Costa, J. L.
Date: 2025
Title: The asymptotic structure of deep neural networks
Journal title: CIM Bulletin
Number: 47
Pages: 23 - 30
Reference: Costa, J. L. (2025). The asymptotic structure of deep neural networks. CIM Bulletin, (47), 23-30.
ISSN: 2183-8062
Abstract: Deep Neural Networks (DNNs) are the main concept at the center of the artificial intelligence revolution we are experiencing. However, some of the reasons behind their effectiveness (for instance, why do they seem to provide ``good’’ solutions, determined by simple optimization algorithms?), as well as the causes of their limitations (for instance, why are they so parameter and data expensive?), remain somewhat unclear. Therefore, a theoretical/mathematical clarification of these issues would be welcomed and, in principle, might help us in the construction of a new generation of interpretable, safer, sustainable and, consequently, more reliable AI models. With that in mind, a mathematical approach that has provided some relevant insights is the study of the asymptotic structure of DNNs. In this article, we will start by introducing the basics of DNNs, followed by a presentation of some results concerning the study of the large width limit of these models and a discussion of the implications that such results have in our understanding of supervised machine learning with DNNs.
Peerreviewed: yes
Access type: Open Access
Appears in Collections:BRU-RN - Artigos em revistas científicas nacionais com arbitragem científica

Files in This Item:
File SizeFormat 
article_118149.pdf198,74 kBAdobe PDFView/Open


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.