Utilize este identificador para referenciar este registo:
http://hdl.handle.net/10071/36914| Autoria: | Awais, M. Postolache, O. A. Oliveira, S. M. |
| Data: | 2026 |
| Título próprio: | Quantum-enhanced learning: Leveraging von Neumann entropy for enhanced graph neural network performance |
| Título da revista: | Neural Networks |
| Volume: | 201 |
| Referência bibliográfica: | Awais, M., Postolache, O. A., & Oliveira, S. M. (2026). Quantum-enhanced learning: Leveraging von Neumann entropy for enhanced graph neural network performance. Neural Networks, 201, Article 108958. https://doi.org/10.1016/j.neunet.2026.108958 |
| ISSN: | 0893-6080 |
| DOI (Digital Object Identifier): | 10.1016/j.neunet.2026.108958 |
| Palavras-chave: | Graph neural networks von Neumann entropy Quantum information theory Over-squashing Over-smoothing |
| Resumo: | Graph Neural Networks (GNNs) have established themselves as powerful tools for learning from graph-structured data. However, their reliance on local message-passing mechanisms leads to over-squashing—the compression of exponentially growing neighborhood information into fixed-size vectors—which severely limits long-range dependency modeling. We introduce the Quantum-Inspired Graph Neural Network (QGNN) with a novel Quantum Entanglement Loss (QEL) function that addresses this challenge through a fundamentally different mechanism than existing approaches. Unlike spectral regularization (which enforces smoothness) or maximum entropy methods (which encourage representation diversity), QEL minimizes the von Neumann entropy of the node embedding correlation matrix, thereby concentrating eigenvalues in dominant eigenmodes that preserve global structural patterns. This entropy minimization creates direct information pathways between distant but functionally related nodes, effectively bypassing multi-hop bottlenecks. We evaluate QGNN on both standard benchmarks (Cora, Citeseer, PPI, Electronic Circuits) and the Long Range Graph Benchmark (LRGB) suite, which features graphs with average diameters up to 56.99 (Peptides). On LRGB datasets, QGNN achieves substantial improvements: 37.6% relative MAE reduction on Peptides-struct compared to GCN, 4.0% improvement over Graph Transformers (GraphGPS), and notably, 97% better performance than GCN on node pairs separated by 7+ hops. Despite these gains, QGNN requires only 20–30% additional computational overhead compared to standard GCN, while being 5–6 × faster than Graph Transformer approaches. Our results establish entropy-based regularization as a principled and efficient approach for long-range dependency modeling in graphs. |
| Arbitragem científica: | yes |
| Acesso: | Acesso Aberto |
| Aparece nas coleções: | IT-RI - Artigos em revistas científicas internacionais com arbitragem científica |
Ficheiros deste registo:
| Ficheiro | Tamanho | Formato | |
|---|---|---|---|
| article_117842.pdf | 3,19 MB | Adobe PDF | Ver/Abrir |
Todos os registos no repositório estão protegidos por leis de copyright, com todos os direitos reservados.












