2024
Conference article  Open Access

GloNets: Globally Connected Neural Networks

Di Cecco A., Metta C., Fantozzi M., Morandin F., Parton M.

Computer Science - Machine Learning  Neural and Evolutionary Computing (cs.NE)  Neural Network  FOS: Computer and information sciences  Computer Science - Neural and Evolutionary Computing  Machine Learning  Machine Learning (cs.LG)  Deep Learning 

Deep learning architectures suffer from depth-related performance degradation, limiting the effective depth of neural networks. Approaches like ResNet are able to mitigate this, but they do not completely eliminate the problem. We introduce Globally Connected Neural Networks (GloNet), a novel architecture overcoming depth-related issues, designed to be superimposed on any model, enhancing its depth without increasing complexity or reducing performance. With GloNet, the network's head uniformly receives information from all parts of the network, regardless of their level of abstraction. This enables GloNet to self-regulate information flow during training, reducing the influence of less effective deeper layers, and allowing for stable training irrespective of network depth. This paper details GloNet's design, its theoretical basis, and a comparison with existing similar architectures. Experiments show GloNet's self-regulation ability and resilience to depth-related learning challenges, like performance degradation. Our findings suggest GloNet as a strong alternative to traditional architectures like ResNets.

Source: LECTURE NOTES IN COMPUTER SCIENCE, vol. 14641, pp. 53-64

Publisher: Springer


Metrics



Back to previous page
BibTeX entry
@inproceedings{oai:it.cnr:prodotti:492327,
	title = {GloNets: Globally Connected Neural Networks},
	author = {Di Cecco A. and Metta C. and Fantozzi M. and Morandin F. and Parton M.},
	publisher = {Springer},
	doi = {10.1007/978-3-031-58547-0_5 and 10.48550/arxiv.2311.15947},
	booktitle = {LECTURE NOTES IN COMPUTER SCIENCE, vol. 14641, pp. 53-64},
	year = {2024}
}