2022
Contribution to book  Open Access

Training convolutional neural networks with competitive hebbian learning approaches

Lagani G., Falchi F., Gennaro C., Amato G.

Computer vision  Competitive learning  Biologically inspired  Neural networks  Machine learning  Hebbian learning 

We explore competitive Hebbian learning strategies to train feature detectors in Convolutional Neural Networks (CNNs), without supervision. We consider variants of the Winner-Takes-All (WTA) strategy explored in previous works, i.e. k-WTA, e-soft-WTA and p-soft-WTA, performing experiments on different object recognition datasets. Results suggest that the Hebbian approaches are effective to train early feature extraction layers, or to re-train higher layers of a pre-trained network, with soft competition generally performing better than other Hebbian approaches explored in this work. Our findings encourage a path of cooperation between neuroscience and computer science towards a deeper investigation of biologically inspired learning principles.

Source: Machine Learning, Optimization, and Data Science, edited by Nicosia G., Ojha V., La Malfa E., La Malfa G., Jansen G., Pardalos P.M., Giuffrida G., Umeton R., pp. 25–40, 2022



Back to previous page
Projects (via OpenAIRE)

AI4EU
A European AI On Demand Platform and Ecosystem

AI4Media
A European Excellence Centre for Media, Society and Democracy


OpenAIRE
BibTeX entry
@inbook{oai:it.cnr:prodotti:465267,
	title = {Training convolutional neural networks with competitive hebbian learning approaches},
	author = {Lagani G. and Falchi F. and Gennaro C. and Amato G.},
	doi = {10.1007/978-3-030-95467-3_2},
	booktitle = {Machine Learning, Optimization, and Data Science, edited by Nicosia G., Ojha V., La Malfa E., La Malfa G., Jansen G., Pardalos P.M., Giuffrida G., Umeton R., pp. 25–40, 2022},
	year = {2022}
}