2022
Contribution to book  Open Access

Evaluating hebbian learning in a semi-supervised setting

Lagani G., Falchi F., Gennaro C., Amato G.

Deep learning  Bio-inspired  Sample efficiency  Neural networks  Semi-supervised  Hebbian learning 

We propose a semi-supervised learning strategy for deep Convolutional Neural Networks (CNNs) in which an unsupervised pre-training stage, performed using biologically inspired Hebbian learning algorithms, is followed by supervised end-to-end backprop fine-tuning. We explored two Hebbian learning rules for the unsupervised pre-training stage: soft-Winner-Takes-All (soft-WTA) and nonlinear Hebbian Principal Component Analysis (HPCA). Our approach was applied in sample efficiency scenarios, where the amount of available labeled training samples is very limited, and unsupervised pre-training is therefore beneficial. We performed experiments on CIFAR10, CIFAR100, and Tiny ImageNet datasets. Our results show that Hebbian outperforms Variational Auto-Encoder (VAE) pre-training in almost all the cases, with HPCA generally performing better than soft-WTA.

Source: Machine Learning, Optimization, and Data Science, edited by Nicosia G.; Ojha V.; La Malfa E.; La Malfa G.; Jansen G.; Pardalos P.M.; Giuffrida G.; Umeton R., pp. 365–379, 2022



Back to previous page
Projects (via OpenAIRE)

AI4EU
A European AI On Demand Platform and Ecosystem

AI4Media
A European Excellence Centre for Media, Society and Democracy


OpenAIRE
BibTeX entry
@inbook{oai:it.cnr:prodotti:465268,
	title = {Evaluating hebbian learning in a semi-supervised setting},
	author = {Lagani G. and Falchi F. and Gennaro C. and Amato G.},
	doi = {10.1007/978-3-030-95470-3_28},
	booktitle = {Machine Learning, Optimization, and Data Science, edited by Nicosia G.; Ojha V.; La Malfa E.; La Malfa G.; Jansen G.; Pardalos P.M.; Giuffrida G.; Umeton R., pp. 365–379, 2022},
	year = {2022}
}