Lagani G, Falchi F, Gennaro C, Amato G
Hebbian learning Deep learning Neural networks Biologically inspired
In this paper, we investigate Hebbian learning strategies applied to Convolutional Neural Network (CNN) training. We consider two unsupervised learning approaches, Hebbian Winner-Takes-All (HWTA), and Hebbian Principal Component Analysis (HPCA). The Hebbian learning rules are used to train the layers of a CNN in order to extract features that are then used for classification, without requiring backpropagation (backprop). Experimental comparisons are made with state-of-the-art unsupervised (but backprop-based) Variational Auto-Encoder (VAE) training. For completeness,we consider two supervised Hebbian learning variants (Supervised Hebbian Classifiers--SHC, and Contrastive Hebbian Learning--CHL), for training the final classification layer, which are compared to Stochastic Gradient Descent training. We also investigate hybrid learning methodologies, where some network layers are trained following the Hebbian approach, and others are trained by backprop. We tested our approaches on MNIST, CIFAR10, and CIFAR100 datasets. Our results suggest that Hebbian learning is generally suitable for training early feature extraction layers, or to retrain higher network layers in fewer training epochs than backprop. Moreover, our experiments show that Hebbian learning outperforms VAE training, with HPCA performing generally better than HWTA.
Source: NEURAL COMPUTING & APPLICATIONS
@article{oai:it.cnr:prodotti:462597, title = {Comparing the performance of Hebbian against backpropagation learning using convolutional neural networks}, author = {Lagani G and Falchi F and Gennaro C and Amato G}, year = {2022} }
Bibliographic record
Bibliographic record
Deposited version
Deposited version
Postprint version
Preprint version