Amato G., Carrara F., Falchi F., Gennaro C., Lagani G.
Deep learning Computer vision Hebbian learning Convolutional neural networks
Neural networks are said to be biologically inspired since they mimic the behavior of real neurons. However, several processes in state-of-the-art neural networks, including Deep Convolutional Neural Networks (DCNN), are far from the ones found in animal brains. One relevant difference is the training process. In state-of-the-art artificial neural networks, the training process is based on backpropagation and Stochastic Gradient Descent (SGD) optimization. However, studies in neuroscience strongly suggest that this kind of processes does not occur in the biological brain. Rather, learning methods based on Spike-Timing-Dependent Plasticity (STDP) or the Hebbian learning rule seem to be more plausible, according to neuroscientists. In this paper, we investigate the use of the Hebbian learning rule when training Deep Neural Networks for image classification by proposing a novel weight update rule for shared kernels in DCNNs. We perform experiments using the CIFAR-10 dataset in which we employ Hebbian learning, along with SGD, to train parts of the model or whole networks for the task of image classification, and we discuss their performance thoroughly considering both effectiveness and efficiency aspects.
Source: Image Analysis and Processing - ICIAP 2019, pp. 324–334, Trento, Italia, 9/9/2019, 13/9/2019
Publisher: Springer, Berlin, DEU
@inproceedings{oai:it.cnr:prodotti:411371, title = {Hebbian learning meets deep convolutional neural networks}, author = {Amato G. and Carrara F. and Falchi F. and Gennaro C. and Lagani G.}, publisher = {Springer, Berlin, DEU}, doi = {10.1007/978-3-030-30642-7_29}, booktitle = {Image Analysis and Processing - ICIAP 2019, pp. 324–334, Trento, Italia, 9/9/2019, 13/9/2019}, year = {2019} }