Lagani G., Falchi F., Gennaro C., Fassold H., Amato G.
Hebbian learning Deep learning Neural networks Biologically inspired
Recent work on sample efficient training of Deep Neural Networks (DNNs) proposed a semi-supervised methodology based on biologically inspired Hebbian learning, combined with traditional backprop-based training. Promising results were achieved on various computer vision benchmarks, in scenarios of scarce labeled data availability. However, current Hebbian learning solutions can hardly address large-scale scenarios due to their demanding computational cost. In order to tackle this limitation, in this contribution, we investigate a novel solution, named FastHebb (FH), based on the reformulation of Hebbian learning rules in terms of matrix multiplications, which can be executed more efficiently on GPU. Starting from Soft-Winner-Takes-All (SWTA) and Hebbian Principal Component Analysis (HPCA) learning rules, we formulate their improved FH versions: SWTA-FH and HPCA-FH. We experimentally show that the proposed approach accelerates training speed up to 70 times, allowing us to gracefully scale Hebbian learning experiments on large datasets and network architectures such as ImageNet and VGG.
Source: NEUROCOMPUTING, vol. 595
@article{oai:iris.cnr.it:20.500.14243/500961, title = {Scalable bio-inspired training of Deep Neural Networks with FastHebb}, author = {Lagani G. and Falchi F. and Gennaro C. and Fassold H. and Amato G.}, doi = {10.1016/j.neucom.2024.127867}, year = {2024} }