[1] K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in: Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770-778.
[2] D. Silver, A. Huang, C. J. Maddison, A. Guez, L. Sifre, G. Van Den Driessche, J. Schrittwieser, I. Antonoglou, V. Panneershelvam, M. Lanctot, et al., Mastering the game of go with deep neural networks and tree search, nature 529 (7587) (2016) 484.
[3] J. Devlin, M.-W. Chang, K. Lee, K. Toutanova, Bert: Pretraining of deep bidirectional transformers for language understanding, arXiv preprint arXiv:1810.04805 (2018).
[4] Y. Bengio, P. Lamblin, D. Popovici, H. Larochelle, Greedy layer-wise training of deep networks, in: Advances in neural information processing systems, 2007, pp. 153-160.
[5] H. Larochelle, Y. Bengio, J. Louradour, P. Lamblin, Exploring strategies for training deep neural networks., Journal of machine learning research 10 (1) (2009).
[6] J. Weston, F. Ratle, H. Mobahi, R. Collobert, Deep learning via semi-supervised embedding, in: Neural networks: Tricks of the trade, Springer, 2012, pp. 639-655.
[7] D. P. Kingma, S. Mohamed, D. Jimenez Rezende, M. Welling, Semi-supervised learning with deep generative models, Advances in neural information processing systems 27 (2014) 3581-3589.
[8] A. Rasmus, M. Berglund, M. Honkala, H. Valpola, T. Raiko, Semi-supervised learning with ladder networks, in: Advances in neural information processing systems, 2015, pp. 3546-3554.
[9] Y. Zhang, K. Lee, H. Lee, Augmenting supervised neural networks with unsupervised objectives for large-scale image classification, in: International conference on machine learning, 2016, pp. 612-621.
[10] T. Chen, S. Kornblith, M. Norouzi, G. Hinton, A simple framework for contrastive learning of visual representations, in: International conference on machine learning, PMLR, 2020, pp. 1597-1607.
[11] S. Haykin, Neural networks and learning machines, 3rd Edition, Pearson, 2009.
[12] W. Gerstner, W. M. Kistler, Spiking neuron models: Single neurons, populations, plasticity, Cambridge university press, 2002.
[13] R. C. O'Reilly, Y. Munakata, Computational explorations in cognitive neuroscience: Understanding the mind by simulating the brain, MIT press, 2000.
[14] J. Weston, S. Chopra, A. Bordes, Memory networks, arXiv preprint arXiv:1410.3916 (2014).
[15] S. Grossberg, Adaptive pattern classification and universal recoding: I. parallel development and coding of neural feature detectors, Biological cybernetics 23 (3) (1976) 121-134.
[16] T. Kohonen, Self-organized formation of topologically correct feature maps, Biological cybernetics 43 (1) (1982) 59-69.
[17] T. D. Sanger, Optimal unsupervised learning in a single-layer linear feedforward neural network, Neural networks 2 (6) (1989) 459-473.
[18] J. Karhunen, J. Joutsensalo, Generalizations of principal component analysis, optimization problems, and neural networks, Neural Networks 8 (4) (1995) 549-562.
[19] S. Becker, M. Plumbley, Unsupervised neural network learning procedures for feature extraction and classification, Applied Intelligence 6 (3) (1996) 185-203.
[20] C. Pehlevan, T. Hu, D. B. Chklovskii, A hebbian/anti-hebbian neural network for linear subspace learning: A derivation from multidimensional scaling of streaming data, Neural computation 27 (7) (2015) 1461-1495.
[21] C. Pehlevan, D. B. Chklovskii, Optimization theory of hebbian/anti-hebbian networks for pca and whitening, in: 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton), IEEE, 2015, pp. 1458-1465.
[22] A. Wadhwa, U. Madhow, Learning sparse, distributed representations using the hebbian principle, arXiv preprint arXiv:1611.04228 (2016).
[23] A. Wadhwa, U. Madhow, Bottom-up deep learning using the hebbian principle (2016).
[24] Y. Bahroun, A. Soltoggio, Online representation learning with single and multi-layer hebbian networks for image classification, in: International Conference on Artificial Neural Networks, Springer, 2017, pp. 354-363.
[25] D. Krotov, J. J. Hopfield, Unsupervised learning by competing hidden units, Proceedings of the National Academy of Sciences 116 (16) (2019) 7723-7731.
[26] G. Amato, F. Carrara, F. Falchi, C. Gennaro, G. Lagani, Hebbian learning meets deep convolutional neural networks, in: International Conference on Image Analysis and Processing, Springer, 2019, pp. 324-334.
[27] G. Lagani, Hebbian learning algorithms for training convolutional neural networks, Master's thesis, School of Engineering, University of Pisa, Italy (2019). URL https://etd.adm.unipi.it/theses/available/ etd-03292019-220853/
[28] A. Magotra, J. kim, Transfer learning for image classification using hebbian plasticity principles, in: Proceedings of the 2019 3rd International Conference on Computer Science and Artificial Intelligence, 2019, pp. 233-238.
[29] A. Magotra, J. Kim, Improvement of heterogeneous transfer learning e ciency by using hebbian learning principle, Applied Sciences 10 (16) (2020) 5631.
[30] F. J. A. Canto, Convolutional neural networks with hebbianbased rules in online transfer learning, in: Mexican International Conference on Artificial Intelligence, Springer, 2020, pp. 35- 49.
[31] J. Yosinski, J. Clune, Y. Bengio, H. Lipson, How transferable are features in deep neural networks?, arXiv preprint arXiv:1411.1792 (2014).
[32] P. Fo¨ ldiak, Forming sparse representations by local anti-hebbian learning, Biological cybernetics 64 (2) (1990) 165-170.
[33] B. A. Olshausen, D. J. Field, Emergence of simple-cell receptive field properties by learning a sparse code for natural images, Nature 381 (6583) (1996) 607.
[34] A. Krizhevsky, G. Hinton, Learning multiple layers of features from tiny images (2009).
[35] J. Wu, Q. Zhang, G. Xu, Tiny imagenet challenge, Tech. rep., Technical report, Stanford University, 2017. Available online at http . . . (2017).
[36] A. Krizhevsky, I. Sutskever, G. E. Hinton, Imagenet classification with deep convolutional neural networks, Advances in neural information processing systems (2012).
[37] B. Scho¨ lkopf, A. Smola, K.-R. Mu¨ ller, Nonlinear component analysis as a kernel eigenvalue problem, Neural computation 10 (5) (1998) 1299-1319.
[38] A. Hyvarinen, J. Karhunen, E. Oja, Independent component analysis, Studies in informatics and control 11 (2) (2002) 205- 207.
[39] F. Javed, Q. He, L. E. Davidson, J. C. Thornton, J. Albu, L. Boxt, N. Krasnow, M. Elia, P. Kang, S. Heshka, et al., Brain and high metabolic rate organ mass: contributions to resting energy expenditure beyond fat-free mass, The American journal of clinical nutrition 91 (4) (2010) 907-912.
[40] S. B. Furber, F. Galluppi, S. Temple, L. A. Plana, The spinnaker project, Proceedings of the IEEE 102 (5) (2014) 652-665.
[41] X. Wu, V. Saxena, K. Zhu, S. Balagopal, A cmos spiking neuron for brain-inspired neural networks with resistive synapses andin situlearning, IEEE Transactions on Circuits and Systems II: Express Briefs 62 (11) (2015) 1088-1092.