Resumen |
Both synaptic plasticity rules (the so-called Hebbian rules) and Convolutional Neural Networks are based on or inspired by well-established models of Computational Neuroscience about mammal vision. There are some theoretical advantages associated with these frameworks, including online learning in Hebbian Learning. In the case of Convolutional Neural Networks, such advantages have been translated into remarkable results in image classification in the last decade. Nevertheless, such success is not shared in Hebbian Learning. In this paper, we explore the hypothesis of the necessity of a wider dataset for the classification of mono-instantiated objects, this is, objects that can be represented in a single cluster in the feature space. By using 15 mono-instantiated classes, the Adam optimizer reaches the maximum accuracy with fewer examples but using more epochs. In comparison, Hebbian rule BCM demands more examples but keeps using real-time learning. This result is a positive answer to the principal hypothesis and enlights how Hebbian learning can find a niche in the mainstream of Deep Learning. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG. |