Conférence
Notice
Lieu de réalisation
Grenoble
Langue :
Français
Crédits
Com LPNC (Réalisation), Damien Querlioz (Intervention)
Détenteur des droits
lpnc-com
Conditions d'utilisation
Droit commun de la propriété intellectuelle
Citer cette ressource :
Damien Querlioz. LPNC. (2022, 12 juillet). Neuroscience-Infused Deep Learning. [Vidéo]. Canal-U. https://www.canal-u.tv/136743. (Consultée le 2 juin 2024)

Neuroscience-Infused Deep Learning

Réalisation : 12 juillet 2022 - Mise en ligne : 20 janvier 2023
  • document 1 document 2 document 3
  • niveau 1 niveau 2 niveau 3
Descriptif

Deep learning is reaching new spectacular achievements almost on a weekly basis. This progress has been largely fueled by the power of the backpropagation algorithm, without taking direct inspiration from the brain. In recent years, however, ideas coming from neuroscience, cognitive psychology, and neuromorphic computing have increasingly been applied to the field of deep learning, to address some of its most obvious limitations. In this talk, I will show two examples coming from our group. The way that deep neural networks and the brain learn differ profoundly. The flagship algorithm for training neural networks, backpropagation, relies on non-local computation, requiring massive memory use. The data movement caused by nonlocal computation is the source of the massive energy consumption of training deep neural networks (3.5 gigawatt-hour for the recent Google Palm language model). The brain, by contrast, learns intrinsically, and its synapses evolve directly through the spikes applied by the neurons they connect, using their biophysics. Based on this consideration, the group of Y. Bengio proposed Equilibrium Propagation, an energy-based deep learning technique that computes gradients using solely local computation and functions using a free phase and a phase where the network is “nudged” toward the right output. We report recent results showing the equivalence of this technique with non-local backpropagation (2), and its potential to scale to real-life tasks (3). Another concern of deep neural networks is that they suffer from catastrophic forgetting, i.e., when presented with new training data, they rapidly forget tasks learned previously. The brain, by contrast, excels at learning tasks sequentially. The ability of synapses to adjust their level of plasticity, metaplasticity, has been proposed as a leading way by which the brain might avoid catastrophic forgetting (3). Here, we show that a specific type of deep neural networks, binarized neural networks, can be made metaplastic. The resulting neural network mitigates catastrophic forgetting, without the need for explicit task boundaries, and using solely local computation (4). These two projects highlight the potential benefits of the synergy between neuroscience and machine learning research. 1. B. Scellier, Y. Bengio, Front. Comput. Neurosci. 11 (2017). 2. M. Ernoult, J. Grollier, D. Querlioz, Y. Bengio, B. Scellier, Proc. NeurIPS, pp. 7081 (2019). 3. A. Laborieux, M. Ernoult, B. Scellier, Y. Bengio, J. Grollier, and D. Querlioz, Front. Neurosci. 15, p. 129 (2021). 3. S. Fusi, P. J. Drew, and L. F. Abbott, Neuron. 45, 599–611 (2005). 4. A. Laborieux, M. Ernoult, T. Hirtzlin, D. Querlioz, Nature Communications 12, Article number: 2549, 2021.

Intervention
Thème
Discipline :