Paper: | PS-2B.2 |
Session: | Poster Session 2B |
Location: | Symphony/Overture |
Session Time: | Friday, September 7, 19:30 - 21:30 |
Presentation Time: | Friday, September 7, 19:30 - 21:30 |
Presentation: |
Poster
|
Publication: |
2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania |
Paper Title: |
Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks |
Manuscript: |
Click here to view manuscript |
DOI: |
https://doi.org/10.32470/CCN.2018.1211-0 |
Authors: |
Thomas Mesnard, Gaƫtan Vignoud, Montreal Institute for Learning Algorithms, Canada; Walter Senn, University of Bern, Switzerland; Yoshua Bengio, Montreal Institute for Learning Algorithms, Canada |
Abstract: |
In the past few years, deep learning has transformed artificial intelligence research and led to impressive performances in various difficult tasks. However, it is still unclear how the brain can perform credit assignment as efficiently as backpropagation does in deep neural networks. In this paper, we introduce a model that relies on a new type of neurons, Ghost Units, that enable the network to backpropagate errors and do efficient credit assignment in deep structures. While requiring very few biological assumptions, it is able to achieve great performances on classification tasks. Error backpropagation occurs through the network dynamics itself and thanks to biologically plausible local learning rules. In particular, it does not require separate feedforward and feedback circuits. Therefore, this model is a step towards understanding how learning and memory are computed in cortical multilayer structures, but also raises interesting questions about neuromorphic hardware applications. |