Paper: | PS-1B.12 |
Session: | Poster Session 1B |
Location: | Symphony/Overture |
Session Time: | Thursday, September 6, 18:45 - 20:45 |
Presentation Time: | Thursday, September 6, 18:45 - 20:45 |
Presentation: |
Poster
|
Publication: |
2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania |
Paper Title: |
Learning long-range spatial dependencies with horizontal gated-recurrent units |
Manuscript: |
Click here to view manuscript |
DOI: |
https://doi.org/10.32470/CCN.2018.1116-0 |
Authors: |
Drew Linsley, Junkyung Kim, Vijay Veerabadran, Thomas Serre, Brown University, United States |
Abstract: |
Progress in deep learning has spawned great successes in many engineering applications. As a prime example, convolutional neural networks, a type of feedforward neural networks, are now approaching -- and sometimes even surpassing -- human accuracy on a variety of visual recognition tasks. Here, however, we show that these neural networks and their recent extensions struggle in recognition tasks where co-dependent visual features must be detected over long spatial ranges. We introduce the horizontal gated-recurrent unit (hGRU) to learn intrinsic horizontal connections -- both within and across feature columns. We demonstrate that a single hGRU layer matches or outperforms all tested feedforward hierarchical baselines including state-of-the-art architectures which have orders of magnitude more free parameters. We further discuss the biological plausibility of the hGRU in comparison to anatomical data from the visual cortex as well as human behavioral data on a classic contour detection task. |