Paper: | GS-5.1 |
Session: | Contributed Talks V |
Location: | Ormandy |
Session Time: | Friday, September 7, 11:00 - 12:00 |
Presentation Time: | Friday, September 7, 11:00 - 11:20 |
Presentation: |
Oral
|
Publication: |
2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania |
Paper Title: |
Neural Population Control via Deep ANN Image Synthesis |
Manuscript: |
Click here to view manuscript |
DOI: |
https://doi.org/10.32470/CCN.2018.1222-0 |
Authors: |
Pouya Bashivan, Kohitij Kar, James DiCarlo, MIT, United States |
Abstract: |
Specific deep feed-forward artificial neural networks (ANNs) constitute our current best understanding of the primate ventral visual stream and the core object recognition behavior it supports. Here we turn things around and ask: can we use these ANN models to synthesize images that \emph{control} neural activity? We here test this control in cortical area V4 in two control goal settings. i) \emph{Single Neuron State Control}: "stretch" the maximal firing rate of any single neural site beyond its naturally occurring maximal rate. ii) \emph{Population State Control}: independently and simultaneously control all neural sites in a \emph{population}. Specifically, we aimed to push the recorded population into a "one hot" state in which one neural site is active and all others are clamped at baseline. We report that, using ANN-driven image synthesis, we can drive the firing rates of most V4 neural sites beyond naturally occurring levels. And we report that V4 neural sites with overlapping receptive fields can be partly -- but not yet perfectly -- independently controlled. These results are the strongest test thus far of deep ANN models of the ventral stream, and they show how these models could be used for setting desired brain states. |