TY - JOUR T1 - Visual motion computation in recurrent neural networks JF - bioRxiv DO - 10.1101/099101 SP - 099101 AU - Marius Pachitariu AU - Maneesh Sahani Y1 - 2017/01/01 UR - http://biorxiv.org/content/early/2017/01/09/099101.abstract N2 - Populations of neurons in primary visual cortex (V1) transform direct thalamic inputs into a cortical representation which acquires new spatio-temporal properties. One of these properties, motion selectivity, has not been strongly tied to putative neural mechanisms, and its origins remain poorly understood. Here we propose that motion selectivity is acquired through the recurrent mechanisms of a network of strongly connected neurons. We first show that a bank of V1 spatiotemporal receptive fields can be generated accurately by a network which receives only instantaneous inputs from the retina. The temporal structure of the receptive fields is generated by the long timescale dynamics associated with the high magnitude eigenvalues of the recurrent connectivity matrix. When these eigenvalues have complex parts, they generate receptive fields that are inseparable in time and space, such as those tuned to motion direction. We also show that the recurrent connectivity patterns can be learnt directly from the statistics of natural movies using a temporally-asymmetric Hebbian learning rule. Probed with drifting grating stimuli and moving bars, neurons in the model show patterns of responses analogous to those of direction-selective simple cells in primary visual cortex. These computations are enabled by a specific pattern of recurrent connections, that can be tested by combining connectome reconstructions with functional recordings.*Author summary Dynamic visual scenes provide our eyes with enormous quantities of visual information, particularly when the visual scene changes rapidly. Even at modest moving speeds, individual small objects quickly change their location causing single points in the scene to change their luminance equally fast. Furthermore, our own movements through the world add to the velocities of objects relative to our retinas, further increasing the speed at which visual inputs change. How can a biological system process efficiently such vast amounts of information, while keeping track of objects in the scene? Here we formulate and analyze a solution that is enabled by the temporal dynamics of networks of neurons. ER -