Abstract
The human brain continuously processes streams of visual input. Yet, a single image typically triggers neural responses that extend beyond one second. To understand how such a slow computation copes with real-time processing, we recorded subjects’ electrical brain activity, while they watched ~5,000 rapidly-changing images. First, we show that each image can be decoded from brain activity for ~1 sec, and demonstrate that the brain simultaneously represents multiple images at each time instant. Second, dynamical system modeling reveals that these sustained representations can be explained by a specific chain of neural circuits, which consist of (i) a hidden maintenance mechanism, and (ii) an observable update mechanism. Third, this neural architecture is localized along the expected visual pathways. Finally, we show that the propagation of low-level representations across the visual hierarchy is a principle shared with deep convolutional networks. Together, these findings provide a general neural mechanism to simultaneously represent successive visual events.
Significance Our retina are continuously bombarded with a rich flux of visual input. How our brain continuously processes such visual stream is a major challenge to neuroscience. Here, we developed techniques to decode and track, from human brain activity, multiple images flashed in rapid succession. Our results show that the brain simultaneously represents multiple successive images at each time instant. A hierarchy of neural assemblies which continuously propagate multiple visual contents explains our findings. Overall, this study sheds new light on the biological basis of our visual experience.