PT - JOURNAL ARTICLE AU - James C.R. Whittington AU - Rafal Bogacz TI - Learning in cortical networks through error back-propagation AID - 10.1101/035451 DP - 2015 Jan 01 TA - bioRxiv PG - 035451 4099 - http://biorxiv.org/content/early/2015/12/28/035451.short 4100 - http://biorxiv.org/content/early/2015/12/28/035451.full AB - To efficiently learn from feedback, the cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error back-propagation. It has been successfully used in both machine learning and modelling of the brain’s cognitive functions. However, in the back-propagation algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified. Hence it has not been known if it can be implemented in biological neural networks. Here we analyse relationships between the back-propagation algorithm and the predictive coding model of information processing in the cortex. We show that when the predictive coding model is used for supervised learning, it performs very similar computations to the back-propagation algorithm. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the back-propagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.Author Summary When an animal learns from feedback, it should minimize its error, i.e., the difference between the desired and the produced behavioural output. Neuronal networks in the cortex are organized in multiple levels of hierarchy, and to best minimize such errors, the strength of synaptic connections should not only be modified for the neurons that directly produced the output, but also on other levels of cortical hierarchy. Theoretical work proposed an algorithm for such synaptic weight update known as error back-propagation. However, in this algorithm, the changes in weights are computed on the basis of activity of many neurons not directly connected with the synapses being modified, hence it has not been known if such computation could be performed by biological networks of neurons. Here we show how the weight changes required in the back-propagation algorithm could be achieve in a model with realistic synaptic plasticity. Our results suggest how the cortical networks may learn efficiently from feedback.