Dynamics of networks of spiking neurons
We have introduced models of networks of integrate-and-fire neurons composed of two populations of neurons (excitatory and inhibitory) with random connectivity, and analyzed their dynamics [11,13,20]. We used an analytical framework in which each population is described by its distribution of membrane potentials, as well as its instantaneous firing probability [19,20]. A crucial feature of this formalism is to take into account the temporal fluctuations in the inputs received by neurons, that are essential to capture the phenomenology of electrophysiological recordings in cortex (highly irregular and low rate activity). In network states that capture this data, mean inputs to neurons are well below the firing threshold, and firing is induced by fluctuations around these mean inputs (the so-called fluctuation-driven regime). The formalism can also account for the emergence of synchronous population oscillations in such networks [19,20]. In the inhibition dominated regime, these oscillations tend to be fast (frequencies from the gamma to the ultra fast range). The analysis identified a regime in which synchronous population oscillations coexist with highly irregular firing of single neurons at rates that are much lower than the population oscillation. In vivo data during oscillatory episodes in cortex and hippocampus support this stochastic oscillatory model, since firing rates are typically lower than the population frequency.
Stochastic dynamics of single spiking neurons
To understand the factors that shape the neuronal response to transient inputs, we have investigated the dynamics of the instantaneous firing probability in single spiking neuron models of increasing complexity in the presence of noisy inputs [23,25,26,28,29,31,34,52, 60,64]. We have characterized extensively how the instantaneous firing rate depends on the statistics of the inputs, in both static and dynamic conditions. Along the way, several simplified models whose complexity is intermediate between the simple leaky integrate-and-fire model and Hodgkin-Huxley (HH) type models have been introduced. In particular, we introduced a simplified non-linear one variable model (the exponential integrate-and-fire (EIF) model) that describes accurately the spike generation dynamics of HH type models . This model allowed us to understand the factors that limit the speed of the neuronal response to fast transient inputs. The EIF model has been shown subsequently to fit well the dynamics of action potential generation in both cortical pyramidal cells and fast-spiking interneurons, and it forms the basis of adaptive EIF models, that are increasingly used in many network models, since they combine simplicity, ability to fit quantitatively electrophysiological data, and ability to produce a large diversity of electrophysiological behaviors. We have also shown that a population of EIF neurons can be reduced mathematically in the strong noise regime to a firing rate model .
Mechanisms of persistent activity in cortical circuits
In parallel with the study of the dynamics of unstructured networks, we have investigated the dynamics of networks that are stuctured by learning, with the aim to understand better phenomena observed in cortex of behaving primates during delayed response tasks, such as selective persistent activity [11,13,21,22,24]. Selective persistent activity has been hypothesized to be the substrate of working memory in cortical circuits. We have built network models of spiking neurons in which persistent activity is due to multi-stability of the network, which arises thanks to a combination of potentiated connectivity between selective sub-groups of pyramidal cells, and strong global inhibition. We also studied how temporal associations between stimuli can be learned in such networks, and how association learning can lead to the phenomenon of prospective activity (increase in activity in neurons that are selective to stimuli that the animal expects to occur after the delay period) .
Synaptic plasticity rules
We have followed three complementary approaches to make progress on our understanding of synaptic plasticity. The first approach consists in building a synaptic plasticity model from in vitro data. We introduced a simplified synaptic plasticity model based on post-synaptic calcium concentration , that can reproduce the diversity of spike-timing dependent plasticity (STDP) curves (how synaptic efficacy changes as a function of timing difference between pre and post-synaptic neurons) seen in experiments, varying only two parameters characterizing the amplitude of calcium transients induced by pre and post-synaptic spikes. The second approach consists in inferring plasticity rules from in vivo data. We have been using electrophysiological data from behaving monkeys (visual responses of IT neurons to large sets of novel and highly familiar stimuli). We introduced a method that allowed us to infer directly the learning rule (or more precisely, the dependence of the rule on post-synaptic firing rate) that produces the observed changes in the response statistics, from the corresponding two distributions of visual responses (to novel and familiar stimuli, respectively) . The third and last approach is to derive learning rules from optimality principles. I have derived plasticity rules that can approach the maximal storage capacity in several contexts [38,65].
Statistics of networks optimizing information storage
What are the consequences of optimizing the amount of stored information, or its robustness, on the statistics of neuronal connectivity? We computed analytically the distribution of synaptic weights in a network that maximizes the amount of stored information, with a given robustness constraint so the information can be retrieved robustly in the presence of noise. When information is stored in excitatory synapses, we showed that the distribution contains a finite and large fraction of exactly zero synaptic weights, which can be interpreted either as `silent’ synapses or `potential’ synapses The fraction of `silent’ or `potential’ synapses can be shown to be always larger than 0.5, and increases with the degree of robustness of information storage. We also showed that the distribution of synaptic weights can fit observed distribution of weights in Purkinje cells and cortical pyramidal cells . We have computed the joint distributions of pairs of synaptic weights in recurrent networks that maximize the numbers of fixed point attractors, or the number of stored sequences of activity . Networks that maximize the numbers of fixed point attractors have a strong over-representation of reciprocally connected pairs of excitatory neurons, in agreement with data from several multiple intracellular recording studies in cortical slices, while networks that maximize the numbers of stored sequences of activity have no such over-representation.