Firing rate model (Mean Field theory) for neural population

Warning message

All forms are disabled.

Mean Field Theory for a neural population is the lowest order macroscopic description for a neural network. By defining the firing rate probability density of the network, a dynamical equations can be written to describe the system. In usual case, there are two importent element in the equation : the time scale and the gain function. To understand the microscopic correspondence, I introduce the central idea of the derivation.

First, one must know that there are two coding strategy in time for neural system : temporal coding and rate coding. According to Taillefumier and Magnasco's work, there is a transition between them depend on how "noisy" the input is comparing to the intrinsic noise which makes the system a random walker. In the rate coding region, one can use firing rate to describe a neuron or a neural network. One can use Linear-Nonlinear model to link the stimulus to the response (firing rate). In a network, a mean field theory can describe the neural population by the firing rate. By studying the stationary state, where all neuron fires in a Poisson process with same firing rate. The mean firing rate can be found by the gain function. For the dynamical equation, there are lots of derivation. One can either read the famous Wilson-Cowan model or read Chap.12~14 in Gerstner et al.'s book. In a word, the time constant is actually the natural time scale for a single neuron. And for the gain function, it is related to the degree distribution and also the firing rate- input current relation. ( In a all to all network, the gain function is the firing rate-input current relation.)

In practical concern, one treat the dynamical equation as a phenomenological model, there are several often-used gain function. Linear, threshold linear or the most physiological one -- sigmoid. Just to remind you that the first two is just a low activity approximation since neuron can't fire infinitely fast. There must be a cutoff in the high activity case. Last but not least, Jack D. Cowan established a statistical field theory for neural network which can also consider the fluctuatio and also the correlation between the cells or between populations. One can find the history in Chap.2 in his latest published book.

One can find my slide here.

Add new comment

Filtered HTML

  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <blockquote> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
  • Mathematics inside the configured delimiters is rendered by MathJax. The default math delimiters are $$...$$ and \[...\] for displayed mathematics, and $...$ and \(...\) for in-line mathematics.

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.