Skip to content

Computational Neuroscience course at BCCN-Göttingen 2009 (Part 1)


Last week (Sep 21th – 25th, 2009), I participated a five day computational neuroscience course organized by the Bernstein center for computational neuroscience. It was the 7th fall CNS course they organized at the Max-Planck-Institute for Dynamics and Self-Organization, Göttingen, Germany. The topics were mostly focused on theoretical approaches to modeling, and in vivo spontaneous activities. Each lecturer spoke for 3 hours, then students had to study papers and present.

Day 1: Dr. Jason Kerr: Population imaging in vivo: from the awake to the anesthetized

Using in vivo calcium imaging techniques combined with fast two-photon imaging, he showed how biased we are in terms of observing the neural activities. The traditional extracellular recording lead to bias on high firing rate and well tunned neurons, because the experimentalist will look for such cells. The hypothesis of rate coding and population tuning are mostly based on these biased sampling of cortical neurons. Similarly, blind patch clamp tends to find neurons with bigger somata (pl. for soma). He presented a figure (Figure 9 from [1]) to show how naturally different cortical layers have different firing rate profile; in general layer 4 and 6 have higher rate. Imaging techniques can be better sample the population, and he showed a method to extract action potential timings in the time scale of 64~128 ms from calcium transient signal. (His group has overcome interesting engineering challenges). They have combined patch clamp and calcium imaging to obtain the ground truth and training data and applied to other neurons to obtain the population activity (See [2] for details of his method). He also showed miniature imaging backpack system mounted on rodents for semi-freely behaving animals (fiber scope). In the second part of his lecture, he showed analysis of barrel cortex imaging results. Interestingly, the correlation structure found in spontaneous activity and evoked activity were very similar, and no spatial pattern was consistently appearing (15 neurons time binned with 128 ms window).

The paper assigned to my group for presentation was about the silent neurons in the brain [3]. The paper was straightforward and I found myself mostly agreeing with the authors. Various evidence including the imaging results show that more than 60% of the neuronal population of the cortex has very low spontaneous firing rate and will be ignored by conventional extracellular recording techniques. Usually the spike sorting method requires multiple instance of the same neuron to fire sufficiently often so that they can form a cluster; this creates a bias to ignore low firing rate neurons. These neurons may firing highly selectively as in the case of auditory or higher visual cortices, but in general their role is not clear. These neurons indicate that temporal codes rather than rate codes maybe of high importance for understanding these neurons. This was somewhat surprising because of the presence of homeostatic gain control mechanisms and developmental apoptosis of neurons that do not fire (I don’t have references to these claims, if anybody knows please let me know). According to various arguments such as maximizing mutual information, and reducing the metabolic cost of the firing of the neuron lead to more sparse encoding of the input, hence the temporal code in higher order areas can be expected, but when I asked a question, Dr. Kerr gave me the example of high firing rate neurons in working memory tasks.

Day 2: Dr. Benjamin Lindner: Interspike interval statistics and response properties of neurons in the fluctuation driven regime

Dr. Lindner lectured on statistics and power spectral density for point processes, and diffusion approximation for integrate-and-fire family of neurons. Also he talked about renewal processes, and serial correlation. A few facts that I was not familiar with: the inter-spike interval serial correlation  coefficient is defined as, \rho_k = \frac{\langle (X_{i+j} - \langle I_i \rangle) (I_i - \langle I_i \rangle) \rangle}{(I_i - \langle I_i \rangle)^2} and related to the power spectral density at 0 frequency as: S(0) = r_0 {CV}^2 \left[ 1 + 2 \sum_{k=1}^\infty \rho_k \right] where r_0 is the mean firing rate, and CV is the coefficient of variation of the (marginal) inter spike interval density. For renewal process, the power spectral density can be neatly expressed as: S(\omega) = r_0 \frac{1 - |\tilde p|^2}{|1 - \tilde p|^2}.

Neurons in vivo have various noise sources — mainly synaptic bombardment, and this can be approximated with a diffusion process. He assumed a fixed conductance based leaky integrate-and-fire neuron (one dimension) and Poisson stimulation with first-order synaptic smoothing as input, and then he showed a series of approximations that leads to a diffusion approximation [4,5]. This results in a standard form of stochastic differential equation C\frac{dV}{dt} = -g_0(V-E_0) + \sqrt{2D} \zeta(t) where g_0 is the effective conductance, D is the diffusion time constant, and $\zeta(t)$ is a white Gaussian noise. To simulate this Ornstein-Uhlenbeck process of the form \dot{x} = f(x) + \sqrt{2D} \zeta(t) in discrete time steps \Delta t is to use the recursive formula: x(t+\delta t) = x(t) +f(x(t)) \Delta t + \sqrt{2D \Delta t} a(t) where a(t) is a zero-mean unit variance Gaussian random variable. The probability density of the membrane voltage trace evolution is governed by the Fokker-Planck equation: \partial_t P(x,t) = \partial_x[ -f(x)P(x,t)] + D\partial_x^2 P(x,t) where the right hand side is the negative partial derivative of the probability current. By setting the correct boundary conditions, one can solve the first-passage time distribution (and hence the power spectral density), and response to periodic stimulation. Also the most important is the linear response approximation of input-output relation of firing rate which works better when the noise is strong. Noise linearizes the response [6].

Our group discussion paper was [7] where the authors show that a model with interval correlation could increase information transfer in a very cleverly chosen model. Two models A and B are introduces have almost same properties (rate and ISI distribution), except for the fact that A has serial correlation. Intuitively thinking, since the correlation will decrease the total entropy of the firing, the mutual information from the input to the output which is bounded by this value should decrease. However, the nonlinear property of the neuron gives rise to redistribution of energy in the spectral domain for spontaneous activity. Hence lower frequency input can be passed through the system better. Linear response approximation for perfect integrate-and-fire neuron and the lower bound of mutual information obtained from the coherence are used to derive the final result. The system is in a high noise region, so the decrease of output entropy is mostly for the noise and the increase in mutual information (lower bound) is very possible. I wonder how general this result is; simulations with more complex neuron models and actual computation of mutual information instead of using the lower bound would be an interesting follow up study.

  1. Sergej V. Girman, Yves Sauve, Raymond D. Lund. Receptive Field Properties of Single Neurons in Rat Primary Visual Cortex. J Neurophysiol, Vol. 82, No. 1. (1 July 1999), pp. 301-311.
  2. Jason N. D. Kerr and Winfried Denk. Imaging in vivo: watching the brain in action. Nature Reviews Neuroscience 9, 195-205 (March 2008) | doi:10.1038/nrn2338
  3. Shy Shoham, Daniel O’Connor, Ronen Segev. How silent is the brain: is there a “dark matter” problem in neuroscience? Journal of Comparative Physiology A: Neuroethology, Sensory, Neural, and Behavioral Physiology, Vol. 192, No. 8. (1 August 2006), pp. 777-784  doi:10.1007/s00359-006-0117-6
  4. Lars Wolff, Benjamin Lindner. Method to calculate the moments of the membrane voltage in a model neuron driven by multiplicative filtered shot noise. Physical Review E (Statistical, Nonlinear, and Soft Matter Physics), Vol. 77, No. 4. (2008), 041913. doi:10.1103/PhysRevE.77.041913
  5. A. Burkitt. A Review of the Integrate-and-fire Neuron Model: I. Homogeneous Synaptic Input. Biological Cybernetics, Vol. 95, No. 1. (1 July 2006), pp. 1-19. doi:10.1103/PhysRevE.77.041913
  6. L. F. Abbott, Carl van Vreeswijk. Asynchronous states in networks of pulse-coupled oscillators. Physical Review E, Vol. 48, No. 2. (1 Aug 1993), pp. 1483-1490. doi:10.1103/PhysRevE.48.1483
  7. Maurice J. Chacron, Benjamin Lindner, André Longtin. Noise Shaping by Interval Correlations Increases Information Transfer. Physical Review Letters, Vol. 92, No. 8. (25 Feb 2004), 080601.  10.1103/PhysRevLett.92.080601

Acknowledgement: Memming thanks Dr. John Harris for his generous travel support.

No comments yet

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: