Skip to content

Interesting talks/posters from COSYNE 2010

2010/03/04

COSYNE is one of the best conferences in computational neuroscience and systems neuroscience. As I did last year, I am going to summarize my experience. This year the acceptance rate of the conference was 74% (just based on abstracts) and around 300 posters were presented. The proceeding for the main meeting abstracts is available online. I’ll be updating this post for sometime as my remember more things.

Although GLM was still a strong trend in terms of modeling (I saw more than 10 posters, many of which came from Paninski’s group), more natural scene statistics (mostly visual), Bayesian analysis and working memory work was presented. Of course I am very biased since my background knowledge only allows me to understand so much (especially lacking on decision theories).

Statistics

II-24. Michael Vidne, Yashar Ahmadian, Jonathon Shlens, Jonathan Pillow, Jayant Kulkarni, Eero P Simoncelli, E.J. Chichilnisky, Liam Paninski. A common-input model of a complete network of ganglion cells in the primate retina.
For a population of neurons, the joint structure can be described by a coupled GLM model, however, in this work, they removed the coupling and only introduced correlated common input to the system to model retinal ganglion cells (RGCs). The model make more sense physiologically, and it seems to explain the observation better.

II-32. Jonathan Pillow. Estimation and assessment of non-Poisson neural encoding models.
Instead of introducing a feedback term in the LNP model to create a non-Poisson model, he substituted the Poisson spiking mechanism to a renewal model. This generalization can also fit by maximum likelihood. Interestingly he also demonstrated that it is possible to trick the KS-test on time rescaling theorem based transformation, because not only the rate, but the interval distribution can be warped in this model (IMHO, time rescaling should be done with the exact conditional intensity function to make it work). As an alternative test statistic, he uses a cross-validated log-likelihood, while I would suggest using a point process divergence. This model could be further generalized by allowing a feedback term, but it did not produce a much better fit. It seems that the conventional GLM still holds the ground.

III-59. Arno Onken, Steffen Grünewälder, Matthias H. J. Munk, Klaus Obermayer. A non-stationary copula-based spike count model.
A parametric copula along with “flashlight transformation” was fit to the spike count data from behaving monkey PFC, and the copula was used to infer the dependency. They extended the previous work, where they only considered total count in an interval, to a continuous point process. The authors assumed a basis for the rate function of each neuron as well as the copula function, and assumed Poisson process to fit.

III-53. Sean Escola, Liam Paninski. Hidden Markov models for the stimulus-response relationships of multistate neural systems.
Impressive work on using GLM to model both the Markov model and conditional intensity function for each state, and using the model to analyze the state transitions in the system from spike train observations. They showed that when an appropriate state transition model is used, one can find transition time locked spike patterns that was not observed before.

Workshop on The sampling hypothesis: relating neural variability to perception and learning
I had to miss many talks because of my volunteering work at the registration desk, but from the few talks I went to, it looked like József Fiser and Máté Lengyel’s theory that the neuronal activities are sampling from the posterior distribution, versus Alex Pouget’s probabilistic population code (PPC) was the main issue.
Perhaps the most packed and hottest discussion of COSYNE was done over the recent paper comparing the sampling hypothesis and PPC.

Neural coding / decoding

Special session honoring Horace Barlow‘s legacy.
He proposed “reduction of redundancy” as a principle of sensory processing, long time ago (1954). Before unsupervised learning principles based on information theory came, he talked about natural statistics and the neural system as reducing the redundancy. His theories have been inspirations of many works we see these days. I feel ashamed that I was ignorant of this great work. Olshausen gave a talk about redundancy reduction while maintaining rich feature space, and having a highly overcomplete yet sparse representation in the visual system. He mentioned how dynamic change in sparse over complete representation can tell you about invariant features.

II-61. Sameet Sreenivasan, Ila Fiete. Near exact correction of path integration errors by the grid cell-place cell system. and Ila Fiete. Neural coding for nearly perfect noise-free integration. (in workshop for Persistent neural activity: mechanisms and functional roles)
Grid cells in the entorhinal cortex (EC) were previously proposed to have combinatorial modulus properties. Here they proposed the grid cells as path integrators, and an error correcting scheme with biologically plausible network architecture for their coding. Each grid cell encodes the phase of the position estimate as rate, and interestingly, in the coding space that is represented by the population rate vector (or phase vector) can be shown to have a 1-dimensional manifold where the small displacement is allowed, and hence all the other perturbation due to phase noise can be corrected with a recurrent circuit to the place cells in the hippocampus. (see recent paper for details).
This is a point estimation procedure where only a single position is maintained with a fixed noise structure, where the uncertainty structure of the animal’s belief is difficult to be represented.

T-22. Bingni Brunton, Carlos D Brody. The Poisson clicks task: long time constant of neural integration of discrete packets of evidence.
Interesting experimental paradigm where a clicking sound with homogeneous Poisson statistics was delivered to the auditory system (left and right), and the subject’s task was to discriminate the side with higher rate. They showed that the subject is integrating over time to collect evidence, since the psychophysical performance increases with duration. Hence, it was evident that the subject was not using the time to first click, but either counting or estimating interval distribution. They built a stochastic model for integration and decision and showed it was consistent.

Workshop 6. Linearity and its discontents— Is there life in a post-STRF world?
Locally linear model, Wiener model and Hammerstein models were presented to overcome the limitation of linear models.
Of most interest to me was the talk by Tatyana Sharpee on “building nonlinear receptive field models using natural stimuli”. She used maximization of mutual information to non-parametrically train the linear part of the linear nonlinear model.

Neurophysiology

T-2. Jackie Schiller. Non linear dendritic processing in cortical pyramidal neurons.
Difference in basal, apical Ca spike region, distal dendrites were elegantly demonstrated in cortical pyramidal neurons. Although the basal dendrites are shorter, the integrating and back propagation properties seemed to be matching that of distal dendrites when normalized by the length of the dendrite. While the active dendritic properties in the distal and basal synaptic activities are based on the NMDA, but not Ca-dependent, the tuft region in the apical dendrite was Ca-dependent and not NMDA dependent.

T-21. Yang Yang, Anthony Zador. Differential sensitivity of different sensory cortices to behaviorally relevant timing differences
They implanted a couple of extracellular stimulating electrodes to various sensory regions of the cortex and gave a 2AC task to discriminate fine timing difference between the electrodes where periodic stimulations were delivered. This was to see if the fine temporal structure that was widely measured in many sensory front-end could be actually utilized by the corresponding cortex. Auditory cortex was reported by the same group to be able to discriminate 3 ms timing difference. Interestingly, the visual cortex (V1) was capable of 15 ms while barrel cortex could discriminate 1 ms! It is not clear if the performance can be interpreted as the ability of the specific cortex or related circuit (thalamo-cortical circuit, or local inhibitory circuit as a few people questioned).
Zador did not show up in time, so Yang Yang used the chance to present her phd work in her perspective. She’s a very entertaining speaker (I’m her fan!) .

Other

Workshop 5. Multi-scale complex dynamics in the brain.
Jochen Triesch. Unsupervised learning in recurrent networks.

He showed that with a very simplified STDP and homeostatic firing rate, weight normalization, the constructed liquid state machine can self-organize to perform unsupervised learning. Although the weight track was not stationary, once they were fixed, a stable readout could be made. It was a very nice result combining the self-organizing principles that lead to unsupervised learning. I would like to see more from the analysis of such systems.
Kilian Koepsell. Detecting functional connectivity in networks of phase-coupled neural oscillators.
Using a Kuramoto model, he proposed a method to infer the connectivity and delay of a large coupled oscillator network. He used score matching algorithm to fit the observed phase distribution to his parametric phase distribution predicted by the model. This model can only account for interaction between same frequency.

T-28. Jeremy Freeman, Eero P Simoncelli. Metamers of the ventral stream.
Visual information processing uses a bigger receptive field as it goes to higher order visual areas along the ventral stream. Also, the size of receptive field increases as the distance from the center increases. They created a psychophysical model and created images that are indiscriminable according to the receptive field size to determine the level of perception of visual image differences. These images are called metamers, and looking at them was a very cool experience.

T-31. Sydney Schaefer, Iris Shelly, Kurt Thoroughman. Beside the point: Motor adaptation without feedback error correction in task-irrelevant conditions.
To investigate the motor control feedback adaptation, they created a novel task where different motor strategies could be learned while some aspects are irrelevant. Error signals would be generated from both relevant motor control and irrelevant control. (referred to Jordan & Wolpert 1999 for motor learning models)

I-56. Christopher Nolan, Gordon Wyeth, Michael Milford. A neural microcircuit using spike timing for novelty detection.
STDP is well known to decrease the latency of causal responses. Hence, if a system is learning quickly through STDP, then by comparing the latency of spiking through a pathway with and without STDP, one could tell if the stimulus is novel. This is a purely hypothetical circuit, however, it might be useful in terms of neuromorphic engineering.

III-52. Yashar Ahmadian, Adam M Packer, Rafael Yuste, Liam Paninski. Designing optimal stimuli to control neuronal spike timing.
How can one deliver optimal stimulus to make the target neuron spike in exact temporal pattern? I saw at SfN 2009 related work by Ted Berger’s group using deterministic nonlinear mapping. This work uses GLM to solve the problem. This problem is of great importance to neural prosthetics.

3 Comments leave one →
  1. 2010/03/05 2:08 pm

    Another excellent post my friend! I think the T-31 on motor adaptation is very interesting. Also the sampling hypothesis one.

Trackbacks

  1. Interesting talks/posters from COSYNE 2011 « Memming
  2. Interesting talks/posters from COSYNE 2012 « Memming

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: