Skip to content

Interesting talks/posters from COSYNE 2011

2011/03/08

As I did for the past two CoSyNe meetings (2009, 2010), I made a summary of my personal experience of COSYNE 2011. This year, I also tried broadcasting via twitter together with hooPhilip, ag_smaoineamh, bradleyvoytek, xandram2110 (hashtag #cosyne). Jim DiCarlo gave a very nice analysis of how the reviewing process went (he injected test abstracts to all the reviewers to normalize! he also showed what’s the expected false positive rate for the selection of talks.)

In my biased eyes, there were many posters advocating the diversity/heterogeneity of neuronal populations this year.

II-33. John Cunningham, Mark Churchland, Matthew Kaufman, Krishna V. Shenoy. Extracting rotational structure from motor cortical data
John Cunningham introduced jPCA, a simple yet powerful method that extracts the most rotational linear projection pairs from a time series. I think this may evolve into a primary visualization tool similar to PCA. The method approximates the data as a linear dynamical system where the Frobenius norm (l2 norm of the vectorized matrix) is used to minimize \left[ \dot{\mathbf{X}} - \mathbf{M} \mathbf{X} \right]_F in the space of skew-symmetric matrices \mathbf{M}. This constraint optimization can be solved easily by a trick.
From dynamical system theory, the real part of eigenvalues of \mathbf{M} gives rise to convergence or divergence, while the imaginary part of eigenvalues indicate rotation around a fixed point. It was developed to visualize neuronal dynamics that oscillates, but this technique might also be useful for finding limit cycles, or phase synchronized population dynamics in general.

III-32. Mark Churchland, John Cunningham, Matthew Kaufman, Stephen I. Ryu, Krishna V. Shenoy. Firing rate oscillations underlie motor cortex responses during reaching in monkey
This is the sister poster that uses jPCA (II-33) to analyze population dynamics of motor responses. Churchland showed that motor responses of monkey did show rotational dynamics in the later phase, and he believes that the earlier stage of motor planning is to carefully regulate the state of the system at the correct initial condition which is critical for the later dynamics.

T-6. Tony Zador. Cortical circuits underlying auditory processing
He actually changed the title, and outlined an awesome plan to recover complete connectome of the brain. The plan is inspired by brainbow where neurons are colored with randomization of DNA sequences. Instead of visually coloring with fluorescent proteins, his plan is to tag them with unique DNA sequence barcodes, and use viruses to transynaptically jointly tag DNAs by combining pairs of neurons that are connected. Assuming the cost and speed of DNA sequencing technology, we would be able to obtain the full connectome of the brain with high probability pretty soon.

T-14. Tim Vogels, Henning Sprekeler, Friedemann Zenke, Claudia Clopath, Wulfram Gerstner. Inhibitory synaptic plasticity generates global and detailed balance of excitation and inhibition
Inhibitory to excitatory spike timing dependent synaptic plasticity rule that could put a network of leaky integrate and fire neurons in a balanced state was presented. This is without changing the excitatory-to-excitatory synapses. In fact, they showed that you can have local excitation structure embedded that could be recalled in this network.

I-86. Alex Huk, Miriam Meister. Neuronal heterogeneity in LIP: Banburismus in the brain, or enigma machine?
Alex Huk showed a collection of neurons that are responsive to memory task and/or discrimination task. The diversity of the population is huge, in terms of temporal organization. Together they support a prolonged activity that is usually seen in population averaged histograms.

II-40. Shimazaki Hideaki, Emery N. Brown. Constructing a joint time-series model of continuous and Bernoulli/Poisson processes using a copula
Copula is a powerful way of capturing dependencies between variables. There have been a several applications of copula in neural data, but it was always between a same type of data; continuous pairs or discrete pairs. Shimazaki is developing a way of capturing the dependence between continuous varaibles and a point process. So far he can capture instantaneous coupling by using an empirical conditional joint between the continuous variable and the spikes. This is possible because essentially the binary spiking can be considered a binary random variable IMHO.

II-70. Olav Stetter, Demian Battaglia, Jordi Soriano, Theo Geisel. State-dependent network reconstruction from calcium imaging signals
Olav presented their recent effort to enhance measures of causality/connectivity. They are working on analyzing the dynamics of cortical culture. He was not happy with traditional measure such as Granger causality and transfer entropy. He proposed a set of extensions: (1) high-pass filter, (2) capture instantaneous power as well, (3) use transfer entropy, (4) condition on population state. The last extension is very interesting. The network dynamics in this case has clear states:

II-12. Bingni W Brunton, Carlos D Brody. Optimal integration of decision-making evidence in the rat
This is a continuation of the work presented last year at COSYNE by the same authors. Poisson clicks were presented for a different rate on each ear for a fixed duration of 150-800 ms, and the animal (rat) has to decide which side had more clicks. From the psychophysics only, they modeled a drift to decision model with 6 parameters (leakage, sensory noise, accumulator noise, bound, facilitation/depression?). From the likelihood of the parameter space, they conclude that the rat has zero accumulator noise, which means the rats can count pretty well, and keep it in memory pretty well too.

III-33. Wieland Brendel, Naoshige Uchida, Ranulfo Romo, Christian K Machens. How to deal with the heterogeneity of neural responses: A demixing method
Given a set of data that can be conditioned on several categorical conditions, the goal is to find a common set of basis vectors that is related to one of the categorical condition at a time, while explaining the variance. A variant of PCA which is somewhat related to Fisher discriminant is introduced for this purpose they call DPCA. Essentially the conditional covariance matrices (marginalizing out other variables) are computed, and the cost function for PCA is weighted by the fraction of contribution from these covariances to the total.
(update Aug, 2011: Christian Machens has a related paper in Frontiers Computational Neuroscience.)

T-3. Anmo J. Kim , Aurel A. Lazar, Yevgeniy B. Slutski. Drosophila projection neurons encode the acceleration of time-varying odor waveforms
Anmo showed the transfer function of early olfactory processing using precisely controlled temporal odor stimulation. Two serial stages of differentiation effect of the input was demonstrated from olfactory sensory neuron (OSN) and projection neurons (PN). Hence, the effective signal output (spike rate) of PN was the acceleration.

II-76. Alexandra Smolyanskaya, Stephen G Lomber, Richard Born. Individual neurons in MT have significant detect probabilities for motion and depth detection tasks
Using cooling, she could change tuning curve properties of visual cortex. Depth tuning and directional tuning both changed, but with different amounts. Interestingly, the performance of the neurons (measured with detection probability (same as choice probability)) degraded more for depth, if I remember correctly, while the tuning curves were worse for depth.

I-42. Il Memming Park, Miriam Meister, Alexander Huk, Jonathan W Pillow. Detailed encoding and decoding of choice-related information from LIP spike trains
My poster was on the first night (hence less coverage of posters on the same day). We analyzed decision making process from a statistical modeling perspective.

Workshops

Wolf Singer. The role of oscillatory phenomena in sensory processing
He was much more careful to claim the usefulness of temporal code compared to his keynote speech at IJCNN 2007. He gave various examples of dissociation between oscillation/synchrony dynamics and rate codes, but none of them was conclusive evidence that the brain utilizes them to compute. Curiosity, attention, expectancy showed modulation of oscillatory components, and he talked about NIPS 2006 paper on liquid state machine like experiment on visual cortex (Nikolic et al.).

Aysegul Gunduz, Gerwin Schalk. The dynamics of attentional shift reflected in electrocorticography
Aysegul talked about how attention modulates ECoG signal amplitudes in human, and demonstrated that it is possible to decode the spacial attention from ECoG as well. Higher frequency activity on different areas were consistently involved and formed an “attention network”.

On the second day of the workshop, I spent most of my time in the closing the loop: new techniques for online neural characterization and optimal control. The topics were closely related to optimal experiment design, and active learning. Greg Horwitz gave a great opening remark on how the invention of computer gave rise to laziness of the experimentalists. For example, in the old days to find V1 receptive fields, people moved a moving light bulb while listening to the neuron, but now we stimulate with a computer in an open loop most of the time. This workshop was very interesting to me. I hoped that some of the workshops on the second day would have been on the first day though.

Greg Horwitz. Characterizing spatial iso-response surfaces of retinal ganglion cells
In a neural response model where the input is combined into a single quantity, iso-response surface captures how the inputs are combined. In chromatic tuning of V1 neurons, the widely used model is linearly combining each cone output followed by a nonlinearity, which results in a linear manifold in the space of cone contrast. However, when measured adaptively through feedback to find the most non-linear part of the iso-response surface, he found some cells had more quadratic basis, or even elliptical. The adaptive strategy was to model the iso-response surface as a collection of triangles, and refine the details by probing the center of the most non-linear polygon and doing so with interleaving stimulus to avoid adaptation of the neuronal response.

Yashar Amadian. Design of optimal stimuli to control neuronal spiking
Extending last year’s poster of his, he talked about how to use a probabilistic modeling of the system (stimulus to spiking), and constraints (maximum amplitude, smoothness of waveform) to optimize for a stimulation pattern that would produce as precisely timed spike train as the target. The loss function he used for the target was an instantaneous Dirac delta function which would be zero if and only if the target spike occurs within the bin.

Kechen Zhang. Optimal stimulus design and network structure
He talked about neural network theory in system identification of how constraints that bound the parameter space to a closed region in a quadratic neural network model can frequently give rise to at least one parameter on the boundary. This is because the eigenstructure of the quadratic model results in saddles of the response surface, as well as the fact that the continuous map from parameter space to response surface forcing the boundary of the parameter space to be mapped to the boundary in the constrained response surface. He also talked about how the network is ambiguous under certain conditions (power gain function and converging feedforward network).

See also Mark H. Histed and Jonathan W. Pillow’s report about the meeting

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: