From Neuron (Sep 2), an interesting combination of mathematical analysis and proof with biology:
Synaptic Connectivity and Neuronal Morphology
Two Sides of the Same Coin
Dmitri B. Chklovskii
Neurons often possess elaborate axonal and dendritic arbors. Why do these arbors exist and what determines their form and dimensions? To answer these questions, I consider the wiring up of a large highly interconnected neuronal network, such as the cortical column. Implementation of such a network in the allotted volume requires all the salient features of neuronal morphology: the existence of branching dendrites and axons and the presence of dendritic spines. Therefore, the requirement of high interconnectivity is, in itself, sufficient to account for the existence of these features. Moreover, the actual lengths of axons and dendrites are close to the smallest possible length for a given interconnectivity, arguing that high interconnectivity is essential for cortical function.
? Design I: Point-to-Point Axons
? Design II: Branching Axons
? Design III: Branching Axons and Dendrites
? Design IV: Branching Axons and Spiny Dendrites
? Proof of Design Optimality
? Comparison with Experiment
? Note Added in Proof
“These lectures were recorded at the Obidos Advanced Course in Computational Neuroscience 2003. This is a yearly summerschool in neuroscience and related areas. We are grateful for the speakers’ permisssion to use their material.”
Haven’t checked these out yet but several look interesting. Larry Abbott has a lecture on spike-timing dependent plasticity, Alain Destexhe has a lecture on synaptic noise in the cortex, and Dan Wolpert has one on sensorimotor learning.
Here’s the link to the lectures.
This looks cool; a machine learning algorithm for learning to cluster, based on training data. You specify a set of attributes which can be computed on any given local neighborhood (nearby subset of points), and then the algorithm learns how to use these “features” to cluster.
For more info, search http://www.psi.toronto.edu/ml.html for “learning to cluster”.
Clustering produced by the algorithm:
The correct answer:
Recent work in Nature from a group in Germany shows the importance of temporal order in determing whether a stimulus is considered aversive or appetitive. Briefly, fruit flies were given paired odors and electric shocks with a varying interval between the two events; when the odor preceeded the shock, it became an aversive stimulus. When shock preceeded odor, the odor became appetitive.
It amazes me that with the recent interest in reinforcement learning, so many of the “cutting-edge experiments” look awfully similar to the behaviorist literature of almost 100 years ago! Although I don’t know the literature well myself, I am positive that someone must have taken Pavlov’s famous experiment and reversed the temporal order (maybe not for fruit flies but for dogs or pigeons perhaps). If anyone out there knows of something, please post it below in the comments.
Also, there is some striking similarity with MM Poo’s work with growth cone guidance in which a chemotactic factor is combing with electrical stimulation. In that work, certain patters of electrical stimulation were able to reverse the actions of particular chemotrophins from attraction to repulsion or vice versa.
Full article from the German group can be found after the jump.