What is the right level of biological realism to model neuronal systems in order to understand their computational properties? Some recent papers may help shed some light on the subject. Models of the computational properties of local networks of neurons are starting to come into their own. This year has already seen at least two articles published in experimentalist journals based on the same core of theoretical work.

To bring you up to speed, I need to remind you what is going on in the world of experimental neuroscience.

Experimentalists are now able to record the single-cell activities of a whole population of neurons simultaneously. From Briggman, Abarbanel, Kristan (2006):

*By using multi-electrode arrays or optical imaging, investigators can now record from many individual neurons in various parts of nervous systems simultaneously while an animal performs sensory, motor or cognitive tasks. Given the large multidimensional datasets that are now routinely generated, it is often not obvious how to find meaningful results within the data.*

This paper goes on to provide a nice overview on mathematical methods that researchers are using to grapple with the challenge of understanding the dynamics of the neural systems they are recording from. They make the case that conceptual progress needs to be made on the interpretation of the data these results yield. How can we understand what computations these neurons are collectively performing?

(Incidentally, this topic is being explored in a conference happening this week at the Los Alamos National Laboratory, which, according to one of the conference session chairs, is intended to help shape future directions for the lab. Hopefully there will be webcasts from this conference.)

Theorists have worried about what neurons are doing in local populations for some time. Investigations have given rise to all kinds of models of such activity, notably those of Wilson & Cowan (1972), and Hopfield (1982), which viewed a network of neurons as having “attractor states”, invoking the language, for Hopfield, and the mathematics, for Wilson & Cowan, of dynamical systems theory.

In 2002, Maass, Natschlager and Markram proposed a theory of computation for localized populations of neurons that did not require that the population settle into a stable attractor in order to perform computations. The idea is that local populations of interconnected neurons of the right kinds are capable of discriminating temporal input patterns *in general*, and that this behavior is governed by the network dynamics of those populations. They showed that neuronal circuits can be constructed to create a generalizable computational architecture for continuous analog systems, known as liquid state machines. In addition, these circuit models are some of the first to incorporate short-term synaptic plasticity in a dynamical population model.

This idea, elaborated by Maass & Markram in a book chapter entitled *Theory of the computational function of microcircuit dynamics*, circa 2005, proposes that the liquid state machine architecture is capable of universal computation, just as Turing machines are.

On the heels of these postulates, two recent papers make a serious effort to combine this theoretical paradigm with experimental data. Haesler & Maass (2007), in Cerebral Cortex, synthesized data from layers of cerebral cortex to build a sophisticated dynamical model and tested to see if it had the kinds of computational properties described by the previous theoretical work (it did). Karmarkar & Buonomano (2007), in Neuron, explored the notion of “clockless computing” by networks of model neurons to understand how neural systems tell time, and used psychophysical experiments to support their theories.

A few concrete things are suggested by these works:

- Computational models of neuronal networks that take short-term plasticity and other biological details into account can be constructed. They demonstrate relevant computational properties.
- Testing the computational properties of such networks requires framing experiments in terms of complex analog signal processing.
- The global dynamical properties of local populations of neurons set the general tone of the computations they perform, while the single cell dynamics shape and refine those computations.

Together, these papers may represent the beginning of a new understanding of the computations that networks of real physiological neurons are capable of. Expectations are high that results from further multi-cellular recordings and from the Blue Brain project will verify and elaborate these ideas. Stay tuned!

Oh great, this must be the dullest work of the decade in neuro, this brilliant combo of maths and neuro has turned up absolutely nothing new, just the same old crap, yeah and some new jargon. Wtf, academic work has finally turned academic!

LikeLike

Once again a very thoughtful essay from Dr. Larson. One very critical question is: is it enough to know the “statistical” connectivity of a brain, or do we need the precise wiring? (Or is the precise wiring just the statistical connectivity with a few tweaks here and there?)

LikeLike

Thanks Guest. By the way, I’m not a Dr. yet, though thanks for the upgrade 😉

The question you raise is an important one indeed. The field is still coming to grips with it. And there may not be just one answer. Although people tend to focus on the cerebral cortex as having the exemplar architecture/connectivity for the whole brain, it is the case that some of the most critical brain areas have very different architectures. For example, the cortex is a six-layered structure, while the reticular formation in the midbrain seems to be a diffuse network of neurons with a structure that cross-cuts many other midbrain nuclei. It may be the case that we discover that structures like cortex can be described with statistical connectivity, but that the reticular formation requires much more precise wiring in order to explain. At the end of the day, it will come down to how well our models of these systems explain and predict the data we collect.

LikeLike