VS Ramachandran's TED Talk

Although I’ve been a longtime fan of Ramachandran’s excellent book Phantoms in the Brain, this TED talk is like a compressed summary of the highlight’s of his research. He’s a great speaker and he covers in 20 minutes my two favorite examples in the book (Capgras delusion and mirror treatment for phantom limb syndrome). Perhaps the best part of the talk is that, after listening to it, I was convinced more than ever before of the statistical nature of sensory perception (ie. the brain attempts to find the most likely explanation for sensory observations) and the integrative nature of central processing of multiple modalities. 

http://video.ted.com/assets/player/swf/EmbedPlayer.swf

Atul Gawande also recently wrote a New Yorker article about treating phantom itch with Ramachandran’s mirror box. I found this part of Gawande’s article on statistical inference in perception most interesting:

You can get a sense of this from brain-anatomy studies. If visual sensations were primarily received rather than constructed by the brain, you’d expect that most of the fibres going to the brain’s primary visual cortex would come from the retina. Instead, scientists have found that only twenty per cent do; eighty per cent come downward from regions of the brain governing functions like memory. Richard Gregory, a prominent British neuropsychologist, estimates that visual perception is more than ninety per cent memory and less than ten per cent sensory nerve signals. When Oaklander theorized that M.’s itch was endogenous, rather than generated by peripheral nerve signals, she was onto something important.

I’m not familiar with this field but I wonder if anyone has tried to quantify what percent of our conscious experience that we normally believe to be 100% due to sensory input is actually recall from memory/inference based on past observation. Also, can this percentage adaptively change? Perhaps there are situations where the brain chooses to rely more heavily on memory and other cases where it relies more on primary sensory input.

NSF/EFRI neuro grants

NSF:ENG:EFRI:Home Page

NSF’s Emerging Frontiers in Research and Innovation (EFRI) office funded 4 very futuristic neuroengineering grants.

  1. Deep learning in mammalian cortex
  2. Studying neural networks in vitro with an innovative patch clamp array
  3. Determining how the brain controls the hand for robotics
  4. In vitro power grid simulation using real neurons

Disclaimer: I was involved with the second proposal on this page.

A Computational Neuroanatomy for Motor Control

An extremely interesting trend in neuroscience has been to use the language of Control Theory to explain brain function. A recent paper by Shadmehr and Krakauer does a very nice job of summarizing this trend and assembling a comprehensive theory of how the brain controls the body. Using control theory, they put forward a mathematically precise description of their theory. Because their theory uses blocks that are direct analogues of specific brain regions like the basal ganglia, motor cortex, and cerebellum, they can use brain lesion studies to undergird their ideas about these components. From the paper:

The theory explains that in order to make a movement, our brain needs to solve three kinds of problems: we need to be able to accurately predict the sensory consequences of our motor commands (this is called system identification), we need to combine these predictions with actual sensory feedback to form a belief about the state of our body and the world (called state estimation), and then given this belief about the state of our body and the world, we have to adjust the gains of the sensorimotor feedback loops so that our movements maximize some measure of performance (called optimal control).

At the heart of the approach is the idea that we make movements to achieve a rewarding state. This crucial description of why we are making a movement, i.e., the rewards we expect to get and the costs we expect to pay, determines how quickly we move, what trajectory we choose to execute, and how we will respond to sensory feedback.

This approach of describing brain lesion studies in the context of a well-thought out theory ought to be further encouraged.

Your Brain Is A Cartographer

The concept that the brain holds maps of the surface of the body in the primary sensory and motor cortex is a fascinating but well known fact to the field of neuroscience since the early work of Wilder Penfield. What is less broadly appreciated is the concept of “peripersonal space”. A new book by Sandra and Matthew Blakeslee describes peripersonal space in the following way:

The maps that encode your physical body are connected directly, immediately, personally to a map of every point in that space and also map out your potential to perform actions in that space. Your self does not end where your flesh ends, but suffuses and blends with the world, including other beings. […] Your brain also faithfully maps the space beyond your body when you enter it using tools. Take hold of a long stick and tap it on the ground. As far as your brain is concerned, your hand now extends to the tip of that stick. […] Moreover, this annexed peripersonal space is not static, like an aura. It is elastic. […] It morphs every time you put on or take off clothes, wear skis or scuba gear, or wield any tool. […] When you eat with a knife and fork, your peripersonal space grows to envelop them. Brain cells that normally represent space no farther out than your fingertips expand their fields of awareness outward, along the length of each utensil, making them part of you.

What I appreciate about this, besides the stretchy comic book characters that it makes me think about, is that it provides a powerful perspective to begin piecing together a mass of disparate neuroscience data, which the Blakeslee’s capitalize on.

Continue reading

Next-Gen Walking Robot Better Than Honda Asimo

Recently, a company named AnyBots appears to have been able to build a humanoid walking robot whose gait is much more humanlike that Honda’s Asimo (If you aren’t familiar with Asimo, check it out)

The robot is named Dexter, and here’s a link to a blog explaining why its better than Asimo. From the blog:

There are of course biped robots that walk. The Honda Asimo is the best known. But the Asimo doesn’t balance dynamically. Its walk is preprogrammed; if you had it walk twice across the same space, it would put its feet down in exactly the same place the second time. And of course the floor has to be hard and flat.

Dynamically balancing—the way we walk—is much harder. It looks fairly smooth when we do it, but it’s really a controlled fall. At any given moment you have to think (or at least, your body does) about which direction you’re falling, and put your foot down in exactly the right place to push you in the direction you want to go. Practice makes it seem easy to us, but it’s a very hard problem to solve. Something as tall as a human becomes irretrievably off balance very rapidly.

This is important because robots that dynamically balance can handle arbitrary terrains and can withstand attempts to knock them over, much like the Big Dog robot we reported on earlier, whereas pre-programmed robots like Asimo cannot. If Honda were smart, they’d buy AnyBot ASAP.

Here’s a link to the Slashdot article on this.

Amputee Controls And Feels Bionic Arm as Her Own

(UPDATE 03-05-2007 – Upon closer inspection, it is clear that while the surgery has enabled the woman to have sensation in the nerves of her missing hand when the surface of her chest is touched, the arm she is fitted with at the time of publication did not relay sensory signals from the arm back to her chest. As soon as she is fitted with an arm that has the appropriate sensors, however, she will not have to undergo further surgery to have this kind of direct feedback. Thanks to astute readers for pointing this out.)

The Guardian reports on an article published today in the Lancet about a successful surgical procedure giving an amputee a bionic arm that both responds to motor commands from her remaining motor nerves to control it and provides sensory feedback to sensory nerves when it is touched. If there was any doubt left, the worlds of neural prosthetics and brain-machine interfaces have officially collided.

The Lancet article is accompanied by two movies of the woman using the arm that you should really check out.

Given the recent progress in the decoding of motor signals from the brain and older progress on sensory feedback from neural prosthetics, this was to be expected. Nonetheless, watching this woman use her arm brings the message home in a visceral way. The spooky thesis of MIT CSAIL’s Rodney Brooks that “we will become a merger between flesh and machines” is one step closer today.

Continue reading