A very cool article on a new open source, online system to crowd source the assemblage of data in neuroscience from the Voice of San Diego. From the article:
Traditionally, the study of the brain was organized somewhat like an archipelago. Neuroscientists would inhabit their own island or peninsula of the brain, and see little reason to venture elsewhere.
Molecular neuroscientists, who study how DNA and RNA function in the brain, didn’t share their work with cognitive specialists who study how psychological and cognitive functions are produced by the brain, for example.
But there has been an awakening to the idea that brains of humans and mammals should be studied like the complex, and interrelated systems that they are. Neuroscientists realized that they had to start collaborating across disciplines and sharing their data if they wanted to make advances in their own field.
Ellisman and his UCSD colleagues have devised a solution: crowdsource a brain. And this week they unveiled their years-long project — the Whole Brain Catalog — at the annual convention of the Society for Neuroscience, the largest gathering of brain experts in the world.
We had read that Dr. Henry Markram of the Blue Brain project had given a talk at TED (technology, entertainment, design), but the video wasn’t released until this month. This talk is geared towards a general audience, rather than getting into the specific details of the Blue Brain project, as he has before. It is engaging and includes many suggestions towards the future of neuroscience and AI.
Watch it online at the TED website.
As has become a hallmark of the Svoboda lab, this new paper in Nature (advance online publication) combines several cutting edge technologies (rAAV-delivered ChR2, most prominently, and 2-photon 1-photon laser stimulation) to do some interesting synaptic physiology.
The subcellular organization of neocortical excitatory connections : Article : Nature.
They used ChR2 (with TTX and 4-AP to block action potentials) to find where on the dendritic tree particular inputs synapsed onto L3 and L5 cells and to measure the strength of those inputs. ChR2 depolarizes the input axon locally (60um spot diameter) at points of (potential) axodendritic contact. If you’ve heard the term “potential synapse” before, then think of this technique as a way of checking potential synapses and seeing if there really is an actual synapse there.
The technique allowed them to map on a L3 barrel cortex pyramidal cell where different thalamic inputs (VPm, POm) and cortical inputs (M1, barrel L2/3, barrel L4):
sCRACM stands for subcellular ChR2-assisted circuit mapping.
This new technique from Cori Bargmann’s lab is one of the neatest that I’ve seen in a while. The authors split GFP into two pieces, expressing one piece presynaptically and the other postsynaptically. This creates functional (ie. fluorescing) GFP only at sites of synaptic contact where the protein can reconstitute. They call the technique GFP Reconstitution Across Synaptic Partners (GRASP). Check out an example labeling here:
The neurons are expressing mCherry in the cytoplasm but GFP is expressed only at the site of synaptic contacts where the split GFP peptides can be reconstituted into a complete GFP fluorophore.
Video-Rate Far-Field Optical Nanoscopy Dissects Synaptic Vesicle Movement
Just the optical engineering alone here deserves mention: 28 frames per second at 62nm resolution (well below the diffraction limit of 260nm for light of the wavelength used)! STED (or stimulated emission depletion, developed in Stefan Hell’s group) is ideal for visualizing synaptic vesicles, whose small size (~50nm) has typically confined them to the domain of electron microscopists. The ability to get high-speed STED allowed the researchers to track individual vesicles and their path dynamics. They conclude that vesicle movement has both motor-driven and diffusive components (ie. a biased random walk). I’m sure with more time and more analysis there will be a lot of interesting applications for this kind of real-time vesicle tracking. Perhaps in the near future we will have single vesicle “minis” monitored at multiple sites through microscopy instead of just one or two sites electrophysiologically…
Here’s the resolution difference between STED and confocal for a single vesicle:
And, for those of you with ~$1.25M lying around, you can now purchase a STED setup directly from Leica!
(UPDATE 03-05-2007 – Upon closer inspection, it is clear that while the surgery has enabled the woman to have sensation in the nerves of her missing hand when the surface of her chest is touched, the arm she is fitted with at the time of publication did not relay sensory signals from the arm back to her chest. As soon as she is fitted with an arm that has the appropriate sensors, however, she will not have to undergo further surgery to have this kind of direct feedback. Thanks to astute readers for pointing this out.)
The Guardian reports on an article published today in the Lancet about a successful surgical procedure giving an amputee a bionic arm that both responds to motor commands from her remaining motor nerves to control it and provides sensory feedback to sensory nerves when it is touched. If there was any doubt left, the worlds of neural prosthetics and brain-machine interfaces have officially collided.
The Lancet article is accompanied by two movies of the woman using the arm that you should really check out.
Given the recent progress in the decoding of motor signals from the brain and older progress on sensory feedback from neural prosthetics, this was to be expected. Nonetheless, watching this woman use her arm brings the message home in a visceral way. The spooky thesis of MIT CSAIL’s Rodney Brooks that “we will become a merger between flesh and machines” is one step closer today.
It seems Markram is again back to getting some interesting results. Recently a new discovery from the Brain Mind Institute of the EPFL shows that the brain adapts to new experience by unleashing a burst of new neuronal connections, and only the fittest survive. The research further shows that this process of creation, testing, and reconfiguring of brain circuits takes place on a scale of just hours, suggesting that the brain is evolving considerably even during the course of a single day.
The paper can be found Here.