Sunday 14 October 2012

Tell me when you feel something



Louis Armstrong New Orleans International Airport at 1am in the morning didn’t look good. It was tired, run down and in need of a facelift. After 24hrs traveling to get there – that’s another story – I felt like it looked. To be fair it was clearly undergoing some sort of refit. With 40,000 scientists descending on New Orleans for this year’s Society for Neuroscience Meeting the city clearly wasn’t bothering about first impressions. I guess that’s in keeping with the city they call the “Big Easy”.

SfN finally came up with the long-awaited app to help navigate the meeting. First impressions are it’s not bad. It has the whole programme loaded, social media feeds and the facility to create your own schedule. This last bit might need a bit of tweaking for next year as it lacks some of the power search facilities of the web-based meeting planner. In theory all you need now carry is something like an iPad but for me I still felt more comfortable with my carefully researched itinerary printed on paper, scribbled with like to see, must see and numerous question marks.

So, SfN is embracing technology, which is apt given the first session I planned to visit on Saturday. Before I get to that let’s consider a couple of basic options we have for the restoration of function for someone suffering a SCI. In general terms we can either look to biology to find ways to help the body repair itself or we can look to technology. For a while now I have noticed that there is a growing field looking to apply high-end technology to replace lost function. Advances in engineering have made sophisticated robotics a reality. Neural prosthetics and computer-brain interfaces are very actively researched but it is fair to say that the majority of this research has focused on the “easier” issue of replacing motor function. A classic example is recording and mapping neuronal activity in the brain associated with the intention to move a limb using micro-electrode arrays and using this information to drive a prosthetic limb (or stimulator) via some clever computer wizardry. While there are still problems to overcome this is feasible and has been demonstrated in humans.

To really own a limb you need to be able to feel it or at least feel what it is doing. And for many practical and dextrous functions, such as writing or manipulating items, you need feedback; not least so you don’t underdo the pressure required to maintain grip or overdo it leading to damage. Getting and perceiving sensory feedback from a prosthetic is a bit of a holy grail. G. A. Tabot [Presentation # 15.02] described some work towards this goal in non-human primates. Placing microelectrodes into the regions of the brain involved in sensory perception they recorded patterns of neuronal activation after stimulating different areas of the hand. At the same time the animal was trained to look at the part of the hand that was being stimulated. To prove the principle they then stimulated the appropriate region of the brain artificially using the same electrodes and found they looked at the correct region of the hand. The animal was reacting as though it had been touched even though the hand wasn’t actually receiving any stimulation. Using a similar approach they demonstrated the animal could distinguish force. This leads the way to integrating this with touch and pressure sensors on artificial prosthetics limbs and perhaps a more natural experience.

No comments:

Post a Comment