Friday, July 10, 2015

Man to Machine: "Think,Decide, Act!."

The kind of acrobatics that robots are called upon to perform  in science fiction novels/ movies , would certainly cost them an arm or a leg. However in SF Imagination is unbound and  the limbs regrow  in a jiffy.  But alas!  Imagination and Reality are not so close neighbors.   Currently  a reality  check is being done on   the possible options available for a robot injured  in  action, in the battle field. Their focus is to  design robots that can adapt like animals. (1,2). Living beings possess natural instincts accumulated over time, and also revised/ refined through learning and memory, to adapt to unexpected and uncertain situations. Cully et al  equipped the robot with   a  unique trial and error algorithm (a behaviour-performance map with weighted values)   The algorithm contained   a number of possible failure modes and their corresponding most optimal responses. So to begin with Cully et al envisaged   a robot injured  in about 19 different ways, with  broken arm or a leg.   Empowered with  this  behavior-performance map,  the robot could manage the situation in less than 2 minutes.  The leader of the team Jean-Baptistre Mouret says "I study machine learning and evolutionary computation to design highly adaptive robots". Mouret's website presents  several, very  interesting videos  that explains it all.  

Courtesy: wikipedia
Just about a year ago   we were amazed how a paraplegic equipped with an exoskeleton propped by an array of electronic devices, proved a point  (World Cup 2014: The symbolic first kick).  A year later Aflalo and team show that an intention  can actually be translated  into an action(3,4). The volunteer,   EGS,   is paralyzed from neck down for the past ten years, but   nerve cells in his  posterior parietal cortex is still active. Posterior parietal cortex is the main area where prework for an action, happens,  in other words  intentionality. Aflalo etal detected that EGS's thought processes selectively fire  nerve cells in this area. The team  took a cue from this and began a process of reading his thoughts. Each time they could check with him whether they read correctly; that was an added advantage. The research team tapped the signals via tiny silicon chips embedded in the brain and then fed into a robotic arm or a computer screen to move the cursor.These silicone chips, an array of 96 microscopic electrodes have been approved by the USFDA (United States Food and Drug Administration ) for commercialization.  A  definite step forward in the development of  next generation neuro-prosthetic devices.  Videos are available at Professor Richard Andersen's webpage. 



References:
1. Robots that can adapt like animals : Cully etal Nature 28 May 2015 pp503-507 
2.Robots with instincts : Adami: Nature 28 May 2015 pp 426-427
3.Decoding motor imagery from the posterior parietal cortex of a tetraplegic human :Aflalo etal Science 22 May 2015, pp906-910 
4. Reading the mind to move the body: Pruszynski & Diedriehsen Science 22 May 2015 pp 860-861