Source – army-technology.com
– “Telepresence – allowing a human operator to have an at-a-distance presence in a remote environment via a brain-actuated robot – is a relatively new field of research, most of the work to date having been done with operator and robot in close proximity to each other. Nevertheless, there have already been some significant advances made, including a museum guide robot and most recently, the University of Minnesota’s successful demonstration of a thought-controlled mini-helicopter capable of being piloted through obstacles with around 90% accuracy”:
The latest round of grants under the US Department of Defense University Research Instrumentation Program lists Brain Computer Interfacing as an area of funded research. While military interest in mind control is hardly new, Dr Gareth Evans finds out how far the technology has come and whether we may be nearing a breakthrough via a new generation of brain computer interfaces.
Jonathan D. Moreno, the author of ‘Mind Wars: Brain Research and National Defense’, and who has served on a number of US Presidential committees, once commented that when you are seeking inspiration for the military developments of the future, science fiction is a good place to start.
With a number of the defence applications currently being investigated for brain computer interfacing (BCI) appearing to have been borrowed from some of the most successful sci-fi movies of recent years, it seems that statement has never been more true.
Military interest in mind-control is nothing new. The US Defense Advanced Research Projects Agency (DARPA) has been funding a variety of BCI projects since the early 1970s – with other nations running similar programmes too – and the technology has come a long way in those intervening 40 years.
If the current rounds of research go according to plan, it may not be long before brain-activated devices begin to revolutionise the way we go to war.
Mind reading in the military
The principles behind BCI could hardly be simpler – sensors detect the electrical signals of the user’s brain, and they are subsequently rendered into a computer-usable form; actually achieving such an interface successfully is, however, rather less straightforward.
There are three fundamental techniques; electro-encephalography (EEG), invasive direct connections and electro-corticography (ECoG), also known as intracranial EEG – a sort of half-way house involving electrodes placed on the brain’s exposed surface, rather than hardwired into the brain itself.
Although the two more invasive methods produce better results, for obvious reasons, EEG-based helmets and headwear remain the military standard – though fully connected cyborgs and super-soldiers may themselves, of course, become a reality one day.
Controlling mobile robot agents is one area where BCI appears to hold much promise, and perhaps most notably with its potential to revolutionise reconnaissance in hostile terrain, which was originally the primary function of aerial drones, before they were widely weaponised.
Telepresence – allowing a human operator to have an at-a-distance presence in a remote environment via a brain-actuated robot – is a relatively new field of research, most of the work to date having been done with operator and robot in close proximity to each other.
Nevertheless, there have already been some significant advances made, including a museum guide robot and most recently, the University of Minnesota’s successful demonstration of a thought-controlled mini-helicopter capable of being piloted through obstacles with around 90% accuracy.
It remains early days for the technology, however, and three major engineering challenges need to be addressed before soldiers will be seamlessly guiding their remote presences around.
Firstly, current non-invasive BCIs are slow and somewhat uncertain, secondly, they tend to make high cognitive demands on the user, and finally, especially for tele-operation via the internet, variable communication delays are a significant problem. If these issues can be overcome effectively, the future of BCI telepresence looks very bright, but until then, most of the systems being developed currently are adopting a ‘shared-control’ approach, equipping the robot agent with a degree of intelligence to allow it to work semi-autonomously.
If telepresence via BCI nods towards the fictional world of James Cameron’s ‘Avatar’, then the potential for marrying neural interfaces with robotic exoskeletons is shaping up to be about as close to the movie ‘Iron Man’ as real life is ever likely to get.
Augmenting human strength and endurance with wearable suits has been an area of military research for decades, and the technology has come a long way since the first rather crude and clunky attempts, with the likes of Raytheon’s XOS2 now bringing exoskeletons to the brink of deployment.
Said to be robust enough to enable its wearer to lift 200lbs loads – or punch through three inches of wood – repeatedly, yet agile and responsive enough for stairs or ramps to be negotiated with ease, the XOS2 robotic suit allows one soldier to do the work of three, and that has clear implications at a time when numbers are being reduced.
The XOS2 is undoubtedly a major step forward, but it still combines a range of mechanical sensors, actuators and controllers, and its high-performance hydraulics mean it remains limited by the need for a tethered power supply. Many believe that the next big challenge is for robotic exoskeletons to achieve self-power and BCI control – and that breakthrough might just benefit injured service personnel first.
A hint of what that may look like came in late 2010, when Berkley – now renamed Ekso – Bionics first unveiled eLEGS, their personal exoskeleton designed to help patients with spinal cord injuries walk again.
As the people behind the human universal load carrier (HULC), they already had an established pedigree in the field, and now eLEGS brings a new dimension to paraplegic mobility, including hitherto unprecedented knee flexion and walking speeds, once the user is fully trained, of up to 2mph.
Crucially, however, unlike previous systems for those who have lost function in their lower limbs, which were either unpowered, or relied on a power tether, eLEGS is self-powered, and in addition, has the necessary artificial intelligence to truly emulate human gait, rather than a shuffle.
Interestingly, eLEGS draws on military design – illustrating the massive cross-over potential between medical and military research in this area – and with BCI technology already beginning to permeate electric wheelchair control, it cannot be long before exoskeletons too will run by thought.
BCI also has major potential military applications beyond the field of robotics – and improved communications is high on the list. DARPA’s ongoing project ‘Silent Talk’ is another idea that has distinctly sci-fi overtones, and aims to allow front-line soldiers to communicate telepathically.
Calling for research to “allow user-to-user communication on the battlefield without the use of vocalized speech through analysis of neural signals”, and kicked off with an initial $4m USD budget, the focus is on identifying and isolating the distinctive neural flashes that occur in the brain when people talk to themselves.
Capture these via an EEG ‘thought helmet’ and turn them into a form that can be beamed directly to other similarly equipped soldiers in theatre, and you have what the US Army neuroscientist, Dr. Elmar Schmeisser, has likened to a radio without a microphone. “You’ll press the button on your harness, you’ll think, then you’ll throw the button off.”
It would represent a major advance in battlefield communication, not least because the messages would be free of background noise, clear and, since there are no spoken words to intercept or overhear, introduces unparalleled levels of security.
There are, however, some serious technical challenges to overcome first. EEGs are unique; the system will need to be individualised for each soldier, and be capable of isolating the shape of word precursor thoughts from the rest of his or her neural activity. That is something one of the researchers described as the electronic equivalent of recognising and unravelling a particular strand of spaghetti from a plateful of pasta.
Getting to that point could take 20 years. As Schmeisser puts it: “The mathematics behind this is fierce. It is really difficult” – and that might be putting it mildly