The power of mobility 3.0: I think, therefore I drive!
Technology serving people, for a brighter future: the Brain Computer Interface will allow people currently unable to drive, to do it autonomously.
Cornelia is in front of a screen in a dark room, her head adorned with a futuristic headset. Her eyes are almost penetrating the display, where a dot speeds across the screen. Suddenly, music starts to play out of nowhere. It is not a trick, though: Cornelia started the music – only with the power of her thoughts. How did she do it? To answer this question, we need to go back in time. Everything started with Cornelia’s cousin Markus, who was diagnosed with multiple sclerosis several years ago and is now paralysed from his head down. Despite this limitation, he runs his own car repair shop, and with the help of eye-tracking software and special computer programs, even carries out the administrative work.
Mind-based controls
“Markus would give anything to be able to drive a car again and be independent”, Cornelia explains. “I wondered if it would be possible to operate a car with an integrated eye-tracking system in combination with a brain-computer interface. I based my degree dissertation on this topic in the Design Interior Interface area, and I carried it out with the support of Audi researchers”. The “brain-computer interface” enables people with cognitive and motor impairments to communicate with their environment via mental commands. The sensors of the mobile EEG register the electrical activities of the nerve cells, then a computer translates these signals into commands and passes them onto a device – for example a wheelchair, home automation controls, or even a car.
Audi Aicon
In the Audi Aicon for instance, a self-driving concept car that was presented at the IAA 2017, the passenger can operate the graphical interface with an eye-tracking system in addition to using touch and voice commands. Several infrared sensors are used to detect which display area the passenger is looking at and then it is enlarged. At this point a simple touch is all that is required to activate it, but Markus’ disability is too severe to allow this. The solution lies with the brain-computer interface.
Graphical User Interface
This is the theory, but the reality turned out to be much more complex. Calibrating the brain-computer interface is not simple, as a matter of fact. It can take up to two weeks to learn a command, which requires a precise and constant thought. As a first step, Cornelia meditated to balance her brain activity, before concentrating on a single concept corresponding to a command. All of this has been then integrated into a Graphical User Interface. Cornelia developed seven commands for her degree dissertation, processing them graphically and conceptually: left, right, up, down, rotate clockwise and counter-clockwise, and typing. Controlling different commands directly one after the other requires maximum concentration, but after a while it becomes second nature. A bit like riding a bike.
An independent future
Once the method of use is learned, the brain-computer interface has a high level of operating safety. If this system is additionally linked to an eye-tracker, it is not necessary to wait until the specific area of the control panel lights up – it can be targeted immediately, increasing the speed of operation. In the not-too distant future, Markus may be able to call an Audi to his home via his smartphone and eye-tracking, drive his wheelchair up to and into the car, and then manage all its functions, including infotainment and air conditioning. Obviously there is still a lot to do before this becomes reality, but the first steps have been taken.
Source: Audi Blog