Remember the old days of tech support where all we had was a telephone connection?
Sorting out even the simplest of user problems was an endless series of questions and mostly confused answers. The session usually started with "On your screen can you see ..." followed up by the likes of "OK, now press F7 ... no, not F and 7, press the key labelled F7 ... wait; what's on the screen now?" (then there's my favorite, "No, sir, there isn't an 'any' key, it means press any key at all ...").
Today online remote assistance has become commonplace and made techs' lives much simpler. For example, products such as Teamviewer make it very easy for a user with a problem to let a technician see what's on their screen or even take over their machine.
While that's great for today, what about tomorrow? Things are only going to get more complicated and in the future we'll probably need to get as close to the user as possible, so, can we improve on remote assistance apps? Yep, we most likely can and late last year researcher Rajesh P. N. Rao of the University of Washington demonstrated what could be the next step: Direct Brain-to-Brain Communication in Humans.
Using an electroencephalograph (EEG) to non-invasively record brain signals from the scalp of one person (the sender), a computer decoded the patterns of neural activation and sent data across the Internet that stimulated the brain of another, remote person (the receiver) using non-invasive transcranial magnetic stimulation (TMS).
The task that the subjects must cooperatively solve via brain-to-brain communication is a computer game ... The task involves saving a "city" ... from getting hit by rockets fired by a "pirate ship" ... To save the city, the subjects must fire a "cannon" ... If the "fire" button is pressed before the moving rocket reaches the city, the rocket is destroyed ..., the city is saved, and the trial ends. To make the task more interesting, on some trials, a friendly "supply plane" may appear instead of a pirate rocket and move leftwards towards the city ... The subjects must avoid firing the cannon at the supply plane.
What made this experiment so interesting was that only the sender could see the game and only the receiver could press the fire button. The link between the perception and action was the computer decoded intention of the sender to press the button causing the TMS system to stimulate the part of the receiver's brain that controls finger and wrist movement which, in turn, pressed the fire button.
And the experiment was a huge success! Two of the trials resulted in about a 90 percent success rate while another trial was 100 percent successful!
Just imagine a few years from now when even more complex systems produce even more complex problems for tech support to solve and so you strap on your EEG cap and the remote user puts on her TMS rig and you really take over the user's system as well as the user ...