Computer, you read my mind.
Researchers at MIT have developed a computing system that can learn to interpret words someone is thinking – but not actually saying out loud.
It’s like voice-recognition software, but without the voice.
A description of the project from MIT says the device works by using electrodes that can pick up signals from jaw and face muscles, and vibrations in the bones of the face and inner ear.
The system, called AlterEgo, includes a headset that can also relay silent information to the user.
“AlterEgo aims to combine humans and computers,” MIT says in a video about the project (below). “Such that computing, the internet, and AI would weave into human personality as a ‘second self,’ and augment human cognition and abilities.”
“The device is thus part of a complete silent-computing system that lets the user undetectably pose and receive answers to difficult computational problems,” MIT says on its website. “In one of the researchers’ experiments, for instance, subjects used the system to silently report opponents’ moves in a chess game and just as silently receive computer-recommended responses.”
“The motivation for this was to build an IA device — an intelligence-augmentation device,” Arnav Kapur, a graduate student at the MIT Media Lab, says in the post. “Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?”
“We basically can’t live without our cellphones, our digital devices,” said Pattie Maes, a professor of media arts and sciences and Kapur’s thesis advisor. “But at the moment, the use of those devices is very disruptive. If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword, and the whole thing requires that I completely shift attention from my environment and the people that I’m with to the phone itself.
“So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”
The researched is described in a paper presented at the Association for Computing Machinery’s ACM Intelligent User Interface conference.