Facebook’s Mind Reading Interface Could be Closer to Reality
Facebook has been financing an R&D initiative it first introduced in 2017 at the F8 on a brain-computer interface (BCI) that could possibly be used on its future augmented reality glasses. It now seems its mind-reading interface might be getting closer to reality. The company has just released it first major update on the research project in which researchers are now able to “decode small sets of full, spoken words and phrases from activity in real time.”
The breakthrough was published in the Nature Communications journal. The breakthrough in a mind-reading interface was realized through an algorithm capable of reading thoughts of participants who were suffering from brain injury. The ultimate goal of the research is to develop a system capable of decoding silent speech without the need to implant electrodes into the person’s brain.
Today we’re sharing an update on our work to build a non-invasive wearable device that lets people type just by imagining what they want to say. Our progress shows real potential in how future inputs and interactions with AR glasses could one day look. https://t.co/ilk192GwAR
— Boz (@boztank) July 30, 2019
Researchers from the University of California have been partnering with Facebook on the project and they say they have achieved a major milestone toward the goal of creating a mind-reading interface. The researchers worked with patients who had already had a brain surgery for epilepsy.
According to Facebook, working with the brain surgery patients for the research was not as non-invasive as they had hoped for. To get around this, the Facebook Reality Labs is currently exploring other less invasive methods one of which entails the detection of brain activity through the monitoring of oxygen levels in the brain using a portable and wearable device that has been made from consumer-grade parts. The mind-reading device is still bulky, unreliable and slow, according to Facebook, but the researchers are hopeful that it may one day finds its way to its virtual reality headsets or Facebook’s planned augmented reality wearables.
In the statement it issued on the latest breakthrough, Facebook stated that the measuring of oxygenation from patients does not necessarily decode the imagined sentences but if the system is capable of decoding even a few commands such as ‘select’, ‘home’ or ‘delete’, then it could be successfully deployed in providing completely new ways of interacting with today’s virtual reality systems as well as future augmented reality glasses.
Facebook believes it can leverage the commercialization of optical technologies for LiDAR and smartphones to create small and convenient brain-computer interface devices that will be able to measure neural signals that are very close to what we are now able to record with implanted electrodes. One day, these could be successfully developed to a point where they will be capable of decoding silent speeches.
The new breakthrough adds an interesting angle to Facebook’s planned augmented reality glasses. Not much is known about them so far. In 2017, Michael Abrash, chief scientist at Oculus, had stated that the AR glasses were still five to ten years away. Facebook hasn’t revealed much about the project but information has gradually trickled in via Facebook patent filings. If Facebook is also planning to incorporate mind-reading capabilities into the glasses, then we are looking at a timeline of at least 10 years as that will allow Facebook to also develop the BCI peripheral technology.
The development of mind-reading capabilities, even at a rudimentary level, might create some ethical concerns and Facebook Reality Labs says it is already contemplating the ethical implications of the technology that it is developing. So far, the company is yet to provide any specifics on how it might address this. It is a lofty vision but one that is now closer to reality than previously thought.