It first appeared on March 9 as a tweet on Andrew Bosworth’s timeline, the tiny corner of the Internet that offers a rare glimpse into the mind of a Facebook executive these days. Bosworth, who leads Facebook’s augmented and virtual reality research labs, had just shared a blog post outlining the company’s 10-year vision for the future of human-computer interaction. Then, in a follow-up tweet, he shared a photo of an as yet unseen wearable device. Facebook’s vision for the future of interacting with computers apparently would involve strapping something that looks like an iPod Mini to your wrist.
Facebook already owns our social experience and some of the world’s most popular messaging apps—for better or notably worse. Anytime the company dips into hardware, then, whether that’s a very good VR headset or a video chatting device that follows your every move, it gets noticed. And it not only sparks intrigue, but questions too: why does Facebook want to own this new computing paradigm?
In this case, the unanswered questions are less about the hardware itself and more about the research behind it—and whether the new interactions Facebook envisions will only deepen our ties to Facebook. (Answer: probably.) In a media briefing earlier this week, Facebook executives and researchers offered an overview of this tech. In simplest terms, Facebook has been testing new computing inputs using a sensor-filled wrist wearable.
It’s an electromyography device, which means it translates electrical motor nerve signals into digital commands. When it’s on your wrist, you can just flick your fingers in space to control virtual inputs, whether you’re wearing a VR headset or interacting with the real world. You can also “train” it to sense the intention of your fingers, so that actions happen even when your hands are totally still.
This wrist wearable doesn’t have a name. It’s just a concept, and there are different versions of it, some of which include haptic feedback. Bosworth says it could be five to 10 years before the technology becomes widely available.
All of this is tied to Facebook’s plans for virtual and augmented reality, technologies that can sometimes leave the user feeling a distinct lack of agency when it comes to their hands. Slip on a VR headset and your hands disappear completely. By picking up a pair of hand controllers, you can play games or grasp virtual objects, but then you lose the ability to take notes or draw with precision. Some AR or “mixed reality” headsets like Microsoft’s HoloLens have cameras that track spatial gestures, so you can use certain hand signals and the headset will interpret those signals … which sometimes works. So Facebook has been using this EMG wearable in its virtual reality lab to see if such a device might enable more precise hand-computer interactions.
But Facebook has visions for this wrist tech beyond AR and VR, Bosworth says. “If you really had access to an interface that allowed you to type or use a mouse—without having to physically type or use a mouse, you could use this all over the place.” The keyboard is a prime example, he says; this wrist computer is just another means of intentional input, except you can carry it with you everywhere.
Bosworth also suggested the kitchen microwave as a use case—while clarifying that Facebook is not, in fact, building a microwave. Home appliance interfaces are all different, so why not program a device like this to understand, simply, when you want to cook something for 10 minutes on medium power?
In the virtual demo Facebook gave earlier this week, a gamer was shown wearing the wrist device and controlling a character in a rudimentary video game on a flat screen, all without having to move his fingers at all. These kinds of demos tend to (pardon the pun) gesture toward mind-reading technology, which Bosworth insisted this is not. In this case, he said, the mind is generating signals identical to the ones that would make the thumb move, but the thumb isn’t moving. The device is recording an expressed intention to move the thumb. “We don’t know what’s happening in the brain, which is full of thoughts, ideas, and notions. We don’t know what happens until someone sends a signal down the wire.”
Bosworth also emphasized that this wrist wearable is different from the invasive implants that were used in a 2019 brain-computer interface study that Facebook worked on with the University of California at San Francisco; and it’s different from Elon Musk’s Neuralink, a wireless implant that could theoretically allow people to send neuroelectrical signals from their brains directly to digital devices. In other words, Facebook isn’t reading our minds, even if it already knows a heck of a lot about what’s going on in our heads.
Researchers say there’s still a lot of work to be done in the area of using EMG sensors as virtual input devices. Precision is a big challenge. Chris Harrison, the director of the Future Interfaces Group in the Human-Computer Interaction Lab at Carnegie Mellon University, points out that each individual human’s nerves are a little bit different, as are the shapes of our arms and wrists. “There’s always a calibration process that has to happen with any muscle-sensing system or BCI system. It really depends on where the computing intelligence is,” Harrison says.
And even with haptic feedback built into these devices, as Facebook is doing with some of its prototypes, there’s the risk of visuo-haptic mismatches, where the user’s visual experience—whether in AR, VR, or real space—does not correlate to the haptic response. These points of friction can make these human-computer interactions all feel frustratingly unreal.
Even if Facebook can overcome these obstacles in its research labs, there’s still the question of why Facebook—largely a software company—wants to own this new computing paradigm. And should we trust it? This hugely powerful tech company that has a track record of sharing user data in “exchange for other equally or more valuable things,” as WIRED’s Fred Vogelstein wrote in 2018? A more recent report in MIT Technology Review highlights how a team at Facebook assembled to tackle “responsible AI” was undermined by leadership’s relentless quest for growth.
Facebook executives said this week that these new human-computer interaction devices will perform as much computing as possible “on device,” which means the information isn’t shared to the cloud; but Bosworth won’t commit to how much data ultimately might be shared to Facebook or how that data will be used. The whole thing is a prototype, so there’s nothing substantive to tease apart yet, he says.
“Sometimes these companies have cash piles large enough to basically invest in these huge R&D projects, and they’ll take a loss on such things if it means they can be front-runners in the future,” says Michelle Richardson, director of the Data and Privacy Project at the nonprofit Center for Democracy and Technology. “But with companies of any size, any product, once it’s built, it’s so difficult to overhaul it. So anything that can start the conversation on this before the devices are built is a good thing.”
Bosworth says Facebook wants to lead this next paradigm shift in computing because the company sees tech like this as fundamental to connecting people. If anything, this past year has shown us the importance of connecting—of feeling like you’re in person, Bosworth says. He also seems to believe he can earn the required trust by not “surprising” customers. “You say what you do, you set expectations, and you deliver on those expectations over time” he says. “Trust arrives on foot and leaves on horseback.” Rose-colored AR glasses, activated.
This story originally appeared on wired.com.