Ctrl-labs’ armband lets you control computer cursors with your mind
Controlling a mouse pointer with your mind may sound like science fiction, but Ctrl-labs, a startup based in New York City, is working hard to make it a reality.
I recently swung by the company’s new digs in Manhattan — a high-rise suite overlooking Herald Square, a few blocks south of the Theater District. It had been two weeks since Ctrl-labs’ employees had moved into the Midtown office, lead scientist Adam Berenzweig told me, and the smell of fresh paint still hung in the air.
“We haven’t finished unpacking the furniture,” he said.
Ctrl-labs can afford the upgrade. In May, the company raised $28 million in a round led by Lux Capital and GV (formerly Google Ventures), the venture capital arm of Alphabet (Google’s parent company). The two join a long, growing list of high-profile backers that includes the Amazon Alexa Fund, Paul Allen’s Vulcan Capital, Peter Thiel’s Founders Fund, Tim O’Reilly, Slack founder and CEO Stewart Butterfield, Warby Parker CEO Dave Gilboa, and others.
What convinced those tech luminaries to fund the three-year-old neuroscience and computing startup, I’d soon find out, feels a little bit like magic.
Finding the neural link
Thomas Reardon, the founder and CEO of Ctrl-labs (formerly Cognescent), was something of a child prodigy. He took graduate-level math and science courses at MIT while in high school and spearheaded a project at Microsoft that became Internet Explorer. A few years later, he enrolled in Columbia University’s classics program, where he studied neuroscience and behavior and went on to earn his Ph.D.
It was in 2015 at Columbia that Reardon, along with fellow neuroscientists Patrick Kaifosh and Tim Machado, conceived of Ctrl-labs and its lofty mission statement: “to answer the biggest questions in computing, neuroscience, and design.” After three years of research and development, the team produced its first product: an armband that reads signals passing from the brain to the hand.
The armband — a bound-together collection of small circuit boards, each soldered to gold contacts meant to adhere tightly to forearm skin — is very much in the prototype stages. A ribbon cable connects the contacts to a Raspberry Pi in an open plastic enclosure, which in turn connects wirelessly to a PC running Ctrl-labs’ software framework.
It’s deceptively unsophisticated.
Image Credit: Kyle Wiggers / VentureBeat
Berenzweig thinks of the armband as an interface much like a keyboard or mouse. But unlike most peripherals, it uses differential electromyography (EMG) — an effect first observed in 1666 by Italian physician Francesco Redi — to translate mental intent into action.
How does it do that? By measuring changes in electrical potential, which are caused by impulses that travel from the brain to hand muscles through lower motor neurons. This information-rich pathway in the nervous system comprises two parts: upper motor neurons connected directly to the brain’s motor center, and lower axons that map to muscle and muscle fibers. Neurotransmitters run the length of that long neural pathway and turn individual muscle fibers on and off — the biological equivalent of binary ones and zeros.
The armband is quite sensitive to these impulses. Before Berenzweig kicked off a demo of the wristband, he made sure to put distance between it and a nearby metal pushcart.
“It acts like an antenna,” he said, “so it’s susceptible to interference.” (Newer versions of the armband don’t have this problem.)
16 electrodes monitor the motor neuron signals amplified by the muscle fibers of motor units, from which they measure signals, and with the help of a machine learning algorithm trained using Google’s TensorFlow distinguish between the individual pulses of each nerve.
Berenzweig, who had put on an armband before I arrived, showed me on a PC an EKG-like graph of colored lines representing each contact. As he lifted a digit, one of the lines showed a slight tremor. Then he let his hand rest at his side, motionless. It show a tremor once again.
Image Credit: Kyle Wiggers / VentureBeat
The wondrous thing about EMG, Berenzweig explained, is that it works independently of muscle movement; generating a brain activity pattern that Ctrl-labs’ tech can detect requires no more than the firing of a neuron down an axon, or what neuroscientists call action potential.
That puts it a class above wearables using electroencephalography (EEG), a technique that measures electrical activity in the brain through contacts pressed against the scalp. EMG devices draw from the cleaner, clearer signals from motor neurons, and as a result are limited only by the accuracy of the software’s machine learning model and the snugness of the contacts against the skin.
That’s not to suggest EMG devices have been perfected. In 2013, Waterloo, Ontario-based Thalmic Labs began shipping an EMG armband — the Myo — that can detect muscle movements, recognize gestures and joint motion, and map neural signals to keys on a keyboard and video game hotkeys. But many of the device’s less-than-stellar reviews mention the inconsistency of its gesture recognition.
Ctrl-labs prototyped its machine learning algorithms with Myo before developing its own hardware, and Berenzweig personally owns one. But the current iteration of Ctrl-labs’ armband is far more precise than the Myo and can work anywhere on the forearm or upper arm. Future versions will work on the wrist.
Berenzweig explained this to me as he typed a few commands into a Linux terminal and fired up the first demo. A likeness of a human hand appeared onscreen, and Berenzweig manipulated it with his fingers, their movements mirroring those of his digital doppelganger.
Then he strapped the bracelet onto my arm. I had worse luck — the thumb on the computerized hand reflected the motions of my thumb, but the index and pinkie finger didn’t — they remained stiff. Berenzweig had me recalibrate the system by angling my wrist slightly, but to no avail.
He chalked this glitch up to the demo’s generalized machine learning model. Experimental versions of the software, he said, are performing much better.
In a second demo, I watched as Berenzweig moved a computer cursor toward a target. Unlike the first demo, movements in this demo actively train a neural net, tuning the system to each user’s neural idiosyncrasies.
When it came time again for my turn, I wasn’t exactly sure how to control it. But after a trepidatious start in which the cursor made maddening laps around the target, coming close to it but not quite touching it, the algorithm — and by extension, precision — improved drastically. Within just a few seconds, moving the cursor with thought became almost second nature, and I was able to steer it up, down, and to the left and even write by thinking about moving — but not actually moving — my hand.
Berenzweig believes this kind of algorithmic learning, which is crucial to the system’s accuracy, could be gamified in other ways. “We’re trying to find the right way to approach it,” he said.
An eye on VR — and smartphones
Ctrl-labs’ armband won’t be relegated to the lab for much longer. By the end of this year, the company plans to ship a developer kit in small quantities and make available software that will expose the band’s raw signals. The final design is in flux, and at least a few devices will be manufactured in-house.
Pricing hasn’t been decided, though Berenzweig said the initial price will be higher than the eventual commercial model’s price point.
Around the corner from the demo and adjacent to a room with a MakerBot (which the team uses to quickly prototype shells), Berenzweig showed me a poster board of concepts and potential form factors. Some looked not unlike Android Wear smartwatches — while the developer kit will have to be tethered to a PC for some processing, he said, the processing overhead is such that all of the hardware will eventually be self-contained.
As for what Ctrl-labs expects its early adopters to build with it and for it, video games top the list — particularly virtual reality games, which Berenzweig thinks are a natural fit for the sort of immersive experiences EMG can deliver. (Imagine swiping through an inventory screen with a hand gesture, or piloting a fighter jet just by thinking about the direction you want to fly.)
But Ctrl-labs is also thinking smaller. Not too long ago, the company demonstrated to Wired a virtual keyboard that maps finger movements to PC inputs, allowing a wearer to type messages by tapping on a tabletop. And at the 2018 O’Reilly AI conference in New York City, Reardon spoke about text messaging apps for smartphones and smartwatches that let you peck out replies one-handed. Berenzweig, for his part, has experimented with control schemes for tabletop robotic arms.
“You know how early versions of Windows used to ship with Minesweeper, and Windows sort of became known for it? We need to find our Minesweeper,” he said.
One field of research Ctrl-labs won’t be investigating is health care — at least not at first. While Berenzweig agrees that the tech could be used to help stroke victims and people with degenerative neural diseases like amyotrophic lateral sclerosis (ALS), he says those aren’t applications the company is actively exploring. Ctrl-labs is loath to submit its hardware for approval by the Federal Food and Drug Administration, a potentially years-long process. (Reardon’s stated goal is to get a million people using the armband within the next three to four years.)
“We’re focusing on consumers right now,” Berenzweig said. “We think it has medical use cases, but we want it to be a consumer product.”
By the time Ctrl-labs hits retail store shelves, it’ll likely have competition. Thalmic Labs is developing a second-generation EMG armband, and a new a venture funded by SpaceX and Tesla head Elon Musk, NeuraLink Corp, aims to develop mass-market implants that treat mood disorders and help physically disabled people regain mobility.
Not to be outdone, Facebook is researching a kind of telepathic transcription that taps the brain’s speech center. In September 2017 at the MIT Media Lab conference, project lead Mark Chevillet told the audience that the company plans to detect brain signals using noninvasive sensors and diffuse optical tomography. Effectively, this would allow a user to type words simply by thinking them.
Berenzweig is convinced that Ctrl-labs’ early momentum, plus the robustness of its developer tools, will help it gain an early lead in the brain-machine interface race.
“Speech evolved specifically to carry information from one brain to another. This motor neuron signal evolved specifically to carry information from the brain to the hand to be able to affect change in the world, but unlike speech, we have not really had access to that signal until this,” he told Wired in September 2017. “It’s as if there were no microphones and we didn’t have any ability to record and look at sound.”