When I tell someone in customer service, “Thank God I got a human being,” I mean this as a compliment. We have all seen how maddeningly stupid a programmed AI can be. What those of us outside tech tend to forget—unless we have been the victim of an algorithm run amok—is how easily the damage can exceed frustration.
People have muttered doom for decades, imagining scenarios that could be scripted into a dystopian SF film. But a lot of science fiction is now with us as fact, and the machine errors track right along with human failings. Early on, Google’s image recognition technology labeled Black people as gorillas. In a 2018 study, AI systems had an error rate of 34 percent for dark-skinned women but only 0.8 percent for light-skinned men. Twitter trolls quickly taught Microsoft’s chatbot Tay to be racist and misogynistic; the second iteration was stiffly politically correct and judgmental. A ProPublica investigation found that an algorithm used by judges and parole officers in Florida was flagging Black defendants as more prone to recidivism than they actually were, White defendants as less prone than they actually were.
Have you ever noticed how often robots themselves are white? Or how often AI researchers and venture capitalists are White males? Their power is slipping away. They need a new planet, fresh genius, a way to be invulnerable.
And we are all their guinea pigs.
Consider Elon Musk. Last year, he tweeted news of an upcoming demo that would “blow everybody’s mind”: neurons firing inside a living brain implanted with his Neuralink.
The Neuralink is designed to let us control a computer or mobile device with our minds. Delicate filaments loaded with electrodes are inserted into the brain—by a robotic system, because they are too fine to be placed by a human hand. The filaments connect to an implant that is controlled remotely from outside the body.
In this case, the living brain belonged to a small pig named Gertrude who, fascinated by something off-camera, refused to enter the enclosure. The streamed demo was perfect reality tv, with Musk making increasingly panicked suggestions, laughing nervously, saying, “All right, this might take a second . . . ”
Warm, solid pink flesh had outwitted his machine.
The crowd waited, silent. After long minutes, an assistant managed to coax Gertrude forward with a treat. And it turned out this was all that was meant to happen: We were to see Gertrude, “a happy healthy pig,” and hear beeps, signals from a disc the size of a silver dollar that rested atop the folds of her brain. Whenever she touched something with her velvety little snout, the electricity spiked and the beeps sped up. The proof was in the pig.
Proof of exactly what, though? In the near future, Musk predicts, people will plug in before they fall asleep, powering up their implants. He expects his Neuralink to solve a host of brain and spine problems, including memory loss, addiction, extreme pain, seizures, brain damage, and strokes. Eventually, he says, it will achieve “symbiosis with artificial intelligence,” which he believes will otherwise destroy us.
At the moment, though, his magic microwires are at best temporary; they cannot weather the “corrosive” environment of a living brain. If researchers cannot fix that glitch, those of us lucky enough to be wired up will have to climb back into his gleaming white (of course) helmet for another robotic surgery, and another, and another….
Meanwhile, another wee glitch: Our brain chip could be hacked.
Call it digital death. Or brainjacking. Scientists in Belgium found that a wireless brain implant, even one of the medical implants currently used for deep brain stimulation, could easily be hacked using off-the-shelf materials. With a Neuralink, the tech would make us vulnerable, and good old-fashioned human malevolence would do the rest. By taking remote control of your brain implant, someone could change the voltage, thereby interrupting sensation, changing behavior, causing disability, or even killing you.
There goes the happy, healthy piggie.
Read more by Jeannette Cooperman here.