Black Mirror, Trapped Intelligence, and Human Experience
AUTHOR: Dylan Z. Siegel  |  DATE: February 25, 2018
The distinction between artificial intelligence and human intelligence lives along a clear line—for the time being. The ability to pass the Turing test and the more obvious difference, physicality—or lack of it—helps us differentiate between the two forms of equally remarkable intelligence. Suppose, however, that we find our way not just to replicating ourselves in human-sized incubators, but to manipulating our conscience to exist in other forms. Put my mind in a bobblehead, or code me into a thumb drive. What distinction, then, is there between human existence and artificial human existence?
     Black Mirror, the Netflix anthology series, explores the unexpected consequences of our asymptotic technological growth and development. Since 2013, it’s experienced a deluge of positive reviews for the mind-bending perspective and creativity that the story writing and world creating requires. One could consider it prescient, if not a warning for the enthusiastic futurists. Often the major difference between our own lives and Black Mirror is a teetering fix to a pressing issue (bee colony collapse) or a super advanced enhancement to some banal part of our lives (dating, memory recall). Other times it is technological enthusiasm and futurism carried to a crumbling extreme (a live “in-life” rating system for every person). Regardless of the tweak, it becomes relatable because at the heart of its subject is humanity, with our fears and hopes.
     In four episodes of the series, however, the human experience is trapped or manufactured. The writers seem to be asking different versions of whether a tree makes a sound in an empty forest. In the 2nd season’s “White Christmas,” an apparently affluent woman undergoes a procedure to manufacture a digital clone of herself. She hires a company to reproduce the miniature copy of her, stored in a contraption the size of an egg timer, to make her daily life easier. Fresh squeezed juice the moment she enters the sheer white kitchen, toast never burnt. However, being a clone of herself, she is the same person with the same personality trapped in a box. Naturally, she is outraged and unwilling to cooperate with her programmer. In response, to break her—or the digital clone of her—the programmer simulates months of sleepless, deathless captivity over a few seconds of real time to coerce her cooperation. Her crinkling desperation to do anything is the result. The egg timer is the perfect companion to your daily life, to streamline a morning routine—and no one suffers in the process.
     Without spoiling anything, at least three other episodes in the series feature some variation of this trapped intelligence scenario. We’re all a bit scared of becoming the eternally living “Grandpa on a flash drive.” In these cloistered intelligence scenarios, death becomes the release and not the fear in unending life. In “White Christmas,” perhaps the rationale for such ruthlessness is the normative attitude we approach coding and artificial intelligence with. The wondrous curiosity that we could create something besides ourselves that can feel and be as we are is radical and nearly unthinkable because we humans are unique on Earth. In other instances throughout Black Mirror, alternate experience and artificial intelligence is created to enhance or supplement something missing from the characters’ lives. For instance, copies of characters live out their every action and lost hope in digital worlds that are unreachable by the constraints of either time or space. In the more charitable iterations of the manufactured intelligence scenario, the benefits are exceptional—extinguishing irreconcilable regret trapped in deep time or affording oneself missed experiences because of a dreadful twist of fate. There are many reasons to feel that the concept of reproduced intelligence could enhance our lives just as it could be corrupted. The question seems to be, with these episodes, at what point does manufactured intelligence’s experiences become just as relevant and existential as our own. Black Mirror approaches this with a lens of sympathy—each experience is explicitly relevant because they look and feel like us. If it shrieks and looks you in the eye with the terror of a mind, if it bubbles with warmth in love, if it replies like we would, then it is a human. They are us, just replicas. The claim to original habeas corpus, however seems to be the only thing preventing the malicious, nay the ordinary, from torturing digital copies of ourselves into broken oblivion. It is also the major barrier to sharing the positive experiences of the replicas. But, does code feel?
     The comparison is palpable in consideration of the multiplicity of video games and computers that run our world, which we abuse and treat with contempt. Printers, we can mostly agree, deserve abuse, but every computer player unit (CPU) and non player character (NPC) that we kill dozens of times over in seeking mission completion is programmed to feel a certain way or die from a certain amount of damage. Surely they aren’t feeling beings like us, the primordial human flesh bag. No, they do not wander around their world when the console shuts down, living their own lives on their own time. They are simply lines of code—but so are Black Mirror’s digital replicas extracted from human biology. So, where do we draw the line—how can we find it? We likely can’t approach that question on a serious level until the technology exists, but the possibility of storing human intelligence is enticing and likely someone’s dream.
    The most terrifying part of Black Mirror’s realities is not the technological growth that inspires perverse human behavior, but the behavior itself. We’ve worked through the Iron Age, a couple industrial revolutions, the difficulty of nuclear technology (barely, and still waiting), and the insidious power of the internet. Despite the inexorable advancement of technology, and the hope that we become better with advancement, the thought that we would actually privilege certain categories of human experience over others is the real terror of Black Mirror’s omniverse, and the real relevance to our own, because we have privileged and deemphasized distinct groups’ experience before. It holds in its anthology, realities where human suffering and joy is only relevant if you have a body to go with it. Torturing code into subservience is a simple switch to flip despite the gurgling pleas that sound so similar to your own voice. Love found in the eternal virtual reality is not our own to keep, but a fairy tale lived out in some other unreachable place, capable of being wiped in a server update. But, if it’s just code, if it’s just the several pixels on your screen, wouldn’t you think only once about its relevance, too?