Early in the 21st century, Amazon advanced delivery evolution into a near-automated stage – with employees becoming virtually identical to robots. They were used in warehouses as underpaid and exploited labour for the hazardous work of capitalism, consumerism and convenience. After the revelation of inhumane working standards, injury and deaths, Amazon was declared one of the most dangerous employers in the world. AI programs were developed to monitor, report on and fire any failing human workers.
This was not called illegal exploitation or capitalist torture.
This was called ‘business’.
Here, at the edge of dystopia, time has finally caught up to Ridley Scott’s Blade Runner (1982). Critiques of Scott’s accuracy in near-future prophecy usually focus on the lack of hovercars and the fact that android sophistication isn’t much more advanced than the ice-cream scooping robot team in Federation Square. The other possibility, of course, is that we can’t see the future clearly because we are looking at it from the other side.
The most striking aspects of the film in the collective memory are probably the future-noir aesthetic and the soundtrack by Vangelis. If you watched the first cinematic release from 1982, you may also remember the lacklustre Harrison Ford voice-over and an inexplicably soppy ending for a cyberpunk dystopia. Ford played the titular ‘blade runner’, Rick Deckard, hunting rebellious Tyrrell Corporation androids (replicants) through the rain-soaked streets and towering megastructures of future-LA. He retires (kills) them one by one, except for Rachael (Sean Young), with whom he falls in love and escapes.
Blade Runner was an 80s vision of the future based on a 60s vision of the future: Philip K. Dick’s 1968 novel, Do Androids Dream of Electric Sheep? Scott’s contrasts of pessimistic retro-futurism, jazz electronica, and cyberpunk noir cause the past and future to collide with melancholic effect. It’s a weary, lived-in futurism. The music is haunted with fragments of crooning or lone arias. The design is full of historical echoes, faded grandeur, pyramids and Frank Lloyd Wright’s Mayan interior alongside zeppelins and Coca Cola and Budweiser branding. It’s a world in deterioration, a collapse of time. This dystopic vision becomes practically its own verb tense: a future-past-imperfect.
Roger Ebert wasn’t as big a fan of the original Blade Runner as he was of the Final Cut (2002), the vision of the film as Scott originally intended. But then, Ebert had the benefit of hindsight in observing Blade Runner‘s successive waves of influence:
This is a seminal film, building on older classics like Metropolis (1926) or Things to Come, but establishing a pervasive view of the future that has influenced science fiction films ever since. Its key legacies are: Giant global corporations, environmental decay, overcrowding, technological progress at the top, poverty or slavery at the bottom – and, curiously, almost always a film noir vision. Look at Dark City, Total Recall, Brazil, 12 Monkeys or Gattaca and you will see its progeny.
Unlike the studio ending, in the Final Cut you are not meant to feel content. The status quo has not changed, the philosophical questions have no answer, the humans may not be human. Ford clashed with Scott over this ending, which strongly implied that Deckard was a replicant:
That was the main area of contention between Ridley and myself … I thought the audience deserved one human being on screen that they could establish an emotional relationship with.
This adds tension to Deckard’s outburst about Rachael: ‘How can it not know what it is?’ and to her question as to whether he had ever taken the Voight-Kampff test (the official method of ascertaining humanity by provoking empathetic response). Empathy is at the core of Blade Runner, Dick’s original novel and posthumanist thought in general.
Unlike in Dick’s novel, Scott’s androids are not incapable of empathy: they are simply created without it, and their shortened life-span of four years is a safeguard to prevent them from developing emotions that would make them indistinguishable from humans. Why? Because that would undermine their status as Other. Proximity to humanity would make their slavery and murder ethically questionable. In his 2015 retrospective of Blade Runner, Michael Newton observed: ‘We are told that the replicants can do everything a human being can do, except feel empathy. Yet how much empathy do we feel for faraway victims or inconvenient others?’ Scott’s insistence that we connect with the fear and sorrow of the android may be ultimately due to the fact that we were never meant to identify as human: we are the replicant.
Shortly after Blade Runner’s release, Donna Haraway published her Cyborg Manifesto, in which she presented ‘cyborg’ as identity in an increasingly technologically-augmented world.
… certain dualisms have been persistent in Western traditions; they have all been systemic to the logics and practices of domination of women, people of color, nature, workers, animals—in short, domination of all constituted as others, whose task is to mirror the self. Chief among these troubling dualisms are self/other, mind/body, culture/ nature, male/female, civilized/primitive, reality/appearance, whole/part, agent/resource, maker/made, active/passive, right/wrong, truth/illusion, total/partial, God/man… High-tech culture challenges these dualisms in intriguing ways. It is not clear who makes and who is made in the relation between human and machine. It is not clear what is mind and what is body in machines that resolve into coding practices … we find ourselves to be cyborgs, hybrids, mosaics, chimeras. Biological organisms have become biotic systems, communications devices like others. There is no fundamental, ontological separation in our formal knowledge of machine and organism, of technical and organic.
Blade Runner‘s legacy is this same anxiety of human/AI identity that bequeathed us Ghost In The Shell (1995), Her (2013) and Ex Machina (2014), among other media.
Normally, video encoding happens through an automated electronic process using a compression standard developed by humans who decide what the parameters should be — how much data should be compressed into what format, and how to package and reduce different kinds of data like aspect ratio, sound, metadata… Broad wanted to teach an artificial neural network how to achieve this video encoding process on its own, without relying on the human factor.
With each attempt, the ‘autoencoder’ clarified and refined the image, an AI repeatedly playing out Deckard’s struggle with his own humanity. Broad published his findings in 2017 under the title Autoencoding Blade Runner: Reconstructing Films with Artificial Neural Networks. He noted the unique ‘hazy, dreamlike’ reconstruction of the ‘disembodied gaze’. An autoencoder dreams of Ridley Scott’s dream of Do Androids Dream Of Electric Sheep? As a human simulation of the brain and nervous system, the autoencoder is a hybrid, perhaps a monstrosity. Algorithms are created in our image, learning from us as children learn from parents.
The mission of the replicant ‘villain’ Roy Batty (Rutger Hauer) and his fury at his creator Eldon Tyrrell (Joe Turkel) is Frankenstein’s creature questioning his maker about the purpose and pain of a brief existence. The monster’s fury is relatable: humans fear death, and whether it’s four years, forty or four-score, arriving into life with your return ticket already punched is a reasonable source of anxiety. And what did Tyrrell owe his creations? Eric Schwitzgebel argues that, if we were to make sentient AI, we would owe them even greater moral obligations than we do human strangers. We can perhaps see how Tyrrell deserves his grisly fate for betraying his children.
Ford’s throwaway quip that ‘[Blade Runner is] a film about whether you can have a meaningful relationship with your toaster’ might seem like his standard cockiness, but it’s oddly close to the mark. The questions of Blade Runner aren’t merely philosophical, but emotional. They are not cold queries about the definition of life, but concerns with how love and connection defines us as a network rather than as individuals. We’ve always been fascinated by automata, and interested in the potential for artificial forms of life to inspire empathy, sexual attraction, and love.
Another aspect of Dick’s original novel is a posthumanist concept the unity and continuum of all living things. The religion of Dick’s sci-fi future, Mercerism, connects people to simultaneous empathy experiences through a machine. The visions centre on the figure of Wilbur Mercer, an illusory Christ-figure who, in Sisyphean fashion, toils up an endless hill until he tumbles down into a pit of dead animals, feeling their extinguished life and bodies as though they are his own. It is the connection of all life and suffering, shared sorrow and pain. These humans derive meaning from their reverence for living things and their ability to care for them, right down to the spider.
But what of those incapable of this connection? Consider, perhaps, Scott Morrison and his empathy consultant, trying to pass the Voight-Kampff test of the public eye, or the anthropocentric blind spots and silences in our empathy. Our country burns, and we mourn the rising human death toll – but what of our long, terrible destruction of our landscape and fauna? What of the life extinguished, the species lost, the land poisoned? Collectively, we lack ecological empathy, which means we do not see the suffering of the natural world as our own pain: we are distinct from it, holding ourselves at the pinnacle of a hierarchy, a dominion over nature. Extinction is not our problem. The bees are not our burden. Meat is not murder: it is, perhaps, retirement.
Sci-fi dystopian futures are now. We teeter on the edge of collapse. Erosion of freedoms, deepening inequality, authoritarianism, failing international pacts of unity, overpopulation, waters that rise and land that burns and the people who flee to survive so they can die in the back of lorries, abandoned like stolen merchandise. Humans are dehumanised, made monster, less valued than Siri or Alexa: people of colour, queer people, women, the disabled and mentally ill and any person whose body is controlled or institutionalised or made utterly illegal.
Those of us who are such Others are not the human protagonists. We’re the toaster.