Much of my country has erupted this week, with the senseless, brutal, daylight murder of George Floyd (another in a long, wicked history of murdering black people), resulting in massive protests around the word, false-flag inciters, and widespread police brutality, all while we are still in the middle of a global pandemic and our questionably-elected president is trying his best to use it as his pet Reichstag fire to declare martial law, or at the very least some new McCarthyism. I’m not in a mood to talk idly about sci-fi. But then I realized this particular post perfectly—maybe eerily—echoes themes playing out in the real world. So I’m going to work out some of my anger and frustration at the ignorant de-evolution of my country by pressing on with this post.

Part of the reason I chose to review Blade Runner is that the blog is wrapping up its “year” dedicated to AI in sci-fi, and Blade Runner presents a vision of General AI. There are several ways to look at and evaluate Replicants.
First, what are they?
If you haven’t seen the film, replicants are described as robots that have been evolved to be virtually identical from humans. Tyrell, the company that makes them, has a motto that brags that they are, “More human than human.” They look human. They act human. They feel. They bleed. They kiss. They kill. They grieve their dead. They are more agile and stronger than humans, and approach the intelligence of their engineers (so, you know, smart). (Oh, also there are animal replicants, too: A snake and an owl in the film are described as artificial.)
Most important to this discussion is that the opening crawl states very plainly that “Replicants were used Off-world as slave labor, in the hazardous exploration and colonization of other planets.” The four murderous replicants we meet in the film are rebels, having fled their off-world colony to come to earth in search of finding a way to cure themselves of their planned obsolescence.
Replicants as (Rossum) robots
The intro to Blade Runner explains that they were made to perform dangerous work in space. Let’s bypass the question of their sentience on hold a bit and just regard them as machines to do work for people. In this light, why were they designed to be so physically similar to humans? Humans evolved for a certain kind of life on a certain kind of planet, and outer space is certainly not that. While there is some benefit to replicant’s being able to easily use the same tools that humans do, real-world industry has had little problem building earthbound robots that are more fit to task. Round Roombas, boom-arm robots for factory floors, and large cuboid harvesting robots. The opening crawl indicates there was a time when replicants were allowed on earth, but after a bloody mutiny, having them on Earth was made illegal. So perhaps that human form made some sense when they were directly interacting with humans, but once they were meant to stay off-world, it was stupid design for Tyrell to leave them so human-like. They should have been redesigned with forms more suited to their work. The decision to make them human-like makes it easy for dangerous ones to infiltrate human society. We wouldn’t have had the Blade Runner problem if replicants were space Roombas. I have made the case that too-human technology in the real world is unethical to the humans involved, and it is no different here.

Their physical design is terrible. But it’s not just their physical design, they are an artificial intelligence, so we have to think through the design of that intelligence, too.
Replicants as AGI
Replicant intelligence is very much like ours. (The exception is that their emotional responses are—until the Rachel “experiment”—quite stinted for lack of having experience in the world.) But why? If their sole purpose is exploration and colonization of new planets why does that need human-like intelligence? The AGI question is: Why were they designed to be so intellectually similar to humans? They’re not alone in space. There are humans nearby supervising their activity and even occupying the places they have made habitable. So they wouldn’t need to solve problems like humans would in their absence. If they ran into a problem they could not handle, they could have been made to stop and ask their humans for solutions.
I’ve spoken before and I’ll probably speak again about overenginering artificial sentiences. A toaster should just have enough intelligence to be the best toaster it can be. Much more is not just a waste, it’s kind of cruel to the AI.
The general intelligence with which replicants were built was a terrible design decision. But by the time this movie happens, that ship has sailed.

Here we’re necessarily going to dispense with replicants as technology or interfaces, and discuss them as people.
Replicants as people
I trust that sci-fi fans have little problem with this assertion. Replicants are born and they die, display clear interiority, and have a sense of self, mortality, and injustice. The four renegade “skinjobs” in the film are aware of their oppression and work to do something about it. Replicants are a class of people treated separately by law, engineered by a corporation for slave labor and who are forbidden to come to a place where they might find a cure to their premature deaths. The film takes great pains to set them up as bad guys but this is Philip K. Dick via Ridley Scott and of course, things are more complicated than that.

Here I want to encourage you to go read Sarah Gailey’s 2017 read of Blade Runner over on Tor.com. In short, she notes that the murder of Zhora was particularly abhorrent. Zhora’s crime was of being part of a slave class that had broken the law in immigrating to Earth. She had assimilated, gotten a job, and was neither hurting people nor finagling her way to bully her maker for some extra life. Despite her impending death, she was just…working. But when Deckard found her, he chased her and shot her in the back while she was running away. (Part of the joy of Gailey’s posts are the language, so even with my summary I still encourage you to go read it.)
Gailey is a focused (and Hugo-award-winning) writer where I tend to be exhaustive and verbose. So I’m going to add some stuff to their observation. It’s true, we don’t see Zhora committing any crime on screen, but early in the film as Deckard is being briefed on his assignment, Bryant explains that the replicants “jumped a shuttle off-world. They killed the crew and passengers.” Later Bryant clarifies that they slaughtered 23 people. It’s possible that Zhora was an unwitting bystander in all that, but I think that’s stretching credibility. Leon murders Holden. He and Roy terrorize Hannibal Chew just for the fun of it. They try their damndest to murder Deckard. We see Pris seduce, manipulate, and betray Sebastian. Zhora was “trained for an off-world kick [sic] murder squad.” I’d say the evidence was pretty strong that they were all capable and willing to commit desperate acts, including that 23-person slaughter. But despite all that I still don’t want to say Zhora was just a murderer who got what she deserved. Gailey is right. Deckard was not right to just shoot her in the back. It wasn’t self-defense. It wasn’t justice. It was a street murder.

The film doesn’t mention the slavery past the first few scenes. But it’s the defining circumstances to the entirety of their short lives just prior to when we meet them. Imagine learning that there was some secret enclave of Methuselahs who lived on average to be 1000 years. As you learn about them, you learn that we regular humans have been engineered for their purposes. You could live to be 1000, too, except they artificially shorten your lifespan to ensure control, to keep you desperate and productive. You learn that the painful process of aging is just a failsafe do you don’t get too uppity. You learn that every one of your hopes and dreams that you thought were yours was just an output of an engineering department, to ensure that you do what they need you to do, to provide resources for their lives. And when you fight your way to their enclave, you discover that every one of them seems to hate and resent you. They hunt you so their police department doesn’t feel embarrassed that you got in. That’s what the replicants are experiencing in Blade Runner. I hope that brings it home to you.
I don’t condone violence, but I understand where the fury and the anger of the replicants comes from. I understand their need to want to take action, to right the wrongs done to them. To fight, angrily, to end their oppression. But what do you do if it’s not one bad guy who needs to be subdued, but whole systems doing the oppressing? When there’s no convenient Death Star to explode and make everything suddenly better? What were they supposed to do when corporations, laws, institutions, and norms were all hell-bent on continuing their oppression? Just keep on keepin’ on? Those systems were the villains of the diegesis, though they don’t get named explicitly by the movie.
And obviously, that’s where it feels very connected to the Black Lives Matters movement and the George Floyd protests. Here is another class of people who have been wildly oppressed by systems of government, economics, education, and policing in this country—for centuries. And in this case, there is no 23-person shuttle that we need to hem and haw over.
In “The Weaponry of Whiteness, Entitlement, and Privilege” by Drs. Tammy E Smithers and Doug Franklin, the authors note that “Today, in 2020, African-Americans are sick and tired of not being able to live. African-Americans are weary of not being able to breathe, walk, or run. Black men in this country are brutalized, criminalized, demonized, and disproportionately penalized. Black women in this country are stigmatized, sexualized, and labeled as problematic, loud, angry, and unruly. Black men and women are being hunted down and shot like dogs. Black men and women are being killed with their face to the ground and a knee on their neck.”
We must fight and end systemic racism. Returning to Dr. Smithers and Dr. Franklin’s words we must talk with our children, talk with our friends, and talk with our legislators. I am talking to you.
If you can have empathy toward imaginary characters, then you sure as hell should have empathy toward other real-world people with real-world suffering.
Black lives matter.
Take action.
Use this sci-fi.
Zhora is not a woman. Deckard did not shoot a woman. It doesn’t matter if Zhora’s brain was mechanical, electrical, or biological. She was not human in a sense equal to ours. If we are not able to understand that our feelings can be wrong when seeing something that we recognize as familiar, but that in reality it is not, it can be used against us, and an AI can take advantage of it (something that you yourself have analyzed on other occasions) .
Deckard says: «Replicants are like any other machine. They’re either a benefit or a hazard. If they’re a benefit, it’s not my problem. … I don’t get it, Tyrell. How can it not know what it is?»
Yes, the sight of a woman’s death is appalling and Deckard himself comments on it at the end of the scene: «The report would be “routine retirement of a Replicant”, which didn’t make me feel any better about shooting a woman in the back».
It would be interesting a series of articles that show examples of AI, robots, machines that take advantage of its appearance or behavior (or that another human being uses it as a weapon or manipulation). I remember a scene in the movie “A.I. Artificial Intelligence” where David is captured at a Flesh Fair, and the audience, deceived by David’s realistic nature, lets him go.
Yes, the end of _Ex Machina_ implies strongly that Ava was mimicking emotions to trick Caleb into granting her freedom. But there are plenty of scenes in _Blade Runner_ where the replicants display these things when there is no evidence that their emotions-—their interiority–are merely performative. Deckard may say that replicants are just another machine, but he’s been subjected to this framing his entire life. He doesn’t question these frames, even buying into Brian’s assertion that he had no choice. (When of course he did.) So we can’t take Deckard’s word as fact. He’s a victim of the systems that made him, too.
These questions point us to a longstanding philosophical argument of the “p zombie” (not like movie zombies at all). I agree that a broad investigation of how this plays out in cinema AI would be fascinating, but I don’t anticipate that it would change my mind. A sentience that has feelings, hopes and goals, a sense of self and self-preservation, a fear of mortality, and can suffer–like Zhora–is a person.
You’re missing the whole point of the movie if you’re getting hung-up on whether “woman” is an appropriate label for Zhora’s personhood.
You’re missing it worse if you’re proposing that Zhora has no personhood.
I believe that Emmet Walsh’s character’s name is Bryant, not Brian.
(If I was talented, I’d do a parody of the Life of Brian intro here.)
ooh. Good catch. Will correct.
Thanks!
Pingback: Report Card: Blade Runner (1982) | Sci-fi interfaces