The Gendered AI series filled out many more posts than I’d originally planned. (And there were several more posts on the cutting room floor.)
I’ll bet some of my readership are wishing I’d just get back to the bread-and-butter of this site, which is reviews of interfaces in movies. OK. Let’s do it. (But first go vote up Gendered AI for SxSW20 takesaminutehelpsaton!)
Since we’re still in the self-declared year of sci-fi AI here on scifiinterfaces.com, let’s turn our collective attention to one of the best depictions of AI in cinema history, Colossus: The Forbin Project.
Release Date: 8 April 1970 (USA)
Dr. Forbin leads a team of scientists who have created an AI with the goal of preventing war. It does not go as planned.
In 8th grade, I went on our class trip to Washington D.C. The hotel we were staying at had kids from all over the country, and one night they held a dance. I had changed into sweats and a t-shirt and was dancing away with my friends when a boy walked up behind me, tapped me on the shoulder, and said, “Fairy!”
When I turned around and the boy realized I was a girl, he got a confused look on his face, mumbled something and walked off. I was left feeling angry and hurt.
Humans have a strong pull to identify gender not just in people, but in robots, animals, and even smart speakers. (Whether that is wrong or right is another matter that I don’t address here, but many people are uncomfortable when gender is ambiguous.)
Even robots, which could easily be genderless, are assigned a gender.
Noessel has further broken down gender assignment into types: social, bodily, and biological. I find the “social” category particularly interesting, which he defines as follows:
Characters are tagged as socially male or female if the only cues are the voice of the actor or other characters use gendered pronouns to refer to it. R2D2 from Star Wars, for example, is referred to as “him” or “he” many times, even though he has no other gender markers, not even voice. For this reason, R2D2 is tagged as “socially male.”
Disturbingly, Noessel found that the gender ratio was skewed most for this category, at 5 male characters for every 1 female.
I believe that much of the time, when writers create an AI character, it is male by default, unless there is something important about being female. For example, if the character is a love interest or mother, then it must be female; otherwise, by default, it’s male. This aligns with the “Men Are Generic, Women Are Special” theory from TV Tropes, which states:
This leads to the Smurfette Principle, in which a character’s femaleness is the most important and interesting thing about her, often to exclusion of all else. It also tends to result in works failing The Bechdel Test, because if there’s a potential character who doesn’t have to be any particular gender, the role will probably be filled by a male character by default.
Having been designing and researching voice interfaces for twenty years, I’d like to add some perspective on how gender and AI is applied to our current technology.
In the real world
One exception to this rule is voice assistants, such as Siri, Cortana, and Alexa. The majority of voice assistants have a female voice, although some allow you to change the default to a male voice. On the other hand, embodied robots (such as Jibo (pictured below), Vector, Pepper, and Kuri) are more often gendered as male.
When a robot is designed, gender does not have to be immediately assigned. In a voice assistant, however, it’s the most apparent characteristic.
In his book Wired for Speech, Clifford Nass wrote that individuals generally perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems.
If voice-only assistants are predominantly given female voices, why are robots any different?
Why are robots different?
One reason is androcentrism: the default for many things in society is male, and whatever differs from that default must be marked in some way. When people see a robot with no obviously “female” traits (such as long hair, breasts, or, in the case of Rosie from the Jetsons, an apron) they usually assign a male gender, as this study found. It’s similar for cartoons such as stick figures, and animals in animated movies. Animals are often given unrealistic bodies (such as a nipped-in waist), a hairbow, or larger, pink lips to “mark” them as female.
It would not be surprising if designers felt that to make a robot NOT male, they would have to add exaggerated features. Imagine if, after R2D2 was constructed, George Lucas said “let’s make R2D2 female”. Despite the fact that nothing would have to be changed (apart from the “he” pronoun in the script), I have no doubt the builders would have scrambled to “female-ize” R2D2 by adding a pink bow or something equally unnecessary.
In addition, male characters in fictional works are often more defined by their actions, and female characters by their looks and/or personalities. In this light, it makes sense that a more physical assistant would be more likely to be male.
There are some notable exceptions to this, mainly in the area of home health robots (such as Mabu). It is interesting to note that Mabu, though “she” has a physical form, the body doesn’t move, just the head and eyes; it serves mainly as a holder for an iPad. Again, she’s an assistant.
One may ask, what’s the harm in these gendered assistants? One problem is the continued reinforcement of women as always helpful, pleasant, organized, and never angry. They’re not running things; they’re simply paving the way to make your life easier. But if you want a computer that’s “knowledgeable”—such as IBM’s Watson that took on the Jeopardy! Challenge—the voice is male. These stereotypes have an impact on our relationships with real people, and not for the better. There shouldn’t be a “default” gender, and it’s time to move past our tired stereotypes of women as the gender that’s always helpful and accommodating.
As fans of sci-fi, we should become at least sensitized, and more hopefully, vocal and active, about this portrayal of women, and do our part to create more equal technology.
Thanks to all who donated to compensate underrepresented voices! I am donating the monies I’ve received to the Geena Davis Institute on Gender in Media. This group “is the first and only research-based organization working within the media and entertainment industry to engage, educate, and influence content creators, marketers and audiences about the importance of eliminating unconscious bias, highlighting gender balance, challenging stereotypes, creating role models and scripting a wide variety of strong female characters in entertainment and media that targets and influences children ages 11 and under.” Check them out.
Recall from the germane distribution post that the germane tag is about whether the gender is important to the plot. (Yes, it’s fairly subjective.)
If an AI character makes a baby via common biological means, or their sex-related organs play a critical role, then the gender of the character is highly germane. Rachel in the Blade Runner franchise gestates a baby, so her having a womb is critical, and as we’ve seen in the survey, gender stacks, so her gender is highly germane.
If an AI character has a romantic relationship with a mono-sexual partner, or is themselves mono-sexual, or they occupy a gendered social role that is important to the plot, the characters is listed as slightly germane. For example, all you’d have to do is, say, make Val Com bisexual or gay, and then they could present as female and nothing else in the plot of Heartbeeps would need to change to accommodate it.
If the character’s gender could be swapped to another gender and it not change the story much, then we say that the character’s gender is not germane. BB-8, for instance, could present as female, and nothing in the canon Star Wars movies would change.
I need to clarify that I’m talking about plot—what happens in the show—rather than story—which entails the reasons it is told and effects—because given the nature of identity politics, a change in gender presentation would often change how the story is received and interpreted by the audience.
All the characters in Alien, for instance, were written unisex, to be playable by actors of any sex or gender presentation. So while it “didn’t matter” that Ripley was cast as Sigourney Weaver, it totally did matter because she was such a bad-ass female character whose gender was immaterial to the plot (we hadn’t had a lot of those at this point in cinematic history). She was just a bad-ass who happened to be female, not female because she “needed” to be. So, yes, it does matter. But diegetically, had she been Alan Ripley, the plot and character relationships of Alien would not need to change. He still damned well better save Jonesy.
So what do we see when we look at the germane-ness of AI characters in a mostly-binary way?
Sure enough, when gender matters to the plot—slightly or highly—the gender presentation of the character is 5.47% female, or about 7% more likely than presenting male. When the gender presentation does not matter, that value is flipped, being around 7% more male than female, and around 9% more other than female.
The sample size for highly germane is vanishingly small, and one would expect the coupling to include a male, so the under-noise values for that category is not too surprising. But the other categories. Holy cow.
Put another way…
Otherwise, they’re more often male or not gendered at all.
That is shitty. It’s like Hollywood thinks men are the default gender, and I know I just said it, but I’m going to stay it again—that’s shitty. Hey, Hollywood. Women are people.
You may be wondering how this is different than the earlier subservience posts. Recall that the subservience studies look at gender presentation of AI as it relates to their own degree of freedom. Are most AIs freewilled? Yes. Do free-willed AI tend to present as boys more often than as girls or other? Yes. But these tell us nothing about the gender relationship of the subservient AIs to their master’s gender. It would be one thing if all the male-presenting AIs were “owned” by male-presenting owners. If would be another if female-presenting AIs were owned much more often by male-presenting masters. This post exposes those correlations in the survey. Chart time!
Data nerds (high fives) may note that unlike every other correlations chart in the series, these numbers don’t balance. For instance, looking at the Male AI in the left chart, -1.63 + 3.97 + 3.97 = 6.31. Shouldn’t they zero out? If we were looking at the entire survey, they would. But in this case, free-willed AI only muddy this picture, so those AIs are omitted, making the numbers seem wonky. Check the live sheet if you’re eager to dig into the data.
This is two charts in one.
The left chart groups the data by genders of master. Turns out if you have a female-presenting master, you are unlikely to be male- or female-presenting. (Recall that there are only 5 female-presenting masters in the entire Gendered AI survey, so the number of data points is low.) If you present as male, you’re more likely to be master of a gendered AI. Otherwise, you are more likely to be master of a male-presenting AI.
The right chart is the same data, but pivoted to look at it from genders of AI. That’s where the clusters are a little more telling.
If you are a female-presenting AI, you are more likely to have a male-presenting master.
If you are non-binary AI, you are more likely to have a female-presenting master.
If you are a male AI, you have anything but a female-presenting master.
The detailed chart doesn’t reveal anything more than we see from this aggregate, so isn’t shown.
The notion of people owning people is revolting, but the notion of owning an AI is still not universally reviled. (With nods to the distinctions of ANI and AGI.) That means that sci-fi AI serves as unique metaphor for taboo questions of gender and ownership. The results are upsetting for their social implications, of course. And sci-fi needs to do better. Hey, maybe this gives you an idea…
And yet this isn’t the most upsetting correlations finding in the study. I saved that for last, which is next, which is when we look at gender and germaneness. Gird your loins.
The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example, do female-presenting AIs get different bodies than male-presenting AIs? (Yes.) Are female AIs more subservient? (No.) What genders are the masters of AI? This particular post is about gender and goodness. If you haven’t read the series intro, related goodness distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.
n.b. If you’re looking at the live sheet, you may note it says “alignment” rather than “goodness” in the dropdown and sheets. Sorry about the D&D roots showing. But by this, I mean a rough, highly debatable scale of saintliness to villainy.
Gender and goodness
What do we see when we look at the correlations of gender and level of goodness? There are three big trends.
The aggregate picture shows a tendency for female-presenting AI’s to be closer to neutral, rather than extreme.
It shows a tendency for male-presenting AI’s to be very good, or very evil.
It shows a slight tendency for nonbinary-presenting AI to be slightly evil, but not full-bore.
When we look into the detailed chart, some additional trends appear.
Biologicially- and bodily-presenting female AI tends toward somewhat evil, but not very evil.
Socially female (voice or pronouns, only) tend toward neutral.
Gender-less AI spike at somewhat evil.
Genderfluid characters (noting that this occurs mostly as a tool of deception) spike at very evil, like, say, Skynet.
AIs showing multiple genders tend toward neutral, like Star Trek TOS’s Exo III androids, or somewhat evil, like Mudd’s androids.
The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example, are female AIs generally shown as smarter than male AIs? Are certain AI genders more subservient? What genders are the masters of AI? This particular post is about gender and category of intelligence. If you haven’t read the series intro, related category distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.
What do we see when we look at the correlations of gender and level of intelligence? First up, the overly-binary chart, and what it tells us.
Gender and AI Category
You’ll recall that levels of AI are one of the following…
Super: Super-human command of facts, predictions, reasoning, and learning. Technological gods on earth.
General: Human-like, able to learn arbitrary new domains to human-like limits
Narrow: Very smart in a limited domain, but unable to learn arbitrary new domains.
The relationships are clear even if the numbers are smallish.
When AI characters are of a human-like intelligence, they are more likely to present gender.
When AI characters are either superintelligent or only displaying narrow intelligence, they are less likely to present gender.
My feminist side is happy that superintelligences are more often female and other than male, but it’s also such small numbers that it could be noise.
If you check the details in the Sheet, you’ll see the detailed numbers don’t reveal any more intense counterbalancing underneath the wan aggregate numbers.
Chris: I posted a question on Twitter, “Other than that SNL skit, have there been queer sci-fi AI in television or movies?” Among the responses is this awesome one from Terence Eden, where he compiled the answers and wrote a whole blog post about it. The following is slightly-modified from the original post on his blog. Consider this a parade of sci-fi AI, to help you nerds celebrate Pride.
Terence: Let’s first define what we mean by queer. This usually means outside of binary gender and/or someone who is attracted to the same sex—what’s commonly referred to as LGBT+. Feel free to supply your own definition.
As for what we mean by AI, let’s go with “mechanical or non-biological autonomous being.” That’s probably wide enough—but do please suggest better definitions.
So is a gay/lesbian robot one who is attracted to other robots? Or to humans with a similar gender? Let’s go with yes to all of the above.
Wait. Do robots have gender?
Humans love categorising things – especially inanimate objects. Some languages divide every noun into male a female. Why? Humans gonna human.
The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example, are female AIs given a certain type of body more than male AIs? Are certain AI genders more subservient? What genders are the masters of AI? This particular post is about gender and subservience. If you haven’t read the series intro, related subservience distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.
Recall from the distributions post that subservience is cruder than we would like. Part of what we’re interested in is the social subservience: specifically whether female-presenting AI more often demur or take a deferential, submissive tone. The measurements I show here are more coarse than that, because the nuanced measurements are very open to debate, and can change over the course of a show. What I felt confident about tagging was first free-willed vs. subservient; and then, for those that had to obey a master, whether they could only act as instructed (slavish), whether they seemed to register and resist their servitude (reluctant) or not (improvisational). Still, even with the crude metric, there’s stuff to see.
What do we see when we look at the correlations of gender and subservience? First up, the trinary chart, and what it tells us.
The numbers are small here, at a max of 4.1% away from perfect, but we can still note the differences.
If it is free-willed, it is slightly more likely to be male than female, and male much more than other.
If it has a master, but free to improvise actions within constraints and orders, it is more likely to be other than male: ungendered (the majority), multi-gendered, or genderfluid.
Female-presenting AI do not appear to have significant disproportions of subservience. Those pink bars are all pretty small, all hovering near perfect distribution, and the one place they’re not, that is, slavish obedience, they’re less represented. Those characters tend to have a machine embodiment and therefore no gender, but it still means there is no bias toward or against female-presenting AIs in this correlation.
Now this probably breaks your gut sense of what you’ve seen in shows. What about Ex Machina! What about Maria! What about Ship’s computer in Star Trek? What about…? I’m not sure what to tell you, as these results thwart my expectations as well. But these are the numbers. It may just be that those examples of subservient female sci-fi AIs stand out for us more, given oppressive norms in the real world.
There’s not a lot more to be pulled from the detailed view of the data, either.
Note that the examples of characters with reluctant obedience to a master are dominated by the unfortunate, neurocloned crew of the U.S.S. Callister from Black Mirror. (Each of whom are reluctantly subservient.) Other than that example, there are three female-presenting characters and one male-presenting character. We would have more confidence in the results with a bigger sample size.
The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example, are female AIs given a certain type of body more than male AIs? Are certain AI genders more subservient? What genders are the masters of AI? This particular post is about gender and embodiment. If you haven’t read the series intro, related embodiment distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.
What do we see when we look at the correlations of gender and embodiment? First up, the overly-binary chart, and what it tells us.
I see three big takeaways.
When AI appears indistinguishable from human, it is female significantly more often than male. When AI presents as female, it is much more likely to be embodied as indistinguishable from a human than an anthropomorphic or mechanical robot. Hollywood likes its female-presenting AIs to be human-like.
Anthropomorphic robots are more likely to be male than female. Hollywood likes its male-presenting AIs to be anthropomorphic robots.
If an AI is mechanical, it is more likely to be “other.” (Having no gender, multiple genders, or genderfluid.)
These first two biases make me think of the longstanding male-gaze popular-culture trope that pairs a conventionally-attractive female character with a conventionally-unattractive male. (Called “Ugly Guy Hot Wife” on TV Tropes.)
Recent research from Denmark hints that these may be the most engaging forms to engage children (and adults?) in the audience: learning outcomes in a study of VR teachers found that girls learn best from a young, female-presenting researcher, and boys learned best when that teacher presented as a drone. The study did not venture a hypothesis as to why this is, or whether this is desirable. These were the only two options tested with the students, so much more work is needed to test what combinations of presentation, embodiment, and superpowers (the drone hovered) are the most effective. And we still have to discuss the ethics and possible long-term effects of such tailoring. But still, interesting in light of this finding.
Not a surprise
When AI is indistinguishable from human, it is less likely to have a gender other than male or female.
If an AI presents with no gender, it is embodied as a mechanical robot. Little surprise there.
Mechanical robots are more likely to be neither male nor female.
When we look more closely at the numbers, it gets a little weirder. This makes for a very complicated graph, so I’ll use a screen grab from the sheets as the image.
Of course we would not expect many socially gendered characters to be indistinguishable from a human, but you’ll note that socially male is much higher than socially female, and that’s because while there are no characters that are both [socially female + indistinguishable from human], there is one tagged [socially male + indistinguishable from human], and that’s Ruk, from Star Trek (the original Series) episode “What are Little Girls Made of?”
Bucking other trends toward male-ness, [disembodied + female-voiced] AI are 8 times as likely to appear as disembodied, male-voiced AI, of which there is only one example, JARVIS from the MCU.