A Default Gender?

By guest blogger Cathy Pearl

In 8th grade, I went on our class trip to Washington D.C. The hotel we were staying at had kids from all over the country, and one night they held a dance.  I had changed into sweats and a t-shirt and was dancing away with my friends when a boy walked up behind me, tapped me on the shoulder, and said, “Fairy!”

cortana
“I think we both know the answer to that.” —Cortana, Halo: Combat Evolved

When I turned around and the boy realized I was a girl, he got a confused look on his face, mumbled something and walked off.  I was left feeling angry and hurt.

Humans have a strong pull to identify gender not just in people, but in robots, animals, and even smart speakers.  (Whether that is wrong or right is another matter that I don’t address here, but many people are uncomfortable when gender is ambiguous.)

Even robots, which could easily be genderless, are assigned a gender.

Author Chris Noessel has accumulated an amazing set of data which looks at hundreds of characters in science fiction, and has found that, among many other things, of the 327 AI characters he looked at, about twice as many are male as female.

Social Gender

Noessel has further broken down gender assignment into types:  social, bodily, and biological. I find the “social” category particularly interesting, which he defines as follows:

Characters are tagged as socially male or female if the only cues are the voice of the actor or other characters use gendered pronouns to refer to it. R2D2 from Star Wars, for example, is referred to as “him” or “he” many times, even though he has no other gender markers, not even voice. For this reason, R2D2 is tagged as “socially male.”

Disturbingly, Noessel found that the gender ratio was skewed most for this category, at 5 male characters for every 1 female.

I believe that much of the time, when writers create an AI character, it is male by default, unless there is something important about being female.  For example, if the character is a love interest or mother, then it must be female; otherwise, by default, it’s male. This aligns with the “Men Are Generic, Women Are Special” theory from TV Tropes, which states:

This leads to the Smurfette Principle, in which a character’s femaleness is the most important and interesting thing about her, often to exclusion of all else. It also tends to result in works failing The Bechdel Test, because if there’s a potential character who doesn’t have to be any particular gender, the role will probably be filled by a male character by default. 

TV Tropes

Having been designing and researching voice interfaces for twenty years, I’d like to add some perspective on how gender and AI is applied to our current technology.

In the real world

One exception to this rule is voice assistants, such as Siri, Cortana, and Alexa.  The majority of voice assistants have a female voice, although some allow you to change the default to a male voice. On the other hand, embodied robots (such as Jibo (pictured below), Vector, Pepper, and Kuri) are more often gendered as male.

When a robot is designed, gender does not have to be immediately assigned.  In a voice assistant, however, it’s the most apparent characteristic.

In his book Wired for Speech, Clifford Nass wrote that individuals generally perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems.

If voice-only assistants are predominantly given female voices, why are robots any different?

Why are robots different?

One reason is androcentrism: the default for many things in society is male, and whatever differs from that default must be marked in some way. When people see a robot with no obviously “female” traits (such as long hair, breasts, or, in the case of Rosie from the Jetsons, an apron) they usually assign a male gender, as this study found. It’s similar for cartoons such as stick figures, and animals in animated movies. Animals are often given unrealistic bodies (such as a nipped-in waist), a hairbow, or larger, pink lips to “mark” them as female.  

It would not be surprising if designers felt that to make a robot NOT male, they would have to add exaggerated features. Imagine if, after R2D2 was constructed, George Lucas said “let’s make R2D2 female”.  Despite the fact that nothing would have to be changed (apart from the “he” pronoun in the script), I have no doubt the builders would have scrambled to “female-ize” R2D2 by adding a pink bow or something equally unnecessary. 

“There. Perfect!” (This is actually R2-KT. Yes, she was created to be the female R2-D2.)

In addition, male characters in fictional works are often more defined by their actions, and female characters by their looks and/or personalities.  In this light, it makes sense that a more physical assistant would be more likely to be male.

There are some notable exceptions to this, mainly in the area of home health robots (such as Mabu).  It is interesting to note that Mabu, though “she” has a physical form, the body doesn’t move, just the head and eyes; it serves mainly as a holder for an iPad. Again, she’s an assistant.

So what?

One may ask, what’s the harm in these gendered assistants? One problem is the continued reinforcement of women as always helpful, pleasant, organized, and never angry.  They’re not running things; they’re simply paving the way to make your life easier. But if you want a computer that’s “knowledgeable”—such as IBM’s Watson that took on the Jeopardy! Challenge—the voice is male.  These stereotypes have an impact on our relationships with real people, and not for the better. There shouldn’t be a “default” gender, and it’s time to move past our tired stereotypes of women as the gender that’s always helpful and accommodating. 

As fans of sci-fi, we should become at least sensitized, and more hopefully, vocal and active, about this portrayal of women, and do our part to create more equal technology.


My donation

Thanks to all who donated to compensate underrepresented voices! I am donating the monies I’ve received to the Geena Davis Institute on Gender in Media. This group “is the first and only research-based organization working within the media and entertainment industry to engage, educate, and influence content creators, marketers and audiences about the importance of eliminating unconscious bias, highlighting gender balance, challenging stereotypes, creating role models and scripting a wide variety of strong female characters in entertainment and media that targets and influences children ages 11 and under.” Check them out.

Advertisements

Gendered AI: Germane-ness Correlations

The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example…

  • Do female- and male-presenting AIs get different bodies? Yes.
  • Are female AIs more subservient? No.
  • How does gender correlate to an AI’s goodness? Males are extremists.
  • Men are more often masters of female AIs. Women are more often masters of non-bindary AIs. Male AIs shy away from having women masters. No, really.

This last correlations post investigates the complicated question of which genders are assigned when gender is not germane to the plot. If you haven’t read the series intro, related germane-ness distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.

Recall from the germane distribution post that the germane tag is about whether the gender is important to the plot. (Yes, it’s fairly subjective.)

  • If an AI character makes a baby via common biological means, or their sex-related organs play a critical role, then the gender of the character is highly germane. Rachel in the Blade Runner franchise gestates a baby, so her having a womb is critical, and as we’ve seen in the survey, gender stacks, so her gender is highly germane.
  • If an AI character has a romantic relationship with a mono-sexual partner, or is themselves mono-sexual, or they occupy a gendered social role that is important to the plot, the characters is listed as slightly germane. For example, all you’d have to do is, say, make Val Com bisexual or gay, and then they could present as female and nothing else in the plot of Heartbeeps would need to change to accommodate it.
  • If the character’s gender could be swapped to another gender and it not change the story much, then we say that the character’s gender is not germane. BB-8, for instance, could present as female, and nothing in the canon Star Wars movies would change.
Yes, this matters.

I need to clarify that I’m talking about plot—what happens in the show—rather than story—which entails the reasons it is told and effects—because given the nature of identity politics, a change in gender presentation would often change how the story is received and interpreted by the audience.

All the characters in Alien, for instance, were written unisex, to be playable by actors of any sex or gender presentation. So while it “didn’t matter” that Ripley was cast as Sigourney Weaver, it totally did matter because she was such a bad-ass female character whose gender was immaterial to the plot (we hadn’t had a lot of those at this point in cinematic history). She was just a bad-ass who happened to be female, not female because she “needed” to be. So, yes, it does matter. But diegetically, had she been Alan Ripley, the plot and character relationships of Alien would not need to change. He still damned well better save Jonesy.

So what do we see when we look at the germane-ness of AI characters in a mostly-binary way?

Sure enough, when gender matters to the plot—slightly or highly—the gender presentation of the character is 5.47% female, or about 7% more likely than presenting male. When the gender presentation does not matter, that value is flipped, being around 7% more male than female, and around 9% more other than female.

The sample size for highly germane is vanishingly small, and one would expect the coupling to include a male, so the under-noise values for that category is not too surprising. But the other categories. Holy cow.

Put another way…

AI characters more often present as female only when they need to be.

Otherwise, they’re more often male or not gendered at all.

That is shitty. It’s like Hollywood thinks men are the default gender, and I know I just said it, but I’m going to stay it again—that’s shitty. Hey, Hollywood. Women are people.

Ayup.

Gendered AI: Gender of Master Correlations

The Gendered AI series looks at sci-fi movies and television to see how Hollywood treats AI of different gender presentations. For example…

  • Do female-presenting AIs get different bodies than male-presenting AIs? Yes.
  • Are female AIs more subservient? No.
  • How does gender correlate to an AI’s goodness? Males are extremists.

This particular post asks who are the master of AIs. If you haven’t read the series intro, related master distributions, or correlations 101 posts, I recommend you read them first. As always, check out the live Google sheet for the most recent data.

Barbarella (female-presenting human) is master of Alphy (an AI whose voice presents male.) This is, statistically, an unlikely and unrepresentative relationship, but spot on for the late 01960s-feminist bent of Barbarella.

You may be wondering how this is different than the earlier subservience posts. Recall that the subservience studies look at gender presentation of AI as it relates to their own degree of freedom. Are most AIs freewilled? Yes. Do free-willed AI tend to present as boys more often than as girls or other? Yes. But these tell us nothing about the gender relationship of the subservient AIs to their master’s gender. It would be one thing if all the male-presenting AIs were “owned” by male-presenting owners. If would be another if female-presenting AIs were owned much more often by male-presenting masters. This post exposes those correlations in the survey. Chart time!

Data nerds (high fives) may note that unlike every other correlations chart in the series, these numbers don’t balance. For instance, looking at the Male AI in the left chart, -1.63 + 3.97 + 3.97 = 6.31. Shouldn’t they zero out? If we were looking at the entire survey, they would. But in this case, free-willed AI only muddy this picture, so those AIs are omitted, making the numbers seem wonky. Check the live sheet if you’re eager to dig into the data.

This is two charts in one.

The left chart groups the data by genders of master. Turns out if you have a female-presenting master, you are unlikely to be male- or female-presenting. (Recall that there are only 5 female-presenting masters in the entire Gendered AI survey, so the number of data points is low.) If you present as male, you’re more likely to be master of a gendered AI. Otherwise, you are more likely to be master of a male-presenting AI.

Your AI may not be happy about it, though.

The right chart is the same data, but pivoted to look at it from genders of AI. That’s where the clusters are a little more telling.

  • If you are a female-presenting AI, you are more likely to have a male-presenting master.
  • If you are non-binary AI, you are more likely to have a female-presenting master.
  • If you are a male AI, you have anything but a female-presenting master.

The detailed chart doesn’t reveal anything more than we see from this aggregate, so isn’t shown.

The notion of people owning people is revolting, but the notion of owning an AI is still not universally reviled. (With nods to the distinctions of ANI and AGI.) That means that sci-fi AI serves as unique metaphor for taboo questions of gender and ownership. The results are upsetting for their social implications, of course. And sci-fi needs to do better. Hey, maybe this gives you an idea…

And yet this isn’t the most upsetting correlations finding in the study. I saved that for last, which is next, which is when we look at gender and germaneness. Gird your loins.