Design fiction in sci-fi

As so many of my favorite lines of thought have begun, this one was started with a provocative question lobbed at me across social media. Friend and colleague Jonathan Korman tweeted to ask, above a graphic of the Black Mirror logo, “Surely there is another example of pop design fiction?”

I replied in Twitter, but my reply there was rambling and unsatisfying, so I’m re-answering here with an eye toward being more coherent.

What’s Design Fiction?

If you’re not familiar, design fiction is a practice that focuses on speculative artifacts to raise issues. While leading the interactions program at The Royal College of Art, Anthony Dunne and Fiona Raby catalyzed the practice.

“It thrives on imagination and aims to open up new perspectives on what are sometimes called wicked problems, to create spaces for discussion and debate about alternative ways of being, and to inspire and encourage people’s imaginations to flow freely. Design speculations can act as a catalyst for collectively redefining our relationship to reality.”

Anthony Dunne and Fiona Raby, Speculative Everything: Design, Dreaming, and Social Dreaming

Dunne & Raby tend to often lean toward provocation more than clarity (“sparking debate” is a stated goal, as opposed to “identifying problems and proposing solutions.”) Where to turn for a less shit-stirring description? Like many related fields there are lots of competing definitions and splintering. John Spicey has listed 26 types of Design Fiction over on Simplicable. But I am drawn to the more practical definition offered by the Making Tomorrow handbook.

Design Fiction proposes speculative scenarios that aim to stimulate commitment concerning existing and future issues.

Nicolas Minvielle et al., Making Tomorrow Collective

To me, that feels like a useful definition and clearly indicates a goal I can get behind. Your mileage may vary. (Hi, Tony! Hi, Fiona!)

Some examples should help.

Dunne & Raby once designed a mask for dogs called Spymaker, so that the lil’ scamps could help lead their owners to unsurveilled locations in an urban environment.

Julijonas Urbonas while at RCA conceived and designed a “euthanasia coaster” which would impart enough Gs on its passengers to kill them through cerebral hypoxia. While he designed its clothoid inversions and even built a simple physical model, the idea has been recapitulated in a number of other media, including the 3D rendering you see below.

This commercial example from Ericsson is a video with mild narrative about appliances having a limited “social life.”

Corporations create design fictions from time to time to illustrate their particular visions of the future. Such examples are on the verge of the space, since we can be sure those would not be released if they ran significantly counter to the corporation’s goals. They’re rarely about the “wicked” problems invoked above and tend more toward gee-whiz-ism, to coin a deroganym.

How does it differ from sci-fi?

Design Fiction often focuses on artifacts rather than narratives. The euthanasia coaster has no narrative beyond what you bring or apply to it, but I don’t think this lack of narrative a requirement. For my money, the point of design fiction is focused on exploring the novum more than a particular narrative around the novum. What are its consequences? What are its causes? What kind of society would need to produce it and why? Who would use it and how? What would change? What would lead there and do we want to do that? Contrast Star Wars, which isn’t about the social implications of lightsabers as much as it is space opera about dynasties, light fascism, and the magic of friendship.

Adorable, ravenous friendship.

But, I don’t think there’s any need to consider something invalid as design fiction if it includes narrative. Some works, like Black Mirror, are clearly focused on their novae and their implications and raise all the questions above, but are told with characters and plots and all the usual things you’d expect to find.

So what’s “pop” design fiction?

As a point of clarification, in Korman’s original question, he asked after pop design fiction. I’m taking that not to mean the art movement in the 01950–60s, which Black Mirror isn’t, but rather “accessible” and “popular,” which Black Mirror most definitely is.

So not this, even though it’s also adorable. And ravenous.

What would distinguish other sci-fi works as design fiction?

So if sci-fi can be design fiction, what would we look for in a show to classify it is design fiction? It’s a sloppy science, of course, but here’s a first pass. A show can be said to be design fiction if it…

  • Includes a central novum…
  • …that is explored via the narrative: What are its consequences, direct and indirect?
  • Corollary: The story focused on a primary novum, and not a mish-mash of them. (Too many muddle the thought experiment.)
  • Corollary: The story focuses on characters who are most affected by the novae.
  • Its explorations include the personal and social.
  • It goes where the novum leads, avoiding narrative fiats that sully the thought experiment.
  • Bonus points if it provides illustrative contrasts: Different versions of the novum, characters using it in different ways, or the before and after.

With this stake in the ground, it probably strikes you that some subgenres lend themselves to design fiction and others do not. Anthology series, like Black Mirror, can focus on different characters, novae, and settings each episode. Series and franchises like Star Wars and Star Trek, in contrast, have narrative investments in characters and settings that make it harder to really explore nova on their own terms, but it is not impossible. The most recent season of Black Mirror is pointing at a unified diegesis and recurring characters, which means Brooker may be leaning the series away from design fiction. Meanwhile, I’d posit that the eponymous Game from Star Trek: The Next Generation S05E06 is an episode that acts as a design fiction. So it’s not cut-and-dry.

“It’s your turn. Play the game, Will Wheaton.”

What makes this even more messy is that you are asking a subjective question, i.e. “Is this focused on its novae?”, or even “Does this intend to spur some commitment about the novae?” which is second-guessing whether or not what you think the maker’s intent was. As I mentioned, it’s messy, and against the normal critical stance of this blog. But, there are some examples that lean more toward yes than no.

Jurasic Park

Central novum: What if we use science to bring dinosaurs back to life?

Commitment: Heavy prudence and oversight for genetic sciences, especially if capitalists are doing the thing.

Hey, we’ve reviewed Jurassic Park on this very blog!

This example leads to two observations. First, the franchises that follow successful films are much less likely to be design fiction. I’d argue that every Jurassic X sequel has simply repeated the formula and not asked new questions about that novum. More run-from-the-teeth than do-we-dare?

Second is that big-budget movies are almost required to spend some narrative calories discussing the origin story of novae at the cost of exploring multiple consequences of the same. Anthology series are less likely to need to care about origins, so are a safer bet IMHO.

Minority Report

Central novum: What if we could predict crime? (Presuming Agatha is a stand-in for a regression algorithm and not a psychic drug-baby mutant.)

Commitment: Let’s be cautious about prediction software, especially as it intersects civil rights: It will never be perfect and the consequences are dire.

Blade Runner

Central novum: What if general artificial intelligence was made to look indistinguishable from humans, and kept as an oppressed class?

Commitment: Let’s not do any of that. From the design perspective: Keep AI on the canny rise.

Hey, I reviewed Blade Runner on this very blog!

Ex Machina

Central novum: Will we be able to box a self-interested general intelligence?

Commitment: No. It is folly to think so.

Colossus: The Forbin Project

Central novum: What if we deliberately prevented ourselves from pulling the plug on a superintelligence, and then asked it to end war?

Commitment: We must be extremely careful what we ask a superintelligence to do, how we ask it, and the safeguards we provide ourselves if we find out we messed it up.

Hey, I lovingly reviewed Colossus: The Forbin Project on this very blog!

Person of Interest

Central novum: What if we tried to box a good superintelligence?

Commitment: Heavy prudence and oversight for computer sciences, especially if governments are doing the thing.

Not reviewed, but it won an award for Untold AI

This is probably my favorite example, and even though it is a long-running series with recurring characters, I argue that the leads are all highly derived, narratively, from the novum, and still counts strongly.

But are they pop?

Each of these are more-or-less accessible and mainstream, even if their actual popularity and interpretations vary wildly. So, yes, from that perspective.

Jurassic Park is at the time of writing the 10th highest-grossing sci-fi movie of all time. So if you agree that it is design fiction, it is the most pop of all. Sadly, that is the only property I’d call design fiction on the entire highest-grossing list.

So, depending on a whole lot of things (see…uh…above) the short answer to Mr. Korman’s original question is yes, with lots of if.

What others?

I am not an exhaustive encyclopedia of sci-fi, try though I may. Agree with this list above? What did I miss? If you comment with additions, be sure and list, as I did these, the novum and the challenge.

Spacesuits in Sci-fi: Wrapping up

If you recall from the first entry in this series of posts, the Spacesuits content was originally drafted as a chapter in the Make It So book, but had to be cut because for length. Now here we are at six posts and 6,700 words later. So it was probably a good call on the part of the publisher. 🙂 But now that we’re here, what have we learned?

Let’s recap. Spacesuit interfaces have to…

Spacesuits are a particularly interesting example of the relationship of sci-fi and design because a tiny fraction of sci-fi audiences will ever experience being in one, but many people have seen them being used in real-world circumstances. This gives them a unique representational anchor in sci-fi, and the survey reveals this. Sci-fi makers base their designs on the surface style with occasional additions or extensions, depending on the fashionable technology of the time. These additions rarely make it to the real world because they’re often made without consideration of the real constraints of keeping a human alive in space. But are still cool.

And the winner is…?

If I had to name the franchise that gets it right the most it’s probably Star Trek. Keep in mind that this has been far from an exhaustive survey (“Yeah like where is The Expanse?,” I hear me cry), and the Star Trek franchise is vast and decades old, with most of its stories set on spacecraft. Extravehicular activity in space is a natural fit to the show and there’s been lots of it. I’m not dismissing it. The work done on Star Trek: Discovery has been beautiful. But if it was just a numbers game rather than a question of quality design, we would expect it to win. And lucky for us, it’s been consistently showcasing the most inspiring examples and most(ly) functional interfaces as well. If you’re looking for inspiration, maybe start there.

What lessons can we learn?

As a particular kind of wearables, spacesuit interfaces reinforce all the principles I originally outlined for Ideal Wearables way back in 2014. They must be…

  • Sartorial
  • Social
  • Easy to access and use
  • Tough to accidentally activate
  • Have apposite inputs and outputs

…all pushed through the harder constraints of listed at the top of the article. We have some additional lessons about where to put interfaces on spacesuits given those constraints, but it seems pretty well tied to this domain and difficult to generalize. That is, unless climate change has us all donning environmental suits just to enjoy our own planet in a few degrees Centigrade. Wait, I did not mean to go that dark. Even though climate change is a massive crisis and we should commit to halting it and reversing it if possible. (Hey check out these cool tree-planting drones.)

Let’s instead focus on a mild prognostication. I expect that we’ll be seeing more sci-fi spacesuits in the near future, partly because space travel has been on a kick lately with the high-profile and branding-conscious missions of SpaceX. Just this week Crew Dragon flew has taken the first commercial flight of four civilians into space. (Not the first civilians into space, according to Harvard professor Jonathan McDowell, that honor belongs to the Soyuz TMA-3 mission in 2003, but that was still a government operation.) For better or for worse, part of how SpaceX is making its name is by bringing a new, cool aesthetic to space travel.

So people are seeing spacesuits again (though am I right…no extravehicular activities?) and that means it will be on the minds of studios and writers, and they will give it their own fantastic spin, which will in turn inspire real-world designers, etc. etc. Illustrators and industrial designers are already posting some amazing speculative designs of late, and I look forward to more inspiring designs to come.

I think my spaceship knows which way to go

You may have noticed that this post comes an uncommonly long time after the prior post. I had cut down my publishing cadence at the start of the pandemic to once every other week because stress, and even that has been difficult to keep up. But now we are heading into fall and the winter holidays and a cluster of family birthdays and whatnot usually keep me busy through March. Plus I’m about to start hosting a regular session with Ambition Group about AI Mastery for Design Leaders, and as a first time curriculum, it’s going to demand much of me on top of my full time job. (You didn’t think I did scifiinterfaces professionally, did you? This is a hobby.) And I’m making some baby steps in publishing my own sci-fi short stories. Keep an eye on Escape Pod and Dark Matter Magazine over the fall if you want to catch those. (I’ll almost certainly tweet about them, too.) I want to work on others.

Which is all to say that I’m on the verge of being overcommitted and burnt out, and so going to do myself a favor and take a break from posting here for a while. Sadly, I don’t have any guest posts in the work. Who would be crazy enough to critique sci-fi interfaces during a climate crisis, ongoing fascist movements, and a global pandemic?

I do have big plans for a major study of the narrative uses of sci-fi interfaces, which I hope to use time off in the winter holiday to conduct. That will probably be as huge as the Untold AI and the Gendered AI series. I have nascent notions of using that study as a last bit of material to collect into a 10-year retrospective follow-up to Make It So (let me know if that sounds appealing). And I’m committed to another round of Fritz awards for 2022. So more is coming, and I’ll be back before you know it.

But for a while, over and out, readers. And don’t forget while I’m gone…

Stop watching sci-fi. Start using it.

—Me

Sci-fi Spacesuits: Identification

Spacesuits are functional items, built largely identically to each other, adhering to engineering specifications rather than individualized fashion. A resulting problem is that it might be difficult to distinguish between multiple, similarly-sized individuals wearing the same suits. This visual identification problem might be small in routine situations:

  • (Inside the vehicle:) Which of these suits it mine?
  • What’s the body language of the person currently speaking on comms?
  • (With a large team performing a manual hull inspection:) Who is that approaching me? If it’s the Fleet Admiral I may need to stand and salute.

But it could quickly become vital in others:

  • Who’s body is that floating away into space?
  • Ensign Smith just announced they have a tachyon bomb in their suit. Which one is Ensign Smith?
  • Who is this on the security footage cutting the phlebotinum conduit?

There a number of ways sci-fi has solved this problem.

Name tags

Especially in harder sci-fi shows, spacewalkers have a name tag on the suit. The type is often so small that you’d need to be quite close to read it, and weird convention has these tags in all-capital letters even though lower-case is easier to read, especially in low light and especially at a distance. And the tags are placed near the breast of the suit, so the spacewalker would also have to be facing you. So all told, not that useful on actual extravehicular missions.

Faces

Screen sci-fi usually gets around the identification problem by having transparent visors. In B-movies and sci-fi illustrations from the 1950s and 60s, the fishbowl helmet was popular, but of course offering little protection, little light control, and weird audio effects for the wearer. Blockbuster movies were mostly a little smarter about it.

1950s Sci-Fi illustration by Ed Emshwiller
c/o Diane Doniol-Valcroze

Seeing faces allows other spacewalkers/characters (and the audience) to recognize individuals and, to a lesser extent, how their faces synch with their voice and movement. People are generally good at reading the kinesics of faces, so there’s a solid rationale for trying to make transparency work.

Face + illumination

As of the 1970s, filmmakers began to add interior lights that illuminate the wearer’s face. This makes lighting them easier, but face illumination is problematic in the real world. If you illuminate the whole face including the eyes, then the spacewalker is partially blinded. If you illuminate the whole face but not the eyes, they get that whole eyeless-skull effect that makes them look super spooky. (Played to effect by director Scott and cinematographer Vanlint in Alien, see below.)

Identification aside: Transparent visors are problematic for other reasons. Permanently-and-perfectly transparent glass risks the spacewalker getting damage from infrared lights or blinded from sudden exposure to nearby suns, or explosions, or engine exhaust ports, etc. etc. This is why NASA helmets have the gold layer on their visors: it lets in visible light and blocks nearly all infrared.

Astronaut Buzz Aldrin walks on the surface of the moon near the leg of the lunar module Eagle during the Apollo 11 mission.

Image Credit: NASA (cropped)

Only in 2001 does the survey show a visor with a manually-adjustable translucency. You can imagine that this would be more safe if it was automatic. Electronics can respond much faster than people, changing in near-real time to keep sudden environmental illumination within safe human ranges.

You can even imagine smarter visors that selectively dim regions (rather than the whole thing), to just block out, say, the nearby solar flare, or to expose the faces of two spacewalkers talking to each other, but I don’t see this in the survey. It’s mostly just transparency and hope nobody realizes these eyeballs would get fried.

So, though seeing faces helps solve some of the identification problem, transparent enclosures don’t make a lot of sense from a real-world perspective. But it’s immediate and emotionally rewarding for audiences to see the actors’ faces, and with easy cinegenic workarounds, I suspect identification-by-face is here in sci-fi for the long haul, at least until a majority of audiences experience spacewalking for themselves and realize how much of an artistic convention this is.

Color

Other shows have taken the notion of identification further, and distinguished wearers by color. Mission to Mars, Interstellar, and Stowaway did this similar to the way NASA does it, i.e. with colored bands around upper arms and sometimes thighs.

Destination Moon, 2001: A Space Odyssey, and Star Trek (2009) provided spacesuits in entirely different colors. (Star Trek even equipped the suits with matching parachutes, though for the pedantic, let’s acknowledge these were “just” upper-atmosphere suits.)The full-suit color certainly makes identification easier at a distance, but seems like it would be more expensive and introduce albedo differences between the suits.

One other note: if the visor is opaque and characters are only relying on the color for identification, it becomes easier for someone to don the suit and “impersonate” its usual wearer to commit spacewalking crimes. Oh. My. Zod. The phlebotinum conduit!

According to the Colour Blind Awareness organisation, blindness (color vision deficiency) affects approximately 1 in 12 men and 1 in 200 women in the world, so is not without its problems, and might need to be combined with bold patterns to be more broadly accessible.

What we don’t see

Heraldry

Blog from another Mog Project Rho tells us that books have suggested heraldry as space suit identifiers. And while it could be a device placed on the chest like medieval suits of armor, it might be made larger, higher contrast, and wraparound to be distinguishable from farther away.

Directional audio

Indirect, but if the soundscape inside the helmet can be directional (like a personal Surround Sound) then different voices can come from the direction of the speaker, helping uniquely identify them by position. If there are two close together and none others to be concerned about, their directions can be shifted to increase their spatial distinction. When no one is speaking leitmotifs assigned to each other spacewalker, with volumes corresponding to distance, could help maintain field awareness.

HUD Map

Gamers might expect a map in a HUD that showed the environment and icons for people with labeled names.

Search

If the spacewalker can have private audio, shouldn’t she just be able to ask, “Who’s that?” while looking at someone and hear a reply or see a label on a HUD? It would also be very useful if I’ve spacewalker could ask for lights to be illuminated on the exterior of another’s suit. Very useful if that other someone is floating unconscious in space.

Mediated Reality Identification

Lastly I didn’t see any mediated reality assists: augmented or virtual reality. Imagine a context-aware and person-aware heads-up display that labeled the people in sight. Technological identification could also incorporate in-suit biometrics to avoid the spacesuit-as-disguise problem. The helmet camera confirms that the face inside Sargeant McBeef’s suit is actually that dastardly Dr. Antagonist!

We could also imagine that the helmet could be completely enclosed, but be virtually transparent. Retinal projectors would provide the appearance of other spacewalkers—from live cameras in their helmets—as if they had fishbowl helmets. Other information would fit the HUD depending on the context, but such labels would enable identification in a way that is more technology-forward and cinegenic. But, of course, all mediated solutions introduce layers of technology that also introduces more potential points of failure, so not a simple choice for the real-world.

Oh, that’s right, he doesn’t do this professionally.

So, as you can read, there’s no slam-dunk solution that meets both cinegenic and real-world needs. Given that so much of our emotional experience is informed by the faces of actors, I expect to see transparent visors in sci-fi for the foreseeable future. But it’s ripe for innovation.

Sci-fi Spacesuits: Audio Comms


A special subset of spacesuit interfaces is the communication subsystems. I wrote a whole chapter about Communications in Make It So, but spacesuit comms bear special mention, since they’re usually used in close physical proximity but still must be mediated by technology, the channels for detailed control are clumsy and packed, and these communicators are often being overseen by a mission control center of some sort. You’d think this is rich territory, but spoiler: There’s not a lot of variation to study.

Every single spacesuit in the survey has audio. This is so ubiquitous and accepted that, after 1950, no filmmaker has thought the need to explain it or show an interface for it. So you’d think that we’d see a lot of interactions.

Spacesuit communications in sci-fi tend to be many-to-many with no apparent means of control. Not even a push-to-mute if you sneezed into your mic. It’s as if the spacewalkers were in a group, merely standing near each other in air, chatting. No push-to-talk or volume control is seen. Communication with Mission Control is automatic. No audio cues are given to indicate distance, direction, or source of the sound, or to select a subset of recipients.

The one seeming exception to the many-to-many communication is seen in the reboot of Battlestar Galactica. As Boomer is operating a ship above a ground crew, shining a light down on them for visibility, she has the following conversation with Tyrol.

  • Tyrol
  • Raptor 478, this is DC-1, I have you in my sights.
  • Boomer
  • Copy that, DC-1. I have you in sight.
  • Tyrol
  • Understood.
  • Boomer
  • How’s it looking there? Can you tell what happened?
  • Tyrol
  • Lieutenant, don’t worry…about my team. I got things under control.
  • Boomer
  • Copy that, DC-1. I feel better knowing you’re on it.

Then, when her copilot gives her a look about what she has just said, she says curtly to him, “Watch the light, you’re off target.” In this exchange there is clear evidence that the copilot has heard the first conversation, but it appears that her comment to him is addressed to him and not for the others to hear. Additionally, we do not hear chatter going on between the ground grew during this exchange. Unfortunately, we do not see any of the conversationalists touch a control to give us an idea about how they switch between these modes. So, you know, still nothing.

More recent films, especially in the MCU, has seen all sorts of communication controlled by voice with the magic of General AI…pause for gif…


…but as I mention more and more, once you have a General AI in the picture, we leave the realm of critique-able interactions. Because an AI did it.

In short, sci-fi just doesn’t care about showing audio controls in sci-fi spacesuits, and isn’t likely to start caring anytime soon. As always, if you know of something outside my survey, please mention it.

For reference, in the real world, a NASA astronaut has direct control over the volume of audio that she hears, using potentiometer volume controls. (Curiously the numbers on them are not backwards, unlike the rest of the controls.)

A spacewalker uses the COMM dial switch mode selector at the top of the DCM to select between three different frequencies of wireless communication, each of which broadcasts to each other and the vehicle. When an astronaut is on one of the first two channels, transmission is voice-activated. But a backup, “party line” channel requires push-to-talk, and this is what the push-to-talk control is for.

By default, all audio is broadcast to all other spacewalkers, the vehicle, and Mission Control. To speak privately, without Mission Control hearing, spacewalkers don’t have an engineered option. But if one of the radio frequency bands happens to be suffering a loss of signal to Mission Control, she can use this technological blind spot to talk with some degree of privacy.

Sci-fi Spacesuits: Moving around

Whatever it is, it ain’t going to construct, observe, or repair itself. In addition to protection and provision, suits must facilitate the reason the wearer has dared to go out into space in the first place.

One of the most basic tasks of extravehicular activity (EVA) is controlling where the wearer is positioned in space. The survey shows several types of mechanisms for this. First, if your EVA never needs you to leave the surface of the spaceship, you can go with mountaineering gear or sticky feet. (Or sticky hands.) We can think of maneuvering through space as similar to piloting a craft, but the outputs and interfaces have to be made wearable, like wearable control panels. We might also expect to see some tunnel in the sky displays to help with navigation. We’d also want to see some AI safeguard features, to return the spacewalker to safety when things go awry. (Narrator: We don’t.)

Mountaineering gear

In Stowaway (2021) astronauts undertake unplanned EVAs with carabiners and gear akin to mountaineers use. This makes some sense, though even this equipment needs to be modified for use by astronauts’ thick gloves.

Stowaway (2021) Drs Kim and Levinson prepare to scale to the propellant tank.

Sticky feet (and hands)

Though it’s not extravehicular, I have to give a shout out to 2001: A Space Odyssey (1969), where we see a flight attendant manage their position in the microgravity with special shoes that adhere to the floor. It’s a lovely example of a competent Hand Wave. We don’t need to know how it works because it says, right there, “Grip shoes.” Done. Though props to the actress Heather Downham, who had to make up a funny walk to illustrate that it still isn’t like walking on earth.

2001: A Space Odyssey (1969)
Pan Am: “Thank god we invented the…you know, whatever shoes.

With magnetic boots, seen in Destination Moon, the wearer simply walks around and manages the slight awkwardness of having to pull a foot up with extra force, and have it snap back down on its own.

Battlestar Galactica added magnetic handgrips to augment the control provided by magnetized boots. With them, Sergeant Mathias is able to crawl around the outside of an enemy vessel, inspecting it. While crawling, she holds grip bars mounted to circles that contain the magnets. A mechanism for turning the magnet off is not seen, but like these portable electric grabbers, it could be as simple as a thumb button.

Iron Man also had his Mark 50 suit form stabilizing suction cups before cutting a hole in the hull of the Q-Ship.

Avengers: Infinity War (2018)

In the electromagnetic version of boots, seen in Star Trek: First Contact, the wearer turns the magnets on with a control strapped to their thigh. Once on, the magnetization seems to be sensitive to the wearer’s walk, automatically lessening when the boot is lifted off. This gives the wearer something of a natural gait. The magnetism can be turned off again to be able to make microgravity maneuvers, such as dramatically leaping away from Borg minions.

Star Trek: Discovery also included this technology, but with what appears to be a gestural activation and a cool glowing red dots on the sides and back of the heel. The back of each heel has a stack of red lights that count down to when they turn off, as, I guess, a warning to anyone around them that they’re about to be “air” borne.

Quick “gotcha” aside: neither Destination Moon nor Star Trek: First Contact bothers to explain how characters are meant to be able to kneel while wearing magnetized boots. Yet this very thing happens in both films.

Destination Moon (1950): Kneeling on the surface of the spaceship.
Star Trek: First Contact (1996): Worf rises from operating the maglock to defend himself.

Controlled Propellant

If your extravehicular task has you leaving the surface of the ship and moving around space, you likely need a controlled propellant. This is seen only a few times in the survey.

In the film Mission to Mars, the manned mobility unit, or MMU, seen in the film is based loosely on NASA’s MMU. A nice thing about the device is that unlike the other controlled propellant interfaces, we can actually see some of the interaction and not just the effect. The interfaces are subtly different in that the Mission to Mars spacewalkers travel forward and backward by angling the handgrips forward and backward rather than with a joystick on an armrest. This seems like a closer mapping, but also seems more prone to error by accidental touching or bumping into something.

The plus side is an interface that is much more cinegenic, where the audience is more clearly able to see the cause and effect of the spacewalker’s interactions with the device.

If you have propellent in a Moh’s 4 or 5 film, you might need to acknowledge that propellant is a limited resource. Over the course of the same (heartbreaking) scene shown above, we see an interface where one spacewalker monitors his fuel, and another where a spacewalker realizes that she has traveled as far as she can with her MMU and still return to safety.

Mission to Mars (2000): Woody sees that he’s out of fuel.

For those wondering, Michael Burnham’s flight to the mysterious signal in that pilot uses propellant, but is managed and monitored by controllers on Discovery, so it makes sense that we don’t see any maneuvering interfaces for her. We could dive in and review the interfaces the bridge crew uses (and try to map that onto a spacesuit), but we only get snippets of these screens and see no controls.

Iron Man’s suits employ some Phlebotinum propellant that lasts for ever, can fit inside his tailored suit, and are powerful enough to achieve escape velocity.

Avengers: Infinity War (2018)

All-in-all, though sci-fi seems to understand the need for characters to move around in spacesuits, very little attention is given to the interfaces that enable it. The Mission to Mars MMU is the only one with explicit attention paid to it, and that’s quite derived from NASA models. It’s an opportunity for film makers should the needs of the plot allow, to give this topic some attention.

Sci-fi Spacesuits: Interface Locations

A major concern of the design of spacesuits is basic usability and ergonomics. Given the heavy material needed in the suit for protection and the fact that the user is wearing a helmet, where does a designer put an interface so that it is usable?

Chest panels

Chest panels are those that require that the wearer only look down to manipulate. These are in easy range of motion for the wearer’s hands. The main problem with this location is that there is a hard trade off between visibility and bulkiness.

Arm panels

Arm panels are those that are—brace yourself—mounted to the forearm. This placement is within easy reach, but does mean that the arm on which the panel sits cannot be otherwise engaged, and it seems like it would be prone to accidental activation. This is a greater technological challenge than a chest panel to keep components small and thin enough to be unobtrusive. It also provides some interface challenges to squeeze information and controls into a very small, horizontal format. The survey shows only three arm panels.

The first is the numerical panel seen in 2001: A Space Odyssey (thanks for the catch, Josh!). It provides discrete and easy input, but no feedback. There are inter-button ridges to kind of prevent accidental activation, but they’re quite subtle and I’m not sure how effective they’d be.

2001: A Space Odyssey (1968)

The second is an oversimplified control panel seen in Star Trek: First Contact, where the output is simply the unlabeled lights underneath the buttons indicating system status.

The third is the mission computers seen on the forearms of the astronauts in Mission to Mars. These full color and nonrectangular displays feature rich, graphic mission information in real time, with textual information on the left and graphic information on the right. Input happens via hard buttons located around the periphery.

Side note: One nifty analog interface is the forearm mirror. This isn’t an invention of sci-fi, as it is actually on real world EVAs. It costs a lot of propellant or energy to turn a body around in space, but spacewalkers occasionally need to see what’s behind them and the interface on the chest. So spacesuits have mirrors on the forearm to enable a quick view with just arm movement. This was showcased twice in the movie Mission to Mars.

HUDs

The easiest place to see something is directly in front of your eyes, i.e. in a heads-up display, or HUD. HUDs are seen frequently in sci-fi, and increasingly in sc-fi spacesuits as well. One is Sunshine. This HUD provides a real-time view of each other individual to whom the wearer is talking while out on an EVA, and a real-time visualization of dangerous solar winds.

These particular spacesuits are optimized for protection very close to the sun, and the visor is limited to a transparent band set near eye level. These spacewalkers couldn’t look down to see the top of a any interfaces on the suit itself, so the HUD makes a great deal of sense here.

Star Trek: Discovery’s pilot episode included a sequence that found Michael Burnham flying 2000 meters away from the U.S.S. Discovery to investigate a mysterious Macguffin. The HUD helped her with wayfinding, navigating, tracking time before lethal radiation exposure (a biological concern, see the prior post), and even doing a scan of things in her surroundings, most notably a Klingon warrior who appears wearing unfamiliar armor. Reference information sits on the periphery of Michael’s vision, but the augmentations occur mapped to her view. (Noting this raises the same issues of binocular parallax seen in the Iron HUD.)

Iron Man’s Mark L armor was able to fly in space, and the Iron HUD came right along with it. Though not designed/built for space, it’s a general AI HUD assisting its spacewalker, so worth including in the sample.

Avengers: Infinity War (2018)

Aside from HUDs, what we see in the survey is similar to what exists in existing real-world extravehicular mobility units (EMUs), i.e. chest panels and arm panels.

Inputs illustrate paradigms

Physical controls range from the provincial switches and dials on the cigarette-girl foldout control panels of Destination Moon to the simple and restrained numerical button panel of 2001, to strangely unlabeled buttons of Star Trek: First Contact’s arm panels (above), and the ham-handed touch screens of Mission to Mars.

Destination Moon (1950)
2001: A Space Odyssey (1968)

As the pictures above reveal, the input panels reflect the familiar technology of the time of the creation of the movie or television show. The 1950s were still rooted in mechanistic paradigms, the late 1960s interfaces were electronic pushbutton, the 2000s had touch screens and miniaturized displays.

Real world interfaces

For comparison and reference, the controls for NASA’s EMU has a control panel on the front, called the Display and Control Module, where most of the controls for the EMU sit.

The image shows that inputs are very different than what we see as inputs in film and television. The controls are large for easy manipulation even with thick gloves, distinct in type and location for confident identification, analog to allow for a minimum of failure points and in-field debugging and maintenance, and well-protected from accidental actuation with guards and deep recesses. The digital display faces up for the convenience of the spacewalker. The interface text is printed backwards so it can be read with the wrist mirror.

The outputs are fairly minimal. They consist of the pressure suit gauge, audio warnings, and the 12-character alphanumeric LCD panel at the top of the DCM. No HUD.

The gauge is mechanical and standard for its type. The audio warnings are a simple warbling tone when something’s awry. The LCD panel provides information about 16 different values that the spacewalker might need, including estimated time of oxygen remaining, actual volume of oxygen remaining, pressure (redundant to the gauge), battery voltage or amperage, and water temperature. To cycle up and down the list, she presses the Mode Selector Switch forward and backward. She can adjust the contrast using the Display Intensity Control potentiometer on the front of the DCM.

A NASA image tweeted in 2019.

The DCMs referenced in the post are from older NASA documents. In more recent images on NASA’s social media, it looks like there have been significant redesigns to the DCM, but so far I haven’t seen details about the new suit’s controls. (Or about how that tiny thing can house all the displays and controls it needs to.)

Sci-fi Spacesuits: Biological needs

Spacesuits must support the biological functioning of the astronaut. There are probably damned fine psychological reasons to not show astronauts their own biometric data while on stressful extravehicular missions, but there is the issue of comfort. Even if temperature, pressure, humidity, and oxygen levels are kept within safe ranges by automatic features of the suit, there is still a need for comfort and control inside of that range. If the suit is to be warn a long time, there must be some accommodation for food, water, urination, and defecation. Additionally, the medical and psychological status of the wearer should be monitored to warn of stress states and emergencies.

Unfortunately, the survey doesn’t reveal any interfaces being used to control temperature, pressure, or oxygen levels. There are some for low oxygen level warnings and testing conditions outside the suit, but these are more outputs than interfaces where interactions take place.

There are also no nods to toilet necessities, though in fairness Hollywood eschews this topic a lot.

The one example of sustenance seen in the survey appears in Sunshine, we see Captain Kaneda take a sip from his drinking tube while performing a dangerous repair of the solar shields. This is the only food or drink seen in the survey, and it is a simple mechanical interface, held in place by material strength in such a way that he needs only to tilt his head to take a drink.

Similarly, in Sunshine, when Capa and Kaneda perform EVA to repair broken solar shields, Cassie tells Capa to relax because he is using up too much oxygen. We see a brief view of her bank of screens that include his biometrics.

Remote monitoring of people in spacesuits is common enough to be a trope, but has been discussed already in the Medical chapter in Make It So, for more on biometrics in sci-fi.

Crowe’s medical monitor in Aliens (1986).

There are some non-interface biological signals for observers. In the movie Alien, as the landing party investigates the xenomorph eggs, we can see that the suit outgases something like steam—slower than exhalations, but regular. Though not presented as such, the suit certainly confirms for any onlooker that the wearer is breathing and the suit functioning.

Given that sci-fi technology glows, it is no surprise to see that lots and lots of spacesuits have glowing bits on the exterior. Though nothing yet in the survey tells us what these lights might be for, it stands to reason that one purpose might be as a simple and immediate line-of-sight status indicator. When things are glowing steadily, it means the life support functions are working smoothly. A blinking red alert on the surface of a spacesuit could draw attention to the individual with the problem, and make finding them easier.

Emergency deployment

One nifty thing that sci-fi can do (but we can’t yet in the real world) is deploy biology-protecting tech at the touch of a button. We see this in the Marvel Cinematic Universe with Starlord’s helmet.

If such tech was available, you’d imagine that it would have some smart sensors to know when it must automatically deploy (sudden loss of oxygen or dangerous impurities in the air), but we don’t see it. But given this speculative tech, one can imagine it working for a whole spacesuit and not just a helmet. It might speed up scenes like this.

What do we see in the real world?

Are there real-world controls that sci-fi is missing? Let’s turn to NASA’s space suits to compare.

The Primary Life-Support System (PLSS) is the complex spacesuit subsystem that provides the life support to the astronaut, and biomedical telemetry back to control. Its main components are the closed-loop oxygen-ventilation system for cycling and recycling oxygen, the moisture (sweat and breath) removal system, and the feedwater system for cooling.

The only “biology” controls that the spacewalker has for these systems are a few on the Display and Control Module (DCM) on the front of the suit. They are the cooling control valve, the oxygen actuator slider, and the fan switch. Only the first is explicitly to control comfort. Other systems, such as pressure, are designed to maintain ideal conditions automatically. Other controls are used for contingency systems for when the automatic systems fail.

Hey, isn’t the text on this thing backwards? Yes, because astronauts can’t look down from inside their helmets, and must view these controls via a wrist mirror. More on this later.

The suit is insulated thoroughly enough that the astronaut’s own body heats the interior, even in complete shade. Because the astronaut’s body constantly adds heat, the suit must be cooled. To do this, the suit cycles water through a Liquid Cooling and Ventilation Garment, which has a fine network of tubes held closely to the astronaut’s skin. Water flows through these tubes and past a sublimator that cools the water with exposure to space. The astronaut can increase or decrease the speed of this flow and thereby the amount to which his body is cooled, by the cooling control valve, a recessed radial valve with fixed positions between 0 (the hottest) and 10 (the coolest), located on the front of the Display Control Module.

The spacewalker does not have EVA access to her biometric data. Sensors measure oxygen consumption and electrocardiograph data and broadcast it to the Mission Control surgeon, who monitors it on her behalf. So whatever the reason is, if it’s good enough for NASA, it’s good enough for the movies.


Back to sci-fi

So, we do see temperature and pressure controls on suits in the real world, which underscores their absence in sci-fi. But, if there hasn’t been any narrative or plot reason for such things to appear in a story, we should not expect them.

Sci-fi Spacesuits: Protecting the Wearer from the Perils of Space

Space is incredibly inhospitable to life. It is a near-perfect vacuum, lacking air, pressure, and warmth. It is full of radiation that can poison us, light that can blind and burn us, and a darkness that can disorient us. If any hazardous chemicals such as rocket fuel have gotten loose, they need to be kept safely away. There are few of the ordinary spatial clues and tools that humans use to orient and control their position. There are free-floating debris that range from to bullet-like micrometeorites to gas and rock planets that can pull us toward them to smash into their surface or burn in their atmospheres. There are astronomical bodies such as stars and black holes that can boil us or crush us into a singularity. And perhaps most terrifyingly, there is the very real possibility of drifting off into the expanse of space to asphyxiate, starve (though biology will be covered in another post), freeze, and/or go mad.

The survey shows that sci-fi has addressed most of these perils at one time or another.

Alien (1976): Kane’s visor is melted by a facehugger’s acid.

Interfaces

Despite the acknowledgment of all of these problems, the survey reveals only two interfaces related to spacesuit protection.

Battlestar Galactica (2004) handled radiation exposure with simple, chemical output device. As CAG Lee Adama explains in “The Passage,” the badge, worn on the outside of the flight suit, slowly turns black with radiation exposure. When the badge turns completely black, a pilot is removed from duty for radiation treatment.

This is something of a stretch because it has little to do with the spacesuit itself, and is strictly an output device. (Nothing that proper interaction requires human input and state changes.) The badge is not permanently attached to the suit, and used inside a spaceship while wearing a flight suit. The flight suit is meant to act as a very short term extravehicular mobility unit (EMU), but is not a spacesuit in the strict sense.

The other protection related interface is from 2001: A Space Odyssey. As Dr. Dave Bowman begins an extravehicular activity to inspect seemingly-faulty communications component AE-35, we see him touch one of the buttons on his left forearm panel. Moments later his visor changes from being transparent to being dark and protective.

We should expect to see few interfaces, but still…

As a quick and hopefully obvious critique, Bowman’s function shouldn’t have an interface. It should be automatic (not even agentive), since events can happen much faster than human response times. And, now that we’ve said that part out loud, maybe it’s true that protection features of a suit should all be automatic. Interfaces to pre-emptively switch them on or, for exceptional reasons, manually turn them off, should be the rarity.

But it would be cool to see more protective features appear in sci-fi spacesuits. An onboard AI detects an incoming micrometeorite storm. Does the HUD show much time is left? What are the wearer’s options? Can she work through scenarios of action? Can she merely speak which course of action she wants the suit to take? If a wearer is kicked free of the spaceship, the suit should have a homing feature. Think Doctor Strange’s Cloak of Levitation, but for astronauts.

As always, if you know of other examples not in the survey, please put them in the comments.

Spacesuits in Sci-fi

“Why cannot we walk outside [the spaceship] like the meteor? Why cannot we launch into space through the scuttle? What enjoyment it would be to feel oneself thus suspended in ether, more favored than the birds who must use their wings to keep themselves up!”

—The astronaut Michel Ardan in Round the Moon by Jules Verne (1870)

When we were close to publication on Make It So, we wound up being way over the maximum page count for a Rosenfeld Media book. We really wanted to keep the components and topics sections, and that meant we had to cut the section on things. Spacesuits was one of the chapters I drafted about things. I am representing that chapter here on the blog. n.b. This was written ten years ago in 2011. There are almost certainly other more recent films and television shows that can serve as examples. If you, the reader, notice any…well, that‘s what the comments section is for.

Sci-fi doesn’t have to take place in interplanetary space, but a heck of a lot of it does. In fact, the first screen-based science fiction film is all about a trip to the moon.

La Voyage Dans La Lune (1904): The professors suit up for their voyage to the moon by donning conical caps, neck ruffles, and dark robes.

Most of the time, traveling in this dangerous locale happens inside spaceships, but occasionally a character must travel out bodily into the void of space. Humans—and pretty much everything (no not them) we would recognize as life—can not survive there for very long at all. Fortunately, the same conceits that sci-fi adopts to get characters into space can help them survive once they’re there.

Establishing terms

An environmental suit is any that helps the wearer survive in an inhospitable environment. Environment suits first began with underwater suits, and later high-altitude suits. For space travel, pressure suits are to be worn during the most dangerous times, i.e. liftoff and landing, when an accident may suddenly decompress a spacecraft. A spacesuit is an environmental suit designed specifically for survival in outer space. NASA refers to spacesuits as Extravehicular Mobility Units, or EMUs. Individuals who wear the spacesuits are known as spacewalkers. The additional equipment that helps a spacewalker move around space in a controlled manner is the Manned Mobility Unit, or MMU.

Additionally, though many other agencies around the world participate in the design and engineering of  spacesuits, there is no convenient way to reference them and their efforts as a group, so Aerospace Community is used as a shorthand. This also helps to acknowledge that my research and interviews were primarily with sources primarily from NASA.

The design of the spacesuit is an ongoing and complicated affair. To speak of “the spacesuit” as if it were a single object ignores the vast number of iterations and changes made to the suits between each cycle of engineering, testing, and deployment, must less between different agencies working on their own designs. So, for those wondering, I’m using the Russian Orlan spacesuit currently being used in the International Space Station and shuttle missions as the default design when speaking about modern spacesuits.

Spacesuit Orlan-MKS at MAKS-2013(air show) (fragment) CC BY-SA 4.0

What the thing’s got to do

A spacesuit, whether in sci-fi or the real world, has to do three things.

  1. It has to protect the wearer from the perils of interplanetary space.
  2. It has to accommodate the wearer’s ongoing biological needs.
  3. Help them move around.
  4. Facilitate communication between them, other spacewalkers, and mission control.
  5. Identify who is wearing the suit for others

Each of these categories of functions, and the related interfaces, are discussed in following posts.

Sci-fi Interfaces Q&A with Perception Studio

First, congratulations to Perception Studio for the excellent work on Black Panther! Readers can see Perception’s own write up about the interfaces on their website. (Note that the reviewers only looked at this after the reviews were complete, to ensure we were looking at end-result, not intent. Also all images in this post were lifted from that page, with permission, unless otherwise noted.)

John LePore of Perception Studio reached out to me when we began to publish the reviews, asking if he could shed light on anything. So I asked if he would be up for an email interview when the reviews were complete. This post is all that wonderful shed light.

What exactly did Perception do for the film?

John: Perception was brought aboard early in the process for the specific purpose of consulting on potential areas of interest in science and technology. A brief consulting sprint evolved into 18 months of collaboration that included conceptual development and prototyping of various technologies for use in multiple sequences and scenarios. The most central of these elements was the conceptualization and development of the vibranium sand interfaces throughout the film. Some of this work was used as design guidelines for various vfx houses while other elements were incorporated directly into the final shots by Perception. In addition to the various technologies, Perception worked closely on two special sequences in the film—the opening ‘history of Wakanda’ prologue, and the main-on-end title sequence, both of which were based on the technological paradigm of vibranium sand.

What were some of the unique challenges for Black Panther?

John: We encountered various challenges on Black Panther, both conceptual and technical. An inspiring challenge was the need to design the most advanced technology in the Marvel Cinematic Universe, while conceptualizing something that had zero influence from any existing technologies. There were lots of challenges around dynamic sand, and even difficulty rendering when a surge in the crypto market made GPU’s scarce to come by!

One of the things that struck me about Black Panther is the ubiquity of (what appear to be) brain-computer interfaces. How was it working with speculative tech that seemed so magical?

John: From the very start, it was very important to us that all of the technology we conceptualized was grounded in logic, and had a pathway to feasibility. We worked hard to hold ourselves to these constraints, and looked for every opportunity to include signals for the audience (sometimes nuanced, sometimes obvious) as to how these technologies worked. At the same time, we know the film will never stop dead in its tracks to explain technology paradigm #6. In fact, one of our biggest concerns was that any of the tech would appear to be ‘made of magic’.

Chris: Ooh, now I want to know what some of the nuanced signals were!

John: One of the key nuances that made it from rough tests to the final film was that the vibranium Sand ‘bounces’ to life with a pulse. This is best seen in the tactical table in the Royal Talon at the start of the film. The ‘bounce’ was intended to be a rhythmic cue to the idea of ultrasonic soundwaves triggering the levitating sand.

Similarly, you can find cymatic patterns in numerous effects in the film.

Did you know going in that you’d be creating something that would be so important to black lives?

John: Sometimes on a film, it is often hard to imagine how it will be received. On Black Panther, all the signals were clear that the film would be deeply important. From our early peeks at concept art of Wakanda, to witnessing the way Marvel Studios supported Ryan Coogler’s vision. The whole time working on the film the anticipation kept growing, and at the core of the buzz was an incredibly strong black fandom. Late in our process, the hype was still increasing—It was becoming obvious that Black Panther could be the biggest Marvel film to date. I remember working on the title sequence one night, a couple months before release, and Ryan played (over speakerphone) the song that would accompany the sequence. We were bugging out— “Holy shit that’s Kendrick!”… it was just another sign that this film would be truly special, and deeply dedicated to an under-served audience.

How did working on the film affect the studio?

John: For us it’s been one of our proudest moments— it combined everything we love in terms of exciting concept development, aesthetic innovation and ambitious technical execution. The project is a key trophy in our portfolio, and I revisit it regularly when presenting at conferences or attracting new clients, and I’m deeply proud that it continues to resonate. 

Where did you look for inspiration when designing?

John: When we started, the brief was simple: Best tech, most unique tech, and centered around vibranium. With a nearly open canvas, the element of vibranium (only seen previously as Captain America’s shield) sent us pursuing vibration and sound as a starting point. We looked deeply into cymatic patterns and other sound-based phenomena like echo-location. About a year prior, we were working with an automotive supplier on a technology that used ultrasonic soundwaves to create ‘mid-air haptics’… tech that lets you feel things that aren’t really there. We then discovered that the University at Tokyo was doing experiments with the same hardware to levitate styrofoam particles with limited movement. Our theory was that with the capabilities of vibranium, this effect could levitate and translate millions of particles simultaneously.

Beyond technical and scientific phenomenon, there was tremendous inspiration to be taken from African culture in general. From textile patterns, to colors of specific spices and more, there were many elements that influenced our process.  

What thing about working on the film do you think most people in audiences would be surprised by?

John: I think the average audience member would be surprised by how much time and effort goes into these pieces of the film. There are so many details that are considered and developed, without explicitly figuring into the plot of the film. We consider ourselves fortunate that film after film Marvel Studios pushes to develop these ideas that in other films are simply ‘set dressing’.

Chris: Lastly, I like finishing interviews with these questions.

What, in your opinion, makes for a great fictional user interface?

John: I love it when you are presented with innovative tech in a film and just by seeing it you can understand the deeper implications. Having just enough information to make assumptions about how it works, why it works, and what it means to a culture or society. If you can invite this kind of curiosity, and reward this fascination, the audience gets a satisfying gift. And if these elements pull me in, I will almost certainly get ‘lost’ in a film…in the best way. 

What’s your favorite sci-fi interface that someone else designed? (and why)

John: I always loved two that stood out to me for the exact reasons mentioned above.

One is Westworld’s tablet-based Dialog Tree system. It’s not the most radical UI design etc, but it means SO much to the story in that moment, and immediately conveys a complicated concept effortlessly to the viewer.

from Westworld Season 01 Episode 06, “The Adversary”

Another see-it-and-it-makes-sense tech concept is the live-tracked projection camera system from Mission Impossible: Ghost Protocol. It’s so clever, so physical, and you understand exactly how it works (and how it fails!). When I saw this in the theatre, I turned to my wife and whispered, “You see, the camera is moving to match the persp…” and she glared at me and said “I get it! Everybody gets it!” The clever execution of the gadget and scene made me, the viewer, feel smarter than I actually was!

from Mission: Impossible – Ghost Protocol (2011)

What’s next for the studio?

The Perception team is continuing to work hard in our two similar paths of exploration— film and real-world tech. This year we have seen our work appear in Marvel’s streaming shows, with more to come. We’ve also been quite busy in the technology space, working on next-generation products from technology platforms to exciting automobiles. The past year has been busy and full of changes, but no matter how we work, we continue to be fascinated and inspired by the future ahead.