Well, that was a solstice. As noted, I took time off from the blog to make progress on some other things. Those things aren’t done yet (I’m making fine progress on them, thank you for asking), but it’s time for the Fritzes! Following are the candidates for the 2022 Fritz awards, recognizing excellence in sci-fi interfaces across the prior year.
Note: There are some movies that might have been nominated but were only released in cinemas in 2021, and as of the time of this post do not have a home streaming option. I have immunocompromised people in my family, a child too young to be vaccinated, I’m not an accelerationist, I ain’t famous enough for studios to send me Oscar-esque review copies, and the drive-in experience sucks for air. So these films—notably including the MCU’s Spider-Man: No Way Home—were not considered. Sorry, but global pandemic not sorry. Still, I’m happy with what did make the cut.
These movies’ interfaces adhere to solid HCI principles and believable interactions. They engage us in the story world by being convincing. The nominees for Best Believable are Needle in a Timestack, Stowaway, and Swan Song.
These movies’ interfaces blow us away with wonderful visuals and the richness of their future vision. They engross us in the story world by being spectacular. The nominees for Best Narrative are The Mitchells vs. The Machines, Reminiscence, and The Matrix Resurrections.
Audience choice: Robot
It could be my bias from working on and teaching about AI, but I noticed a preponderance of interesting robots last year. So for this year there’s a new category of Audience Choice, and that’s for Robot! Look for an upcoming post with a link to vote on your favorite. The candidates are the unnamed bartender from Cosmic Sin (who gets maybe seconds of screen time, but is interesting nonetheless), Jeff from Finch, Eric and Deborahbot 5000 from The Mitchells vs. The Machines, Bubs from Space Sweepers, and Steve from the unsettling Settlers.
Audience choice: Movie
All of the movies nominated for other awards will be presented for an Audience Choice award. Watch this space for when the ballot is open. In the meantime, if like me you want to see all the candidates so you can be elated or outraged at results, start watching now.
The movies nominated for Best Interfaces manage the extraordinary challenge of being believable and helping to paint a picture of the world of the story. They advance the state of the art in telling stories with speculative technology. The nominees for Best Interfaces are Oxygen, Space Sweepers, and Voyagers.
In prior years I’ve done custom edits of the nominees’ interfaces, but those supercuts keep getting yoinked from YouTube despite obvious Fair Use, and I don’t have the time or willpower to fight it, so we’ll all have to make do with trailers (above) that don’t include interfaces plus individual posts with screenshots (coming soon). Thank the IP lawyers for that.
Winners will be announced near the end of March. And while I don’t have any idea how I’d find a single address to send physical awards to, I’d like to try for that this year.
As always, please remember that the award looks at the interfaces in the movies rather than the movies overall.
If you recall from the first entry in this series of posts, the Spacesuits content was originally drafted as a chapter in the Make It So book, but had to be cut because for length. Now here we are at six posts and 6,700 words later. So it was probably a good call on the part of the publisher. 🙂 But now that we’re here, what have we learned?
Spacesuits are a particularly interesting example of the relationship of sci-fi and design because a tiny fraction of sci-fi audiences will ever experience being in one, but many people have seen them being used in real-world circumstances. This gives them a unique representational anchor in sci-fi, and the survey reveals this. Sci-fi makers base their designs on the surface style with occasional additions or extensions, depending on the fashionable technology of the time. These additions rarely make it to the real world because they’re often made without consideration of the real constraints of keeping a human alive in space. But are still cool.
And the winner is…?
If I had to name the franchise that gets it right the most it’s probably Star Trek. Keep in mind that this has been far from an exhaustive survey (“Yeah like where is The Expanse?,” I hear me cry), and the Star Trek franchise is vast and decades old, with most of its stories set on spacecraft. Extravehicular activity in space is a natural fit to the show and there’s been lots of it. I’m not dismissing it. The work done on Star Trek: Discovery has been beautiful. But if it was just a numbers game rather than a question of quality design, we would expect it to win. And lucky for us, it’s been consistently showcasing the most inspiring examples and most(ly) functional interfaces as well. If you’re looking for inspiration, maybe start there.
What lessons can we learn?
As a particular kind of wearables, spacesuit interfaces reinforce all the principles I originally outlined for Ideal Wearables way back in 2014. They must be…
Easy to access and use
Tough to accidentally activate
Have apposite inputs and outputs
…all pushed through the harder constraints of listed at the top of the article. We have some additional lessons about where to put interfaces on spacesuits given those constraints, but it seems pretty well tied to this domain and difficult to generalize. That is, unless climate change has us all donning environmental suits just to enjoy our own planet in a few degrees Centigrade. Wait, I did not mean to go that dark. Even though climate change is a massive crisis and we should commit to halting it and reversing it if possible. (Hey check out these cool tree-planting drones.)
Let’s instead focus on a mild prognostication. I expect that we’ll be seeing more sci-fi spacesuits in the near future, partly because space travel has been on a kick lately with the high-profile and branding-conscious missions of SpaceX. Just this week Crew Dragon flew has taken the first commercial flight of four civilians into space. (Not the first civilians into space, according to Harvard professor Jonathan McDowell, that honor belongs to the Soyuz TMA-3 mission in 2003, but that was still a government operation.) For better or for worse, part of how SpaceX is making its name is by bringing a new, cool aesthetic to space travel.
So people are seeing spacesuits again (though am I right…no extravehicular activities?) and that means it will be on the minds of studios and writers, and they will give it their own fantastic spin, which will in turn inspire real-world designers, etc. etc. Illustrators and industrial designers are already posting some amazing speculative designs of late, and I look forward to more inspiring designs to come.
I think my spaceship knows which way to go
You may have noticed that this post comes an uncommonly long time after the prior post. I had cut down my publishing cadence at the start of the pandemic to once every other week because stress, and even that has been difficult to keep up. But now we are heading into fall and the winter holidays and a cluster of family birthdays and whatnot usually keep me busy through March. Plus I’m about to start hosting a regular session with Ambition Group about AI Mastery for Design Leaders, and as a first time curriculum, it’s going to demand much of me on top of my full time job. (You didn’t think I did scifiinterfaces professionally, did you? This is a hobby.) And I’m making some baby steps in publishing my own sci-fi short stories. Keep an eye on Escape Pod and Dark Matter Magazine over the fall if you want to catch those. (I’ll almost certainly tweet about them, too.) I want to work on others.
Which is all to say that I’m on the verge of being overcommitted and burnt out, and so going to do myself a favor and take a break from posting here for a while. Sadly, I don’t have any guest posts in the work. Who would be crazy enough to critique sci-fi interfaces during a climate crisis, ongoing fascist movements, and a global pandemic?
I do have big plans for a major study of the narrative uses of sci-fi interfaces, which I hope to use time off in the winter holiday to conduct. That will probably be as huge as the Untold AI and the Gendered AI series. I have nascent notions of using that study as a last bit of material to collect into a 10-year retrospective follow-up to Make It So (let me know if that sounds appealing). And I’m committed to another round of Fritz awards for 2022. So more is coming, and I’ll be back before you know it.
But for a while, over and out, readers. And don’t forget while I’m gone…
Spacesuits are functional items, built largely identically to each other, adhering to engineering specifications rather than individualized fashion. A resulting problem is that it might be difficult to distinguish between multiple, similarly-sized individuals wearing the same suits. This visual identification problem might be small in routine situations:
(Inside the vehicle:) Which of these suits it mine?
What’s the body language of the person currently speaking on comms?
(With a large team performing a manual hull inspection:) Who is that approaching me? If it’s the Fleet Admiral I may need to stand and salute.
But it could quickly become vital in others:
Who’s body is that floating away into space?
Ensign Smith just announced they have a tachyon bomb in their suit. Which one is Ensign Smith?
Who is this on the security footage cutting the phlebotinum conduit?
There a number of ways sci-fi has solved this problem.
Especially in harder sci-fi shows, spacewalkers have a name tag on the suit. The type is often so small that you’d need to be quite close to read it, and weird convention has these tags in all-capital letters even though lower-case is easier to read, especially in low light and especially at a distance. And the tags are placed near the breast of the suit, so the spacewalker would also have to be facing you. So all told, not that useful on actual extravehicular missions.
Screen sci-fi usually gets around the identification problem by having transparent visors. In B-movies and sci-fi illustrations from the 1950s and 60s, the fishbowl helmet was popular, but of course offering little protection, little light control, and weird audio effects for the wearer. Blockbuster movies were mostly a little smarter about it.
Seeing faces allows other spacewalkers/characters (and the audience) to recognize individuals and, to a lesser extent, how their faces synch with their voice and movement. People are generally good at reading the kinesics of faces, so there’s a solid rationale for trying to make transparency work.
Face + illumination
As of the 1970s, filmmakers began to add interior lights that illuminate the wearer’s face. This makes lighting them easier, but face illumination is problematic in the real world. If you illuminate the whole face including the eyes, then the spacewalker is partially blinded. If you illuminate the whole face but not the eyes, they get that whole eyeless-skull effect that makes them look super spooky. (Played to effect by director Scott and cinematographer Vanlint in Alien, see below.)
Identification aside: Transparent visors are problematic for other reasons. Permanently-and-perfectly transparent glass risks the spacewalker getting damage from infrared lights or blinded from sudden exposure to nearby suns, or explosions, or engine exhaust ports, etc. etc. This is why NASA helmets have the gold layer on their visors: it lets in visible light and blocks nearly all infrared.
Only in 2001 does the survey show a visor with a manually-adjustable translucency. You can imagine that this would be more safe if it was automatic. Electronics can respond much faster than people, changing in near-real time to keep sudden environmental illumination within safe human ranges.
You can even imagine smarter visors that selectively dim regions (rather than the whole thing), to just block out, say, the nearby solar flare, or to expose the faces of two spacewalkers talking to each other, but I don’t see this in the survey. It’s mostly just transparency and hope nobody realizes these eyeballs would get fried.
So, though seeing faces helps solve some of the identification problem, transparent enclosures don’t make a lot of sense from a real-world perspective. But it’s immediate and emotionally rewarding for audiences to see the actors’ faces, and with easy cinegenic workarounds, I suspect identification-by-face is here in sci-fi for the long haul, at least until a majority of audiences experience spacewalking for themselves and realize how much of an artistic convention this is.
Other shows have taken the notion of identification further, and distinguished wearers by color. Mission to Mars, Interstellar, and Stowaway did this similar to the way NASA does it, i.e. with colored bands around upper arms and sometimes thighs.
Destination Moon, 2001: A Space Odyssey, and Star Trek (2009) provided spacesuits in entirely different colors. (Star Trek even equipped the suits with matching parachutes, though for the pedantic, let’s acknowledge these were “just” upper-atmosphere suits.)The full-suit color certainly makes identification easier at a distance, but seems like it would be more expensive and introduce albedo differences between the suits.
One other note: if the visor is opaque and characters are only relying on the color for identification, it becomes easier for someone to don the suit and “impersonate” its usual wearer to commit spacewalking crimes. Oh. My. Zod. The phlebotinum conduit!
According to the Colour Blind Awareness organisation, blindness (color vision deficiency) affects approximately 1 in 12 men and 1 in 200 women in the world, so is not without its problems, and might need to be combined with bold patterns to be more broadly accessible.
What we don’t see
Blog from another Mog Project Rho tells us that books have suggested heraldry as space suit identifiers. And while it could be a device placed on the chest like medieval suits of armor, it might be made larger, higher contrast, and wraparound to be distinguishable from farther away.
Indirect, but if the soundscape inside the helmet can be directional (like a personal Surround Sound) then different voices can come from the direction of the speaker, helping uniquely identify them by position. If there are two close together and none others to be concerned about, their directions can be shifted to increase their spatial distinction. When no one is speaking leitmotifs assigned to each other spacewalker, with volumes corresponding to distance, could help maintain field awareness.
Gamers might expect a map in a HUD that showed the environment and icons for people with labeled names.
If the spacewalker can have private audio, shouldn’t she just be able to ask, “Who’s that?” while looking at someone and hear a reply or see a label on a HUD? It would also be very useful if I’ve spacewalker could ask for lights to be illuminated on the exterior of another’s suit. Very useful if that other someone is floating unconscious in space.
Mediated Reality Identification
Lastly I didn’t see any mediated reality assists: augmented or virtual reality. Imagine a context-aware and person-aware heads-up display that labeled the people in sight. Technological identification could also incorporate in-suit biometrics to avoid the spacesuit-as-disguise problem. The helmet camera confirms that the face inside Sargeant McBeef’s suit is actually that dastardly Dr. Antagonist!
We could also imagine that the helmet could be completely enclosed, but be virtually transparent. Retinal projectors would provide the appearance of other spacewalkers—from live cameras in their helmets—as if they had fishbowl helmets. Other information would fit the HUD depending on the context, but such labels would enable identification in a way that is more technology-forward and cinegenic. But, of course, all mediated solutions introduce layers of technology that also introduces more potential points of failure, so not a simple choice for the real-world.
So, as you can read, there’s no slam-dunk solution that meets both cinegenic and real-world needs. Given that so much of our emotional experience is informed by the faces of actors, I expect to see transparent visors in sci-fi for the foreseeable future. But it’s ripe for innovation.
A special subset of spacesuit interfaces is the communication subsystems. I wrote a whole chapter about Communications in Make It So, but spacesuit comms bear special mention, since they’re usually used in close physical proximity but still must be mediated by technology, the channels for detailed control are clumsy and packed, and these communicators are often being overseen by a mission control center of some sort. You’d think this is rich territory, but spoiler: There’s not a lot of variation to study.
Every single spacesuit in the survey has audio. This is so ubiquitous and accepted that, after 1950, no filmmaker has thought the need to explain it or show an interface for it. So you’d think that we’d see a lot of interactions.
Spacesuit communications in sci-fi tend to be many-to-many with no apparent means of control. Not even a push-to-mute if you sneezed into your mic. It’s as if the spacewalkers were in a group, merely standing near each other in air, chatting. No push-to-talk or volume control is seen. Communication with Mission Control is automatic. No audio cues are given to indicate distance, direction, or source of the sound, or to select a subset of recipients.
The one seeming exception to the many-to-many communication is seen in the reboot of Battlestar Galactica. As Boomer is operating a ship above a ground crew, shining a light down on them for visibility, she has the following conversation with Tyrol.
Raptor 478, this is DC-1, I have you in my sights.
Copy that, DC-1. I have you in sight.
How’s it looking there? Can you tell what happened?
Lieutenant, don’t worry…about my team. I got things under control.
Copy that, DC-1. I feel better knowing you’re on it.
Then, when her copilot gives her a look about what she has just said, she says curtly to him, “Watch the light, you’re off target.” In this exchange there is clear evidence that the copilot has heard the first conversation, but it appears that her comment to him is addressed to him and not for the others to hear. Additionally, we do not hear chatter going on between the ground grew during this exchange. Unfortunately, we do not see any of the conversationalists touch a control to give us an idea about how they switch between these modes. So, you know, still nothing.
More recent films, especially in the MCU, has seen all sorts of communication controlled by voice with the magic of General AI…pause for gif…
…but as I mention more and more, once you have a General AI in the picture, we leave the realm of critique-able interactions. Because an AI did it.
In short, sci-fi just doesn’t care about showing audio controls in sci-fi spacesuits, and isn’t likely to start caring anytime soon. As always, if you know of something outside my survey, please mention it.
For reference, in the real world, a NASA astronaut has direct control over the volume of audio that she hears, using potentiometer volume controls. (Curiously the numbers on them are not backwards, unlike the rest of the controls.)
A spacewalker uses the COMM dial switch mode selector at the top of the DCM to select between three different frequencies of wireless communication, each of which broadcasts to each other and the vehicle. When an astronaut is on one of the first two channels, transmission is voice-activated. But a backup, “party line” channel requires push-to-talk, and this is what the push-to-talk control is for.
By default, all audio is broadcast to all other spacewalkers, the vehicle, and Mission Control. To speak privately, without Mission Control hearing, spacewalkers don’t have an engineered option. But if one of the radio frequency bands happens to be suffering a loss of signal to Mission Control, she can use this technological blind spot to talk with some degree of privacy.
Whatever it is, it ain’t going to construct, observe, or repair itself. In addition to protection and provision, suits must facilitate the reason the wearer has dared to go out into space in the first place.
One of the most basic tasks of extravehicular activity (EVA) is controlling where the wearer is positioned in space. The survey shows several types of mechanisms for this. First, if your EVA never needs you to leave the surface of the spaceship, you can go with mountaineering gear or sticky feet. (Or sticky hands.) We can think of maneuvering through space as similar to piloting a craft, but the outputs and interfaces have to be made wearable, like wearable control panels. We might also expect to see some tunnel in the sky displays to help with navigation. We’d also want to see some AI safeguard features, to return the spacewalker to safety when things go awry. (Narrator: We don’t.)
In Stowaway (2021) astronauts undertake unplanned EVAs with carabiners and gear akin to mountaineers use. This makes some sense, though even this equipment needs to be modified for use by astronauts’ thick gloves.
Sticky feet (and hands)
Though it’s not extravehicular, I have to give a shout out to 2001: A Space Odyssey (1969), where we see a flight attendant manage their position in the microgravity with special shoes that adhere to the floor. It’s a lovely example of a competent Hand Wave. We don’t need to know how it works because it says, right there, “Grip shoes.” Done. Though props to the actress Heather Downham, who had to make up a funny walk to illustrate that it still isn’t like walking on earth.
With magnetic boots, seen in Destination Moon, the wearer simply walks around and manages the slight awkwardness of having to pull a foot up with extra force, and have it snap back down on its own.
Battlestar Galactica added magnetic handgrips to augment the control provided by magnetized boots. With them, Sergeant Mathias is able to crawl around the outside of an enemy vessel, inspecting it. While crawling, she holds grip bars mounted to circles that contain the magnets. A mechanism for turning the magnet off is not seen, but like these portable electric grabbers, it could be as simple as a thumb button.
Iron Man also had his Mark 50 suit form stabilizing suction cups before cutting a hole in the hull of the Q-Ship.
In the electromagnetic version of boots, seen in Star Trek: First Contact, the wearer turns the magnets on with a control strapped to their thigh. Once on, the magnetization seems to be sensitive to the wearer’s walk, automatically lessening when the boot is lifted off. This gives the wearer something of a natural gait. The magnetism can be turned off again to be able to make microgravity maneuvers, such as dramatically leaping away from Borg minions.
Star Trek: Discovery also included this technology, but with what appears to be a gestural activation and a cool glowing red dots on the sides and back of the heel. The back of each heel has a stack of red lights that count down to when they turn off, as, I guess, a warning to anyone around them that they’re about to be “air” borne.
Quick “gotcha” aside: neither Destination Moon nor Star Trek: First Contact bothers to explain how characters are meant to be able to kneel while wearing magnetized boots. Yet this very thing happens in both films.
If your extravehicular task has you leaving the surface of the ship and moving around space, you likely need a controlled propellant. This is seen only a few times in the survey.
In the film Mission to Mars, the manned mobility unit, or MMU, seen in the film is based loosely on NASA’s MMU. A nice thing about the device is that unlike the other controlled propellant interfaces, we can actually see some of the interaction and not just the effect. The interfaces are subtly different in that the Mission to Mars spacewalkers travel forward and backward by angling the handgrips forward and backward rather than with a joystick on an armrest. This seems like a closer mapping, but also seems more prone to error by accidental touching or bumping into something.
The plus side is an interface that is much more cinegenic, where the audience is more clearly able to see the cause and effect of the spacewalker’s interactions with the device.
If you have propellent in a Moh’s 4 or 5 film, you might need to acknowledge that propellant is a limited resource. Over the course of the same (heartbreaking) scene shown above, we see an interface where one spacewalker monitors his fuel, and another where a spacewalker realizes that she has traveled as far as she can with her MMU and still return to safety.
For those wondering, Michael Burnham’s flight to the mysterious signal in that pilot uses propellant, but is managed and monitored by controllers on Discovery, so it makes sense that we don’t see any maneuvering interfaces for her. We could dive in and review the interfaces the bridge crew uses (and try to map that onto a spacesuit), but we only get snippets of these screens and see no controls.
Iron Man’s suits employ some Phlebotinum propellant that lasts for ever, can fit inside his tailored suit, and are powerful enough to achieve escape velocity.
All-in-all, though sci-fi seems to understand the need for characters to move around in spacesuits, very little attention is given to the interfaces that enable it. The Mission to Mars MMU is the only one with explicit attention paid to it, and that’s quite derived from NASA models. It’s an opportunity for film makers should the needs of the plot allow, to give this topic some attention.
A major concern of the design of spacesuits is basic usability and ergonomics. Given the heavy material needed in the suit for protection and the fact that the user is wearing a helmet, where does a designer put an interface so that it is usable?
Chest panels are those that require that the wearer only look down to manipulate. These are in easy range of motion for the wearer’s hands. The main problem with this location is that there is a hard trade off between visibility and bulkiness.
Arm panels are those that are—brace yourself—mounted to the forearm. This placement is within easy reach, but does mean that the arm on which the panel sits cannot be otherwise engaged, and it seems like it would be prone to accidental activation. This is a greater technological challenge than a chest panel to keep components small and thin enough to be unobtrusive. It also provides some interface challenges to squeeze information and controls into a very small, horizontal format. The survey shows only three arm panels.
The first is the numerical panel seen in 2001: A Space Odyssey (thanks for the catch, Josh!). It provides discrete and easy input, but no feedback. There are inter-button ridges to kind of prevent accidental activation, but they’re quite subtle and I’m not sure how effective they’d be.
The second is an oversimplified control panel seen in Star Trek: First Contact, where the output is simply the unlabeled lights underneath the buttons indicating system status.
The third is the mission computers seen on the forearms of the astronauts in Mission to Mars. These full color and nonrectangular displays feature rich, graphic mission information in real time, with textual information on the left and graphic information on the right. Input happens via hard buttons located around the periphery.
Side note: One nifty analog interface is the forearm mirror. This isn’t an invention of sci-fi, as it is actually on real world EVAs. It costs a lot of propellant or energy to turn a body around in space, but spacewalkers occasionally need to see what’s behind them and the interface on the chest. So spacesuits have mirrors on the forearm to enable a quick view with just arm movement. This was showcased twice in the movie Mission to Mars.
The easiest place to see something is directly in front of your eyes, i.e. in a heads-up display, or HUD. HUDs are seen frequently in sci-fi, and increasingly in sc-fi spacesuits as well. One is Sunshine. This HUD provides a real-time view of each other individual to whom the wearer is talking while out on an EVA, and a real-time visualization of dangerous solar winds.
These particular spacesuits are optimized for protection very close to the sun, and the visor is limited to a transparent band set near eye level. These spacewalkers couldn’t look down to see the top of a any interfaces on the suit itself, so the HUD makes a great deal of sense here.
Star Trek: Discovery’s pilot episode included a sequence that found Michael Burnham flying 2000 meters away from the U.S.S. Discovery to investigate a mysterious Macguffin. The HUD helped her with wayfinding, navigating, tracking time before lethal radiation exposure (a biological concern, see the prior post), and even doing a scan of things in her surroundings, most notably a Klingon warrior who appears wearing unfamiliar armor. Reference information sits on the periphery of Michael’s vision, but the augmentations occur mapped to her view. (Noting this raises the same issues of binocular parallax seen in the Iron HUD.)
Iron Man’s Mark L armor was able to fly in space, and the Iron HUD came right along with it. Though not designed/built for space, it’s a general AI HUD assisting its spacewalker, so worth including in the sample.
Aside from HUDs, what we see in the survey is similar to what exists in existing real-world extravehicular mobility units (EMUs), i.e. chest panels and arm panels.
Inputs illustrate paradigms
Physical controls range from the provincial switches and dials on the cigarette-girl foldout control panels of Destination Moon to the simple and restrained numerical button panel of 2001, to strangely unlabeled buttons of Star Trek: First Contact’s arm panels (above), and the ham-handed touch screens of Mission to Mars.
As the pictures above reveal, the input panels reflect the familiar technology of the time of the creation of the movie or television show. The 1950s were still rooted in mechanistic paradigms, the late 1960s interfaces were electronic pushbutton, the 2000s had touch screens and miniaturized displays.
Real world interfaces
For comparison and reference, the controls for NASA’s EMU has a control panel on the front, called the Display and Control Module, where most of the controls for the EMU sit.
The image shows that inputs are very different than what we see as inputs in film and television. The controls are large for easy manipulation even with thick gloves, distinct in type and location for confident identification, analog to allow for a minimum of failure points and in-field debugging and maintenance, and well-protected from accidental actuation with guards and deep recesses. The digital display faces up for the convenience of the spacewalker. The interface text is printed backwards so it can be read with the wrist mirror.
The outputs are fairly minimal. They consist of the pressure suit gauge, audio warnings, and the 12-character alphanumeric LCD panel at the top of the DCM. No HUD.
The gauge is mechanical and standard for its type. The audio warnings are a simple warbling tone when something’s awry. The LCD panel provides information about 16 different values that the spacewalker might need, including estimated time of oxygen remaining, actual volume of oxygen remaining, pressure (redundant to the gauge), battery voltage or amperage, and water temperature. To cycle up and down the list, she presses the Mode Selector Switch forward and backward. She can adjust the contrast using the Display Intensity Control potentiometer on the front of the DCM.
The DCMs referenced in the post are from older NASA documents. In more recent images on NASA’s social media, it looks like there have been significant redesigns to the DCM, but so far I haven’t seen details about the new suit’s controls. (Or about how that tiny thing can house all the displays and controls it needs to.)
Spacesuits must support the biological functioning of the astronaut. There are probably damned fine psychological reasons to not show astronauts their own biometric data while on stressful extravehicular missions, but there is the issue of comfort. Even if temperature, pressure, humidity, and oxygen levels are kept within safe ranges by automatic features of the suit, there is still a need for comfort and control inside of that range. If the suit is to be warn a long time, there must be some accommodation for food, water, urination, and defecation. Additionally, the medical and psychological status of the wearer should be monitored to warn of stress states and emergencies.
Unfortunately, the survey doesn’t reveal any interfaces being used to control temperature, pressure, or oxygen levels. There are some for low oxygen level warnings and testing conditions outside the suit, but these are more outputs than interfaces where interactions take place.
There are also no nods to toilet necessities, though in fairness Hollywood eschews this topic a lot.
The one example of sustenance seen in the survey appears in Sunshine, we see Captain Kaneda take a sip from his drinking tube while performing a dangerous repair of the solar shields. This is the only food or drink seen in the survey, and it is a simple mechanical interface, held in place by material strength in such a way that he needs only to tilt his head to take a drink.
Similarly, in Sunshine, when Capa and Kaneda perform EVA to repair broken solar shields, Cassie tells Capa to relax because he is using up too much oxygen. We see a brief view of her bank of screens that include his biometrics.
Remote monitoring of people in spacesuits is common enough to be a trope, but has been discussed already in the Medical chapter in Make It So, for more on biometrics in sci-fi.
There are some non-interface biological signals for observers. In the movie Alien, as the landing party investigates the xenomorph eggs, we can see that the suit outgases something like steam—slower than exhalations, but regular. Though not presented as such, the suit certainly confirms for any onlooker that the wearer is breathing and the suit functioning.
Given that sci-fi technology glows, it is no surprise to see that lots and lots of spacesuits have glowing bits on the exterior. Though nothing yet in the survey tells us what these lights might be for, it stands to reason that one purpose might be as a simple and immediate line-of-sight status indicator. When things are glowing steadily, it means the life support functions are working smoothly. A blinking red alert on the surface of a spacesuit could draw attention to the individual with the problem, and make finding them easier.
One nifty thing that sci-fi can do (but we can’t yet in the real world) is deploy biology-protecting tech at the touch of a button. We see this in the Marvel Cinematic Universe with Starlord’s helmet.
If such tech was available, you’d imagine that it would have some smart sensors to know when it must automatically deploy (sudden loss of oxygen or dangerous impurities in the air), but we don’t see it. But given this speculative tech, one can imagine it working for a whole spacesuit and not just a helmet. It might speed up scenes like this.
What do we see in the real world?
Are there real-world controls that sci-fi is missing? Let’s turn to NASA’s space suits to compare.
The Primary Life-Support System (PLSS) is the complex spacesuit subsystem that provides the life support to the astronaut, and biomedical telemetry back to control. Its main components are the closed-loop oxygen-ventilation system for cycling and recycling oxygen, the moisture (sweat and breath) removal system, and the feedwater system for cooling.
The only “biology” controls that the spacewalker has for these systems are a few on the Display and Control Module (DCM) on the front of the suit. They are the cooling control valve, the oxygen actuator slider, and the fan switch. Only the first is explicitly to control comfort. Other systems, such as pressure, are designed to maintain ideal conditions automatically. Other controls are used for contingency systems for when the automatic systems fail.
The suit is insulated thoroughly enough that the astronaut’s own body heats the interior, even in complete shade. Because the astronaut’s body constantly adds heat, the suit must be cooled. To do this, the suit cycles water through a Liquid Cooling and Ventilation Garment, which has a fine network of tubes held closely to the astronaut’s skin. Water flows through these tubes and past a sublimator that cools the water with exposure to space. The astronaut can increase or decrease the speed of this flow and thereby the amount to which his body is cooled, by the cooling control valve, a recessed radial valve with fixed positions between 0 (the hottest) and 10 (the coolest), located on the front of the Display Control Module.
The spacewalker does not have EVA access to her biometric data. Sensors measure oxygen consumption and electrocardiograph data and broadcast it to the Mission Control surgeon, who monitors it on her behalf. So whatever the reason is, if it’s good enough for NASA, it’s good enough for the movies.
Back to sci-fi
So, we do see temperature and pressure controls on suits in the real world, which underscores their absence in sci-fi. But, if there hasn’t been any narrative or plot reason for such things to appear in a story, we should not expect them.
Space is incredibly inhospitable to life. It is a near-perfect vacuum, lacking air, pressure, and warmth. It is full of radiation that can poison us, light that can blind and burn us, and a darkness that can disorient us. If any hazardous chemicals such as rocket fuel have gotten loose, they need to be kept safely away. There are few of the ordinary spatial clues and tools that humans use to orient and control their position. There are free-floating debris that range from to bullet-like micrometeorites to gas and rock planets that can pull us toward them to smash into their surface or burn in their atmospheres. There are astronomical bodies such as stars and black holes that can boil us or crush us into a singularity. And perhaps most terrifyingly, there is the very real possibility of drifting off into the expanse of space to asphyxiate, starve (though biology will be covered in another post), freeze, and/or go mad.
The survey shows that sci-fi has addressed most of these perils at one time or another.
Despite the acknowledgment of all of these problems, the survey reveals only two interfaces related to spacesuit protection.
Battlestar Galactica (2004) handled radiation exposure with simple, chemical output device. As CAG Lee Adama explains in “The Passage,” the badge, worn on the outside of the flight suit, slowly turns black with radiation exposure. When the badge turns completely black, a pilot is removed from duty for radiation treatment.
This is something of a stretch because it has little to do with the spacesuit itself, and is strictly an output device. (Nothing that proper interaction requires human input and state changes.) The badge is not permanently attached to the suit, and used inside a spaceship while wearing a flight suit. The flight suit is meant to act as a very short term extravehicular mobility unit (EMU), but is not a spacesuit in the strict sense.
The other protection related interface is from 2001: A Space Odyssey. As Dr. Dave Bowman begins an extravehicular activity to inspect seemingly-faulty communications component AE-35, we see him touch one of the buttons on his left forearm panel. Moments later his visor changes from being transparent to being dark and protective.
We should expect to see few interfaces, but still…
As a quick and hopefully obvious critique, Bowman’s function shouldn’t have an interface. It should be automatic (not even agentive), since events can happen much faster than human response times. And, now that we’ve said that part out loud, maybe it’s true that protection features of a suit should all be automatic. Interfaces to pre-emptively switch them on or, for exceptional reasons, manually turn them off, should be the rarity.
But it would be cool to see more protective features appear in sci-fi spacesuits. An onboard AI detects an incoming micrometeorite storm. Does the HUD show much time is left? What are the wearer’s options? Can she work through scenarios of action? Can she merely speak which course of action she wants the suit to take? If a wearer is kicked free of the spaceship, the suit should have a homing feature. Think Doctor Strange’s Cloak of Levitation, but for astronauts.
As always, if you know of other examples not in the survey, please put them in the comments.
“Why cannot we walk outside [the spaceship] like the meteor? Why cannot we launch into space through the scuttle? What enjoyment it would be to feel oneself thus suspended in ether, more favored than the birds who must use their wings to keep themselves up!”
—The astronaut Michel Ardan in Round the Moon by Jules Verne (1870)
When we were close to publication on Make It So, we wound up being way over the maximum page count for a Rosenfeld Media book. We really wanted to keep the components and topics sections, and that meant we had to cut the section on things. Spacesuits was one of the chapters I drafted about things. I am representing that chapter here on the blog. n.b. This was written ten years ago in 2011. There are almost certainly other more recent films and television shows that can serve as examples. If you, the reader, notice any…well, that‘s what the comments section is for.
Sci-fi doesn’t have to take place in interplanetary space, but a heck of a lot of it does. In fact, the first screen-based science fiction film is all about a trip to the moon.
Most of the time, traveling in this dangerous locale happens inside spaceships, but occasionally a character must travel out bodily into the void of space. Humans—and pretty much everything (no not them) we would recognize as life—can not survive there for very long at all. Fortunately, the same conceits that sci-fi adopts to get characters into space can help them survive once they’re there.
An environmental suit is any that helps the wearer survive in an inhospitable environment. Environment suits first began with underwater suits, and later high-altitude suits. For space travel, pressure suits are to be worn during the most dangerous times, i.e. liftoff and landing, when an accident may suddenly decompress a spacecraft. A spacesuit is an environmental suit designed specifically for survival in outer space. NASA refers to spacesuits as Extravehicular Mobility Units, or EMUs. Individuals who wear the spacesuits are known as spacewalkers. The additional equipment that helps a spacewalker move around space in a controlled manner is the Manned Mobility Unit, or MMU.
Additionally, though many other agencies around the world participate in the design and engineering of spacesuits, there is no convenient way to reference them and their efforts as a group, so Aerospace Community is used as a shorthand. This also helps to acknowledge that my research and interviews were primarily with sources primarily from NASA.
The design of the spacesuit is an ongoing and complicated affair. To speak of “the spacesuit” as if it were a single object ignores the vast number of iterations and changes made to the suits between each cycle of engineering, testing, and deployment, must less between different agencies working on their own designs. So, for those wondering, I’m using the Russian Orlan spacesuit currently being used in the International Space Station and shuttle missions as the default design when speaking about modern spacesuits.
What the thing’s got to do
A spacesuit, whether in sci-fi or the real world, has to do three things.
It has to protect the wearer from the perils of interplanetary space.
It has to accommodate the wearer’s ongoing biological needs.
Help them move around.
Facilitate communication between them, other spacewalkers, and mission control.
Identify who is wearing the suit for others
Each of these categories of functions, and the related interfaces, are discussed in following posts.
The Fritzes award honors the best interfaces in a full-length motion picture in the past year. Interfaces play a special role in our movie-going experience, and are a craft all their own that does not otherwise receive focused recognition. Awards are given for Best Believable, Best Narrative, Audience Choice, and Best Interfaces (overall.) A group of critics and creators were consulted to watch the nominated films, compare their merits, and cast votes.
As we all know 2020 was a strange year—being the first big year of the COVID pandemic—and cinema was greatly affected. The number of sci-fi films was low, and the amount of interfaces in those films often smallish compared to prior years. But that does not mean they were not without quality, and here I’m happy to celebrate the excellent work of the candidates and the winners.
These movies’ interfaces adhere to solid HCI principles and believable interactions. They engage us in the story world by being convincing. The nominees for Best Believable are Minor Premise, Project Power, and Proximity.
The winner of the Best Believable award for 2021 is Project Power.
Project Power’s novum is a speculative street drug called Power, that can either explode you, or give you temporary superpowers that are derived from animals’ abilities. Frank Shaver is a policeman who is a user, who has befriended his young dealer, Robin. Art, an ex-soldier who goes by the name Major, teams up with Shaver and Robin, to work their way through the Power dealer network, to stop distribution and find Major’s daughter Tracy, who plays a key role in the whole thing. On the way they learn that Power was created by a private defense contractor, Teleios, that is using New Orleans like a Tuskegee-like testing ground, and work to bring it down.
The interfaces we see belong to Teleios, and tell a story of surveillance, control, social justice, and cutting-edge genetic engineering. While being cool and reserved, the interfaces are believable and help engage us in its psychotic scheme. It’s a Netflix original, so you can catch the movie there.
These movies’ interfaces blow us away with wonderful visuals and the richness of their future vision. They engross us in the story world by being spectacular. The nominees for Best Narrative are Love and Monsters, Underwater, and World of Tomorrrow Episode Three: The Absent Destinations of David Prime.
The winner of the Best Narrative award for 2021 is World of Tomorrow Episode Three: The Absent Destinations of David Prime.
World of Tomorrow Episode Three: The Absent Destinations of David Prime
In a far and bleakly dystopian future, David Prime is alone in his spaceship, when he discovers a hidden memory from a future lover named Emily 9, that sets him off on a trek to retrieve memories from his multiple, future, cloned selves. The instructions that he needs to follow are all from a technology 400 years in the future, the size of which require that he offload increasingly more and more important “cognitive apps.” David’s glitchy, intrusive-ad-infested head-mounted viewscreen interface tells of a world where genetic engineering is a schlock product “HOLOGRAMS THAT YELL AT YOU! (HOTT WILD DISCRETE PARTYLOVE),” human minds are little more than extended smartphones, time travel is used mostly for murder, and human experience is wholly mediated. See it on Vimeo.
All of the movies nominated for other awards were presented for an Audience Choice award. Across social media, the readership was invited to vote for their favorite, and the results tallied. The winner of the Audience Choice award for 2021 is LX 2048.
Adam Bird is dealing with a broken family, a wrecked world, a failing career, and on top of it all, a diagnosis of heart failure. To get a new heart that can be transplanted from a clone, he must approach his estranged wife Reena and ask her to request her Insurance Spouse ahead of his death. She agrees to it but bitterly arranges a virtual assassination for Adam before getting accidentally killed herself. When his clone shows up at his door he must face off against a better version of himself. It’s a dense thriller that goes to ask: What if your dream lover prefers a dream version of you? What if humanity was only a chrysalis?
The interfaces are simple and often subtle, but tell of a high-tech world trapped by virtual escapism, the complications of technological personhood, and the threat that our creations will obviate us. You can watch LX 2048 on many streaming services.
The movies nominated for Best Interfaces manage the extraordinary challenge of being believable and helping to paint a picture of the world of the story. They advance the state of the art in telling stories with speculative technology. The nominees for Best Narrative are Archive, LX 2048, and The Midnight Sky.
The winner of the Best Interfaces award for 2021 is Archive.
George is an engineer reactivating a remote, mothballed industrial facility for a corporation called ARM. George is using the facility’s assets to work on general artificial intelligence and a robot housing that would be indistinguishable from human. He is camping on a technology called Archive, which offers its clients interactions with a virtual simulation of deceased persons for up to 200 hours, while the archive lasts. But he’s hiding both how far he’s gotten with his work, and that he’s not building just any human, but specifically that of Jules, his deceased wife. He and his 3 prototypes must try to reactive the facility, keep the corporation in the dark, keep a tech gang called the Otaku at bay, and deal with the dark interpersonal strife of the prototypes—with the resources and time he has left.
The interfaces are striking in their high-contrast palette, tight grid, and bold typography. The interface style extends throughout the costumes, the sets, and props. The interfaces tell of a setting that is lonely, corporatist, and isolated, and hides a dark secret at the center of it all. You can see Archive on several streaming services.
Congratulations to all the candidates and the winners. That you for helping advance the art and craft of speculative interfaces in cinema.