Sci-fi Spacesuits: Interface Locations

A major concern of the design of spacesuits is basic usability and ergonomics. Given the heavy material needed in the suit for protection and the fact that the user is wearing a helmet, where does a designer put an interface so that it is usable?

Chest panels

Chest panels are those that require that the wearer only look down to manipulate. These are in easy range of motion for the wearer’s hands. The main problem with this location is that there is a hard trade off between visibility and bulkiness.

Arm panels

Arm panels are those that are—brace yourself—mounted to the forearm. This placement is within easy reach, but does mean that the arm on which the panel sits cannot be otherwise engaged, and it seems like it would be prone to accidental activation. This is a greater technological challenge than a chest panel to keep components small and thin enough to be unobtrusive. It also provides some interface challenges to squeeze information and controls into a very small, horizontal format. The survey shows only three arm panels.

The first is the numerical panel seen in 2001: A Space Odyssey (thanks for the catch, Josh!). It provides discrete and easy input, but no feedback. There are inter-button ridges to kind of prevent accidental activation, but they’re quite subtle and I’m not sure how effective they’d be.

2001: A Space Odyssey (1968)

The second is an oversimplified control panel seen in Star Trek: First Contact, where the output is simply the unlabeled lights underneath the buttons indicating system status.

The third is the mission computers seen on the forearms of the astronauts in Mission to Mars. These full color and nonrectangular displays feature rich, graphic mission information in real time, with textual information on the left and graphic information on the right. Input happens via hard buttons located around the periphery.

Side note: One nifty analog interface is the forearm mirror. This isn’t an invention of sci-fi, as it is actually on real world EVAs. It costs a lot of propellant or energy to turn a body around in space, but spacewalkers occasionally need to see what’s behind them and the interface on the chest. So spacesuits have mirrors on the forearm to enable a quick view with just arm movement. This was showcased twice in the movie Mission to Mars.

HUDs

The easiest place to see something is directly in front of your eyes, i.e. in a heads-up display, or HUD. HUDs are seen frequently in sci-fi, and increasingly in sc-fi spacesuits as well. One is Sunshine. This HUD provides a real-time view of each other individual to whom the wearer is talking while out on an EVA, and a real-time visualization of dangerous solar winds.

These particular spacesuits are optimized for protection very close to the sun, and the visor is limited to a transparent band set near eye level. These spacewalkers couldn’t look down to see the top of a any interfaces on the suit itself, so the HUD makes a great deal of sense here.

Star Trek: Discovery’s pilot episode included a sequence that found Michael Burnham flying 2000 meters away from the U.S.S. Discovery to investigate a mysterious Macguffin. The HUD helped her with wayfinding, navigating, tracking time before lethal radiation exposure (a biological concern, see the prior post), and even doing a scan of things in her surroundings, most notably a Klingon warrior who appears wearing unfamiliar armor. Reference information sits on the periphery of Michael’s vision, but the augmentations occur mapped to her view. (Noting this raises the same issues of binocular parallax seen in the Iron HUD.)

Iron Man’s Mark L armor was able to fly in space, and the Iron HUD came right along with it. Though not designed/built for space, it’s a general AI HUD assisting its spacewalker, so worth including in the sample.

Avengers: Infinity War (2018)

Aside from HUDs, what we see in the survey is similar to what exists in existing real-world extravehicular mobility units (EMUs), i.e. chest panels and arm panels.

Inputs illustrate paradigms

Physical controls range from the provincial switches and dials on the cigarette-girl foldout control panels of Destination Moon to the simple and restrained numerical button panel of 2001, to strangely unlabeled buttons of Star Trek: First Contact’s arm panels (above), and the ham-handed touch screens of Mission to Mars.

Destination Moon (1950)
2001: A Space Odyssey (1968)

As the pictures above reveal, the input panels reflect the familiar technology of the time of the creation of the movie or television show. The 1950s were still rooted in mechanistic paradigms, the late 1960s interfaces were electronic pushbutton, the 2000s had touch screens and miniaturized displays.

Real world interfaces

For comparison and reference, the controls for NASA’s EMU has a control panel on the front, called the Display and Control Module, where most of the controls for the EMU sit.

The image shows that inputs are very different than what we see as inputs in film and television. The controls are large for easy manipulation even with thick gloves, distinct in type and location for confident identification, analog to allow for a minimum of failure points and in-field debugging and maintenance, and well-protected from accidental actuation with guards and deep recesses. The digital display faces up for the convenience of the spacewalker. The interface text is printed backwards so it can be read with the wrist mirror.

The outputs are fairly minimal. They consist of the pressure suit gauge, audio warnings, and the 12-character alphanumeric LCD panel at the top of the DCM. No HUD.

The gauge is mechanical and standard for its type. The audio warnings are a simple warbling tone when something’s awry. The LCD panel provides information about 16 different values that the spacewalker might need, including estimated time of oxygen remaining, actual volume of oxygen remaining, pressure (redundant to the gauge), battery voltage or amperage, and water temperature. To cycle up and down the list, she presses the Mode Selector Switch forward and backward. She can adjust the contrast using the Display Intensity Control potentiometer on the front of the DCM.

A NASA image tweeted in 2019.

The DCMs referenced in the post are from older NASA documents. In more recent images on NASA’s social media, it looks like there have been significant redesigns to the DCM, but so far I haven’t seen details about the new suit’s controls. (Or about how that tiny thing can house all the displays and controls it needs to.)

Motion Detector

Johnny, with newly upgraded memory, goes straight to the hotel room where he meets the client’s scientists. Before the data upload, he quickly installs a motion detector on the hotel suite door. This is a black box that he carries clipped to his belt. He uses his thumb to activate it as he takes hold and two glowing red status lights appear.

jm-5-motion-detector-a-adjusted

Once placed on the door, there is just one glowing light. We don’t see exactly how Johnny controls the device, but for something this simple just one touch button would be sufficient.

jm-5-motion-detector-b-adjusted

A little later, after the brain upload (discussed in the next post), the motion detector goes off when four heavily armed Yakuza arrive outside the door. The single light starts blinking, and there’s a high pitched beep similar to a smoke alarm, but quieter. Continue reading

Alien VPs

In the volumetric projection chapter of Make It So, we note that sci-fi makers take pains to distinguish the virtual from the real most often with a set of visual treatments derived from the “Pepper’s Ghost” parlor trick, augmented with additional technology cues: translucency, a blue tint, glowing whites, supersaturated colors for wireframed objects, clear pixels and/or flicker, with optional projection rays.

Prometheus has four types of VPs that adhere to this style in varying degrees. Individual displays (with their interactions) are discussed in other posts. This collection of posts compares their styles. This particular post describes the alien VPs.

Prometheus-223

The two alien VPs are quite different from the human VPs in appearance and behavior. The first thing to note is that they adhere to the Pepper’s Ghost style more readily, with glowing blue-tinted whites and transparency. Beyond that they differ in precision and implied technology.

Precision VPs

The first style of alien VP appears in the bridge of the alien vessel, where projection technology can be built into the architecture. The resolution is quite precise. When the grapefruit-sized Earth gets close to the camera in one scene, it appears to have infinite resolution, even though this is some teeny tiny percentage of the whole display.

Prometheus-228

Glowing Pollen

The other alien VP tech is made up of small, blue-white voxels that float, move in space, obey some laws of physics, and provide a crude level of resolution. These appear in the caves of the alien complex where display tech is not present in the walls, and again as “security footage” in the bridge of the alien ship. Because the voxels obey some laws of physics, it’s easier to think of them as glowing bits of pollen.

Prometheus-211 Prometheus-140

Pollen behavior

These voxels appear to not be projections of light in space, but actual motes that float through the air. When David activates the “security footage” in the alien complex, a wave of this pollen appears and flows past him. It does not pass through him, but collides with him, each collided mote taking a moment to move around him and regain its roughly-correct position in the display. (How it avoids getting in his mouth is another question entirely.) The motes even produce a gust of wind that disturb David’s bleached coif.

Pollen inaccuracy

The individual lines of pollen follow smooth arcs through the air, but lines appear to be slightly off from one another.

Prometheus-215

This style is beautiful and unique, and conveys a 3D display technology that can move to places even where there’s not a projector in line of sight. The sci-fi makers of this speculative technology use this inaccuracy to distinguish it from other displays. But if a precise understanding of the shapes being described is useful to its viewers, of course it would be better if the voxels were more precisely positioned in space. That’s a minor critique. The main critique of this display is when it gets fed back into the human displays as an arbitrary style, as I’ll discuss in the next post about the human-tech, floating-pixel displays.

Human VPs

In the volumetric projection chapter of Make It So, we note that sci-fi makers take pains to distinguish the virtual from the real most often with a set of visual treatments derived from the “Pepper’s Ghost” parlor trick, augmented with additional technology cues: translucency, a blue tint, glowing whites, supersaturated colors for wireframed objects, clear pixels and/or flicker, with optional projection rays.

Prometheus has four types of VPs that adhere to this style in varying degrees. Individual displays (with their interactions) are discussed in other posts. This collection of posts compares their styles. This particular post describes the human VPs.

Prometheus-064

Blue-box displays

One type of human-technology VPs are the blue-box displays:

  • David’s language program
  • Halloway and Shaw’s mission briefing
  • The display in Shaw’s quarters

These adhere more closely to the Pepper’s Ghost style, being contained in a translucent blue cuboid with saturated surface graphics and a grid pattern on the sides.

Weyland-Yutani VP

The other type of human displays are the Weyland-Yutani VPs. These have translucency and supersaturated wireframes, but they do not have any of the other conventional Pepper’s Ghost cues. Instead they add two new visual cues to signal to the audience their virtualness: scaffolded transitions and edge embers.

When a Weyland-Yutani VP is turned on, it does not simply blink into view. It builds. First, shapes are described in space as a tessellated surface, made of yellow-green lines describing large triangles that roughly describe the forthcoming object or its extents. These triangles have a faint smoky-yellow pattern on their surface. Some of the lines have yellow clouds and bright red segments along their lengths. Additionally, a few new triangles extend to a point space where another piece of the projection is about to appear. Then the triangles disappear, replaced with a fully refined image of the 3D object. The refined image may flicker once or twice before settling into persistence. The whole scaffolding effect is staggered across time, providing an additional sense of flicker to the transition.

Prometheus-010

Motion in resolved parts of the VP begins immediately, even as other aspects of the VP are still transitioning on.

When a VP is turned off, this scaffolding happens in reverse, as elements decay into tessellated yellow wireframes before flickering out of existence.

Edge embers

A line of glowing, flickering, sliding, yellow-green points illustrates the extents of the VP area, where a continuous surface like flooring is clipped at the limits of the display. These continue across the duration of the playback.

A growing confidence in audiences

This slightly different strategy to distinguishing VPs from the real world indicates the filmmaker’s confidence that audiences are growing familiar enough with this trope that fewer cues are needed during the display. In this case the translucency and subtle edge embers are the only persistent cues, pushing the major signals of the scaffolding and surface flicker to the transitions.

If this trend continues and sci-fi makers become overconfident, it may confuse some audiences, but at the same time give the designers of the first real-world VPs more freedom with their appearance. They wouldn’t have to look like Star Wars’.

Something new: Projected Reflectance

One interesting detail is that when we see Vickers standing in the projection of Weyland’s office, she casts a slight reflection in the volumetric surface. It implies a technology capable of projecting not just luminance, but reflectivity as well. The ability to project volumetric mirrors hasn’t appeared before in the survey.

Prometheus_VP-0012

Lesson: Transition by importance

Another interesting detail is that when the introduction to the Mission briefing ends, the environment flickers out first, then the 2D background, then Weyland’s dog, then finally Weyland.

This order isn’t by position, brightness, motion, or even surface area (the dog confounds that.) It is by narrative importance: Foreground, background, tertiary character, primary character. The fact that the surrounding elements fade first keep your eyes glued onto the last motion (kind of like watching the last bit of sun at a sunset), which in this order is the most important thing in the feed, i.e. the human in view. If a staggered-element fade-out becomes a norm in the real world for video conferencing (or eventually VP conferencing), this cinematic order is worth remembering.