Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”


The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.

When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.

Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”


I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…

  • the massive amounts of biological similarity between the sexes
  • the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
  • most procedures are gender neutral

…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.

Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.

Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)

The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.

Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.

Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.



Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.

OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.


After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.

She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.

OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”

For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.

Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.

I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.

Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.

Remote Monitoring

The Prometheus spacesuits feature an outward-facing camera on the chest, which broadcasts its feed back to the ship, where the video it overlaid with the current wearer’s name, and inscrutable iconographic and numerical data along the periphery. The suit also has biometric sensors, continuously sending it’s wearer’s vital signs back to the ship. On the monitoring screen, a waveform in the lower left appears is similar to a EKG, but is far too smooth and regular to be an actual one. It is more like an EKG icon. We only see it change shape or position along its bounding box once, to register that Weyland has died, when it turns to a flat line. This supports its being iconic rather than literal.


In addition to the iconic EKG, a red selection rectangle regularly changes across a list in the upper left hand corner of the monitor screens. One of three cyan numbers near the top occasionally changes. Otherwise the peripheral data on these monitoring screens does not change throughout the movie, making it difficult to evaluate its suitability.

The monitoring panel on Prometheus features five of the monitoring feeds gathered on a single translucent screen. One of these feeds has the main focus, being placed in the center and scaled to double the size of the other monitors. How the monitoring crewperson selects which feed to act as the main focus is not apparent.


Vickers has a large, curved, wall-sized display on which she’s able to view David’s feed at one point, so these video feeds can be piped to anyone with authority.


David is able to turn off the suit camera at one point, which Vickers back on the Prometheus is unable to override. This does not make sense for a standard-issue suit supplied by Weyland, but it is conceivable that David has a special suit or has modified the one provided to him during transit to LV-223.


The surface of LV-223, as one imagines with the overwhelming majority of alien planets, is inhospitable to human life. For life support and protection, the crew wears suits complete with large clear “fishbowl” helmets that give a full-range of view. A sensor strip arcs across the forehead of the bowl, with all manner of sensors broadcasting information back to the ship. Crew also wear a soft fabric cap beneath the bowl with their name clearly stitched into a patch that sits above the forehead. (Type nerds: The face is modular, something similar to Eurostile. The name is in all caps. This is par for sci-fi typography, but poor for legibility at distance.)




Sadly for the crewmembers (and the actors as well) the air inside these bowls are not well filtered and circulated. The inside surface fogs up quite easily from the wearer’s breath.


Audio is handled intuitively, with all microphones between spacesuits being active all the time, with an individual’s volume relative to his or her proximity to the listener. Janek is at one point able to stand in front of the ship and address everyone inside it, knowing that the helmet microphones are monitored at all times.

Lights, Cameras

There are lights inside the helmet, placed over the forehead and pointing down to make the wearer’s face visible to others nearby, as well as anyone remote-monitoring the wearer with a backward-facing camera. A curious feature of the suits that they also include yellow lights that highlight the wearer’s neck. What is the purpose of these lights? Certainly it shows off Michael Fassbender’s immaculate jawline, but diegetically, it’s unclear what the purpose of these things are. It is after the scientists remove their helmets in the alien environment—against the direct orders of the Captain—it becomes clear that the spacesuits were designed with this in mind. This way the spacesuits can be operated helmetlessly while maintaining identification lights for other crew. The odds of this being a feature that would ever be used on an alien planet are astronomically low, but the designers accommodated the ability to be operated without helmets.


Sleeve computers

Holloway’s left sleeve has two small screens. The left one of these displays inscrutably small lines of cyan text. The right one has a label of PT011, with a 3×3 array of two-digit hexadecimal numbers beneath it.


A few of the hexadecimal pairs have highlight boxes around them. Looking at this grid, Holloway is able to report to the others that, “Look at the CO2 levels. Outside it’s completely toxic, and in here there’s nothing. It’s breathable.” It’s inscrutable, but believably shorthand for vital bits of information, understsandable to well-trained wearers. For inputs to the sleeve computer, he has four momentary buttons along the bottom and a rotary side-mounted dial. Using these controls, Holloway is able to disable his safety controls and remove the helmet.

Flight instrument panels

There are a great many interfaces seen on the bridge of the Prometheus, and like most flight instrument panels in sci-fi, they are largely about storytelling and less about use.


The captain of the Prometheus is also a pilot, and has a captain’s chair with a heads-up display. This HUD has with real-time wireframe displays of the spaceship in plan view, presumably for glanceable damage feedback.


He also can stand afore at a waist-high panel that overlooks the ship’s view ports. This panel has a main screen in the center, grouped arrays of backlit keys to either side, a few blinking components, and an array of red and blue lit buttons above. We only see Captain Janek touch this panel once, and do not see the effects.


Navigator Chance’s instrument panel below consists of four 4:3 displays with inscrutable moving graphs and charts, one very wide display showing a topographic scan of terrain, one dim panel, two backlit reticles, and a handful of lit switches and buttons. Yellow lines surround most dials and group clusters of controls. When Chance “switches to manual”, he flips the lit switches from right to left (nicely accomplishable with a single wave of the hand) and the switches lights light up to confirm the change of state. This state would also be visible from a distance, useful for all crew within line of sight. Presumably, this is a dangerous state for the ship to be in, though, so some greater emphasis might be warranted: either a blinking warning, or a audio feedback, or possibly both.


Captain Janek has a joystick control for manual landing control. It has a line of light at the top rear-facing part, but its purpose is not apparent. The degree of differentiation in the controls is great, and they seem to be clustered well.


A few contextless flight screens are shown. One for the scientist known only as Ford features 3D charts, views of spinning spaceships, and other inscrutable graphs, all of which are moving.


A contextless view shows the points of metal detected overlaid on a live view from the ship.


There is a weather screen as well that shows air density. Nearby there’s a push control, which Chance presses and keeps held down when he says, “Boss, we’ve got an incoming storm front. Silica and lots of static. This is not good.” Thought we never see the control, it’s curious how such a thing could work. Would it be an entire-ship intercom, or did Chance somehow specify Janek as a recipient with a single button?


Later we see Chance press a single button that illuminates red, after which the screens nearby change to read “COLLISION IMMINENT,” and an all-ship prerecorded announcement begins to repeat its evacuation countdown.


This is single button is perhaps the most egregious of the flight controls. As Janek says to Shaw late in the film, “This is not a warship.” If that’s the case, why would Chance have a single control that automatically knows to turn all screens red with the Big Label and provide a countdown? And why should the crew ever have to turn this switch on? Isn’t a collision one of the most serious things that could happen to the ship? Shouldn’t it be hard to, you know, turn off?

Mission Briefing

Once the Prometheus crew has been fully revived from their hypersleep, they gather in a large gymnasium to learn the details of their mission from a prerecorded volumetric projection. To initiate the display, David taps the surface of a small tablet-sized handheld device six times, and looks up. A prerecorded VP of Peter Weyland appears and introduces the scientists Shaw and Holloway.

This display does not appear to be interactive. Weyland does mention and gesture toward Shaw and Holloway in the audience, but they could have easily been in assigned seats.

Cue Rubik’s Space Cube

After his introduction, Holloway places an object on the floor that looks like a silver Rubik’s Cube with a depressed black button in the center-top square.


He presses a middle-edge button on the top, and the cube glows and sings a note. Then a glowing-yellow “person” icon appears, glowing, at the place he touched, confirming his identity and that it’s ready to go.

He then presses an adjacent corner button. Another glowing-yellow icon appears underneath his thumb, this one a triangle-within-a-triangle, and a small projection grows from the side. Finally, by pressing the black button, all of the squares on top open by hinged lids, and the portable projection begins. A row of 7 (or 8?) “blue-box” style volumetric projections appear, showing their 3D contents with continuous, slight rotations.

Gestural control of the display

After describing the contents of each of the boxes, he taps the air towards either end of the row (there is a sparkle-sound to confirm the gesture) and he brings his middle fingers together like a prayer position. In response, the boxes slide to a center as a stack.

He then twists his hands in opposite directions, keeping the fingerpads of his middle fingers in contact. As he does this, the stack merges.


Then a forefinger tap summons an overlay that highlights a star pattern on the first plate. A middle finger swipe to the left moves the plate and its overlay off to the left. The next plate automatically highlights its star pattern, and he swipes it away. Next, with no apparent interaction, the plate dissolves in a top-down disintegration-wind effect, leaving only the VP spheres that illustrate the star pattern. These grow larger.

Halloway taps the topmost of these spheres, and the VP zooms through intersteller space to reveal an indistinct celestial sphere. He then taps the air again (nothing in particular is beneath his finger) and the display zooms to a star. Another tap zooms to a VP of LV-223.



After a beat of about 9 seconds, the presentation ends, and the VP of LV-223 collapses back into its floor cube.

Evaluating the gestures

In Chapter 5 of Make It So we list the seven pidgin gestures that Hollywood has evolved. The gestures seen in the Mission Briefing confirm two of these: Push to Move and Point to Select, but otherwise they seem idiosyncratic, not matching other gestures seen in the survey.

That said, the gestures seem sensible. On tapping the “bookends” of the blue boxes, Holloway’s finger pads come to represent the extents of the selection, so bringing them together is a reasonable gesture to indicate stacking. The twist gesture seems to lock the boxes in place, to break the connection between them and his fingertips. This twist gesture turns his hand like a key in a lock, so has a physical analogue.

It’s confusing that a tap would perform four different actions (highlight star patterns in the blue boxes, zoom to the celestial sphere, zoom to star, zoom to LV-223) but there is no indication that this is a platform for manipulating VPs as much as it is a presentation software. With this in mind he could arbitrarily assign any gesture to simply “advance the slide.”

Floating-pixel displays

In other posts we compared the human and alien VPs of Prometheus. They were visually distinct from each other, with the alien “glowing pollen” displays being unique to this movie.

There is a style of human display in Prometheus that looks similar to the pollen. Since the users of these displays don’t perceive these points in 3D, it’s more precise to call it a floating-pixel style. These floating-pixel displays appear in three places.

  • David’s Neurovisor for peering into the dreams of the hypersleeping Shaw. (Note this may be 3D for him.)
  • The landing-sequence topography displays
  • The science lab scanner, used on the alien head




There is no diegetic reason offered in the movie for the appearance of an alien 3D display technology in human 2D systems. When I started to try and explain it, it quickly drifted away from interaction design and into fan theory, so I have left it as an exercise for the reader. But there remains a question about the utility of this style.

Poor cues for understanding 3D

Floating, glowing points are certainly novel to our survey as a way to describe 3D shapes for users. And in the case of the alien pollen, it makes some sense. Seeing these in the world, our binocular vision would help us understand the relationships of each point as well as the gestalt, like walking around a Christmas tree at night.

But in 2D, simple points are not ideal for understanding 3D surfaces. Especially when the pixels are all the same apparent size. We normally use the small bits of scale to help us understand an object’s relative distance from us. Though the shape can be kind-of inferred through motion, it still creates a great deal of visual noise. It also hurts when the points are too far apart. It doesn’t give us a gestalt sense of surface.



I couldn’t find any scientific studies of the readability of this style, this is my personal take on it. But we also can look to the real world, namely to the history of maps, where cartographers have wrestled with similar problems to show topography. Centuries of their trial-and-error have resulted in four primary techniques for describing 3D shapes on a 2D surface: hachures, contour lines, hypsometric tints, and shaded relief.

(images from

(images from

These styles utilize lines, shades, and colors to describe topography, and notably not points. Even modern 3D modeling software uses tessellated wireframes instead of floating points as a lightweight rendering technique. To my knowledge, only geographic information systems display anything similar, and that’s only when the user wants to see actual data points.

These anecdotal bits of evidence combine with my observations of these interfaces in Prometheus to convince me that while it’s stylistically unique (and therefore useful to the film makers), it’s seriously suboptimal for real-world adoption.