Kusanagi is able to mentally activate a feature of her skintight bodysuit and hair(?!) that renders her mostly invisible. It does not seem to affect her face by default. After her suit has activated, she waves her hand over her face to hide it. We do not see how she activates or deactivates the suit in the first place. She seems to be able to do so at will. Since this is not based on any existing human biological capacity, a manual control mechanism would need some biological or cultural referent. The gesture she uses—covering her face with open-fingered hands—makes the most sense, since even with a hand it means, “I can see you but you can’t see me.”
In the film we see Ghost Hacker using the same technology embedded in a hooded coat he wears. He activates it by pulling the hood over his head. This gesture makes a great deal of physical sense, similar to the face-hiding gesture. Donning a hood would hide your most salient physical identifier, your face, so having it activate the camouflage is a simple synechdochic extension.
The spider tank also features this same technology on its surface, where we learn it is a delicate surface. It is disabled from a rain of glass falling on it.
This tech less than perfect, distorting the background behind it, and occasionally flashing with vigorous physical activity. And of course it cannot hide the effects that the wearer is creating in the environment, as we see with splashes the water and citizens in a crowd being bumped aside.
Since this imperfection runs counter to the wearer’s goal, I’d design a silent, perhaps haptic feedback, to let the wearer know when they’re moving too fast for the suit’s processors to keep up, as a reinforcement to whatever visual effects they themselves are seeing.
UPDATE: When this was originally posted, I used the incorrect concept “metonym” to describe these gestures. The correct term is “synechdoche” and the post has been updated to reflect that.
When the Mangalores meet with Zorg to deliver (what they think are) the stones, their leader Aknar is wearing a human disguise. The exact nature of the speculative technology is difficult to determine. (In fact, it’s entirely arguable that this is a biological ability, but it’s more useful to presume it’s not.)
Zorg tells Aknar, “What is that you? What an ugly face. It doesn’t suit you. Take it off.” Aknar strains his chin upward and shakes his head rapidly. As he does so, the disguise fades to reveal his true face.
Presuming it’s a technology, the gesture is a nice design choice for the interaction. It’s not a gesture that’s likely to be done accidentally, and has a nice physical metaphor—that of shaking off water. The physicality makes it easy to remember. Plus, being a head gesture, it can be deployed in the field even when carrying a weapon. This makes it possible to dismiss and show your identity to comrades without the risk of lowering your guard. It does temporarily limit the wearer’s ability to sense danger, but I suspect Mangalores care more about keeping their finger on the trigger.
Of course it raises the question of whether what results from the shake is just another disguise, but that would depend on some external system of multifactor authentication that’s separate from the gesture.
Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”
The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.
When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.
Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”
I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…
the massive amounts of biological similarity between the sexes
the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
most procedures are gender neutral
…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.
Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.
Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)
The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.
Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.
Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.
Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.
OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.
After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.
She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.
OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”
For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.
Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.
I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.
Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.
Once the Prometheus crew has been fully revived from their hypersleep, they gather in a large gymnasium to learn the details of their mission from a prerecorded volumetric projection. To initiate the display, David taps the surface of a small tablet-sized handheld device six times, and looks up. A prerecorded VP of Peter Weyland appears and introduces the scientists Shaw and Holloway.
This display does not appear to be interactive. Weyland does mention and gesture toward Shaw and Holloway in the audience, but they could have easily been in assigned seats.
Cue Rubik’s Space Cube
After his introduction, Holloway places an object on the floor that looks like a silver Rubik’s Cube with a depressed black button in the center-top square.
He presses a middle-edge button on the top, and the cube glows and sings a note. Then a glowing-yellow “person” icon appears, glowing, at the place he touched, confirming his identity and that it’s ready to go.
He then presses an adjacent corner button. Another glowing-yellow icon appears underneath his thumb, this one a triangle-within-a-triangle, and a small projection grows from the side. Finally, by pressing the black button, all of the squares on top open by hinged lids, and the portable projection begins. A row of 7 (or 8?) “blue-box” style volumetric projections appear, showing their 3D contents with continuous, slight rotations.
Gestural control of the display
After describing the contents of each of the boxes, he taps the air towards either end of the row (there is a sparkle-sound to confirm the gesture) and he brings his middle fingers together like a prayer position. In response, the boxes slide to a center as a stack.
He then twists his hands in opposite directions, keeping the fingerpads of his middle fingers in contact. As he does this, the stack merges.
Then a forefinger tap summons an overlay that highlights a star pattern on the first plate. A middle finger swipe to the left moves the plate and its overlay off to the left. The next plate automatically highlights its star pattern, and he swipes it away. Next, with no apparent interaction, the plate dissolves in a top-down disintegration-wind effect, leaving only the VP spheres that illustrate the star pattern. These grow larger.
Halloway taps the topmost of these spheres, and the VP zooms through intersteller space to reveal an indistinct celestial sphere. He then taps the air again (nothing in particular is beneath his finger) and the display zooms to a star. Another tap zooms to a VP of LV-223.
After a beat of about 9 seconds, the presentation ends, and the VP of LV-223 collapses back into its floor cube.
Evaluating the gestures
In Chapter 5 of Make It So we list the seven pidgin gestures that Hollywood has evolved. The gestures seen in the Mission Briefing confirm two of these: Push to Move and Point to Select, but otherwise they seem idiosyncratic, not matching other gestures seen in the survey.
That said, the gestures seem sensible. On tapping the “bookends” of the blue boxes, Holloway’s finger pads come to represent the extents of the selection, so bringing them together is a reasonable gesture to indicate stacking. The twist gesture seems to lock the boxes in place, to break the connection between them and his fingertips. This twist gesture turns his hand like a key in a lock, so has a physical analogue.
It’s confusing that a tap would perform four different actions (highlight star patterns in the blue boxes, zoom to the celestial sphere, zoom to star, zoom to LV-223) but there is no indication that this is a platform for manipulating VPs as much as it is a presentation software. With this in mind he could arbitrarily assign any gesture to simply “advance the slide.”
The second interface David has to monitor those in hypersleep is the Neuro-Visor, a helmet that lets him perceive their dreams. The helmet is round, solid, and white. The visor itself is yellow and back-lit. The yellow is the same greenish-yellow underneath the hypersleep beds and clearly establishes the connection between the devices to a new user. When we see David’s view from inside the visor, it is a cinematic, fully-immersive 3D projection of events in her dreams, that is presented in the “spot elevations” style that is predominant throughout the film (more on this display technique later).
Later in the movie we see David using this same helmet to communicate with Weyland who is in a hypersleep chamber, but Weyland is somehow conscious enough to have a back-and-forth dialogue with David. We don’t see either David’s for Weyland’s perspective in the scene.
As an interface, the helmet seems straightforward. He has one Neuro-Visor for all the hypersleep chambers, and to pair the device to a particular one, he simply touches the surface of the chamber near the hyper sleeper’s head. Cyan interface elements on that translucent interface confirm the touch and presumably allow some degree of control of the visuals. To turn the Neuro-Visor off, he simply removes it from his head. These are simple and intuitive gestures that makes the Neuro-Visor one of the best and most elegantly designed interfaces in the movie.
Much of the control of Morbius’ highly technological house is given to wave-over switches. These devices sit on the tops of tables, and illuminate briefly when they detect a hand moving above them. Each is keyed to a different job.
Alta must beam Robbie several times before he arrives.
One of these devices summons Robbie. After waving her hand several times over it, Alta expresses her frustration when Robbie finally arrives. “Where have you been?” she asks, “I’ve beamed and beamed.” It hints at the need for some feedback from Robbie that he has heard the summons and is coming.
With a wave of his hand Morbius activates the security shutters around his house.
Another hand wave device closes or opens the protective metal shutters that seal Morbius’ house.
To see the outdoors better, Morbius waves off the indoor lights.
Another one allows him to turn off interior lights, which he uses in the film to see what is approaching so noisily.
One interesting fact of these interfaces is that they lack any affordance for their cause and effect. How is a new user meant to know to wave? How would she know what would happen as a result? Though this is generally inadvisable, we can imagine that Morbius only ever planned for he and Alta to use it.