So, talking about how JARVIS is lying to Tony, and really all of this was to get us back here. If you accept that JARVIS is doing almost all the work, and Tony is an onboard manager, then it excuses almost all of the excesses of the interface.
Distracting 3D, transparent, motion graphics of the tower? Not a problem. Tony is a manager, and wants to know that the project is continuing apace.
Random-width rule line around the video? Meh, it’s more distracting visual interest.
“AUDIO ANAL YSIS” (kerning, people!) waveform that visually marks whether there is audio he could hear anyway? Hey, it looks futuristic.
The fact that the video stays bright and persistent in his vision when he’s a) not looking it and b) piloting a weaponized environmental suit through New York City? Not an issue because JARVIS is handling the flying.
That is has no apparent controls for literally anything (pause/play, end call, volume, brightness)? Not a problem, JARVIS will get it right most of the time, and will correct anything at a word from Tony.
That the suit could have flown itself to the pipe, handled the welding, and pipe-cuffing itself, freeing Tony to continue Tony Starking back in his office? It’s because he’s a megalomaniac and can’t not.
If JARVIS were not handling everything, and this a placebo interface, well, I can think of at least 6 problems.
The first computer interface we see in the film occurs at 3:55. It’s an interface for housing and monitoring the tesseract, a cube that is described in the film as “an energy source” that S.H.I.E.L.D. plans to use to “harness energy from space.” We join the cube after it has unexpectedly and erratically begun to throw off low levels of gamma radiation.
The harnessing interface consists of a housing, a dais at the end of a runway, and a monitoring screen.
Fury walks past the dais they erected just because.
The housing & dais
The harness consists of a large circular housing that holds the cube and exposes one face of it towards a long runway that ends in a dais. Diegetically this is meant to be read more as engineering than interface, but it does raise questions. For instance, if they didn’t already know it was going to teleport someone here, why was there a dais there at all, at that exact distance, with stairs leading up to it? How’s that harnessing energy? Wouldn’t you expect a battery at the far end? If they did expect a person as it seems they did, then the whole destroying swaths of New York City thing might have been avoided if the runway had ended instead in the Hulk-holding cage that we see later in the film. So…you know…a considerable flaw in their unknown-passenger teleportation landing strip design. Anyhoo, the housing is also notable for keeping part of the cube visible to users near it, and holding it at a particular orientation, which plays into the other component of the harness—the monitor.
On each of the sleep pods in which the Odyssey crew sleep, there is a display for monitoring the health of the sleeper. It includes some biometric charts, measurements, a body location indicator, and a countdown timer. This post focuses on that timer.
To show the remaining time of until waking Julia, the pod’s display prompts a countdown that shows hours, minutes and seconds. It shows in red the final seconds while also beeping for every second. It pops-up over the monitoring interface.
Julia’s timer reaches 0:00:01.
The thing with pop-ups
We all know how it goes with pop-ups—pop-ups are bad and you should feel bad for using them. Well, in this case it could actually be not that bad.
The viewer
Although the sleep pod display’s main function is to show biometric data of the sleeper, the system prompts a popup to show the remaining time until the sleeper wakes up. And while the display has some degree of redundancy to show the data—i.e. heart rate in graphics and numbers— the design of the countdown brings two downsides for the viewer.
Position: it’s placed right in the middle of the screen.
Size: it’s roughly a quarter of the whole size of the display
Between the two, it partially covers both the pulse graphics and the numbers, which can be vital, i.e. life threatening—information of use to the viewer. Continue reading →
As far as Carmen is concerned, the shuttle is small fries. Her real interest is in piloting a big ship, like the Rodger Young.
On her first time at the helm as Pilot Trainee, she enters the bridge, reports for duty, and takes the number 2 chair. As she does, she reaches out to one of two panels and flips two green toggle switches simultaneously down, and immediately says, “Identify.”
In response her display screen (a cathode ray tube, guys, complete with bowed-glass surface!)—which had been reading STATION STANDBY in alternating red and yellow capitals—very quickly flashes the legend VOICE IDENTITY CONFIRMED in white letters before displaying a waveform with the label ANALYZING VOICEPRINT, ostensibly of her voice input. Then, having confirmed her identiy, it displays her IDENTIFICATION RECORD, including her name, portrait, mission status, current assignment, and a shouty all-caps red-letter welcome message at the bottom: WELCOME ABOARD ENSIGN. There are tables of tubles along the bottom and top of these screens but they’re unreadable in my copy.
She then reaches to the panel of physical controls again, and flips a red toggle switch before pressing two out of a 4×4 grid of yellow-orange momentary buttons. She sits back in her seat, and turns to see the ridiculously-quaffed Zander in the adjacent chair. Plot ensues.
Some challenges with this setup.
Input
It looks like those vertical panels of unlabeled switches and buttons are all she’s got for input. Not the most ergonomic, if she’s expected to be entering data for any length of time or under any duress.
Output
Having the display in front of her makes a great deal of sense, since most of the things she’s dealing with as either a pilot or navigator are not just out the front viewport.
Workflow
The workflow for authentication is a little strange, and mismatched for the screens we see.
A toggle switch might make sense if it’s meaning was “I am present.” But we can imagine lots of other ways the system might sense that she is present passively, and not require her to flip the switch manually.
Why would it analyze the voiceprint after the voice identity was confirmed? It would have made more sense to have the first screen prompt her to provide a voice print, like “Provide voiceprint” with some visual confirmation that it’s currently recording and sensitive to her voice. Then when she finishes speaking the sample, then the next can say Analyzing voiceprint with the recorded waveform, and the final screen can read Voice identity confirmed, before moving on. I can’t readily apologize for the way it’s structured now. Fortunately it zips by so that most folks will just get it.
The waveform
That waveform, by the way, is not for the word “identify.”. I opened the screen cap, isolated the “waveform”, tweaked it in Photoshop for levels, and expanded it.
I ran this image through the demo of a program called PhotoSounder. What played from my speakers was more like astronomy recordings than a voice. Admittedly, it’s audio interpreted from a very low-rez version of the waveform, but seriously, more data is not going to help resolve that audio spookiness into human language.
Props to the interface designers for NOT showing the waveform of sounds in the Rodger Young’s database. It would be explanatory, of course, to immediately see the freshly recorded one being compared against the one in the database. But it would not be very secure. A malefactor would just be able to screen cap or photograph the database version, interpret the waveform like I did for the sound above, and play it back for the system for a perfect match.
Multifactor authentication
Additional props to whoever specced the password button presses after the login. She might be setting a view she wants to see, but I prefer it to mean the system is using multifactor authentication. She’s providing a password. Sure, it’s a weak one—2 hexadecimal characters—but it’s better than nothing, and would even help with the hacking I described in the above section.
The welcome message
Finally, the welcome message feels a little out of place. Is this the only place she encounters the computer system? The literal sense of “welcome aboard” is to welcome someone aboard, which would be most appropriate only when they, you know, come aboard, which surely was some time ago. Carmen at least had to drop her stuff off in quarters. It’s also used by individuals who have been aboard welcoming newcomers the first time they greet them. But that anthropomorphizes this interface, which through this interaction and the several we’ll see next, would be dangerously overpromising.
The Prometheus spacesuits feature an outward-facing camera on the chest, which broadcasts its feed back to the ship, where the video it overlaid with the current wearer’s name, and inscrutable iconographic and numerical data along the periphery. The suit also has biometric sensors, continuously sending it’s wearer’s vital signs back to the ship. On the monitoring screen, a waveform in the lower left appears is similar to a EKG, but is far too smooth and regular to be an actual one. It is more like an EKG icon. We only see it change shape or position along its bounding box once, to register that Weyland has died, when it turns to a flat line. This supports its being iconic rather than literal.
In addition to the iconic EKG, a red selection rectangle regularly changes across a list in the upper left hand corner of the monitor screens. One of three cyan numbers near the top occasionally changes. Otherwise the peripheral data on these monitoring screens does not change throughout the movie, making it difficult to evaluate its suitability.
The monitoring panel on Prometheus features five of the monitoring feeds gathered on a single translucent screen. One of these feeds has the main focus, being placed in the center and scaled to double the size of the other monitors. How the monitoring crewperson selects which feed to act as the main focus is not apparent.
Vickers has a large, curved, wall-sized display on which she’s able to view David’s feed at one point, so these video feeds can be piped to anyone with authority.
David is able to turn off the suit camera at one point, which Vickers back on the Prometheus is unable to override. This does not make sense for a standard-issue suit supplied by Weyland, but it is conceivable that David has a special suit or has modified the one provided to him during transit to LV-223.
During David’s two year journey, part of his time is spent “deconstructing dozens of ancient languages to their roots.” We see one scene illustrating a pronunciation part of this study early in the film. As he’s eating, he sees a volumetric display of a cuboid appear high in the air opposite his seat at the table. The cuboid is filled with a cyan glow in which a “talking head” instructor takes up most of the space. In the left is a column of five still images of other artificial intelligent instructors. Each image has two vertical sliders on the left, but the meaning of these sliders is not made clear. In the upper right is an obscure diagram that looks a little like a constellation with some inscrutable text below it.
On the right side of the cuboid projection, we see some other information in a pinks, blues, and cyans. This information appears to be text, bar charts, and line graphs. This information is not immediately usable to the learner, so perhaps it is material about the entire course, for when the lessons are paused: Notes about the progress towards a learning goal, advice for further study, or next steps. Presuming this is a general-purpose interface rather than a custom one made just for David, this information could be the student’s progress notes for an attending human instructor.
We enter the scene with the AI saying, “…Whilst this manner of articulation is attested in Indo-European descendants as a purely paralinguistic form, it is phonemic in the ancestral form dating back five millennia or more. Now let’s attempt Schleicher’s Fable. Repeat after me.”
In the lower part of the image is a waveform of the current phrase being studied. In the lower right is the written text of the phrase being studied, in what looks like a simplified phoenetic alphabet. As the instructor speaks this fable, each word is hilighted in the written form. When he is done, he prompts David to repeat it.
After David repeats it, the AI instructor smiles, nods, and looks pleased. He praises David’s pronunciation as “Perfect.”
This call and response seems par for modern methods of language learning software, even down to “listening” and providing feedback. Learning and studying a language is ultimately far more complicated than this, but it would be difficult to show much more of it in such a short scene. The main novelty that this interface brings to the notion of language acquisition seems to be the volumetric display and the hint of real-time progress notes.
The android David tends to the ship and the hypersleping crew during the two-year journey.
The first part of the interface for checking in on the crew is a cyan-blue touch screen labeled “HYP.SL” in the upper left hand corner. The bulk of this screen is taken up with three bands of waveforms. A “pulse” of magnification flows across the moving waveforms from left to right every second or so, but its meaning is unclear. Each waveform appears to show a great deal of data, being two dozen or so similar waveforms overlaid onto a single graph. (Careful observers will note that these bear a striking resemblance to the green plasma-arc alien interface seen later in the film, and so their appearance may have been driven stylistically.)
To the right of each waveform is a medium-sized number (in Eurostile) indicating the current state of the index. They are color-coded for easy differentiation. In contrast, the lines making up the waveform are undifferentiated, so it’s hard to tell if the graph shows multiple data points plotted to a single graph, or a single datapoint across multiple times. Whatever the case, the more complex graph would make identifying a recent trend more complicated. If it’s useful to summarize the information with a single number on the right, it would be good to show what’s happening to that single number across the length of the graph. Otherwise, you’re pushing that trendspotting off to the user’s short term memory and risking missing opportunities for preventative measures.
Another, small diagram in the lower left is a force-directed, circular edge bundling diagram, but as this and the other controls on the screen are inscrutable, we cannot evaluate their usefulness in context.
After observing the screen for a few seconds, David touches the middle of the screen, a wave of distortion spreads from his finger for a half a second, and we hear a “fuzz” sound. The purpose of the touch is unclear. Since it makes no discernable change in the interface, it could be what I’ve called one free interaction, but this seems unlikely since such cinematic attention was given to it. My only other guess is to register David’s presence there like a guard tour patrol system or watchclock that ensures he’s doing his rounds.
All telecommunications in the film are based on either a public address or a two-way radio metaphor.
Commander Adams addresses the crew.
To address the crew from inside the ship, Commander Adams grabs the microphone from its holder on the wall. Its long handle makes it easy to grab. By speaking into the lit, transparent circle mounted to one end, his voice is automatically broadcast across the ship.
Commander Adams lets Chief Quinn know he’s in command of the ship.
Quinn listens for incoming signals.
The two-way radio on his belt is routed through the communications officer back at the ship. To use it, he unclips the small cylindrical microphone from its clip, flips a small switch at the base of the box, and pulls the microphone on its tether close to his mouth to speak. When the device is active, a small array of lights on the box illuminates.
Confirming their safety by camera, Chief Quinn gets an eyeful of Alta.
The microphone also has a video camera within it. When Chief Quinn asks Commander Adams to activate the viewer, he does so by turning the device such that its small end faces outwards, at which time it acts as a camera, sending a video signal back to the ship, to be viewed on the view plate.
The Viewplate is used frequently to see outside the ship.
Altair IV looms within view.
The Viewplate is a large video screen with rounded edges that is mounted to a wall off the bridge. To the left of it three analog gauges are arranged in a column, above two lights and a stack of sliders. These are not used during the film.
Commander Adams engages the Viewplate to look for Altair IV.
The Viewplate is controlled by a wall mounted panel with a very curious placement. When Commander Adams rushes to adjust it, he steps to the panel and adjusts a few horizontal sliders, while craning around a cowling station to see if his tweaks are having the desired effect. When he’s fairly sure it’s correct, he has to step away from the panel to get a better view and make sure. There is no excuse for this poor placement.