Cyberspace: the hardware

And finally we come to the often-promised cyberspace search sequence, my favourite interface in the film. It starts at 36:30 and continues, with brief interruptions to the outside world, to 41:00. I’ll admit there are good reasons not to watch the entire film, but if you are interested in interface design, this will be five minutes well spent. Included here are the relevant clips, lightly edited to focus on the user interfaces.

Click to see video of The cyberspace search.

Click to see Board conversation, with Pharmakom tracker and virus

First, what hardware is required?

Johnny and Jane have broken into a neighbourhood computer shop, which in 2021 will have virtual reality gear just as today even the smallest retailer has computer mice. Johnny clears miscellaneous parts off a table and then sits down, donning a headset and datagloves.

jm-30-hardware-a

Headset

Headsets haven’t really changed much since 1995 when this film was made. Barring some breakthrough in neural interfaces, they remain the best way to block off the real world and immerse a user into the virtual world of the computer. It’s mildly confusing to a current day audience to hear Johnny ask for “eyephones”, which in 1995 was the name of a particular VR headset rather than the popular “iPhone” of today. Continue reading

Brain Scanning

The second half of the film is all about retrieving the data from Johnny’s implant without the full set of access codes. Johnny needs to get the data downloaded soon or he will die from the “synaptic seepage” caused by squeezing 320G of data into a system with 160G capacity. The bad guys would prefer to remove his head and cryogenically freeze it, allowing them to take their time over retrieval.

1 of 3: Spider’s Scanners

The implant cable interface won’t allow access to the data without the codes. To bypass this protection requires three increasingly complicated brain scanners, two of them medical systems and the final a LoTek hacking device. Although the implant stores data, not human memories, all of these brain scanners work in the same way as the Non-invasive, “Reading from the brain” interfaces described in Chapter 7 of Make It So.

The first system is owned by Spider, a Newark body modification
specialist. Johnny sits in a chair, with an open metal framework
surrounding his head. There’s a bright strobing light, switching on
and off several times a second.

jm-20-spider-scan-a

Nearby a monitor shows a large rotating image of his head and skull, and three smaller images on the left labelled as Scans 1 to 3. Continue reading

Brain Upload

Once Johnny has installed his motion detector on the door, the brain upload can begin.

3. Building it

Johnny starts by opening his briefcase and removing various components, which he connects together into the complete upload system. Some of the parts are disguised, and the whole sequence is similar to an assassin in a thriller film assembling a gun out of harmless looking pieces.

jm-6-uploader-kit-a

It looks strange today to see a computer system with so many external devices connected by cables. We’ve become accustomed to one piece computing devices with integrated functionality, and keyboards, mice, cameras, printers, and headphones that connect wirelessly.

Cables and other connections are not always considered as interfaces, but “all parts of a thing which enable its use” is the definition according to Chris. In the early to mid 1990s most computer user were well aware of the potential for confusion and frustration in such interfaces. A personal computer could have connections to monitor, keyboard, mouse, modem, CD drive, and joystick – and every single device would use a different type of cable. USB, while not perfect, is one of the greatest ever improvements in user interfaces. Continue reading

Little boxes on the interface

StarshipT-undocking01

After recklessly undocking we see Ibanez using an interface of…an indeterminate nature.

Through the front viewport Ibanez can see the cables and some small portion of the docking station. That’s not enough for her backup maneuver. To help her with that, she uses the display in front of her…or at least I think she does.

Undocking_stabilization

The display is a yellow wireframe box that moves “backwards” as the vessel moves backwards. It’s almost as if the screen displayed a giant wireframe airduct through which they moved. That might be useful for understanding the vessel’s movement when visual data is scarce, such as navigating in empty space with nothing but distant stars for reckoning. But here she has more than enough visual cues to understand the motion of the ship: If the massive space dock was not enough, there’s that giant moon thing just beyond. So I think understanding the vessel’s basic motion in space isn’t priority while undocking. More important is to help her understand the position of collision threats, and I cannot explain how this interface does that in any but the feeblest of ways.

If you watch the motion of the screen, it stays perfectly still even as you can see the vessel moving and turning. (In that animated gif I steadied the camera motion.) So What’s it describing? The ideal maneuver? Why doesn’t it show her a visual signal of how well she’s doing against that goal? (Video games have nailed this. The “driving line” in Gran Turismo 6 comes to mind.)

Gran Turismo driving line

If it’s not helping her avoid collisions, the high-contrast motion of the “airduct” is a great deal of visual distraction for very little payoff. That wouldn’t be interaction so much as a neurological distraction from the task at hand. So I even have to dispense with my usual New Criticism stance of accepting it as if it was perfect. Because if this was the intention of the interface, it would be encouraging disaster.

StarshipT-undocking17

The ship does have some environmental sensors, since when it is 5 meters from the “object,” i.e. the dock, a voiceover states this fact to everyone in the bridge. Note that it’s not panicked, even though that’s relatively like being a peach-skin away from a hull breach of bajillions of credits of damage. No, the voice just says it, like it was remarking about a penny it happened to see on the sidewalk. “Three meters from object,” is said with the same dispassion moments later, even though that’s a loss of 40% of the prior distance. “Clear” is spoken with the same dispassion, even though it should be saying, “Court Martial in process…” Even the tiny little rill of an “alarm” that plays under the scene sounds more like your sister hasn’t responded to her Radio Shack alarm clock in the next room rather than—as it should be—a throbbing alert.

StarshipT-undocking24

Since the interface does not help her, actively distracts her, and underplays the severity of the danger, is there any apology for this?

1. Better: A viewscreen

Starship Troopers happened before the popularization of augmented reality, so we can forgive the film for not adopting that technology, even though it might have been useful. AR might have been a lot for the film to explain to a 1997 audience. But the movie was made long after the popularization of the viewscreen forward display in Star Trek. Of course it’s embracing a unique aesthetic, but focusing on utility: Replace the glass in front of her with a similar viewscreen, and you can even virtually shift her view to the back of the Rodger Young. If she is distracted by the “feeling” of the thrusters, perhaps a second screen behind her will let her swivel around to pilot “backwards.” With this viewscreen she’s got some (virtual) visual information about collision threats coming her way. Plus, you could augment that view with precise proximity warnings, and yes, if you want, air duct animations showing the ideal path (similar to what they did in Alien).

2. VP

The viewscreen solution still puts some burden on her as a pilot to translate 2D information on the viewscreen to 3D reality. Sure, that’s often the job of a pilot, but can we make that part of the job easier? Note that Starship Troopers was also created after the popularization of volumetric projections in Star Wars, so that might have been a candidate, too, with some third person display nearby that showed her the 3D information in an augmented way that is fast and easy for her to interpret.

3. Autopilot or docking tug-drones

Yes, this scene is about her character, but if you were designing for the real world, this is a maneuver that an agentive interface can handle. Let the autopilot handle it, or adorable little “tug-boat” drones.

StarshipT-undocking25

Audio Syringe

Prometheus-247

David uses this device when Shaw begins to double over from the pain of the alien growing in her womb. It is a palm-sized cylinder with a large needle sticking out one end and a yellow button on the other. To administer it, he jams it into her shoulder, depressing the yellow button with his thumb, and holds it there until the spraying sound coming from it ceases.

Prometheus-246

This is not a hypospray (as described in Chapter 12, Medicine, of the book), which would not have a needle, so where is the sound coming from? It might be an audio augmentation to let the administrator know. This would be a reasonable sound, as it gives sense of pressure releasing. But there should be some clear signal—like a soft double-beep—when the doseage is complete, less it be removed too soon for misinterpreting the audio signal.

HYP.SL

The android David tends to the ship and the hypersleping crew during the two-year journey.

The first part of the interface for checking in on the crew is a cyan-blue touch screen labeled “HYP.SL” in the upper left hand corner. The bulk of this screen is taken up with three bands of waveforms. A “pulse” of magnification flows across the moving waveforms from left to right every second or so, but its meaning is unclear. Each waveform appears to show a great deal of data, being two dozen or so similar waveforms overlaid onto a single graph. (Careful observers will note that these bear a striking resemblance to the green plasma-arc alien interface seen later in the film, and so their appearance may have been driven stylistically.)

HYP.SL

To the right of each waveform is a medium-sized number (in Eurostile) indicating the current state of the index. They are color-coded for easy differentiation. In contrast, the lines making up the waveform are undifferentiated, so it’s hard to tell if the graph shows multiple data points plotted to a single graph, or a single datapoint across multiple times. Whatever the case, the more complex graph would make identifying a recent trend more complicated. If it’s useful to summarize the information with a single number on the right, it would be good to show what’s happening to that single number across the length of the graph. Otherwise, you’re pushing that trendspotting off to the user’s short term memory and risking missing opportunities for preventative measures.

Another, small diagram in the lower left is a force-directed, circular edge bundling diagram, but as this and the other controls on the screen are inscrutable, we cannot evaluate their usefulness in context.

After observing the screen for a few seconds, David touches the middle of the screen, a wave of distortion spreads from his finger for a half a second, and we hear a “fuzz” sound. The purpose of the touch is unclear. Since it makes no discernable change in the interface, it could be what I’ve called one free interaction, but this seems unlikely since such cinematic attention was given to it. My only other guess is to register David’s presence there like a guard tour patrol system or watchclock that ensures he’s doing his rounds.