Brain Scanning

The second half of the film is all about retrieving the data from Johnny’s implant without the full set of access codes. Johnny needs to get the data downloaded soon or he will die from the “synaptic seepage” caused by squeezing 320G of data into a system with 160G capacity. The bad guys would prefer to remove his head and cryogenically freeze it, allowing them to take their time over retrieval.

1 of 3: Spider’s Scanners

The implant cable interface won’t allow access to the data without the codes. To bypass this protection requires three increasingly complicated brain scanners, two of them medical systems and the final a LoTek hacking device. Although the implant stores data, not human memories, all of these brain scanners work in the same way as the Non-invasive, “Reading from the brain” interfaces described in Chapter 7 of Make It So.

The first system is owned by Spider, a Newark body modification
specialist. Johnny sits in a chair, with an open metal framework
surrounding his head. There’s a bright strobing light, switching on
and off several times a second.

jm-20-spider-scan-a

Nearby a monitor shows a large rotating image of his head and skull, and three smaller images on the left labelled as Scans 1 to 3.

jm-20-spider-scan-b

The largest image resembles a current-day MRI or CT display. It is being drawn on a regular flat 2D display rather than as a 3D holographic type projection, so does not qualify as a volumetric projection even though a current day computer graphics programmer might call it such. The topmost Scan 1 is the head viewed from above in the same rendering style. Scan 2 in the middle shows a bright spot around the implant, and Scan 3 shows a circuit board, presumably the implant itself. The background is is blue, which so far has been common but not as predominant as it is in other science fiction interfaces. Chris suggests  this is because blue LEDs were not common in 1995, so the physical lights we see are red and green and likewise the onscreen graphics use many bright colors.

jm-20-spider-scan-c

Occasionally a purple bar slides across the main image. It perhaps represents some kind of processing update, but since the image is already rotating, that seems superfluous. At one point the color of the main image changes to red, with a matching red sliding bar, but we don’t know why. All the smaller images rotate or flash regularly, with faint ticking sounds as they do.

From this system, Spider is able to tell Johnny that there is a problem with his implant and it must be painful. (Understandably, Johnny is not impressed with this less than helpful diagnosis.) Unlike either the scanner at Newark Airport or the LoTek binoculars, there are no obvious messages or indicators providing this information. But this is a specialised piece of medical technology rather than a public access system, so presumably Spider has sufficient expertise to interpret the displays without needing large popup text.

2 of 3: Hospital Scanner

Spider takes Johnny to a hospital for a more thorough scan. Here the first step is attaching a black flexible strip with various cables around his head. His implant cable is also connected.

jm-21-hospital-scan-b

There isn’t a clear shot of the entire system, but behind Johnny is a CRT monitor and to his left, our right, is a bank of displays that look like electronic oscilloscopes. Since embedded body electronics are common in the world of Johnny Mnemonic, that is probably exactly what they are intended to be. Spider adjusts some controls on these.

jm-21-hospital-scan-c

The oscilloscopes show no text, just green lines and shapes. The CRT behind Johnny is now showing the same head image that we saw at the end of the previous scan.

jm-21-hospital-scan-d

In front of the oscilloscopes is a PC keyboard from the 1990s. In 2021 this will look even older, but this entire hospital is portrayed as a shoestring operation relying on donations and salvage. Spider types on the keyboard, and the CRT changes to show a lot of scrolling text.

jm-21-hospital-scan-e

This is enough for Spider to announce that the “data” is the cure for NAS, the world wide epidemic disease that Jane is showing symptoms of. Again it’s not clear how he can determine this, as the data is still protected by the access codes. Perhaps the scrolling text is unencrypted metadata in the implant that is more easily retrieved. Given the apparent hazardous life of a mnemonic courier, it would make sense to attach the equivalent of a sticky label to the implant, briefly describing the contents and who they should be delivered to.

(This is also the point where one has to ask why this valuable data is encrypted and protected to begin with. Using a mnemonic courier for distribution makes sense, to avoid content filters on the Internet. But now the data is here in Newark, with the intended recipients, so why is it so hard to get at? The best answer I can think of is that the scientists wanted to ensure that the mnemonic courier couldn’t keep the data for themselves and sell it to the highest bidder.)

The third of the three brain interfaces warrants its own post, coming up next. 

Drone Status Feed

Oblivion-Desktop-Overview-002
Oblivion-Desktop-DroneMonitor-001

 

As Vika is looking at the radar and verifying visuals on the dispatched drones with Jack, the symbols for drones 166 and 172 begin flashing red. An alert begins sounding, indicating that the two drones are down.

Oblivion-Desktop-DroneCoordinates

Vika wants to send Jack to drone 166 first. To do this she sends Jack the drone coordinates by pressing and holding the drone symbol for 166 at which time data coordinates are displayed. She then drags the data coordinates with one finger to the Bubbleship symbol and releases. The coordinates immediately display on Jack’s HUD as a target area showing the direction he needs to go.

Simple interactions

Overall, the sequence of interactions for this type of situation is pretty simple and well thought out. Sending coordinates is as simple as:

  1. Tap and hold on the symbol of the target (in this case the drone) using one finger
  2. A summary of coordinates data is displayed around the touchpoint (drone symbol)
  3. Drag data over to the symbol of the receiver (in this case the Bubbleship)

Then on Jack’s side, the position of the coordinates target on his HUD adjusts as he flies toward the drone. Can’t really get much simpler than that.

However…

When Vika initially powers up the desktop, the drone status feed already shows drones 166 and 172 down. This is fine, except the alert sound and blinking icons on the TETVision don’t occur until Jack has already reached the hydro-rigs. This is quite a significant time lag between the drone status feed and the TETVision feed. It would be understandable if there was a slight delay in the alert sound upon startup. An immediate alert sound would likely mean there is something wrong with the TETVision system itself. That said, the TETVision drone icons should at the very least already be blinking red on load.

Monitoring drone 166

Oblivion-Desktop-DroneMonitor-005

As Jack is repairing drone 166, Vika watches the drone status feed on her desktop. The drone status feed is a dedicated screen to the right of the TETVision feed.

Oblivion-Desktop-DroneMonitor-000

It is divided into two main sections, the drone diagnostic matrix to the left and the drone deployment table to the right.

The dispatched drone table lists all drones currently working the security perimeter and lists an overview of information including drone ID, a diagram and operational status. The drone diagnostic matrix shows data such as fuel status and drone positioning along the perimeter as well as a larger detailed diagram of the selected drone.

Oblivion-Desktop-DroneMonitor

By looking at the live diagnostics diagram, Vika is able to immediately tell Jack that the central core is off alignment. As soon as Jack finishes repairing the central core, the diagram updates that the core is back in alignment and an alert sound pings.

How does the feed know which drone to focus on?

Since there is no direct interaction with this monitor shown in the film, it is assumed to be an informational display. So, how does the feed know which drone to focus on for diagnostics?

One possibility could be that Jack transmits data from the ground through his mobile drone programmer handset, which is covered in another post. However, a great opportunity for an example of agentive tech would be that when Vika sends the drone coordinates to the Bubbleship, the drone status feed automatically focuses on that one for diagnostics.

Clear messaging in real-time…almost

Overall, the messaging for drone status feed is clear and simple. As seen in the drone deployment table, the dataset for operational drones includes the drone ID number and a rotating view of the drone schematic. If the drone is down, the ID number fades and the drone schematic is replaced with a flashing red message stating that the drone is offline. Yet, when the drone is repaired, the display immediately updates to show that everything is operational again.

This is one of the basic fundamentals of good user interface design. Don’t let the UI get in the way and distract the user.

Keep it simple.