Zed-Eyes

In the world of “White Christmas”, everyone has a networked brain implant called Zed-Eyes that enables heads-up overlays onto vision, personalized audio, and modifications to environmental sounds. The control hardware is a thin metal circle around a metal click button, separated by a black rubber ring. People can buy the device with different color rings, as we see alternately see metal, blue, and black versions across the episode.

To control the implant, a person slides a finger (thumb is easiest) around the rim of a tiny touch device. Because it responds to sliding across its surface, let’s say the device must use a sensor similar to the one used in The Entire History of You (2011) or the IBM Trackpoint,

A thumb slide cycles through a carousel menu. Sliding can happen both clockwise and counterclockwise. It even works through gloves.

HUD_menu.gif

The button selects or executes the selected action. The complete list of carousel menu options we see in the episode are: SearchCameraMusicMailCallMagnifyBlockMapThe particular options change across scenes, so it is context-aware or customizable. We will look at some of the particular functions in later posts. For now, let’s discuss the “platform” that is Zed-eyes.

Analysis

There’s not much to discuss about the user interface. The carousel a mature, if constrained, interface model familiar to anyone who has used an iPod. We know the constraints and benefits of such a system, and the Zed-Eyes content seems to fit this kind of interface well.

Hardware

The main question about the hardware is that is must be very very easy to lose or misplace. It would make sense for the Zed-Eyes to help you locate it when you need help, but we don’t see a hint of this in the show.

I think the little watch-battery form factor is a bad design. It’s easy to lose and hard to find and requires a lot of precision to use. Since this exists in a world with very high fidelity image recognition and visual processing, better would be to get rid of input hardware altogether.

Let the user swipe with their thumb across their index finger (or really, any available surface) and have the HUD read that as input. To distinguish real-world interactions that should not have consequence—like swiping dust off a computer—from input meant for the HUD, it could track the user’s visual focal point. When the user’s eyes focus on the empty space in the air right above where they’re swiping, the system knows swiping is meant to affect the interface.

With this kind of interaction there would be no object to lose, and of course save whatever entity provides this service the costs of the hardware and maintenance.

We must note that such a design might not play well cinematically, as viewers might not understand what was happening at first, but understanding the hardware is not critical to understanding the plot-critical effects of using the technology.

Cyborgs in social space

A last question is about the invisibility of the technology. This can cause problems when a user is known to be hearing, but functionally deaf because they are listening to music loudly, and the people around them can’t tell that. Someone could be speaking to the user and believe their non-response is disrespect. It could cause safety problems as, say, a bicyclist barrels towards them on a sidewalk, ringing their bell, expecting the user to move. This can allow privacy abuse as a user can take pictures in circumstances that should be private.

Joe, the moment he is taking a picture of Beth.

One solution would be to make the presence of the tech and interactions quite visible. Glowing pupils and large, obvious gestural control, for example. But in a world where everyone has the technology, the Zed-Eyes can simply limit the behavior of photographs to permitted places, times, and according to the preferences of the people in the photograph. If someone is listening to music and functionally deaf, a real time overlay could inform people around them. This guy is listening to music. If a place is private, the picture option could be disabled with feedback to the user of this. Sorry, pictures are not allowed here.

The visibility we want for ubiquitous technology can be virtual, and provide feedback to everyone involved.

11 thoughts on “Zed-Eyes

  1. very nice comments. i really agree with you that the hardware is completely unnecessary and super easy to loose; on the other hand having a haptic element is a big plus compared to today’s non-haptic tech (i.e. which has to mimic little vibrations for that lack of contact)

  2. Maybe it doesn’t matter that the controller is easy to lose, because firstly they are cheap and secondly you can use any controller with your implant?

    The implant is hooked into your neural system, so it can sense when you are touching a controller. The implant automatically “pairs” with it to receive input through your nerves or just skin conductivity.

    • The similar device in The Entire History of You works sort of this way. When Liam wants to see something from Ffion’s recordings, he hands her a controller and she has to manipulate it. It’s not absolutely clear whether the controllers are interchangeable or not.

  3. I wish they’d address this in the show, I imagine that there’s something physical they can do, maybe blink three times quickly or something, that then causes their HUD to show where the device is…

  4. Pingback: Zed-Eyes: Block | Sci-fi interfaces

  5. What if the little dongle is not an active device at all, but just a sort of QR object or motion capture marker that the zed-eye system recognizes as a virtual input device? Any appropriately sized/colored object is recognized as a controller when it is seen in the user’s hand. The system is watching the movements of the user’s thumb against the disk, not getting a transmission from the disk itself. It’s nice to have a little clicker or spinny wheel on there like on a fidget toy, to keep your fingers entertained, but a simple cardboard disk or inert plastic chip would work just as well.

    Why have a physical object at all, and not just read finger movements? Well, it signals to others that you are interacting with your zed-eye system, and maybe it cuts down on inadvertent inputs, since it requires focus on the hands and a non-hand object of particular size and shape.

Leave a Reply to scifihughfCancel reply