Identity Processing Program of America

After his initial arrest, Joe is led by a noose stick (and a police officer speaking some devolved version of copspeak) to a machine to get an identity tattoo. Joe sits in the chair and a synthesized voice says, “Welcome to the Identity Processing Program of America. Please insert your forearm into the forearm receptacle.” Joe does as instructed and it locks his arm into place. A screen in front of him shows the legend “Identity Processing Program of America” superimposed over an USA pattern made up of company names and Carls Junior amputated star logos. Five rectangles across the top are labeled: System, Identity, Verify, Imprint, and Done.

Idiocracy_ippa02
It prompts him to “…speak your name as it appears on your current federal identity card, document number G24L8.” Joe says, “I’m not sure if—“ The machine interprets this as input and blinks the name as it says, “You have entered the name ‘Not Sure.’ Is this correct, Not Sure?”

Idiocracy_ippa05
Joe tries to correct it, saying, “No…it’s not correct.” On the word “correct” it dings and continues, “Thank you. ‘Not’ is correct.” “Not” stops blinking in the interface.

“…Is ‘Sure’ correct?” Joe has patience and tries to correct it. “No, it’s not. My name is Joe.” It blurts out some error beeps. “You’ve already confirmed your first name is Not. Please confirm your last name, ‘Sure.’”

Joe: My…My last name is not Sure.
Kiosk: Thank you, Not Sure.
Joe: No. What I mean is my name is Joe.
Kiosk: Confirmation is complete. Please wait while I tattoo your new identity on your arm.
The machine begins to shake and make noises. Joe says, “Wait a second. Can we start over? Can I cancel this?” He sees a progress bar, labeled “Tattoo In Progress…”

Idiocracy_ippa06

“Can we cancel this and just go back to the beginning? They’re gonna tat—Ow. Could I speak to your supervisor? Ow!” While he’s trying to wrench his arm free, the machine instructs him, “Please hold still for your photograph.” It flashes an unflattering picture of him and the clamp on his arm releases. He removes his arm to see the new tattoo. The screen shows him his identity card.

Idiocracy_ippa08
In exasperation from the whole ordeal he mumbles, “Oh, that’s fuckin’ great.”

This scene is played for the Vaudevillian yuks, but it does illustrate some problems with conversational design. And note that, if you’re interested in this topic, let me make an early shout out to the book Conversational Design by Erika Hall, published earlier this year.

conversational_design.jpg

Understanding intent

When it hears Joe say “I’m not sure…” it takes it as a literal answer. It does not recognize that Joe isn’t answering the question. It is one meta-level up. He has a question about the question being asked. Humans are pretty good at recognizing when another human is breaking the usual logic of adjacency pairs and not providing an answer to the question. (This was discussed in Make It So in Chapter 5, “Gestural Interfaces” in relation to Minority Report) Computers have a harder time of it. If this kiosk understood it, it would be know that he’s not answering the question, and resolve what conversational analysis calls “the expansion” before returning to the question. (Disclosure: I work there and know the guy who wrote that.)

Aside: Douglas Hofstadter in his mind-expanding book Godel, Escher, and Bach, writes about the trick question “Have you stopped beating your spouse?” for which neither “yes” nor “no” are good answers, but the only “correct” answers according to the binary frame of the question. In that text he introduces the eastern answer of “mu” (or “wu” in Chinese languages) that means roughly “the answer does not fit the question.” So it can be said that computers have a hard time understanding mu.

mu-seal.png


Designers of digital assistants have to wrangle with this, but it’s rarely a problem that the individual designer must wrestle with. Language and naming are informal, slippery notions as far as computers are concerned, so it’s understandably a hard problem. It’s entirely possible that someone has chosen “Not Sure” as a name, but it’s highly unlikely. And that’s another problem.

Understanding likelihood

Understanding intent might be a little easier if the computer could recognize that “Not” and “Sure” are unlikely values for a name. (Even in Idiocracy where names tend to be brands like Lexus, and Frito, and Biggiez. More on this later.) If it knew that, it would have a low confidence that it “heard” correctly, and shift into a repair or at least clarification mode. “‘Not’ would be very uncommon name. Let me be extra careful, here…” It could even shift into a more deliberate mode of input, like a keyboard, or asking him to spell his name out (or, you know, cancel the whole thing.)

Catching not

When the kiosk is asking for confirmation, it hears Joe say, “No it’s not correct” and registers the keyword “correct” but misses the function word “not” which completely flips the meaning.

Again, avoiding this speech-to-text error would be a developer’s task, but dealing with the back-and-forth would definitely fall to a designer. When clarifying low-confidence input, users should be able to provide discrete high-confidence feedback.

Joe could, for instance, be shown the kiosk’s (stupid) understanding of his input and—since this has a pretty permanent consequence—wait on his confirmation and providing the simple option to redo it so he can try some other tactic to getting “Joe Bauers” in there until he gets it right.
But, of course, this is Idiocracy, and Joe is stuck with it.

Idiocracy_ippa09
Sorry, Joe.

But we’re not

I mean, Republicans have done all they can to suppress votes that don’t favor them. They don’t care about Democracy or the will of the American people as they do staying in power to serve their 1% overlords. But research shows that people who have a plan to vote are more likely to actually do it, and if we all do it, we can overwhelm them with sheer numbers.
There are lots of tools to help you make a plan, but let’s send some traffic to our friends at Planned Parenthood. They’ve been under a lot of pressure during this administration. Maybe throw them a few sheckles while you’re there. Not for the election, but because you’re a good person. And vote all of them out.

20161101-GOTV-Make-Plan-Tool-BLOG.png

Cyberspace: Beijing Hotel

After selecting its location from a map, Johnny is now in front of the virtual entrance to the hotel. The virtual Beijing has a new color scheme, mostly orange with some red.

jm-33-hotel-a

The “entrance” is another tetrahedral shape made from geometric blocks. It is actually another numeric keypad. Johnny taps the blocks to enter a sequence of numbers.

The tetrahedral keypad

jm-33-hotel-b

Note that there can be more than one digit within a block. I mentioned earlier that it can be difficult to “press” with precision in virtual reality due to the lack of tactile feedback. Looking closely, here the fingers of Johnny’s “hands” cast a shadow on the pyramid, making depth perception easier. Continue reading

The Memory Doubler

In Beijing, Johnny steps into a hotel lift and pulls a small package out his pocket. He unwraps it to reveal the “Pemex MemDoubler”.

jm-4-memdoubler-a

Johnny extends the cable from the device and plugs it into the implant in his head. The socket glows red once the connection is made.

jm-4-memdoubler-b-adjusted

Continue reading

The Drone

image01

Each drone is a semi-autonomous flying robot armed with large cannons, heavy armor, and a wide array of sensor systems. When in flight mode, the weapon arms retract. The arms extend when the drone senses a threat.

image02

Each drone is identical in make and temperament, distinguishable only by large white numbers on its “face”. The armored shell is about a meter in diameter (just smaller than Jack). Internal power is supplied by a small battery-like device that contains enough energy to start a nuclear explosion inside of a sky-scraper-sized hydrogen distiller. It is not obvious whether the weapons are energy or projectile-based.

The HUD

The Drone Interface is a HUD that shows the drone’s vision and secondary information about its decision making process. The HUD appears on all video from the Drone’s primary camera. Labels appear in legible human English.

Video feeds from the drone can be in one of several modes that vary according to what kind of searching the drone is doing. We never see the drone use more than one mode at once. These modes include visual spectrum, thermal imaging, and a special ‘tracking’ mode used to follow Jack’s bio signature.

Occasionally, we also see the Drone’s primary objective on the HUD. These include an overlay on the main view that says “TERMINATE” or “CLEAR”.

image00 Continue reading

The Excessive Machine

When Durand-Durand captures Barbarella, he places her in a device which he calls the “Excessive Machine. She sits in a reclining seat, covered up to the shoulders by the device. Her head rests on an elaborate red leather headboard. Durand-Durand stands at a keyboard, built into the “footboard” of the machine, facing her.

The keyboard resembles that of an organ, but with transparent vertical keys beneath which a few colored light pulse. Long silver tubes stretch from the sides of the device to the ceiling. Touching the keys (they do not appear to depress) produces the sound of a full orchestra and causes motorized strips of metal to undulate in a sine wave above the victim.

When Durand-Durand reads the strange sheet music and begins to play “Sonata for Executioner and Various Young Women,” the machine (via means hidden from view) removes Barbarella’s clothing piece by piece, ejecting them through a tube in the side of the machine near the floor. Then in an exchange Durand-Durand reveals its purpose…

  • Barbarella
  • It’s sort of nice, isn’t it?
  • Durand-Durand
  • Yes. It is nice. In the beginning. Wait until the tune changes. It may change your tune as well.
  • Barbarella
  • Goodness, what do you mean?
  • Durand-Durand
  • When we reach the crescendo, you will die of pleasure. Your end will be swift, but sweet, very sweet.

As Durand-Durand intensifies his playing, Barbarella writhes in agony/ecstasy. But despite his most furious playing, he does not kill Barbarella. Instead his machine fails dramatically, spewing fire and smoke out of the sides as its many tubes burn away. Barbarella is too much woman for the likes of his technology.

I’m going to disregard this as a device for torture and murder, since I wouldn’t want to improve such a thing, and that whole premise is kind of silly anyway. Instead I’ll regard it as a BDSM sexual device, in which Durand-Durand is a dominant, seeking to push the limits of an (informed, consensual) submissive using this machine. It’s possible that part of the scene is demonstration of prowess on a standardized, difficult-to-use instrument. If so, then a critique wouldn’t matter. But if not…Since the keys don’t move, the only variables he’s controlling are touch duration and vertical placement of his fingers. (The horizontal position on each key seems really unlikely.) I’d want to provide the player some haptic feedback to detect and correct slipping finger placement, letting him or her maintain attention on the sub who is, after all, the point.

Alien Stasis Chambers

The alien stasis chambers have recessed, backlit touch controls. The shape of each looks like a letterform. (Perhaps in Proto-Indo-European language that David was studying at the start of the film?) David is able to run his fingers along and tap these character shapes in particular sequences to awaken the alien sleeping within.

vlcsnap-00003

The writing/controls take up quite a bit of room, on both the left and right sides of the chamber near the occupant’s head. It might seem a strange decision to have controls placed this way, since a single user might have to walk around the chamber to perform tasks. But a comparison of the left and right side shows that the controls are identical, and so are actually purposefully redundant. This way it doesn’t matter which side of the chamber a caretaker was on, he could still operate the controls. Two caretakers might have challenges “walking over” each other’s commands, especially with the missing feedback (see below).

Prometheus-295

Having the writing/controls spread over such a large area does seem error prone. In fact in the image above, you can see that David’s left hand is resting with two fingers “accidentally” in the controls. (His other hand was doing the button pressing.) Of course this could be written off as “the technology is not made for us, it’s made for an alien race,” but the movie insists these aliens and humans share matching DNA, so apart from being larger in stature, they’re not all that different.

Two things seem missing in the interface. The first is simple feedback. When David touches the buttons, they do not provide any signal that his touch has been received. If he didn’t apply enough pressure to register his touch, he wouldn’t have any feedback to know that until an error occurred. The touch walls had this feedback, so it seems oddly missing here.

The second thing missing is some status indicator for the occupant. Unless that information is available on wearable displays, having it hidden forces a caretaker to go elsewhere for the information or rely solely on observation, which seems far beneath the technological capabilities seen so far in the complex. See the Monitoring section in Chapter 12 of Make it So for other examples of medical monitoring.

MedPod

Early in the film, when Shaw sees the MedPod for the first time, she comments to Vickers that, “They only made a dozen of these.” As she caresses its interface in awe, a panel extends as the pod instructs her to “Please verbally state the nature of your injury.”

Prometheus-087

The MedPod is a device for automated, generalized surgical procedures, operable by the patient him- (or her-, kinda, see below) self.

When in the film Shaw realizes that she’s carrying an alien organism in her womb, she breaks free from crewmembers who want to contain her, and makes a staggering beeline for the MedPod.

Once there, she reaches for the extended touchscreen and presses the red EMERGENCY button. Audio output from the pod confirms her selection, “Emergency procedure initiated. Please verbally state the nature of your injury.” Shaw shouts, “I need cesarean!” The machine informs her verbally that, “Error. This MedPod is calibrated for male patients only. It does not offer the procedure you have requested. Please seek medical assistance else–”

Prometheus-237

I’ll pause the action here to address this. What sensors and actuators are this gender-specific? Why can’t it offer gender-neutral alternatives? Sure, some procedures might need anatomical knowledge of particularly gendered organs (say…emergency circumcision?), but given…

  • the massive amounts of biological similarity between the sexes
  • the needs for any medical device to deal with a high degree of biological variability in its subjects anyway
  • most procedures are gender neutral

…this is a ridiculous interface plot device. If Dr. Shaw can issue a few simple system commands that work around this limitation (as she does in this very scene), then the machine could have just done without the stupid error message. (Yes, we get that it’s a mystery why Vickers would have her MedPod calibrated to a man, but really, that’s a throwaway clue.) Gender-specific procedures can’t take up so much room in memory that it was simpler to cut the potential lives it could save in half. You know, rather than outfit it with another hard drive.

Aside from the pointless “tension-building” wrong-gender plot point, there are still interface issues with this step. Why does she need to press the emergency button in the first place? The pod has a voice interface. Why can’t she just shout “Emergency!” or even better, “Help me!” Isn’t that more suited to an emergency situation? Why is a menu of procedures the default main screen? Shouldn’t it be a prompt to speak, and have the menu there for mute people or if silence is called for? And shouldn’t it provide a type-ahead control rather than a multi-facet selection list? OK, back to the action.

Desperate, Shaw presses a button that grants her manual control. She states “Surgery abdominal, penetrating injuries. Foreign body. Initiate.” The screen confirms these selections amongst options on screen. (They read “DIAGNOS, THERAP, SURGICAL, MED REC, SYS/MECH, and EMERGENCY”)

The pod then swings open saying, “Surgical procedure begins,” and tilting itself for easy access. Shaw injects herself with anesthetic and steps into the pod, which seals around her and returns to a horizontal position.

Why does Shaw need to speak in this stilted speech? In a panicked or medical emergency situation, proper computer syntax should be the last thing on a user’s mind. Let the patient shout the information however they need to, like “I’ve got an alien in my abdomen! I need it to be surgically removed now!” We know from the Sonic chapter that the use of natural language triggers an anthropomorphic sense in the user, which imposes some other design constraints to convey the system’s limitations, but in this case, the emergency trumps the needs of affordance subtleties.

Once inside the pod, a transparent display on the inside states that, “EMERGENCY PROC INITIATED.” Shaw makes some touch selections, which runs a diagnostic scan along the length of her body. The terrifying results display for her to see, with the alien body differentiated in magenta to contrast her own tissue, displayed in cyan.

Prometheus-254

Prometheus-260

Shaw shouts, “Get it out!!” It says, “Initiating anesthetics” before spraying her abdomen with a bile-yellow local anesthetic. It then says, “Commence surgical procedure.” (A note for the grammar nerds here: Wouldn’t you expect a machine to maintain a single part of speech for consistency? The first, “Initiating…” is a gerund, while the second, “Commence,” is an imperative.) Then, using lasers, the MedPod cuts through tissue until it reaches the foreign body. Given that the lasers can cut organic matter, and that the xenomorph has acid for blood, you have to hand it to the precision of this device. One slip could have burned a hole right through her spine. Fortunately it has a feather-light touch. Reaching in with a speculum-like device, it removes the squid-like alien in its amniotic sac.

OK. Here I have to return to the whole “ManPod” thing. Wouldn’t a scan have shown that this was, in fact, a woman? Why wouldn’t it stop the procedure if it really couldn’t handle working on the fairer sex? Should it have paused to have her sign away insurance rights? Could it really mistake her womb for a stomach? Wouldn’t it, believing her to be a man, presume the whole womb to be a foreign body and try to perform a hysterectomy rather than a delicate caesarian? ManPod, indeed.

Prometheus-265

After removing the alien, it waits around 10 seconds, showing it to her and letting her yank its umbilical cord, before she presses a few controls. The MedPod seals her up again with staples and opens the cover to let her sit up.

She gets off the table, rushes to the side of the MedPod, and places all five fingertips of her right hand on it, quickly twisting her hand clockwise. The interface changes to a red warning screen labeled “DECONTAMINATE.” She taps this to confirm and shouts, “Come on!” (Her vocal instruction does not feel like a formal part of the procedure and the machine does not respond differently.) To decontaminate, the pod seals up and a white mist fills the space.

OK. Since this is a MedPod, and it has something called a decontamination procedure, shouldn’t it actually test to see whether the decontamination worked? The user here has enacted emergency decontamination procedures, so it’s safe to say that this is a plague-level contagion. That’s doesn’t say to me: Spray it with a can of Raid and hope for the best. It says, “Kill it with fire.” We just saw, 10 seconds ago, that the MedPod can do a detailed, alien-detecting scan of its contents, so why on LV-223 would it not check to see if the kill-it-now-for-God’s-sake procedure had actually worked, and warn everyone within earshot that it hadn’t? Because someone needs to take additional measures to protect the ship, and take them, stat. But no, MedPod tucks the contamination under a white misty blanket, smiles, waves, and says, “OK, that’s taken care of! Thank you! Good day! Move along!”

For all of the goofiness that is this device, I’ll commend it for two things. The first is for pushing the notion forward of automated medicine. Yes, in this day and age, it’s kind of terrifying to imagine devices handling something as vital as life-saving surgery, but people in the future will likely find it terrifying that today we’d rather trust an error prone, bull-in-a-china-shop human to the task. And, after all, the characters have entrusted their lives to an android while they were in hypersleep for two years, so clearly that’s a thing they do.

Second, the gestural control to access the decontamination is well considered. It is a large gesture, requiring no great finesse on the part of the operator to find and press a sequence of keys, and one that is easy to execute quickly and in a panic. I’m absolutely not sure what percentage of procedures need the back-up safety of a kill-everything-inside mode, but presuming one is ever needed, this is a fine gesture to initiate that procedure. In fact, it could have been used in other interfaces around the ship, as we’ll see later with the escape pod interface.

I have the sense that in the original script, Shaw had to do what only a few very bad-ass people have been willing to do: perform life-saving surgery on themselves in the direst circumstances. Yes, it’s a bit of a stretch since she’s primarily an anthropologist and astronomer in the story, but give a girl a scalpel, hardcore anesthetics, and an alien embryo, and I’m sure she’ll figure out what to do. But pushing this bad-assery off to an automated device, loaded with constraints, ruins the moment and changes the scene from potentially awesome to just awful.

Given the inexplicable man-only settings, requiring a desperate patient to recall FORTRAN-esque syntax for spoken instructions, and the failure to provide any feedback about the destruction of an extinction-level pathogen, we must admit that the MedPod belongs squarely in the realm of goofy narrative technology and nowhere near the real world as a model of good interaction design.