Carl’s Junior

In addition to its registers, OmniBro also makes fast-food vending machines. The one we see in the film is free-standing kiosk with five main panels, one for each of the angry star’s severed arms. A nice touch that flies by in the edit is that the roof of the kiosk is a giant star, but one of the arms has broken and fallen onto a car. Its owners have clearly just abandoned it, and things have been like this long enough for the car to rust.

Idiocracy_omnibro09.png

A description

Each panel in the kiosk has:

  • A small screen and two speakers just above eye level
  • Two protruding, horizontal slots of unknown purpose
  • A metallic nozzle
  • A red laser barcode scanner
  • A 3×4 panel of icons (similar in style to what’s seen in the St. God’s interfaces) in the lower left. Sadly we don’t see these buttons in use.

But for the sake of completeness, the icons are, in western reading order:

  • No money, do not enter symbol, question
  • Taco, plus, fries
  • Burger, pizza, sundae
  • Asterisk, up-down, eye

The bottom has an illuminated dispenser port.

Idiocracy_omnibro20

In use

Joe approaches the kiosk and, hungry, watches to figure out how people get food. He hears a transaction in progress, with the kiosk telling the customer, “Enjoy your EXTRA BIG ASS FRIES.” She complains, saying, “You didn’t give me no fries. I got an empty box.”

She reaches inside the food port to see if it just got stuck, and tinto the take-out port and fishes inside to see if it just got stuck. The kiosk asks her, “Would you like another EXTRA BIG ASS FRIES?” She replies loudly into the speaker, “I said I didn’t get any.” The kiosk ignores her and continues, “Your account has been charged. Your balance is zero. Please come back when you afford to make a purchase.” The screen shows her balance as a big dollar sign with a crossout circle over it.

Frustrated, she bangs the panel, and a warning screen pops up, reading, “WARNING: Carl’s Junior frowns upon vandalism.”

Idiocracy_omnibro27
She hits it again, saying, “Come on! My kids’re starving!” (Way to take it super dark, there, Judge.) Another screen reads, “Please step back.”

Idiocracy_omnibro28

A mist sprays from the panel into her face as the voice says, “This should help you calm down. Please come back when you can afford to make a purchase! Your kids are starving. Carl’s Junior believes no child should go hungry. You are an unfit mother. Your children will be placed in the custody of Carl’s Junior.”

She stumbles away, and the kiosk wraps up the whole interaction with the tagline, “Carl’s Junior: Fuck you. I’m eating!” (This treatment of brands, it should be noted, is why the film never got broad release. See the New York Times article, or, if you can’t get past the paywall, the Mental Floss listicle, number seven.)

Joe approaches the kiosk and sticks a hand up the port. The kiosk recognizes the newcomer and says, “Welcome to Carl’s Junior. Would you like to try our EXTRA BIG ASS TACO, now with more MOLECULES?” Then the cops arrive to arrest the mom.

***

Critique

Now, I don’t think Judge is saying that automation is stupid. (There are few automated technologies in the film that work just fine.) I think he’s noting that poorly designed—and inhumanely designed—systems are stupid. It’s a reminder for all of us to consider the use cases where things go awry, and design for graceful degradation. (Noting the horrible pun so implied.) If we don’t, people can lose money. People can go hungry. The design matters.

Idiocracy_omnibro29

Spoiler alert: If you’re worried about the mom, the police arrive in the next beat and arrest him , so at least she’s not arrested.

I have questions

The interface inputs raise a lot of questions that are just unanswerable. Are there only four things on the menu? Why are they distributed amongst other categories of icons? Is “plus” the only customization? Does that mean another of the same thing I just ordered, or a larger size? What have I ordered already? How much is my current total? Do I have enough to pay for what I have ordered? There all sorts of purchase path best practice standards being violated or unaddressed by the scene. Of course. It’s not a demo. A lot of sci-fi scenes involve technology breaking down.

Graceful degradation

Just to make sure I’m covering the bases, here, let me note what I hope is obvious. No automation system/narrow AI is perfect. Designers and product owners must presume that there will be times when the system fails—and the system itself does not know about it. The kiosk thinks it has delivered EXTRA BIG ASS FRIES, but it’s wrong. It’s delivered an empty box. It still charged her, so it’s robbed her.

We should always be testing, finding, and repairing these failure points in the things we help make. But we should also design an easy recourse for when the automation fails and doesn’t know. This could be a human attendant (or even a button that connects to a remote human operator who could check the video feed) to see that the woman is telling the truth, mark that panel as broken and use overrides to get her EXTRA BIG ASS FRIES from one of the functioning panels or refund her money to, I guess, go get a tub of Flaturin instead? (The terrible nutrition of Idiocracy is yet another layer for some speculative scifinutrition blog to critique.)

Idiocracy_omnibro25

Again, privacy. Again, respectfulness.

The financial circumstances of a customer are not the business of any other customer. The announcement and unmistakable graphic could be an embarrassment. Adding the disingenuous 🙁 emoji when it was the damned machine’s fault only adds insult to injury. We have to make sure and not get cute when users are faced with genuine problems.

Benefit of the doubt

Anther layer of the stupid here is that OmniBro has the sensors to detect frustrated customers. (Maybe it’s a motion sensor in the panel or dispense port. Possibly emotion detectors in the voice input.) But what it does with that information is revolting. Instead of presuming that the machine has made some irritating mistake, it presumes a hostile customer, and not only gasses her into a stupor while it calls the cops, it is somehow granted the authority to take her children as indentured servants for the problems it helped cause. If you have a reasonable customer base, it’s better for the customer experience, for the brand, and the society in which it operates to give the customers the benefit of the doubt rather than the presumption of guilt.

Prevention > remedy

Another failure of the kiosk is that it discovers that she has no money only after it believes it has dispensed EXTRA BIG ASS FRIES. As we see elsewhere in the film, the OmniBro scanners work accurately at a huge distance even while the user is moving along at car speeds. It should be able to read customers in advance to know that they have no ability to pay for food. It should prevent problems rather than try (and, as it does here, fail) to remedy them. At the most self-serving level, this helps avoid the potential loss or theft of food.

At a collective level, a humane society would still find some way to not let her starve. Maybe it could automatically deduct from a basic income. Maybe it could provide information on where a free meal is available. Maybe it could just give her the food and assign a caseworker to help her out. But the citizens of Idiocracy abide a system where, instead, children can be taken away from their mothers and turned into indentured servants because of a kiosk error. It’s one thing for the corporations and politicians to be idiots. It’s another for all the citizens to be complicit in that, too.

Idiocracy_omnibro30

Fighting American Idiocracy

Since we’re on the topic of separating families: Since the fascist, racist “zero-tolerance” policy was enacted as a desperate attempt to do something in light of his failed and ridiculous border wall promise, around 3000 kids were horrifically and forcibly separated from their families. Most have been reunited, but as of August there were at least 500 children still detained, despite the efforts of many dedicated resisters. The 500 include, according to the WaPo article linked below, 22 kids under 5. I can’t imagine the permanent emotional trauma it would be for them to be ripped from their families. The Trump administration chose to pursue scapegoating to rile a desperate, racist base. The government had no reunification system. The Trump administration ignored Judge Sabraw’s court-ordered deadline to reunite these families. The GOP largely backed him on this. They are monsters. Vote them out. Early voting is open in many states. Do it now so you don’t miss your chance.

ACLU.png

The Excessive Machine

Barbarella-104

When Durand-Durand captures Barbarella, he places her in a device which he calls the “Excessive Machine. She sits in a reclining seat, covered up to the shoulders by the device. Her head rests on an elaborate red leather headboard. Durand-Durand stands at a keyboard, built into the “footboard” of the machine, facing her.

Barbarella-103

The keyboard resembles that of an organ, but with transparent vertical keys beneath which a few colored light pulse. Long silver tubes stretch from the sides of the device to the ceiling. Touching the keys (they do not appear to depress) produces the sound of a full orchestra and causes motorized strips of metal to undulate in a sine wave above the victim.

Barbarella-097 Barbarella-099

When Durand-Durand reads the strange sheet music and begins to play “Sonata for Executioner and Various Young Women,” the machine (via means hidden from view) removes Barbarella’’s clothing piece by piece, ejecting them through a tube in the side of the machine near the floor. Then in an exchange Durand-Durand reveals its purpose…

Barbarella: It’’s …sort of nice, isn’t it?
Durand-Durand: Yes. It is nice. …In the beginning. Wait until the tune changes. It may change your tune as well.
Barbarella: Goodness, what do you mean?
Durand-Durand: When we reach the crescendo, you will die… of pleasure. Your end will be swift, but sweet, very sweet.

As Durand-Durand intensifies his playing, Barbarella writhes in agony/ecstasy. But despite his most furious playing, he does not kill Barbarella. Instead his machine fails dramatically, spewing fire and smoke out of the sides as its many tubes burn away. Barbarella is too much woman for the likes of his technology.

Barbarella-106

I’m going to disregard this as a device for torture and murder, since I wouldn’t want to improve such a thing, and that whole premise is kind of silly anyway. Instead I’ll regard it as a BDSM sexual device, in which Durand-Durand is a dominant, seeking to push the limits of an (informed, consensual) submissive using this machine. It’s possible that part of the scene is demonstration of prowess on a standardized, difficult-to-use instrument. If so, then a critique wouldn’t matter. But if not…Since the keys don’t move, the only variables he’s controlling are touch duration and vertical placement of his fingers. (The horizontal position on each key seems really unlikely.) I’d want to provide the player some haptic feedback to detect and correct slipping finger placement, letting him or her maintain attention on the sub who is, after all, the point.

A disaster-avoidance service

The key system in The Cabin in the Woods is a public service, and all technological components can be understood as part of this service. It is, of course, not a typical consumer service for several reasons. Like the CIA, FBI, and CDC, the people who most benefit from this service—humanity at large—are aware of it barely, if at all. These protective services only work by forestalling a negative event like a terrorist action or plague. Unlike these real-world threats, if Control fails in their duties, there is no crisis management as a next step. There’s only the world ending. Additionally, it is not typical in that it is an ancient service that has built itself up over ages around a mystical core.

So who are the users of the service? The victims are not. They are intentionally kept in the dark, and it is seen as a crisis when Marty learns the truth.

Given that interaction design requires awareness of the service in question, as well as inputs and outputs to steer variables towards a goal, it stands that the organization in the complex are the primary users. Even more particularly it is Sitterson and Hadley, the two “stage managers” in charge of the control room for the event, who are the real users. Understanding their goals we can begin an analysis. Fittingly, it’s complex:

  • Forestall the end of the world…
  • by causing the (non-Virgin) victims to suffer and die before Dana (who represents the Virgin archetype)…
  • at the hand of a Horrible Monster selected by the victims themselves…
  • marking each successful sacrifice with a blood ritual…
  • while keeping the victims unaware of the behind-the-scenes truth.

Sitterson and Hadley dance in the control room.

Part of a larger network with similar goals

This operation is not the only one operating at the same time. There are at least six other operations, working with their particular archetypes and rituals around the world: Berlin, Kyoto, Rangoon, Stockholm, Buenos Aires, and Madrid.

To monitor these other scenarios, there are two banks of CRT monitors high up on the back wall, each monitor dedicated to a different scenario. Notably, these are out of the stage manager’s line of attention when their focus is on their own.

The CRT monitors display other scenarios around the world.

The digital screens on the main console are much more malleable, however, and can be switched to display any of the analog video feeds if any special attention needs to be paid to it.

The amount of information that the stage managers need about any particular scenario is simple: What’s the current state of an ongoing scenario, and whether it has succeeded or failed for a concluded one. We don’t see any scenario succeed in this movie, so we can’t evaluate that output signal. Instead, they all fail. When they fail, a final image is displayed on the CRT with a blinking red legend “FAIL” superimposed across it, so it’s clear when you look at the screen (and catch it in the “on” part of the blink) what it’s status is.

Sitterson watches the Kyoto scenario fail.

Hadley sees that other scenarios have all failed.

One critique of this simple pass-fail signal is that it is an important signal that might be entirely missed, if the stage managers’ attentions were riveted forward, to problems in their own scenario. Another design option would be to alert Sitterson and Hadley to the moment of change with a signal in their peripheral attention, like a flash or a brief buzz. But signaling a change of state might not be enough. The new state, i.e. 4 of 7 failed, ought to be persistent in their field of vision as they continue their work, if the signal is considered an important motivator.

The design of alternate, persistent signals depend on rules we do not have access to. Are more successful scenarios somehow better? Or is it a simple OR-chain, with just one success meaning success overall? Presuming it’s the latter, strips of lighting around the big screens could become increasingly bright red, for instance, or a seven-sided figure mounted around the control room could have wedges turn red when those scenarios failed. Such environmental signals would allow the information to be glanceable, and remind the stage managers of the increasing importance of their own scenario. These signals could turn green at the first success as well, letting them know that the pressure is off and that what remains of their own scenario is to be run as a drill.

There is a Prisoner’s Dilemma argument to be made that stage managers should not have the information about the other scenarios at all, in order to keep each operation running at peak efficiency, but this would not have served the narrative as well.