The FloorMaster

As Joe wanders through the (incredibly depressing) lobby of St. God’s Memorial Hospital, it is at once familiar but wrong. One of these wrong things is a floor cleaning robot labeled The FloorMaster. It loudly announces “YOUR FLOOR IS NOW CLEAN!” while bumping over and over into a toe kick under a cabinet. (It also displays this same phrase on a display panel.) The floor immediately below its path is, in fact, spotless, but the surrounding floor is so filthy it is opaque with dirt, as well as littered with syringes and trash lined with unsettling stains.

There are few bananas for scale, but I’m guessing it’s half meter square. It has a yellow top with green sides and highlights. It has bumpers and some greebles and an amber display screen on top. “The FloorMaster” logo is printed on its side.

Narratively awesome

The wonderful thing about this device is it quickly tells us many things at once. First, the FloorMaster is a technology that is, itself, kind of stupid. Today’s Roombas “know” to turn a bit when they bump into a wall. It’s one of the basic ways they avoid this very scenario. So this illustrates that the technology in this world is, itself, kind of stupid. (How society managed to make it this far without imploding or hell, exploding, is a mystery.)

It also shows that the people around the machines are failing to notice and do anything about the robot. They are either too dull to notice or this is just so common that it’s not worth doing anything about.

It also shows how stupid capitalism has become (it’s a running theme of St. God’s and the rest of the movie). It calls itself the floor master, but in no way has it mastered your floors. In no way are your floors clean, despite what the device itself is telling and blinking at you. And CamelCase brand names are so 1990s, much less 2505.

floornowclean

Realistically stupid

So, I wrote this whole book about agents, i.e. technologies that persistently respond to triggers with behaviors that serve people. It’s called Designing Agentive Technologies: AI That Works for People. One of my recurring examples in that book and when I speak publicly about that content is the Roomba, so I have a bookload of opinions on how this thing should be designed. I don’t want to simply copy+paste that book here. But know that Chapter 9 is all about handoff and takeback between an agent and a user, and ideally this machine would be smart enough to detect when it is stuck and reach out to the user to help.

33446295894_9b5c594a2b_k.jpg

I would be remiss not to note that, as with the The Fifth Element floor sweeping robots, safety of people around the underfoot robot is important. This is especially true in a hospital setting, where people may be in a fragile state and not as alert as they would ordinarily be. So unless this was programmed to run only when there was no one around, it seems like a stupid thing to have in a hospital. OK, chalk another point up to its narrative virtues.

Fighting US Idiocracy

Speaking of bots, there is a brilliant bot that you can sign up for to help us resist American idiocracy. It’s the resistbot, and you can find it on Facebook messenger, twitter, and telegram. It provides easy ways to find out who represents you in Congress, and deliver messages to them in under 2 minutes. It’s not as influential as an in-person visit or call, but as part of your arsenal, it helps with reminders for action. Join!

resistbot-banner.png

A Deadly Pattern

The Drones’ primary task is to patrol the surface for threats, then eliminate those threats. The drones are always on guard, responding swiftly and violently against anything they do perceive as a threat.

An explosion in a dimly lit library with debris scattered around and a hovering robotic device amidst the chaos.

During his day-to-day maintenance, Jack often encounters active drones. Initially, the drones always regard him as a threat, and offer him a brief window of time speak his name and tech number (for example, “Jack, Tech 49”) to authenticate. The drone then compares this speech against some database, shown on their HUD as a zoomed-in image of Jack’s mouth and a vocal frequency.

vlcsnap-2015-02-03-22h07m47s249

Occasionally, we see that Jack’s identification doesn’t immediately work. In those cases, he’s given a second chance by the drone to confirm his identity.

A man with a glowing headband appears concerned, reaching out with his hand, in a dimly lit environment.

Although never shown, it is almost certain that failing to properly identify himself would get Jack immediately killed. We never see any backup mechanism, and when Jack’s response doesn’t immediately work, we see him get very worried. He knows what happens when the drone detects a threat.

Zero Error Tolerance

This pattern is deadly because it offers very little tolerance for error. The Drone does show some desire to give Jack a second chance on his vocal pattern, but it is unclear how many total chances he gets.

On a website, if I enter my password wrong too many times it will lock me out. With this system, the wrong password too many times will get Jack killed.

There are many situations where Jack may not be able to immediately respond:

  • Falling off his bike and knocking himself out
  • Focus on repairing a drone, when a second drone swoops in to check the situation out
  • Severe shock after breaking a limb
  • etc…

As we see in the crashed shuttle scene, the Drones have no hesitation in killing unconscious targets. This means that Jack has a strong chance of being killed by his Drone protector in some of the situations where he needs help the most.

A futuristic robotic sphere with the number 166 displayed on its surface, emitting flames and smoke from its underside.

A more effective method could be a passive recognition system. We already know that the drone can remotely detect Jack’s biosignature, and that the Tet has full access to the Drone’s HUD feed.

The Drone then could be automatically set to not attack Jack unless the Tet gives a very specific override. Or, alternatively, the Drone could be hard-wired to never attack Jack at all (though this would complicate the movie’s plot). In any situation where it looks like the Drone might attack anyways, the remote software Vika uses could act as a secondary switch, providing a backup confirmation message.

That said, we must acknowledge that this system excels at is keeping Jack nervous and afraid of active drones.  While they help him, he knows that they can turn on him at any moment.  This serves the TET by keeping Jack cowed, obedient, and always looking over his shoulder.

Ethical Ramifications

The Drones are built as autonomous sentries, able to protect extraordinarily expensive infrastructure against attack. They need to be able to eliminate that threat, quickly and efficiently. Current militaries are facing the exact same issues. Even though they have pledged (for now) to not build autonomous kill systems, modern military planners may find value in having a robot perform a drudging, dangerous task like patrolling remote infrastructure.

The question asked best in Oblivion is “What should constitute a threat?

A desolate landscape with wreckage and flames, suggesting a recent disaster or battle scene.

Drones fire mercilessly on unarmed civilians and armed enemy militia, but do not attack armed friendly soldiers (Jack). This already implies some level of advanced threat analysis, even if we abhor the choices the Drone makes.

The Future

Military Planners will need to answer the same question: How does the algorithm determine a threat? With human labor becoming more and more expensive both monetarily and emotionally, the push for autonomous drone systems will become even stronger for future conflicts.

There is still enough time to research and test potential concepts before we have to make a decision on autonomous drones.

Interaction Design Lessons:

  1. Don’t threaten civilians and non-combatants.
  2. Give clear feedback of limits and consequences if a deadly pattern is about to be activated.
  3. Give users a second chance.