The Cloak of Levitation, Part 3: But is it agentive?

Full_coverSo I mentioned in the intro to this review that I was drawn to review Doctor Strange (with my buddy and co-reviewer Scout Addis) because the Cloak displays some interesting qualities in relation to the book I just published. Buy it, read it, review it on amazon.com, it’s awesome.

That sales pitch done, I can quickly cover the key concepts here.

  • A tool, like a hammer, is a familiar but comparatively-dumb category of thing that only responds to a user’s input. A hammer is an example. Tool has been the model of the thing we’re designing in interaction design for, oh, 60 years, but it is being mostly obviated by narrow artificial intelligence, which can be understood as automatic, assistive, or agentive.
  • Assistive technology helps its user with the task she is focused on: Drawing her attention, providing information, making suggestions, maybe helping augment her precision or force. If we think of a hammer again, an assistive might draw her attention to the best angle to strike the nail, or use an internal gyroscope to gently correct her off-angle strike.
  • Agentive technology does the task for its user. Again with the hammer, she could tell hammerbot (a physical agent, but there are virtual ones, too) what she wants hammered and how. Her instructions might be something like: Hammer a hapenny nail every decimeter along the length of this plinth. As it begins to pound away, she can then turn her attention to mixing paint or whatever.

When I first introduce people to these distinctions, I step one rung up on Wittgenstein’s Ladder and talk about products that are purely agentive or purely assistive, as if agency was a quality of the technology. (Thabks to TU prof P.J. Stappers for distinguishing these as ontological and epistemological approaches.) The Roomba, for example, is almost wholly agentive as a vacuum. It has no handle for you to grab, because it does the steering and pushing and vacuuming.

roomba_r2_d2_1

Yes, it’s a real thing you can own.

Once you get these basic ideas in your head, we can take another step up the Ladder together and clarify that agency is not necessarily a quality of the thing in the world. It’s subtler than that. It’s a mode of relationship between user and agent, one which can change over time. Sophisticated products should be able to shift their agency mode (between tool, assistant, agent, and automation) according to the intentions and wishes of their user. Hammerbot is useful, but still kind of dumb compared to its human. If there’s a particularly tricky or delicate nail to be driven, our carpenter might ask hammerbot’s assistance, but really, she’ll want to handle that delicate hammering herself.

Which brings us back to the Cloak. Continue reading

“Real-time,” Interplanetary Chat

While recording a podcast with the guys at DecipherSciFi about the twee(n) love story The Space Between Us, we spent some time kvetching about how silly it was that many of the scenes involved Gardner, on Mars, in a real-time text chat with a girl named Tulsa, on Earth. It’s partly bothersome because throughout the rest of the the movie, the story tries for a Mohs sci-fi hardness of, like, 1.5, somewhere between Real Life and Speculative Science, so it can’t really excuse itself through the Applied Phlebotinum that, say, Star Wars might use. The rest of the film feels like it’s trying to have believable science, but during these scenes it just whistles, looks the other way, and hopes you don’t notice that the two lovebirds are breaking the laws of physics as they swap flirt emoji.

Hopefully unnecessary science brief: Mars and Earth are far away from each other. Even if the communications transmissions are sent at light speed between them, it takes much longer than the 1 second of response time required to feel “instant.” How much longer? It depends. The planets orbit the sun at different speeds, so aren’t a constant distance apart. At their closest, it takes light 3 minutes to travel between Mars and Earth, and at their farthest—while not being blocked by the sun—it takes about 21 minutes. A round-trip is double that. So nothing akin to real-time chat is going to happen.

But I’m a designer, a sci-fi apologist, and a fairly talented backworlder. I want to make it work. And perhaps because of my recent dive into narrow AI, I began to realize that, well, in a way, maybe it could. It just requires rethinking what’s happening in the chat. Continue reading

R. S. Revenge Comms

Note: In honor of the season, Rogue One opening this week, and the reviews of Battlestar Galactica: The Mini-Series behind us, I’m reopening the Star Wars Holiday Special reviews, starting with the show-within-a-show, The Faithful Wookie. Refresh yourself of the plot if it’s been a while.

Faithful-Wookiee-02

On board the R.S. Revenge, the purple-skinned communications officer announces he’s picked up something. (Genders are a goofy thing to ascribe to alien physiology, but the voice actor speaks in a masculine register, so I’m going with it.)

faithful-wookiee-01-surrounds

He attends a monitor, below which are several dials and controls in a panel. On the right of the monitor screen there are five physical controls.

  • A stay-state toggle switch
  • A stay-state rocker switch
  • Three dials

The lower two dials have rings under them on the panel that accentuate their color.

Map View

The screen is a dark purple overhead map of the impossibly dense asteroid field in which the Revenge sits. A light purple grid divides the space into 48 squares. This screen has text all over it, but written in a constructed orthography unmentioned in the Wookieepedia. In the upper center and upper right are unchanging labels. Some triangular label sits in the lower-left. In the lower right corner, text appears and disappears too fast for (human) reading. The middle right side of the screen is labeled in large characters, but they also change too rapidly to make much sense of it.

revengescreen Continue reading

Tony Stark is being lied to (by his own creation)

Before I surface from the deep dive examination of the Iron Man HUD, there’s one last bit of meandering philosophy and fan theory I’d like to propose, that touches on our future relationship with technology.

The Iron Man is not Tony Stark. The Iron Man is JARVIS. Let me explain.

Tony can’t fire weapons like that

vlcsnap-2015-09-15-05h12m45s973

The first piece of evidence is that most of the weapons he uses are unlikely to be fired by him. Take the repulsor rays in his palms. I challenge readers to strap a laser perpendicular to each of their their palms and reliably target moving objects that are actively trying to avoid getting hit, while, say, roller skating an obstacle course. Because that’s what he’s doing as he flies around incapacitating Hydra agents and knocking around Ultrons. The weapons are not designed for Tony to operate them manually with any accuracy. But that’s not true for the artificial intelligence.

Iron Targeting 02 Continue reading

Iron Man HUD: 2nd-person view

The HUD itself displays a number of core capabilities across the Iron Man movies prior to its appearance in The Avengers. Cataloguing these capabilities lets us understand (or backworld) how he interacts with the HUD, equipping us to look for its common patterns and possible conflicts. In the first-person view, we saw it looked almost entirely like a rich agentive display, but with little interaction. Now, let’s look at that gorgeous 2nd-person view.

When in the first film Tony first puts the faceplate on and says to JARVIS, “Engage heads-up display”… IronMan1_HUD00 …we see things from a narrative-conceit, 2nd-person perspective, as if the helmet were huge and we are inside the cavernous space with him, seeing only Tony’s face and the augmented reality interface elements. IronMan1_HUD07 You might be thinking, “Of course it’s a narrative conceit. It’s not real. It’s in a movie.” But what I mean by that is that even in the diegesis, the Marvel Cinematic World, this is not something that could be seen. Let’s move through the reasons why. Continue reading

Dat glaive: Enthrallment

Several times throughout the movie, Loki uses places the point of the glaive on a victim’s chest near their heart, and a blue fog passes from the stone to infect them: an electric blackness creeps upward along their skin from their chest until it reaches their eyes, which turn fully black for a moment before becoming the same ice blue of the glaive’s stone, and we see that the victim is now enthralled into Loki’s servitude.

Enthralling_Hawkeye

You have heart.

The glaive is very, very terribly designed for this purpose. Continue reading

Odyssey Communications

Desktop_2014_07_15_19_32_16_746

The TET is far enough away from Earth that the crew goes into suspended animation for the initial travel to it. This initial travel is either automated or controlled from Earth. After waking up, the crew speak conversationally with their mission controller Sally.

This conversation between Jack, Vika, and [actual human] Sally happens over a small 2d video communication system. The panel in the middle of the Odyssey’s control panel shows Sally and a small section of Mission Control, presumably back on Earth. Sally confirms with Jack that the readings Earth is getting from the Odyssey remotely are what is actually happening on site.

Desktop_2014_07_15_19_29_53_919

Soon after, mission control is able to respond immediately to Jack’s initial OMS burn and let him know that he is over-stressing the ship trying to escape the TET. Jack is then able to make adjustments (cut thrust) before the stress damages the Odyssey.

FTL Communication

Communication between Odyssey and the Earth happens in real-time. When you look at the science of it all, this is more than a little surprising. Continue reading