Lumpy’s Brilliant Cartoon Player

I am pleased to report that with this post, we are over 50% of the way through this wretched, wretched Holiday Special.

SWHS-Cartoon-Player-07

Description

After Lumpy tries to stop stormtroopers from going upstairs, an Imperial Officer commands Malla to keep him quiet. To do so, she does what any self-respecting mother of a pre-teen in the age of technology does, and sits him down to watch cartoons. The player is a small, yellow device that sits flat on an angled tabletop, like a writing desk.

Two small silver buttons stack vertically on the left, and an upside down plug hole strainer on the right. A video screen sits above these controls. Since no one in the rest of his family wants to hear the cartoon introduction of Boba Fett, he dons a pair of headphones, which are actually kind of stylish in that the earpieces are square and perforated, but not beveled. There are some pointless animations that start up, but then the cartoon starts and Lumpy is, in fact, quiet for the duration. So, OK, point one Malla.

SWHS-Cartoon-Player-08

Why no budding DJ has glommed onto this for an album cover is beyond me.

Analysis

We only see Lumpy press down onto the surface of the device from the far side, so it’s mostly conjecture about how the interface works. The same goes for the media. But we do know the basic needs of video: Start, stop, and volume. And a single click-stop dial could handle all that, even if kind of poorly.

We also don’t know whether the device has media inserts—like a Blu-Ray player—or is more like a television with fixed streams of ongoing content to pick from, or like a Netflix requiring a search of a practically infinite on-demand catalogue. But that sink drain thing looks like it’s meant to be a channel selector, and this was 1978, so let’s presume it was a television model with a few-year prescient Walkman personal-media bent. In fact, there’s a handle visible in the shot posted below, so let’s give this thing some credit for presaging miniaturization to the point of mobility. It must have blown some kids minds back then.

And, sure, this interface could manage the task at hand, even if it’s missing some feedback for exactly which channel is being watched, and what the current volume is or what that second click-stop dial does, or why it has an affordance for turning when Lumpy clearly pushes it.

SWHS-Cartoon-Player-09.png

Apology

What I’m most interested in though is the crappy, crappy production quality of the thing. While it’s easy and admittedly fun to decry this as rushed through the prop department in about 30 minutes, I’m going to use my old friend apologetics to wonder if maybe Lumpy himself put this together. Not like a science fair project, but as an off-the shelf product. Wouldn’t it be awesome to give a kid a blank box with a video screen, let him take any object he found on top of it to use as a control device? A thimble could become the on-off switch. A jack could become the channel selector. A Matchbox car could become the volume control. This would diegetically explain the dopey sink strainer, and give Lumpy an awesome opportunity to think about the affordances of the things around him and the relationships-of-parts he could use to control abstract variables like volume, power, playback speed, etc. Maybe he could even assign objects to favorite videos. This stone in that crayon circle means that video. It would be a dream to foster interaction design thinking.

Sure, you might be thinking, but this would take cameras of an eye-like quality, and perfect image recognition attached to a near general artificial intelligence. Too bad they don’t have anything like that in Star Wars, yeah?

r2d2-and-c3po-star-wars

Of course one imagines such a device might be prohibitively expensive for a smuggler’s Life Day budget, and moreover this is giving the Star Wars Holiday Special waaaaay too much credit, but these are the truffles I actually do hope to find in rooting around all this muck for you.

Also to drop this. Contact me with demos.

DJLumpy.png

Live fire exercise

StarshipTroopers-Gunner-Practice-19

After the capture the flag exercise, the recruits advance to a live ammo exercise. In this one, the recruits have weapons loaded with live ammo and surge in waves over embankments. They wear the same special vests they did in the prior exercise that detect when they are hit with a laser, flashing briefly with red lights on the front and back and thereafter delivering a debilitating shock to the wearer until the game is over. As they approach the next embankment, dummies automatically rise up and fire lasers randomly towards the recruits. The recruits shoot to destroy the dummies, making it safe to advance to the next embankment. Continue reading

War game equipment

StarshipTroopers-Gunner-Practice-01

The recruits practice their war skills with capture the flag games. Each participant carries visible-laser weapons (color coded to match the team color) to fire at members of the other team, and wears a special vest that detects when it is hit with a laser, flashing briefly with red lights on the front and back and thereafter delivering a debilitating shock to the wearer until the game is over.

Continue reading

5E-opedia: Search

TheFifthElement-eye

Leeloo learns about the facts of the human race which she is destined to save through an online encyclopedia available to her in many places: in Cornelius’’ home, the spaceship to Fhloston Paradise, and aboard Zorg’’s ship. Three modes are seen for it. Today we discuss the third mode, which is to search for an in-depth topic.

Search

When Leeloo experiences full-scale combat with Zorg and the Mangalores aboard Fhloston Paradise, she grows curious about war. On the route back to Earth aboard Zorg’s ship, she once again returns to the online encyclopedia she’s been referencing throughout the film. When she sits down, it just so happens that the system is in the middle of the W topics. It is amid “we*” and “wh*” words. “Weapon” is at the top, so maybe that’s what Zorg was looking for.

TheFifthElement-W TheFifthElement-weapon

To access a particular topic not on screen, she simply begins typing. She types “WAR,” the letters filling the screen in green all-caps, and the entry for war begins playing. This entry is different than the prior one seen on martial arts. This is simply a series of still images presented serially, around four dozen that culminate in an image of the French test of an atomic weapon at Mururoa Atoll.

TheFifthElement-wkey

TheFifthElement-war

TheFifthElement-war-043

Two small nuances to note. The first is that we don’t see a result of possible search results. Like Wikipedia, there is a main entry for war, and it presumes that’s the one she means. If it’s wrong, she can interrupt. That’s a smart default that will work in most cases.

The second is that we don’t see or hear Leeloo hit an “enter” key after she finishes typing “war.” (The other keys each emit a small beep.) How did the system know she wasn’t continuing on to “warrior” or “warship”? A smart system would be able to interpret the pause after the “r” as a likely end, once it passes an outer threshold for her typical typing speed, and begin to show her the “war” entry. Then, if she continued to add another letter just outside that threshold, it could evaluate the string. If it might be a continuation, like typing an “s” for “warship” it could pause the display and wait. If a continuation wouldn’t make any sense, like “warx,” it could presume she was entering a new word beginning with “x” or help her recover in case it was just a plain old typo.

Interestingly, this is kind of the way Google Instant search works. Did the designers for The Fifth Element accidentally invent it 13 years ahead of Google?

Despite that cool possibility, I have to ding this entry for not really explaining anything. Some aren’t really about war but about terror, such as the image of the burning cross at a KKK rally. But even for the others, yes, they are horrific images. And they are a stinging reminder of the horrors that accompany war. But they really only work for someone with the prior knowledge of what they describe. Steve McCurry‘s haunting image of a tank in Kuwait, for instance, inspires despair only if you know the full background story of that war, and this sequence certainly does not provide it to Leeloo.

TheFifthElement-war-033

Ultimately, regardless of the mode this encyclopedia is in, it is a cinematic conceit that we should not take as a good example of rapid learning for the real world.

5E-opedia: Watch and learn

WPmode1

Leeloo learns about the facts of the human race which she is destined to save through an online encyclopedia available to her in many places: in Cornelius’’ home, the spaceship to Fhloston Paradise, and aboard Zorg’’s ship. Three modes are seen for it. Today we discuss the first mode, which is just play-and-watch.

Watch-and-learn

When we first see her using the (unnamed) encyclopedia, she is simply watching four columns of words quickly scroll by. The words are arranged alphabetically, top-to-bottom before continuing to the next column. There is a large, blinking, lower-case letter reversed out of a white square in the lower left. Near the middle of the screen, a thick bracket emphasizes four of the words from the screen and red lines connects each of the words to a large image on the right side of the screen. The words and pictures fly by at a rate that’s impossible for us mere humans to follow, but Cornelius assures David that she”s ‘learning our history…the last 5,000 years she’s missed. She’s been asleep for quite a while, you know.”

fifthelement-151

This mode is all passive. When in the scene Leeloo goes to grab a turkey from the kitchen to bring back and eat, the screen is still moving with no one in front of it. We see a menu of capital letters, and a selection moving from “A” to “M”, by itself.

The Tyranny of Pause

Here I’m going to have to break the usual stance with which I review interfaces. That is, I usually treat each interface as if it was perfectly as it should be in the world of the movie or TV show, trying to willfully ignore its speculative nature. (That’s how we can make it relevant to our real world work.) But here, there’s just too much that’s broken in the content to make any sense. You only notice it when you slow the movie down to read and examine the screens, so this is totally unfair. But then again, the DVD format had been in the world for two years by the time The Fifth Element came out, so there’s not a great excuse for playback technology. They could presume it would eventually be paused.

First off, the words in the lists are repeated. The first column is identical to the third. The second is identical to the fourth. I can’t imagine a good reason why this would help a reader. I was hoping maybe there was some autostereogram thing going on, but no. It just adds noise.

Secondly, the vast majority of images have little to do with the words to which they’re connected. “Me” points to a halved cantaloupe filled with blackberries. (See the image above.) “Maunder” points to an image of a woman’s softly parted, lipsticked lips. What on earth is Leeloo meant to learn from that? (Also I think it’s high time we bring back maunder into common usage.)

TheFifthElement-maunder

Worse, the images repeat. So the same picture of a chimpanzee connects to both “meadow” and “matriarch.” The same picture of a nose (flipped horizontally once) connects to “nav(vv?)y,” “nefarious,” and “negate.”

Also the same word may appear multiple times, connected to different pictures. “Maw” points once to a mouth, which is sensible, but once to a full-body portrait of a model in a little black dress. I’m all for polysemy and homonymy, but this just makes no sense.

Only once does the connection make absolute sense, as “Napoleon” points to François Gérard’s 1804 portrait of “Napoleon Bonaparte, Emperor of France, at Malmaison.”

TheFifthElement-Napoleon

Now I understand the tough challenge the interface designers faced: They probably had zero time, the damned thing had to look…well…encyclopedic, it had to make Leeloo look like a learning machine, no one had the budget or time to create new media for all these entries. Plus Larry Lessig wouldn’t found the Creative Commons organization for four years later. But…still.

Sometimes the words are arbitrarily cut off, as in “mayonnais” and “masturbat.” (Both the prurient-minded and those prone to conspiracies about subliminal influence may note that the word “masturbat~” appears three times, twice as a full word, and once cut off in this way.)

So, in short, unless she’s studying the Dada Encyclopedia, this display just makes no sense.

Um…learning?

Even if the images didn’t repeat, the words didn’t repeat, the images made sense for the words, and words appeared fully spelled out, it’s a ridiculous display for what Leeloo is trying to do. At best, this might be able to teach her the written words for some concrete things. But even this is doubtful. How does she know that “Napoleon” as a word means a particular individual rather than the word for a painted portrait, or the name of the uniform being depicted? Without going too far into history or learning theory, we can say that Leeloo would need exposure to some propositional language to understand history as interrelated events occurring across time, and an alphabetical list of words and pictures just isn’t enough.

Alien Astrometrics

Prometheus-222

When David is exploring the ancient alien navigation interfaces, he surveys a panel, and presses three buttons whose bulbous tops have the appearance of soft-boiled eggs. As he presses them in order, electronic clucks echo in in the cavern. After a beat, one of the eggs flickers, and glows from an internal light. He presses this one, and a seat glides out for a user to sit in. He does so, and a glowing pollen volumetric projection of several aliens appears. The one before David takes a seat in the chair, which repositions itself in the semicircular indentation of the large circular table.

Prometheus-204

The material selection of the egg buttons could not be a better example of affordance. The part that’s meant to be touched looks soft and pliable, smooth and cool to the touch. The part that’s not meant to be touched looks rough, like immovable stone. At a glance, it’s clear what is interactive and what isn’t. Among the egg buttons there are some variations in orientation, size, and even surface texture. It is the bumpy-surfaced one that draws David’s attention to touch first that ultimately activates the seat.

The VP alien picks up and blows a few notes on a simple flute, which brings that seat’s interface fully to life. The eggs glow green and emit green glowing plasma arcs between certain of them. David is able to place his hand in the path of one of the arcs and change its shape as the plasma steers around him, but it does not appear to affect the display. The arcs themselves appear to be a status display, but not a control.

After the alien manipulates these controls for a bit, a massive, cyan volumetric projection appears and fills the chamber. It depicts a fluid node network mapped to the outside of a sphere. Other node network clouds appear floating everywhere in the room along with objects that look like old Bohr models of atoms, but with galaxies at their center. Within the sphere three-dimensional astronomical charts appear. Additionally huge rings appear and surround the main sphere, rotating slowly. After a few inputs from the VP alien at the interface, the whole display reconfigures, putting one of the small orbiting Bohr models at the center, illuminating emerald green lines that point to it and a faint sphere of emerald green lines that surround it. The total effect of this display is beautiful and spectacular, even for David, who is an unfeeling replicant cyborg.

Prometheus-226

At the center of the display, David observes that the green-highlighted sphere is the planet Earth. He reaches out towards it, and it falls to his hand. When it is within reach, he plucks it from its orbit, at which point the green highlights disappear with an electronic glitch sound. He marvels at it for a bit, turning it in his hands, looking at Africa. Then after he opens his hands, the VP Earth gently returns to its rightful position in the display, where it is once again highlighted with emerald, volumetric graphics.

Prometheus-229

Finally, in a blinding flash, the display suddenly quits, leaving David back in the darkness of the abandoned room, with the exception of the small Earth display, which is floating over a small pyramid-shaped protrusion before flickering away.

After the Earth fades, david notices the stasis chambers around the outside of the room. He realizes that what he has just seen (and interacted with) is a memory from one of the aliens still present.

Prometheus-238

Prometheus-239

Hilarious and insightful Youtube poster CinemaSins asks in the video “Everything Wrong with Prometheus in 4 minutes or Less,” “How the f*ck is he holding the memory of a hologram?” Fair question, but not unanswerable. The critique only stands if you presume that the display must be passive and must play uninterrupted like a television show or movie. But it certainly doesn’t have to be that way.

Imagine if this is less like a YouTube video, and more like a playback through a game engine like a holodeck StarCraft. Of course it’s entirely possible to pause the action in the middle of playback and investigate parts of the display, before pressing play again and letting it resume its course. But that playback is a live system. It would be possible to run it afresh from the paused point with changed parameters as well. This sort of interrupt-and-play model would be a fantastic learning tool for sensemaking of 4D information. Want to pause playback of the signing of the Magna Carta and pick up the document to read it? That’s a “learning moment” and one that a system should take advantage of. I’d be surprised if—once such a display were possible—it wouldn’t be the norm.

Starmetheus

The only thing I see that’s missing in the scene is a clear signal about the different state of the playback:

  1. As it happened
  2. Paused for investigation
  3. Playing with new parameters (if it was actually available)

David moves from 1 to 2, but the only change of state is the appearance and disappearance of the green highlight VP graphics around the Earth. This is a signal that could easily be missed, and wasn’t present at the start of the display. Better would be some global change, like a global shift in color to indicate the different state. A separate signal might compare As it Happened with the results of Playing with new parameters, but that’s a speculative requirement of a speculative technology. Best to put it down for now and return to what this interface is: One of the most rich, lovely, and promising examples of sensemaking interactions seen on screen. (See what I did there?)

For more about how VP might be more than a passive playback, see the lesson in Chapter 4 of Make It So, page 84, VP Systems Should Interpret, Not Just Report.

VP language instructor

During David’s two year journey, part of his time is spent “deconstructing dozens of ancient languages to their roots.” We see one scene illustrating a pronunciation part of this study early in the film. As he’s eating, he sees a volumetric display of a cuboid appear high in the air opposite his seat at the table. The cuboid is filled with a cyan glow in which a “talking head” instructor takes up most of the space. In the left is a column of five still images of other artificial intelligent instructors. Each image has two vertical sliders on the left, but the meaning of these sliders is not made clear. In the upper right is an obscure diagram that looks a little like a constellation with some inscrutable text below it.

On the right side of the cuboid projection, we see some other information in a pinks, blues, and cyans. This information appears to be text, bar charts, and line graphs. This information is not immediately usable to the learner, so perhaps it is material about the entire course, for when the lessons are paused: Notes about the progress towards a learning goal, advice for further study, or next steps. Presuming this is a general-purpose interface rather than a custom one made just for David, this information could be the student’s progress notes for an attending human instructor.

We enter the scene with the AI saying, “…Whilst this manner of articulation is attested in Indo-European descendants as a purely paralinguistic form, it is phonemic in the ancestral form dating back five millennia or more. Now let’s attempt Schleicher’s Fable. Repeat after me.”

In the lower part of the image is a waveform of the current phrase being studied. In the lower right is the written text of the phrase being studied, in what looks like a simplified phoenetic alphabet. As the instructor speaks this fable, each word is hilighted in the written form. When he is done, he prompts David to repeat it.

akʷunsəz dadkta,
hwælna nahast
təm ghεrmha
vagam ugεntha, Continue reading