Wakandan Med Table

When Agent Ross is shot in the back during Klaue’s escape from the Busan field office, T’Challa stuffs a kimoyo bead into the wound to staunch the bleeding, but the wounds are still serious enough that the team must bring him back to Wakanda for healing. They float him to Shuri’s lab on a hover-stretcher.

Here Shuri gets to say the juicy line, “Great. Another white boy for us to fix. This is going to be fun.
Sorry about the blurry screen shot, but this is the most complete view of the bay.

The hover-stretcher gets locked into place inside a bay. The bay is a small room in the center of Shuri’s lab, open on two sides. The walls are covered in a gray pattern suggesting a honeycomb. A bas-relief volumetric projection displays some medical information about the patient like vital signs and a subtle fundus image of the optic nerve.

Shuri holds her hand flat and raises it above the patient’s chest. A volumetric display of 9 of his thoracic vertebrae rises up in response. One of the vertebrae is highlighted in a bright red. A section of the wall display displays the same information in 2D, cyan with orange highlights. That display section slides out from the wall to draw observer’s attentions. Hexagonal tiles flip behind the display for some reason, but produce no change in the display.

Shuri reaches her hands up to the volumetric vertebrae, pinches her forefingers and thumbs together, and pull them apart. In response, the space between the vertebrae expands, allowing her to see the top and bottom of the body of the vertebra.

She then turns to the wall display, and reading something there, tells the others that he’ll live. Her attention is pulled away with the arrival of Wakabe, bringing news of Killmonger. We do not see her initiate a treatment in the scene. We have to presume that she did it between cuts. (There would have to be a LOT of confidence in an AI’s ability to diagnose and determine treatment before they would let Griot do that without human input.)

We’ll look more closely at the hover-stretcher display in a moment, but for now let’s pause and talk about the displays and the interaction of this beat.

A lab is not a recovery room

This doesn’t feel like a smart environment to hold a patient. We can bypass a lot of the usual hospital concerns of sterilization (it’s a clean room) or readily-available equipment (since they are surrounded by programmable vibranium dust controlled by an AGI) or even risk of contamination (something something AI). I’m mostly thinking about the patient having an environment that promotes healing: Natural light, quiet or soothing music, plants, furnishing, and serene interiors. Having him there certainly means that Shuri’s team can keep an eye on him, and provide some noise that may act as a stimulus, but don’t they have actual hospital rooms in Wakanda? 

Why does she need to lift it?

The VP starts in his chest, but why? If it had started out as a “translucent skin” illusion, like we saw in Lost in Space (1998, see below), then that might make sense. She would want to lift it to see it in isolation from the distracting details of the body. But it doesn’t start this way, it starts embedded within him?!

The “translucent skin” display from Lost in Space (1998)

It’s a good idea to have a representation close to the referent, to make for easy comparison between them. But to start the VP within his opaque chest just doesn’t make sense.

This is probably the wrong gesture

In the gestural interfaces chapter of  Make It So, I described a pidgin that has been emerging in sci-fi which consisted of 7 “words.” The last of these is “Pinch and Spread to Scale.” Now, there is nothing sacred about this gestural language, but it has echoes in the real world as well. For one example, Google’s VR painting app Tilt Brush uses “spread to scale.” So as an increasingly common norm, it should only be violated with good reason. In Black Panther, Shuri uses spread to mean “spread these out,” even though she starts the gesture near the center of the display and pulls out at a 45° angle. This speaks much more to scaling than to spreading. It’s a mismatch and I can’t see a good reason for it. Even if it’s “what works for her,” gestural idiolects hinder communities of practice, and so should be avoided.

Better would have been pinching on one end of the spine and hooking her other index finger to spread it apart without scaling. The pinch is quite literal for “hold” and the hook quite literal for “pull.” This would let scale be scale, and “hook-pull” to mean “spread components along an axis.”

Model from https://justsketch.me/

If we were stuck with the footage of Shuri doing the scale gesture, then it would have made more sense to scale the display, and fade the white vertebrae away so she could focus on the enlarged, damaged one. She could then turn it with her hand to any arbitrary orientation to examine it.

An object highlight is insufficient

It’s quite helpful for an interface that can detect anomalies to help focus a user’s attention there. The red highlight for the damaged vertebrae certainly helps draw attention. Where’s the problem? Ah, yes. There’s the problem. But it’s more helpful for the healthcare worker to know the nature of the damage, what the diagnosis is, to monitor the performance of the related systems, and to know how the intervention is going. (I covered these in the medical interfaces chapter of Make It So, if you want to read more.) So yes, we can see which vertebra is damaged, but what is the nature of that damage? A slipped disc should look different than a bone spur, which should look different than one that’s been cracked or shattered from a bullet. The thing-red display helps for an instant read in the scene, but fails on close inspection and would be insufficient in the real world.

This is not directly relevant to the critique, but interesting that spinal VPs have been around since 1992. Star Trek: The Next Generation, “Ethics” (Season 5, Episode 16).

Put critical information near the user’s locus of attention

Why does Shuri have to turn and look at the wall display at all? Why not augment the volumetric projection with the data that she needs? You might worry that it could obscure the patient (and thereby hinder direct observations) but with an AGI running the show, it could easily position those elements to not occlude her view.

Compare this display, which puts a waveform directly adjacent to the brain VP. Firefly, “Ariel” (Episode 9, 2002).

Note that Shuri is not the only person in the room interested in knowing the state of things, so a wall display isn’t bad, but it shouldn’t be the only augmentation.

Lastly, why does she need to tell the others that Ross will live? if there was signifcant risk of his death, there should be unavoidable environmental signals. Klaxons or medical alerts. So unless we are to believe T’Challa has never encountered a single medical emergency before (even in media), this is a strange thing for her to have to say. Of course we understand she’s really telling us in the audience that we don’t need to wonder about this plot development any more, but it would be better, diegetically, if she had confirmed the time-to-heal, like, “He should be fine in a few hours.”

Alternatively, it would be hilarious turnabout if the AI Griot had simply not been “trained” on data that included white people, and “could not see him,” which is why she had to manually manage the diagnosis and intervention, but that would have massive impact on the remote piloting and other scenes, so isn’t worth it. Probably.

Thoughts toward a redesign

So, all told, this interface and interaction could be much better fit-to-purpose. Clarify the gestural language. Lose the pointless flipping hexagons. Simplify the wall display for observers to show vitals, diagnosis and intervention, as well as progress toward the goal. Augment the physician’s projection with detailed, contextual data. And though I didn’t mention it above, of course the bone isn’t the only thing damaged, so show some of the other damaged tissues, and some flowing, glowing patterns to show where healing is being done along with a predicted time-to-completion.

Stretcher display

Later, when Ross is fully healed and wakes up, we see a shot of of the med table from above. Lots of cyan and orange, and *typography shudder* stacked type. Orange outlines seem to indicate controls, tough they bear symbols rather than full labels, which we know is better for learnability and infrequent reuse. (Linguist nerds: Yes, Wakandan is alphabetic rather than logographic.)

These feel mostly like FUIgetry, with the exception of a subtle respiration monitor on Ross’ left. But it shows current state rather than tracked over time, so still isn’t as helpful as it could be.

Then when Ross lifts his head, the hexagons begin to flip over, disabling the display. What? Does this thing only work when the patient’s head is in the exact right space? What happens when they’re coughing, or convulsing? Wouldn’t a healthcare worker still be interested in the last-recorded state of things? This “instant-off” makes no sense. Better would have been just to let the displays fade to a gray to indicate that it is no longer live data, and to have delayed the fade until he’s actually sitting up.

All told, the Wakandan medical interfaces are the worst of the ones seen in the film. Lovely, and good for quick narrative hit, but bad models for real-world design, or even close inspection within the world of Wakanda.


MLK Day Matters

Each post in the Black Panther review is followed by actions that you can take to support black lives.

Today is Martin Luther King Day. Normally there would be huge gatherings and public speeches about his legacy and the current state of civil rights. But the pandemic is still raging, and with the Capitol in Washington, D.C. having seen just last week an armed insurrection by supporters of outgoing and pouty loser Donald Trump, (in case that WP article hasn’t been moved yet, here’s the post under its watered-down title) worries about additional racist terrorism and violence.

So today we celebrate virtually, by staying at home, re-experiening his speeches and letters, and listening to the words of black leaders and prominent thinkers all around us, reminding us of the arc of the moral universe, and all the work it takes to bend it toward justice.

With the Biden team taking the reins on Wednesday, and Kamala Harris as our first female Vice President of color, things are looking brighter than they have in 4 long, terrible years. But Trump would have gotten nowhere if there hadn’t been a voting block and party willing to indulge his racist fascism. There’s still much more to do to dismantle systemic racism in the country and around the world. Let’s read, reflect, and use whatever platforms and resources we are privileged to have, act.

Agent Ross’ remote piloting

Remote operation appears twice during Black Panther. This post describes the second, in which CIA Agent Ross remote-pilots the Talon in order to chase down cargo airships carrying Killmonger’s war supplies. The prior post describes the first, in which Shuri remotely drives an automobile.

In this sequence, Shuri equips Ross with kimoyo beads and a bone-conducting communication chip, and tells him that he must shoot down the cargo ships down before they cross beyond the Wakandan border. As soon as she tosses a remote-control kimoyo bead onto the Talon, Griot announces to Ross in the lab “Remote piloting system activated” and creates a piloting seat out of vibranium dust for him. Savvy watchers may wonder at this, since Okoye pilots the thing by meditation and Ross would have no meditation-pilot training, but Shuri explains to him, “I made it American style for you. Get in!” He does, grabs the sparkly black controls, and gets to business.

The most remarkable thing to me about the interface is how seamlessly the Talon can be piloted by vastly different controls. Meditation brain control? Can do. Joystick-and-throttle? Just as can do.

Now, generally, I have a beef with the notion of hyperindividualized UI tailoring—it prevents vital communication across a community of practice (read more about my critique of this goal here)—but in this case, there is zero time for Ross to learn a new interface. So sure, give him a control system with which he feels comfortable to handle this emergency. It makes him feel more at ease.

The mutable nature of the controls tells us that there is a robust interface layer that is interpreting whatever inputs the pilot supplies and applying them to the actuators in the Talon. More on this below. Spoiler: it’s Griot.

Too sparse HUD

The HUD presents a simple circle-in-a-triangle reticle that lights up red when a target is in sights. Otherwise it’s notably empty of augmentation. There’s no tunnel in the sky display to describe the ideal path, or proximity warnings about skyscrapers, or airspeed indicator, or altimeter, or…anything. This seems a glaring omission since we can be certain other “American-style” airships have such things. More on why this might be below, but spoiler: It’s Griot.

What do these controls do, exactly?

I take no joy in gotchas. That said…

  • When Ross launches the Talon, he does so by pulling the right joystick backward.
  • When he shoots down the first cargo ship over Birnin Zana, he pushes the same joystick forward as he pulls the trigger, firing energy weapons.

Why would the same control do both? It’s hard to believe it’s modal. Extradiegetically, this is probably an artifact of actor Martin Freeman’s just doing what feels dramatic, but for a real-world equivalent I would advise against having physical controls have wholly different modes on the same grip, lest we risk confusing pilots on mission-critical tasks. But spoiler…oh, you know where this is going.

It’s Griot

Diegetically, Shuri is flat-out wrong that Ross is an experienced pilot. But she also knew that it didn’t matter, because her lab has him covered anyway. Griot is an AI with a brain interface, and can read Ross’ intentions, handling all the difficult execution itself.

This would also explain the lack of better HUD augmentation. That absence seems especially egregious considering that the first cargo ship was flying over a crowded city at the time it was being targeted. If Ross had fired in the wrong place, the cargo ship might have crashed into a building, or down to the bustling city street, killing people. But, instead, Griot quietly, precisely targets the ship for him, to insure that it would crash safely in nearby water.

This would also explain how wildly different interfaces can control the Talon with similar efficacy.

An stained-glass image of William of Ockham. A modern blackletter caption reads, “It was always Griot.”

So, Occams-apology says, yep, it’s Griot.

An AI-wizard did it?

In the post about Shuri’s remote driving, I suggested that Griot was also helping her execute driving behind the scenes. This hearkens back to both the Iron HUD and Doctor Strange’s Cloak of Levitation. It could be that the MCU isn’t really worrying about the details of its enabling technologies, or that this is a brilliant model for our future relationship with technology. Let us feel like heroes, and let the AI manage all the details. I worry that I’m building myself into a wizard-did-it pattern, inserting AI for wizard. Maybe that’s worth another post all its own.

But there is one other thing about Ross’ interface worth noting.

The sonic overload

When the last of the cargo ships is nearly at the border, Ross reports to Shuri that he can’t chase it, because Killmonger-loyal dragon flyers have “got me trapped with some kind of cables.” She instructs him to, “Make an X with your arms!” He does. A wing-like display appears around him, confirming its readiness.

Then she shouts, “Now break it!” he does, and the Talon goes boom shaking off the enemy ships, allowing Ross to continue his pursuit.

First, what a great gesture for this function. Very ordinarily, Wakandans are piloting the Talon, and each of them would be deeply familiar with this gesture, and even prone to think of it when executing a hail Mary move like this.

Second, when an outsider needed to perform the action, why didn’t she just tell Griot to just do it? If there’s an interpretation layer in the system, why not just speak directly to that controller? It might be so the human knows how to do it themselves next time, but this is the last cargo ship he’s been tasked with chasing, and there’s little chance of his officially joining the Wakandan air force. The emergency will be over after this instance. Maybe Wakandans have a principle that they are first supposed to engage the humans before bringing in the machines, but that’s heavy conjecture.

Third, I have a beef about gestures—there’s often zero affordances to tell users what gestures they can do, and what effects those gestures will have. If Shuri was not there to answer Ross’ urgent question, would the mission have just…failed? Seems like a bad design.

How else could have known he could do this? If Griot is on board, Griot could have mentioned it. But avoiding the wizard-did-it solutions, some sort of context-aware display could detect that the ship is tethered to something, and display the gesture on the HUD for him. This violates the principle of letting the humans be the heroes, but would be a critical inclusion in any similar real-world system.

Any time we are faced with “intuitive” controls that don’t map 1:1 to the thing being controlled, we’re faced with similar problems. (We’ve seen the same problems in Sleep Dealer and Lost in Space (1998). Maybe that’s worth its own write-up.) Some controls won’t map to anything. More problematic is that there will be functions which don’t have controls. Designers can’t rely on having a human cavalry like Shuri there to save the day, and should take steps to find ways that the system can inform users of how to activate those functions.

Fit to purpose?

I’ve had to presume a lot about this interface. But if those things are correct, then, sure, this mostly makes it possible for Ross, a novice to piloting, to contribute something to the team mission, while upholding the directive that AI Cannot Be Heroes.

If Griot is not secretly driving, and that directive not really a thing, then the HUD needs more work, I can’t diegetically explain the controls, and they need to develop just-in-time suggestions to patch the gap of the mismatched interface. 


Black Georgia Matters

Each post in the Black Panther review is followed by actions that you can take to support black lives. As this critical special election is still coming up, this is a repeat of the last one, modified to reflect passed deadlines.

The state flag of Georgia, whose motto clearly violates the doctrine of separation of church and state.
Always on my mind, or at least until July 06.

Despite outrageous, anti-democratic voter suppression by the GOP, for the first time in 28 years, Georgia went blue for the presidential election, verified with two hand recounts. Credit to Stacey Abrams and her team’s years of effort to get out the Georgian—and particularly the powerful black Georgian—vote.

But the story doesn’t end there. Though the Biden/Harris ticket won the election, if the Senate stays majority red, Moscow Mitch McConnell will continue the infuriating obstructionism with which he held back Obama’s efforts in office for eight years. The Republicans will, as they have done before, ensure that nothing gets done.

To start to undo the damage the fascist and racist Trump administration has done, and maybe make some actual progress in the US, we need the Senate majority blue. Georgia is providing that opportunity. Neither of the wretched Republican incumbents got 50% of the vote, resulting in a special runoff election January 5, 2021. If these two seats go to the Democratic challengers, Warnock and Ossof, it will flip the Senate blue, and the nation can begin to seriously right the sinking ship that is America.

Photograph: Erik S Lesser/EPA

What can you do?

If you live in Georgia, vote blue, of course. You can check your registration status online. You can also help others vote. Important dates to remember, according to the Georgia website

  • 14 DEC Early voting begins
  • 05 JAN 2021 Final day of voting

Residents can also volunteer to become a canvasser for either of the campaigns, though it’s a tough thing to ask in the middle of the raging pandemic.

The rest of us (yes, even non-American readers) can contribute either to the campaigns directly using the links above, or to Stacey AbramsFair Fight campaign. From the campaign’s web site:

We promote fair elections in Georgia and around the country, encourage voter participation in elections, and educate voters about elections and their voting rights. Fair Fight brings awareness to the public on election reform, advocates for election reform at all levels, and engages in other voter education programs and communications.

We will continue moving the country into the anti-racist future regardless of the runoff, but we can make much, much more progress if we win this election. Please join the efforts as best you can even as you take care of yourself and your loved ones over the holidays. So very much depends on it.

Black Reparations Matter

This is timely, so I’m adding this on as well rather than waiting for the next post: A bill is in the house to set up a commission to examine the institution of slavery and its impact and make recommendations for reparations to Congress. If you are an American citizen, please consider sending a message to your congresspeople asking them to support the bill.

Image, uncredited, from the ACLU site. Please contact me if you know the artist.

On this ACLU site you will find a form and suggested wording to help you along.

Kimoyo Beads

One of the ubiquitous technologies seen in Black Panther is the kimoyo bead. They’re liberally scattered all over the movie like tasty, high-tech croutons. These marble-sized beads are made of vibranium and are more core to Wakandan’s lives than cell phones are to ours. Let’s review the 6 uses seen in the film.

1. Contact-EMP bombs

We first see kimoyo beads when Okoye equips T’Challa with a handful to drop on the kidnapper caravan in the Sambisa forest. As he leaps from the Royal Talon, he flings these, which flatten as they fall, and guide themselves to land on the hoods of the caravan. There they emit an electromagnetic pulse that stops the vehicles in their tracks. It is a nice interaction that does not require much precision or attention from T’Challa.

2. Comms

Wakandans wear bracelets made of 11 kimoyo beads around their wrists. If they pull the comms bead and place it in the palm, it can project very lifelike volumetric displays as part of realtime communication. It is unclear why the bead can’t just stay on the wrist and project at an angle to be facing the user’s line of sight, as it does when Okoye presents to tribal leaders (below.)

We see a fascinating interaction when T’Challa and W’Kabi receive a call at the same time, and put their bracelets together to create a conference call with Okoye.

The scaled-down version of the projection introduces many of the gaze matching problems identified in the book. Similarly to those scenes in Star Wars, we don’t see the conversation from the other side. Is Okoye looking up at giant heads of T’Challa and W’Kabi? Unlikely. Wakanda is advanced enough to manage gaze correction in such displays.

Let me take a moment to appreciate how clever this interaction is from a movie maker’s perspective. It’s easy to imagine each of them holding their own bead separately and talking to individual instances of Okoye’s projection. (Imagine being in a room with a friend and both of you are on a group call with a third party.) But in the scene, she turns to address both T’Challa and W’Kabi. Since the system is doing body-and-face gaze correction, the two VP displays would look slightly different, possibly confusing the audience into thinking these were two separate people on the call. Wakandans would be used to understanding these nuances, but us poor non-Wakandan’s are not.

Identical Okoyes ensures (at least) one of the displays is looking at something weird. It’s confusing.
This is confusing.
Having gaze correction so both Okoyes are looking at T’Challa when she’s talking to him makes it look like there are two different characters. It’s confusing.
This is also confusing.

The shared-display interaction helps bypass these problems and make the technology immediately understandable and seamless.

Later Shuri also speaks with Okoye via communication bead. During this conversation, Shuri removes another bead, and tosses it into a display to show an image and dossier of Killmonger. Given that she’s in her lab, it’s unclear why this gesture is necessary rather than, say, just looking toward a display and thinking, “Show me,” letting the AI Griot interpret from the context what to display.

A final communication happens immediately after as Shuri summons T’Challa to the the lab to learn about Killmonger. In this screenshot, it’s clear that the symbol for the comms bead is an asterisk or star, which mimics the projection rats of the display, and so has some nice semantics to help users learning which symbols do what.

3. Presentation

 In one scene, Okoye gives the tribe rulers a sitrep using her kimoyo beads as a projector. Here she is showing the stolen Wakandan artifact. Readers of the book will note the appearance of projection rays that are standard sci-fi signals that what is seen is a display. A lovely detail in the scene is how Okoye uses a finger on her free hand to change the “slide” to display Klawe. (It’s hard to see the exact gesture, but looks like she presses the projection bead.) We know from other scenes in the movie that the beads are operated by thought-command. But that would not prevent a user from including gestures as part of the brain pattern that triggers an event, and would make a nice second-channel confirmation as discussed in UX of Speculative Brain-Computer Inputs post.

4. Remote piloting

When T’Challa tours Shuri’s lab, she introduces him to remote access kimoyo beads. They are a little bigger than regular beads and have a flared, articulated base. (Why they can’t just morph mid-air like the ones we see in the kidnapper scene?) These play out in the following scene when the strike team needs to commandeer a car to chase Klawe’s Karavan. Oyoke tosses one on the hood on a parked car, its base glows purple, and thereafter Shuri hops into a vibranium-shaped simulacrum of the car in her lab, and remotely operates it.

A quick note: I know that the purple glow is there for the benefit of the audience, but it certainly draws attention to itself, which it might not want to do in the real world.

In the climactic battle of the tribes with Killmonger, Shuri prints a new bracelet and remote control bead for Agent Ross. She places the bracelet on him to enable him to remote pilot the Royal Talon. It goes by very quickly, and the scene is lit quite sparsely, but the moment she puts it on him, you can see that the beads are held together magnetically.

5. Eavesdropping

When Agent Ross is interrogating the captured Klawe, we get a half-second shot to let us know that a kimoyo bead has been placed on his shoulder, allowing T’Challa, Okoye, and Nakia to eavesdrop on the conversation. The output is deliveredby a flattened bone-conducting speaker bead behind their left hears.

6. Healing

Later in the scene, when Killmonger’s bomb grievously wounds Agent Ross in his spine, T’Challa places one of Nakia’s kimoyo beads onto the wound, stabilizing Ross long enough to ferry him to Wakanda where Shuri can fully tend to him. The wound conveniently happens to be kimoyo-bead sized, but I expect that given its shape-shifting powers, it could morph to form a second-skin over larger wounds.


I wondered if kimoyo beads were just given to Wakandan royalty, but it’s made clear in the scene where T’Challa and Nakia walk through the streets of Birnin Zana that every citizen has a bracelet. There is no direct evidence in the film, but given the pro-social-ness throughout, I want to believe that all citizens have free access to the beads, equipping each of them to participate equitably in the culture.

So, most of the interaction is handled through thought-command with gestural augmentation. This means that most of our usual concerns of affordances and constraints are moot. The one thing that bears some comment is the fact that there are multiple beads on the bracelet with different capabilities. How does a user know which bead does what?

As long as the beads can do their job in place on the wrist, I don’t think it matters. As long as all of the beads are reading the user’s thoughts, only the one that can respond need respond. The others can disregard the input. In the real world you’d need to make sure that one thought isn’t interpretable as multiple things, a problem discussed on my team at IBM as disambiguation. Or if they are you must design an interaction where the user can help disambiguate the input, or tell the system which meaning they intend. We never this edge case in Black Panther. 

It seems that some of the beads have specialized functions that cannot be performed by another, each has several symbols engraved into it, the indentions of which glow white for easy identification. The glow is not persistent across all uses, so it must be either context-aware and/or a setting that users can think to change. But even when not lit, the symbols are clear, and clearly distinguishable, so once the user learns the symbols, the labeling should help.


Black Votes Matter

Today is an important day in the United States. It’s election day 2020. Among one of the most important days in U.S. politics, ever. Among Trump’s litany of outrageous lies across his presidency is this whopper: “I have done more for Black Americans than anybody, except for the possible exception of Abraham Lincoln.” (Pause for your spit take and cleaning your screen.)

As infuriating and insulting as this statement is emotionally (like, fuck you for adding “possible” in there, like it’s somehow possible that you’ve done more than freed our black citizens from slavery, you maggot-brained, racist, malignant narccicist) let’s let the Brookings institute break down why, if you believe Black Lives Matter, you need to get out there and vote blue all the way down the ticket.

https://www.cnn.com/2020/11/02/us/ocoee-massacre-100th-anniversary-trnd/index.html

You should read that whole article, but some highlights/reminders

  • Trump ended racial sensitivity training, and put a ban on trainings that utilize critical race theory
  • Hate crimes increased over 200% in places where Trump held a campaign rally in 2016
  • He dismissed the Black Lives Matters movement, said there were “fine people” among white supremacist groups, and rather than condemning the (racist, not gay) Proud Boys, told them to “stand by.”
  • Not a single one of his 53 confirmed appeals court judges circuit justices is black.
  • The criminal mishandling of the COVID-19 pandemic has killed twice as many black Americans as it has white Americans. (Don’t forget he fired the pandemic response team.)

If you are reading this on election day, and have not already done so, please go vote blue. Know that if you are in line even when the polls officially closed, they have to stay open for the entire line to vote. If you have voted, please help others in need. More information is below.

If you are reading this just after election day, we have every evidence that Trump is going to try and declare the election rigged if he loses (please, please let it be when he loses to a massive blue waver). You can help set the expectation among your circle of friends, family, and work colleagues that we won’t know the final results today. We won’t know it tomorrow. We may have a better picture at the end of the week, but it will more likely take until late November to count everyone’s vote, and possibly until mid December to certify everyone’s vote.

And that’s what we do in a liberal democracy. We count everyone’s vote, however long that takes. To demand it in one day during a pandemic is worse than a toddler throwing a “I want it now” tantrum. And we are so very sick of having a toddler in this position.

By Christian Bloom

The Royal Talon piloting interface

Since my last post, news broke that Chadwick Boseman has passed away after a four year battle with cancer. He kept his struggles private, so the news was sudden and hard-hitting. The fandom is still reeling. Black people, especially, have lost a powerful, inspirational figure. The world has also lost a courageous and talented young actor. Rise in Power, Mr. Boseman. Thank you for your integrity, bearing, and strength.

Photo CC BY-SA 2.0,
by Gage Skidmore.

Black Panther’s airship is a triangular vertical-takeoff-and-landing vehicle called the Royal Talon. We see its piloting interface twice in the film.

The first time is near the beginning of the movie. Okoye and T’Challa are flying at night over the Sambisa forest in Nigeria. Okoye sits in the pilot’s seat in a meditative posture, facing a large forward-facing bridge window with a heads up display. A horseshoe-shaped shelf around her is filled with unactivated vibranium sand. Around her left wrist, her kimoyo beads glow amber, projecting a volumetric display around her forearm.

She announces to T’Challa, “My prince, we are coming up on them now.” As she disengages from the interface, retracting her hands from the pose, the kimoyo projection shifts and shrinks. (See more detail in the video clip, below.)

The second time we see it is when they pick up Nakia and save the kidnapped girls. On their way back to Wakanda we see Okoye again in the pilot’s seat. No new interactions are seen in this scene though we linger on the shot from behind, with its glowing seatback looking like some high-tech spine.

Now, these brief glimpses don’t give a review a lot to go on. But for a sake of completeness, let’s talk about that volumetric projection around her wrist. I note is that it is a lovely echo of Dr. Strange’s interface for controlling the time stone Eye of Agamatto.

Wrist projections are going to be all the rage at the next Snap, I predict.

But we never really see Okoye look at this VP it or use it. Cross referencing the Wakandan alphabet, those five symbols at the top translate to 1 2 K R I, which doesn’t tell us much. (It doesn’t match the letters seen on the HUD.) It might be a visual do-not-disturb signal to onlookers, but if there’s other meaning that the letters and petals are meant to convey to Okoye, I can’t figure it out. At worst, I think having your wrist movements of one hand emphasized in your peripheral vision with a glowing display is a dangerous distraction from piloting. Her eyes should be on the “road” ahead of her.

The image has been flipped horizontally to illustrate how Okoye would see the display.

Similarly, we never get a good look at the HUD, or see Okoye interact with it, so I’ve got little to offer other than a mild critique that it looks full of pointless ornamental lines, many of which would obscure things in her peripheral vision, which is where humans need the most help detecting things other than motion. But modern sci-fi interfaces generally (and the MCU in particular) are in a baroque period, and this is partly how audiences recognize sci-fi-ness.

I also think that requiring a pilot to maintain full lotus to pilot is a little much, but certainly, if there’s anyone who can handle it, it’s the leader of the Dora Milaje.

One remarkable thing to note is that this is the first brain-input piloting interface in the survey. Okoye thinks what she wants the ship to do, and it does it. I expect, given what we know about kimoyo beads in Wakanda (more on these in a later post), what’s happening is she is sending thoughts to the bracelet, and the beads are conveying the instructions to the ship. As a way to show Okoye’s self-discipline and Wakanda’s incredible technological advancement, this is awesome.

Unfortunately, I don’t have good models for evaluating this interaction. And I have a lot of questions. As with gestural interfaces, how does she avoid a distracted thought from affecting the ship? Why does she not need a tunnel-in-the-sky assist? Is she imagining what the ship should do, or a route, or something more abstract, like her goals? How does the ship grant her its field awareness for a feedback loop? When does the vibranium dashboard get activated? How does it assist her? How does she hand things off to the autopilot? How does she take it back? Since we don’t have good models, and it all happens invisibly, we’ll have to let these questions lie. But that’s part of us, from our less-advanced viewpoint, having to marvel at this highly-advanced culture from the outside.


Black Health Matters

Each post in the Black Panther review is followed by actions that you can take to support black lives.

Thinking back to the terrible loss of Boseman: Fuck cancer. (And not to imply that his death was affected by this, but also:) Fuck the racism that leads to worse medical outcomes for black people.

One thing you can do is to be aware of the diseases that disproportionately affect black people (diabetes, asthma, lung scarring, strokes, high blood pressure, and cancer) and be aware that no small part of these poorer outcomes is racism, systemic and individual. Listen to Dorothy Roberts’ TED talk, calling for an end to race-based medicine.

If you’re the reading sort, check out the books Black Man in a White Coat by Damon Tweedy, or the infuriating history covered in Medical Apartheid by Harriet Washington.

If you are black, in Boseman’s memory, get screened for cancer as often as your doctor recommends it. If you think you cannot afford it and you are in the USA, this CDC website can help you determine your eligibility for free or low-cost screening: https://www.cdc.gov/cancer/nbccedp/screenings.htm. If you live elsewhere, you almost certainly have a better healthcare system than we do, but a quick search should tell you your options.

Cancer treatment is equally successful for all races. Yet black men have a 40% higher cancer death rate than white men and black women have a 20% higher cancer death rate than white women. Your best bet is to detect it early and get therapy started as soon as possible. We can’t always win that fight, but better to try than to find out when it’s too late to intervene. Your health matters. Your life matters.

Dr. Strange’s augmented reality surgical assistant

We’re actually done with all of the artifacts from Doctor Strange. But there’s one last kind-of interface that’s worth talking about, and that’s when Strange assists with surgery on his own body.

After being shot with a soul-arrow by the zealot, Strange is in bad shape. He needs medical attention. He recovers his sling ring and creates a portal to the emergency room where he once worked. Stumbling with the pain, he manages to find Dr. Palmer and tell her he has a cardiac tamponade. They head to the operating theater and get Strange on the table.

DoctorStrange_AR_ER_assistant-02.png

When Strange passes out, his “spirit” is ejected from his body as an astral projection. Once he realizes what’s happened, he gathers his wits and turns to observe the procedure.

DoctorStrange_AR_ER_assistant-05.png

When Dr. Palmer approaches his body with a pericardiocentesis needle, Strange manifests so she can sense him and recommends that she aim “just a little higher.” At first she is understandably scared, but once he explains what’s happening, she gets back to business, and he acts as a virtual coach.

Continue reading

Jefferson Projection

SWHS-musicVP-01

When Imperial troopers intrude to search the house, one of the bullying officers takes interest in a device sitting on the dining table. It’s the size of a sewing machine, with a long handle along the top. It has a set of thumb toggles along the top, like old cassette tape recorder buttons.

Saun convinces the officer to sit down, stretches the thin script with a bunch of pointless fiddling of a volume slider and pantomimed delays, and at last fumbles the front of the device open. Hinged at the bottom like a drawbridge, it exposes a small black velvet display space. Understandably exasperated, the officer stands up to shout, “Will you get on with it?” Saun presses a button on the opened panel, and the searing chord of an electric guitar can be heard at once.

SWHS-musicVP-04

Inside the drawbridge-space a spot of pink light begins to glow, and mesmerized officer who, moments ago was bent on brute intimidation, but now spends the next five minutes and 23 seconds grinning dopily at the volumetric performance by Jefferson Starship.

During the performance, 6 lights link in a pattern in the upper right hand corner of the display. When the song finishes, the device goes silent. No other interactions are seen with it.

tappa

Many questions. Why is there a whole set of buttons to open the thing? Is this the only thing it can play? If not, how do you select another performance?Is it those unused buttons on the top? Why are the buttons unlabeled? Is Jefferson Starship immortal? How is it that they have only aged in the long, long time since this was recorded? Or was this volumetric recording somehow sent back in time?  Where is the button that Saun pressed to start the playback? If there was no button, and it was the entire front panel, why doesn’t it turn on and off while the officer taps (see above)? What do the little lights do other than distract? Why is the glow pink rather than Star-Wars-standard blue? Since volumetric projections are most often free-floating, why does this appear in a lunchbox? Since there already exists ubiquitous display screens, why would anyone haul this thing around? How does this officer keep his job?

Perhaps it’s best that these questions remain unanswered. For if anything were substantially different, we would risk losing this image, of the silhouette of the lead singer and his microphone. Humanity would be the poorer for it.

SWHS-musicVP-09

The holocircus

To distract Lumpy while she tends to dinner, Malla sits him down at a holotable to watch a circus program. She leans down to one of the four control panels inset around the table’s edge, presses a few buttons, and the program begins.

SWHS-holotable02

In the program small volumetric projections of human (not Wookie) performers appear on the surface of the table and begin a dance and acrobatic performance to a soundtrack that is, frankly, ear-curdling.

Continue reading

Grabby hologram

After Pepper tosses off the sexy bon mot “Work hard!” and leaves Tony to his Avengers initiative homework, Tony stands before the wall-high translucent displays projected around his room.

Amongst the videos, diagrams, metadata, and charts of the Tesseract panel, one item catches his attention. It’s the 3D depiction of the object, the tesseract itself, one of the Infinity Stones from the MCU. It is a cube rendered in a white wireframe, glowing cyan amidst the flat objects otherwise filling the display. It has an intense, cold-blue glow at its center.  Small facing circles surround the eight corners, from which thin cyan rule lines extend a couple of decimeters and connect to small, facing, inscrutable floating-point numbers and glyphs.

Avengers_PullVP-02.png

Wanting to look closer at it, he reaches up and places fingers along the edge as if it were a material object, and swipes it away from the display. It rests in his hand as if it was a real thing. He studies it for a minute and flicks his thumb forward to quickly switch the orientation 90° around the Y axis.

Then he has an Important Thought and the camera cuts to Agent Coulson and Steve Rogers flying to the helicarrier.

So regular readers of this blog (or you know, fans of blockbuster sci-fi movies in general) may have a Spidey-sense that this feels somehow familiar as an interface. Where else do we see a character grabbing an object from a volumetric projection to study it? That’s right, that seminal insult-to-scientists-and-audiences alike, Prometheus. When David encounters the Alien Astrometrics VP, he grabs the wee earth from that display to nuzzle it for a little bit. Follow the link if you want that full backstory. Or you can just look and imagine it, because the interaction is largely the same: See display, grab glowing component of the VP and manipulate it.

Prometheus-229 Two anecdotes are not yet a pattern, but I’m glad to see this particular interaction again. I’m going to call it grabby holograms (capitulating a bit on adherence to the more academic term volumetric projection.) We grow up having bodies and moving about in a 3D world, so the desire to grab and turn objects to understand them is quite natural. It does require that we stop thinking of displays as untouchable, uninterruptable movies and more like toy boxes, and it seems like more and more writers are catching on to this idea.

More graphics or more information?

Additionally,  the fact that this object is the one 3D object in its display is a nice affordance that it can be grabbed. I’m not sure whether he can pull the frame containing the JOINT DARK ENERGY MISSION video to study it on the couch, but I’m fairly certain I knew that the tesseract was grabbable before Tony reached out.

On the other hand, I do wonder what Tony could have learned by looking at the VP cube so intently. There’s no information there. It’s just a pattern on the sides. The glow doesn’t change. The little glyph sticks attached to the edges are fuigets. He might be remembering something he once saw or read, but he didn’t need to flick it like he did for any new information. Maybe he has flicked a VP tesseract in the past?

Augmented “reality”

Rather, I would have liked to have seen those glyph sticks display some useful information, perhaps acting as leaders that connected the VP to related data in the main display. One corner’s line could lead to the Zero Point Extraction chart. Another to the lovely orange waveform display. This way Tony could hold the cube and glance at its related information. These are all augmented reality additions.

Augmented VP

Or, even better, could he do some things that are possible with VPs that aren’t possible with AR. He should be able to scale it to be quite large or small. Create arbitrary sections, or plan views. Maybe fan out depictions of all objects in the SHIELD database that are similarly glowy, stone-like, or that remind him of infinity. Maybe…there’s…a…connection…there! Or better yet, have a copy of JARVIS study the data to find correlations and likely connections to consider. We’ve seen these genuine VP interactions plenty of places (including Tony’s own workshop), so they’re part of the diegesis.

Avengers_PullVP-05.pngIn any case, this simple setup works nicely, in which interaction with a cool media helps underscore the gravity of the situation, the height of the stakes. Note to selves: The imperturbable Tony Stark is perturbed. Shit is going to get real.

 

Stark Tower monitoring

Since Tony disconnected the power transmission lines, Pepper has been monitoring Stark Tower in its new, off-the-power-grid state. To do this she studies a volumetric dashboard display that floats above glowing shelves on a desktop.

Avengers-Reactor-output03

Volumetric elements

The display features some volumetric elements, all rendered as wireframes in the familiar Pepper’s Ghost (I know, I know) visual style: translucent, edge-lit planes. A large component to her right shows Stark Tower, with red lines highlighting the power traveling from the large arc reactor in the basement through the core of the building.

The center of the screen has a similarly-rendered close up of the arc reactor. A cutaway shows a pulsing ring of red-tinged energy flowing through its main torus.

This component makes a good deal of sense, showing her the physical thing she’s meant to be monitoring but not in a photographic way, but a way that helps her quickly locate any problems in space. The torus cutaway is a little strange, since if she’s meant to be monitoring it, she should monitor the whole thing, not just a quarter of it that has been cut away.

Flat elements

The remaining elements in the display appear on a flat plane. Continue reading