As I rule I don’t review lethal weapons on scifiinterfaces.com. The Panther Glove Guns appear to be remote-bludgeoning beams, so this kind of sneaks by. Also, I’ll confess in advance that there’s not a lot that affords critique.
We first see the glove guns in the 3D printer output with the kimoyo beads for Agent Ross and the Dora Milaje outfit for Nakia. They are thick weapons that fit over Shuri’s hands and wrists. I imagine they would be very useful to block blades and even disarm an opponent in melee combat, but we don’t see them in use this way.
The next time we see them, Shuri is activating them. (Though we don’t see how) The panther heads thrust forward, their mouths open wide, and the “neck” glows a hot blue. When the door before her opens, she immediately raises them at the guards (who are loyal to usurper Killmonger) and fires.
A light-blue beam shoots out of the mouths of the weapons, knocking the guards off the platform. Interestingly, one guard is lifted up and thrown to his 4-o-clock. The other is lifted up and thrown to his 7-o-clock. It’s not clear how Shuri instructs the weapons to have different and particular knock-down effects. But we’ve seen all over Black Panther that brain-computer interfaces (BCI) are a thing, so it’s diegetically possible she’s simply imagining where she wants them to be thrown, and then pulling a trigger or clenching her fist around a rod or just thinking “BAM!” to activate. The force-bolt strikes them right where they need to so that, like a billiard ball, they get knocked in the desired direction. As with all(?) brain-computer interfaces, there is not an interaction to critique.
After she dispatches the two guards, still wearing the gloves, she throws a control bead onto the Talon. The scene is fast and blurry, but it’s unclear how she holds and releases the bead from the glove. Was it in the panther’s jaw the whole time? Could be another BCI, of course. She just thought about where she wanted it, flung her arm, and let the AI decide when to release it for perfect targeting. The Talon is large and she doesn’t seem to need a great deal of accuracy with the bead, but for more precise operations, the AI targeting would make more sense than, say, letting the panther heads disintegrate on command so she would have freedom of her hands.
Later, after Killmonger dispatches the Dora Milaje, Shuri and Nakia confront him by themselves. Nakia gets in a few good hits, but is thrown from the walkway. Shuri throws some more bolts his way though he doesn’t appear to even notice. I note that the panther gloves would be very difficult to aim since there’s no continuous beam providing feedback, and she doesn’t have a gun sight to help her. So, again—and I’m sorry because it feels like cheating—I have to fall back to an AI assist here. Otherwise it doesn’t make sense.
Then Shuri switches from one blast at a time to a continuous beam. It seems to be working, as Killmonger kneels from the onslaught.
This is working! How can I eff it up?
But then for some reason she—with a projectile weapon that is actively subduing the enemy and keeping her safe at a distance—decides to close ranks, allowing Killmonger to knock the glove guns with a spear tip, thereby free himself, and destroy the gloves with a clutch of his Panther claws. I mean, I get she was furious, but I expected better tactics from the chief nerd of Wakanda. Thereafter, they spark when she tries to fire them. So ends this print of the Panther Guns.
As with all combat gear, it looks cool for it to glow, but we don’t want coolness to help an enemy target the weapon. So if it was possible to suppress the glow, that would be advisable. It might be glowing just for the intimidation factor, but for a projectile weapon that seems strange.
The panther head shapes remind an opponent that she is royalty (note no other Wakandan combatants have ranged weapons) and fighting in Bast’s name, which I suppose if you’re in the business of theocratic warfare is fine, I guess.
It’s worked so well in the past. More on this aspect later.
So, if you buy the brain-computer interface interpretation, AI targeting assist, and theocratic design, these are fine, with the cinegenic exception of the attention-drawing glow.
Black History Matters
Each post in the Black Panther review is followed by actions that you can take to support black lives.
When The Watchmen series opened with the Tulsa Race Massacre, many people were shocked to learn that this event was not fiction, reminding us just how much of black history is erased and whitewashed for the comfort of white supremacy (and fuck that). Today marks the beginning of Black History Month, and it’s a good opportunity to look back and (re)learn of the heroic figures and stories of both terror and triumph that fill black struggles to have their citizenship and lives fully recognized.
Library of Congress, American National Red Cross Photograph Collection
There are lots of events across the month. The African American History Month site is a collaboration of several government organizations (and it feels so much safer to share such a thing now that the explicitly racist administration is out of office and facing a second impeachment):
The Library of Congress
National Archives and Records Administration
National Endowment for the Humanities
National Gallery of Art
National Park Service
Smithsonian Institution and United States Holocaust Memorial Museum
Today we can take a moment to remember and honor the Greensboro Four.
On this day, February 1, 1960: Through careful planning and enlisting the help of a local white businessman named Ralph Johns, four Black college students—Ezell A. Blair, Jr., Franklin E. McCain, Joseph A. McNeil, David L. Richmond—sat down at a segregated lunch counter at Woolworth’s in Greensboro, North Carolina, and politely asked for service. Their request was refused. When asked to leave, they remained in their seats.
Police arrived on the scene, but were unable to take action due to the lack of provocation. By that time, Ralph Johns had already alerted the local media, who had arrived in full force to cover the events on television. The Greensboro Four stayed put until the store closed, then returned the next day with more students from local colleges.
Their passive resistance and peaceful sit-down demand helped ignite a youth-led movement to challenge racial inequality throughout the South.
A last bit of amazing news to share today is that Black Lives Matter has been nominated for the Nobel Peace Prize! The movement was co-founded by Alicia Garza, Patrisse Cullors and Opal Tometi in response to the acquittal of Trayvon Martin’s murderer, got a major boost with the outrage following and has grown to a global movement working to improve the lives of the entire black diaspora. May it win!
When Agent Ross is shot in the back during Klaue’s escape from the Busan field office, T’Challa stuffs a kimoyo bead into the wound to staunch the bleeding, but the wounds are still serious enough that the team must bring him back to Wakanda for healing. They float him to Shuri’s lab on a hover-stretcher.
Here Shuri gets to say the juicy line, “Great. Another white boy for us to fix. This is going to be fun.”Sorry about the blurry screen shot, but this is the most complete view of the bay.
The hover-stretcher gets locked into place inside a bay. The bay is a small room in the center of Shuri’s lab, open on two sides. The walls are covered in a gray pattern suggesting a honeycomb. A bas-relief volumetric projection displays some medical information about the patient like vital signs and a subtle fundus image of the optic nerve.
Shuri holds her hand flat and raises it above the patient’s chest. A volumetric display of 9 of his thoracic vertebrae rises up in response. One of the vertebrae is highlighted in a bright red. A section of the wall display displays the same information in 2D, cyan with orange highlights. That display section slides out from the wall to draw observer’s attentions. Hexagonal tiles flip behind the display for some reason, but produce no change in the display.
Shuri reaches her hands up to the volumetric vertebrae, pinches her forefingers and thumbs together, and pull them apart. In response, the space between the vertebrae expands, allowing her to see the top and bottom of the body of the vertebra.
She then turns to the wall display, and reading something there, tells the others that he’ll live. Her attention is pulled away with the arrival of Wakabe, bringing news of Killmonger. We do not see her initiate a treatment in the scene. We have to presume that she did it between cuts. (There would have to be a LOT of confidence in an AI’s ability to diagnose and determine treatment before they would let Griot do that without human input.)
We’ll look more closely at the hover-stretcher display in a moment, but for now let’s pause and talk about the displays and the interaction of this beat.
A lab is not a recovery room
This doesn’t feel like a smart environment to hold a patient. We can bypass a lot of the usual hospital concerns of sterilization (it’s a clean room) or readily-available equipment (since they are surrounded by programmable vibranium dust controlled by an AGI) or even risk of contamination (something something AI). I’m mostly thinking about the patient having an environment that promotes healing: Natural light, quiet or soothing music, plants, furnishing, and serene interiors. Having him there certainly means that Shuri’s team can keep an eye on him, and provide some noise that may act as a stimulus, but don’t they have actual hospital rooms in Wakanda?
Why does she need to lift it?
The VP starts in his chest, but why? If it had started out as a “translucent skin” illusion, like we saw in Lost in Space (1998, see below), then that might make sense. She would want to lift it to see it in isolation from the distracting details of the body. But it doesn’t start this way, it starts embedded within him?!
The “translucent skin” display from Lost in Space (1998)
It’s a good idea to have a representation close to the referent, to make for easy comparison between them. But to start the VP within his opaque chest just doesn’t make sense.
This is probably the wrong gesture
In the gestural interfaces chapter of Make It So, I described a pidgin that has been emerging in sci-fi which consisted of 7 “words.” The last of these is “Pinch and Spread to Scale.” Now, there is nothing sacred about this gestural language, but it has echoes in the real world as well. For one example, Google’s VR painting app Tilt Brush uses “spread to scale.” So as an increasingly common norm, it should only be violated with good reason. In Black Panther, Shuri uses spread to mean “spread these out,” even though she starts the gesture near the center of the display and pulls out at a 45° angle. This speaks much more to scaling than to spreading. It’s a mismatch and I can’t see a good reason for it. Even if it’s “what works for her,” gestural idiolects hinder communities of practice, and so should be avoided.
Better would have been pinching on one end of the spine and hooking her other index finger to spread it apart without scaling. The pinch is quite literal for “hold” and the hook quite literal for “pull.” This would let scale be scale, and “hook-pull” to mean “spread components along an axis.”
If we were stuck with the footage of Shuri doing the scale gesture, then it would have made more sense to scale the display, and fade the white vertebrae away so she could focus on the enlarged, damaged one. She could then turn it with her hand to any arbitrary orientation to examine it.
An object highlight is insufficient
It’s quite helpful for an interface that can detect anomalies to help focus a user’s attention there. The red highlight for the damaged vertebrae certainly helps draw attention. Where’s the problem? Ah, yes. There’s the problem. But it’s more helpful for the healthcare worker to know the nature of the damage, what the diagnosis is, to monitor the performance of the related systems, and to know how the intervention is going. (I covered these in the medical interfaces chapter of Make It So, if you want to read more.) So yes, we can see which vertebra is damaged, but what is the nature of that damage? A slipped disc should look different than a bone spur, which should look different than one that’s been cracked or shattered from a bullet. The thing-red display helps for an instant read in the scene, but fails on close inspection and would be insufficient in the real world.
This is not directly relevant to the critique, but interesting that spinal VPs have been around since 1992. Star Trek: The Next Generation, “Ethics” (Season 5, Episode 16).
Put critical information near the user’s locus of attention
Why does Shuri have to turn and look at the wall display at all? Why not augment the volumetric projection with the data that she needs? You might worry that it could obscure the patient (and thereby hinder direct observations) but with an AGI running the show, it could easily position those elements to not occlude her view.
Compare this display, which puts a waveform directly adjacent to the brain VP. Firefly, “Ariel” (Episode 9, 2002).
Note that Shuri is not the only person in the room interested in knowing the state of things, so a wall display isn’t bad, but it shouldn’t be the only augmentation.
Lastly, why does she need to tell the others that Ross will live? if there was signifcant risk of his death, there should be unavoidable environmental signals. Klaxons or medical alerts. So unless we are to believe T’Challa has never encountered a single medical emergency before (even in media), this is a strange thing for her to have to say. Of course we understand she’s really telling us in the audience that we don’t need to wonder about this plot development any more, but it would be better, diegetically, if she had confirmed the time-to-heal, like, “He should be fine in a few hours.”
Alternatively, it would be hilarious turnabout if the AI Griot had simply not been “trained” on data that included white people, and “could not see him,” which is why she had to manually manage the diagnosis and intervention, but that would have massive impact on the remote piloting and other scenes, so isn’t worth it. Probably.
Thoughts toward a redesign
So, all told, this interface and interaction could be much better fit-to-purpose. Clarify the gestural language. Lose the pointless flipping hexagons. Simplify the wall display for observers to show vitals, diagnosis and intervention, as well as progress toward the goal. Augment the physician’s projection with detailed, contextual data. And though I didn’t mention it above, of course the bone isn’t the only thing damaged, so show some of the other damaged tissues, and some flowing, glowing patterns to show where healing is being done along with a predicted time-to-completion.
Stretcher display
Later, when Ross is fully healed and wakes up, we see a shot of of the med table from above. Lots of cyan and orange, and *typography shudder* stacked type. Orange outlines seem to indicate controls, tough they bear symbols rather than full labels, which we know is better for learnability and infrequent reuse. (Linguist nerds: Yes, Wakandan is alphabetic rather than logographic.)
These feel mostly like FUIgetry, with the exception of a subtle respiration monitor on Ross’ left. But it shows current state rather than tracked over time, so still isn’t as helpful as it could be.
Then when Ross lifts his head, the hexagons begin to flip over, disabling the display. What? Does this thing only work when the patient’s head is in the exact right space? What happens when they’re coughing, or convulsing? Wouldn’t a healthcare worker still be interested in the last-recorded state of things? This “instant-off” makes no sense. Better would have been just to let the displays fade to a gray to indicate that it is no longer live data, and to have delayed the fade until he’s actually sitting up.
All told, the Wakandan medical interfaces are the worst of the ones seen in the film. Lovely, and good for quick narrative hit, but bad models for real-world design, or even close inspection within the world of Wakanda.
MLK Day Matters
Each post in the Black Panther review is followed by actions that you can take to support black lives.
Today is Martin Luther King Day. Normally there would be huge gatherings and public speeches about his legacy and the current state of civil rights. But the pandemic is still raging, and with the Capitol in Washington, D.C. having seen just last week an armed insurrection by supporters of outgoing and pouty loser Donald Trump, (in case that WP article hasn’t been moved yet, here’s the post under its watered-down title) worries about additional racist terrorism and violence.
So today we celebrate virtually, by staying at home, re-experiening his speeches and letters, and listening to the words of black leaders and prominent thinkers all around us, reminding us of the arc of the moral universe, and all the work it takes to bend it toward justice.
As we reflect on the contributions of Dr. King, let us build on his legacy by securing the promise of justice for all.
By ensuring our voices were heard in the streets and at the ballot box, we renewed our fight to make the dream a reality. But the work continues. #MLKDay
— The Martin Luther King, Jr. Center (@TheKingCenter) January 18, 2021
With the Biden team taking the reins on Wednesday, and Kamala Harris as our first female Vice President of color, things are looking brighter than they have in 4 long, terrible years. But Trump would have gotten nowhere if there hadn’t been a voting block and party willing to indulge his racist fascism. There’s still much more to do to dismantle systemic racism in the country and around the world. Let’s read, reflect, and use whatever platforms and resources we are privileged to have, act.
Like so much of the tech in Black Panther, this wearable battle gear is quite subtle, but critical to the scene, and much more than it seems at first. When Okoye and Nakia are chasing Klaue through the streets of Busan, South Korea, she realizes she would be better positioned on top of their car than within it.
She holds one of her spears out of the window, stabs it into the roof, and uses it to pull herself out on top of the swerving, speeding car. Once there, she places her feet into position, and the moment the sole of her foot touches the roof, it glows cyan for a moment.
She then holds onto the stuck spear to stabilize herself, rears back with her other spear, and throws it forward through the rear-window and windshield of some minions’ car, where it sticks in the road before them. Their car strikes the spear and get crushed. It’s a kickass moment in a film of kickass moments. But by all means let’s talk about the footwear.
Now, it’s not explicit, the effect the shoe has in the world of the story. But we can guess, given the context, that we are meant to believe the shoes grip the car roof, giving her a firm enough anchor to stay on top of the car and not tumble off when it swerves.
She can’t just be stuck
I have never thrown a javelin or a hyper-technological vibranium spear. But Mike Barber, PhD scholar in Biomechanics at Victoria University and Australian Institute of Sport, wrote this article about the mechanics of javelin throwing, and it seems that achieving throwing force is not just by sheer strength of the rotator cuff. Rather, the thrower builds force across their entire body and whips the momentum around their shoulder joint.
Ilgar Jafarov, CC BY-SA 4.0, via Wikimedia Commons
Okoye is a world-class warrior, but doesn’t have superpowers, so…while I understand she does not want the car to yank itself from underneath her with a swerve, it seems that being anchored in place, like some Wakandan air tube dancer, will not help her with her mighty spear throwing. She needs to move.
It can’t just be manual
Imagine being on a mechanical bull jerking side to side—being stuck might help you stay upright. But imagine it jerking forward suddenly, and you’d wind up on your butt. If it jerked backwards, you’d be thrown forward, and it might be much worse. All are possibilities in the car chase scenario.
If those jerking motions happened to Okoye faster than she could react and release her shoes, it could be disastrous. So it can’t be a thing she needs to manually control. Which means it needs to some blend of manual, agentive, and assistant. Autonomic, maybe, to borrow the term from physiology?
So…
To really be of help, it has to…
monitor the car’s motion
monitor her center of balance
monitor her intentions
predict the future motions of the cars
handle all the cybernetics math (in the Norbert Wiener sense, not the sci-fi sense)
know when it should just hold her feet in place, and when it should signal for her to take action
know what action she should ideally take, so it knows what to nudge her to do
These are no mean feats, especially in real-time. So, I don’t see any explanation except…
An A.I. did it.
AGI is in the Wakandan arsenal (c.f. Griot helping Ross), so this is credible given the diegesis, but I did not expect to find it in shoes.
An interesting design question is how it might deliver warning signals about predicted motions. Is it tangible, like vibration? Or a mild electrical buzz? Or a writing-to-the-brain urge to move? The movie gives us no clues, but if you’re up for a design challenge, give it a speculative design pass.
Wearable heuristics
As part of my 2014 series about wearable technologies in sci-fi, I identified a set of heuristics we can use to evaluate such things. A quick check against those show that they fare well. The shoes are quite sartorial, and look like shoes so are social as well. As a brain interface, it is supremely easy to access and use. Two of the heuristics raise questions though.
Wearables must be designed so they are difficult to accidentally activate. It would have been very inconvenient for Okoye to find herself stuck to the surface of Wakanda while trying to chase Killmonger later in the film, for example. It would be safer to ensure deliberateness with some mode-confirming physical gesture, but there’s no evidence of it in the movie.
Wearables should have apposite I/O. The soles glow. Okoye doesn’t need that information. I’d say in a combat situation it’s genuinely bad design to require her to look down to confirm any modes of the shoes. They’re worn. She will immediately feel whether her shoes are fixed in place. While I can’t name exactly how an enemy might use the knowledge about whether she is stuck in place or not, but on general principle, the less information we give to the enemy, the safer you’ll be. So if this was real-world, we would seek to eliminate the glow. That said, we know that undetectable interactions are not cinegenic in the slightest, so for the film this is a nice “throwaway” addition to the cache of amazing Wakandan technology.
Black Georgia Matters and Today is the Day
Each post in the Black Panther review is followed by actions that you can take to support black lives.
Today is the last day in the Georgia runoff elections. It’s hard to overstate how important this is. If Ossoff and Warnock win, the future of the country has a much better likelihood of taking Black Lives Matter (and lots of other issues) more seriously. Actual progress might be made. Without it, the obstructionist and increasingly-frankly-racist Republican party (and Moscow Mitch) will hold much of the Biden-Harris administration back. If you know of any Georgians, please check with them today to see if they voted in the runoff election. If not—and they’re going to vote Democrat—see what encouragement and help you can give them.
If their absentee ballot has not been registered, they can go to the polls and tell the workers there that they want to cancel their absentee ballot and vote in person. Help them know their poll at My Voter Page: https://www.mvp.sos.ga.gov/MVP/mvp.do
Last year in January I launched an annual awards program for excellence in cinematic sci-fi interfaces. If you didn’t catch it then, here are the inaugural winners. Then the rest of 2020 happened. A lot of movies were postponed since reasonable audiences were staying out of cinemas during the pandemic.
I’m still going to do the Fritzes this year, formally announcing nominees on 15 MAR and publishing the results on 25 APR (timed with the Oscars), but this post is asking to see if my readership knows (and can recommend) other movies that I might not have heard about. This is because of the movies I saw last year, only 2 (that’s right, two) had interfaces significant enough to evaluate. And one of those only had 1 interface in it. In total. So maybe you know more?
2020 sci-fi movies with negligible-or-no interfaces
Here’s a list of those movies that I watched which had negligible or no interfaces to review. (Note some of these were awful, so this is not a recommendations list, just an accounting.)
Here’s the list of sci-fi movies I’m aware of that were released in 2020 that I have yet to watch. Now it could be that I’ve just watched things in an unfortunate order, and the ones I watched happened to be the ones without much to review. Maybe (hopefully?) that’s true. But just in case…
If you have seen any of these and can share with me that they have no or negligible interfaces (and save the 90ish minutes of time) please let me know that, too.
The Ask
So…does anyone know of other 2020 films with great interfaces that I should see? Please let me know on social media or in the comments.
Television addendum
In contrast to the suspiciously-interfaceless moviescape, I’ve seen a lot of great interface work in television. Agents of S.H.I.E.L.D., The Expanse, Devs, Lovecraft Country, The Mandalorian, Star Trek Discovery, and Star Trek Picard, to name a few. So much great work. Awarding television shows is some complexity I haven’t ventured into yet, but if I wind up with a paucity of examples in film, I might have to start alternating years for TV & movies. Or start up a new set of awards just for TV.
Star Trek Discovery, S03E05 “Die Trying” screen shot. And yes, I know.
I presume my readership are adults. I honestly cannot imagine this site has much to offer the 3-to-8-year-old. That said, if you are less than 8.8 years old, be aware that reading this will land you FIRMLY on the naughty list. Leave before it’s too late. Oooh, look! Here’s something interesting for you.
For those who celebrate Yule (and the very hybridized version of the holiday that I’ll call Santa-Christmas to distinguish it from Jesus-Christmas or Horus-Christmas), it’s that one time of year where we watch holiday movies. Santa features in no small number of them, working against the odds to save Christmas and Christmas spirit from something that threatens it. Santa accomplishes all that he does by dint of holiday magic, but increasingly, he has magic-powered technology to help him. These technologies are different for each movie in which they appear, with different sci-fi interfaces, which raises the question: Who did it better?
Unraveling this stands to be even more complicated than usual sci-fi fare.
These shows are largely aimed at young children, who haven’t developed the critical thinking skills to doubt the core premise, so the makers don’t have much pressure to present wholly-believable worlds. The makers also enjoy putting in some jokes for adults that are non-diegetic and confound analysis.
Despite the fact that these magical technologies are speculative just as in sci-fi, makers cannot presume that their audience are sci-fi fans who are familiar with those tropes. And things can’t seem too technical.
The sci in this fi is magical, which allows makers to do all-sorts of hand-wavey things about how it’s doing what it’s doing.
Many of the choices are whimsical and serve to reinforce core tenets of the Santa Claus mythos rather than any particular story or worldbuilding purpose.
But complicated-ness has rarely cowed this blog’s investigations before, why let a little thing like holiday magic do it now?
Ho-Ho-hubris!
A Primer on Santa
I have readers from all over the world. If you’re from a place that does not celebrate the Jolly Old Elf, a primer should help. And if you’re from a non-USA country, your Saint Nick mythos will be similar but not the same one that these movies are based on, so a clarification should help. To that end, here’s what I would consider the core of it.
Santa Claus is a magical, jolly, heavyset old man with white hair, mustache, and beard who lives at the North Pole with his wife Ms. Claus. The two are almost always caucasian. He can alternately be called Kris Kringle, Saint Nick, Father Christmas, or Klaus. The Clark Moore poem calls him a “jolly old elf.” He is aware of the behavior of children, and tallies their good and bad behavior over the year, ultimately landing them on the “naughty” or “nice” list. Santa brings the nice ones presents. (The naughty ones are canonically supposed to get coal in their stockings though in all my years I have never heard of any kids actually getting coal in lieu of presents.) Children also hang special stockings, often on a mantle, to be filled with treats or smaller presents. Adults encourage children to be good in the fall to ensure they get presents. As December approaches, Children write letters to Santa telling him what presents they hope for. Santa and his elves read the letters and make all the requested toys by hand in a workshop. Then the evening of 24 DEC, he puts all the toys in a large sack, and loads it into a sleigh led by 8 flying reindeer. Most of the time there is a ninth reindeer up front with a glowing red nose named Rudolph. He dresses in a warm red suit fringed with white fur, big black boots, thick black belt, and a stocking hat with a furry ball at the end. Over the evening, as children sleep, he delivers the presents to their homes, where he places them beneath the Christmas tree for them to discover in the morning. Families often leave out cookies and milk for Santa to snack on, and sometimes carrots for the reindeer. Santa often tries to avoid detection for reasons that are diegetically vague.
There is no single source of truth for this mythos, though the current core text might be the 1823 C.E. poem, “A Visit from St. Nicholas” by Clement Clarke Moore. Visually, Santa’s modern look is often traced back to the depictions by Civil War cartoonist Thomas Nast, which the Coca-Cola Corporation built upon for their holiday advertisements in 1931.
Both these illustrations are by Nast.
There are all sorts of cultural conversations to have about the normalizing a magical panopticon, what effect hiding the actual supply chain has, and asking for what does perpetuating this myth train children; but for now let’s stick to evaluating the interfaces in terms of Santa’s goals.
Santa’s goals
Given all of the above, we can say that the following are Santa’s goals.
Sort kids by behavior as naughty or nice
Many tellings have him observing actions directly
Manage the lists of names, usually on separate lists
Manage letters
Reading letters
Sending toy requests to the workshop
Storing letters
Make presents
Travel to kids’ homes
Find the most-efficient way there
Control the reindeer
Maintain air safety
Avoid air obstacles
Find a way inside and to the tree
Enjoy the cookies / milk
Deliver all presents before sunrise
For each child:
Know whether they are naughty or nice
If nice, match the right toy to the child
Stage presents beneath the tree
Avoid being seen
We’ll use these goals to contextualize the Santa interfaces against.
This is the Worst Santa, but the image is illustrative of the weather challenges.
Typical Challenges
Nearly every story tells of Santa working with other characters to save Christmas. (The metaphor that we have to work together to make Christmas happen is appreciated.) The challenges in the stories can be almost anything, but often include…
Inclement weather (usually winter, but Santa is a global phenomenon)
Air safety
Air obstacles (Planes, helicopters, skyscrapers)
Ingress/egress into homes
Home security systems / guard dogs
The Contenders
Imdb.com lists 847 films tagged with the keyword “santa claus,” which is far too much to review. So I looked through “best of” lists (two are linked below) and watched those films for interfaces. There weren’t many. I even had to blend CGI and live action shows, which I’m normally hesitant to do. As always, if you know of any additional shows that should be considered, please mention it in the comments.
After reviewing these films, the ones with Santa interfaces came down to four, presented below in chronological order.
The Santa Clause (1994)
This movie deals with the lead character, Scott Calvin, inadvertently taking on the “job” of Santa Clause. (If you’ve read Anthony’s Incarnations of Immortality series, this plot will feel quite familiar.)
The sleigh he inherits has a number of displays that are largely unexplained, but little Charlie figures out that the center console includes a hot chocolate and cookie dispenser. There is also a radar, and far away from it, push buttons for fog, planes, rain, and lightning. There are several controls with Christmas bell icons associated with them, but the meaning of these are unclear.
Santa’s hat in this story has headphones and the ball has a microphone for communicating with elves back in the workshop.
This is the oldest of the candidates. Its interfaces are quite sterile and “tacked on” compared to the others, but was novel for its time.
This movie tells the story of Santa’s n’er do well brother Fred, who has to work in the workshop for one season to work off bail money. While there he winds up helping forestall foreclosure from an underhanded supernatural efficiency expert, and un-estranging himself from his family. A really nice bit in this critically-panned film is that Fred helps Santa understand that there are no bad kids, just kids in bad circumstances.
Fred is taken to the North Pole in a sled with switches that are very reminiscent of the ones in The Santa Clause. A funny touch is the “fasten your seatbelt” sign like you might see in a commercial airliner. The use of Lombardic Capitals font is a very nice touch given that much of modern Western Santa Claus myth (and really, many of our traditions) come from Germany.
The workshop has an extensive pneumatic tube system for getting letters to the right craftself.
This chamber is where Santa is able to keep an eye on children. (Seriously panopticony. They have no idea they’re being surveilled.) Merely by reading the name and address of a child a volumetric display appears within the giant snowglobe. The naughtiest children’s names are displayed on a digital split-flap display, including their greatest offenses. (The nicest are as well, but we don’t get a close up of it.)
The final tally is put into a large book that one of the elves manages from the sleigh while Santa does the actual gift-distribution. The text in the book looks like it was printed from a computer.
In this telling, the Santa job is passed down patrilineally. The oldest Santa, GrandSanta, is retired. The dad, Malcolm, is the current-acting Santa one, and he has two sons. One is Steve, a by-the-numbers type into military efficiency and modern technology. The other son, Arthur, is an awkward fellow who has a semi-disposable job responding to letters. Malcolm currently pilots a massive mile-wide spaceship from which ninja elves do the gift distribution. They have a lot of tech to help them do their job. The plot involves Arthur working with Grandsanta using his old Sleigh to get a last forgotten gift to a young girl before the sun rises.
To help manage loud pets in the home who might wake up sleeping people, this gun has a dial for common pets that delivers a treat to distract them.
Elves have face scanners which determine each kids’ naughty/nice percentage. The elf then enters this into a stocking-filling gun, which affects the contents in some unseen way. A sweet touch is when one elf scans a kid who is read as quite naughty, the elf scans his own face to get a nice reading instead.
The S-1 is the name of the spaceship sleigh at the beginning (at the end it is renamed after Grandsanta’s sleigh). Its bridge is loaded with controls, volumetric displays, and even a Little Tree air freshener. It has a cloaking display on its underside which is strikingly similar to the MCUS.H.I.E.L.D. helicarrier cloaking. (And this came out the year before The Avengers, I’m just sayin’.)
The north pole houses the command-and-control center, which Steve manages. Thousands of elves manage workstations here, and there is a huge shared display for focusing and informing the team at once when necessary. Smaller displays help elf teams manage certain geographies. Its interfaces fall to comedy and trope, mostly, but are germane to the story beats
One of the crisis scenarios that this system helps manage is for a “waker,” a child who has awoken and is at risk of spying Santa.
Grandsanta’s outmoded sleigh is named Eve. Its technology is much more from the early 20th century, with switches and dials, buttons and levers. It’s a bit janky and overly complex, but gets the job done.
One notable control on S-1 is this trackball with dark representations of the continents. It appears to be a destination selector, but we do not see it in use. It is remarkable because it is very similar to one of the main interface components in the next candidate movie, The Christmas Chronicles.
The Christmas Chronicles follows two kids who stowaway on Santa’s sleigh on Christmas Eve. His surprise when they reveal themselves causes him to lose his magical hat and wreck his sleigh. They help him recover the items, finish his deliveries, and (well, of course) save Christmas just in time.
Santa’s sleight enables him to teleport to any place on earth. The main control is a trackball location selector. Once he spins it and confirms that the city readout looks correct, he can press the “GO” button for a portal to open in the air just ahead of the sleigh. After traveling in a aurora borealis realm filled with famous landmarks for a bit, another portal appears. They pass through this and appear at the selected location. A small magnifying glass above the selection point helps with precision.
Santa wears a watch that measures not time, but Christmas spirit, which ranges from 0 to 100. In the bottom half, chapter rings and a magnifying window seem designed to show the date, with 12 and 31 sequential numbers, respectively. It’s not clear why it shows mid May. A hemisphere in the middle of the face looks like it’s almost a globe, which might be a nice way to display and change time zone, but that may be wishful thinking on my part.
Santa also has a tracking device for finding his sack of toys. (Apparently this has happened enough time to warrant such a thing.) It is an intricate filligree over a cool green and blue glass. A light within blinks faster the closer the sphere is to the sack.
Since he must finish delivering toys before Christmas morning, the dashboard has a countdown clock with Nixie tube numbers showing hours, minutes, and milliseconds. They ordinary glow a cyan, but when time runs out, they turn red and blink.
This Santa also manages his list in a large book with lovely handwritten calligraphy. The kids whose gifts remain undelivered glow golden to draw his attention.
The hard problem here is that there is a lot of apples-to-oranges comparisons to do. Even though the mythos seems pretty locked down, each movie takes liberties with one or two aspects. As a result not all these Santas are created equally. Calvin’s elves know he is completely new to his job and will need support. Christmas Chronicles Santa has perfect memory, magical abilities, and handles nearly all the delivery duties himself, unless he’s enacting a clever scheme to impart Christmas wisdom. Arthur Christmas has intergenerational technology and Santas who may not be magic at all, but fully know their duty from their youths but rely on a huge army of shock troop elves to make things happen. So it’s hard to name just one. But absent a point-by-point detailed analysis, there are two that really stand out to me.
The weathered surface of this camouflage button is delightful (Arthur Christmas).
Coverage of goals
Arthur Christmas movie has, by far, the most interfaces of any of the candidates, and more coverage of the Santa-family’s goals. Managing noisy pets? Check? Dealing with wakers? Check. Navigating the globe? Check. As far as thinking through speculative technology that assists its Santa, this film has the most.
Keeping the holiday spirit
I’ll confess, though, that extradiegetically, one of the purposes of annual holidays is to mark the passage of time. By trying to adhere to traditions as much as we can, time and our memory is marked by those things that we cannot control (like, say, a pandemic keeping everyone at home and hanging with friends and family virtually). So for my money, the thoroughly modern interfaces that flood Arthur Christmas don’t work that well. They’re so modern they’re not…Christmassy. Grandsanta’s sleigh Eve points to an older tradition, but it’s also clearly framed as outdated in the context of the story.
Gorgeous steampunkish binocular HUD from The Christmas Chronicles 2, which was not otherwise included in this post.
Compare this to The Christmas Chronicles, with its gorgeous steampunk-y interfaces that combine a sense of magic and mechanics. These are things that a centuries-old Santa would have built and use. They feel rooted in tradition while still helping Santa accomplish as many of his goals as he needs (in the context of his Christmas adventure for the stowaway kids). These interfaces evoke a sense of wonder, add significantly to the worldbuilding, and which I’d rather have as a model for magical interfaces in the real world.
Of course it’s a personal call, given the differences, but The Christmas Chronicles wins in my book.
Ho, Ho, HEH.
For those that celebrate Santa-Christmas, I hope it’s a happy one, given the strange, strange state of the world. May you be on the nice list.
Remote operation appears twice during Black Panther. This post describes the second, in which CIA Agent Ross remote-pilots the Talon in order to chase down cargo airships carrying Killmonger’s war supplies. The prior post describes the first, in which Shuri remotely drives an automobile.
In this sequence, Shuri equips Ross with kimoyo beads and a bone-conducting communication chip, and tells him that he must shoot down the cargo ships down before they cross beyond the Wakandan border. As soon as she tosses a remote-control kimoyo bead onto the Talon, Griot announces to Ross in the lab “Remote piloting system activated” and creates a piloting seat out of vibranium dust for him. Savvy watchers may wonder at this, since Okoye pilots the thing by meditation and Ross would have no meditation-pilot training, but Shuri explains to him, “I made it American style for you. Get in!” He does, grabs the sparkly black controls, and gets to business.
The most remarkable thing to me about the interface is how seamlessly the Talon can be piloted by vastly different controls. Meditation brain control? Can do. Joystick-and-throttle? Just as can do.
Now, generally, I have a beef with the notion of hyperindividualized UI tailoring—it prevents vital communication across a community of practice (read more about my critique of this goal here)—but in this case, there is zero time for Ross to learn a new interface. So sure, give him a control system with which he feels comfortable to handle this emergency. It makes him feel more at ease.
The mutable nature of the controls tells us that there is a robust interface layer that is interpreting whatever inputs the pilot supplies and applying them to the actuators in the Talon. More on this below. Spoiler: it’s Griot.
Too sparse HUD
The HUD presents a simple circle-in-a-triangle reticle that lights up red when a target is in sights. Otherwise it’s notably empty of augmentation. There’s no tunnel in the sky display to describe the ideal path, or proximity warnings about skyscrapers, or airspeed indicator, or altimeter, or…anything. This seems a glaring omission since we can be certain other “American-style” airships have such things. More on why this might be below, but spoiler: It’s Griot.
What do these controls do, exactly?
I take no joy in gotchas. That said…
When Ross launches the Talon, he does so by pulling the right joystick backward.
When he shoots down the first cargo ship over Birnin Zana, he pushes the same joystick forward as he pulls the trigger, firing energy weapons.
Why would the same control do both? It’s hard to believe it’s modal. Extradiegetically, this is probably an artifact of actor Martin Freeman’s just doing what feels dramatic, but for a real-world equivalent I would advise against having physical controls have wholly different modes on the same grip, lest we risk confusing pilots on mission-critical tasks. But spoiler…oh, you know where this is going.
It’s Griot
Diegetically, Shuri is flat-out wrong that Ross is an experienced pilot. But she also knew that it didn’t matter, because her lab has him covered anyway. Griot is an AI with a brain interface, and can read Ross’ intentions, handling all the difficult execution itself.
This would also explain the lack of better HUD augmentation. That absence seems especially egregious considering that the first cargo ship was flying over a crowded city at the time it was being targeted. If Ross had fired in the wrong place, the cargo ship might have crashed into a building, or down to the bustling city street, killing people. But, instead, Griot quietly, precisely targets the ship for him, to insure that it would crash safely in nearby water.
This would also explain how wildly different interfaces can control the Talon with similar efficacy.
So, Occams-apology says, yep, it’s Griot.
An AI-wizard did it?
In the post about Shuri’s remote driving, I suggested that Griot was also helping her execute driving behind the scenes. This hearkens back to both the Iron HUD and Doctor Strange’s Cloak of Levitation. It could be that the MCU isn’t really worrying about the details of its enabling technologies, or that this is a brilliant model for our future relationship with technology. Let us feel like heroes, and let the AI manage all the details. I worry that I’m building myself into a wizard-did-it pattern, inserting AI for wizard. Maybe that’s worth another post all its own.
But there is one other thing about Ross’ interface worth noting.
The sonic overload
When the last of the cargo ships is nearly at the border, Ross reports to Shuri that he can’t chase it, because Killmonger-loyal dragon flyers have “got me trapped with some kind of cables.” She instructs him to, “Make an X with your arms!” He does. A wing-like display appears around him, confirming its readiness.
Then she shouts, “Now break it!” he does, and the Talon goes boom shaking off the enemy ships, allowing Ross to continue his pursuit.
First, what a great gesture for this function. Very ordinarily, Wakandans are piloting the Talon, and each of them would be deeply familiar with this gesture, and even prone to think of it when executing a hail Mary move like this.
Second, when an outsider needed to perform the action, why didn’t she just tell Griot to just do it? If there’s an interpretation layer in the system, why not just speak directly to that controller? It might be so the human knows how to do it themselves next time, but this is the last cargo ship he’s been tasked with chasing, and there’s little chance of his officially joining the Wakandan air force. The emergency will be over after this instance. Maybe Wakandans have a principle that they are first supposed to engage the humans before bringing in the machines, but that’s heavy conjecture.
Third, I have a beef about gestures—there’s often zero affordances to tell users what gestures they can do, and what effects those gestures will have. If Shuri was not there to answer Ross’ urgent question, would the mission have just…failed? Seems like a bad design.
How else could have known he could do this? If Griot is on board, Griot could have mentioned it. But avoiding the wizard-did-it solutions, some sort of context-aware display could detect that the ship is tethered to something, and display the gesture on the HUD for him. This violates the principle of letting the humans be the heroes, but would be a critical inclusion in any similar real-world system.
Any time we are faced with “intuitive” controls that don’t map 1:1 to the thing being controlled, we’re faced with similar problems. (We’ve seen the same problems in Sleep Dealer and Lost in Space (1998). Maybe that’s worth its own write-up.) Some controls won’t map to anything. More problematic is that there will be functions which don’t have controls. Designers can’t rely on having a human cavalry like Shuri there to save the day, and should take steps to find ways that the system can inform users of how to activate those functions.
Fit to purpose?
I’ve had to presume a lot about this interface. But if those things are correct, then, sure, this mostly makes it possible for Ross, a novice to piloting, to contribute something to the team mission, while upholding the directive that AI Cannot Be Heroes.
If Griot is not secretly driving, and that directive not really a thing, then the HUD needs more work, I can’t diegetically explain the controls, and they need to develop just-in-time suggestions to patch the gap of the mismatched interface.
Black Georgia Matters
Each post in the Black Panther review is followed by actions that you can take to support black lives. As this critical special election is still coming up, this is a repeat of the last one, modified to reflect passed deadlines.
Always on my mind, or at least until July 06.
Despite outrageous, anti-democratic voter suppression by the GOP, for the first time in 28 years, Georgia went blue for the presidential election, verified with two hand recounts. Credit to Stacey Abrams and her team’s years of effort to get out the Georgian—and particularly the powerful black Georgian—vote.
But the story doesn’t end there. Though the Biden/Harris ticket won the election, if the Senate stays majority red, Moscow Mitch McConnell will continue the infuriating obstructionism with which he held back Obama’s efforts in office for eight years. The Republicans will, as they have done before, ensure that nothing gets done.
To start to undo the damage the fascist and racist Trump administration has done, and maybe make some actual progress in the US, we need the Senate majority blue. Georgia is providing that opportunity. Neither of the wretched Republican incumbents got 50% of the vote, resulting in a special runoff election January 5, 2021. If these two seats go to the Democratic challengers, Warnock and Ossof, it will flip the Senate blue, and the nation can begin to seriously right the sinking ship that is America.
Residents can also volunteer to become a canvasser for either of the campaigns, though it’s a tough thing to ask in the middle of the raging pandemic.
The rest of us (yes, even non-American readers) can contribute either to the campaigns directly using the links above, or to Stacey Abrams’ Fair Fight campaign. From the campaign’s web site:
We promote fair elections in Georgia and around the country, encourage voter participation in elections, and educate voters about elections and their voting rights. Fair Fight brings awareness to the public on election reform, advocates for election reform at all levels, and engages in other voter education programs and communications.
We will continue moving the country into the anti-racist future regardless of the runoff, but we can make much, much more progress if we win this election. Please join the efforts as best you can even as you take care of yourself and your loved ones over the holidays. So very much depends on it.
Black Reparations Matter
This is timely, so I’m adding this on as well rather than waiting for the next post: A bill is in the house to set up a commission to examine the institution of slavery and its impact and make recommendations for reparations to Congress. If you are an American citizen, please consider sending a message to your congresspeople asking them to support the bill.
Image, uncredited, from the ACLU site. Please contact me if you know the artist.
On this ACLU site you will find a form and suggested wording to help you along.
Remote operation appears twice during Black Panther. This post describes the first, in which Shuri remotely operates an automobile during a chase sequence. The next post describes the other, in which Ross remotely pilots the Talon.
In the scene, Okoye has dropped a remote control kimoyo bead onto a car in Singapore. (It’s unclear why this is necessary. During the chase, Klawe tells his minion the car is made of vibranium, which tells us it’s Wakandan. Wouldn’t remote control be built in? But I digress…)
T’Challa, leaving the Singaporean casino, shouts, “Shuri!” Shuri, in her lab in Wakanda, hears the call. The lab’s AI, Griot, says, “Remote driving system activated.” The vibranium dust / programmable matter of the lab forms a seat and steering wheel for her that match the controlled car’s. A projection of the scene around the controlled car gives her a live visual to work with. She pauses to ask, “Wait. Which side of the road is it?” T’Challa shouts, “For Bast’s sake, just drive!” She floors the gas pedal in her lab, and we see the gas pedal of the controlled car depress in Singapore. There ensues a nail-biting car chase.
Now, I don’t want to de-hero-ize our heroes, but let’s face it, Griot must be doing a significant portion of the driving here. Here’s my rationale: The system has a feedback loop that must shuttle video data from Singapore to Wakanda, then Shuri has to respond, and her control signal must be digitized and sent back from Wakanda to Singapore, continuously. Presuming some stuff, that’s a distance of 7633 kilometers / 4743 miles. If that signal was unimpeded light (and these quora estimates are correct) and Shuri’s response time instantaneous, it would take that signal on the order of 600 milliseconds round trip. Sure, this is specualtively-advanced, but it’s still technology, and there are analog-to-digital, digital-to-analog, encryption, and decryption conversions to be managed, signal boosts along the way, and the impedance of whatever network these signals are riding. Plus as awesome as Shuri is, her response time is longer than 0. The feedback loop would be way longer than the 100 milliseconds minimum required to feel like instantaneous response.
Without presuming some physics-breaking stuff, there will a significant lag between what’s happening around the actual car and Shuri’s remote reaction getting back to that car. In a high-speed chase like this, the lag would prove disastrous, and the only way I can apologize my way around it is if Griot spun-up some aspect of himself in the kimoyo bead sitting on the car that is doing the majority of the stunt driving. For all the excitement that Shuri is feeling, she is likely just providing broad suggestions to what she thinks should happen, and Griot is doing the rest. (Long-time readers will note this would be similar to the relationship I describe between JARVIS and Tony Stark.) Shuri is just an input. An important one—and one that would dislike being disregarded—but still, an input.
HUD notes
The HUD bears two quick notes about its display.
The hexagonal shapes in the background house the video projection.
First, the video feed around the remote operators is a sphere, onto which 2D photorealistic video projects. Modern racing games mostly use the 2D displays of televisions as well, and they’re enjoyable, but I should think that immersion and responses would be better if it was a three-dimensional volumetric display instead, improving the visual data with parallax . That would be difficult to convey on screen for the audience, but I don’t think impossible.
Third, when Klawe’s minions cause a pile-up in an intersection, Shuri’s view shows the scene with the obstacles overlaid in red. As a bit of assistance, that shows us several things. Griot is watching the scene, and able to augment the display in real time. She would find more of this context- and goal-awareness augmentation useful. For instance, she wouldn’t have had to ask which side of the road Singaporeans drive on. (It’s the left, by the way, like the UK. Her steering wheel, if it was to match the car’s, should have been on the right. Nearly all of the driving in the scene happens on the wrong side of the road to feel “correct” to right-driving audiences.)
Haptics
It’s also really interesting to note that the seat provides strong haptic feedback. When T’Challa dumps a minion from the SUV in front of the car, the controlled car speed-bumps over the body. Shuri’s seat matches the bump, and she asks T’Challa, “What was that?” (This is a slightly unbelievable moment. Her focus is on the scene, and her startle response could not help but alert her to a dark shape symmetrically expanding.) We know from motion simulators that tilting a seat up and down can strongly mimic momentum as if traveling, so I’m guessing that Shuri’s very much feeling the chase.
We are not shown what happens when T’Challa sharpens that emergency turn and lifts the real car by around 35 degrees, but Griot must have supplied her with a just-in-time seatbelt if she was angled similarly.
When Klawe manages to shoot his arm-cannon at the remotely-controlled car, destroying it, for some reason Shuri’s vibranium dust simply…collapses, dropping her rudely to the floor. This had to be added in to the design of the system, and I cannot for the life of me figure out why this would be a good thing. Just…no.
Fit to purpose?
Shuri’s remote driving interface gives her well-mapped controls with rich sensory feedback, low latency, and at least the appearance of control, even if Griot is handling all the details. The big critiques are that Griot must be “there” quietly doing most of the work, that the HUD could provide a richer augmentation to help her make better real-time suggestions, and the failure event should not risk a broken coccyx.
Black Georgia Matters
Each post in the Black Panther review is followed by actions that you can take to support black lives.
Looking back at these posts, I am utterly floored at the number of things that have occurred in the world that are worth remarking on with each post. Floyd’s murder. Boseman’s passing. Ginsberg’s passing and hasty, hypocritical replacement. The national election. And while there is certainly more to say about anti-racism in general, for this post let’s talk about Georgia.
Always on my mind, or at least until July 06.
Despite outrageous, anti-democratic voter suppression by the GOP, for the first time in 28 years, the state went blue for the presidential election, verified with two hand recounts. Credit to Stacey Abrams and her team’s years of effort to get out the Georgian—and particularly the powerful black Georgian—vote.
But the story doesn’t end there. Though the Biden/Harris ticket won the election, if the Senate stays majority red, Moscow Mitch McConnell will continue the infuriating obstructionism with which he held back Obama’s efforts in office for eight years. The Republicans will, as they have done before, ensure that nothing gets done.
To start to undo the damage the fascist and racist Trump administration has done, and maybe make some actual progress in the US, we need the Senate majority blue. Georgia is providing that opportunity. Neither of the wretched Republican incumbents got 50% of the vote, resulting in a special runoff election January 5, 2021. If these two seats go to the Democratic challengers, Warnock and Ossof, it will flip the Senate blue, and the nation can begin to seriously right the sinking ship that is America.
Residents can also volunteer to become a canvasser for either of the campaigns, though it’s a tough thing to ask in the middle of the raging pandemic.
The rest of us (yes, even non-American readers) can contribute either to the campaigns directly using the links above, or to Stacey Abrams’ Fair Fight campaign. From the campaign’s web site:
We promote fair elections in Georgia and around the country, encourage voter participation in elections, and educate voters about elections and their voting rights. Fair Fight brings awareness to the public on election reform, advocates for election reform at all levels, and engages in other voter education programs and communications.
If you don’t want to donate money directly, you can join a letter writing campaign to help get out the vote, via the Vote Forward campaign.
We will continue moving the country into the anti-racist future regardless of the runoff, but we can make much, much more progress if we win this election. Please join the efforts as best you can even as you take care of yourself and your loved ones over the holidays. So very much depends on it.
One of the ubiquitous technologies seen in Black Panther is the kimoyo bead. They’re liberally scattered all over the movie like tasty, high-tech croutons. These marble-sized beads are made of vibranium and are more core to Wakandan’s lives than cell phones are to ours. Let’s review the 6 uses seen in the film.
1. Contact-EMP bombs
We first see kimoyo beads when Okoye equips T’Challa with a handful to drop on the kidnapper caravan in the Sambisa forest. As he leaps from the Royal Talon, he flings these, which flatten as they fall, and guide themselves to land on the hoods of the caravan. There they emit an electromagnetic pulse that stops the vehicles in their tracks. It is a nice interaction that does not require much precision or attention from T’Challa.
2. Comms
Wakandans wear bracelets made of 11 kimoyo beads around their wrists. If they pull the comms bead and place it in the palm, it can project very lifelike volumetric displays as part of realtime communication. It is unclear why the bead can’t just stay on the wrist and project at an angle to be facing the user’s line of sight, as it does when Okoye presents to tribal leaders (below.)
We see a fascinating interaction when T’Challa and W’Kabi receive a call at the same time, and put their bracelets together to create a conference call with Okoye.
The scaled-down version of the projection introduces many of the gaze matching problems identified in the book. Similarly to those scenes in Star Wars, we don’t see the conversation from the other side. Is Okoye looking up at giant heads of T’Challa and W’Kabi? Unlikely. Wakanda is advanced enough to manage gaze correction in such displays.
Let me take a moment to appreciate how clever this interaction is from a movie maker’s perspective. It’s easy to imagine each of them holding their own bead separately and talking to individual instances of Okoye’s projection. (Imagine being in a room with a friend and both of you are on a group call with a third party.) But in the scene, she turns to address both T’Challa and W’Kabi. Since the system is doing body-and-face gaze correction, the two VP displays would look slightly different, possibly confusing the audience into thinking these were two separate people on the call. Wakandans would be used to understanding these nuances, but us poor non-Wakandan’s are not.
This is confusing.This is also confusing.
The shared-display interaction helps bypass these problems and make the technology immediately understandable and seamless.
Later Shuri also speaks with Okoye via communication bead. During this conversation, Shuri removes another bead, and tosses it into a display to show an image and dossier of Killmonger. Given that she’s in her lab, it’s unclear why this gesture is necessary rather than, say, just looking toward a display and thinking, “Show me,” letting the AI Griot interpret from the context what to display.
A final communication happens immediately after as Shuri summons T’Challa to the the lab to learn about Killmonger. In this screenshot, it’s clear that the symbol for the comms bead is an asterisk or star, which mimics the projection rats of the display, and so has some nice semantics to help users learning which symbols do what.
3. Presentation
In one scene, Okoye gives the tribe rulers a sitrep using her kimoyo beads as a projector. Here she is showing the stolen Wakandan artifact. Readers of the book will note the appearance of projection rays that are standard sci-fi signals that what is seen is a display. A lovely detail in the scene is how Okoye uses a finger on her free hand to change the “slide” to display Klawe. (It’s hard to see the exact gesture, but looks like she presses the projection bead.) We know from other scenes in the movie that the beads are operated by thought-command. But that would not prevent a user from including gestures as part of the brain pattern that triggers an event, and would make a nice second-channel confirmation as discussed in UX of Speculative Brain-Computer Inputs post.
4. Remote piloting
When T’Challa tours Shuri’s lab, she introduces him to remote access kimoyo beads. They are a little bigger than regular beads and have a flared, articulated base. (Why they can’t just morph mid-air like the ones we see in the kidnapper scene?) These play out in the following scene when the strike team needs to commandeer a car to chase Klawe’s Karavan. Oyoke tosses one on the hood on a parked car, its base glows purple, and thereafter Shuri hops into a vibranium-shaped simulacrum of the car in her lab, and remotely operates it.
A quick note: I know that the purple glow is there for the benefit of the audience, but it certainly draws attention to itself, which it might not want to do in the real world.
In the climactic battle of the tribes with Killmonger, Shuri prints a new bracelet and remote control bead for Agent Ross. She places the bracelet on him to enable him to remote pilot the Royal Talon. It goes by very quickly, and the scene is lit quite sparsely, but the moment she puts it on him, you can see that the beads are held together magnetically.
5. Eavesdropping
When Agent Ross is interrogating the captured Klawe, we get a half-second shot to let us know that a kimoyo bead has been placed on his shoulder, allowing T’Challa, Okoye, and Nakia to eavesdrop on the conversation. The output is deliveredby a flattened bone-conducting speaker bead behind their left hears.
6. Healing
Later in the scene, when Killmonger’s bomb grievously wounds Agent Ross in his spine, T’Challa places one of Nakia’s kimoyo beads onto the wound, stabilizing Ross long enough to ferry him to Wakanda where Shuri can fully tend to him. The wound conveniently happens to be kimoyo-bead sized, but I expect that given its shape-shifting powers, it could morph to form a second-skin over larger wounds.
I wondered if kimoyo beads were just given to Wakandan royalty, but it’s made clear in the scene where T’Challa and Nakia walk through the streets of Birnin Zana that every citizen has a bracelet. There is no direct evidence in the film, but given the pro-social-ness throughout, I want to believe that all citizens have free access to the beads, equipping each of them to participate equitably in the culture.
So, most of the interaction is handled through thought-command with gestural augmentation. This means that most of our usual concerns of affordances and constraints are moot. The one thing that bears some comment is the fact that there are multiple beads on the bracelet with different capabilities. How does a user know which bead does what?
As long as the beads can do their job in place on the wrist, I don’t think it matters. As long as all of the beads are reading the user’s thoughts, only the one that can respond need respond. The others can disregard the input. In the real world you’d need to make sure that one thought isn’t interpretable as multiple things, a problem discussed on my team at IBM as disambiguation. Or if they are you must design an interaction where the user can help disambiguate the input, or tell the system which meaning they intend. We never this edge case in Black Panther.
It seems that some of the beads have specialized functions that cannot be performed by another, each has several symbols engraved into it, the indentions of which glow white for easy identification. The glow is not persistent across all uses, so it must be either context-aware and/or a setting that users can think to change. But even when not lit, the symbols are clear, and clearly distinguishable, so once the user learns the symbols, the labeling should help.
Black Votes Matter
Today is an important day in the United States. It’s election day 2020. Among one of the most important days in U.S. politics, ever. Among Trump’s litany of outrageous lies across his presidency is this whopper: “I have done more for Black Americans than anybody, except for the possible exception of Abraham Lincoln.” (Pause for your spit take and cleaning your screen.)
As infuriating and insulting as this statement is emotionally (like, fuck you for adding “possible” in there, like it’s somehow possible that you’ve done more than freed our black citizens from slavery, you maggot-brained, racist, malignant narccicist) let’s let the Brookings institute break down why, if you believe Black Lives Matter, you need to get out there and vote blue all the way down the ticket.
You should read that whole article, but some highlights/reminders
Trump ended racial sensitivity training, and put a ban on trainings that utilize critical race theory
Hate crimes increased over 200% in places where Trump held a campaign rally in 2016
He dismissed the Black Lives Matters movement, said there were “fine people” among white supremacist groups, and rather than condemning the (racist, not gay) Proud Boys, told them to “stand by.”
Not a single one of his 53 confirmed appeals court judges circuit justices is black.
The criminal mishandling of the COVID-19 pandemic has killed twice as many black Americans as it has white Americans. (Don’t forget he fired the pandemic response team.)
If you are reading this on election day, and have not already done so, please go vote blue. Know that if you are in line even when the polls officially closed, they have to stay open for the entire line to vote. If you have voted, please help others in need. More information is below.
If you are reading this just after election day, we have every evidence that Trump is going to try and declare the election rigged if he loses (please, please let it be when he loses to a massive blue waver). You can help set the expectation among your circle of friends, family, and work colleagues that we won’t know the final results today. We won’t know it tomorrow. We may have a better picture at the end of the week, but it will more likely take until late November to count everyone’s vote, and possibly until mid December to certify everyone’s vote.
And that’s what we do in a liberal democracy. We count everyone’s vote, however long that takes. To demand it in one day during a pandemic is worse than a toddler throwing a “I want it now” tantrum. And we are so very sick of having a toddler in this position.
before we get into the Kimoyo beads, or the Cape Shields, or the remote driving systems…
before I have to dismiss these interactions as “a wizard did it” style non-designs…
before I review other brain-computer interfaces in other shows…
…I wanted check on the state of the art of brain-computer interfaces (or BCIs) and see how our understanding had advanced since I wrote the Brain interface chapter in the book, back in the halcyon days of 2012.
Note that I am deliberately avoiding the tech side of this question. I’m not going to talk about EEG, PET, MRI, and fMRI. (Though they’re linked in case you want to learn more.) Modern brain-computer interface (or BCI) technologies are evolving too rapidly to bother with an overview of them. They’ll change in the real world by the time I press “publish,” much less by the time you read this. And sci-fi tech is most often a black box anyway. But the human part of the human-computer interaction model changes much more slowly. We can look to the brain as a relatively-unalterable component of the BCI question, leading us to two believability questions of sci-fi BCI.
How can people express intent using their brains?
How do we prevent accidental activation using BCI?
Let’s discuss each.
1. How can people express intent using their brains?
In the see-think-do loop of human-computer interaction…
See (perceive) has been a subject of visual, industrial, and auditory design.
Think has been a matter of human cognition as informed by system interaction and content design.
Do has long been a matter of some muscular movement that the system can detect, to start its matching input-process-output loop. Tap a button. Move a mouse. Touch a screen. Focus on something with your eyes. Hold your breath. These are all ways of “doing” with muscles.
The “bowtie” diagram I developed for my book on agentive tech.
But the first promise of BCI is to let that doing part happen with your brain. The brain isn’t a muscle, so what actions are BCI users able to take in their heads to signal to a BCI system what they want it to do? The answer to this question is partly physiological, about the way the brain changes as it goes about its thinking business.
Ah, the 1800s. Such good art. Such bad science.
Our brains are a dense network of bioelectric signals, chemicals, and blood flow. But it’s not chaos. It’s organized. It’s locally functionalized, meaning that certain parts of the brain are predictably activated when we think about certain things. But it’s not like the Christmas lights in Stranger Things, with one part lighting up discretely at a time. It’s more like an animated proportional symbol map, with lots of places lighting up at the same time to different degrees.
The sizes and shapes of what’s lighting up may change slightly between people, but a basic map of healthy, undamaged brains will be similar to each other. Lots of work has gone on to map these functional areas, with researchers showing subjects lots of stimuli and noting what areas of the brain light up. Test enough of these subjects and you can build a pretty good functional map of concepts. Thereafter, you can take a “picture” of the brain, and you can cross-reference your maps to reverse-engineer what is being thought.
Right now those pictures are pretty crude and slow, but so were the first actual photographs in the world. In 20–50 years, we may be able to wear baseball caps that provide a much more high-resolution, real time inputs of concepts being thought. In the far future (or, say, the alternate history of the MCU) it is conceivable to read these things from a distance. (Though there are significant ethical questions involved in such a technology, this post is focused on questions of viability and interaction.)
Similarly the brain maps we have are only for a small percentage of an average adult vocabulary. Jack Gallant’s semantic map viewer (pictured and linked above) shows the maps for about 140 concepts, and estimates of average active vocabulary is around 20,000 words, so we’re looking at a tenth of a tenth of what we can imagine (not even counting the infinite composability of language). But in the future we will not only have more concepts mapped, more confidently, but we will also have idiographs for each individual, like the personal dictionary in your smart phone.
All this is to say that our extant real world technology confirms that thoughts are a believable input for a system. This includes linguistic inputs like “Turn on the light” and “activate the vibranium sand table” and “Sincerely, Chris” and even imagining the desired change, like a light changing from dark to light. It might even include subconscious thoughts that yet to be formed into words.
2. How do we prevent accidental activation?
But we know from personal experience, we don’t want all our thoughts to be acted on. Take, for example, those thoughts you’re feeling hangry, or snarky, or dealing with a jerk-in-authority. Or those texts and emails that you’ve composed in the heat of the moment but wisely deleted before they get you in trouble.
If a speculative BCI is being read by a general artificial intelligence, it can manage that just like a smart human partner would.
He is composing a blog post, reasons the AGI, so I will just disregard his thought that he needs to pee.
And if there’s any doubt, an AGI can ask. “Did you intend me to include the bit about pee in the post?” Me: “Certainly not. Also BRB.” (Readers following the Black Panther reviews will note that AGI is available to Wakandans in the form of Griot.)
If AGI is unavailable to the diegesis (and it would significantly change any diegesis of which it is a part) then we need some way to indicate when a thought is intended as input and when it isn’t. Having that be some mode of thought feels complicated and error-prone, like when programmers have to write regex expressions that escape escape characters. Better I think is to use some secondary channel, like a bodily interaction. Touch forefinger and pinky together, for instance, and the computer understands you intend your thoughts as input.
So, for any BCI that appears in sci-fi, we would want to look for the presence or absence of AGI as a reasonableness interpreter, and, barring that, for some alternate-channel mechanism for indicating deliberateness. We would also hope to see some feedback and correction loops to understand the nuances of the edge-case interactions, but these are rare in sci-fi.
Even more future-full
This all points to the question of what seeing/perceiving via a BCI might be. A simple example might be a disembodied voice that only the user can hear.
A woman walks alone at night. Lost in thoughts, she hears her AI whisper to her thoughts, “Ada, be aware that a man has just left a shadowy doorstep and is following, half a block behind you. Shall I initialize your shock shoes?”
What other than language can be written to the brain in the far future? Images? Movies? Ideas? A suspicion? A compulsion? A hunch? How will people know what are their own thoughts and what has been placed there from the outside? I look forward to the stories and shows that illustrate new ideas, and warn us of the dark pitfalls.
All of these build on the given that vibranium is a very powerful substance and that Wakanda’s scientists have managed to gain a very, very sophisticated control over it.
In the Talon
This table is about a meter square, and raised off the floor around knee-height. As Okoye and T’Challa approach the traffickers in the Sambisa Forest, T’Challa approaches the table and it springs to life, showing him real-time model of the traffickers’ vehicle train. T’Challa picks up the model of the small transport truck and with a finger, wipes off its roof, revealing that there are over a dozen people huddled within. One of the figures glows amber. (It’s Nakia.) He places the truck back into the display, and the display collapses back to inert sand.
A quick critique of this interaction. The sand highlights Nakia for T’Challa, but why did it wait for him to find her truck and wipe off the top of it to look inside? It knew his goals (find Nakia), can clearly conduct a scan into the vehicle, and understood the context (she’s in one of those trucks), it should not wait for him to pick up each car and scrape off its roof to check and see which one she was in. The interface should have drawn his attention to the truck it knew she was in. This is a “stoic guru” mistake that I’ve critiqued before. You know, the computer knows all, but only tells you when you ask it. It is much more sensible for the transport truck to be glowing from the moment the table goes live, as in the comp below.
Designers: Don’t wait for users to ask just the the right thing at the right time.
Otherwise, this is a good high-tech use of the sand table for the more common meaning of “sand table,” which is a 3-dimensional surface for understanding a theatre of conflict. It doesn’t really help him run through scenarios, testing various tactics, but T’Challa is a warrior king, he can do all that in his head.
The interaction also nicely blurs the line between display and gestural interactive tool, in the same way that the Prometheus astrometrics display did. Like that other example, it would be useful for the display to distinguish when it is representing reality, and when the display is being interrupted or modified. Also, T’Challa is nice enough to put the truck back where it “belongs,” but a design would also need to handle how to respond when T’Challa put the truck back in the wrong place, or, say, crushed the truck model with his hand in fury.
In Prometheus it was an Earth, not a truck, but still focused on Africa.
Shuri’s lab
The largest table we see in the movie is in Shuri’s lab. After Black Panther challenges Killmonger and engages in battle outside the capital city, Shuri, Nakia, and Agent Ross rush down to the lab. As they approach an edge-lit hexagonal table, the vibranium sand lowers to reveal 3D-printed armor and weaponry for Shuri and Nakia to join the fight. (Though it’s not like modern 3D printing, these are powered weapons and kimoyo beads, items with very sophisticated functionality.)
Shuri outfits Ross with kimoyo beads from the print and takes off to join the fight. In the lab, the table creates a seat for Ross to remote-pilot the Royal Talon. Up on the flight deck, Shuri throws a control bead onto the Talon, and an AI in the lab named Griot announces to Agent Ross, “Remote piloting system activated.” (Hey, Trevor Noah, we hear you there!)
Around the seat, a volumetric projection of the Talon appears around him, including a 360° display just beyond the windshield that gives him a very immersive remote flying experience. We hear Shuri’s voice explain to Ross “I made it American Style for you. Get in!”
Ross sits down, grabs joystick controls, and begins remote-chasing down the cargo ships that are carrying munitions to Killmonger’s War Dogs around the world. (The piloting controls and HUD for Ross are a separate issue, and will be handled in their own post.)
The moment that Ross pilots the Talon through the last cargo ship, the volumetric projection disappears and the piloting seat returns to sand, ungraciously plopping Ross down the floor level of the lab.
It is in this shot that we realize that the dark tiles of the lab’s floor are all recessed vibranium sand tables. I can count seven in the shot. So the lab is full of them.
Display material
Let’s talk for a bit about the display choices. Vibranium can change to display any color and a shape down to a fine level of detail. See the screen cap below for an example of perfectly lifelike (if scaled) representation.
This is a vibranium-powered volumetric display. It raises the gaze matching issues we’ve seen before.
So why would it be designed so that in most cases, the display is sparkly and black like black tourmaline? Wouldn’t the truck that T’Challa picks up be most useful if it was photographically rendered? Wouldn’t the remote piloting chair be more comfortable if it had pleather- and silicone-like surfaces?
Extradiegetically, I understand the reason is because art direction. We want Wakandan tech to be visibly different than other tech in the MCU, and having it look like vibranium dust ties it back to that key plot element.
But, per the stance of this blog, I try to look for a diegetic reason. It might be a deliberate reminder of the resource on which their technological fortunes are built. And as the Okoye VP above shows, they aren’t purists about it. When detail is needed, it’s included. So perhaps this is it. That implies a great deal of sophistication on the part of the displays to know when photorealism is needed and when it is not, but the presence of Griot there tells us that they have something approaching general AI.
Missing interactions
So, just like I had to do for the Royal Talon, I have to throw my hands up about reviewing the interactions with the sand tables, because we don’t see the interactions that would give these results.
How were the mission goals communicated to the Royal Talon table? Is it programmed to activate when someone approaches it, or did T’Challa issue a mental command? How did Shuri specify those weapons and that armor? What did she do to make the ship “American style” for Ross? Is that a template? Was it Griot’s interpretation of her intention? Why did the remote piloting seat vanish the moment the mission was complete? Was this something Shuri set up in advance, or Griot’s way of telling Agent Ross to GTFO for his own safety? How does someone in the lab instruct a floor tile to leap up and become a table and do stuff? It’s almost certainly via mental commands through the kimoyo beads, but that’s conjecture. The film really provides little evidence.
On the one hand, this is appropriate for us mere non-Wakandans observing the most technologically advanced society on earth. Much of it would feel like inexplicable magic to us.
On the other, sci-fi routinely introduces us to advanced technologies, and doesn’t always eschew the explanatory interactions, so the absence is notable here. It’s magic.
Black Lives Matter
Each post in the Black Panther review is followed by actions that you can take to support black lives.
In the last post we grieved Chadwick Boseman’s passing. This week we’re grieving the loss of Ruth Bader Ginsburg. May her memory be a blessing. With her loss, the GOP is ratcheting up its outrageous hypocrisy by reversing a precedent that they themselves established when Obama was president. The “Moscow Mitch Rule” (oh, oops, sorry) “McConnell Rule” was that new Justices should not be appointed within a year of a general election, so the people’s voice can be taken into account. Of course, the bastards are just ignoring that now and trying to ram through one of their own before election day. This Justice will certainly be a conservative, and we know with this administration that means reactionary, loyal to tiny-hand Twittler, and racist as a Jim Crow law.
There are a few arrows in citizen’s quivers to stop this. One is to convince at least 4 Republican Senators to reject this outright hypocrisy, put country over party, and adhere to the McConnell rule.
To help put pressure where it might work, you can leave voicemails with Republican Senators who may be mulling whether to put country over party. Those 6 Senators’ names and numbers are below. Here’s a script for your message:
Hello, my name is ______. In 2016, Mitch McConnell created the principle of not confirming a Supreme Court Justice in an election year until after the next inauguration. For the legitimacy of the Court in the eyes of the people, I’m asking Senator ________ to uphold that principle by refusing to confirm a new Justice until after a new President is installed. Thank you.
—You, hopefully
Lisa Murkowski, Alaska; (202) 224-6665
Mitt Romney, Utah: (202) 224-5251
Susan Collins, Maine: (202) 224-2523
Martha McSally, Arizona: (202) 224-2235
Cory Gardner, Colorado: (202) 224-5941
Chuck Grassley, Iowa: (202) 224-3744
I’ve made my calls and left my messages. Can you do the same to stop the hypocritical Trumpian power grab that would tip the Supreme Court for generations?