The Fritzes award honors the best interfaces in a full-length motion picture in the past year. Interfaces play a special role in our movie-going experience, and are a craft all their own that does not otherwise receive focused recognition. Best Assistant is a special award that I’m giving for the first time.
Ok but why now? Well, in March of this year I published a new non-fiction book about the design of technology that assists people doing things (as opposed to doing stuff for them). It’s called Designing Assistant Technology: AI That Makes People Smarter. In the book I lay out a framework for categorizing assistant interactions, and describe the risks and mitigations of having an assistant in the mix. I daresay it’s not only valuable for design, but for scriptwriters and futurists as well. If that intrigues you, look for a discount code near the end of this article.
Anyway, it gave me the idea to select the movie with the best examples of Assistants.
The 2026 Award for Best Assistants: M3gan 2.0
I know, I’m as surprised as you are.
The first movie, while smarter than I expected, seemed to be a horror flick that was using AI as set dressing. It did get a shout-out in the Fritzes 2024 for best HUD, but as I recall, its unbounded atomic optimization was just another way to frame it as a ruthless, efficient killer. But this second one seems to take the theme more seriously, and the scriptwriters did their homework.

In Part II of the book, I build on the see-think-do loop (that is core to interaction design) to identify the Five Universal Assists. These are the universal, exhaustive set of categories by which technology can assist users: Perceive, Know, Plan, Perform, and Reflect. And to my surprise, when you look close, there are examples of all five of the universal assists in M3gan 2.0, more than any other film in 2025.
Note: M3gan jumps bodies many times over the course of the movie, so you’ll see her described many times with the same name, but with vastly different appearances in the screen shots.
Perceive
In this assist, the tech helps users perceive signal amidst noise.
Early in the film, Cady discovers that the source code of Better Bionix is being hacked. When everyone comes over to see what’s on her screen, Tess says, “Oh, Jesus. She’s right. There’s stray commands all over the source code.” The screen we see doesn’t ask them (or us) to try and detect which out of the dozens on screen are suspect. Those lines are colored red to contrast greatly with the screen-green, and in case you were colorblind, they’re indented as well.

You might think that that M3gan’s alerting Gemma of the FBI home invasion to be an example of perceive, but she was sleeping when the alert comes. In that context, M3gan’s acting more as an agent. (More on that below.)





In act 2, Gemma asks M3gan to increase audio of two conversants at a noisy party, and that might as well be a canonical example. (And the first time she does it, M3gan substitutes audio in a very snarky way, reminding the audience that in a super-AI-mediated world, you cannot implicitly trust the media it controls, reminding us about over-reliance, another theme from Part III of the book.)

Know
In this assist, the tech helps users understand the meaning of what they’ve perceived, either in shallow ways such as names and categories, or very deeply.
HUDs have this built into the trope, and there are plenty of HUDs throughout.





















But also, when beginning their joint hunt for AMELIA, M3gan explains that every battery Altwave (the villain corporation in the film) makes has a remote-controllable kill switch, explaining the meaning of what Gemma sees in the file.




When infiltrating Altwave, M3gan(toy) explains why AMELIA is there as well: She seeks to control Altwave’s cloud servers, which are half of North America. That control enables AMELIA to disable the economy, threatening “societal collapse in 10 to 12 working days”.

Plan
In this assist, it helps users plan their course of action, tactically or strategically.
When M3gan comes out of hiding and presents a deal to Gemma, she explains that she’s run a thousand simulations and if they don’t team up, more people die than if they didn’t. M3gan asks, “Who is the real killer in that situation?” Not having much of a choice, Gemma agrees.

A key part of the planning assist is helping users know what the best course of action is.
Perform
In this assist, the tech helps users perform some task.
One of the first scenes in the film has Tess and Cole demonstrating an exosuit. In their pitch they explain to the potential investor that its purpose is to help laborers avoid fatigue while performing physical tasks. To demonstrate, Cole lifts huge concrete blocks without showing any signs of exertion.





A few beats later, slimy Elon-Musk stand-in demonstrates how his neural chip helps him stand though he is ordinarily bound to his wheelchair.



In the climax, M3gan stows away on a neural chip forcibly implanted on Gemma. When Gemma dons an exosuit the AI helps her defeat many goons in hand to hand combat. It’s arguably acting as an agent here, since Gemma isn’t trying to build those skills. (Similarly when Gemma gets knocked unconscious, M3gan controls the exosuit to animate her body anyway, something we also see in Section 31, but more on this example in a later post.)
Reflect
In this assist—the most abstract of them—the technology helps users reflect on things to turn experience into knowledge, or to question their goals and future tactics.
There’s a lot less of this here, just like there is in the real world. But, we see some of it.When Cady asks M3gan(half-formed) how she can feel anything, M3gan replies, “Can you explain why you feel things?” It’s rhetorical in context, but exactly the sort of thing that a reflection assistant might ask.

When Gemma is spiraling about her parenting in the basement, M3gan(souped up) takes a moment to share counterexamples. “I saw you wake up every day at 4:00 A.M., staring at the ceiling contemplating what the future holds for her…I watched you make homemade lunches with fresh-baked sourdough…I watched you help her with her homework, even though it always ended in a fight…Gemma, it’s not a failure to feel guilt or that you’re not enough. It’s part of the job.” It’s not the best fit for the definition of this assist I give in the book, but it’s the closest thing in the movie and the closest thing in my survey of the year’s films.

Also agents
There are also many examples where M3gan(AI) acts as an agent on their behalf, but that was my last book, so I’ll skip getting into those examples. But as you watch the movie, keep an eye out for additional shouts out to the paperclip thought experiment (a metaphor for the threat of instrumental convergence), allusions to the Xerox WorkCentre scanner bug, and of course super AI as an existential threat. The whole plot can be seen as an example of Bostrom’s a priori argument that multiple super AIs are the most stable scenario. All this is why I say that the writers seemed to have done their homework..
I’m a lot less fond about the guy wanting to regulate/eliminate AI is painted the bad guy, but having positioned M3gan as sentient and the antihero of the film, I’m not sure what else they could do. But I wish it didn’t valorize AI as equivalent to humans despite all of that. We have enough LeMoinian panic about large language models as it is.
Anyway, congratulations to M3gan 2.0 for showing so many examples of assistants throughout. If you’re interested in getting the book, you can get 20% off if you purchase from Rosenfeldmedia.com and use the code “scifi26” during checkout. Use this power only for good.
And let me know in comments if you think of other examples of assistants across the year.
IMDB: https://www.imdb.com/title/tt26342662/Currently streaming on: ![]()
![]()
Next up: A Big Screen Label Roundup (currently scheduled for 8 May 2026)














































































