Lessons about Her (7/8)

Ordinarily, my final post in a movie review is to issue a report card for the film. But since this is there are a few interfaces missing, and since I wrote this from a single cinema viewing and a reading of Jonze’s script, I’ll wait until it’s out on DVD to commit that final evaluation to pixels.

HER-Learn

But I do think it’s OK to think about what we can learn specifically from this particular interface. So, given this…lengthy…investigation into OS1, what can we learn from it to inform our work here in the real world?

Related lessons from the book

  • Audiences already knew about operating systems, so Jonze was Building on what users already know (page 19)
  • OS1 mixed mechanical and other controls (page 26)
  • The earpiece had differentiated system sounds for different events (page 111)
  • Samantha put information in the channels it fit best. (page 116)
  • Given her strong AI, nobody needed to reduce vocabulary to increase recognition. In fact, they made a joke out of that notion. (page 119)
  • Samantha followed most human social conventions (except that pesky one about falling in love with your client) (page 123). The setup voice response did not follow human social conventions.
  • Jonze thought about the uncanny valley, and decided homey didn’t play that. Like, at all. (page 184)
  • Conversation certainly cast the system in the role of a character (page 187)
  • The hidden microphones didn’t broadcast that they were recording (202)
  • OS1 used sound for urgent attention (page 208)
  • Theodore tapped his cameo phone to receive a call (page 212)
  • Samantha certainly handled emotional inputs (page 214)
  • The beauty mark camera actually did remind Theodore of the incredibly awkward simulation (page 297)

New lessons

  • Samantha’s disembodiment implies that imagination is the ultimate personalization
  • The cameo reminds us that wearable can include shirt pockets.
  • Her cyclopean nature wasn’t a problem, but makes me wonder if computer vision should be binocular (so they can see at least what users can see, and perform gaze monitoring).
  • When working on a design for the near future, check in with some framework to make sure you haven’t missed some likely aspect of the ecosystem.
  • Samantha didn’t have access to cameras in her environment, even though that would have helped her do her job. Hers might have been either a security or a narrative restriction, but we should keep the notion in mind. To misquote Henry Jones, let your inputs be the rocks and the trees and the birds in the sky. (P.S. That totally wasn’t Charlemagne.)
  • Respect the market norms of market relationships. I’m looking at you, Samantha.
  • Fit the intelligence to the embodiment. Anything else is just cruel.

I don’t want these lessons to cast OS1 in a negative light. It’s a pretty good interface to a great artificial intelligence that fails as a product after it’s sold by unethical or incompetant slave traders. Her is one of the most engaging and lovely movies about the singularity I’ve ever seen. And if we are to measure the cultural value of a film by how much we think and talk about it afterward, Her is one of the most valuable sci-fi films in the last decade.

I can’t leave it there, though, as there’s something nagging at my mind. It’s a self-serving question, but that will almost certainly be of interest to my readership: What is the role of interaction designers in the world of artificial intelligence?

One thought on “Lessons about Her (7/8)

  1. Pingback: Is it going to happen like this? | Sci-fi interfaces

Leave a Reply