Remote wingman via EYE-LINK

EYE-LINK is an interface used between a person at a desktop who uses support tools to help another person who is live “in the field” using Zed-Eyes. The working relationship between the two is very like Vika and Jack in Oblivion, or like the A.I. in Sight.

In this scene, we see EYE-LINK used by a pick-up artist, Matt, who acts as a remote “wingman” for pick-up student Harry. Matt has a group video chat interface open with paying customers eager to lurk, comment, and learn from the master.

Harry’s interface

Harry wears a hidden camera and microphone. This is the only tech he seems to have on him, only hearing his wingman’s voice, and only able to communicate back to his wingman by talking generally, talking about something he’s looking at, or using pre-arranged signals.

image1.gif
Tap your beer twice if this is more than a little creepy.

Matt’s interface

Matt has a three-screen setup:

  1. A big screen (similar to the Samsung Series 9 displays) which shows a live video image of Harry’s view.
  2. A smaller transparent information panel for automated analysis, research, and advice.
  3. An extra, laptop-like screen where Matt leads a group video chat with a paying audience, who are watching and snarkily commenting on the wingman scenario. It seems likely that this is not an official part of the EYE-LINK software.
image55.png
image47.png
image28.png
Please make a note of the hilarious and condemning screen names of the peanut gallery: Pie Ape, Popkorn, El Nino, Nixon, Fappucino [sic], Stingray, I_AM_WALDO, and Wigwam.

Harry communicates to Matt by speaking or enacting a crude sign language for the video camera. Matt communicates back to Harry using an audiolink through a headset. Setting up the connection is similar to Skype/Hangouts (even featuring an icon of an archaic laptop.) Every first-person EYE-LINK view is characterized by a pixelated gradient at the sides of the screen.

Matt’s wingman support tools

We see that Matt has a number of tools to help him act as a remote wingman for Harry, evident through six main navigation items on his side screen…A home icon, Web, News, Image, Video, and Social Media. The home icon is always bright white, but the section he’s currently viewing is a bolded gray.  

In the Image mode, it runs a face recognition on a still image from Matt’s video feed, and provides its best match for further research.

image20.png

Somehow he can also get information on the event that Harry is attending. In this view, there’s a floor plan of the venue, which Matt can use to instruct Harry.

image11.png

OK. This is of course a creepy use of this interface, but it’s easy to imagine scenarios where something like the EYE-LINK is used virtuously:

  • A nurse practitioner needing to call on the expertise of a remote, more senior caregiver.
  • An airplane maintenance worker needing to speak to the aircraft engineers about a problem she’s encountering.
  • Paintball players coordinating their game through a centralized team captain.

So with that in mind, let’s review this with the caveat that of course the specific wingman scenario is super creepy.

Analysis: Harry’s feedback

The communication channel back from Harry to Matt doesn’t need to be too rich for these purposes, but there are ways that it could be richer. Of course Harry could pick up his phone and simply type something that Matt could see. But if the communication needed to be undetectable to a casual observer, there are other options. Subvocalization is nascent, but a possibility and mostly-natural for the speaker.

78105main_ACD04-0024-001.jpg
Image courtesy of the NASA Ames Research demo of subvocalization.

If the remote user has time for training, subgestural detection might be another option. This is like subvocal detection, but instead of detecting throat movements used in speech, it would be an armband (like the Myo) that could detect gentle finger presses allowing the user chorded keyboard input which he could use while, say, gripping the beer bottle.

tw_hand.png

Either way, richer “undetectable” communication mechanisms exist, and could be incorporated.

Analysis: Graphics

One of the refreshing things about the interfaces in Black Mirror generally—and these screens in particular—is how understated they are, especially compared to the Roccoco interfaces that populate much of sci-fi. (Compare the two below.)

The color palette is spartan grayscale. The typeface is Helvetica (or adjacent). Nothing 3D, nothing swoopy, no complexity for complexity’s sake.

Analysis: Navigation and layout

The navigation for the information panel is a little confusing. Sure, it looks like lots of websites. But this chunking of information into separate screens requires that Matt hunt for information that’s of interest. Better would be to have a single, dynamic screen, and have the system do real-time parsing, providing suggestions and notifications in the context of the event. If he needed to dive down into some full-screen mode, let it fill the screen with some easy way to return to context.

Also, how did he get to the event view? Is that just a web view? What bar puts its floorplan on its site? There is no primary navigation element that would on first glance explain how he got there, or once there, how he might get back to other screens. The home icon is obscured. (Maybe this is designed by Apple, though, and has some entirely hidden swipe gesture or long press to request the event screen or force a return to home?) It’s really hard to say, and so fails affordance.

Analysis: Group chat

A quick look at any modern group video chat software shows that this is too pared down, with lots of controls for audio and video controls missing, as well as controls for the “meeting.” It’s possible that these appear only if Matt interacts with the cursor on that laptop, but again, affordances.

Analysis: More wingman tools?

There are more tools that would be useful to a wingman’s job, which could be built even now—without the strong AI that this diegesis has. They could be more virtuous, like…

  • Ways to keep Harry calm, focused, and feeling confident.
  • Reminders of general best practices for making a good impression.
  • Automatic privacy blackout when Harry approaches people for conversation.
thegame

Or they could be…uh…more questionable. (Here I’ll confess to referencing The Game: Penetrating the Secret Society of Pickup Artists by Neil Strauss, for how a real PUA might handle it.)

  • A transcript of the conversation with key phrases highlit, indicating the “target’s” attitudes and levels of interest.
  • Personality analysis on social media, listing derived topics that these particular “targets” would find engaging.
  • A list of Harry’s practiced “routines” for Matt to quickly review, and suggest. The AI could even highlight its best-guess suggestion.
  • Counts of “indicators of interest.”
  • An overview of Matt’s favored stages of pickup, with an indicator of where Harry is and how well he performed on the prior stages.

Either way, the support that these tools are offering are pretty minimal compared to what could be done, but then again, that kind of fits the story. Yes, the creepiness of the remote wingman support tools is part of the point. But the whole reason the peanut gallery pays for the honor of watching Matt coach Harry is (yes, voyeurism, but also) to witness a master wingman at his work. If the system was too much of a support, the peanut gallery would be less incentivized to pay to see him in action.

Black Mirror: White Christmas (2012)

As part of my visit to Delft University earlier this year, Ianus Keller asked his IDE Master Students to do some analysis of the amazing British sci-fi interface series Black Mirror, specifically the “White Christmas” episode. While I ordinarily wait for television programs to be complete before reviewing them, Black Mirror is an anthology series, where each new show presents a new story world, or diegesis.

image58.gif

Overview

Matt (John Hamm) and Potter (Rafe Spall) are in a cabin sharing stories about their relationship with technology and their loved ones. Matt tells stories about his past career of (1) delivering “romantic services” to “dorks” using a direct link to his client’s eyes and (2) his regular job of training clones of people’s personalities as assistive Artificial Intelligences. Potter tells the story of his relationship to his wife and alleged daughter, who blocks him through the same vision controlling interface. In the end…

massive-spoilers_sign_color

…it turns out Matt and Potter are actually talking to each other as interrogator and artificial intelligence respectively, in order to get Potter convicted.

IMDB: https://www.imdb.com/title/tt34786243/

RSW CalArts: Rebel bombing target computer 2

I have, over the past several years, conducted a workshop at a handful of conferences, companies, and universities called Redesigning Star Wars. (Read more about that workshop on its dedicated page.) It’s one of my favorite workshops to run.

In April of 2016 I was invited to run the workshop at CalArts in Southern California for some of the interaction design students. Normally I ask attendees to illustrate their design ideas on paper, but the CalArts students went the extra mile to illustrate their ideas in video comps! So with complete apologies for being impossibly late, here are some of those videos.

Next up, a second redesign of the Rebel bombing target computer.

Redesigning Star Wars_UX London 2015_Interfaces_Page_19.png

Monique Wilmoth and Andrea Yasko redesigned the controls to keep the Rebel bomber’s hands on the controls, added voice control, and reconsidered the display. Take a look at their video, below.

 

If you’d like to discuss a workshop for your org, contact workshop@scifiinterfaces.com.

RSW CalArts: Rebel bombing target computer

I have, over the past several years, conducted a workshop at a handful of conferences, companies, and universities called Redesigning Star Wars. (Read more about that workshop on its dedicated page.) It’s one of my favorite workshops to run.

In April of 2016 I was invited to run the workshop at CalArts in Southern California for some of the interaction design students. Normally I ask attendees to illustrate their design ideas on paper, but the CalArts students went the extra mile to illustrate their ideas in video comps! So with complete apologies for being impossibly late, here are some of those videos.

Next up, a redesign of the Rebel bombing target computer.

Redesigning Star Wars_UX London 2015_Interfaces_Page_19

Abby Chang and Julianna Bach redesigned the controls to keep the Rebel bomber’s hands on the controls, and reconsidered the display. Take a look at their video, below.

If you’d like to discuss a workshop for your org, contact workshop@scifiinterfaces.com.

RSW CalArts: Luke’s binoculars

I have, over the past several years, conducted a workshop at a handful of conferences, companies, and universities called Redesigning Star Wars. (Read more about that workshop on its dedicated page.) It’s one of my favorite workshops to run.

In April of 2016 I was invited to run the workshop at CalArts in Southern California for some of the interaction design students. Normally I ask attendees to illustrate their design ideas on paper, but the CalArts students went the extra mile to illustrate their ideas in video comps! So with complete apologies for being impossibly late, here are some of those videos.

First up, a redesign of Luke’s binoculars.

Redesigning Star Wars_UX London 2015_Interfaces_Page_03.png

 Yinchin Niu and Samantha Shiu redesigned the control buttons to make them more accessible to Luke and reconsidered the augmentations through the viewfinder. Take a look at their demonstration video, below.

If you’d like to discuss a workshop for your org, contact workshop@scifiinterfaces.com.

Using iMovie

If you prefer to use iMovie (it’s free for Mac users) for contributing to the blog, here’s how. Once your file is in a digital format, you can extract both clips and screenshots in iMovie. All of the clips will be stored in events and projects in iMovie regardless of whether or not you export the files for use elsewhere.

First, import the video into iMovie

  1. Create a new library in iMovie by going to File > Open Library > New from the main menu. Name the library and save.
    image11
  2. A new event should have been automatically created. To rename it, double-click on the name. (Since I’m doing a TV series, I named the event “eps” for episodes.)
  1. Once the event has been renamed, either select the option to “Import” into the new event or drag and drop the film into the box from the Finder.
    image8
  2. The screen should look something like this when the movie has finished importing.
    image3
  3. Select File > New Movie  from the top menu bar.
    image22
  4. The library should automatically be set to the one you’re working with.
    image16
  5. The screen should look like this with a blank timeline at the bottom.
    image21
  6. Select the filmstrip (or strips if it’s a TV show), then drag it down to the timeline.
    image15
  7. You can adjust the zoom of the filmstrip with the slider.
    You can scrub just by hovering over the filmstrip with your mouse.
    image1
  8. You’ll want to save the movie you just created as a project. To do this, select the Projects button in the top option bar.
    image20
  9. Be sure to name the project something clear that you’ll be able to quickly refer to as you start editing and scrubbing for interface footage. For example, since this project will be the master of all of the footage where I do all of the slicing, I’ll name it “eps cut”.
    image6

To slice the filmstrip…

Before you can extract video clips, you first need to slice the filmstrip.

  1. Click on the timeline where you want to slice and type Cmd+B. Continue to slice the beginning and end of each of your clips. All the way through the footage.
    image13
  2. When you’re done, it should look something like this.
    image17

To snag a screenshot…

To snag a screenshot, just click on the timeline to pick the frame. You’ll see a preview in the viewer. Then select Share > Image from the top menu bar and save as usual.

image7

Then to organize all of those clips by tech…

Grouping all of your clips together by each piece of tech can be a real time saver when you need to refer back to all of the clips during your analysis.

For iMovie, this is where the process begins to fall apart. iMovie is great for assembling movies, but not necessarily for disassembling them like we do for the blog.

You’ll need to create a new project for each piece of tech under the library you created previously. The easiest way I’ve found to do this in the latest iteration of iMovie, is to…

  1. Go to the Projects view and duplicate the project with all the sliced footage by either using the contextual menu, or by selecting the project and using the keyboard shortcut Cmd+D.
    image18
  2. It will be automatically named, so rename it by tech type or interface. You can do this by either double-clicking on the project name, or selecting the option from the contextual menu.
    image10
  3. Double click on the new project thumbnail to open it, and delete all of the sliced clips that are not part of that specified tech.

    This is an odd way of doing it, but after Apple’s “improvements” to iMovie, the drag and drop feature doesn’t work the way it did before.
    image12
  4. Do this for each type of tech. In the end, your project library should look something like this.
    image19

And extract a clip for animated gifs…

This will be similar to how you organize the clips by tech. You’ll start by duplicating projects and deleting the clips you don’t want.

  1. Go to the Projects view and duplicate the project that has the clip you want to extract by either using the contextual menu, or by selecting the project and using the keyboard shortcut Cmd+D.
    image14
  2. Rename the project something that describes the clip. You can do this by either double-clicking on the project name, or selecting the option from the contextual menu.

    Since you can’t create subfolders to keep everything organized by type, it’s best to name the clips so that like stay together with like.
    image4
  3. Delete all of the slices you don’t want in the extracted clip.
    image2
  4. Export the clip by selecting Share > File from the top menu bar.
    image9
  5. In the settings window that pops up, select the quality settings you want to use. I usually pick no more than 720 for the quality. Anything bigger will create a ginormous file.
    image5
  6. The file will save as an mp4, so you’ll still need to take it into Photoshop or your preferred image editing tool for converting it to an animated gif.

And that’s it. If you know of better ways to use iMovie to organize your clips for contributing to the blog, feel free to let me know in the comments and I’ll update the article.

Report Card: Doctor Strange

Read all Doctor Strange reviews in chronological order.

Chris: I really enjoyed Doctor Strange. Sure, it’s blockbuster squarely in origin story formula, but the trippiness, action, special effects, and performances made it fun. And the introduction of the new overlapping rulespace of magic makes it a great addition to the Marvel Cinematic Universe. And hey, another Infinity Stone! It’s well-connected to the other films.

Scout: Doctor Strange is another delightful film that further rounds out the Marvel universe. It remained faithful (enough) to the comics that I loved growing up and the casting of Benedict Cumberbatch was spot-on perfect, much as Robert Downey Jr. was for Tony Stark. It is a joyful and at times psychedelic ride that I’m eager to take again. “The Infinity Wars” will be very interesting indeed.

But, as usual, this site is not about the movie but the interfaces, and for that we turn to the three criteria for evaluating movies here on scifiinterfaces.com.

  1. How believable are the interfaces? (To keep you immersed.)
  2. How well do the interfaces inform the narrative of the story? (To tell a good story.)
  3. How well do the interfaces equip the characters to achieve their goals? (To be a good model for real-world design?)
Report-Card-Doctor-Strange

Sci: B- (3 of 4) How believable are the interfaces?

Magic might be a tricky question for narrative believability, as by definition it is a breaking of some set of rules. It’s tempting laziness to patch every hole we find by proclaiming “it’s magic!” and move on. But in most modern stories, magic does have narrative rules; what it’s breaking is known laws of physics or the capabilities of known technology, but still consistent within the world. Oh, hey, kind of like a regular sci-fi story.

The artifacts mostly score quite well for believability. The Boots, the Staff, and the Bands are constrained in what they do, so no surprise there. Even the Cloak is a believable intelligent agent acting for Strange. Its flight-granting and ability to pull in any spatial direction arbitrarily don’t quite jive, but they don’t contradict each other, just raise questions that aren’t answered in the movie itself.

But, the Sling Rings are a trainwreck in terms of usability and believability. With that and the Eye missing some key variables that simply must be specified for it to do what we see it doing, it breaks the diegesis, taking us out of the movie.

Fi: A (4 of 4) How well do the interfaces inform the narrative of the story?

None of these are tacked-on gee-whiz.

  • Since Strange is occupying an office (Master) that is part of a venerated and peacekeeping secret organization (the Masters of Mysticism) we would expect it to have some tools in place to help the infantry and the boss.
  • That the powerful artifacts choose their masters helps establish Strange as unique and worthy.
  • The Eye is core to the plot, and the film uses it to convey how much of a talent and rulebreaking maverick Strange is.
  • The Staff helps us see Mordo’s militancy, threat, and lawful neutral-ness.
  • The laugh-out-loud comedy of the Cloak comes from its earnestly trying to help, its constraints, and how Strange is really, really new to this job.
  • Even the dumb Sling Ring helps show Strange’s learning and confidence, and set up how Strange gets stabbed and yadda yadda yadda begins his reconciliation with Dr. Palmer.
Cloak-of-Levitation-pulling
Once more, because it was so damned funny.

All great narrative uses of the “tech” in the film.

Interfaces: C+ (2 of 4)
How well do the interfaces equip the characters to achieve their goals?

The Boots do. The Cloak totally does. The “AR” surgical assistant does. (And it’s not even an artifact.) If we ever get to technologies that would enable such things, these would be fine models for real world equivalents. (With the long note about general intelligence needing language for strategic discussions with humans.)

DoctorStrange_AR_ER_assistant-05

That aside, the Sling Ring services a damned useful purpose, but its design is a serious impediment to its utility, and all the Masters of the Mystic Arts uses it. The Staff kind of helps its user, i.e. Mordo, but you have to credit it with a great deal of contextual intelligence or some super-subtle control mechanism.  The Bands are so clunky that they’re only useful in the exact context in which they are used. And the Eye, with its missing controls, missing displays, and dangerously ambiguous modes, is a universe-crashing temporal crisis just waiting to happen. This is where the artifacts suffer the most. For that, it gets the biggest hit.

Final Grade B- (9 of 12), Must-see.

Definitely see it. It’s got some obvious misses, but a lot of inventive, interesting stuff, and some that are truly cutting edge concepts. In a hat tip to Arthur C. Clarke’s famous third law, I suppose this is “sufficiently advanced technology.

IMDB: https://www.imdb.com/title/tt1211837/Currently streaming on: