Using iMovie

If you prefer to use iMovie (it’s free for Mac users) for contributing to the blog, here’s how. Once your file is in a digital format, you can extract both clips and screenshots in iMovie. All of the clips will be stored in events and projects in iMovie regardless of whether or not you export the files for use elsewhere.

First, import the video into iMovie

  1. Create a new library in iMovie by going to File > Open Library > New from the main menu. Name the library and save.
    image11
  2. A new event should have been automatically created. To rename it, double-click on the name. (Since I’m doing a TV series, I named the event “eps” for episodes.)
  1. Once the event has been renamed, either select the option to “Import” into the new event or drag and drop the film into the box from the Finder.
    image8
  2. The screen should look something like this when the movie has finished importing.
    image3
  3. Select File > New Movie  from the top menu bar.
    image22
  4. The library should automatically be set to the one you’re working with.
    image16
  5. The screen should look like this with a blank timeline at the bottom.
    image21
  6. Select the filmstrip (or strips if it’s a TV show), then drag it down to the timeline.
    image15
  7. You can adjust the zoom of the filmstrip with the slider.
    You can scrub just by hovering over the filmstrip with your mouse.
    image1
  8. You’ll want to save the movie you just created as a project. To do this, select the Projects button in the top option bar.
    image20
  9. Be sure to name the project something clear that you’ll be able to quickly refer to as you start editing and scrubbing for interface footage. For example, since this project will be the master of all of the footage where I do all of the slicing, I’ll name it “eps cut”.
    image6

To slice the filmstrip…

Before you can extract video clips, you first need to slice the filmstrip.

  1. Click on the timeline where you want to slice and type Cmd+B. Continue to slice the beginning and end of each of your clips. All the way through the footage.
    image13
  2. When you’re done, it should look something like this.
    image17

To snag a screenshot…

To snag a screenshot, just click on the timeline to pick the frame. You’ll see a preview in the viewer. Then select Share > Image from the top menu bar and save as usual.

image7

Then to organize all of those clips by tech…

Grouping all of your clips together by each piece of tech can be a real time saver when you need to refer back to all of the clips during your analysis.

For iMovie, this is where the process begins to fall apart. iMovie is great for assembling movies, but not necessarily for disassembling them like we do for the blog.

You’ll need to create a new project for each piece of tech under the library you created previously. The easiest way I’ve found to do this in the latest iteration of iMovie, is to…

  1. Go to the Projects view and duplicate the project with all the sliced footage by either using the contextual menu, or by selecting the project and using the keyboard shortcut Cmd+D.
    image18
  2. It will be automatically named, so rename it by tech type or interface. You can do this by either double-clicking on the project name, or selecting the option from the contextual menu.
    image10
  3. Double click on the new project thumbnail to open it, and delete all of the sliced clips that are not part of that specified tech.

    This is an odd way of doing it, but after Apple’s “improvements” to iMovie, the drag and drop feature doesn’t work the way it did before.
    image12
  4. Do this for each type of tech. In the end, your project library should look something like this.
    image19

And extract a clip for animated gifs…

This will be similar to how you organize the clips by tech. You’ll start by duplicating projects and deleting the clips you don’t want.

  1. Go to the Projects view and duplicate the project that has the clip you want to extract by either using the contextual menu, or by selecting the project and using the keyboard shortcut Cmd+D.
    image14
  2. Rename the project something that describes the clip. You can do this by either double-clicking on the project name, or selecting the option from the contextual menu.

    Since you can’t create subfolders to keep everything organized by type, it’s best to name the clips so that like stay together with like.
    image4
  3. Delete all of the slices you don’t want in the extracted clip.
    image2
  4. Export the clip by selecting Share > File from the top menu bar.
    image9
  5. In the settings window that pops up, select the quality settings you want to use. I usually pick no more than 720 for the quality. Anything bigger will create a ginormous file.
    image5
  6. The file will save as an mp4, so you’ll still need to take it into Photoshop or your preferred image editing tool for converting it to an animated gif.

And that’s it. If you know of better ways to use iMovie to organize your clips for contributing to the blog, feel free to let me know in the comments and I’ll update the article.

TETVision

image05

The TETVision display is the only display Vika is shown interacting with directly—using gestures and controls—whereas the other screens on the desktop seem to be informational only. This screen is broken up into three main sections:

  1. The left side panel
  2. The main map area
  3. The right side panel

The left side panel

The communications status is at the top of the left side panel and shows Vika the status of whether the desktop is online or offline with the TET as it orbits the Earth. Directly underneath this is the video communications feed for Sally.

Beneath Sally’s video feed is the map legend section, which serves the dual purposes of providing data transfer to the TET and to the Bubbleship as well as a simple legend for the icons used on the map.

The communications controls, which are at the bottom of the left side panel, allow Vika to toggle the audio communications with Jack and with Sally.

The main map area

The largest section is the viewport where the various live feeds are displayed. The main map, which serves as a radar, as well as the remote video feeds she uses to monitor Jack are both in this section of the display.

The right side panel

The panel on the right side of the map contains the video feed controls, which allow Vika to toggle between live footage from the Bubbleship, the TET, and of course, the main map view.

Although never shown in use in the film, the bottom right of the screen houses the tower rotation controls. This unused control is the only indication the capability even exists, so it is unknown whether the tower rotates 360 degrees or whether it’s limited to set points. (More on this below.)

It has robust capabilities

image02

At one point in the movie, Vika is able to use the drones to search for bio trail signatures when Jack is abducted by the scavs.

image06

Vika is also able to detect and decode various types of signals such as the morse code message sent by Jack or the rogue signal sent out by the scavs.

image08

And, probably unbeknownst to Jack and Vika, the TETVision can be controlled remotely from the TET to allow Sally access to the data stored on the desktop—as shown at one point in the movie, when Sally pulls up a past bio trail signature to send drones after Jack and the scavs.

It’s missing a critical layer of data

image03

At the beginning of the film, as Jack heads toward the downed drone 166, he suddenly encounters a dangerous lightning storm and nearly plunges to his death when the Bubbleship loses power. His signature disappears from the TETVision map, but from Vika’s perspective there is no indication as to what could have happened — or that there was any danger to begin with.

image01

Since the weather is unstable and constantly changing, it would have been better to include a weather overlay so that Vika could have notified Jack of the storm—allowing him to fly around it instead of straight into it.

It’s got some useless bits

image09

The tower rotation controls are never shown in use in the film, so it’s not clear what benefit rotating the tower would serve. The main purpose of their mission is to ensure the hydro-rigs are secure and functioning properly, not getting an optimal view.

image04

The tower is almost completely surrounded by windows as it is. And since the tower windows already face the hydro-rigs, what would be the benefit of changing vantage points?

It seems that the space could be used for something more beneficial to Vika such as bike, hydro-rig and drone cam feeds. This would provide Vika with more eyes on the ground, allowing her the additional support to keep Jack safe and monitor scav activity.

From an clustering standpoint, it would also fall in line logically with the other feed controls on the right side panel.

And some unnecessary visual feedback

image07

Towards the end of the movie, Sally is trying to find Jack and the scavs. She accesses Vika’s desktop remotely in order to pull up the bio trail records. Although no one is around to see the information, the TETVision displays the process as it happens. Of course, this is necessary for the narrative to progress, but in a real-life situation Sally would only need to see the data on her side—not from the desktop in Tower 49. If they’ve managed interstellar travel, cloning, terraforming, and cognitive reprogramming of alien species, they’re not likely still using VNC. This type of interaction should simply run in the background and not be visible on screen.

Better: Provide useful visuals

When a drone picks up a bio trail signal, a visual of a DNA sequence is displayed. Since the analysis is being conducted by Sally on the TET, it seems that this information isn’t really useful to Vika at all.

image00

From Vika’s point of view it seems like the actual trail would be more important, so why not show a drone cam feed complete with the HUD overlay? She could instantly gain more information by seeing that there are two bio trails—proving that Jack has been captured by the scavs and taken to another location.

Communications with Sally

image01

While Vika and Jack are conducting their missions on the ground, Sally is their main point of contact in orbital TET command. Vika and Sally communicate through a video feed located in the top left corner of the TETVision screen. There is no camera visible in the film, but it is made obvious that Sally can see Vika and at one point Jack as well.

image00

The controls for the communications feed are located in the bottom left corner of the TETVision screen. There are only two controls, one for command and one for Jack. The interaction is pretty standard—tap to enable, tap again to disable. It can be assumed that conferencing is possible, although certain scenes in the film indicate that this has never taken place.

image02

Upon first connecting with Sally each morning, Vika uploads data to the TET by using a two-finger gesture to drag the information up to Sally’s video display. There is no footage showing where she taps to begin the gesture, but it seems to originate at the hydro-rig symbol since Vika is discussing hydro-rig support as she interacts with the screen.

Same interaction for different functions

When Vika sends the hydro-rig coordinates to Jack in the Bubbleship, she is using the exact same interaction as she uses here to send the hydro-rig status data to the TET. When she uses the two-finger gesture to drag from the hydro-rig symbol to the Bubbleship, GPS coordinates are being sent. When she uses the same gesture to drag from the hydro-rig symbol to Sally’s video feed, it sends data on the hydro-rig status. How does the system know what data to send when?

It’s possible that this is another instance of agentive tech in which the system determines what data to send based on where the gesture ends. However, as mentioned in another post, it would be better to use consistent interactions for similar tasks by using the two-finger gesture to upload data to the TET and to use the one-finger gesture for sending coordinates directly with the map interface.

Or better yet, auto-upload the data to the TET upon connection and make it fully agentive. Not great for our heroes’ sense of control, but from the TET’s perspective…

TET communications status

image03

Since Vika relies heavily on the TET’s surveillance and communications capabilities, it is important for her to know when the TET is going to be within contact range. The TET system status feed, which is the screen at the top of the upright section of the desk, monitors the TET’s orbital position in relation to the tower.

As labeled in the image above, the tower position is indicated by an icon located at the top of the circle (the earth) and remains stationary as the TET icon rotates, representing the real-time orbital position. The lighter blue gradient area of the monitor indicates the TET’s range of communications. The darker area indicates when the TET is outside of contact range.

No thinking required

This is one of the simplest interfaces in the film. The visualization of data is very easy to understand and allows for a quick glimpse of all of the information Vika needs without having to think about it. The gradients represent the strength of the signal – the more solid light blue seen directly under the TET symbol indicates a full-strength signal while the darker gradients represent a weaker signal.

One of the main tenets of user experience design is to create technology that would allow anyone, regardless of their technical background, to quickly and easily use the interface with as little mental processing power as possible. Make the design obvious and self-explanatory with no training required.

Don’t make the user have to think about it.

Hydro-rig Monitoring

image00

As a part of their morning routine, Jack makes the rounds in his Bubbleship to provide a visual confirmation that the hydro-rigs are operating properly. In order to send the hydro-rig coordinates to the Bubbleship, Vika:

  1. Holds with two fingers on the hydro-rig symbol on the left-hand side panel of the TETVision feed
  2. A summary of coordinates is displayed around the touchpoint (hydro-rig symbol)
  3. Drags the data up to the Bubbleship symbol on the side panel

Inconsistent interactions

When Vika sends the drone coordinates, she interacts directly with the map and uses only one finger. Why is the interaction for sending hydro-rig coordinates different than the interaction for sending drone coordinates?

image01

Perhaps Vika uses a different interaction here because she is sending the coordinates for all three rigs at the same time. However, since the three rigs are all in the same general location, that doesn’t really seem necessary.

It would be better to maintain a consistent interaction for the same function—in this case sending coordinates. This would leave the side panel with a more consistent interaction for uploading larger amounts of data to the TET, which will be covered in a separate post.

image02

The hydro-rig status feed on the left of the desk display is broken up into two sections. The main section consists of a diagram showing the resource collection status for each rig. The lower section of the feed indicates the grid position of each rig along with some additional data elements that are too blurry to make out.

Don’t forget the main objective

After the scavs take out one of the hydro-rigs in a gigantic show of fireworks, there is no noticeable change in the hydro-rig status feed. Where is the alert messaging stating that one of the rigs is offline? At the very least there should be an alert message similar to the red offline messaging displayed on the drone status feed. However, the screen appears to be unchanging throughout the film.

The main objective of Jack and Vika’s team is to keep the rigs safe. That’s why they are on earth and that is why the desktop with all of its fabulous capabilities was created. Don’t forget the main purpose of the design.

image03

At least there is a modest indication of a greyed-out symbol on the TETVision feed that indicates the rig is down—albeit very modest. There is not much of a clear visual distinction between the online and offline rigs. The colors are so similar, that a person who is colorblind may not even notice the difference.

image04

Made using tools at http://www.etre.com/tools/colourblindsimulator/

In this case, it’s likely that the TET selected teams who met fitness prerequisites, but it’s a good reminder for those of us doing real-world design for the general population: Don’t forget about accessibility in design. Following accessibility standards in design ensures that as many people as possible are able to use the interface.

Consistency is key

Overall, the system mostly does what it is supposed to do, but doesn’t seem to have been as well thought out or as consistent in design as the other systems in the film. Consistency should be maintained unless there’s a damned good reason not to, whether it’s with interactions or UI messaging.

Users tend to be more comfortable and confident when working with an interface that has consistent patterns. If a user expects a gesture or command to behave a certain way and it does, this consistency in design provides a more efficient workflow by enabling users to confidently interact with technology, without having to remember arcane details about what does what when.

Drone Status Feed

Oblivion-Desktop-Overview-002
Oblivion-Desktop-DroneMonitor-001

 

As Vika is looking at the radar and verifying visuals on the dispatched drones with Jack, the symbols for drones 166 and 172 begin flashing red. An alert begins sounding, indicating that the two drones are down.

Oblivion-Desktop-DroneCoordinates

Vika wants to send Jack to drone 166 first. To do this she sends Jack the drone coordinates by pressing and holding the drone symbol for 166 at which time data coordinates are displayed. She then drags the data coordinates with one finger to the Bubbleship symbol and releases. The coordinates immediately display on Jack’s HUD as a target area showing the direction he needs to go.

Simple interactions

Overall, the sequence of interactions for this type of situation is pretty simple and well thought out. Sending coordinates is as simple as:

  1. Tap and hold on the symbol of the target (in this case the drone) using one finger
  2. A summary of coordinates data is displayed around the touchpoint (drone symbol)
  3. Drag data over to the symbol of the receiver (in this case the Bubbleship)

Then on Jack’s side, the position of the coordinates target on his HUD adjusts as he flies toward the drone. Can’t really get much simpler than that.

However…

When Vika initially powers up the desktop, the drone status feed already shows drones 166 and 172 down. This is fine, except the alert sound and blinking icons on the TETVision don’t occur until Jack has already reached the hydro-rigs. This is quite a significant time lag between the drone status feed and the TETVision feed. It would be understandable if there was a slight delay in the alert sound upon startup. An immediate alert sound would likely mean there is something wrong with the TETVision system itself. That said, the TETVision drone icons should at the very least already be blinking red on load.

Monitoring drone 166

Oblivion-Desktop-DroneMonitor-005

As Jack is repairing drone 166, Vika watches the drone status feed on her desktop. The drone status feed is a dedicated screen to the right of the TETVision feed.

Oblivion-Desktop-DroneMonitor-000

It is divided into two main sections, the drone diagnostic matrix to the left and the drone deployment table to the right.

The dispatched drone table lists all drones currently working the security perimeter and lists an overview of information including drone ID, a diagram and operational status. The drone diagnostic matrix shows data such as fuel status and drone positioning along the perimeter as well as a larger detailed diagram of the selected drone.

Oblivion-Desktop-DroneMonitor

By looking at the live diagnostics diagram, Vika is able to immediately tell Jack that the central core is off alignment. As soon as Jack finishes repairing the central core, the diagram updates that the core is back in alignment and an alert sound pings.

How does the feed know which drone to focus on?

Since there is no direct interaction with this monitor shown in the film, it is assumed to be an informational display. So, how does the feed know which drone to focus on for diagnostics?

One possibility could be that Jack transmits data from the ground through his mobile drone programmer handset, which is covered in another post. However, a great opportunity for an example of agentive tech would be that when Vika sends the drone coordinates to the Bubbleship, the drone status feed automatically focuses on that one for diagnostics.

Clear messaging in real-time…almost

Overall, the messaging for drone status feed is clear and simple. As seen in the drone deployment table, the dataset for operational drones includes the drone ID number and a rotating view of the drone schematic. If the drone is down, the ID number fades and the drone schematic is replaced with a flashing red message stating that the drone is offline. Yet, when the drone is repaired, the display immediately updates to show that everything is operational again.

This is one of the basic fundamentals of good user interface design. Don’t let the UI get in the way and distract the user.

Keep it simple.

Vika’s Desktop

Oblivion-Desktop-Overview

As Jack begins his preflight check in the Bubbleship, Vika touches the center of the glass surface to power up the desktop that keeps her in contact with Sally on the TET and allows her to assist and monitor Jack as he repairs the drones on the ground.

The interface components

Oblivion-Desktop-Overview-000

The desktop is broken up into five main screens. The central screen is the TETVision map, which is a radar map used for communications, and monitors the Bubbleship, drones, and scav activity.

To the left of the TETVision map is a Hydro-rig status feed that keeps Vika informed of the water collection progress. Then on the right of the map is the drone status feed, which provides drone vital statistics, deployment and fuel status.

Oblivion-Desktop-Overview-000b

The upright section of the desktop contains two screens. The top screen is the TET system status feed, which monitors the TET’s orbit, and communications status. The second screen monitors the weather systems and wind velocity vectors, which would have an affect on the Bubbleship and drone flight safety.

Quick power-up

Powering on the desktop is virtually instantaneous and is as simple as touching the center of the table. One possible explanation for the speed is that the desktop goes into sleep mode and is in an always-on state. There are a couple of scenes in the film when the TET is able to access the desktop remotely that would support this assumption.

A possible method of power-down would be to tap and hold for a determined period of time. Sadly, there is no film footage that shows Vika shutting down the system.

Multiple versus single user

Oblivion-Desktop-Overview-003

The scale of this desktop is a bit large for a single user who needs to access life-saving information quickly. The display size and setup in the film is generally used for collaborative space so that multiple people can comfortably view and manipulate the data at the same time.

This large scale causes Vika to constantly lean over the table to see information for various reasons including glare, reach and angle of the displays. This could be stressful on the body when interacting with the desktop over long periods of time each day.

A better solution

Vika is only shown interacting with the TETVision map and not with any of the other feeds. If the map is the only screen that is interactive, a more ergonomic setup could be utilized to minimize glare and reach. This would allow Vika to see the vital information at a glance and still enable her to comfortably interact with the TETVision map.

Oblivion-Desktop-Overview-Ergonomics

Don’t forget the user’s needs

Overall, Vika’s desktop is a beautiful piece of technology that performs its function very well. However, in a real-world situation, it is important to remember that Vika will be using this equipment for possibly long periods of time and needs quick access to vital information. Having to roll back and forth between screens during an emergency situation could mean the difference between life and death for Jack while out in the field.

Breakfast Sand Table

A woman in a modern kitchen holding a piece of food while looking at a countertop, with a man in a black shirt leaning on the counter beside her, both engaged in conversation.

While eating breakfast, Vika views the overnight surveillance via a touchscreen interface that is inset into the top of a white table.

Which touch tech?

Anyone interested in the touch technology should take note: Vika places her coffee cup and breakfast plate directly on the surface, which indicates that it utilizes capacitive touch technology with a glass top. Placing dishes on a resistive touchscreen, which is made of layers of plastic and glass would have interfered with the interactions and would be less durable as a tabletop.

Jack joins her at the table and leans on the surface with his hand and later with his forearm, which supports the idea that the area surrounding the viewport is not touch-enabled. If it were, it would need to incorporate palm-rejection technology in order for his arm to not interfere with Vika’s interactions.

The interface components

Oblivion-Desktop-Sandtable-001

The main viewing area is a hybrid of satellite imagery and topographic mapping, surrounded in the interface by surveillance data and video playback controls. A message next to the video playback controls reports the current location of the scav activity.

To the left of the map is a list of fuel cells that have been stolen by the scavs along with the dates they went missing. The last one on the list is flashing red to draw their attention—a new one has gone missing.

Some elements, such as the current date and number of days into the mission face out at the top and the bottom to allow both Vika and Jack to view the data from either side.

A hand points at a futuristic digital map displayed on a transparent screen, showing a detailed terrain with lines and contours, alongside various mission data indicators and a date label.

The interface is responsive to touch gestures. Vika circles an area on the map and the icon indicating unusual activity turns red. She taps the icon and a video feed begins playing. Jack zooms in on the video feed by using a five-finger multi-touch “spread” gesture.

Why is the vital information facing Jack when Vika is the one using the interface?

It’s interesting to note that the the most vital information such as the list of missing drones, video playback and the topographic shaded relief are seen from Jack’s view. This causes Vika to have to process the information and videos upside-down—even though the playback controls face her.

This can be particularly problematic with the topographic shaded relief. Shaded relief simulates the shadow cast by the sun on the surface. Viewing this relief upside-down can cause a perception illusion that results in confusion on what is a crater and what is a hill.

Better: Lenticular display

A better solution would be to utilize a lenticular interactive display. Lenticular displays are made by placing a transparent film containing tiny ridges over an image that is made up of two or more images sectioned into bands and displayed in alternating lines. The ridges in the film cause the eye to focus on one set of lines in order to come out with a cohesive image.

Then, as in the illustration below, Vika would only see the view illustrated by the white lines and Jack would only see the view illustrated by the black lines.

Diagram illustrating the concept of lenticular film and interlaced images with labeled sections showcasing different views.

Utilizing a lenticular display would solve the issue of the shaded relief perception illusion and allow Jack and Vika to each read the information and watch the video from their own perspective at the same time.

The thing that gets a little tricky about utilizing a lenticular display for this solution is the fact that it is a touch screen. The elements that are being interacted with need to be in the same position for both Jack and Vika in order for the computer to know what is being manipulated. This can be solved by flipping the individual elements such as the shaded relief on the topography and the activity icons, words, etc., while keeping them in the same location on the interface.

Smart video recording and playback

So, how did the TET know where to start the video recording and playback? Given that the other interfaces in the film have the capability to detect motion, it is likely that the video recording was automatically triggered by the scavs when they moved in to attack the drone.

Unfortunately, there is no screentime granted to the use of the actual video playback controls, but assuming they are as smart as the rest of the interfaces in the film, it is safe to expect these controls to be more useful than simply sequencing through the scenes. The interface would probably allow Vika to scrub through a grid of thumbnails to quickly find any scenes of interest.

Why circle and tap to play?

The activity alert icon on the map was static white until Vika circled an area surrounding it. Only then did it start flashing red. Other interfaces on Vika’s main desktop provide immediate feedback with an audible alert and a flashing red symbol. Why would this one require the extra effort of circling the area? It would seem simpler to flash red from the beginning and allow Vika to immediately tap on the symbol for video playback.

It is possible that she is circling the area that she wants the TET feed to focus on, but if the TET has the capability to detect the activity to begin with, it should automatically know where to focus.

Another possibility is that she is used to getting multiple alerts every morning and the circle gesture could be for playing all of the surveillance videos at the same time instead of having to tap on each one to play. If that is the case, then she may be using the circle gesture through muscle memory since people tend to use repetitive gestures without thinking about it even if there is a simpler gesture available. If a gesture isn’t used very often, users tend to forget about it.

Overall, this is a nice system that effectively allows Jack and Vika to get a quick overview of the events of the previous night and gives them a heads-up as to what is in store for them that day.