​© 2019 Russell Darling

  • White LinkedIn Icon
  • Facebook Clean
  • Twitter Clean
  • Flickr Clean
  • White Instagram Icon
  • White Vimeo Icon
  • imdb-4

SIMULATION FOR AUTONOMOUS VEHICLES

 

FILMMAKING TECHNOLOGY

Academy of Motion Picture Arts and Sciences

88th Scientific and Technical Awards

Presentation Night

The Scientific and Technical Awards Committee's Presentation Night is an opportunity for each technology to be presented, as an overview, to the entire Committee for awards consideration.

Presentation: ILM Digital Dailies & Shot Management System

Presenter: Russell Darling

Time: 6min

 

Industrial Light and Magic's Automated Digital Dailies and Shot Management System is a collection of hardware and custom software used for the real-time transfer, storage, management and retrieval of images. More specifically, it provides the ability to review high-quality image and video-based media. Thanks to the system, ILM’s creative and technical staff had near instantaneous access to the digital images rendered every day. From this system, they could view, edit, and transfer images completely “on demand” in any screening room or dailies review station. as well as at their desktop.

The system was highly extensible. A centralized server was the “brain” of the system. You could add any number of output devices (digital cinema projectors, professional broadcast monitors, computer display monitors, film printers, color grading systems), and input devices (digital files, VTR's, digital motion picture cameras, etc). The characteristics and capabilities of each device were stored in a database and the server would then enable those devices to be used where appropriate. 

We believe that the ILM Digital Dailies System was the first and most advanced computer-based system to provide review of full-resolution uncompressed HD dailies, with an advanced artist-friendly user interface, and the ability to both playback and record digital media whilst fully integrating with a visual effects pipeline. 

Playback Features

  • Provides high-quality uncompressed images with guaranteed frame-rate playback.

  • Variable frame-rate playback with synchronized audio scrubbing.

  • Fully adjustable timeline, including toggling handles on/off.

  • Playback of frames, loops, segments in non-linear fashion .

  • Ability to playback individual shots, sequences (defined by editorial EDL's or user defined), or entire movie (defined by editorial EDL) that is conformed real-time during playback.

  • Automatic shot version selection, with the ability to automatically select shot versions that are furthest developed (e.g. composites instead of animation, because it is the more refined version of the shot; if composite isn't available, drop down to the TD render, then animation and so on).

  • Ability to compare different versions of shots, both within the same department (ie. animation v8 vs. animation v9) as well as with other departments (TD render vs. composite, etc).

  • Automatic colorspace conversion and default LUT application, along with ability to change LUT's

  • Supports remote dailies collaboration

  • Ability to view shots on any supported device or playback location (desktop, screening room, video kiosk station, etc). System did not require that shots be allocated or pre-cached to fixed viewing locations. Users could request to review any shots from any viewing location Network broadcast of playback session info, including current playing position so that other tools could monitor (desktop artist dailies notification widget)

Editorial Integration

  • Automatic creation of OMF’s and DnxHD AAF files from CG renders, for import into Avid Media Composer

  • Video device connectivity (VTR's, etc) through RS-422 machine control

Digital Media Support

  • HD uncompressed video playback/recording using serial digital interface (SDI) SD 480i (640x480) – 4:3 and 16:9

  • HD 1080p/24 (1920x1080), 4:2:2, 4:4:4 dual-link

  • Quicktime, Avid OMF/AAF/DnxHD

  • DPX frame-based image playback

Audio

  • Synchronized, scrubbable audio playback.

  • Hardware connectivity (ProTools, etc).

Control Connectivity

  • Ethernet network control for software client tools

  • RS-422 machine control protocol ( jog/shuttle remote controllers, VTR’s, etc)

Color Management

  • 2D / 3D LUT's

  • Trulight color cube used to map HD color gamut to the film gamut 

Annotation

  • Interactive on-screen pointer for remote collaborative sessions

  • Head slate with main details (does not play during looping).

  • Togglable on-screen burn-ins (provides ability to see a pristine image when desired) of shot names, frame counters, artist name, notes, mask overlays (lines, hard soft)

Production Integration

  • Supported full end-to-end digital production (acquisition from digital motion picture cameras, editorial interchange, post-production, digital dailies, digital mastering, digital cinema playback)

  • Fully automated transfer of rendered frames to media/video servers

  • Review tools designed with simple/artist-friendly user interfaces (don't need to be an editor to work with system)

  • Image acquisition from renderfarm, digital motion picture cameras, VTR's, and scanned film.

  • Scheduled job queuing support for image conversion, colorspace conversion, file transfers.

Extended Feature-set

  • Digital motion picture camera image acquisition - load images from a high-definition video source, perform image processing and conversion, and deliver the images to CG artists in an organized and highly automated manner (used to acquire the 1.2 million frames that comprised every digitally photographed background plate and element that went into the creation of Star Wars: Episode 2) .

  • Conforming and mastering - tracking every frame of every shot, conforming with the Avid editorial cut, interfacing with a color-grading system, and producing a digital cinema master along with frames

  • to be filmed out for a traditional film print release. (first used on Star Wars: Episode 2)

User Interface Tools

  • AddShot – artist tool that assigns an auto-generated “daily number” and prepares the system to receive frames from the CG pipeline. ShotVu – primary user interface for the HD video playback component of the system.

  • MediaView – primary user interface for the desktop computer dailies review component of the system.

  • Dailies Notifier - desktop tool to provide information and alerts about active dailies sessions.

  • Editorial technician tool-box (wallaby, wombat) used to control:

  • Image acquisition (HD ingest and conversion).

  • Real-time transfer to video servers, conversion, delivery to CG file servers .

  • Conforming and mastering.

Back-end Servers

  • broker server - manages requests and dispatches commands to connected devices and other special-purpose servers

  • media database – manages information related to shots and their versions, editorial cut sequences, playback devices, etc

  • HD video server(s) - provides ability to playback/record uncompressed HD digital video with SDI

PyCon 2007, the fifth annual conference of the Python community, took place February 23-25 in Dallas, Texas. 

Talk: Python for Visual Effects and Animation Pipelines: A Case Study of Tippett Studio's JET

 

Presenter: Russell Darling (Tippett Studio

 

Time: 30min 
 

JET is a Python-based system comprised of software tools and scripts used to implement a visual effects and animation pipeline.

 

A Visual Effects and Animation Pipeline is an "assembly line" of software used to organize, automate and facilitate the creation of computer generated imagery.  JET is a system comprised of software tools and scripts used to implement such a pipeline in the distributed computing environment at Tippett Studio (http://www.tippett.com), an Academy Award-winning visual effects and animation studio specializing in the creation of computer-generated imagery for movies and television commercials.  The entire JET system, including artist tools and the underlying engine, is implemented in Python.

The primary JET tool is highly customizable, featuring XML-based user interface templates that can be modified to suit specific types of artists (animators, painters, technical directors, etc.) or production needs without modifying the core application software.  This means that a user will only have access to those options which are relevant to their department and production.

JET uses modular template "chunks" to perform the each of the tasks in the pipeline (rendering, compositing, etc.).  The templates are designed to be "plug and play" so that the artist can select any combination and have them automatically configure themselves to work with each other by forming the appropriate connections in the pipeline. The templates are implemented as Python objects and are centrally located. This provides for better software maintenance, allowing for quicker response when dealing with bug fixes and change requests.

JET is not only implemented entirely in Python, but it is also used to automatically generate Python scripts.  These custom scripts form unique pipelines for each computer graphics job to run on the render farm.

In this case study, we propose to demonstrate the JET tool-set from both a user and implementation point-of-view, as well as to discuss the following topics:

Problem:

  • legacy code and outdated/limited pipeline

  • simple and easy user interface for artists while also supporting a more advanced interface for technical users

  • cross-platform

  • rapid implementation

  • small software group cannot support unique and diverse set of tools that basically do the same thing

  • interface with commercial VFX software (Maya, Renderman, Shake, etc.)


Design/Architecture:

  • dynamic "chunk" templates

  • core JET engine (user interface builder, script reader/writer)

  • flexible/customizable graphical user interface

  • dependency tree


Why Python?:

  • ease of customization

  • automated code generation

  • cross-platform

  • Python objects ideal for "chunk" templates


Problems:

  • performance for large jobs

  • QT/GUI issues

  • error handling


Future:

  • next evolution of the visual effects pipeline