Dann Cahn on editing I Love Lucy with the Three Headed Monster, a Moviola which played the film from all three camera angles simultaneously and (hopefully) in sync with an optical track for sound.
Interview with Walter Murch who received the Camerimage festival’s Special Award to Editor with Unique Visual Sensitivity. He edited sound on American Graffiti and The Godfather: Part II, won his first Academy Award nomination for The Conversation, won his first Oscar for Apocalypse Now, and won an unprecedented double Oscar for Best Sound and Best Film Editing for his work on The English Patient.
What has three letters, many aliases and is of major significance to the sound community? You guessed it: ADR aka Automated Dialog Replacement aka Additional Dialog Recording aka Dubbing aka Looping. All of these monikers are understood as the process of re-recording dialog that cannot be salvaged from a production. To make one thing clear, there is nothing automated about it. ADR is an art.
Article in International Photographer, the forerunner of International Cinematographer Magazine.
In a 1969 interview, cast and crew remember working with the mad genius Orson Welles on his groundbreaking first film.
Republished from the DGA’s Action magazine (May-June, 1969)
The Lady from Shanghai (1947)
Update:The new Orson Welles documentary MAGICIAN starts in Los Angeles and New York City on December 10th!
More info: http://cohenmedia.net/films/magician
Magician: The Astonishing Life and Work of Orson Welles looks at the remarkable genius of Orson Welles on the eve of his centenary – the enigma of his career as a Hollywood star, a Hollywood director (for some a Hollywood failure), and a crucially important independent filmmaker. From Oscar-winner Chuck Workman.
With: Simon Callow, Christopher Welles Foder, Jane Hill Sykes, Norman Lloyd, Ruth Ford, Julie Taymor, Peter Bogdanovich, James Naremore, Steven Spielberg, Henry Jaglom, Elvis Mitchell, Beatrice Welles-Smith, Walter Murch, Costa-Gavras, Oja Kodar, Joseph McBride, Wolfgang Puck, Jonathan Rosenbaum, Michael Dawson, Paul Mazursky, Frank Marshall
Disney Research demonstrated Automatic Editing of Footage from Multiple Social Cameras at SIGGRAPH.
Video cameras that people wear to record daily activities are creating a novel form of
creative and informative media. But this footage also poses a challenge: how to expeditiously
edit hours of raw video into something watchable. One solution, according to Disney researchers,
is to automate the editing process by leveraging the first-person viewpoints of multiple cameras
to find the areas of greatest interest in the scene.
The method they developed can automatically combine footage of a single event shot by
several such “social cameras” into a coherent, condensed video. The algorithm selects footage
based both on its understanding of the most interesting content in the scene and on established
rules of cinematography.
“The resulting videos might not have the same narrative or technical complexity that a
human editor could achieve, but they capture the essential action and, in our experiments, were
often similar in spirit to those produced by professionals,” said Ariel Shamir, an associate
professor of computer science at the Interdisciplinary Center, Herzliya, Israel, and a member of
the Disney Research Pittsburgh team.
Whether attached to clothing, embedded in eyeglasses or held in hand, social cameras
capture a view of daily life that is highly personal but also frequently rough and shaky. As more
– more –eople begin using these cameras, however, videos from multiple points of view will be
available of parties, sporting events, recreational activities, performances and other encounters.
“Though each individual has a different view of the event, everyone is typically looking
at, and therefore recording, the same activity – the most interesting activity,” said Yaser Sheikh,
an associate research professor of robotics at Carnegie Mellon University. “By determining the
orientation of each camera, we can calculate the gaze concurrence, or 3D joint attention, of the
group. Our automated editing method uses this as a signal indicating what action is most
significant at any given time.”
In a basketball game, for instance, players spend much of their time with their eyes on the
ball. So if each player is wearing a head-mounted social camera, editing based on the gaze
concurrence of the players will tend to follow the ball as well, including long passes and shots to
The algorithm chooses which camera view to use based on which has the best quality
view of the action, but also on standard cinematographic guidelines. These include the 180-
degree rule – shooting the subject from the same side, so as not to confuse the viewer by the
abrupt reversals of action that occur when switching views between opposite sides.
Avoiding jump cuts between cameras with similar views of the action and avoiding very
short-duration shots are among the other rules the algorithm obeys to produce an aesthetically
The computation necessary to achieve these results can take several hours. By contrast,
professional editors using the same raw camera feeds took an average of more than 20 hours to
create a few minutes of video.
The algorithm also can be used to assist professional editors tasked with editing large
amounts of footage.
Other methods available for automatically or semi-automatically combining footage from
multiple cameras appear limited to choosing the most stable or best lit views and periodically
switching between them, the researchers observed. Such methods can fail to follow the action
and, because they do not know the spatial relationship of the cameras, cannot take into
consideration cinematographic guidelines such as the 180-degree rule and jump cuts.
Automatic Editing of Footage from Multiple Social Cameras
Arik Shamir (DR Boston), Ido Arev (Efi Arazi School of Computer Science), Hyun Soo Park (CMU), Yaser Sheikh (DR Pittsburgh/CMU), Jessica Hodgins (DR Pittsburgh)
ACM Conference on Computer Graphics & Interactive Techniques (SIGGRAPH) 2014 – August 10-14, 2014
Paper [PDF, 25MB]
A very good documentary about the EditDroid. When I worked at Disney Imagineering, we were using laser disc players in a lot of places, including EPCOT starting the 1980s. We were also innovators and early adoptors of nonlinear editing and video to film matchback.
From the film’s website;
The EditDroid was (one of) the first nonlinear electronic editing system and used several laser disc players loaded with the raw footage of a film. The simple computer interface was unique for its time. After a short period of success the EditDroid disappeared from the film scene and George Lucas sold the machine’s patents to a small company called Avid.
I highly recommend the book Droidmaker: George Lucas And the Digital Revolution. It is also available as an iBook and on Kindle.
This book ventures in territory never explored, as Rubin-a former member of the Lucasfilm Computer Division-reconstructs the events in Hollywood, in Silicon Valley, and at Lucas’ private realm in Marin County, California, to track the genesis of modern media. With unprecedented access to images and key participants from Lucasfilm, Pixar and Zoetrope-from George Lucas and the executives who ran his company, to the small team of scientists who made the technological leaps, Rubin weaves a tale of friendships, a love of movies, and the incessant forward movement of technology. This is a compelling story that takes the reader into an era of technological innovation almost completely unknown
A very good video about "Jerry's Noisy Toy." Here is a more advanced version seen in the behind the scenes film "Man in Motion"
Particle Fever follows six brilliant scientists during the launch of the Large Hadron Collider, marking the start-up of the biggest and most expensive experiment in the history of the planet, pushing the edge of human innovation.