Category: VFX

Face2Face: Real-time Face Capture

A team of researchers are working on a project called Face2Face, which is described as “real-time face capture and reenactment of RGB videos.”

Read more

Disney’s Augmented Reality Characters from Colored Drawings

Given the proliferation and popularity of digital devices, real-world activities like coloring can seem unexciting, and children become less engaged in them. Augmented reality holds unique potential to impact this situation by providing a bridge between real-world activities and digital enhancements.

Read more

“What Lives Inside” Episodes

I previously wrote about What Lives Inside here.

This is the future folks, computer companies producing a movie to be shown on a streaming service.

Get ready to be taken to a world beyond your imagination. From Academy Award Winner Robert Stromberg, Dell presents What Lives Inside. Starring Academy Award Winner J.K. Simmons, Colin Hanks and Catherine O’Hara. Premiering March 25th only on Hulu.

This is the Episode playlist and some behind the scenes.

Here is the Making of video.


Intel Dell What Lives Inside – Behind the Scenes by CGMeetup


Intel Dell What Lives Inside – VFX Breakdown by CGMeetup

“What Lives Inside” Official Trailer and behind the scenes

This is the future folks, computer companies producing a movie to be shown on a streaming service.

Get ready to be taken to a world beyond your imagination. From Academy Award Winner Robert Stromberg, Dell presents What Lives Inside. Starring Academy Award Winner J.K. Simmons, Colin Hanks and Catherine O’Hara. Premiering March 25th only on Hulu. Find out more at here.

From Fast Co.Create.

What Lives Inside is the fourth installment of Intel’s “Inside Films” series, dating back to a partnership with Toshiba, and agency Pereira & O’Dell, that started in 2011 with Inside, starring Emmy Rossum and directed by D.J. Caruso. It was followed by 2012’s The Beauty Inside starring Topher Grace, and 2013’s The Power Inside starring Harvey Keitel.

This year’s film, divided into four episodes, is about the son of an absentee father (Hanks) who finds himself on a journey of self-discovery after the death of his father (Simmons), a well-known and acclaimed children’s puppeteer who was widely celebrated for his creativity. The son discovers a mysterious world of his dad’s creation and finds himself on an adventure that will soon unlock his own creativity.

3043484-slide-i-1-intel-and-dell-launch-what-lives-inside-a-social-film-directed-by-two-time-oscar-winner-robe

Pereira & O’Dell chief creative officer PJ Pereira says one of the biggest challenges this year was finding another fresh way of bringing the same premise—Intel tagline “It’s what’s inside that counts”—to life. “We had to find a role to make the product not the subject of the story we are telling, but a character,” says Pereira. “Because characters are what the audience will remember and love months after the campaign is gone.”

In addition to Oscar-winning talent, each Inside Films series featured a social element, soliciting submissions from people for the chance to see their photo and videos for the film, or even audition for a part. This year, it worked a bit differently. “This time, because the central theme is their creativity, that’s what is on display. Their drawings, as if they were all kids that have submitted ideas to the character played by J.K. Simmons,” says Pereira.

Just six weeks after Stromberg issued a challenge online, the film received thousands of creature submissions and more user-generated content than the two previous films combined. “This project always felt more like a film than an ad, with its longer format, incredible cast and extensive visual effects,” says Stromberg, who won art direction Oscars for Alice in Wonderland and Avatar. “The whole interactive angle is also super interesting to me. We’ve had over 6,000 submissions of art work, which is crazy! I just think that’s a much better indicator of engagement than throwing a project into testing. I love how it lets people be an active part of the final product. Any time I can be a part of inspiring others to get in touch with their creative side, only inspires me more as an artist.”

The film debuts on Hulu, with new episodes weekly for four weeks, then starting May 6 the full series will be available on WhatLivesInside.com and YouTube.

VFX Legend Douglas Trumbull talks about the Future of Film … and Kubrick.

From the Sept. 12 issue of The Hollywood Reporter.

Trumbull drives me a short distance from his home to a full-size soundstage and escorts me into a screening room that he has constructed to meet his ideal specifications: a wide wall-to-wall and floor-to-ceiling curved screen, with surround sound, steeply rigged stadium seating and a 4K high-resolution projector. As I put on specially designed 3D glasses and settle into stadium seating, he tells me, with an unmistakable hint of nervousness, “You’re one of the first people on the planet to see this movie.”

Ten minutes later, the lights come back up and I sit in stunned silence. The short that I have just seen, UFOTOG (a blending of the words “UFO” and “fotog,” the latter slang for press photographer), is stunning not because of its story — we’ve all seen movies about UFOs — but because it shows, as it was designed to do, what movies can look like if theaters, studios and filmmakers embrace the MAGI process through which Trumbull brought it to the screen: bigger, brighter, clearer and with greater depth-of-field than anything ever seen in a cinema before.

All of the aforementioned conditions are part of the MAGI equation, but the most essential element is the rate of frames per second at which a film is projected. In the beginning, the Lumiere brothers projected films at 18 fps, slow enough to result in the appearance of flickering —  hence the early nickname for the movies, “the flickers” or “the flicks.” That figure eventually increased to 24 fps, and has remained there, for the most part, ever since.

In 2012, Peter Jackson dared to release The Hobbit‘s first installment at 48 fps, which was supposed to create a heightened sense of realism, but which instead struck many as strange-looking and some even as nauseating. Many deemed the experiment a failure. Trumbull disagreed. He felt that if a digitally shot film was projected even faster — markedly faster, as in 120 fps, via a bright projector and onto a big screen — then the movie screen itself would seemingly disappear and serve effectively as a window into a world on the other side that would appear as real as the world in which one sits.

tothemoon

To the Moon and Beyond featured a 70 mm circular image projected onto a dome screen and took viewers on a journey “from the Big Bang to the microcosm in 15 minutes.” Two of the thousands who saw it were Stanley Kubrick, the filmmaker, and Arthur C. Clarke, the writer, who came away from it convinced that an A-level sci-fi film — which eventually became 2001: A Space Odyssey — was possible. Kubrick contracted Graphic Films to produce conceptual designs for the project, but, once it got off the ground, moved it to London, at which point 23-year-old Trumbull cold-called the director and got a job on the film. His greatest contribution to it was devising a way to create a believable “Star Gate” effect, representing “the transformation of a character through time and space to another dimension.” Even though Kubrick alone claimed screen credit and an Oscar for the film’s VFX, Trumbull instantly became a name in the business.

silent-running-movie-poster-1972-1020209768

A few years later, he made his directorial debut with Silent Running (1972), a well-received film that landed him deals at Fox, MGM and Warner Bros. — but all of them “unraveled for stupid reasons.” By 1975, “desperate because you can’t live on development deals,” he and Richard Yuricich proposed the creation of the Future General Corporation, through which they would try to identify ways to improve the technology used to make films. Paramount agreed to sponsor the endeavor — which, to them, was a tax write-off — in return for 80 percent ownership. Within the first nine months of its existence, Trumbull says, “We invented Showscan [a manner of projecting films at 60 fps]. We invented the first simulator ride. We invented the 3D interactive videogame. And we invented the Magicam process [by which actors can perform in front of a blue screen, onto which nonexistent locations can be projected to create virtual realities].” And yet, in the end, Paramount “saw no future in the future of movies” and failed to support their efforts, devastating Trumbull, who was under exclusive contract to the studio for the next six years. (The studio’s one gesture that he did appreciate: loaning him out to Columbia to do the special effects for Close Encounters of the Third Kind.)

Trumbull got out of his Paramount contract in 1979 thanks to Star Trek: The Motion Picture. The original effects team that had been engaged for the highly anticipated film couldn’t handle the job, something the studio realized only six months before its long-scheduled Christmas release date. The studio begged Trumbull to take over, and he agreed to do so — provided he was paid a considerable fee and released from his contract. He got what he requested and, to the detriment of his health, also got the job done on time.

Newly a free agent, Trumbull continued to take on special effects jobs for others — for instance, Ridley Scott‘s Blade Runner (1982) — but his primary focus was on directing a film of his own that would demonstrate the capabilities of Showscan. For the project, which he called Brainstorm, he secured a top-notch cast, led by Natalie Wood, and a major distributor, MGM. Production got underway and was almost completed when, on Nov. 29, 1981, tragedy struck: Wood drowned under circumstances that remain mysterious to this day. Since Wood had only a few small scenes left to shoot, Trumbull felt that he could easily finish the film, but MGM, which was in dire financial straits, filed what he deemed a “fraudulent insurance claim” because “they wanted to get out of it.”

Doug Trumbull on motion simulator base for “In Search of the Obelisk” (1993) VistaVision ridefilm at the Luxor Las Vegas.
Doug Trumbull on motion simulator base for “In Search of the Obelisk” (1993) VistaVision ridefilm at the Luxor Las Vegas.

Photo courtesy of Mice Chat.

Then, in 1990, he was approached about making a Back to the Future ride for Universal Studios venues in Florida, Hollywood and Japan. Others had been unable to conquer it, but he made it happen — and in a groundbreaking way: “It took you out of your seat and put you into the movie. You were in a DeLorean car. You became Marty McFly. You became a participant in the movie. The movie was all around you.” It ran for 15 years, he says, but was “dismissed as a theme park amusement.” He felt it was something more. “This was a moment where, for the first time in history, you went inside a movie.” Even though others failed to see larger possibilities, he says, “That kinda kept me going for a long time because it validated that we could be here in the Berkshires and make breakthroughs that no one else was able to do in Hollywood or anywhere else.”

In 2009, James Cameron‘s Avatar, a digitally shot 3D production that grossed a record $2.8 billion worldwide, changed everything. Its success spurred, at long last, filmmakers to transition en masse to digital photography and theaters to transition en masse to digital projection — at which point Trumbull made a crucial discovery. He realized that digital projectors run at 144 fps — twice as fast as Showscan had been able to — but films were still being made at 24 fps, with each frame just flashing multiple times. “Could we do a new frame every flash?” he wondered. If so, he reasoned, it might just give people a reason to put down their smartphones, tablets and laptops and actually buy a ticket to see a movie in a theater.

After years of work on his farm, Trumbull is finally ready to unveil UFOTOG. Its first public presentation will take place on Sept. 11 as part of the Toronto International Film Festival’s Future of Cinema conference (at which Trumbull will also give a keynote address), and it will also screen days later at the IBC Conference in Amsterdam. At both venues, he says, his message will be rather straightforward: “It’s not rocket science, guys. It’s just a different shape, a different size, a different brightness and a different frame rate. Abandon all that crud that’s leftover from 1927. We’re in the digital age. Get with it.”

The cost of these changes, he insists, will be rather negligible: projectors are already equipped to handle faster frame rates, and would require only slightly more data time and render time; theaters are already adopting brighter projectors that employ laser illumination, which uses a longer-lasting bulb to produce twice the amount of light; and theaters, he believes, will soon recognize that they are in the “real estate business” and that it is in their interest to have fewer total screens but more big screens, for which the public has demonstrated a willingness to pay a premium.

Trumbull’s main objective, though, is “to show the industry what it is possible to do” with MAGI. He says he’s “dying to show” UFOTOG to filmmakers such as Jackson, Cameron and Christopher Nolan, whom he regards as kindred souls. But mostly, he wants to challenge the industry one more time, warning it, “If you want people to come to theaters, you better do something different.”


Hyperlapse from Instagram

Product designer Chris Connolly, and software engineers Thomas Dimson and Alex Karpenko.

Photo: Ariel Zambelich/WIRED

From Wired.

Today at 10am PST, Instagram is lifting the veil on Hyperlapse, one of the company’s first apps outside of Instagram itself. Using clever algorithm processing, the app makes it easy to use your phone to create tracking shots and fast, time-lapse videos that look as if they’re shot by Scorsese or Michael Mann. What was once only possible with a Steadicam or a $15,000 tracking rig is now possible on your iPhone, for free. (Instagram hopes to develop an Android version soon, but that will require changes to the camera and gyroscope APIs on Android phones.) And that’s all thanks to some clever engineering and an elegantly pared-down interaction design. The product team shared their story with WIRED.

The Inspirations
By day, Thomas Dimson quietly works on Instagram’s data, trying to understand how people connect and spread content using the service. Like a lot of people working at the company, he’s also a photo and movie geek—and one of his longest-held affections has been for Baraka, an art-house ode to humanity that features epic tracking shots of peoples all across the world. “It was my senior year, and my friend who was an architect said, ‘You have to see it, it will blow you away,’” says Dimson. He wasn’t entirely convinced. The movie, after all, was famous for lacking any narration or plot. But watching the film in his basement, Dimson was awestruck. “Ever since, it’s always been the back of my mind,” he says.

A sample shot from Baraka

By 2013, Dimson was at Instagram. That put him back in touch with Alex Karpenko, a friend from Stanford who had sold his start-up to Instagram in 2013. Karpenko and his firm, Luma, had created the first-ever image-stabilization technology for smartphone videos. That was obviously useful to Instagram, and the company quickly deployed it to improve video capture within the app. But Dimson realized that it had far greater creative potential. Karpenko’s technology could be used to shoot videos akin to all those shots in Baraka. “It would have hurt me not to work on this,” says Dimson.

Once you start using the app, you quickly see that replay speed itself becomes a novel, alluring tool: For pets and people, replaying at about 1x gives you the sense that you’re creating a tracking shot like that Copacabana scene in Goodfellas. The higher replay speeds work better for shooting the sky out your airplane window, the scenery scrolling past during a train ride, or anything else that’s moving slowly or at a distance. Where Instagram’s filters are all about changing color and light, Hyperlapse uses a simple speed slider as its main creative decision.

All of those choices must be built-in up front with traditional camera rigs. Usually, capturing even a brief tracking shot requires intricate choreography between where you’ll move with the camera and what your subjects will be doing when you film them. Time-lapse set-ups are even more intense, requiring a camera be set up on a track and programmed to move at a steady speed. Both of those art forms are hardly spontaneous, and spontaneity is supposed to be Instagram’s calling card.

Hyperlapse, by contrast, let’s you create a tracking shot in less than a minute. “This is an app that let’s you be in the moment in a different way,” says Krieger. “We did that by taking a pretty complicated image processing idea, and reducing it to a single slider. That’s super Instagram-y.”


The Technology behind Hyperlapse from Instagram

INSTAGRAM, HYPERLAPSE, INSTAGRAM ENGINEERING, VIDEO STABILIZATION
Yesterday we released Hyperlapse from Instagram—a new app that lets you capture and share moving time lapse videos. Time lapse photography is a technique in which frames are played back at a much faster rate than that at which they’re captured. This allows you to experience a sunset in 15 seconds or see fog roll over hills like a stream of water flowing over rocks. Time lapses are mesmerizing to watch because they reveal patterns and motions in our daily lives that are otherwise invisible.

Hyperlapses are a special kind of time lapse where the camera is also moving. Capturing hyperlapses has traditionally been a laborious process that involves meticulous planning, a variety of camera mounts and professional video editing software. With Hyperlapse, our goal was to simplify this process. We landed on a single record button and a post-capture screen where you select the playback rate. To achieve fluid camera motion we incorporated a video stabilization algorithm called Cinema (which is already used in Video on Instagram) into Hyperlapse.

In this post, we’ll describe our stabilization algorithm and the engineering challenges that we encountered while trying to distill the complex process of moving time lapse photography into a simple and interactive user interface.

Cinema Stabilization

Video stabilization is instrumental in capturing beautiful fluid videos. In the movie industry, this is achieved by having the camera operator wear a harness that separates the motion of the camera from the motion of the operator’s body. Since we can’t expect Instagrammers to wear a body harness to capture the world’s moments, we instead developed Cinema, which uses the phone’s built-in gyroscope to measure and remove unwanted hand shake.

Go here for more in depth info and technical videos http://instagram-engineering.tumblr.com/post/95922900787/hyperlapse

The Congress, a film by Ari Folman.

The recent controversy about what Andy Serkis said about Dawn of the Planet of the Apes and Scarlett Johansson’s performance in Her are both indications of the rise of the “Virtual Actor.” 

After seeing the fascinating trailer, I am looking forward to seeing this film. Please note the Fleischer and Kubrick comments in the interview selections below.

From the website for The Congress, a film by Ari Folman.

Robin Wright, a Hollywood actress who once held great promise (“The Princess Bride”, “Forest Gump”), receives an unexpected offer in mid-life: Mirramount Studios want to scan her entire being into their computers and purchase ownership of her image for an astronomical fee. After she is scanned, the studio will be allowed to make whatever films it wishes with the 3-D Robin, including all the blockbusters she chose not to make during her career. As if that were not inducement enough, the studio promises to keep the new 3D Robin forever young in the movies. She will always be thirty-something, a stunning beauty who never grows old. In return, Robin will receive tons of money but shell be forbidden to appear on any kind of stage for all eternity. Despite her deep internal resistance, Robin eventually signs the contract , since she understand that in the economy of scanned actors, its her only way to stay in the business, but even more crucial, Robin can give her son Aaron, who suffers from a rare disorder, the best treatment money can buy. The contract is valid for 20 years.

Twenty years later, Robin arrives at Abrahama, the animated city composed by Miramount Nagasaki, once a Hollywood studio that signed Robin, and now the exclusive creator of the cinematic dream-world that controls all our emotions, from love and longings to ego and deathly anxieties. Miramount Nagasaki’s chemistry is everywhere, from the air-conditioning to the water sources. During the intervening two decades, the corporation has turned Robin Wright from a Hollywood actress with unfulfilled potential into an international superstar and fantasy. On-screen, she has remained forever young. In the animated world of the future, Miraramount Nagasaki is celebrating a huge gathering in the heart of the desert, “The Futurist Congress.” At the event, Miramount Nagasaki’s genius scientists — once creators of movies, now computer programmers who have evolved into chemists and pharmacists—will declare the next stage in the chemical evolution: free choice! From now on, every viewer can create movies in his own imagination, thanks to chemical selection. Robin Wright is now a mere chemical formula that every person can consume by taking the correct prescription, then staging whatever story they desire: Snow White, personal family dramas, or porn. It’s all in the brain, all through chemicals.

The animated Robin Wright is an “elderly” woman of 66. When she arrives at the congress as the guest of honor, no one recognizes her as the stunning beauty admired by all, a star whose image is broadcast on screens in every corner of the congress. She is lonely, about to become a chemical formula, when out of nowhere, Paramount Nagasaki’s utopian plan is suddenly derailed: the thinking man, the resister, the rebels who have been fighting the deceptive regime of the pharmaceutical world, unite and turn the Futurist Congress into a fatally violent arena. The struggle for clarity of thought becomes a war of independence for the right to imagine. Out of the forgetting and the loss, Robin suddenly regains the ability to choose. Will she go back to living in the world of truth, a gray world devoid of chemistry, where she is an aging, anonymous actress caring for her sick 30-year-old son? Or will she surrender to the captivating lie of the chemical world and remain forever young?

The Congress by Ari Folman

robinwright

In his novel The Futurological Congress, the great science fiction writer Stanislaw Lem foresaw a worldwide chemical dictatorship run by the leading pharmaceutical companies.  Written in the late nineteen-sixties, the book depicted drug manufacturers’ complete control of our entire range of emotions, from love and longings, to jealousy and deadly fear. Lem, considered sci-fi’s greatest prophet and philosopher (alongside Philip K. Dick), could not have realized how prescient he was in predicting the start of the third millennium. Into the psychochemical whirlwind foreseen by Lem, the film adaptation of his novel introduces the current cinematic technologies of 3-D and motion capture, which threaten to eradicate the cinema we grew up on. In the post-“Avatar” era, every filmmaker must ponder whether the flesh and blood actors who have rocked our imagination since childhood can be replaced by computer-generated 3-D images. Can these computerized characters create in us the same excitement and enthusiasm, and does it truly matter? The film, entitled The Congress, takes 3-D computer images one step further, developing them into a chemical formula that every customer may consume through prescription pills, thereby compiling in their minds the movies they have always wanted to see, staging their fantasies, and casting the actors they adore. In this world, these beloved creatures of stage and cinema become futile relics, lacking in content, remembered by no one. Where, then, do these actors go after selling their souls and identities to the studio devil? The Congress comprises quasi-documentary live-action sequences that follow one such actress, Robin Wright, as she accepts an offer to be scanned and signs a contract selling her identity to the studio, then transitions into an animated world that depicts her tribulations after selling her image, up until the moment when the studio turns her into a chemical formula. Only the mesmerizing combination of animation – with the beautiful freedom it bestows on cinematic interpretation – and quasidocumentary live-action, can illustrate the transition made by the human mind between psychochemical influence and deceptive reality. The Congress is primarily a futuristic fantasy, but it is also a cry for help and a profound cry of nostalgia for the old-time cinema we know and love.

INTERVIEW WITH ARI FOLMAN

THE CONGRESS presents a strongly dystopic vision of Hollywood and big studio movies – is that also how you view that part of the industry? Does your film reflect a fear for the future of cinema?

While searching for a suitable location in LA to shoot the scanning room scene, I was  shocked to learn that such a room already exists. Actors have been scanned for a number of years now – this technology is already here. Flesh and blood actors are not really needed in this ”post Avatar era“. I guess its economics now that dictate whether the next generation of films will be with scanned actors, or with a completely new generation of actors ”built from scratch“. As an optimist, I think the choice for a human actor will win out and I hope The Congress is our small contribution toward that goal.

So many details in THE CONGRESS are ”futuristic“ yet still very current – do you see any positive aspects of living in another reality, behind an online avatar for example? Do you think it approaches the film‘s idea of choosing your own reality?

I think the chemical world outlined in Lem‘s novel and in the film is a fantasy, but at the same time its still a major fear for those of us who travel in our imagination and our dreams. I have always had the feeling that everybody, everywhere lives in parallel universes, one, were we function in real time and the other, the universe where our mind takes us – with or without our control. Combining the two worlds into a one, is for me the biggest goal of being a filmmaker. 

The film is unique but features what seems like an encyclopedia of significant references in terms of cinema and otherwise. Were there key films or other influences that served as guides or inspirations as you made this movie?

The animated part is a tribute to the great Fleischer Brothers‘ work from the 30‘s. It‘s hand drawn, made in 8 different countries and took two and a half years to create 55 minutes of animation. It was by far the toughest mission of my life as a director. The team back home, led by the director of animation, Yoni Goodman were working 24/7 to ensure the animation from a number of different studios had a consistency in the characters from scene to scene. During the process we discovered that sleep is for mortals and animation for the insane! Elsewhere in the movie I try to pay tribute to my idol Stanley Kubrick twice; once with a reference to Dr. Strangelove and another to 2001: A Space Odyssey, still my favorite sci-fi movie ever.

Congresswide

For more behind the scenes go here.

Lucid Dreams of Gabriel – Teaser

From Variety,

Disney and Swiss pubcaster SRF unveil experimental short at Locarno fest.

At the Locarno Film Festival, the Disney lab and SRF jointly unveiled an impressive experimental short titled “Lucid Dreams of Gabriel” (see teaser) which for the first time displayed local frame variation, local pixel timing, super slow motion effects, and a variety of artistic shutter functions showcasing this “The Flow-of-Time” technique.

The project was created by the Disney Research lab in tandem with the formidable computer graphics lab at the Swiss Federal Institute of Technology Zurich (ETH) with SRF providing studio space, personnel, and other resources.

“We wanted to control the perception of motion that is influenced by the frame rate (how many images are shown per second) as well as by the exposure time,” said Markus Gross, who is Vice President Research, Disney Research and director of Disney Research, Zurich, at the presentation.

Use of the new technologies in the short, which is a surreal non-linear story about a mother achieving immortality in her son’s eyes after an accident in the spectacular Engadin Alpine valley, allowed director Sasha A. Schriber to avoid using green screen and to make the transition from reality (at 24 frames per second) to a supernatural world (at 48 frames per second).

“Lucid Dreams Of Gabriel,” an experimental short film created by Disney Research in collaboration with ETH, Zurich, was shot at 120fps/RAW with all effects invented and applied in-house at Disney Research Zurich. We sought to produce a visual effects framework that would support the film’s story in a novel way. Our technique, called “The Flow-Of-Time,” includes local frame rate variation, local pixel timing and a variety of artistic shutter functions.

Effects include:
•High dynamic range imaging
•Strobe and rainbow shutters
•Global and local framerate variations
•Flow motion effects
•Super slow motion
•Temporal video compositing

The following scenes of the teaser, indicated by the timecode, demonstrate different components of our new technology:

Shots with a dark corridor and a window (0:08); a man sitting on a bed (0:16):
Our new HDR tone-mapping technique makes use of the full 14 bit native dynamic range of the camera to produce an image featuring details in very dark as well as very bright areas at the same time. While previous approaches have been mostly limited to still photography or resulted in artifacts such as flickering, we present a robust solution for moving pictures.

A hand holding a string of beads (0:14):
As we experimented with novel computational shutters, the classic Harris-shutter was extended to make use of the full rainbow spectrum instead of the traditional limitation to just red, green, and blue. For this scene, the input was rate converted using our custom technology, temporally split and colored, then merged back into the final result.

The double swings scene (0:20):
Extending on our experiments with computational shutters, this scene shows a variety of new techniques composed into a single shot. Fully facilitating the original footage shot in 120 fps, the boy has been resampled at a higher frame rate (30fps) and a short shutter, resulting an ultra crisp, almost hyper-real appearance, while the woman was drastically resampled at a lower frame rate (6fps) featuring an extreme shutter which is physically not possible and adding a strong motion blur to make her appear more surreal.

Car driving backwards and a flower (0:30); a train (0:36),
For these scenes, we were experimenting with extreme computational shutters. The theoretical motion blur for the scenes was extended with a buoyancy component and modified through a physical fluid simulation, resulting in physically impossible motion blur. As shown, it is possible to apply this effect selectively on specific parts of the frame, as well as varying the physical forces.

Super slow motion closeup of the boy (0:44); a handkerchief with motion blur and super slow motion (0:47); an hourglass (0:50):
These shots show the classical application of optical flow – slow motion. However, with our new technology we have been able to achieve extremely smooth pictures with virtually no artifacts, equivalent of a shutter speed at 1000 fps. At the same time, artificial motion blur equivalent of a shutter of far more than 360 degrees can be added to achieve a distinct “stroby” look, if desired, while maintaining very fluent motion in all cases. We are also able to speed up or slow down parts of the scene, e.g. to play the background in slow-motion while the foreground runs at normal speed. All of these effects can be applied on a per-pixel basis, thus giving full freedom to the artist.

Additional info on the film:

“Lucid Dreams Of Gabriel” is a surrealistic and non-linear story about a mother achieving immortality through her son, unconditional love, and the fluidity of time.

Producer: Markus Gross
DOP: Marco Barberi
Script & Director: Sasha A. Schriber
Camera & lenses: Arri Alexa XT with Zeiss prime lenses
Original language: English
Length: 11 minutes

UFOTOG, a new film by Douglas Trumbull

I have always been a great admirer of Douglas Trumbull. Here is the press release about his newest project.

SEATTLE, April 30, 2014 — Academy Award winner Douglas Trumbull, announced today his forward-looking ten-minute demonstration movie UFOTOG. Produced at TRUMBULL STUDIOS in western Massachusetts, the experimental sci-fi adventure written and directed by Trumbull in 4K 3D at 120 frames per second, demonstrates his new process called MAGI, which explores a new cinematic language that invites the audience to experience a powerful sense of immersion and impact that is not possible using conventional 24 fps or 3D standards.

UFOTOG is a dramatic short story about a lone man attempting to photograph UFOs. Trumbull felt that it would be ideal to premiere UFOTOG at Paul Allen’s iconic Seattle Cinerama Theater as the headlining event at the annual Science Fiction Film Festival Sunday May 11, 2014, in conjunction with special screenings of 2001: A SPACE ODYSSEY and CLOSE ENCOUNTERS OF THE THIRD KIND, both of which have alien contact stories. In addition, Trumbull will introduce his movie BRAINSTORM, on Friday, May 9, which marked the beginning of his quest for immersive cinema.

Trumbull has embarked on a project to write, produce, and direct using the MAGI technology for the production of his own feature films at Trumbull Studios. Through his comprehensive understanding of the needed technological advances, Trumbull has constructed a laboratory/stage/studio where he can shoot 120 fps 4K 3D live action within virtual environments, and see the results on the large screen adjacent to the shooting space. Trumbull announced: “This way, we can explore and discover a new landscape of audience excitement, and do it inexpensively and quickly – we are pushing the envelope to condense movie production time, intending to make films at a fraction of current blockbuster costs, yet with a much more powerful result on the screen.”

Trumbull Studios partnered with Christie to explore the potential of 3D 4K 120 fps projection, using the latest Christie Mirage 4K35 projection system. A special Mirage system will be installed in the Seattle Cinerama Theater for the premiere of UFOTOG, and the theater will soon offer the first public installation of Christie’s new laser illumination system in the fall of this year.

Trumbull Studios is committed to Eyeon Software, which has enabled the production of UFOTOG with the unheard-of impact of 3D in 4K at 120 fps, using Eyeon Fusion, Generation, Connection, and Dimension.

Trumbull’s pioneering work also included his Academy Award winning 70mm 60 frames per second SHOWSCAN process, which was widely acclaimed by industry professionals, and that led to the development of the film BRAINSTORM, which was to debut the process worldwide, with Trumbull directing. Yet in the days of celluloid film and the attendant high 70mm print costs and projector upgrades, the process did not get traction. Now, with digital projectors regularly operating at 144 frames per second for 3D, implementing much higher frame rates and increasing resolution is proving to be a cost-effective way to improve movie impact and profitability.

One objective of Trumbull’s initiative is to demonstrate to the film industry that the successful future of the movie-going experience needs to be a “special event” on larger screens, at high brightness, and with ultra-high frame rate 2K and 4K presentations that cannot be emulated on television, laptops, tablets, or smartphones. “Today, the multiplex is in your pocket…” says Trumbull, “…so younger audiences are enjoying the benefits of low cost and convenience via downloading and streaming, causing tidal shifts in the entertainment industry, and particularly in theatrical exhibition. Theaters must offer an experience that is so powerful and overwhelming that people will see the reward of going out to a movie.”

Trumbull is legendary for his ground-breaking visual effects work on films such as 2001: A SPACE ODYSSEY, CLOSE ENCOUNTERS OF THE THIRD KIND, and BLADE RUNNER, as well as his directorial achievements on SILENT RUNNING and BRAINSTORM and special venue projects such as BACK TO THE FUTURE – THE RIDE, and a trilogy of giant screen high frame rate attractions at the Luxor Pyramid in Las Vegas. Trumbull has more recently been pursuing what he believes can usher in a powerful transformation of cinema itself. At a time when the major studios have embarked on a business model to produce only tent pole-franchise-superhero-comic book action films, theater attendance is in decline. Trumbull believes that a jolt of high technology energy is needed to improve the impact of these expensive productions via photographic and exhibition technology that fully delivers the production value that is presently being throttled down by 24 fps, 2K resolution identical to television, and low brightness 3D on small screens.

Trumbull Studios includes equipment provided by Christie, Dolby Laboratories, RealD, Eyeon, Stewart Filmscreen, Composite Components, Abel Cine, Vision Research, nVidia, 3Ality Technica, Codex, Motion Analysis, Virident, Limelight Productions, and many more. Facilities include a shooting stage, production offices, multiple workshops, screening rooms, editorial, compositing, and sound mixing.

UFOTOG was written and directed by Douglas Trumbull, produced by Julia Hobart Trumbull and Steve Roberts, executive producers Donald Rosenfeld and Andreas Roald, starring Ryan Winkles, director of photography Richard Sands, original music by Claes Nystrom, produced at Trumbull Studios, with special production services provided by Eyeon.

RELATED LINKS
http://www.douglastrumbull.com

Life After PI

“Life After Pi” is a short documentary about Rhythm & Hues Studios, the L.A. based Visual Effects company that won an Academy Award for its groundbreaking work on “Life of Pi”– just two weeks after declaring bankruptcy. The film explores rapidly changing forces impacting the global VFX community, and the Film Industry as a whole.

Read more