Monday, April 8, 2013

Panelists Examine The Hobbit’s 48fps Journey


SMPTE President Wendy Alysworth—who is SVP Technology at Warner Bros. Technical Operations—kicked off a discussion of high frame rates by describing the effort to get theaters ready to project Peter Jackson’s The Hobbit: An Unexpected Journey at 48fps.
She related that when Jackson revealed his decision in April 2011 to make his Hobbit films at a HFR, the projection equipment didn’t exist to support the frame rate. Software and hardware upgrades would be needed.
Following much work, the months before the film’s debut involved a busy schedule of installations, testing, and certification of the 4,000 auditoriums that would be ready for the December release. The screens ranged in brightness between 4.5-7 foot lamberts in 3D (and two at 10fL).
Odeon & UCI Cinemas, Europe’s biggest cinema chain, converted 100 screens—including 45 in the UK, and 36 in Spain—to accommodate HFRs for The Hobbit in seven weeks.
“We did 3D theaters because it was the only way to monetize this,” explained EVP of digital development Drew Kaza. “Clearly there’s no way to go out and charge extra, so we only have the 3D premium. We branded HFRs with 3D.”
He reported that box office performance was “excellent,” but not HFR driven. For instance 57% of the box office for The Hobbit was 3D in UK,  but 70-85% were Imax or premium large format auditoriums.
Joe Miraglia of ArcLight/Pacific Theatres similarly changed a premium for 3D, not for HFRs. He reported that 49% of the chain's Hobbit gross came from 3D and 73% of that was HFR.
Audience reactions to 48fps were mixed, Kaza said, though he believes this to be generation to some degree. Response from movie goers under 25 was more favorable, while older audience members were generally not keen. “I think it was largely age driven,” Kaza said.
Emphasizing that he believes Jackson is right, Kaza asserted, “This is the beginning of a new format. That means committing fully and marketing it passionately.”

Jon Landau Highlights Technology Summit on Cinema


A conversation with Oscar winning producer Jon Landau highlighted the NAB Technology Summit on Cinema, co-produced by SMPTE. The Sunday session was moderated by Variety's David S. Cohen.

During the discussion Landau covered a range of topics, including:

--The Avatar sequels will include underwater performance capture
“We want to take advantage of technology to make the next two movies even more emotionally engaging and visually tantalising and to wrap up the story arc of our two main characters,” said Landau. “We have kept a team of digital artists on from Avatar in order to test how we can create performance capture underwater. We could simulate water [in computer graphics] but we can't simulate the actor's experience so we are going to capture performance in a tank.”

--On High Frame Rates
“Nobody should dictate to a filmmaker whether they should make films at 24, 48 or 60fps since the technology now exists and can be presented with the same cinema equipment.”

--On the 3D market
 “3D is an evolutionary not revolutionary and it will take time to come to market. But look at Russia and China where the 3D screens market is phenomenal. In emerging markets communities are going to theaters for the first time and are experiencing film in 3D – that's what they think of as a cinema experience. To show them a 2D presentation is a step back from them.”

--On 2D-to-3D conversion
“It will never be a comparable choice to native 3D shooting,” said Landau. “As good as conversion can get, it's two and three quarters 3D and never true 3D.”

--On Technology’s Impact on Storytelling
“Stories that could be conceived a few years ago could not be realized because the technology did not exist. Now, if someone can dream it, someone can find a way to realize it. But let's not to lose sight of why people go to the movies. They don't watch for technology they watch because they make an emotional connection to a story.
He added: “We all go the theater and suspend our disbelief even while we are see live actors performing on stage. Perhaps the role of the director working at high frame rates and higher resolutions will have to evolve to be more like that of a stage director.”

SMPTE Launches Centennial Predict-the-Future Contest


SMPTE Executive Director Barbara Lange announced the launch of the Society’s Centennial Predict-the-Future Contest, Sunday at the NAB Technology Summit on Cinema, co-produced by SMPTE.
Also at the conference, SMPTE engineering vp Pat Griffis of Dolby reported that a discount is available this week for SMPTE’s upcoming Entertainment Technology in the Internet Age conference. Details are available at the Society’s NAB booth.
For the Predict-the-Future Contest, participants’ projections regarding the state of television and cinema technology in 2016, the 100-year anniversary of SMPTE’s founding, will be collected until Nov. 1 and the winners — identified by renowned analysts — will be recognized during the gala SMPTE Centennial Event in 2016.
“In 1916 C. Francis Jenkins filed the SMPTE founding papers, establishing our organization and its commitment to advancing theory and development in the motion-imaging field,” said Peter Ludé, past president of SMPTE. “Over the past century, SMPTE’s pioneering work has helped to guide the development and implementation of motion-imaging technologies and systems. The Centennial Predict-the-Future Contest celebrates this achievement by looking ahead to further advances in an exciting and dynamic industry.”
“Over the past 100 years, the hundreds of standards, recommended practices, and engineering guidelines developed by SMPTE have shaped all areas of motion imaging, from television production, filmmaking, digital cinema, and audio recording to information technology and medical imaging,” said Charles Jablonski, chair of the SMPTE Centennial Committee. “The Centennial Predict-the-Future Contest offers anyone the opportunity to express their expert opinions about the state of motion-imaging technology as the organization moves forward into its next century of innovation.”
Open to all interested participants, the Centennial Predict-the-Future Contest asks 15 questions about the future of television and cinema technology. The list of questions, compiled from SMPTE member suggestions, is available at www.smpte100.org. At the contest entry deadline on Nov. 1, the answers submitted will be sealed until 2016 when a select group of analysts and their firms will identify the correct answers to the contest questions and determine the winners. Prizes will be awarded to those who answer questions correctly.
The Society is currently planning other events to celebrate its 100th anniversary, including a gala banquet event. Stay tuned to the SMPTE website to learn of further developments.

Technology Summit on Cinema Discusses "Tidal Wave of Pixels"


“It will be as easy to do an 8K show a decade from now as it is to do 2K HD production now,” predicted Jim Houston, Principal, Starwatcher Digital, though it is also clear that getting to that point is predicated on sound workflow design.
The question of how best to handle the 'tidal wave of pixels' emerging from technologies capable of generating HFR, wider color gamuts, greater dynamic range, and higher resolutions is taxing the tech teams at post facilities, he said.
“From our perspective in the post world we are not fearful of any new proposals - they can all be handled given enough time and money - although some are too expensive to be practical just yet.
Managing greater volumes of data comes down to efficient workflow design, for which Houston outlined some principles.
“In essence, workflow is about how you move media around, to do so in the most cost effective way and to communicate every step clearly,” he said. “People get lost if you don't have a blueprint of it around.”
Footage is still a big cost and discipline in shooting ratios would be big help, he suggested. The cloud is not practical yet for post at these data levels but it is a great information management resource.
“Information should be freely available in the cloud but that doesn't happen today as often as it should,” he said.
Stressing the importance of storage, he said: “When you don't have enough media available then the long term cost will be much greater for a $100m dollar show than the upfront cost of a $7000 drive. Make sure you have multiple copies of media.”
As an indication of the size of data which facilities are being asked to tackle, dailies are commonly of the order of 2.5TB of data per day for a standard 2K show. By contrast an 8K show at 120fps might average 187TB per day of new data.
“Consider that with a typical dailies workflow of 2.5TB a day you are looking at a two hours transfer time and in 4K its 8 hours then in HFR 8K you could be looking at 16 hours of transfer per day of shooting,” he said. “So it's clear you are going to get backed-up with work very quickly.
“If you look at how data rates of network bandwidth and optical transmission are increasing then, to do an 8K show as easily as it is today to do a show in 2K or 4K, we are looking at the order of 10-12 years.”
FotoKem Chief Strategy Officer Mike Brodersen said the facility was currently handling a 4K multi-camera show and generating a volume of data “that is already becoming difficult”.
“Instead of a normal show average of 1-2TB a day, on some days we are managing 6-7TB so how we would deal with 8K I just don't know. That said, we are testing 8K images from an F65 for one client just to what we can achieve.”
With an 8K debayer pattern available from Sony in the summer “there will be a director out there who will decide to use it” noted Houstan.
Brodersen went on to describe the set to near-location post workflow for Oz The Great and Powerful, a Red Epic project which averaged 2TB a day with up to 11TB on days of heavy shooting.
“On set media management became the launching point for everything we did in the remote post environment,” he said. “In some cases a single take would fill a DIT cart with media so the speed at which the media needed to be ingest and verified in a second pass was vital.
“We treat digital media like original negative,” he emphasized. “It's checked and visually QCed. There is no room for error. Getting it in quickly and getting it right is a challenge.”
Real world data wrangling of giant proportions was outlined in a case study by Peter Anderson, ASC of Universal Studios King Kong 360 3D theme park attraction.
For this 91 second immersive ride Weta Digital delivered four files of 8K x 1.5K or 5559 DPX frames in full color space which totaled 22,236 8K frames.
The sequence was played out over 16 projectors over a 300 foot wide screen viewed by guests on a tram ride just 35 feet away.

Saturday, April 6, 2013

Sandrew Discusses the Good, the Bad and the Ugly of 3D Conversion


Conversion is maturing, stated Legend3D Founder Barry Sandrew, Saturday at the Technology Summit on Cinema. “It offers filmmakers a dynamic creative canvas for creating 3D when placed in the right hands.”
Sandrew spent time elaborating on some of the “common errors” in conversion when what he considered to be substandard skills or corner cutting were applied.
“We all know bad when we see it, but it's difficult to pinpoint what's wrong. Choosing a conversion house is a very critical thing and finding one that will do the work properly will make your production go a lot smoother. There are some basic rules to conversion and it's relatively easy to see if a house is violating them or doesn't understand them.”
To emphasize his point, Sandrew showed DCP examples of deliberately exaggerated badly converted material to highlight problems such as incorrect eye-line, occulsion artifacts and incorrect grouping.
By comparison, he showed some of Legend3D's recent work. The 'ugly' in Sandrew's session was realtime conversion where the main problem he argued is that no algorithm has yet managed to cater for the changing dynamics of depth within in a live shot.
He also suggested that some people's lack of appreciation for 3D - or inability to judge conversion correctly - was simply to do with the frailties of human eyesight. Failure to correct near sightedness so that both eyes were in balance, for example, would render any stereo experience inaccurate.
“You need to have two perfectly balanced eyes to accurately assess disparity,” he said. “The most important aspect to watching 3D is our own visual system yet oddly it's the last thing we think about calibrating.”  

SMPTE Examines Distributed Postproduction for Cinema


Postproduction workflows are evolving from a localized, tightly integrated and highly controlled process to one where production and post may occur collaboratively and with creative individuals scattered around the globe.
In the session “Distributed Post Production for Cinema: Technologies, Issues and Business Opportunities,” held Saturday at the Technology Summit on Cinema, Mark Lemmons, CTO T3Media encouraged delegates to 'think big' and recognize that cloud technology can benefit production today from raw storage to transcoding.
“We need to recognize that some big challenges are being tackled right now in the cloud including archive integrated production,” he said. Lemmons highlighted work T3Media completed for Paramount Pictures in which film scans of 4K DPX files were placed for long term storage in a cloud hosted by T3Media but protected behind the studio's firewall.
"The cloud can be used today for storage of high value and high resolution files," he said. "I would though encourage questions to be asked of cloud platform providers about the long term costs storage."
Former EFILM VP Engineering and now CTO Blendo, Gary Thomson described his oversight in successfully establishing a secondary DI facility including waveform monitoring and color correction for Fox Studios 8 miles from its main Hollywood campus.
"What do filmmakers and creatives want when they work in diverse locations?" he asked rhetorically. "The main thing is that they want to finish an entire film with audio co-located with visuals. You can't practically move an audio facility but you can replicate a DI suite in a virtual environment fairly easily and inexpensively using dark fibre connectivity."
The ability to work internationally and remotely using very high resolution media continues to be demonstrated in a series of projects at CineGrid, a community of networked collaborators which uses the 10Gb fibre network Global Lambda Integrated Facility (GLIF), originally designed for high energy physics research.
Co-founder Lauren Herr explained that since 2005 CineGrid has tested point to point transmissions of HD and 4K including 4K bidirectional live telepresence. It has also conducted a live shoot using a Dalsa 4K camera in Prague, which was debayered in San Fransisco and resent to Prague for grading while the colorist was located in Toronto.
Recent experiments have included remote collaboration with media at 4K 60p; realtime stereographic treatments; realtime film restoration and live uncompressed 4K 10-bit streaming over IP between Prague and San Diego.
Herr showed film of a demonstration held last December in which a director in San Diego directed an actor against green screen in Amsterdam with live composites rendered by a supercomputer at another site in Amsterdam in real time and all in uncompressed HD over IP.
“One of the main learnings in all of this is the importance of the human networks that underpin the fibre networks," said Herr. "Collaboration is a human activity, the social dynamics are very important to consider and not trivial to solve.”
In remote collaboration, he said, it's not enough to have face to face telepresence but necessary also to have additional channels for context such as screens which make it clear that both parties are looking at and describing the same thing.
“U.S government departments have already begun rolling out 100Gb networks and while there are still places where there are bottlenecks, such as in the last mile, the direction of travel is clear," Herr stated. "Media students will have to learn how to work over these networks because that is the way the world will work in the future.”

SMPTE Eyeing Standards for Immersive Audio, HFRs

Emerging immersive audio is now firmly on SMPTE's agenda. In an overview of the activities being undertaken by the newly established Technology Committee on Digital Sound Systems (25CSS), Brian Vessa, Executive Director, Digital Audio Mastering, Sony Pictures Entertainment declared audio to be 50 per cent of the experience “yet little attention has been paid to audio during digital projection rollout.”
During the NAB Technology Summit on Cinema, co-produced by SMPTE, he reported that SMPTE's 25CSS study group is setting out to address this. “Making a movie soundtrack sound consistent in multiple venues is a big challenge. One of the issues is that many sound systems in theatres are two decades old, system upgrades are rare and the acoustics are variable. The majority of cinema theatre calibration is done by technicians without a lot of training and the industry's current standards are also becoming dated.”
The study group is to lay the ground work for a new audio standard which will more closely model how people actually hear, he explained.
It will need be a standard that can be performed in a straight forward way in the field. A project to incorporate immersive audio content – such as Dolby's Atmos or Barco's Auro 3D – into the new standard will start this July. “The industry doesn't want a variety of ways to deliver immersive audio to cinemas,” he said.
John Hurst, CTO CineCert, provided an overview of the standardization work ongoing at SMPTE's 21DC committee including on Higher Frame Rates DCPs “where there are concrete moves to standardize HFR” and the addition of stereoscopic rendering for subtitles in a draft document to be published in June.