Thursday, October 24, 2013

SMPTE 2013: Canon Previews Sophia Loren In 'The Human Voice'

During Thursday’s session on cinematography, Canon Senior Fellow Larry Thorpe previewed some clips from The Human Voice, a 30-minute short lensed with the Canon C500 in 2K 12-bit, using Cooke S4/i Primes and Canon Zooms.
Photographed by Oscar nominated cinematographer Rodrigo Prieto ASC, AMC the film stars Sophia Loren and was directed by the actress’ son, Edoardo Ponti.
It was was filmed last summer in Rome and Naples.
The footage was screened during Thorpe's presentation on lens considerations for digital cinematography.

Also Thursday afternoon, NHK’s Hiroshi Shimamoto reported that the company had developed a 120 fps 8K 12-bit CMOS image sensor, and recent built an 8K compact camera that uses this sensor.

Tonight, the conference will wrap with the Honors and Awards ceremony, hosted by David Wood, followed by the second annual SMPTE Jam.

I've look at screens and clouds from both sides now....

The ‘Demo room’ was a great achievement for SMPTE 2013, and we may look back on it in twenty years’ time in the same way we now look back on the 1892 demos at the SMPTE conference of digital TV and analogue HDTV.   So, what was on show?

One thing was an appraisal of what you actually get for your money on a UHD-1 TV with native UHD-1 compared to the same content at 1080p up-scaled.   There was a difference, but it was not huge.  

Another thing was the impact of using HEVC compression on image quality.   Estimates of at least a two to one saving compared to AVC seem to be well founded. 

Images of UHD-1 images with frame rates of 50 and 60 Hz were on show, and delegates could see the kind of quality that the new HDMI 2.0 will ‘let through’ to the display. 

There were also demonstrations of live UHD-1 content.

For the morning of the last day of the Conference, I also attended the stream of session on ‘clouds’.    At previous SMPTE conferences there has been an air of reluctance by the studios to use ‘clouds’ to store their most precious content, because they do not want even the remotest risk of harming them.   This year, at least apparently, confidence in, and use of, clouds has grown.  The use of ‘hybrid systems’ where you use your own storage when you have some left, and turn to a cloud when you don’t, was a popular theme, and may represent the shape of things to come.   Such things can be done automatically.  Clouds may be a competitive market, so one of the skills broadcasters may need is to ‘shop around’ for the cloud space.    

Deploying Video Platforms in the Cloud

Thursday’s program began with a session on deploying video platforms in the cloud, with speakers sharing their experiences and insights.

The messages included:

--Bhavik Vyes of Amazon Web Services detailed ways to build flexible and scalable media workflows in the cloud, using Amazon Web Services as a platform.
This came with a reminder that the increasing size of date files will place stress on the workflow. “It’s getting harder,” he admitted, “especially with the arrival of 4K.”

--“Go cloud first; but not cloud everything,” urged Andrew Sinclair, New Corp. Sinclair Media Technology.
He related that News Corp. has a “cloud first” policy. “We are looking for a SaaS platform,” he said, noting that it gives them flexibility since they don’t have to completely commit to it. “Beyond that we look for PaaS (Platform as a Service), down to Iaas (Infrastrucuture as a Service),  and then physical infrastructure.”

--Robert Jenkins, CEO of Cloudsigma, discussed the notion of the “hybrid cloud," suggesting that companies would generally leverage their existing investment and combine it with a public cloud. “It's a project-based industry so pretty much every production tends to be unique,” he said. “We have to accept that this isn’t a cookie cutting approach.”
He warned: “a lot of public cloud pricing is broken. … If you buy it for too long it’s too expensive. With the right price, the public cloud can be competitive.”

Wednesday, October 23, 2013

SMPTE Conference: Studying Acceptance of HFR 3D

Wolfgang Ruppel, professor at the Rheinmain University of Applied Sciences in Germany, presented the results of a study that the University conducted about the subjective acceptance of HFRs for stereoscopic 3D.
This was achieved using James Cameron and Lightstorm Entertainment’s “medieval feast” themed test footage comparing 24, 48 and 60 fps; and Circus, an animated short produced at RheinMain University, also rendered in 24, 48, and 60fps.
Ruppel reported that overall results found a preference for HFRs amongst the participants. More specifically, the findings suggest that the difference between 48 fps/eye and 60 fps/eye varied by the speed of motion. And 60fps/eye can make the most noticeable difference for mid to fast motion scenes.
A second test examined down-conversion from 48 fps/eye and 60 fps/eye to 24 fps/eye, compared with footage natively shot with 24 fps/eye.
The conclusion: downconversion appears to work well.
Have you seen these comparisons? What do you think?

SMPTE 2013 - A historic event for UHDTV

Such was the attraction of the special UHDTV exhibition, and the main exhibition area, that attendance in the main sessions today was more modest today than yesterday.   Though there was still interesting technology to hear about.  

They included the developments in the ATSC on a system for providing emergency warnings to users of the ATSC’s mobile system.  If there a hurricane coming, your mobile TV can switch itself on, and give you all the details.  It seems though that take up by the industry has so far been modest.   That’s the problem with things you only need very occasionally, but when you need them, you really need them.

There was also a presentation about technology to allow a broadcaster to increase the number of channels carried in a multiplex, which could silently and secretly be used for carrying content to a consumer or a daughter broadcaster.   The idea is that you discretely reduce the bit rate of the normal channels for a while,  and slip in the extra channel or content.  No new equipment is needed.  Could it be that the technical quality we broadcast could be raised or lowered depending on the audience size - like adaptive internet streaming in reverse?

For the UHDTV exhibition itself, there were many UHDTV monitors, comparisons of quality after compression, and more.  Maybe the SMPTE 2013 will go down in history as the day the story of UHDTV finally broke?    

SMPTE Exhibition Hall Is Now Open

The SMPTE conference's exhibition hall opened on Tuesday and was also the setting for the evening's opening reception.
Numerous stands have technology that underscores the themes of Monday’s symposium on next generation image formats.
Among them was a Cisco display presented during the reception, showing imagery using its 4K HEVC compression in 6 Mbps (1000:1 compression of the original file) in 4:2:0 color depth.
A sampling of the exhibitions:
At the Daystrom stand, Fusion-io is showing playback of 4K uncompressed imagery at 120fps on a Sony 84-inch 4K display, from an HP Z820 workstation equipped with four of its ioFX cards with 1.6 TB capacity.
Fusion-io’s director of visual computer Vincent Brisebois noted, “you can’t get that with SSD and RAID.”
The Fusion-io cards can be purchased standalone or bundled with certain models of HP’s workstations including the Z820.
NVIDIA is exhibiting a beta version of Red’s RedcineX Pro, which now has GPU acceleration from the Nvidia Quadro K6000. At the stand, it is being used for 5K playback in real time.
Also on the floor, Snell is showing its flagship broadcast automation system Morpheus and Momentum, a workflow-automated media asset management and resource-planning tool.
Dolby meanwhile is demonstrating the new model of its Professional Reference Monitor, the lighter and slimmer PRM-4220;  and its glasses-free Dolby 3D system.
The exhibition hall will be open through Thursday.

Networked production – not quite utopia?

A lot of focus on Day 2 was the transition IT based / Networked Media sessions.
The trend (of course) for production centers is to networks, where everyone in the production centre is interconnected.   But is this quite the utopia it sounds?  There will, we learned, still be headaches for the engineers who have to plan and install them.

The phrase "of the shelf" products for Networked Media needs qualifications. You cannot achieve the performance we need by combining different systems and switch supports. When considering the data transfer capacity of the studio links, you need to factor in the issues of buffering and synchronisation.  Also compression systems are not ‘reversible’, so expect generation issues.  In other words there may be no such thing as ‘off the shelf IT equipment’ for us.   

For those, like your blogger, who still love 3DTV, there was a small exhibit with an autostereoscopic display at the exhibition.  Oh, how the mighty have fallen?  The demonstrator told me that six views are created from two original views and displayed on adjacent strips on the display.  The result is reasonable if not perfect.  The stereoscopic phenomena where, when you move your head, the objects in the scene appear to shift laterally, is still there outside a modest viewing position.   But, hey, we should be thankful for small steps forward?        

Tuesday, October 22, 2013

Multi-View Production: New Ways To Capture Depth

Tuesday afternoon's SMPTE conference session on multi-view production included a preview of two developing camera systems that introduce new ways of capturing depth on set. Both were recently tested in the field.
Session chair Howard Lukk, Disney’s vp production technology, described the first as a “hybrid 3D” approach as it combines elements of both native 3D production and conversion. It's a prototype trifocal camera system that is effectively a rig holding one main camera (Alexa) and two small satellite cameras. This allows the filmmakers to photograph the imagery on set while at the same time generate depth information that can be used to create a stereo version in postproduction. This camera system was developed by Fraunhofer Heinrich Hertz Institute with Walt Disney Studios and ARRI.
The second, a “motion scene camera,” was introduced by ARRI’s principal engineer Dr. Johannes Steurer. This is a motion picture camera with a time-of-flight sensor for generating the depth information.
The day ended with the annual opening night reception in the exhibit hall.

Gewecke Keynote: Professionals Can "Redesign" Entertainment

“Over the next 5-10 years we have an opportunity to redesign what entertainment experiences will be … This is a time for us to create new projects and services.”
That was the message of Thomas Gewecke, chief digital officer and executive vp for strategy and business development at Warner Bros. Entertainment, who delivered a keynote Tuesday at the SMPTE conference.
One way that Warners is responding to change in the foreseeable future is by making its content as accessible as possible. Noting that Warners has more than 250 distribution partners in more than 100 countries, Gewecke said, “It needs to be in as many places as possible so consumers have legitimate way to access it.”
Looking at additional ways to buy, Gewecke described the Cineplex Superticket program in Canada that now allows movie-goers to  pre-order a digital copy of a film that they are seeing, while they are at the theater. “It’s helping them make the purchase at a time when they are most aware of the movie,” he said.
On potential new opportunities, Gewecke pointed out, “it’s difficult to reproduce the serendipity of walking around looking at things [in a store]. ... We are at the very beginning of reproducing that experience in the digital space. There are many technologies that could drive that.”
During his address, he also expressed interest in the potential of creating “communities of interest,” using as an example the recent Veronica Mars kickstarter campaign (more than 91,000 contributing to the project, which raised $5.7 million to fund a movie based on the TV series).
The takeaway: “It was about quantifying the number of fans. They created a community and effectively [learned the level of interest] before making the movie.”
Concluded Gewecke: “We think it’s critical to continue to assume that content is being changed by technology. We need a sense of urgency and it’s important that we act to drive innovation.”


The grouping of Hollywood studios, Movielabs, revealed their own wish-list for UHD on the afternoon of the first day symposium at SMPTE 2013.   It’s quite a stack – and its seems to go beyond the features available in UHD TVs in the shops now.   The engineers in the audience were delighted when they heard of the high quality aspirations of the movie studios.    A little later in afternoon, another speaker practically got a standing ovation when he said ‘let’s forget 8 bit/sample’ (and go for higher bit depths).   Maybe as a group of engineers, we are happiest when providing the highest image quality that technology permits – but is this a fault?

One of the points of ‘stress’ in the movie studios wish list is the suggestion that UHDTV movies shown on TV sets should use the same colorimetry that is used for making movies, termed the ‘XYZ’ system.   Having the same system for making movies and for showing them on TVs would have benefits, and the XYZ system is very flexible.   But for TV broadcasting the price maybe higher bit rates without actually any higher quality for events like sports.  Maybe tests made in the coming months will shed light on this?

Monday, October 21, 2013

4K/UHD TV - Will it be a Hit with Consumers?

“Aren’t we living in the most interesting time? I truly believe we are” said SMPTE president Wendy Aylsworth on Monday as she welcomed attendees to the Symposium on next generation formats that kicked off the annual SMPTE Technical Conference and Exhibition. “Advances in wireless technology allow consumers to get content anywhere, anytime …  Home technology is allowing HFR, HDR, UHD – and these can greatly increase the content quality.”
But do consumers want UHD TVs, which are currently priced anywhere from $700-$40,000?
This is the question that Insight Media president Chris Chinnock asked the first panel of the business track, and this question was addressed throughout the day.
“Nobody has been making money for a while, but I see some reason to be optimistic,” said Peter Keith, vp and senior research analyst Piper Jaffray. “When I see a product, I ask ‘is it bigger and does the picture look better?’ 4K checks both those boxes.’
But  there are many considerations in this discussion and while initiatives in areas such as satellite and OTT 4K services are in the works, a key challenge of course is getting 4K content to the home.
Steve Venuti of HDMI Licensing believes 4K upconversion features on new displays could help to jumpstart the market. Added Samsung marketing exec Dan Schinasi: “The reality is that most of what we watch will be upconverted for the foreseeable future.”
But on another panel, Bryan Burns, the former ESPN exec who is now president and CEO of Forward Direction Group, offered his view on the subject of upconversion, saying, “I hope we don't hear, ‘if there was only more content [4K would gain momentum].’ If there’s is chip that upconverts everything for you, it make it harder for producers to invest in creating the content.”
Speakers also addressed the need for clear terminology with which to communicate the new capabilities to consumers. The Consumer Electronics Association uses Ultra HD as the industry term for 4K, but ITU for instance has a 4K and 8K flavor. “CEA has come up with a definition for the U.S., but they don't follow it in Europe [and other international markets]. There’s still a way to go,” said Paul Gagnon, director of global TV research for NPD Display Search.
In conjunction with this program, a 4K/UHD demo room has been organized to showcase 4K/UHD in both professional and consumer markets. It will remain open through Oct. 23.

4K in Movies and TV—Where Does it Make Sense?

There are “no major barriers” to 4K production using currently available gear, asserted Larry Thorpe, senior fellow at Canon USA, who kicked off a Monday session exploring the potential of 4K for movies ad TV.
Steve Weinstein, CTO of Deluxe Entertainment Services Group, said the issue surrounding 4K at his company is that “our storage has been blow out.
“This makes a distributed workflow much harder to do,” he said. “You end up working on proxys. We are looking at cloud transcoding to move this around. …The toolsets seem to be handling 4K reasonably well.”
He added that Deluxe is seeing increased labor in visual effects, requiring more “personnel and time.”
Offering a broadcasters perspective, Fox Sports vp of field operations Jerry Steinberg said the network currently has no plans to deliver 4K. “We spent millions going to HD and never got an extra dime from advertisers,” he said. “It seems today [4K broadcasting] is a monumental task with not a lot of return.”
But answering the question “where does it make sense?,” Steinberg related that Fox has been using Sony F65 4K cameras for HD sport coverage, for instance to extract a portion of a shot for a replay. “We were able to tell the story with clarity. This Super Bowl, I would probably have six 4K cameras.”


 The first day of the SMPTE conference symposium split into two streams – one on UHDTV technology and the other on Business aspects. 

You blogger joined the technology stream, where two issues predominated mostly about the lower UHDTV quality level UHD-1 (otherwise 4k). 

The first was about what ‘brightness’ should be assumed for UHDTV displays.  Display brightness used to be measured in ‘candelas/sq metre’, but a new shorter term ‘nit’ (which by the way means ‘idiot’ in colloquial English) is coming into fashion.   There were two schools of thought.  One was that UHDTV should be more like the real brightness range we see in nature, with 10-20000 nits.  The other was that if we did that, when a TV show was set on the beach, viewers in their living rooms would need to wear sun-glasses.    This is linked to the ‘opto-electronic transfer characteristic’ we need in the UHDTV standard.   The debate will no doubt be continued in the ITU later in the year.  Let’s hope for compromise.

The second was about the number of images per second needed (frame rate).  Though UHDTV sets going on the market today accept a frame rate of up to 60 frames/second – the frame rate used for HDTV today – evidence was presented showing that there would be a really dramatic improvement in image quality or sports if the frame rate were increased to 100 or 120 frames/second.  This quality jump could even outshine the jump in quality due to the increased resolution going from HDTV to UHD-1.   However true this is, it looks like we may have to wait some years before receiver encoders can cope with higher rates than 60Hz.

The discussion showed that we may have some way to go before all the i’s are dotted for UHD-1.

Sunday, October 20, 2013

SMPTE 2013 Symposium - Better Pixels but no 3DTV.

I'm not quite sure what 'better pixels' means (and what therefore 'worse pixels' might be) but it is the talking point at the SMPTE 2013 Symposium here in Los Angeles.  It seems to be a shorthand way of saying that having more resolution alone may not be enough to sell UHDTV, and that more goodies are needed such as higher dynamic range (giving the picture more sparkle) and higher frame rates (making the line-backers sharper when they run).  Looking at the programme, it seems it will be impossible not to emerge from the Symposium not an expert in UHDTV.  Some may see UHDTV as the great meeting place where movie production and TV production meet.....all will be revealed in the next few days.   By the way, I've sold my 3DTV glasses, because it doesn't look like they will be needed.