Wednesday, December 5, 2012

SMPTE Member, Eddie Veale Receives a Sound Fellowship from Association of Professional Recording Services (APRS)


Pioneer of the first professional home recording studio, acoustic consultant and designer Eddie Veale has been recognized for his contribution to the world of audio, as he receives a sound fellowship from the Association of Professional Recording Services (APRS).

Veale principal of Stevenage-based Veale Associates, was presented with the prestigious award, the first to be granted to an acoustician, by music legend Sir George Martin, at a November luncheon to recognize outstanding contribution to the art, science or business of recording.

In his own acceptance speech, fellow recipient, Pete Townshend of The Who, thanked Veale for his innovative approach to studio design that led to the creation of Townshend’s floating Dutch barge studio, created by Veale in the early 80s and still in use today.

Veale’s career began back in 1960 working on noise control in passenger aircraft for De Havilland. A chance meeting with John Lennon resulted in Veale designing and building the country’s first professional home recording studio at Tittenhurst Park. 

For Veale, who subsequently became an engineer on the iconic ’ Imagine album, a new career unfolded. Today, he is acclaimed as one of the UK’s most influential studio designers, responsible for the private studios of George Harrison, Gus Dudgeon, Eric Clapton, and Mike Oldfield among others, and studios at Sarm, The Mill, Roundhouse, Dean Street, Trident, Decca Records, GMG Radio, Virgin, Channel 4, and Carlton TV.

Other Veale firsts include:
  • Creation of the sweet spot the width of the console
  • High level control room reference loud speaker monitor system
  • Quadraphonic joystick pan pot
  • Technology for measuring room acoustics
  • First presenter driven radio station
  • First Rock ‘n’ Roll film dubbing theatre for Advision 

Veale also introduced the Moog and ARP synthesizers to Europe, a move which undoubtedly revolutionised the sound of music.

Accepting his APRS Sound Fellowship Veale said: “I am greatly honored to receive this award, particularly as it marks a first in the acoustic world. It is more than I could have imagined, when I set out in business all those years ago, to be recognised alongside such prestigious names as those already afforded the honour of the Fellowship. Many thanks.“







Thursday, October 25, 2012

Tech Committee Focused on Cinema Sound


Sony Pictures Entertainment’s Brian Vessa provided an overview of the work of SMPTE’s B-chain study group and urged participation in the resulting technical committee, 25CSS, which he said is the first SMPTE committee solely focused on cinema sound. This presentation was held 25 October during SMPTE’s conference and exhibition.
The current standards for calibration of the sound produced in cinemas are based on work started in the 1970's employing acoustical real-time analysis of pink noise injected into the cinema sound system at the normalized output stages termed the B-chain.
A B-chain study group was formed in March 2010 to examine and make recommendations surrounding these standards as they are impacted by current technology. Findings suggest that performance in this area varies greatly and overall quality has a lot to do with the skill of the technicians.
A SMPTE technical committee has been formed and proposed activities include formation of an ad hoc group aimed at creating recommended practices for measurement and calibration of B-chain systems, standardizing a digital pink noise test signal, and a study of the B-chain of immersive new sound systems such as Dolby Atmos, Barco's Auro 3D and Iosono.
Vessa, who chaired the study group, urged involvement TC-25CSS. “We have to get everyone in the audio community together if we are going to make a difference,” he said, adding that this includes the participation of mixers, sound designers manufacturers and exhibitors.
The group is already starting to liaison with AES and other organizations.
The first official meeting will be held Dec. 6.

The picture tells the story at SMPTE?


On the last afternoon of the SMPTE Conference, an interesting insight into the practicality of UHD-1 (aka 4k or Ultra-HD) came from a French consortium of companies, the ‘4ever project.  This is investigating how practical it will be to broadcast it on terrestrial television in France.    

The case for broadcasting UHD-1 has, they suggested, two pillars:  more information in the picture – the picture now tells more of the story, and more emotional involvement by viewers.  

Their project uses (because it is the only form available) the ‘Quad HD’ format, with an interesting new audio idea.   They plan for 8 audio channels, which will be used as an ‘adaptive’ system rather than with a fixed number of speakers around the room.  The system can adjust itself to the speaker configuration the viewer has in his room (some similarity to the Cinema Atmos system here).  

They find there are today three cameras available – from Red, JVC, and Sony.  None are perfect for day to day television production, but the JVC image seems likely to allow the greatest compression. 

They have looked at the new compression technology, HEVC, which they categorized as an encoder with a gain of bit rate of 50% and a gain of complexity of 10x.    Decoder complexity increases by 2-3x. 

The companies in the group  argued  in MPEG for the inclusion of 10 bit/sample in the first  HEVC specification to be issued net year, and they believe they have won the battle.

Their initial finding is that the HEVC compression will put UHD-1 capacity well within reach of digital terrestrial television, and that in France it will be possible to add new terrestrial services using DVB-t2, UHD-1, and HEVC in the coming years. 

Do you think they will be the first?
David Wood   

Pirates of the Olympics?



The Olympics had a dedicated session on the last day at the SMPTE conference.  This blogger reported on the 3D coverage in an earlier blog, but other interesting issues came up in the session.  

 The practical details of the NBCU ‘multiscreen’ services were given.  This is the services provided to mobiles, PCs, and tablets.  The production and distribution facilities were massive, and the results for users were creative and elegant.  For example, tablet users could watch the live feed on the left and more information (or ads) on the right.   How much has the environment changed since the last Games because of the massive use of Tablets today!   The NBCU service was a real ‘tour de force’.    But just for 17 days?

Even more novel information came from the presentation on the copy protection measures that NBCU took to protect their content from piracy during the games.  This information is difficult to find, so you should download the presentation if you want more. 

NBCU paid 1.18 billion dollars for the rights alone to show the Games in the US, so you could imagine they would pay more than scant attention to protecting the content.   In brief they derive a digital ‘fingerprint’ for each short segment of the content, which is a digital word derived from the segment, and  which would be is different from that for any other segment.

This is used to check whether anyone is illegally streaming their content, by deriving the fingerprint for any suspected content, and comparing them.  If they match, they know they have a pirate.  They ask the portal to take down the segment.  They have a team of ‘internet investigators’ who monitor the web 24/7.  

I suppose this will be the ‘shape of things to come’ for much content in the years ahead.  Do you agree?

David Wood

Wednesday, October 24, 2012

A little bit more on clouds and stuff...


As the last blog explains, if you’ve looked at clouds from both sides now,  you would have enjoyed the whole half day of presentations about them.  There are evangelists for clouds and sceptics.   One of the main (and best) evangelists is Al Kovalick, who managed the session. 

 If you are doing nothing about clouds yet, it was easy to feel inadequate after, for example, hearing about a new Amazon service ‘Glazier,’ promising a level of content security beyond that achievable in your own building.  Content security has been one of the main brakes on the wider use of clouds for professional content.

 Al held that everyone should be asking their suppliers when they will be offering a product that can run in the cloud, and how much the product has been tested in the cloud.  

From where your blogger sat, there were nevertheless in the audience muttering dissenters who remain cautious about whether the time of clouds has finally come.      

Another ‘world changer’ idea came from Gary Demos today.   His presentation was the ‘unfolding merger of television and movie technology’.   I think this amounted to the hypothesis that the television industry should change to the ACES system used by the movie industry, which would allow higher quality to be preserved in the production chain.   In principle its true, but my guess is that it’s a little unlikely – for example many broadcasters believe that in future television programme production in a progressive scanning world will move down from 4:2:2 to 4:2:0, rather than ‘up’ from 4:2:2 to the movie production 4:4:4.   

What do you think?

A Cloudspotter’s Guide To Migration


“This is big business, and if we think we are the tail wagging the dog. We are wrong,” Al Kovalick (Media Systems Consulting) said of the cloud.
He cited (IDC) research that projects that the web apps market will reach $40.5 billion by 2014. And in 2012, 80% of all new applications would be designed to run in the cloud.
Asserted Kovalick: “We need to leverage it; we need to migrate to the cloud.”
Production collaboration represents one opportunity, as the cloud could provide applications including ingest, download, MAM, review and approval, storage/archive, editing, transcoding and rendering.
What is still missing is real time control and capabilities to support live events “especially when it is 4K,” Kovalick said, adding that the low-hanging fruit includes HR services, billing, traffic/scheduling and MAM.
“Just about anybody who makes a product could put it in the cloud [now],” Kovalick believes, though he pointed out that current business models encourage ownership rather than a pay-as-you-go model. “That is why you haven’t been seeing it. It’s too cheap, it’s too affordable. The manufacturers can’t survive (with this model). I think this will change, somewhat slowly.”

In close and personal – the 3D Olympics


3D is certainly alive!  There were many interesting presentations on 3DTV, but none more so than Jim DeFilippis’ presentation on 3D at the 2012 London Olympic Games.  Just nine months before the games were to begin, a decision was taken to go for extensive coverage in 3D – 18 different sports were slated, and a TV channel dedicated to 3DTV.  The companies OBS, NBCU, BBC, Sky Italia, Foxtel, and Panasonic got together and made the impossible happen.

Jim’s visual 3D material from the Games is eye-popping, and a must see for broadcasters yet to be convinced about 3D.   3DTV is notorious in needing ultra care to match perfectly the left and right eye images, and for care between scene cuts.  Imagine having to provide the 3D channel, live, with however many different cameras from different companies?   They did it.

The games brought lots of experience about which type of sports events are ‘3D-friendly’ and which are less so. 3D works best when the main object in the scene is close to the camera – the natural 3d effect is most striking and natural when the principal object in the scene is within a few metres of us, and the same applies to 3D TV.   When you see the material it is clear that some sports are an absolute gift for 3D, such as ‘diving’ and ‘gymnastics’.   Others with teams and a larger canvas are less effective in 3D. 

Some interesting rules about 3D programme making also emerged from the 3D content – maximum disparity tolerances of-1% and +4%, and captions at +0.5% looks best.  

Overall,  the title of this blog to get the camera ‘in close and personal’ is probably the number one macro rule for making effective 3DTV.

FCC Update on IP-Delivery of Closed Captions


Earlier this year the FCC adopted rules for IP delivery of closed captions in video programming and these regulations are starting to go into effect. Wednesday at the SMPTE conference, the FCC’s Alison Greenwald Neplokh provided an overview of what content owners, distributors and manufacturers need to know.
The basics state that the content owners must provide captions to the distributors, the distributors must pass through closed captions, program providers must ensure that the end user can view the captions, and the physical devices (including bundled software) must ensure that end users can view the captions.
The FCC adopted SMPTE Timed Text (SMPTE-TT) as a safe harbor.
Closed captions need to be a priority for manufacturers if they are updating devices with new features. For available devices, waivers or exemptions are options if certain conditions exist. For instance, device makers can get an exception if meeting the requirement proves technically infeasible (the device can’t be modified with "reasonable effort or expense") or if support for screens smaller than 13 inches is not achievable. Manufacturers could also get a waiver if the device is designed primarily for purposes other than receiving and playing back programming.
Some outstanding petitions could result in revisions to the FCC rules. For instance the Consumer Electronics Association is asking that the FCC rules not cover removable media players, and Telecommunciation for the Deaf and Hard of Hearing wants the regulations to include video clips (including news).

Tuesday, October 23, 2012

SMPTE Exhibition "Excellent As Always"


An encouraged Amberfin CTO Bruce Devlin took some time out from the SMPTE exhibition—which opened 23 October and will be open through Thursday—to talk about some of the progress he is seeing in adoption of file-based workflows and the SMPTE standards that support them.
“SMPTE is so important in getting vendors’ [technologies] to talk to each other in a cooperative environment for the benefit of media companies,” Devlin said.
He was encouraged to hear SMPTE 436—which supports carriage of VBI and VANC in MXF—discussed on Tuesday in a conference session. And he noted that Wednesday’s presentation of SMPTE Timed Text Format (SMPTE-TT), a caption interchange standard, is another important topic on the agenda.
AmberFin is on the exhibition floor showing its iCR, which has implemented SMPTE 436 and SMPTE-TT.
Also exhibiting this week is Digital Rapids, which is on hand with Kayak, its new workflow platform; and StreamZ Live Broadcast, a live encoder developed to support both multi-screen streaming and broadcast operations. “Excellent as always,” said Digital Rapid product manager Darren Gallipeau of SMPTE’s exhibition. Like Devlin, Gallipeau said he looks forward to exploring file-based workflows this week.
The list of 77 exhibitors also includes Cinegy, which is on hand with the latest version of its Cinegy Multiviewer as well as the new Cinegy Player, a standalone tool for viewing broadcast files. At SMPTE, the company is distributed complimentary Cinegy Player licenses.
The exhibition will be open Wednesday from 10:30-18:30, and Thursday from 10:30-14:00.

“If you can’t find it, it’s not worth a damn thing”



Thus spoke Darryl Jefferson, responsible for the incredibly successful US Olympics coverage at NBCUniversal, at the SMPTE industry lunch on 23 October 2012.    His reference was to the need for meticulous metadata labelling of all the material they had – something absolutely symptomatic of the new age of file based programme production.

We didn’t really need the statistics from Darryl because everyone knows that the Olympic Games in summer 2012 were the most successful television event in the history of mankind – but we got them.   

What can you say about 82 million television viewers (just in the US)?  Maybe even more phenomenal than the television audience, which we might expect to be high, was the huge US Internet audience.  There were 57 million unique visitors, and at times the Olympic streams were consuming 35-50% of the entire US internet bandwidth.    3000 highlight packages were produced.   

There was 2500 staff working in London, with 200 cameras providing pictures in addition to those from the IOC OBS. 

Part of the production success was the use of a Media Access management system, which allowed the NBCU teams in New York, Florida (for the Spanish services), and elsewhere to find and call up items instantly from the available material.  

Darryl claimed that the event showed ‘Television is not dead’.  Maybe so, but it appears to have an ever growing large room-mate. 

Any video content from the dawn of time at the touch of a button?



Anthony Wood, CEO of ROKU, gave the opening keynote on the first day of the SMPTE conference (23 October 2012).   Anthony is a successful and charming Silicon Valley entrepreneur, complete with the handyman uniform of sneakers, worn jeans, and suspenders (see photo).  The term ROKU comes from ‘six’ in Japanese, by the way.  It was Anthony’s sixth company.

His basic point was that the future of television will be via Internet streaming, rather other delivery means.   This is the service that the ROKU player or set top box provides at an equipment retail cost of just 50 USD.  It provides a ‘gateway’ to the services of other content providers such as Netflix.  It is a competitor of other gateway boxes from Apple TV and others. 

The ROKU service will also be available via a ‘dongle’ for insertion into the USB port of a TV set.  This will be a disposable dongle, which can be updated with new and more sophisticated software as time goes by, without the viewer having to wait for his next TC set.  

ROKU offers access to a large number of services, which fall into popularity bands like a ‘long tail’.   Top gun at the moment is Netflix. 

Anthony was asked what broadcaster stations should be doing to continue to be successful in a future age of Internet gateways like ROKU.  He suggested they should be working out how to gain the rights to include premium content in their Internet services, which should also be offered via an Internet gateway as well as broadcast.  It seems that much the right to much premium content today resides with the networks, rather than the broadcasters who use network content for broadcasting.    

So what is the ‘end point’ of the current trend?  For Anthony it will be a world where “any video content in history is available via Internet”.  

Do you think he is right?

David Wood.