Thursday, October 27, 2011

SMPTE Members Honored


SMPTE Fellow Andy Setos received the Charles F. Jenkins Lifetime Achievement Award, Wednesday night during the Academy of Television Arts & Sciences’ 2011 Engineering Awards.
He received a standing ovation as he accepted the award, appropriately named after one of SMPTE’s founders. Setos thanked the Academy, its Engineering Committee, friends and colleagues.
He summed up: “Recently I had a couple of reporters ask me, of all the things I did, which one was my favorite. Was it doing MTV? Giving reality to David Hill’s desire to put the score on the screen? Was it HDTV, or Blu-ray, or Digital Cinema? No, none of those. My most favorite are the things that I will be doing next.”

This week during SMPTE, Setos shared his views on several conference topics:

“Doing live sports with 3D is awfully challenging. I think a lot of good work has been done. But it is very hard; I don’t think anybody has really got the magic formula that is going to make it universal, yet.”

“I think from a home standpoint, until 3D becomes glasses free—which is technical practical and there is work going on, but it has not matured yet to be a product for group viewing—there is going to be a limit to how much (3D) content people can watch.
"Once we get past glasses, I think things will get more prolific.”

"I think second screen is going to become very interesting. We tried to do this a long time ago, the challenge was inventing the second screen. … But now, second screens are practical. And that is an opportunity. The Internet can do some things really well, but it can’t do real time, high value content—not when it’s popular. It was never intended for that.
“The strength of Internet protocol is breath and choice. The strength of reserve bandwidth systems is that they can provide rich, high value imagery and sound to as many people as are out there connected.


Thursday evening, SMPTE will present its Honors and Awards ceremony.
Dr. Edwin Catmull will receive the SMPTE Progress Medal, Graeme Ferguson and Roman Kroitor will be awarded the John Grierson International Gold Medal, Joshua Pines will receive the Technicolor/Herbert T. Kalmus Medal, Max Bell will accept the Samuel L. Warner Memorial Medal, Bruce Devlin will be awarded the David Sarnoff Medal, Linda J. Brown will receive the Kodak Educational Award, and Douglas Trumbull will accept the SMPTE Presidential Proclamation.
The SMPTE Journal Award will be presented to Regunathan Radhakrishnan and Kent B. Terry for their article, "Detection and Correction of Lip-Sync Errors Using Audio and Video Fingerprints.” The SMPTE Journal Certificate of Merit will be awarded to Sean McCarthy for the article, "A Biological Framework for Perceptual Video Processing and Compression” and to David Wood for the article "Understanding Stereoscopic Television and Its Challenges.”
David Horowitz will receive a Citation for Outstanding Service to the Society, and Michael A. Dolan has earned the Excellence in Standards Award.
Catmull, Todd Brunhoff, Gary Demos, Michael Karagosian  and Paul Michael Stechly were named SMPTE Fellows.
Alexander Michael Pagliaro, a senior at Rochester Institute of Technology, was awarded the Louis Wolf Memorial Scholarship.

Wednesday, October 26, 2011

Sen. Dodd Urges Industry to Address Content Theft

Sen. Chris Dodd, chairman and CEO of the Motion Picture Association of America, encouraged Hollywood and Silicon Valley to come together—and urged both communities to join him in the fight against content theft, Wednesday at the SMPTE Conference in Hollywood.
“I will fight as hard as I possibly can to see that we bring an end to content theft,” asserted Sen. Dodd, who starting his work at the MPAA roughly eight months ago. “It must be a collective effort.
“Content theft is stealing,” he said. “It is not a victimless crime; 373,000 people have lost their jobs in this country because of content theft. Some $16 billion in earning are gone; $3 billion in revenue for taxes are gone because of content theft.”
Sen. Dodd pointed to search engines as one area of this problem that needs to be addressed. “When you knowingly provide a venue to an audience that wants to see (the content), you are an accessory to the crime. There is nothing neutral about providing a venue that tells someone how they can steal something.”
As to uniting stakeholders, he said: “Hollywood and Silicon Valley mutually benefit from their partnership. One of my top priorities at the MPAA will be to grow and strengthen that partnership. We cannot survive without each other. But together we can give our mutual customers experiences that defy imagination.”

Tuesday, October 25, 2011

SMPTE Keynote: Scott Ross Talks VFX; 2D-to-3D Conversion


SMPTE opened Tuesday with a keynote address from industry veteran Scott Ross, whose topics ranged from the VFX business to 2D-to-3D conversion to standards.
Ross is a co-founder and former CEO of Digital Domain, whose credits include Titanic. Earlier, he served as the general manager of George Lucas’ Industrial Light & Magic and as senior VP of LucasArts. He currently works as a professional consultant and producer and serves on the boards of multiple technology and entertainment companies.
Among his points:

On the VFX Industry:
Tackling the challenging business climate, Ross looked back and traced the rise—and many falls—of VFX companies.
“Frankly it is very difficult running these business,” he said, citing uncertain workflows, difficulty with pricing and competition driven by government subsidies.
He added: “Studios continue to ask for lower prices, higher quality, and more work in shorter periods of time.”
Ross suggested as a strategy that facilities move from service providers toward content producers. He commented that factors such as broadband could lead to the notion of “global content producers.”

On 2D-to-3D conversion:
“I believe 2D-to-3D conversion is analogous to the VFX industry,” Ross said. “We are now seeing the first generation of conversion companies, but it takes a lot of people—yes, tools are important, but it take a lot of people to do that.
“It is going to be much too expensive to have that work done in the US. It’s a service, it will go where (studios can get) high quality and lower costs. We are na├»ve to think that will come only from US.”

On Standards:
“There are also no standards (in 2D-to-3D conversion). There are tools just like in the VFX business that being developed that could talk to each other. Those organizations need to look to (SMPTE) to set those standards.”

Monday, October 24, 2011

IIF ACES Addressed At SMPTE Symposium


AMPAS’ SciTech Council co-chair Ray Feeney urged the industry to do some internal testing of the developing IIF (Image Interchange Format) ACES  (Academy Color Encoding Specification) “before it becomes a project requirement."
His remark came Monday at the SMPTE Symposium, dubbed “The Large-Sensor Imaging Revolution,” which was developed in collaboration with the American Society of Cinematographers.
IIF ACES was discussed as part of the conversation about workflow issues.
“The system has been proven to work," Feeney said of IIF ACES, which addresses workflow with emphasis on color consistency. "Some pieces still need refinement. We are making some very significant progress.”
He pointed out that with the IIF ACES pipeline, there are many pieces that need to be addressed in the workflow, “not just the major pieces. There are clean-up tools, restoration tools, conforming tools—and every facility has a different setup.”
“Looking at ACES as an archival format is a very good thing,” added speaker and moderator Michael Most of Level 3 Post. “It is a documented, open format. … It is about eliminating restrictions.”
Kicking off the day, SMPTE Executive Director Barbara Lange welcomed attendees to the symposium and shared the news that the Society has won two 2011 Technology & Engineering Emmy Awards from the National Academy of Television Arts and Sciences.
One Emmy recognizes SMPTE’s work in local cable ad-insertion technologies that now help broadcast facilities – particularly cable head ends and unattended stations – to switch as easily between digital programming and advertising as they did with similar materials in the analog domain. The second award will be presented to the Society for its end-to-end system for describing a program’s aspect ratio and to allow users to control the ratio displayed.
The SMPTE conference officially kicks off at 9:30am on Tuesday at the Renaissance Hollywood Hotel with opening remarks followed by a keynote address from visual effects veteran Scott Ross, co-founder and former CEO of Digital Domain.

Sunday, April 10, 2011

Trumbull Tackles Higher Frame Rates

Continuing James Cameron’s dialog about higher frame rates, Douglas Trumbull, the inventor of ShowScan, delivered a keynote at the DCS during which he suggested that when it comes to frame rates, it is not “one size fits all.”

“It’s not just about higher frame rates,” he said of creative filmmaking. He suggested that filmmakers use what is appropriate to a given movie. “There is no one size fits all.”

But like Cameron, he emphasized the benefits of higher frame rates include reduction of strobing and sharper images.

He reported that he plans to helm a 3D feature that would be shot using his ShowScan Digital system that uses 24 fps but allows embedded 60 frames per second sequences. He has applied for a patent on the system.

To test this process, he recently shot a music video in New York, using Vision Research’s Phantom 65 with Zepar 3D lens system, shooting 120 fps. SpeedeGrade was used for color grading.

It has not yet gone through full postproduction, but the intent is to test a new workflow, developed with suppliers including Vision Research, maker of the Phantom; and equipment supplier AbelCine.

During his address, he asserted that brightness remains a key issue that needs to be addressed in 3D.

Footage from the music video--Dana Fuchs’ Golden Eyes--will be displayed at the AbelCine booth during NAB.

First Images From Sony Prototype 4K Camera Screened at DCS

The Arrival, a short lensed with a prototype of Sony’s new digital cinematography camera that is capable of recording 16-bit 4K and higher, got a lot of interest when it was screened during the second day of the DCS.

The short was lensed outdoors and in the Bradbury building in downtown Los Angeles. "The resolution and detail is just remarkable," Clark said of the camera's capability. "The details draws you into the image. You can comfortably let your images lead the narrative."

During a session on “Advances in Image and Sound: More Pixels,” ASC Technology Committee chair Curtis Clark, who wrote, directed and shot the short, spoke not just about the camera, but the workflow.

“It is critically important to extract the best possible images from these new 4K pipelines,” Clark said. “If we do something to degrade the image, then what you think is 4K may not be 4K. … It needs to be 4K end to end, and that needs to include color management that is consistent and reliable.”

ASC president Michael Goi shared some concern about today’s generation of 2K archival material. “Some consider 2K as their master archival element,” he explained. “Going forward, I think less that 5 years, those 2K masters are going to look very soft.”

He concluded: “We want the best presentation and the best tools. Aesthetically, we need to approach these technologies in a way that give us that toolbox. That is something the ASC is concerned with.”

DCS Explores IIF and Greater Bit Depth

Bill Bennett, ASC opened a morning session titled “Better Pixels from Greater Bit Depth,” during the DCS with a series of beautiful images that he shot 12-bit with the ARRI Alexa, including imagery shot directly into the sun over the ocean.

During the session Ray Feeney, co-chair of AMPAS’ SciTech Committee, discussed the emerging IIF (Image Interchange Framework) aimed at seamless image interchange.

He suggested that today’s image interchange is “totally insufficient,” noting that there are new cameras, new display devices, and more and varied distribution formats. ”It is about any format, anywhere, anytime,” he said. “We are reformatting for a wider variety of distribution mechanisms that ever before. Each has a different sweet spot.

“Existing standards are outdated,” he asserted.

He suggested that if a change is to be made, that it be to 16-bit with its ability to increase dynamic range and color gamut. IIF enables workflows including 16-bit 4K.

For those in attendance at NAB, Feeney announced a few IIF related presentations scheduled during the week. These include:

--TV series “Justified” Monday, 2pm, Room N115.

--Sony’s new 4K camera imagery through the IIF pipeline. Monday, Tuesday and Wednesday, 11 am, Sony booth.

Saturday, April 9, 2011

Streather Emphasizes Need for 3D Training Courses

“I find it astonishing that there is no permanent 3D training course at any film school in the UK,” said Phil Streather, CEO of UK production company Principal Large Format, Saturday during the DCS.

Streather is one of the UK’s leading stereo 3D educators, having worked with Sky and training body Skillset to arrange the country’s first 3D primer for cinematographers and producers.

“We do need to find some accessible and permanent training solutions because there are still far too few production and craft personnel who understand the principals,” he said.

Presenting a case study of one his most recent projects, the National Geographic documentary Meercats 3D, produced by Oxford Scientific Films in association with PLF, Streather suggested that the best 3D created volume and that the best way to achieve that was to get as physically close to the subject as possible.

Streather added that the only way to get physically close to subjects like insects and small mammals was to go in close with a wide angle lens.

“In the same way that you collapse depth using zooms in 2D so you collapse depth using zooms in 3D,” he said. “You may have objects separated in background and foreground but they will look like cardboard. It’s also important to use motion parallax to create depth cues for objects moving in foreground and background of the frame.”

Streather added: “For me the idea of 3D being a window into another world is wrong. I don’t want a window or things bumping up against a screen plane. I want objects to pass in front, pass through and live behind the frame effortlessly.”

Sky Accelerates 3D Program Production

British payTV satellite broadcaster BSkyB launched the world’s first 3DTV channel last October. Already Sky 3D has amassed 70,000 subscriptions, half of the estimated total number of 3D-ready TVs installed in UK homes.

The man responsible for driving this technical innovation on behalf of Sky is Chris Johns, the broadcaster’s chief engineer who was first alerted to the potential of 3DTV delivery over satellite in demonstrations at NAB three years ago.

“We are now looking to bridge the content gap by accelerating 3D program production,” he told the audience at the DCS. However such acceleration should not lead to corner cutting on the creation of a high quality and comfortable 3D viewing experience, Johns said. “If you deliver cut price 3D people won’t buy into it.”

No matter the genre, and Sky has experimented with most of them illustrated here with clips from several live sports to documentary programming, live opera, dance and studio entertainment shows, there were a core set of lessons that Sky wants to educate the market about.

“A producer’s first project in 3D turns out, in most cases, to be theme park 3D which emphasises the wow factor but is actually very demanding on the eyes,” said Johns. “At Sky we want viewers to be able to watch 3D – sports or opera for example - for 2-3 hours or more at a time. That means creating the 3D in a natural way – retaining the theme park effects but limiting them for special occasions and for certain effects.”

Sky ensures that everyone who creates content for its channel adheres to a set of guidelines, available on its website. These are not specifications, stressed Johns, in as much as they are not defined rules, but guidelines to create a foundation for everyone to work from.

These include keeping the depth budget to 3% into the screen and 1% out of the screen and minimising the amount of 2D-3D conversion.

“We don’t advocate conversion,” he said. “It can be done but only at great expense and not on a $10k box in realtime. You need to have editorial control and you have to do it with care or don’t do it at all.”

It’s perfectly practical to use a wide variety of rigs and camera equipment provided the tech choice is tuned into the particular project. For live sports, where there is no second chance, Sky uses 3Ality rigs - expensive acknowledged Johns but a technology that provides the production team with full control over all the lens and rig parameters.

“Yet using manual convergence on camera set-ups is fine if, for example, you are shooting a drama,” he said. “Integrated cameras can also produce great results - if used correctly. The important thing is to understand what technology you are using and what it can achieve. The same goes for postproduction. Understand what transfer methods there are for getting material out of a digital camera all the way through the chain before you start. Don’t short change yourself on preparing the workflow because it will end additional cost in post when you find you cannot correct misalignments.”

He advised covering any 3D material in 5.1 multichannel because the sound adds to the 3D effect by positioning objects in the picture, if done correctly, and even helps the viewer predict the direction from where things come from out of the frame: “That applies not just to sport but across all genre,” he said.

Sky is investing heavily in commissioning 3D content and hopes its initiative, along with that of 3net, will push the industry forward and create a virtuous circle; “So that when a viewer turns on the TV they have a choice to watch in SD, HD or 3D,” said Johns. “When their first choice becomes 3D then we will have achieved our aims.”

2011 DCS Opens in Las Vegas

SMPTE president and Sony exec Peter Lude opened the DCS 2011, “Advances in image and sound: 3D, 4K and Beyond,” explaining “we are trying to extend this beyond cinema, though we recognize that many innovations start with cinema.”

He cited a number of key topics as technology continues to march forward, including higher frame rates, higher spatial resolution, high dynamic range, expanded color gamut by display technologies, stereoscopic 3D, and volumetric 3D.

Lude announced that in June, SMPTE will publish a book titled “3D Cinema and Television Technology: The First 100 Years,” with over 400 pages and more than 33 papers. Pre-orders will be accepted at the SMPTE booth during NAB.

Barbara Lange, executive director of SMPTE, discussed some additional Society news. For one, SMPTE will launch a new conference, a global summit on emerging media technologies, which will be held in Geneva during 2012. EBU’s David Wood will chair the event.

A digital library will be introduced by SMPTE, enabling the community to learn by accessing information from the desktop. A new SMPTE web site is also launching.

A new SMPTE professional report series is on the way, with the first publications slated for availability in June.

Disney’s vp production technology Howard Lukk will chair SMPTE’s second annual international conference on stereoscopic 3D for media and entertainment, June 21-22 in in New York.

Upcoming regional seminars on file-based workflows are scheduled to take place in Montreal, Toronto and Atlanta.

Discounts for may of these events and opportunities are available at the SMPTE booth in the South Hall, during the NAB Show.

During welcome remarks, Andrew Stucker of sponsor Sony said: “It is well known in the industry that there are a few places where smart people openly exchange ideas,” said Stucker, citing the DCS, as well as the HPA Tech Retreat and ASC Technology Committee meetings. “You are in for a real treat.”

He completed his remarks by recognizing colleagues and friends from Japan and their “determination in the face of insurmountable disaster.”

Monday, February 7, 2011

Multi-camera HD mobile capture system

My last big project here at the UK's University of Surrey was fitting out a bluescreen motion-capture studio with eight Thomson Viper cameras, together with timecode and audio. Dell Poweredge boxes fitted with DVS cards give us about 40TB of storage.
This was back in 2004 and the range of hardware and cabling for HDTV was nothing like as prevalent as it is today.
Time moves on - now there is a requirement for a mobile version of our capture equipment for use on film/SFX sets and small sporting venues. This will consist of two suitably equipped 19" rack flight cases, eight Vipers in single/dual/Filmstream mode. Low-end Canon prosumer camcorders will also be available for less rigorous work. The disk-recorders should be able to store dual-stream recording for at least 80 minutes.
I didn't mention - due to the analytical nature of our work, we cannot use any compression!
So at this stage we've chosen the capture-record machines and I'm now accumulating a small pile of gear such as the SPG, Timecode generator, Rasterizer, DAs, viewing matrix, etc., ready for installation.
At the moment I'm expecting capture-box delivery Nov/Dec time; then I can get on with the project's design and installation.

5th October 2009
Happened upon The Battle of River Plate (1956) on TV this afternoon. First time it's been shown in HD here in the UK. The original VistaVision sparkles through with nice saturated colours from the Technicolor print. Well done Channel Four!

15th October 2009
Just ordered 200 BNC connectors and 100m of cable - oh I'm going to be busy when these capture units finally turn up!
Started wiring diagrams using Word's drawing function, not ideal but it worked for my previous studio project.
I shall use the primitives (U-links, DAs etc.) from these old drawings as well.
Should really use Autocad or similar but don't have time to learn it!
TVLogic monitor arrived but must find a suitable flightcase for it. Pictures look OK for confidence use but I still prefer CRT's.

12th November 2009
Capture boxes delivery now late. Manufacturer can't get the solid-state storage to work reliably on very long clips. They suggest we now utilise conventional disk-drives.

23rd November 2009
Just completed the eighth and final conversion of our Viper cameras to external synchs. for the above project.
The Viper HDSDI back normally only accepts external refs. via the multiway connector.
Using the thick multicore cables in the field would have been expensive and cumbersome, therefore I decided to convert each camera to accept synchs. via a separate BNC connector.
I did not want to drill or modify the case so a small ABS box was bolted
underneath the HDSDI back, just behind the shoulder pad.
Wiring involved virtually dismantling the HDSDI back to access the rear of the multiway socket.
It is physically possible to break into the circuit at a more convenient point, but that would bypass the common-mode/EMC input filters.

24th November 2009
It's annoying when you try to do the best thing technically - and fail!
Last week I needed to temporarily split a short feed of 800mV HDSDI to two separate monitors. I couldn't justify a $600 DA so decided to build a small 3 x 27 ohm star-splitter using copper board and the smallest resistors I could find.
Did it work? - no.
Next I tried using a 'goalpost' type BNC T-piece for the two-way split - this of course results in a double termination and a return loss nightmare.
Again, did it work? - YES!
Final result:- Botch 1 : The Better Way 0

22nd December 2009
Glory be - the four DVS ProntoXways (http://www.dvs.de/products/video-systems/pronto-family.html) arrived last week. They are mounted in two 16U shockproof KALMS flight-cases and weigh approx. 80Kg/177 pounds each* (this is a transportable not portable system!).
I will have to re-design some parts of the project as there is less space than envisaged; also the racks are powered individually by 16A cords. As each rack, when completed, will additionally be interconnected with data/audio/synch./HDSDI cables etc., I didn't want to create a potential earth-loop. As planned, rack #2's mains-power will be sourced from rack #1, it's earth coming from a star-point located within the latter.
Christmas is nearly upon us now so I won't have a chance to progress until early next month, although I am currently now sitting here re-drawing my plans! Season greetings!
* this will of course increase when I've fitted the rest of the equipment.

26th December 2009
Just thinking of other uses for this system:- stereoscopic x 4 channels, digital Cinerama?

17th February 2010
Happy New Year! (Indulge me - my tardiness is due to a busy last few weeks!)
Xways are now being tested for reliable storage over several hours and, most importantly for us, no dropped frames. Each machine has 2 independent channels with it's own Windows GUI running the DVS ClipStore software. There are 4 machines so therefore 8 desktops controlling each channel respectively. To keep the amount of hardware down there are 2 KVM switches feeding 2 Lindy monitor/KB/mouse terminals.
Setting up each machine takes a while due to having to keep switching desktops. A source of potential confusion is that you have to set each channel to it's own storage disk eg., Channel 1 to V: and Channel 2 to W:
It is possible to erroneously set both channels to the same disk - this causes frames to drop eventually as the drives get more full.
Obviously I am keen to get some kind of script running to preconfigure each machine and minimise operator error!

23rd February 2010
Just had a rummage around the disk-drives under the front covers. After this amount of time it might sound like a lack of curiosity on my part but in my defence I also have a sick DMX lighting controller, a burned-up DA, and a multiline to-do list sitting on my desk!
Each machine has 2 x 2.8TB RAID 5 storage which is enough for over two hours of 4:4:4
per channel.
The individual drives are Seagate Cheetah 15K.7 (http://www.seagate.com/www/en-us/products/servers/cheetah/cheetah_15k.7/) mounted in removable harnesses for easy hot-swapping.

23rd March 2010
Just watched Niagara (1953) on TV. This film seems to have noticable colour-fringing in some scenes. Was this print struck from shrunken matrices? I wouldn't have thought the Technicolor guys would have let it pass at source so I wonder what has happened since?
Haven't seen the DVD version yet - any comments?

2nd April 2010
For the purposes of sane documentation I needed to uniquely indentify each of the two equipment racks. To keep the operating ergonomics and inter-unit cabling tidy it was decided that when set up on location the cases would always appear as left and right respectively (does this make sense?).
It was difficult to keep refering to "the one on the left with the SPG" or, "the right-hand one with slightly less stuff inside".
It had to be something easily memorable, for example - Laurel & Hardy, Hewy and Dewy, Ant & Dec*, etc,.
I'm afraid it came out really dull:- left side = Port = Red, right side = Starboard = Green.
Therefore they are now individually known as either the Red or Green rack!

* for non-UK readers they are popular TV presenters here.
7th April 2010
It's been suggested here at the research-centre that we should investigate multi-camera setups without the benefit of synchronised shutters ie., no genlock. This would be advantageous for witness camera applications in green-screen studios - one less cable to rig/trip over.
A couple of years age I built a device which consisted of a 'rolling LED' strip which was synchronised with SD synchs. When individual frames were captured and subsequently analysed the illuminated LED(s) would show at what point the camera shutter was open (and for how long).
I now need to redesign and improve this box for HDTV experiments but this is going to have to follow this current project, so we're talking >June perhaps.....

23rd April 2010
Increasingly irritated with the 3rd-party supplied flight-case racking system. Instead of conventional equally-spaced cage-nut holes it has a conduit slot running from top to bottom. This means having to source and fit (hard-to-find) fiddly spring nuts at the desired locations.
Unfortunately "at the desired locations" can mean anywhere in the slot, so you not only have to space them apart the correct distance vertically you also have to ensure that the fitted unit is horizontally level. I've never before had to use a spirit-level to fit 19" rack equipment!
An additional nuisance is that the rack's depth is quite deep which, while not so important for the short stuff, try fitting a Tektronix TG700 SPG. This unit has a sliding mount which fits not only at the front, but also has to be fixed at the rear of the rack as well. This is accomplished in normal situations by using two supplied slotted metal strips which can be adjusted to the desired depth.
Yes, you may have guessed it, S**'s Law - the Tek's metalwork was 1" too short.
Luckily there is a vertical conduit facing inwards at the back of the flight-case (I'm not sure why but mustn't look a gift-horse etc....). If I bend out the Tek's metal strip 90 degrees outwards then it might just meet. Well after a session involving a vice and a large pair of pliers I managed to straighten out the whole thing.
Offering up the TG700's sliding mechanism showed that the aformentioned Law was still in force, now the slot was 1/10th" short and was also slightly too small for the M5 screw needed to fit it to conduit's spring-nut.
This outcome causeth me to rend my garments and utter bad words. It being a Friday afternoon I decided that someday I will laugh about this and should leave it until next week.
Stay tuned.

30th April 2010
Recomposed this week. TG700's metalwork placed in vice, end holes cut through into U-shape and then offered up back to the rackmounts - perfect fit - job done.
The disk-recorders are now being used for a couple of weeks (on a limited basis) for an upcoming European Union project. Rather than lose any assembly time I decide to
separately build the wiring loom for the termination panels at the rear of each rack.
Each unit will have two rows of 20 x BNC which provide interconnection with inputs from the cameras (links A and B), outputs for TLS reference synch., and LTC.
Eight of the lower BNCs will mate with a main interconnecting 'umbilical-cord' which carries HDSDI signals and control data signals.
The rear termination cable for each camera BNC is 1694 type. After a long cable run the signal will be of low amplitude so any further losses must be minimal. The other cables used within the racks are much less critical and are of a smaller, slightly more lossy type.

5th July 2010
Well my last 'more soon' turned into 'much much later' - it all turned very busy, very rapidly. A European Union sponsored research project turned up at the last moment, they needed my mobile HD capture system and all 8 Viper cameras in a special inline configuration.
The camera views needed would give 4 x stereo and 8 x 1 for later use with a pseudo-holographic monitor.
The only silver-lining was the fact that they would be using the existing studio's infrastructure.
As usual, the fact that the units were not yet complete didn't come into the equation, so there was a lot of last-minute 'temporary' cabling.
They wouldn't have the viewing matrix, rasterizer or patch panels - the DAs for the synch. and LTC were wired and cable-tied to the inside of the racks.
Unusually for an EU project the requirement was for 720/50p. The existing Tektronix TG700 doesn't do this standard so an AJA GEN10 spg was pressed into service.
To cut a long story short the capture sessions were very successful and I think everyone was pleased with the results.
Just as that storm had passed it was decided then that London-based Framestore would like to use the mobile units for witness camera tests! As this would be in the field we would need 24 x 30m cables (Return HDSDI, TLS, LTC for 8 cameras). There was no time for making neat multiway cables so I ordered a drum of BD SD19 coax, a lower-loss version of 1694.
Of course the devil is in the detail, I also needed BNC connecters plus strain relief boots of various colours. No time for neatness, each cameras cable was assembled from 3 coax cables wrapped at 1m intervals with cable-ties. Not tidy, and a pain to spool up, but time was against us.
As the units would have to endure a journey in a truck I had to anchor down my 'temporary' system with many more cable-ties and velcro-wraps!
I spent 2 whole days measuring cable and making over 60 BNC terminations!
To make matters worse I would be on vacation the week the units were to be used, therefore I had to train up a non-video engineering person as to how it all went together. All the cables were copiously labelled.
As with the last project I was told that it all went well with no serious problems - in fact user's feedback is vital in getting the finished units just right.
Now things have quietened down I can hopefully finish this project soon(?)

6th July 2010
'Proper' cable for the cameras now on order for future jobs - Belden 7710A (3 x 1694A inside a 20mm PVC sheath). This will give us adequate performance over 30m with predicted loss of approx. 9dB @ 1.5Ghz. One of my future projects will be to try and measure the loss over the various BNC connections, I know it's small but I just like to know these things.
There is a possibility of dirt ingress and the PVC splitting at the 'Y' point where the 3 individual cables emerge from the sheath. A neat (but expensive) solution is Tyco Electronic's 462A023-25-0 heatshrink boot/transition.
I've tried an intial test and, once you've got the hang of heating it into shape with the heatgun, does do a good job of sealing the cable.

7th July 2010
Not many cartoons about HDTV but take a look at http://www.xkcd.com/732/
I shall say nothing. . . . . . . .

8th July 2010
I'm no football (soccer) fan but whilst watching last night's semi-final of the World Cup 2010 it occurred to me how far we've come in television engineering. Live high-definition TV pictures from South Africa where you could almost pick out the facial features of the crowd. I remember back in the early 1970s watching analogue pictures from South America of the World Cup through a fog of spatial and motion artifacts - cutting-edge 525 to 625 line standards-converters built out of discreet TTL logic!
I believe the BBC had a analogue converter built out of ultrasonic delay-lines, but that's just ancient history!
Looking back at literature of the time it's interesting to note how much successful R&D into spatial and temporal components of the video signal was being done. This was at a time when microprocessors were way too slow for video - all signal processing was achieved using 5 volt TTL adders/multipliers plus some early RAM chips.

24th September 2010
Time scoots by and I'm falling behind, not my fault, these units are just too darned popular!
I've fitted both the viewing matrix, rasterizer and made the 10-way interconnection cable but haven't been able to get near the machines!
Note to up and coming project engineers - never let the users near your installation until you've finally finished it!
In the meantime I've been assembling the eight 3 x multiways with the boots, BNCs etc., mentioned above.
Highlight was melting the Tyco fittings mentioned above into shape in a room that's already 27C (80F). These heatshrink boots take a fair amount of heat so a 2kW heatgun takes at least two minutes to become pliable. When shrunk the fittings are still very hot so they need cooling under a fan before wound onto the cable drum.

At least now I could use my latest toy, the Harris HDStar to test the cables.

This is a convenient test generator/monitor for SD and HD. The in and out BNCs are at the top of the unit so making cable checks really quick and easy.
I love this little box although it's quite heavy on the Li-Ion battery; also the recharge rate seems spectacularly slow.

23rd December 2010
Christmas break at last so now I have a chance to update the blog. Someone uses the cable reels mentioned above in the studio and decides that it's a nuisance having to unravel the whole 30 metres in order to get to the BNCs at the far end. I pondered earlier on whether to terminate the ends with just BNC tails or to fit bulkhead sockets into the cable reel's centre plate. Looks I'll have to do this after all - that will mean making 8 x 2 metre multiway cables and fitting BNCs, boots and transition joints. A job for the new year - deep joy!!

29th December 2010
Channel 4 here in the UK is, at this moment of writing, showing David Lean's This Happy Breed (1944) - what a stunning print and a fine example of 3-strip Technicolor from the British plant.
Talking of Technicolor I recently caught the end of Anchors Aweigh (1945) and noticed some horrible RGB misregistration in some shots. It looked camera-specific as it would alternate with 'good' registration between shots within the same scene.

14th January 2011
Oh dear, a cloud has appeared in the paradise of ProntoXway ownership. Number one machine had some flakey bootups and has now died completely.

17th January 2011
New system drive received from DVS so machine is removed from it's flight rack.
Wow - it's !*@@*!! heavy but once put on my (straining) worktop I'll open her up and take a look around inside the machine.

18th January 2011
Of course the ProntoXway is typically German top-dollar engineering - can't fault it (except the dud hard-drive of course!). From the top I can see a large motherboard kept cool by a small hurricane from the fans mounted across the case. System-drive C: is mounted in a bracket at the front, held in a by a screw and mounting lugs so it's a breeze to change.
OK, disk now replaced so just have to re-image with the GHOST disk supplied by DVS.

4th February 2011
NOTE: Old/seasoned IT hands may wish to skip this entry.......
Oh brother, what a saga that GHOST reimage turned into. Loaded the disk and the usual Linux screed raced down the screen. When prompted which device to install Windows OS onto you have (in Linuxspeak) a choice of sda, sdb, or sdc. The prompt suggested ">sda?" which, in a hurry I did - big mistake! As I found out later the Xway's devices sda, sdb, and sdc relate to RAID array #1, RAID array #2, and the 4GB system drive (C:) respectively - therefore sdc should have been selected.
To cut a long saga short I now had Windows installed on a RAID drive which would normally be used for media files only. Windows will not allow you to destroy itself through disk manager so somehow I had to get 'outside' the system to flatten the RAID drive. Thankfully I found this wonderful program SystemRescueCd on the net which creates a bootable Linux CD with a miriad of disk maintenance tools, some of which work with NTFS.
Using this wonderful software I was able to wipe the RAID drive and load the image onto the correct drive - sdc (aka C:).
Moral of the story is clear - before pressing the Enter key read the options given to you in the small print!

5th March 2011
Readers of this blog may be interested in an article that I have written for the Spring 2011 issue of the UK's Guild of Television Cameramen's magazine 'Zerb'. This article concentrates on a multiview research project held last Summer 2010 at the University of Surrey, and hints at the (still) distant prospect of holographic TV. The aforementioned mobile capture system and Viper cameras were used for this session.
Copies for non GTC members can be ordered from zerb.production.07@gtc.org.uk