Tuesday, May 15, 2012

The new normal


The emerging media technologies forum closed with a keynote from Peter Hinssen of Across Technology which proved to be both witty and wise.
He started out with some visual experiments, to make the point that culture affects what we see. In the west we are culturally inclined to see what is in front of our faces, while Far Eastern cultures tend to appreciate the shape and structure of the background.
It was the set-up to talk about technological advances becoming the new normal. And although we feel we are immersed in digital technology today, Hinssen thinks we are only half way there, in the mid-point of the s-curve.
As we cross into the second part of the digital s-curve we will start to talk about the benefits, not the features. Then it will become the new normal.
Consumerisation means that we have better technology at home than at work. Hinssen defined work as the brief period during the day when I have to use old technology.
He suggested that the current battlefield is mobile technology. In Europe there is 127% mobile phone penetration; 60% of smartphone users take them to bed; 23% would rather lose their wedding ring than their smartphone.
He looked at Apple, and pointed out that it did not innovate – there were MP3 players long before the iPod. But where it is supremely effective is in constantly driving for simplicity, looking at every single product from the consumer’s viewpoint. That is why it is able to seize markets.
From there Hinssen moved on to our instant society: instant messaging, instant gratification. Businesses struggle to keep up with this, finding it hard to be agile enough to develop products and services to meet the new expectations in this era of now. The Rand Corporation defines this as the Vuca world: a business and political environment of volatility, uncertainty, complexity and ambiguity.
The millennial generation suffers from continual partial attention, he said. They can multi-task but only on a superficial level. They lack the ability to go deep.
One of the key themes of the forum has been the vital importance of data. To illustrate this, he described the Nike+, which was launched as a way of driving the jogger’s iPod to choose appropriate music to match the pace. But it also tracks the course of your run, where you go and where you stop. Aggregating the data means Nike knows favourite tracks, and where people stop. Is it selling common rest points to Starbucks or juice bars? Certainly it is.
He also quoted Linked In. 98% of Microsoft executives are on Linked In, he suggested. Who knows more about Microsoft’s business plans: Microsoft or Linked In?
The biggest problem of innovation is not clever people, it is the organisation which surrounds it. And research shows that as businesses grow, their productivity falls. If a company triples in size, its productivity falls by 50%. Bigger organisations stick to small company organisation charts, and Hinssen recommended all delegates return to their companies and burn all their org charts.
The old ideas of absolute control have to be abandoned. Businesses have to be organised as networks, as organisms. Innovation flows faster in a network.
Hinssen said that if you want to really promote innovation, you have to allow that failure is an option. You have to try things quickly and see what works. You innovate not just by having clever people, but by having the ability to kill quickly those ideas that are not working.
In the new normal there is no business side and information side: we all work in information now. Technology has never been more exciting – you have to let that sense of excitement radiate through the whole organisation.

A view of journalism


News will always remain a major driver for broadcasting, so it was appropriate that the emerging technologies forum included a presentation from a journalist. Glenn Zorpette talked about his experiences and his ideas for the future.
He started by saying that the rapid and sweeping changes affecting journalism are all driven by technology. The web is a big source of change, because it has conditioned audiences to expect the same five things: video, podcasts, blogs, articles and slide shows. That is a problem for in-depth publications like expert journals (Zorpette is now the editor of IEEE Spectrum) which publishes 2500 word papers. They do not work online.
His suggestion is that you use video as the honeypot which gets people on to the site. Getting them onto the site, and keeping them there for a few minutes, is what advertisers need.
But if you have to include video, what kind of video do readers want? He joked that people just wants videos of cats. But even on a hardcore technology publication like IEEE Spectrum, the most popular clip is of a GPS connected cat, earning hundreds of thousands of hits.
80% of the video they use is provided for them by researchers, or by their PR people. This simple reposting gets traffic to the website. Other content comes from Skype interviews and shoots using low cost cameras, as well as more professionally produced videos. It depends on the nature of the content: sometimes you need to get the video online immediately, he claimed.
He suggested that low cost cameras are good enough. Backpack 4G connectivity, such as the LiveU system, can replace the satellite uplink truck.
Zorpette’s view is that we are all aggregators now. Writers blog and tweet then expand them into bigger pieces, and lift content from other writers’ blogs and tweets. Is this good for journalism? It is hard to tell, he said.

Is the future bright?


Chuck Dages from Warner Bros. opened up a session called Pulling it all together by charting some of the disruptive technologies in its history. Talkies were introduced, he said, to reduce operating costs: if the sound was on the movie then the theatres saved the cost of a piano player. He also mentioned the impact of colour in 1939, with The Wizard of Oz, and the coming of scripted television with the 1954 series Cheyenne.
In 2006 Warner Bros. had 1.3 billion transactions for its products. By 2011 – with the coming of smartphones and tablets, among other factors – that had risen to two billion. The challenge is to accommodate that 50% growth in content deliveries without a matching growth in delivery costs.
He said that as a business Warner Bros. has learned through this, and through other revolutions like music and photography, that consumers want access wherever and whenever they are. They also want to want to release the value of the library: there are 10 billion DVDs on shelves in homes which they want to take with them. And consumers want tools to be able to manage their digital libraries.
From Technicolor Christophe Diot said he agreed entirely with Chuck. He added the point that a regular movie is now delivered in 200 different versions. He also claimed that Disney collects nine petabytes a day in user data – far more than they can possibly use, at least with today’s technology.
He said that the last 10 years have been the research years, the time when the infrastructure and the technology was developed. Now we are entering the consumer years, when this technology will be used to deliver the experience that audiences want. It is a time of dematerialisation, when we make life easier for the consumer by making consumption seamless and transparent, and letting the consumer pay the right price to the studio for the content.
Studios then have the ability to determine the balance of content, quality and price without it becoming commoditised. The content and the technology should work together to deliver, so simply that anyone can understand, the right experience at the right price.
He reiterated the importance of metadata, and outlined three kinds of metadata: the technical metadata, the user metadata, and the crowd-sourced internet metadata. This third class is what will drive discovery through recommendations.
The third panellist was Chin Siang Lim from Singapore Media, who suggested that having too firm a vision of the future was dangerous because you will be trapped into a fixed path. And it will be boring!
He then offered a couple of hints of what might be to come. First, he noted that the amount of data is rising hugely. One solution might be that all the content in the world could be available to every consumer, and at the point of consumption you pay for the resolution for the device you choose. IPv6 will allow everyone to have their own server to achieve this.
His second hint concerned automated production. With enough fast cloud computing anyone can make their own content. Lim pointed to Apple’s Siri as the foretaste of what will come. Today it gives you recommendations for restaurants, tomorrow it might run scientific experiments or create videos. It might be the way to achieve the idea of interactive content which adapts to the feelings of the audience.
If you can produce content that will expand the minds of the audience, then not only will that will help societies grow, it will help your audiences and revenue grow.
One comment from the floor picked up on the idea of adaptive content, and asked if this would be a way of presenting alternative cuts of a movie, perhaps an adult version for late night screenings. Lim suggested that this would happen if the audiences demanded it.
Diot saw a couple of problems: the creative people do not like to see any changes to their movie; and the need to build the movie in a different way, with metadata inserted into the file to allow the dynamic interaction. But technology is growing, particularly for the home where it is possible to detect who is in the room so you can protect the adult version.
Another comment from the floor noted that, given the general agreement that metadata was the key to the future, what does this mean for the future of the industry? Do we need more data analysts?
Dages said that it was important to collect, collate and parse metadata at every stage of the process. Data wranglers are now a part of a movie crew.
There was a debate on how metadata could be standardised, and even if it would be worthwhile. Diot felt that because much of it is transient, and linked to cultural and social thinking, then even keeping it long term is not necessary let alone standardised.
From the floor there was the suggestion that artificial intelligence might, in just a few years time, be capable of automatically generating all the metadata we need, in the way that Shazam does in cataloguing music.
The issue then was raised about how the next generation of staff – creative and technical – can be attracted into the industry. Given that, at least according to the young people’s panel yesterday, there is a strong feeling against the corporate world, will they want to work for big producers?
Diot said that Technicolor is addressing this concern through workshops and mentoring schemes to develop the passion for quality and creativity. He said they can teach the details of technology, but schemes like this encourage interest. Dages added that the film festival scene is mushrooming, to reflect the fact that many more people are taking advantage of the ready availability of tools to create content.
Chairman of the session David Wood recalled Alvin Toffler’s theory that change accelerates – future shock. What does this mean for the coming years in our industry?
Lim thought it was a case of survival of the fittest: only those who can keep up with the pace of change will benefit. Dages suggested that there was another piece of human psychology that came into play here: we all develop our selected subset of favourites. We may have access to hundreds of radio channels in our cars but only listen to a couple of them.
Diot said he was a researcher, and understood the curve that utility of new concepts stabilises over time. At first it might be too complex for the majority, but we will adapt in time. But he felt that predicting that time – and putting a date on events in the future – was not possible. In a show of hands, there was overwhelming agreement that the future is bright.

Show me the money


The first day of the emerging media technologies conference was largely around the way that new acquisition and delivery techniques are changing the way we present and consume content. Which is all well and good, but are there new business models emerging to fund it? That was the theme of the first session on the second day. Taking the chair was Tom Morrod, senior broadcast analyst at IHS Screen Digest.
Morrod’s first point was that total consumer leisure spending is still focussed on traditional content types. New media and online spending is still down in the noise in comparison: 1 or 2% only. More than half goes to pay TV, 10% on gaming, and substantial revenues still go to cinema and DVD.
Advertising revenue, though, has shifted towards internet and online. Television is still the largest single market, but internet is now running it close. But that revenue is largely associated with online video, so in a way it is returning to the same content owners and providers.
Consumer spending is surging ahead, but only on pay television. Advertising is flat – a growth of only around 20% from 1999 to 2011 – but online advertising is rising fast. Morrod also showed a chart of the decline of the music industry, showing that unless it is properly managed a rise in online services will not necessarily be good for business.
The final point he made was that consumers are spending on devices: the typical household will now include tablets and smartphones as well as computers, televisions and DVD players. The results are two-fold: first that the share of viewing hours taken by the television screen will fall rapidly, and second that it is placing huge – maybe unreasonable – demands on IP traffic.
Morrod then introduced John Yip from RTHK. In these times of technology change, he suggested that we are in a David and Goliath situation, Davids being the new, agile companies and Goliaths are the big monolithic broadcasters who may be slow to change.
To provide some quantitative method of determining whether a new service is viable, he introduced his RPMO analytical model, a simple way of analysing business trends. This is a conceptual equation which balances Regulatory, Pricing, Marketing and Other factors across macroeconomic forces to derive a numeric value for driving force. He even provided a link to an Excel calculator on the RTHK website. Yip’s claim is that it is a pretty accurate way of predicting the growth in market penetration of a new service.
Through examples he showed that the model derives values for R, P, M and O, and by taking the geometric mean you get a number between 0 and 5 for the proposal. 5 is a certain success, 0 is hopeless. Because it depends on a geometric mean, if one factor is not there it will bring the final total to zero.
Martin Guillaume from IBM quoted from a study of 1700 CEOs. Of these, they took the answers of the top performing 20% businesses and contrasted them with the remaining 80%. On the question of innovating the industry model, the average response was 18%, the top performing group as 35%, but media CEOs were at 45%, showing a clear appetite for new business models. Similar results were determined for new sources of revenue. While this innovation and new revenue is being forced on the media industry, it is clear that a significant number of businesses are taking positive steps.
Turning to where the money will come from, Guillaume noted that traditional broadcasters had to adapt to multiple delivery platforms, not least the fact that audiences can ignore advertising by focusing on the second screen. You have to deliver new experiences to develop new business models, not least to make the multi-screen experience more immersive.
For content owners, the rules which have governed the value of content has to be rethought. It should be possible to derive and justify the value of each piece of content, and them optimising the way that revenues are earned around it.
As an example, he showed a mock-up of a tablet app for football, which derived data in real time from the game – player stamina, for instance – and model it onto the screen. That opens up new revenue opportunities from gaming and advertising insertion to direct links to gambling and subscription data.
The key point to bear in mind is that, as modes of consumption change, so the value of rights have shifted, and you need to find new ways of measuring and analysing it. So metadata is the cornerstone of new business models.
The final speaker was Charles Sevior from Australian company Charan Group Consulting. He concurred with the prevailing wisdom that, as viewers are uniformly keen to adopt new devices for content consumption, broadcasters need to find the best ways of harnessing them to improve the return on investment for their advertisers.
He then turned to the demands on broadcast infrastructure. The growth of channels and consumption platforms has a multiplying effect on scale and capacity, and the challenge is to decide when to invest: early to allow agility, just in time to get the benefit of the latest technology.
One of the ways in which that is being handled is the move to generic IT. Standard software environments are increasingly well-behaved, he suggested, allowing them to deliver broadcast standards of reliability and stability. It allows the agility by replacing traditional broadcast interconnectivity with a three layer model of network I/O, processing and storage.
He showed an interesting slide which reminded the audience that the whole of the broadcast industry, according to the IABM research, is worth around $25 billion, which is about the same size as EMC alone, and dwarfed by Windows and Apple. His conclusion was that the broadcast industry should increase alliances with major IT vendors and qualified media IT system integrators. That would allow us to focus our ingenuity and investment on differentiating software and specialised hardware.
In the panel discussion, Sevior described the current series of The Voice on Channel Nine in Australia, where alongside conventional premium-rate telephone voting the audience can download songs from iTunes, which counts as two votes for the performer. They see this as a source of new revenue, as well as a practical link to social networking.
Guillaume talked about realtime analytics of social media as a way of determining what audiences think about a show as it was being broadcast. Comments on Facebook, Twitter and blogs can be parsed as they are posted to evaluating the programme and its talent.
Yip said that one of the benefits of media consumption on mobile devices is that you can then use location-based services. Advertising can be tailored not just for the individual but where that individual is at the time, maybe suggesting nearby shops or restaurants.
Morrod then moved the conversation away from the new revenue opportunities towards the costs: what investment will be required in infrastructure and content creation, and will it be justified. Sevior felt that there was no new money around and he was unconvinced that bold new ideas would be commercially sound.
Guillaume suggested that broadcasters and content creators need to challenge technology suppliers to come up with new solutions, maybe based on cloud, which can be delivered quickly and easily to allow these new models to be tested in the real world. If an idea fails then it can be quickly killed without significant cost; if it succeeds then it can be scaled up rapidly.
Finally, the debate moved onto so-called t-commerce, the idea that a mix of television advertising and interactive services could become a huge new stream of revenue, potentially coming close to replacing bricks and mortar stores entirely. Sevior felt that was technically a huge challenge, and the rewards were not clear: it required a significant change in consumer behaviour, too.

Monday, May 14, 2012

Emerging technologies, day two

The delegates are grabbing a final coffee and comparing notes on last night's excellent dinner at the Chateau des Eaux Vives, overlooking the lake in Geneva. Day two of the SMPTE/EBU conference on emerging media technologies is about to get under way.
Once again, leading international commentator and writer Dick Hobbs and his Mac are at the back of the room, ready to bring readers of the SMPTE blog first thoughts from the conference.

The content of the future


Sooner or later, all conversations about the broadcast industry comes back to the universal truth that content is king. The final session of the first day of the emerging media technology addressed the issue of content.
Sheau Ng of NBC Universal felt that convergence was now an old word. Now we have proliferations of new methods of communicating and proliferations of devices. That paints a new picture of the future for content. We are beginning to connect some of the dots, to build the new paradigms for the creative community, he said.
What we have to do is push ourselves into new ways of telling stories, to find routes to re-engage with audiences. We can now present data in multiple ways synchronously, to the main screen and the extra screens. It is up to the producers to find new ways of using those tools.
Chair of the session Anthony Rose of Zeebox talked about a programme on UK Channel 4 called Style the Nation. This was aimed at teen girls, and paid for by a clothing company. The accompanying app allows the audience to both vote on the clothes discussed on the programme and by it online. It turns the conventional broadcasting model upside down.
Jean Philip de Tender is a channel controller at VRT, but described himself as a story evangelist. New technologies allows you to tell stories in a better way, he agreed. With second screens to have the ability to organise a dialogue with your audience, to understand how they are responding to what you tell them.
He recalled that he had once made a programme about terminal cancer, which naturally triggered strong emotions. This was an opportunity to start a conversation, an ideal use for connected communications.
In response to a question, he was clear that, while the broadcaster need not own the second screen, it was vital that the content was linked to provide real integration and convergence.
Ken Kerschbaumer of SVG brought the sports broadcasters’ viewpoint. If you have the rights to a sport you have a way of keeping the audience locked in and a naturally dramatic story line. It is a story best told live. So sports fans do not cut the cord. They want to watch their favourite sport as it happens.
The discussion moved onto discovering content. What will the next iteration of EPGs look like? Ng felt that there was much to be said for the old up and down buttons. This worked when there was just a handful of channels. Now it needs to take that idea and put under the up and down buttons suggestions for the sorts of programmes that individual or family might want to watch at that sort of time. Some clever algorithm somewhere will predict tailored programming for you.
Tender remained a firm believer in the strong brand: the broadcaster setting out what audiences will want to watch. He also expressed the idea of a show having “talk value”, making audiences watch a programme as it is transmitted because everyone will be talking about it afterwards.
Programme scheduling exists today, and will exist tomorrow, asserted Ng. But within the next decade there will be functionality to develop the sort of functionality which will provide a personalised schedule. But it has to not only understand the user, it has to understand the content, including new content which the audience has yet to see. How can that new content be advertised?
Audiences are increasingly expected to find content for nothing, and broadcasters work with YouTube as a way of building brands. Kerschbaumer pointed to the sports bodies, like Formula 1 and MLB, who control their content very tightly and do not have YouTube or Facebook presences. Does this mean that audiences will reject them because they cannot see them for nothing?
In conclusion, Rose asked how content owners generate blockbusters if there is no schedule. Ng said you will always have surprises, like The Hunger Game, but for most it is the release date.
Kerschbaumer said the beauty of sport is that you always know when and where it is happening. And Tender said it always begins and ends with a good story.