Posted on Leave a comment

Hype-ometer: Comparing lack of rate adaptive buzz with 3D buzz

The more hype the less genuine importance?

Rate-adaptive technologies occupy very little media space but will radically transform the Internet and broadcasting industries.

3D on the other hand is yet again making all the headlines (yes is has done several times over the decades), but I’m convinced this new surge of interest will barely be remembered as just another blip on the radar, a few years from now.

Counting Google hits is by no means science, and you write ‘3D’ in most languages as oppose to “rate adaptive” which is English only, but getting 170 million hits for the search “3D cinema OR TV” (limited to English language) while the search ”rate adaptive cinema OR TV” gets 240 thousand must portray a bit of the hype imbalance between the two topics.

Walking through the booths at IBC for example, you’ll see 3D used on booths whichever way you look, to entice wanderers-bye to stop. Counting the number occurrences of “rate adaptive” on booth walls will be easier!

First let’s look at one reason why 3D is generating such a fuss at places like IBC.

The global economy is apparently picking up after a recession and cyclic industries like the electronics industry need to have something new to push. 3D serves that purpose well. For set makers, there is now a wider and wider array of new high tech parameters to get people to think they need a new TV. On top of the traditional TV specs like screen resolution, size, contrast, colour management etc., there’s also now Internet connectivity, widgets, OTT services and home networking.

This list of features is also used to differentiate from competition. In the end, even if 3D never really takes off in the living room, the likes of Sony, Philips, LG, Sharp or Samsung will have benefited from the hype. 3D is just one of many many features and can only help sales.

Device makers make cool devices but - Apple apart – they don’t deliver cool services. For the 3D revolution to happen, content needs to start flowing.

Apart from the set makers, the other group that have a vested interest in making 3D take of are service providers who feel it would give them an added value.

Now for some reasons why this might remain hype and never make it to mass-market.

Today’s side-by-side trials by satellite operators actually have to reduce resolution. They basically split the screen and send the left and right components of the 3D stream on each half of the screen. Feedback I’ve had has been disappointing as viewers notice the drop in resolution from HD, which is not compensated for by 3D.

3D will have to be broadcast in full HD resolution to have an outsider’s chance of delivering its promise. To do this a bandwidth of 20% to 100% increase will be required. DSL and unmanaged Internet will drop out of the race, at least for live content. So the only stakeholders I see pushing hard to get the 3D bandwagon rolling are satellite operators – and in some cases FTTx and Cable operators – for whom bandwidth is less of a blocking point. That’s why Sky has several 3D initiatives and been showing some impressive demos for over a year, they rightly see 3D, if it takes off, as keeping them one stage ahead of the game.

But even the biggest marketing muscles are ineffective to make a person adopt something that doesn’t bring any benefit to them.

When a movie especially thought of for 3D comes along people will notice. But beyond aesthetics, this doesn’t answer a need expressed by users or yet imagined by marketers.

HD improves the experience of any content, whereas 3D is beneficial only to content specially designed and created for 3D. It’s a funny contradiction that the sex industry was one of the last to take up HD, (indeed who wanted more gynaecology?) but may take back its role as innovator for 3D (have a beautiful body passing right by your fingertips can have more effect that just seeing it in 2D). But beyond that early adoption, 3D will remain niche for most of us for a few years yet, because it doesn’t answer any of our needs. It might even remain niche forever like 3D photography has for over a hundred years.

Aesthetics alone can however make an impact if 3D becomes part of our culture. 3D will have to permeate all aspects of production, starting with design of the content; this is underway, but will take years.

***

Rate adaptive is a lot less sexy to talk about than 3D. Indeed there isn’t all that much to show, except maybe to geeks who understand what’s happening under the hood.

Rate adaptive technologies will however enable the delivery of services people have been wanting from the Internet from the outset. This month’s Wired magazine cover reads “The Web is Dead”, inside you’ll see they mean that it’s video in particular that is killing it. Delivering video over the Internet remains a challenge though. In the media space one of the rare companies out there that was saying this out loud was Verimatrix with their white paper from last year "Adaptive Rate Streaming: Pay-TV at an Inflection Point". They don't seem so focussed on the subject (at least on their Website in the sun up to IBC), I'll try and find out why and keep you posted.

An early implementation of rate adaptive technology from Move Networks led the market by several years and almost made it to the mainstream when they were rolling out web streaming services with major US studios. Somehow they failed in the last stretch. Positioning and marketing must be to blame, because the technology is beautiful. They have now moved out of the B2C space and head-ends and are concentrating on enabling TV delivery for corporate customers.

Rate Adaptive technology is picking up speed in the consumer market. It is one of the latest exciting things from all of Microsoft, Apple and even Adobe. Of course all the encoder manufacturers like Envivio support it too now.

UK’s project Canvas CTO Anthony Rose recently said it’s “essential for Quality of Experience on a range of Internet bandwidth”.

He’s right and there should be more fuss about it in the media it’s really much more significant than 3D.

Bad quality and unreliability have been real killers to both user aspirations and business models for all Web streaming efforts from even before the Internet bubble days.

The traditional pay-TV model may survive in a renewed shape, but even the most conservative execs in the industry agree there is a major shake-up underway and OTT is one of its names. Content owners frown upon many OTT ventures, but to reassert control, content owner are investing themselves heavily in TV-Everywhere initiatives so that consumers have access to premium content from anywhere. That means pushing content across unmanaged networks.

Google’s entry into the market reinforces the feeling of unstoppable change.

As the MP3 and music industries debacle showed, people want more freedom in consuming content. Companies, TV content creators included, need to make money. Rate adaptive technology is the key enabler for both to be satisfied.

User surveys invariable show that consumers are happy to pay for content; as long as technology is seamless and doesn’t get in the way. Price points will find their natural equilibrium on their own.

Finally a little technical perspective [geeks only from this point]: what do you actually need to deliver rate adaptive say in a STB? A recent LinkedIn post by Amino’s CTO Dominique Le Foll puts it nicely in a nutshell.

The most high-level requirement is to adapt quickly and automatically to a change in available bandwidth. All components of the stream must of course remain synchronised (video, audio, teletext / closed caption). All features requiring significant processing must be supported by hardware (demux, decode, etc.). Trick modes must be supported for Fast Forward, Rewind, Pause, etc. Of course the technology license must be affordable and content protection must be possible. I also agree with Dominique that streaming should be based on http. This is the best solution via the NAT in end-user’s home, but it also means that in some implementations of the technology (e.g. Move Networks), streams can make use of cheap HTTP caches throughout the Internet (this is akin to free multicast feature in the network). Adjusting the initial buffering level will be tricky to achieve good TV user experience, but who said it would be easy?

Posted on Leave a comment

STBs: from CAPEX to Cash-in

Un TV Connectée powered by Awox
A connected TV powered by Awox

For the last six years, I've been going around trade shows hearing and saying that the big bad wolf in IPTV economics is the STB, which typically represents up to 70% of total capital expenditure, or CAPEX in Telco-speak.

As OTT and social media are accelerating the arrival of a new technical and business environment, my premise is that the huge threat is becoming just as big an opportunity. This year's IPTV World Forum gave me more food for thought when I spoke to Awox, which has a foot in the operator set-top box market and also a smaller one in off-the-shelf devices.

The problem

Let me go back first to the initial problem I've had to surmount several times from within operator deployments.

Typically we are talking about a total cost of ownership for a single set-top box (packaged with remote cables and CAS, delivered, installed and maintained) of, say, 150€. If we have a million subscribers the math is simple. We need a spare 10% of boxes for repairs and to ship to new subscribers so the capital required would be 165M€, all for one happy operator to pay for.

All major Telco deployments have had to cross this difficult chasm. To make things worse, IP based boxes were initially very much more expensive than satellite or cable ones. In finance terms, a way of easing the pain is to remember that contrary to head-ends, STBs are a marginal cost, which means you only pay for boxes as you deploy them to customers who hopefully are, in turn, paying for a service.

Why did all of the early operators and many coming to market today want to do something so financially bizarre as own the STB?

The first reasons were security and control.

From the outset, operators needed to obey stringent security rules set out by rights holders to be given access to their content. Before considering interactive services, an operator must at least deliver plain vanilla pay-TV. For that they must have access to the premium content that people want to watch. Therefore they must adhere to the strictest security constraints imposed by content owners. A few years ago it seemed only natural that to get into such a business, one could only play by the rules. So like cable and satellite operators, who have always owned the STB and the smartcard therein, early IPTV operators did the same and most are still doing so.

But ten years on from the launch of the first IPTV commercial trials, a consensus is appearing (there is a good Farncombe white paper on this subject here). Operators only need to own a smartcard for broadcast networks that do not have an inherent return path like satellite or digital terrestrial. For IP networks, where each STB can establish an individual link with a security server, software-based security is sufficient. A smartcard is no longer required and thus, this first reason is vanishing.

Telco’s and especially incumbents have long had a phobia about letting anything that they don’t control onto their networks. They usually have a team of security gurus who have to give a blessing before any new device can be deployed. Looking back a few decades, PTT's have always jealously guarded their PSTN networks from non-vetted devices, even plain vanilla telephones. As a teenager in the early eighties in Europe (Paris & London), I remember the thrill of plugging an illegally 'smuggled' phone from the USA. The phone was made of transparent plastic with coloured LEDs. What a thrill when at the time BT, DT or FT only supplied cream or brown handsets. In the deregulated 2010 landscape, all operators have so little control over the last mile of their networks that it seems silly to pretend that owning the STB still makes a difference, and even incumbents that own the last mile are lost when it comes to managing the home network.

Awox has experienced this gradual change first hand. They got through France Telecom’s red tape with their Internet Live-radio devices currently available to Orange subscribers in France.

Service operators have always worried about stickiness. In today's Internet world, where the competition is only a mouse-click away, it’s no surprise to Awox that many Telcos have gone for a “walled garden" approach. Indeed Awox have been through those trials and tribulations with Orange already, helping the operator offer OTT services from within their walled garden. But operators still pertain that owning the STB is part of the secret to owning the subscriber, or at least locking him or her in.

Until recently, the lack of standards has meant that operators have had to develop a new portal for most new devices. This has provided yet another argument for those proponents of a tightly controlled device policy, which again ends up meaning that operators want to own the STB.

In the early days decision-makers considered that technology was the hard nut to crack. Getting digital video through IP networks and keeping the service up and running turned out indeed to be really hard. But technological difficulties were overcome in the end and the make-or-break issue for IPTV turned out to be content and features. It's been a while since anyone has risked the tired old "content is king" slogan, but it was dominant for a long time. If that 165M€ could have been spent on content rather than STBs there might well be even more competition from IPTV operators today.

Let's leave the past there. What has changed so that 2010 might be different?

Costs can come down:

As a device vendor Awox sees itself helping move the STB away from its current CAPEX-devouring Achilles heel position, in particular through the use of standards.

Throughout the whole tech industry, standards have been the best way to lower costs. Linux Vs Windows is such an example. Awox is one of the IPTV ecosystem's DLNA champions. Olivier Carmona, the CMO, pointed out that this is particularly true for advanced home networking, for example. You can commoditize many components so that in a fully DLNA home network, for example, a low-end hard disk simply plugged into an STB becomes a ridiculously cheap NAS. Looking further down the road, Awox have contributed DTCP/IP SYNC & DTCP/IP SOURCE to the spec so that DLNA systems will be able to distribute premium content within the home. It's no longer science fiction for that same 30€ hard disk to enable PVR functionality from a DLNA enabled Pay TV service. This is yet another initiative that goes against traditional STB middleware vendors.

Other reasons:

  • Content owners were badly bruised from the MP3 music phenomenon - I almost wrote debacle there. However, the story is still unfolding and some musicians are living well. Musicians, like the big film studios, have now acknowledged that they must innovate. They will already agree to release content into new distribution channels and even consider entirely new business models.
  • Users have got used to the Internet as a source of content, even if they don't yet get premium TV from that source. They expect ready access to what is considered as free, like YouTube.
  • New initiatives to deliver premium content are still searching for their business models. Some, like Hulu, are bound to find some kind of stability in 2010. In the same vein, many TV stations are eager for a chance to reach out directly to the world's hundreds of millions of broadband subscribers.
  • In this area, the never-ending success of Apple has shown that people, beyond early adopters, will pay if the product, including digital content, is truly desirable.
  • Until now, TV-based widgets have been a gimmick. Indeed, if you want stock quotes in your living room you will either use your laptop, smart-phone or some tablet. But finally, demos at the 2009 IBC (more at CES, then NAB this year) are showing some really useful widgets. The secret ingredient seems to be the interactions with content itself, which NDS's Oona concept illustrates well.
  • Early adopters have shown that they are prepared to pay for a physical device - as long as it is desirable. Take-up of expensive devices like the Sling-box is good evidence. Some pundits predict the latest Tivo box will reinvent TV yet again in 2010.
  • The advent of home networks has led users to expect some control over what goes into their sitting rooms. DLNA championed by Awox will accelerate this further. Empowering users with a wider and constantly renewed choice of devices makes them happy. The marketing message is that the pain of paying is replaced by the power of choice.
  • Operators are scrambling to deliver sexy new 2.0 features. Big companies are rarely successful at this kind of catch-up game. I eagerly await some real figures from Verizon's much-touted Fios Twitter and Facebook implementations to see if we have reached a turning point (I heard at IPTV World Forum in March that only 10% of the user base knew about the social media features).

There are two ways of looking at the OTT box market. Some are saying that the huge variety of devices, ranging from FetchTV to Myka or Roku through Apple TV, have not yet made a huge impact. I think the glass is half full: there is such a strong a vibrant offer out there, as well as a real demand, I have no doubt that it's just a question of time - in quarters, not years - before one meets the other and we see one of the OTT services turn their huge mind-share into an equivalent market-share and then ARPU. TiVo has already shown what success can look like, albeit at a modest scale. If a box were to be operator endorsed, that could only help and the TiVo reincarnation in the UK market with Virgin backing could create a de facto standard.

Google's entry into the TV space is only a question of time. Apple, too, will eventually get it right and both giants will get a slice of the sitting room pie. Again the only sensible way forward for operators is openness, as Martin Peronnet, CEO of Monaco Telecom, pointed out during IPTV World Forum. He pointed to the way the iPhone's Appstore has diverted ARPU from operators and said, "never again".

With their internal processes, operators are never quick enough to get the time-to-market right on their own. Many big operators are publishing specifications of network API's. This is, for example, the case with the Orange Telco 2.0 initiative described by Stephan Hadinger during the last World Broadband Forum. The end game is for end users to always have the best of breed, sexiest new devices that they want enough to pay for. A lightweight certification process could guarantee that basic services all work. Any new over-the-top services would be the vendor's responsibility.

Getting rid of a huge financial burden is rewarding enough. But that 165M€ of cost discussed already could become extra revenue instead. Indeed, why would you want a better, newer device if you were not going to use it more often? Even if much of the content revenue goes to over-the-top suppliers, those extra hours will always enable some marginal revenue opportunities. Nothing stops operators jumping on to any success story as it emerges and delivering their own service, either OTT or in a walled garden. OTT services are bound to flow through different parts of the home network, where Awox' staunch DLNA support makes all the more sense.

In the model of my premise, if some technology turns out to be a dead-end, that would be the subscriber's issue. Leading-edge technology customers expect this to happen from time to time. No one sued their vendor over Betamax or HD-DVDs after all.

Sleek new devices are coming to market anyway. Operators must become better at encouraging their customers to use devices over which they still have some influence because they will not retain control for much longer.

Olivier Carmona commented that "Operators don't want the living room and it's content related revenues hijacked by an OTT supplier. Getting the sleekest, newest devices available into subscriber's sitting rooms seems a good proactive strategy". Beyond the technology, that I agree is cool, the true innovation is in the new relationship operators can have with their subscribers.

The whole industry claims the ability to link broadcast content with the interactive experience from the web. With an open standards DLNA approach, Awox believes that it is important to make only the link that best suits the user, the moment, the content and the available hardware.

Operators should consider launching new devices or peripherals to existing devices, that customers go out and buy in the stores; after all it will take them at least two years to make a decision ;o)