Posted on Leave a comment

UHD will change living room TV forever, if we fix customer-facing interop issues first (Update: Atmos working)

HDR and some NGA are here (well almost). Demos will blow your mind and ears, but beware - it can take a geek a couple of hours to get it working at home.

Here is my personal account, as a simple user, of my road to UHD nirvana in my living room. I wrote this in early September 2016, and just updated it at the end of the month as I finally got Dolby Atmos working and it was worth the wait).

My setup

When I moved to my new flat in central Paris 8 months ago, I immediately got access to an Orange Fibre connexion with speeds of up to 800 Mbps. With all the work I’ve been doing on UHD as a member of the Ultra HD Forum, I saw this as an opportunity to test some streaming services in the real world of my sitting room.

I’ve had my Samsung SUHD TV (UE55SJ8500) for 6 months now. First demos were with still images from the UHD Zoo app, as it took me a while to get a 4K video that I could effectively stream to my TV. Of course, Netflix was available, and although some 4K series have some stunning shots that show up all the new pixels, many don’t, even though they’re in the 4K section.

After having completed a white paper on Object based sound and getting excited about DTS:X (see here), I went for a mid to high-range Onkyo A/V receiver (TX-RZ 810 B ) that was already Dolby Atmos-capable and will be software upgradable to DTS:X.

Having spent 1,200 € on the receiver, I was no longer ready to splash out on the high-end Atmos speaker system I’d been eying. Amazon had a 400€ set of speakers available for next day delivery so I went for an Onkyo SKS-HT588(B) system, knowing that once my system is stable, I’ll have to invest in real speakers.

Sound first

A first hurdle for many viewers will be that the TV set-top-box is usually set by default to stereo sound. So before getting anything like 5.1 output from that source one must find the appropriate sub-menu and set HDMI audio output to what in my case Orange calls Home cinema.

Im1

The other option (yes, it’s well known that two options make things simpler!) is to use an optical output from the STB, which is always in pass-through mode, and then configure the AV receiver to associate that with the video, from the Orange STB in my case.

The Orange TV service has had a single Dolby Atmos sports transmission but I couldn’t find any next-gen audio in the VoD library so to get some fancy sound demos, back to the Internet where I found Dolby demo files with difficulty on http://www.demo-world.eu/2d-demo-trailers-hd/.

It turned out none of my devices or software was able to send the Dolby Atmos sound track to my AV receiver. I found on an obscure geek chat that the Kodi video player could decode Atmos. So I installed that onto my Mac and Eureka!, the Dolby Demos played on my TV (connected with HDMI to my Mac). It sounded beautiful, but the Onkyo receiver never had the word ‘Atmos’ appear so I’m guessing it just considered it was Dolby 7.1, but hey, it sounded really immersive with the rain falling literally overhead, so who cares? [see update at end of blog, I did finally get the Atmos to work from the Orange VoD store].

Now for some HDR video

Amazon and Netflix have some UHD content but on their own interfaces, it is so far impossible to tell whether there’s any HDR, and I understand Netflix chooses the HDR mode dynamically, so for the next demo, it seems like physical media is the only way.

Which UHD Blu-ray player?

After waiting for UHD capable Blu-Ray players for a year, I decided to go and get one of the two available in France (The Samsung at 500€ or the Hitachi at 800€). But when I got to the retail store, the sales guy suggested I get an Xbox One S for 400€. That would also hopefully get my sixteen-year-old interested, so I went for that option.

Back home with the Xbox unpacked, my next objective was to get UHD Blu-Ray disks to play and at-long-last see some real HDR.

Im2

Inside the Xbox parameters, to select HDR, there is no mention (yet) of HDR itself. You need the knowledge that we are looking for 10-bit colour and so must select the 30 (!?) bits per pixel option (I later used the top 36 (12) bits options, which the TV accepted fine, and the video looked a bit better, strangely I had a better sense of very high resolution rather than amazing colours. There was no HDR wow effect with the Man-of-Steel blue ray I got for free with my Xbox, it just looked very nice.

Im3

I then got into the Xbox One S’ advanced video parameters and all of a sudden the HDR word appears. And so now, going into the 4K TV submenu (note the confusion it should be a UHD sub-menu as we’re talking HDR and a bit of HFR too), I was all excited to see all the new possibilities:

Im4

But let’s not run away with ourselves, there was a last hurdle to cross. The Xbox’s Blu-ray player said I had the wrong kind of TV for UHD. It turned out that the AV Receiver through which my HDMI signal was passing, was not HDCP 2.2 enabled.

Im5

In the Inputs sub-menu of the Onkyo AV receiver, I discovered that only HDMI 1 through 3 were HDCP 2.2-capable. That required pulling the TV away from the wall yet again and reassigning the X-Box One S to one of the 3 first ports (and of course reassigning whatever was already there that didn’t need HDCP 2.2 somewhere else). I’ll spare you the screen shot of doing that in the AV receiver’s menus.

Im6

Finally, on my Samsung TV I had to hunt down to the 14th menu item of the main Picture menu, called HDMI UHD Color, which everyone else calls HDR.

Im7

Then within the Samsung TV submenu I turned on the HDMI ports that are connected to HDR sources. For each value you change here, the TV needs to reboot (no kidding it really does).

Im8

A couple of hours after had I started, my Samsung TV finally tells me I’ve succeeded: full UHD with HDR AKA UHD Color. I am of course too hot and bothered at this stage to want to watch anything, but when I have since shown off my new 4K/HDR/NGA setup, I’ve persistently got the most wows for the immersive audio demos. Hmmm, maybe I should have just bought a new stereo… nah, just kidding 😉

Wrapping up

Putting my professional hat on, I’m  still a true believer in UHD and all its promises, but despite having often written about “This being the year for UHD”,  I do see that there is a potential blocking point with these customer-facing issues. I trust that at the Ultra HD Forum and the UHD Alliance folks will get to grips with these interoperability teething problems so that the true benefits of UHD aren’t confined to the tech-savvy. I see a great opportunity for operators and their call centres to fix wires problems today but also for the CPE suppliers to work on processing HDR and one day NGA locally. UHD has to be plug and play to truly take off.

[Update Sept 29 2016: I spent 10€ on a digital copie of Salt (Angelina Jolie) from the Orange VoD store that had about 9 other movies with Atmos at time of writing - so Yeah! I finally got my expensive A/V receiver to actually recognise an Atmos audio stream and generate the right output. The sense of immersion is clearly improved, you really can't tell what's coming out of what speaker any more and I heard sounds that seemed to come from "in front of a given speaker".

If on an imaginary quality scale 1 is bad mono (i.e. the phone), the a jump to good stereo would bring a real wow-effect probably scoring maybe 5, moving from stereo to 5.1 is another similar wow-effect say doubling the score to 10. Object based sound on top of 5.1 (or 5.1.2 in my case) brings another really noticeable improvement, but less of a wow-effect, so I'd subjectively  say my current system scores 12 on my imaginary scale.]

Posted on Leave a comment

Enterprise may drive Internet of Things boom

The Internet of Things (IoT) has reached a critical stage in its evolution where it seems to be caught between two tipping points, waiting for the final explosion after the arrival of joined up applications connecting different domains. The first tipping point came around 2014 with proven single domain applications and the arrival of big players in retail such as Staples, energy utility like British Gas and ADT in premises security. That was also the year Google acquired smart thermostat leader Nest. The big data centre systems companies also piled in but more on the enterprise side, such as IBM with a $3 billion investment early in 2015 in its Watson IoT centre based in Munich.

Since then though the sheen has come off IoT a little with mixed signals from the leading players. Google in particular has struggled rather as it did initially with Android TV, with Nest failing to bring out promised new products and recently calling time on its smart home hub for wireless control of end devices called Revolv, which was launched amid much fanfare in October 2014 but then withdrawn in May 2015. It now looks like Google is pursuing a more distributed approach promoting direct interoperability among its own Nest devices without any intermediate hub, but that is not yet completely clear.

Another big US technology company Intel has also found the IoT sector harder going than it expected with its IoT Group reporting reduced revenue growth and a 25% year on year slump in operating income down to $132 million for 2015. The common theme here is failure of the IoT to break out of its silos so that both companies were left connecting their own things.

British Gas has fared better largely because as an energy utility it started with the expectation that it would be confined to its own domain for a while before branching out into other smart home sectors such as security and environmental control. The company instead is focusing on developing the analytics tools it believes will enable wider success in a future joined up IoT and has been investing in real time processing of the large data sets generated by its Hive connected thermostat. Hive allows users to control their boilers and central heating systems remotely by phone, which generates 30,000 messages a second amounting to 40 TB of static data so far, distributed across 30 nodes. Like Google, British Gas has created a dedicated IoT subsidiary called Connected Home, which has built an open source software stack running on the Apache Cassandra distributed database to process data both in real time and offline.

British Gas then is preparing for IoT’s second tipping point, which will come with joined up services that exploit synergy between different domains. IBM shares this conviction from its enterprise-focused perspective, drawing heavily on its cognitive computing work at its Thomas J. Watson Research Centre in New York, with one line being analysis of data from multiple remote sensors for predictive diagnostics. IBM is already enabling Pratt & Whitney to monitor 4,000 commercial engines and obtain early warning of faults that cause costly service outages if left unfixed until later, even if they are not safety critical.

Telcos are of course also intent on capitalizing on the IoT from their position as broadband providers to homes. One early mover is Paris based SoftAtHome, in which three major Telcos are investors, Orange of France, Swisscom and Etisalat based in the United Arab Emirates. The software developer has extended its home operating platform with CloudAtHome to enable centralized control of devices with potential for integration between domains. All such initiatives must support all the key wireless protocols such as Wi-Fi, Bluetooth and Zigbee that IoT devices such as thermostats use to communicate. SoftAtHome uses a hybrid model combining some form of home hub and data repository with cloud-based processes. Such a hybrid approach aims to deliver the required flexibility, security (and privacy), performance and functional breadth. Flexibility comes from being able to deploy processes in the cloud or at home as appropriate, while keeping sensitive data within the local repository will ensure security and privacy. Performance may require some processes to be performed locally to keep latency down while some features may need cloud components.

A close look at this cloud/home distribution shows that in some cases the cloud should be partitioned between remote processes that may be executed in a distant data centre (what is usually called the cloud) and intermediate ones that might be best run at the network edge. This is known as Fog Computing, where some storage and processing takes place more locally perhaps in a DSLAM or even a street cabinet. The argument is that as IoT takes off, a lot of the initial data collection and analytics will be best performed at a Fog level before in some cases being fed back to the cloud after aggregation.

Fog could also work well for enterprise IoT where it might serve as a campus level control and aggregation layer within a larger cloud based infrastructure. It could also play a role as enterprise IoT becomes customer facing rather than mainly concerned with internal or supply chain operations. This could be a third IoT tipping point bringing together enterprise and consumer IT if a recent survey from Gartner is to be believed. This found that while only 18 percent of today’s enterprise IoT deployments are focused on customer experience, this will jump to 34 per cent over the year to Q1 2017. This represents a threefold absolute jump given that Gartner is forecasting the number of enterprises with IoT deployed somewhere to soar from 29 percent now to 43 per cent in a year’s time. Gartner also expects IoT to expand into new service related industry segments such as insurance beyond the heavier industries like manufacturing, utilities and logistics where it is concentrated now.

Such enterprise IoT forecasts have a history of becoming more accurate than some of the over hyped consumer analyst predictions. This means that if consumer IoT does continue to stall it may be dragged forward by enterprises seeking competitive advantage as well as new revenues, as we are seeing to an extent with the likes of British Gas.

Posted on Leave a comment

@nebul2’s NAB 2016 Journal (UHD, HDR, VR, All-IP)

Las Vegas was again focused on UHD in 2016, at least through my eayes. The four Keywords I came away with were 1: UHD (again), 2: HDR, but also 3: VR and 4: All-IP production. Of course other things like drones were important, but I'm not a real journalist, I don't know how to write about things I don't know.

NAB Parking Day1

We got in from Europe on the Saturday evening and this year I was on a budget so we stayed in an Airbnb apartment with my colleague Marta. It turned out to be just behind the main LVCC parking lot. On Sunday morning, you can see on thE right what the parking looked like when you arrive before the show is really underway.

Size and growth of the industry

On the Sunday I sat for a moment through the "Media Technology Business Summit" run by Devoncroft and learned abit about the industry trends:

  • Starting with radio shows this year’s NAB is the 94th annual Show, so I suppose in 6 years we’ll have a big bonanza, I wonder if we’ll have something like Augmented Reality in 8K by then.
  • Devoncroft sees the global media being market worth 49bn in 2015 with the US Media industry having pushed revenue per user to the limit. 3000 vendors make up their industry panel and 2009-2015 CAGR was 1,9% with 2014-2015 OpEx spend at -4.2% and CapEx spend at -4.4%.
  • Despite the OTT craze and losing traditional subs, ESPN still gets 7$/Month from linear subscriptions, but only 0,42$/Month from OTT viewers, so hold your hats, linear pay-TV ain’t dead quite yet. Beyond sports, Devoncroft argues that even though there is growth, digital revenues are insufficient to replace linear ones. The big issue is how the ad market can transition.
  • 4K and UHD make up the third most import topic for respondents of Devoncroft's 2016 Big Broadcast Survey the results of which will soon be released. But Demand for UHD is less for “more pixels” than one for “better pixels”. So according to Devoncroft, like Ericsson, the HDR Vs. 4K debate is all but over.

Virtual and Augmented Reality

I then popped into an Augmented Reality (AR) conference where Gary Acock and Juan Salvo were discussing how to add live content to the UnReal video gaming engine. AR is seen as bringing the real world into Virtual Reality (VR). Stitching 360° video is still apparently a “pretty unpleasant experience” and French startup VideoStitch was mentioned as one of the key players working on fixing this. Currently 360° production design is limited by how effectively you can stitch video. But with AR there are also Inherent UX limitations like parallax issues with head movement or camera movement when there’s no head movement. With AR one needs to always know where the head is and how it's positioned as head movements affect the content that is being created.

The amount of data to process for VR can be well over 1TB / hour so the coming (?) VR/AR revolution needs powerful GPU and CPU.

AR, VR and any immersive experience are still moving targets in 2016. But neither AR nor VR are isolated from the broadcast experience anymore. Indeed VR is less of an isolating and lonely experience, but a new way of engaging, a bit like coming to a conference and interacting with social media on a smartphone at the same time. Content is still king and creating compelling content remains the goal where AR & VR are just other tools. As we still don't have toolsets like an « Adobe for AR/VR » we need to jerry-rig existing tools.

A VR demo that was not at NAB intrigued me. Frauhoffer’s Stephan Steglich told me about FAME. It’s the simple idea of navigating the 360 video with a remote control. 2 key advantages are removing the isolation aspect of having to wear something over the eyes and moving all the processing to the cloud, allowing for future-proof deployments. It sounded convincing but I’ll wait for a compelling demo before making an opinion.

Showstoppers

Sennheiser Mic

I had been told great things about the CES Showstoppers being a big event, at my first experience at NAB, it was a very focused affair where great food and wine seemed to be as attractive for the media as the companies to visit.

German manufacturer Sennheiser was showing off their latest MKE440 DSLR microphone, which they say is the first mini-shotgun for HQ stereo sound image in one take. I was more taken by the beautiful design of the prototype VR microphone that goes under VR camera.

I met up with V-Nova’s Fabio Murra who was showing their two OTT deployments based on their Perseus codec. FastFilmz launched on March 26 in India offering SVoD to a mobile-only Tamil customer base with a potential of 120m subs. There were 350 titles at launch and according to V-Nova, Perseus made the business case possible in southern India where only 2G is available in some areas, offering a 64-128 kbps bandwidth. The demo I saw was watchable at 120kbps using 14 fps (I had to point that out though). The Perseus codec is described as “hybrid on top of H264” with a metadata stream on top of H264. I’ll be looking to dig into this a bit more as I no longer understand exactly what this means after a heated discussion several analysts. Content is protected with DRM I couldn’t find out by who.

I only glimpsed the other demo of a 4K STB using OTT delivery. It was showing Tears of Steel at 4mbps and looked fine but without any wow effect at least for what was on screen then, or maybe it was just that I was too far away for the small screen.

V-Nova had already announced a contribution deal with Eutelsat and promised another one for the next day (which turned out to be Sky Italia).

brother

The Japanese company Brother that I wrongly thought of as a printer maker (does any Japanese company do only one thing?) was displaying « Airscouter », a surprising head-mounted monitor designed for cameramen in difficult positions. You see a 720p resolution image in the corner of one eye. It was a bit disconcerting and I guess limited to some very specific use cases. I felt a bit nauseous with it on my head but it does really work and felt maybe like what Iron Man might feel.

Ultra HD Forum

Monday was taken up with Ultra HD Forum activities for me. We had our own press conference in the morning and in the after noon I made a tiny presentation during the Pilot press conference in the Futures Park. I discussed, the forum’s reason for being, it’s history, our Plugfest #1, the Guidelines 2016 and the general « Work in Progress » aspect of live UHD.

« Pilot » is new name for « NAB Labs » that was started in 2012. We were among 30 exhibitors in Futures Park, which aims to promote « Edge of the art » concepts that are not yet commercialized. ATSC 3.0 was the star with 15 companies focusing on that alone. Other stuff is very diverse ranging from commercial R&D, government to academic research. NHK 8k Super High Vision was prominent as usual and the Nippon public broadcaster is still scheduled to launch commercially in 2018 « so people can enjoy in 2020 Japanese Olympics » in glorious 8K HDR with HFR.

Security and analytics

Monday night was over-booked and I chose the Verimatrix media dinner. I had some animated discussions on UHD and the extent to which HDR might be the only big game-changer (I still believe in 4K but am feeling more and more lonely on that front). Tom Munro the CEO gave me a great update on the company strategy and how the move towards analytics, which I now understand can be a logical progression for a security vendor. If the financial transactions are precious enough to secure, then private usage data is worthy of the same efforts. More on that in a dedicated blog soon.

Satellite industry on edge of a cliff and might UHD save it?

On Tuesday I got myself to the Satellite industry day. I have this vision on the industry (at least the broadcast and the Telecoms parts of it) sitting on the edge of a cliff wondering when fiber, 5G and delinearization will push the off the edge.

Despite a great lineup with Caleb Henry of Via Sat Mag, Steve Corda VP Bizdev SES, Markus Fritz Eutelsat, Dan Miner AT&T and Peter Ostapiuk of Intelsat, the opening panel didn’t really give me any new ideas to tackle that problem.

AT&T in particular sees similarities between the move from SD to HD and that from HD to UHD, but IntelSat sobered the audience asking how the content industry will make money from upgrade to UHD. SES’s Steve Corda made it scarier still reminding the audience that during the upgrade from SD to HD we didn't have competition from OTT as we do now with most early UHD coming from OTT suppliers.

The satellite industry panel agreed that demand for UHD channels is growing especially from their cable operator clients and that the bottleneck is still available content. AT&T's Dan Miner noted that a key change in OTT delivery in the coming 18 months is that US data plans will enable the TV Everywhere on cellular networks.

The consensus is that to have a monetizable UHD offering you need a bouquet of at least 2 channels, ideally at least to 5 including sports.

When the panel went round enumerating their live 4K services, I counted about a dozen UHD linear channels and as many demo channels as well as a few events based channels.

One of Viasat’s founders Mark Dankberg gave an inspirational talk reassuring the audience that the satellite industry’s future is safe, at least if they copy Viasat. The merger of AT&T and DirecTV is an indicator to him that Satellite without broadband is no longer viable in the long term. Viasat started 1986 in defense, during the 90's they got into VSAT (Data Networking) just on the B2B side. Dankberg believes high –orbit geostationary is still the way to go (instead of mid of low-orbit (LEO)) because it’s the best way to optimize resources with thousands of beams. He points out that as 95% of demand is in 15% of geography; LEO that orbit the earth can't do that. I was enthused by his talk and hoping to get home and write a blog about it, but when I looked through my notes I realized that in the end there wasn’t any new information, just the charisma and communicative beliefs of an industry veteran.

TV Middleware on Android

Beeniuis, the middleware guys from Slovenia that I’ve written about a few time caught me in the south hall so I went to have a look.

In demonstrating their new version 4.2 core product, Beenius told me that the EPG is dead but still went ahead to show me theirs. Navigation is via genres with favorite channels on top of a carousel that mixes live and VoD. Recommendation currently uses their own algorithms but can be based on Think Analytics with « Trending » content on second line.

beenius

The company is very Google-centric, although they still have a Linux offering with a Hybrid DVB solution. They clarified to me how GooglePlay apps can be controlled by the TV operator with three different approaches:

  1. 1. Preinstalled apps and an open GooglePlay
  2. 2. « Walled Garden » where the user chooses apps from the operator’s list typically among a dozen including YouTube, Netflix, etc.
  3. 3. Apps already embedding into the UI, which is also a closed model.

VoD also benefits from integrated recommendation but is open to extra info from the Web such IMDB content.

Beenius haven’t had much interaction with 4K yet, although they say they are ready. As with any competitive TV middleware you can fling content from screen to screen.

The operator-controlled UI can be updated from a central server so that a new version of the App gets automatically pushed to STB via GooglePlay as soon as it's closed and reopened. Playing in the google arena has enabled a full-featured app for Android powered smart TVs, Beanies just needs Google to finally get it right in the living room.

Automatically generated HDRB-COM

Ludovic Noblet of French institute of research B<>Com showed me a tool to up-convert SDR content to HDR. He sees it as a gap-filler for legacy setups which is already available for offline, with a real-time version planed for IBC 2016. The current version introduced a latency of just 3 images and was convincing even if it didn’t carry that amazing wow-effect of some native HDR content. He was very secretive about the first customers but seemed very confident.

The pull of social media

On the last day I had a quick stop at Texas Instrument’s tiny booth, simply because they engaged with me on twitter ;o)

The LMH1219 is a 12G SDI card shown above enables SDI cables to be up to 110m without any signal attenuation, instead of the usual 20-30m. Its UltraScale processing equalizes and Improves the signal. The TI chip is agnostic to metadata so should work fine with HDR for example.

Another hardware innovation they showed me was a single chip for receive (Cable EQ) or drive mode (TX) that makes BNC connectors more versatile as they needn't be just IN or OUT but can be either. The device isn’t available yet nor does it have a product name. Launch is expected in Q1 2017.

Note that I didn’t interact with any of the All-IP production vendors, but just noted it as a buzzing theme in conferences and on booth signage.

NAB Day 3

Oh and the Convention Centre car park looked like this from our apartment window by 9:30 am Monday through Wednesday:

That’s all for now folks.

Posted on Leave a comment

Virtualization approaches final frontiers in the home

Virtualization has been around almost as long as business computing after being invented by IBM in the 1970s so that “big iron” mainframes could mimic smaller machines for economies of scale. Later after personal computers arrived it reached the desktop with products like Soft PC allowing Apple Mac computers to run Microsoft’s Windows operating system and associated applications.

Another 10 years on the scope of virtualization expanded during the noughties to allow separation of hardware and software outside the data center in networking equipment such as routers and firewalls and then finally the TV industry joined the party. Even that was not the end of the story since now virtualization is beating a path into the home, not just for the gateway or set top boxes, but right to the ultimate client, whether a user’s PC or even an Internet of Things (IoT) device like a thermostat.

Over time the motivations have evolved subtly, so that virtualization became more about being able to exploit lower cost and more flexible commodity hardware than getting the best value out of a few large computers and exploiting their superior capabilities in various areas such as resilience and security. But now as virtualization comes together with the cloud there is another dimension, which is to enable much greater flexibility over where both hardware and software are deployed.

This shift to virtualization around the cloud has been aided by major standardization efforts, especially the open source initiative OpenFlow, which defines the interface between the control and forwarding layers of an SDN (Software Defined Network) architecture. SDN enables traditional networking functions, notably routing from node to node across IP networks, to be split between packet forwarding, which can be done locally on commodity hardware, and the higher level control logic, which can run remotely somewhere in the cloud if desired. OpenFlow then enables a physical device in the home, such as a gateway, to be “bridged” to its virtual counterpart within the network.

The key point here is that not all home gateway functions should be hived off to the cloud, since for example sensitive personal data may be best stored at home perhaps on a NAS (Network Attached Storage) device. It may also be that some processes will run more effectively locally for performance or security reasons, including some associated with the IoT. Virtualization combined with the cloud via OpenFlow allows this flexibility such that functions as well as underlying hardware can be located optimally for given services without incurring a cost penalty.

Just as IBM broke the ground for virtualization in the data center, we are now seeing virtualization reach into the home. Orange founded the French software company SoftAtHome in 2007 so it could deploy hardware independent home gateways. Other vendors have joined the fray since with Alcatel Lucent (now Nokia) among the leaders with its vRGWs (virtualized Residential Gateway) portfolio. Nokia like SoftAtHome argue that with their products operators can turn up new and innovative services faster, while reducing CAPEX and OPEX for existing and new services. Updates can be applied centrally without having to replace hardware or visit the homes, as has been common practice in the data center for some years.

Not surprisingly then some technology vendors have come into the virtualized home gateway area from the enterprise arena. One of these is Japanese IT giant NEC with its networking software subsidiary NetCracker, which helped Austrian incumbent Telekom Austria over an in-depth trial of virtualized customer premises equipment (vCPE). This integrated SDN technology with virtual network functions (VNFs) through a common service and network orchestration platform which also involved technology from other vendors. The Telco cited as a key benefit the ability to have one single point of delivery for home media and entertainment content.

Now virtualization is approaching its next frontier in the IoT arena where the motivation shifts yet again. One challenge for IoT is to be able to configure generic devices for a range of applications rather than having to make dedicated hardware for each one. This is again about being able to use off the shelf hardware for a range of services but this time the commoditization must occur down at the chip level. This calls for embedded virtualization so that small single chip devices such as sensors can be remotely programmed and repurposed in the field. Apart from flexibility and cost reduction, embedded virtualization will confer greater security and real time performance since operations are executed within a single SoC (System on Chip). Even this is not entirely new since embedded virtualization has emerged in other sectors such as the automotive industry  where again there is a need for field upgradeability, given that vehicles as a whole now have a longer life cycle than many of the underlying software based components.

The real challenge for broadband operators will be to capitalize on end to end virtualization extending across the home network, which presents an opportunity to key vendors like Nokia and SoftAtHome to smooth the path.

Posted on Leave a comment

Measurement key to monetizing mobile video

Measuring mobile video audiences and associated ad engagement is one of the greatest challenges facing the pay TV industry, with big rewards for getting it right. Mobile video has surged over the last year, with phones and tablets accounting for 46 per cent of all online viewing globally during Q4 2016, up from 34 per cent a year earlier, according to video technology vendor Ooyala. Ad spending is moving with the eyeballs and in the UK for example more of it will be on mobile than mainstream TV for the first time this year, £4.58 billion ($7 billion) against £4.18 billion ($6.39 billion), according to eMarketer.

While some pay TV operators may have reasonable visibility over viewing on desktops, mobile devices raise complexity to another dimension. On desktops access to web sites and services is almost all via browsers, but on mobiles these only account for a minority of viewing. It is true that the majority of web sites are accessed from mobiles too via the browser, for obviously individual users only have room for a certain number of apps on their devices. But apps account for the great majority of time spent on mobiles and also for most traffic, because users tend to hang out in just a few places. Those places are accessed via apps rather than the browser, including the likes of Facebook, Google Maps and WeChat. However an interesting and relevant trend for operators during 2016, which has been highlighted by analyst group Forrester, is that users are increasingly turning towards aggregation apps to access the content they want.

When access is predominantly via a browser as on the desktop PC cookies can be used to track viewing activity and measure ad engagement. But cookies do not work well in the mobile world because activity is partitioned between the mobile browser and the various apps isolated from each other via sandboxing, which is a fundamental property of both the dominant mobile OSs, Android and Apple iOS. Web sites accessed within apps open via dedicated custom browsers which means that they cannot interact with persistent cookies on the device, which precludes use of proven desk top measurement tools. In the case of iOS devices, the situation is just as bad even for sites accessed via the mobile browser because Apple prohibits use of third party cookies.

There are also higher level challenges for mobile TV advertising such as defining how long people should watch an ad for it to count as having been viewed, given that attention spans are shorter on small screens. The situation is similar for the actual TV content, where the value of mobile viewing can depend on context, being particularly high when there is synergy with the big screen for example to resume watching something started earlier.

The overall challenge then is to integrate audience measurement and analytics across all screens including mobile to deliver consistent information that takes account of differences in context and engagement across the different platforms. There are now plenty of tools available for tracking activity on the mobile side, but integrating them within a coherent end to end measurement and analytics system is highly complex. Some big operators are attempting to do this in-house but increasingly even they are turning to specialist TV audience companies to enable the integration.

One example is UK based TV analytics firm Genius Digital, offering two services which can be combined or stand alone, one being Real Time Data Collection for reporting viewing data across all devices. This is based on multiscreen libraries that can be embedded into mobile or web applications to enable monitoring of video consumption, profile management, performance and quality management on JavaScript, iOS and Android devices. Secondly Genius Digital offers Multiscreen Data Service (MDS), designed to extract viewing data from apps, even those from third parties. A key benefit of this approach lies in marrying viewing information from these different apps, each of which will normally use different metrics, to provide consistent information about engagement with channels or specific programs for integration with traditional set top box return path data.

Another TV analytics company TVbeat, also UK based, has moved in a similar direction, in this case through a partnership with a dedicated TV app company Metrological. This has enabled TVbeat to meld set top data with mobile device return path and app consumption information from Metrological’s Application Platform.

Such developments ease the pain of mobile audience measurement for pay TV operators and we expect to see more that have previously relied solely on in-house development to at least consider working with one of the specialist analytics companies that are in a better position to aggregate data from many sources. With mobiles accounting for a rapidly increasing proportion of both viewing and ad budgets, operators need to embrace that with their existing actionable data analytics.

Posted on 3 Comments

The State of #HDR in Broadcast and OTT – CES 2016 update

By Yoeri Geutskens

This article was first published in December 2015, but has been updated post-CES 2016 (corrections on Dolby Vision, UHD Alliance's "Ultra HD Premium" specification and the merging of Technicolor and Philips HDR technologies).

A lot has been written about HDR video lately, and from all of this perhaps only one thing becomes truly clear – that there appear to be various standards to choose from. What’s going on in this area in terms of technologies and standards? Before looking into that, let’s take a step back and look at what HDR video is and what’s the benefit of it.

Since 2013, Ultra HD or UHD has come up as a major new consumer TV development. UHD, often also referred to as ‘4K’, has a resolution of 3,840 x 2,160 – twice the horizontal and twice the vertical resolution of 1080p HDTV, so four times the pixels. UHD has been pushed above all by TV manufacturers looking for new ways to entice consumers to buy new TV sets. To appreciate the increased resolution of UHD, one needs to have a larger screen or a smaller viewing distance but it serves a trend towards ever larger TV sizes.

While sales of UHD TV sets are taking off quite prosperously, the rest of the value chain isn’t following quite as fast. Many involved feel the increased spatial resolution alone is not enough to justify the required investments in production equipment. Several other technologies promising further enhanced video are around the corner however. They are:

  • High Dynamic Range or HDR
  • Deep Color Resolution: 10 or 12 bits per subpixel
  • Wide Color Gamut or WCG
  • High Frame Rate or HFR: 100 or 120 frames per second (fps)

As for audio, a transition from conventional (matrixed or discrete) surround sound to object-based audio is envisaged for the next generation of TV.

Of these technologies, the first three are best attainable in the short term. They are also interrelated.

So what does HDR do? Although it’s using rather different techniques, HDR video is often likened to HDR photography as their aims are similar: to capture and reproduce scenes with a greater dynamic range than traditional technology can, in order to offer a more true-to-life experience. With HDR, more detail is visible in images that would otherwise look either overexposed, showing too little detail in bright areas, or underexposed, showing too little detail in dark areas.

HDR video is typically combined with a feature called Wide Color Gamut or WCG. Traditional HDTVs use a color space referred to as Rec.709, which was defined for the first generations of HDTVs which used CRT displays. Current flat panel display technologies like LCD and OLED can produce a far wider range of colors and greater luminance, measured in ‘nits’. A nit is a unit for brightness, equal to candela per square meter (cd/m2). To accommodate this greater color gamut, Rec.2020 color space was defined. No commercial display can fully cover this new color space but it provides room for growth. The current state of the art of color gamut for displays in the market is a color space called DCI-P3 which is smaller than Rec.2020 but substantially larger than Rec.709.

To avoid color banding issues that could otherwise occur with this greater color gamut, HDR/WCG video typically uses a greater sampling resolution of 10 or 12 bits per subpixel (R, G and B) instead of the conventional 8 bits, so 30 or 36 bits per pixel rather than 24.

Sony-Color-space

Color/luminance volume: BT.2020 (10,000 nits) versus BT.709 (100 nits); Yxy
Image credit: Sony

The problem with HDR isn’t so much on the capture side nor on the rendering side – current professional digital cameras can handle a greater dynamic range and current displays can produce a greater contrast than the content chain in between can handle. It’s the standards for encoding, storage, transmission and everything else that needs to happen in between that are too constrained to support HDR.

So what is being done about this? A lot, in fact. Let’s look at the technologies first. A handful of organizations have proposed technologies for describing HDR signals for capture, storage, transmission and reproduction. They are Dolby, SMPTE, Technicolor, Philips, and BBC together with NHK. Around the time of CES 2016, Technicolor and Philips have announced they are going to merge their HDR technologies.

Dolby’s HDR technology is branded Dolby Vision. One of the key elements of Dolby Vision is the Perceptual Quantizer EOTF which has been standardized by SMPTE as ST 2084 (see box: SMPTE HDR Standards) and mandated by the Blu-ray Disc Association for the new Ultra HD Blu-ray format. The SMPTE ST 2084 format can actually contain more picture information than TVs today can display but because the information is there as manufacturers build better TVs the content has the potential to look better as the new, improved display technologies come to market. Dolby Vision and HDR10 use the same SMPTE 2084 standard making it easy for studios and content producers to master once and deliver to either HDR10 or, with the addition of dynamic metadata, Dolby Vision. The dynamic metadata is not an absolute necessity, but using it guarantees the best results when played back on a Dolby Vision-enabled TV. HDR10 uses static metadata which ensures it will still look good – far better than Standard Dynamic Range (SDR). Even using no metadata at all, SMPTE 2084 can work at an acceptable level just as other proposed EOTFs without metadata do.

For live broadcast Dolby supports both single and dual layer 10-bit distribution methods and has come up with a single workflow that can simultaneously deliver an HDR signal to the latest generation and future TVs and a derived SDR signal to support all legacy TVs. The signal can be encoded in HEVC or AVC. Not requiring dual workflows will be very appealing to all involved in content production and the system is flexible to let the broadcaster choose where to derive the SDR signal.  If it’s done at the head-end they can choose to simply simulcast it as another channel or convert the signal to dual-layer single stream signal at the distribution encoder for transmission.  Additionally the HDR-to-SDR conversion can be built into set-top boxes for maximum flexibility without compromising the SDR or HDR signals. Moreover, the SDR distribution signal that’s derived from the HDR original using Dolby’s content mapping unit (CMU) is significantly better in terms of detail and color than one that’s captured natively in SDR, as Dolby demonstrated side by side at IBC 2015. The metadata is only produced and multiplexed into the stream at the point of transmission, just before or in the final encoder – not in the baseband workflow. Dolby uses 12-bit color depth for cinematic Dolby Vision content to avoid any noticeable banding but the format is actually agnostic to different color depths and works with 10-bit video as well. In fact, Dolby recommends 10-bit color depth for broadcast.

High-level overview of Dolby Vision dual-layer transmission for OTT VOD

High-level overview of Dolby Vision dual-layer transmission for OTT VOD;
other schematics apply for OTT live, broadcast, etc. 
Image credit: Dolby Labs Dolby Vision white paper

Technicolor has developed two HDR technologies. The first takes a 10-bit HDR video signal from a camera and delivers a video signal that is compatible with SDR as well as HDR displays. The extra information that is needed for the HDR rendering is encoded in such a way that it builds on top of the 8-bit SDR signal but SDR devices simply ignore the extra data.

Technicolor

Image credit: Technicolor

The second technology is called Intelligent Tone Management and offers a method to ‘upscale’ SDR material to HDR, using the extra dynamic range that current-day capture devices can provide but traditional encoding cannot handle, and providing enhanced color grading tools to colorists. While it remains to be seen how effective and acceptable the results are going to be, this technique has the potential to greatly expand the amount of available HDR content.

Having a single signal that delivers SDR to legacy TV sets (HD or UHD) and HDR to the new crop of TVs is also the objective of what BBC’s R&D department and Japan’s public broadcaster NHK are working on together.  It’s called Hybrid Log Gamma or HLG. HLG’s premise is an attractive one: a single video signal that renders SDR on legacy displays but HDR on displays that can handle this. HLG, BBC and NHK say, is compatible with existing 10-bit production workflows and can be distributed using a single HEVC Main 10 Profile bitstream.

Depending on whom you ask HLG is the best thing since sliced bread or a clever compromise that accommodates SDR as well as HDR displays but gives suboptimal results and looks great on neither. The Hybrid Log Gamma name refers to the fact that the OETF is a hybrid that applies a conventional gamma curve for low-light signals and a logarithmic curve for the high tones.

HLG

Hybrid Log Gamma and SDR OETFs; image credit: T. Borer and A. Cotton, BBC R&D

Transfer functions:

  • OETF: function that maps scene luminance to digital code value; used in HDR camera;
  • EOTF: function that maps digital code value to displayed luminance; used in HDR display;
  • OOTF: function that maps scene luminance to displayed luminance; a function of the OETF and EOTF in a chain. Because of the non-linear nature of both OETF and EOTF, the chain’s OOTF also has a non-linear character.

 BBC_WorkFlow

Image credit: T. Borer and A. Cotton, BBC R&D

The EOTF for Mastering Reference Displays, conceived by Dolby and standardized by SMPTE as ST 2084 is ´display-referred'.  With this approach, the OOTF is part of the OETF, requiring implicit or explicit metadata.

Hybrid Log Gamma (HLG), proposed by BBC and NHK, is a 'scene-referred' system which means the OOTF is part of the EOTF. HLG does not require mastering metadata so the signal is display-independent and can be displayed unprocessed on an SDR screen.

The reasoning is simple: bandwidth is scarce, especially for terrestrial broadcasting but also for satellite and even cable, so transmitting the signal twice in parallel, in SDR and HDR, is not an attractive option. In fact, most broadcasters are far more interested in adding HDR to 1080p HD channels than in launching UHD channels, for exactly the same reason. Adding HDR is estimated to consume up to 20% extra bandwidth at most, whereas a UHD channel gobbles up the bandwidth of four HD channels. It’s probably no coincidence HLG technology has been developed by two broadcast companies that have historically invested a lot in R&D. Note however that the claimed backwards compatibility of HLG with SDR displays only applies to displays working with Rec.2020 color space, i.e. Wide Color Gamut. This more or less makes its main benefit worthless.

ARIB, the Japanese organization that’s the equivalent of DVB in Europe and ATSC in North America, has standardized upon HLG for UHD HDR broadcasts.

The DVB Project meanwhile has recently announced that UHD-I phase 2 will actually include a profile that adds HDR to 1080p HD video – a move advocated by Ericsson  and supported by many broadcasters. Don’t expect CE manufacturers to start producing HDTVs with HDR however. Such innovations are likely to end up only in the UHD TV category, where the growth is and any innovation outside of cost reductions takes place.

This means consumers will need a HDR UHD TV to watch HD broadcasts with HDR. Owners of such TV sets will be confronted with a mixture of qualities – plain HD, HD with HDR, plain UHD and UHD with HDR (and WCG), much in the same way HDTV owners may watch a mix of SD and HD television, only with more variations.

The SMPTE is one of the foremost standardization bodies active in developing official standards for the proposed HDR technologies. See box ‘SMPTE HDR standards’.

SMPTE HDR Standards

ST 2084:2014 - High Dynamic Range EOTF of Mastering Reference Displays

  • defines 'display referred' EOTF curve with absolute luminance values based on human visual model
  • called Perceptual Quantizer (PQ)

ST 2086:2014 - Mastering Display Color Volume Metadata supporting High Luminance and Wide Color Gamut images

  • specifies mastering display primaries, white point and min/max luminance

Draft ST 2094:201x - Content-dependent Metadata for Color Volume Transformation of High-Luminance and Wide Color Gamut images

  • specifies dynamic metadata used in the color volume transformation of source content mastered with HDR and/or WCG imagery, when such content is rendered for presentation on a display having a smaller color volume

One other such body is the Blu-ray Disc Association (BDA). Although physical media have been losing some popularity with consumers lately, few people are blessed with a fast enough broadband connection to be able to handle proper Ultra HD video streaming, with or without HDR. Netflix requires at least 15 Mbps sustained average bitrate for UHD watching but recommends at least 25 Mbps. The new Ultra HD Blu-ray standard meanwhile offers up to 128 Mpbs peak bit rate. Of course one can compress Ultra HD signals but the resulting quality loss would defy the entire purpose of Ultra High Definition.

Ultra HD Blu-ray may be somewhat late to the market, with some SVOD streaming services having beat them to it, but the BDA deserves praise for not rushing the new standard to launch without HDR support. Had they done that, the format may very well have been declared dead on arrival. The complication, of course, was that there was no single agreed-upon standard for HDR yet. The BDA has settled on the HDR10 Media Profile (see box) as mandatory for players and discs with Dolby Vision and Philips’ HDR format as optional for players as well as discs.

HDR10 Media Profile

  • EOTF: SMPTE ST 2084
  • Color sub-sampling: 4:2:0 (for compressed video sources)
  • Bit depth: 10 bit
  • Color primaries: ITU-R BT.2020
  • Metadata: SMPTE ST 2086, MaxFall (Maximum Frame Average Light Level), MaxCLL (Maximum Content Light Level)

Referenced by:

  1. Ultra HD Blu-ray spec (Blu-Ray Disc Association)
  2. HDR-compatible display spec (CTA; former CEA)

UHD Alliance ‘Ultra HD Premium’ definition Display Content Distribution
Image resolution 3840×2160 3840×2160 3840×2160
Color Bit Depth 10-bit signal Minimum 10-bit signal depth Minimum 10-bit signal depth
Color Palette Signal input: BT.2020 color representation

Display reproduction: More than 90% of P3 color space

BT.2020 color representation BT.2020 color representation
High Dynamic Range SMPTE ST 2084 EOTF

A combination of peak brightness and black level either:

More than 1000 nits peak brightness and less than 0.05 nits black level
or

More than 540 nits peak brightness and less than 0.0005 nits black level

SMPTE ST 2084 EOTF

Mastering displays recommended to exceed 1000 nits in brightness, less than 0.03 black level, minimum of DCI-P3 color space

SMPTE ST 2084 EOTF

The UHD Alliance mostly revolves around Hollywood movie studios and is focused on content creation and playback, guidelines for CE devices, branding and consumer experience). At CES 2016, the UHDA has announced a set of norms for displays, content end ‘distribution’ to deliver UHD with HDR, and an associated logo program. The norm is called ‘Ultra HD Premium’ (see box). Is it a standard? Arguably, yes. Does it put an end to any potential confusion over different HDR technologies? Not quite – while the new norm guarantees a certain level of dynamic range it does not specify any particular HDR technology, so all options are still open. 

The Ultra HD Forum meanwhile focuses on the end-to-end content delivery chain including production workflow and distribution infrastructure.

In broadcasting we’ve got ATSC in North America defining how UHD and HDR should be broadcast over the air with the upcoming ATSC 3.0 standard (also used in South Korea) and transmitted via cable. Here, the SCTE comes into play as well. Japan has the ARIB (see above) and for most of the rest of the world, including Europe, there’s the DVB Project, part of the EBU, specifying how UHD and HDR should fit into the DVB standards that govern terrestrial, satellite and cable distribution.

In recent news, the European Telecommunications Standards Institute (ETSI) has launched a new Industry Specification Group (ISG) “to work on a standardized solution to define a scalable and flexible decoding system for consumer electronics devices from UltraHD TVs to smartphones” which will look at UHD, HDR and WCG. Founding members include telcos BT and Telefónica. The former already operates a UHD IPTV service; the latter is about to launch one.

Then there are CTA (Consumer Technology Association, formerly known as CEA) in the US and DigitalEurope dealing with guidelines and certification programs for consumer products. What specifications does a product have to support to qualify for ‘Ultra HD’ branding? Both have formulated answers to that question. It has not been a coordinated effort but fortunately they turn out to almost agree on the specs. Unity on a logo was not as feasible, sadly. The UHD Alliance has just announced they’ve settled on a definition of Ultra HD they’ll announce at CES, January 4th, 2016. One can only hope this will not lead to yet more confusion (and more logos) but I’m not optimistic.

By now, the CTA has also issued guidelines for HDR. DigitalEurope hasn’t yet. It’d be great for consumers, retailers and manufacturers alike if the two organizations could agree on a definition as well as a logo this time.

Ultra HD display definition CTA definition DigitalEurope definition
Resolution At least 3840x2160 At least 3840x2160
Aspect ratio 16:9 or wider 16:9
Frame rate Supporting 24p, 30p and 60p 24p, 25p, 30p, 50p, 60p
Chroma subsampling Not specified 4:2:0 for 50p, 60p

4:2:2 for 24p,25p, 30p

Color bit depth Minimum 8-bit Minimum 8-bit
Colorimetry BT.709 color space; may support wider colorimetry standards Minimum BT.709
Upconversion Capable of upscaling HD to UHD Not specified
Digital input One or more HDMI inputs supporting HDCP 2.2 or equivalent content protection. HDMI with HDCP 2.2
Audio Not specified PCM 2.0 Stereo
Logo  CTA Logo UHD  DigitalEur Logo UHD

CTA definition of HDR-compatible:

A TV, monitor or projector may be referred to as a HDR Compatible Display if it meets the following minimum attributes:

  1. Includes at least one interface that supports HDR signaling as defined in CEA-861-F, as extended by CEA-861.3.
  2. Receives and processes static HDR metadata compliant with CEA-861.3 for uncompressed video.
  3. Receives and processes HDR10 Media Profile* from IP, HDMI or other video delivery sources. Additionally, other media profiles may be supported.
  4. Applies an appropriate Electro-Optical Transfer Function (EOTF), before rendering the image.

CEA-861.3 references SMPTE ST 2084 and ST 2086.

What are consumers, broadcasters, TV manufacturers, technology developers and standardization bodies to do right now?

I wouldn’t want to hold any consumer back but I couldn’t blame them if they decided to postpone purchasing a new TV a little longer until standards for HDR have been nailed. Similarly, for broadcasters and production companies it only seems prudent to postpone making big investments in HDR production equipment and workflows.

For all parties involved in technology development and standardization, my advice would be as follows. It’s inevitable we’re going to see a mixture of TV sets with varying capabilities in the market – SDR HDTVs, SDR UHD TVs and HDR UHD TVs, and that’s not even taking into consideration near-future extensions like HFR.

Simply ignoring some of these segments would be a very unwise choice: cutting off SDR UHD TVs from a steady flow of UHD content for instance would alienate the early adopters who bought into UHD TV already. The CE industry needs to cherish these consumers. It’s bad enough that those Brits who bought a UHD TV in 2014 cannot enjoy BT Sport’s Ultra HD service today because the associated set-top box requires HDCP 2.2 which their TV doesn’t support.

It is not realistic to cater to each of these segments with separate channels either. Even if the workflows can be combined, no broadcaster wants to spend the bandwidth to transmit the same channel in SDR HD and HDR HD, plus potentially SDR UHD and HDR UHD.

Having separate channels for HD and UHD is inevitable but for HDR to succeed it’s essential for everyone in the production and delivery chain that the HDR signal be an extension to the broadcast SDR signal and the SDR signal be compatible with legacy Rec.709 TV sets.

Innovations like Ultra HD resolution, High Dynamic Range, Wide Color Gamut and High Frame Rate will not come all at once with a big bang but (apart from HDR and WCG which go together) one at a time, leading to a fragmented installed base. This is why compatibility and ‘graceful degradation’ are so important: it’s impossible to cater to all segments individually.

What is needed now is alignment and clarity in this apparent chaos of SDOs (Standards Defining Organizations). Let’s group them along the value chain:

Domain Production Compression Broadcast Telecom Media/ Streaming CE
SDO SMPTE, ITU-R MPEG , VCEG ATSC, SCTE, EBU/DVB, ARIB, SARFT ETSI BDA, DECE (UV), MovieLabs CTA, DigitalEurope, JEITA

Within each segment, the SDOs need to align because having different standards for the same thing is counterproductive. It may be fine to have different standards applied, for instance if broadcasting uses a different HDR format than packaged media; after all, they have differing requirements. Along the chain, HDR standards do not need to be identical but they have to be compatible. Hopefully organizations like the Ultra HD Forum can facilitate and coordinate this between the segments of the chain.

If the various standardization organizations can figure out what HDR flavor to use in which case and agree on this, the future is looking very bright indeed.

Further reading:

Yoeri Geutskens has worked in consumer electronics for more than 15 years. He writes about high-resolution audio and video. You can follow him on Ultra HD and 4K on twitter @UHD4k.