Posted on Leave a comment

Pre-NAB 2018 thoughts and questions

Although my first NAB was over 15 years ago, I’m still keen to get out to Vegas this year - OK not for Vegas the place, which gives me the creeps, but to catch up with the people, the trends and the tech. It remains one of my favorite conferences despite its gargantuan scale. Here are some of the questions I'll be looking to shed light on this year.

UHD

When I can get time off the Ultra HD Forum booth that I’ll be busy on, I’ll be looking into how the first generation of mature UHD technologies are doing. The debate as to whether 4K resolution was needed for a true UHD experience was all the rage just after the trailblazers were deploying UHD around 2014. Now that the paint has dried on the static metadata-based HDR solutions (HDR10/PQ and HLG), the battle seems over. Proponents of 1080p/HDR are grinning and claim they have won this round: we are already seeing some such content appearing on Netflix… For me the jury is still out, but I’ll be nosing around on people’s true intentions here.

But what’s next for UHD? I’ll be gaging the readiness of the next set of technologies, and as my friend Ian Nock says, how they might be deployed without breaking what’s already there. In the dynamic metadata space, Dolby Vision is already out there. Does there have to be a winner and a loser with HDR10+ or is there room in the market for both? As an audiophile I’ll be keen to find Next Gen Audio demos and here again fathom whether the existence of several standards (Dolby Atmos, MPEG-H, DTS:X, …) is holding things back.

Encoding

If one of my friends from the encoding space is kind enough to explain to me what's going on, I'll try to catch up on the encoding wars which have confused me with too many competing stakeholders to understand on my own. HEVC was supposed to be represent smooth transition from H264, now I don't know who to believe. The moving parts range from imploding patent pools, to Google, Apple, Amazon and Microsoft without forgetting the streaming people like DASH, H26x, disruptive start-ups, etc. Thierry, help! Decode it for me, tell me what's going on.

VR360

Having just published an eBook on VR360, that doesn’t predict 2018 is the year of lift-off but does explain why it’s the year to get involved, I’ll be eagerly looking how much we got right and if the hype has finally hit bottom, so we can now start to start to do business … I’ll do my best to get to the VR-IF masterclass on day 1 and if I'm lucky get an update from Rob Koenen.

Streaming Delay

I’ve been commissioned to do some work on OTT streaming delay, under the assumption that operators really care about reducing It. I’ve been very surprised that in my investigations so far, this is not the case. Sure, they’d like to reduce delay, but it’s low down their priority list. It’s got me wondering, as OTT streaming becomes more prevalent if the “norm” might, quite a few years from now become a 10-20s delay, where whatever broadcast is left, gets delayed so as to be synced with the crowd … probably science fiction but I’ll test out the idea.

Driven by Data, at last?

When I joined France Telecom (now Orange) in 2001, I remember a meeting where it was explained to me that our unique access to amazing data on subscribers and what they did, meant we would become the kings of data driven UX, data-driven decision making and data-driven just about everything … That vision of analytics was spot on, just too early and focussed on the wrong kind of operator. The Silicon Valley giants now dominate the world with Data and AI (which we didn’t see coming back then). So, have we truly entered the data era where other operators can get some of the pie? The recent Facebook/election scandals seem to say so. I’ll be looking around at vendors in the ecosystem are on the holy data grail. Is the market taking off for real or is it still vendor fantasy?

Posted on Leave a comment

My 10 does-the-emperor-have-any-clothes-on questions for IBC17

To get the most out of my annual pilgrimage to Amsterdam, I’ve sat down and had think about the big questions I don't believe we have answers on in late 2017.

I came up with 10, which only represent what I've been working on not necessarily the complete picture. Clearly we need to take ourselves less seriously sometimes. I for one would never trust an expert who has straightforward answers to all these questions, because the honest truth is that we don't know.

From new to old topics:

1.   What will mainstream HDR look like in 2018?

Continue reading My 10 does-the-emperor-have-any-clothes-on questions for IBC17

Posted on Leave a comment

UHD will change living room TV forever, if we fix customer-facing interop issues first (Update: Atmos working)

HDR and some NGA are here (well almost). Demos will blow your mind and ears, but beware - it can take a geek a couple of hours to get it working at home.

Here is my personal account, as a simple user, of my road to UHD nirvana in my living room. I wrote this in early September 2016, and just updated it at the end of the month as I finally got Dolby Atmos working and it was worth the wait).

My setup

When I moved to my new flat in central Paris 8 months ago, I immediately got access to an Orange Fibre connexion with speeds of up to 800 Mbps. With all the work I’ve been doing on UHD as a member of the Ultra HD Forum, I saw this as an opportunity to test some streaming services in the real world of my sitting room.

I’ve had my Samsung SUHD TV (UE55SJ8500) for 6 months now. First demos were with still images from the UHD Zoo app, as it took me a while to get a 4K video that I could effectively stream to my TV. Of course, Netflix was available, and although some 4K series have some stunning shots that show up all the new pixels, many don’t, even though they’re in the 4K section.

After having completed a white paper on Object based sound and getting excited about DTS:X (see here), I went for a mid to high-range Onkyo A/V receiver (TX-RZ 810 B ) that was already Dolby Atmos-capable and will be software upgradable to DTS:X.

Having spent 1,200 € on the receiver, I was no longer ready to splash out on the high-end Atmos speaker system I’d been eying. Amazon had a 400€ set of speakers available for next day delivery so I went for an Onkyo SKS-HT588(B) system, knowing that once my system is stable, I’ll have to invest in real speakers.

Sound first

A first hurdle for many viewers will be that the TV set-top-box is usually set by default to stereo sound. So before getting anything like 5.1 output from that source one must find the appropriate sub-menu and set HDMI audio output to what in my case Orange calls Home cinema.

Im1

The other option (yes, it’s well known that two options make things simpler!) is to use an optical output from the STB, which is always in pass-through mode, and then configure the AV receiver to associate that with the video, from the Orange STB in my case.

The Orange TV service has had a single Dolby Atmos sports transmission but I couldn’t find any next-gen audio in the VoD library so to get some fancy sound demos, back to the Internet where I found Dolby demo files with difficulty on http://www.demo-world.eu/2d-demo-trailers-hd/.

It turned out none of my devices or software was able to send the Dolby Atmos sound track to my AV receiver. I found on an obscure geek chat that the Kodi video player could decode Atmos. So I installed that onto my Mac and Eureka!, the Dolby Demos played on my TV (connected with HDMI to my Mac). It sounded beautiful, but the Onkyo receiver never had the word ‘Atmos’ appear so I’m guessing it just considered it was Dolby 7.1, but hey, it sounded really immersive with the rain falling literally overhead, so who cares? [see update at end of blog, I did finally get the Atmos to work from the Orange VoD store].

Now for some HDR video

Amazon and Netflix have some UHD content but on their own interfaces, it is so far impossible to tell whether there’s any HDR, and I understand Netflix chooses the HDR mode dynamically, so for the next demo, it seems like physical media is the only way.

Which UHD Blu-ray player?

After waiting for UHD capable Blu-Ray players for a year, I decided to go and get one of the two available in France (The Samsung at 500€ or the Hitachi at 800€). But when I got to the retail store, the sales guy suggested I get an Xbox One S for 400€. That would also hopefully get my sixteen-year-old interested, so I went for that option.

Back home with the Xbox unpacked, my next objective was to get UHD Blu-Ray disks to play and at-long-last see some real HDR.

Im2

Inside the Xbox parameters, to select HDR, there is no mention (yet) of HDR itself. You need the knowledge that we are looking for 10-bit colour and so must select the 30 (!?) bits per pixel option (I later used the top 36 (12) bits options, which the TV accepted fine, and the video looked a bit better, strangely I had a better sense of very high resolution rather than amazing colours. There was no HDR wow effect with the Man-of-Steel blue ray I got for free with my Xbox, it just looked very nice.

Im3

I then got into the Xbox One S’ advanced video parameters and all of a sudden the HDR word appears. And so now, going into the 4K TV submenu (note the confusion it should be a UHD sub-menu as we’re talking HDR and a bit of HFR too), I was all excited to see all the new possibilities:

Im4

But let’s not run away with ourselves, there was a last hurdle to cross. The Xbox’s Blu-ray player said I had the wrong kind of TV for UHD. It turned out that the AV Receiver through which my HDMI signal was passing, was not HDCP 2.2 enabled.

Im5

In the Inputs sub-menu of the Onkyo AV receiver, I discovered that only HDMI 1 through 3 were HDCP 2.2-capable. That required pulling the TV away from the wall yet again and reassigning the X-Box One S to one of the 3 first ports (and of course reassigning whatever was already there that didn’t need HDCP 2.2 somewhere else). I’ll spare you the screen shot of doing that in the AV receiver’s menus.

Im6

Finally, on my Samsung TV I had to hunt down to the 14th menu item of the main Picture menu, called HDMI UHD Color, which everyone else calls HDR.

Im7

Then within the Samsung TV submenu I turned on the HDMI ports that are connected to HDR sources. For each value you change here, the TV needs to reboot (no kidding it really does).

Im8

A couple of hours after had I started, my Samsung TV finally tells me I’ve succeeded: full UHD with HDR AKA UHD Color. I am of course too hot and bothered at this stage to want to watch anything, but when I have since shown off my new 4K/HDR/NGA setup, I’ve persistently got the most wows for the immersive audio demos. Hmmm, maybe I should have just bought a new stereo… nah, just kidding 😉

Wrapping up

Putting my professional hat on, I’m  still a true believer in UHD and all its promises, but despite having often written about “This being the year for UHD”,  I do see that there is a potential blocking point with these customer-facing issues. I trust that at the Ultra HD Forum and the UHD Alliance folks will get to grips with these interoperability teething problems so that the true benefits of UHD aren’t confined to the tech-savvy. I see a great opportunity for operators and their call centres to fix wires problems today but also for the CPE suppliers to work on processing HDR and one day NGA locally. UHD has to be plug and play to truly take off.

[Update Sept 29 2016: I spent 10€ on a digital copie of Salt (Angelina Jolie) from the Orange VoD store that had about 9 other movies with Atmos at time of writing - so Yeah! I finally got my expensive A/V receiver to actually recognise an Atmos audio stream and generate the right output. The sense of immersion is clearly improved, you really can't tell what's coming out of what speaker any more and I heard sounds that seemed to come from "in front of a given speaker".

If on an imaginary quality scale 1 is bad mono (i.e. the phone), the a jump to good stereo would bring a real wow-effect probably scoring maybe 5, moving from stereo to 5.1 is another similar wow-effect say doubling the score to 10. Object based sound on top of 5.1 (or 5.1.2 in my case) brings another really noticeable improvement, but less of a wow-effect, so I'd subjectively  say my current system scores 12 on my imaginary scale.]

Posted on Leave a comment

@nebul2’s NAB 2016 Journal (UHD, HDR, VR, All-IP)

Las Vegas was again focused on UHD in 2016, at least through my eayes. The four Keywords I came away with were 1: UHD (again), 2: HDR, but also 3: VR and 4: All-IP production. Of course other things like drones were important, but I'm not a real journalist, I don't know how to write about things I don't know.

NAB Parking Day1

We got in from Europe on the Saturday evening and this year I was on a budget so we stayed in an Airbnb apartment with my colleague Marta. It turned out to be just behind the main LVCC parking lot. On Sunday morning, you can see on thE right what the parking looked like when you arrive before the show is really underway.

Size and growth of the industry

On the Sunday I sat for a moment through the "Media Technology Business Summit" run by Devoncroft and learned abit about the industry trends:

  • Starting with radio shows this year’s NAB is the 94th annual Show, so I suppose in 6 years we’ll have a big bonanza, I wonder if we’ll have something like Augmented Reality in 8K by then.
  • Devoncroft sees the global media being market worth 49bn in 2015 with the US Media industry having pushed revenue per user to the limit. 3000 vendors make up their industry panel and 2009-2015 CAGR was 1,9% with 2014-2015 OpEx spend at -4.2% and CapEx spend at -4.4%.
  • Despite the OTT craze and losing traditional subs, ESPN still gets 7$/Month from linear subscriptions, but only 0,42$/Month from OTT viewers, so hold your hats, linear pay-TV ain’t dead quite yet. Beyond sports, Devoncroft argues that even though there is growth, digital revenues are insufficient to replace linear ones. The big issue is how the ad market can transition.
  • 4K and UHD make up the third most import topic for respondents of Devoncroft's 2016 Big Broadcast Survey the results of which will soon be released. But Demand for UHD is less for “more pixels” than one for “better pixels”. So according to Devoncroft, like Ericsson, the HDR Vs. 4K debate is all but over.

Virtual and Augmented Reality

I then popped into an Augmented Reality (AR) conference where Gary Acock and Juan Salvo were discussing how to add live content to the UnReal video gaming engine. AR is seen as bringing the real world into Virtual Reality (VR). Stitching 360° video is still apparently a “pretty unpleasant experience” and French startup VideoStitch was mentioned as one of the key players working on fixing this. Currently 360° production design is limited by how effectively you can stitch video. But with AR there are also Inherent UX limitations like parallax issues with head movement or camera movement when there’s no head movement. With AR one needs to always know where the head is and how it's positioned as head movements affect the content that is being created.

The amount of data to process for VR can be well over 1TB / hour so the coming (?) VR/AR revolution needs powerful GPU and CPU.

AR, VR and any immersive experience are still moving targets in 2016. But neither AR nor VR are isolated from the broadcast experience anymore. Indeed VR is less of an isolating and lonely experience, but a new way of engaging, a bit like coming to a conference and interacting with social media on a smartphone at the same time. Content is still king and creating compelling content remains the goal where AR & VR are just other tools. As we still don't have toolsets like an « Adobe for AR/VR » we need to jerry-rig existing tools.

A VR demo that was not at NAB intrigued me. Frauhoffer’s Stephan Steglich told me about FAME. It’s the simple idea of navigating the 360 video with a remote control. 2 key advantages are removing the isolation aspect of having to wear something over the eyes and moving all the processing to the cloud, allowing for future-proof deployments. It sounded convincing but I’ll wait for a compelling demo before making an opinion.

Showstoppers

Sennheiser Mic

I had been told great things about the CES Showstoppers being a big event, at my first experience at NAB, it was a very focused affair where great food and wine seemed to be as attractive for the media as the companies to visit.

German manufacturer Sennheiser was showing off their latest MKE440 DSLR microphone, which they say is the first mini-shotgun for HQ stereo sound image in one take. I was more taken by the beautiful design of the prototype VR microphone that goes under VR camera.

I met up with V-Nova’s Fabio Murra who was showing their two OTT deployments based on their Perseus codec. FastFilmz launched on March 26 in India offering SVoD to a mobile-only Tamil customer base with a potential of 120m subs. There were 350 titles at launch and according to V-Nova, Perseus made the business case possible in southern India where only 2G is available in some areas, offering a 64-128 kbps bandwidth. The demo I saw was watchable at 120kbps using 14 fps (I had to point that out though). The Perseus codec is described as “hybrid on top of H264” with a metadata stream on top of H264. I’ll be looking to dig into this a bit more as I no longer understand exactly what this means after a heated discussion several analysts. Content is protected with DRM I couldn’t find out by who.

I only glimpsed the other demo of a 4K STB using OTT delivery. It was showing Tears of Steel at 4mbps and looked fine but without any wow effect at least for what was on screen then, or maybe it was just that I was too far away for the small screen.

V-Nova had already announced a contribution deal with Eutelsat and promised another one for the next day (which turned out to be Sky Italia).

brother

The Japanese company Brother that I wrongly thought of as a printer maker (does any Japanese company do only one thing?) was displaying « Airscouter », a surprising head-mounted monitor designed for cameramen in difficult positions. You see a 720p resolution image in the corner of one eye. It was a bit disconcerting and I guess limited to some very specific use cases. I felt a bit nauseous with it on my head but it does really work and felt maybe like what Iron Man might feel.

Ultra HD Forum

Monday was taken up with Ultra HD Forum activities for me. We had our own press conference in the morning and in the after noon I made a tiny presentation during the Pilot press conference in the Futures Park. I discussed, the forum’s reason for being, it’s history, our Plugfest #1, the Guidelines 2016 and the general « Work in Progress » aspect of live UHD.

« Pilot » is new name for « NAB Labs » that was started in 2012. We were among 30 exhibitors in Futures Park, which aims to promote « Edge of the art » concepts that are not yet commercialized. ATSC 3.0 was the star with 15 companies focusing on that alone. Other stuff is very diverse ranging from commercial R&D, government to academic research. NHK 8k Super High Vision was prominent as usual and the Nippon public broadcaster is still scheduled to launch commercially in 2018 « so people can enjoy in 2020 Japanese Olympics » in glorious 8K HDR with HFR.

Security and analytics

Monday night was over-booked and I chose the Verimatrix media dinner. I had some animated discussions on UHD and the extent to which HDR might be the only big game-changer (I still believe in 4K but am feeling more and more lonely on that front). Tom Munro the CEO gave me a great update on the company strategy and how the move towards analytics, which I now understand can be a logical progression for a security vendor. If the financial transactions are precious enough to secure, then private usage data is worthy of the same efforts. More on that in a dedicated blog soon.

Satellite industry on edge of a cliff and might UHD save it?

On Tuesday I got myself to the Satellite industry day. I have this vision on the industry (at least the broadcast and the Telecoms parts of it) sitting on the edge of a cliff wondering when fiber, 5G and delinearization will push the off the edge.

Despite a great lineup with Caleb Henry of Via Sat Mag, Steve Corda VP Bizdev SES, Markus Fritz Eutelsat, Dan Miner AT&T and Peter Ostapiuk of Intelsat, the opening panel didn’t really give me any new ideas to tackle that problem.

AT&T in particular sees similarities between the move from SD to HD and that from HD to UHD, but IntelSat sobered the audience asking how the content industry will make money from upgrade to UHD. SES’s Steve Corda made it scarier still reminding the audience that during the upgrade from SD to HD we didn't have competition from OTT as we do now with most early UHD coming from OTT suppliers.

The satellite industry panel agreed that demand for UHD channels is growing especially from their cable operator clients and that the bottleneck is still available content. AT&T's Dan Miner noted that a key change in OTT delivery in the coming 18 months is that US data plans will enable the TV Everywhere on cellular networks.

The consensus is that to have a monetizable UHD offering you need a bouquet of at least 2 channels, ideally at least to 5 including sports.

When the panel went round enumerating their live 4K services, I counted about a dozen UHD linear channels and as many demo channels as well as a few events based channels.

One of Viasat’s founders Mark Dankberg gave an inspirational talk reassuring the audience that the satellite industry’s future is safe, at least if they copy Viasat. The merger of AT&T and DirecTV is an indicator to him that Satellite without broadband is no longer viable in the long term. Viasat started 1986 in defense, during the 90's they got into VSAT (Data Networking) just on the B2B side. Dankberg believes high –orbit geostationary is still the way to go (instead of mid of low-orbit (LEO)) because it’s the best way to optimize resources with thousands of beams. He points out that as 95% of demand is in 15% of geography; LEO that orbit the earth can't do that. I was enthused by his talk and hoping to get home and write a blog about it, but when I looked through my notes I realized that in the end there wasn’t any new information, just the charisma and communicative beliefs of an industry veteran.

TV Middleware on Android

Beeniuis, the middleware guys from Slovenia that I’ve written about a few time caught me in the south hall so I went to have a look.

In demonstrating their new version 4.2 core product, Beenius told me that the EPG is dead but still went ahead to show me theirs. Navigation is via genres with favorite channels on top of a carousel that mixes live and VoD. Recommendation currently uses their own algorithms but can be based on Think Analytics with « Trending » content on second line.

beenius

The company is very Google-centric, although they still have a Linux offering with a Hybrid DVB solution. They clarified to me how GooglePlay apps can be controlled by the TV operator with three different approaches:

  1. 1. Preinstalled apps and an open GooglePlay
  2. 2. « Walled Garden » where the user chooses apps from the operator’s list typically among a dozen including YouTube, Netflix, etc.
  3. 3. Apps already embedding into the UI, which is also a closed model.

VoD also benefits from integrated recommendation but is open to extra info from the Web such IMDB content.

Beenius haven’t had much interaction with 4K yet, although they say they are ready. As with any competitive TV middleware you can fling content from screen to screen.

The operator-controlled UI can be updated from a central server so that a new version of the App gets automatically pushed to STB via GooglePlay as soon as it's closed and reopened. Playing in the google arena has enabled a full-featured app for Android powered smart TVs, Beanies just needs Google to finally get it right in the living room.

Automatically generated HDRB-COM

Ludovic Noblet of French institute of research B<>Com showed me a tool to up-convert SDR content to HDR. He sees it as a gap-filler for legacy setups which is already available for offline, with a real-time version planed for IBC 2016. The current version introduced a latency of just 3 images and was convincing even if it didn’t carry that amazing wow-effect of some native HDR content. He was very secretive about the first customers but seemed very confident.

The pull of social media

On the last day I had a quick stop at Texas Instrument’s tiny booth, simply because they engaged with me on twitter ;o)

The LMH1219 is a 12G SDI card shown above enables SDI cables to be up to 110m without any signal attenuation, instead of the usual 20-30m. Its UltraScale processing equalizes and Improves the signal. The TI chip is agnostic to metadata so should work fine with HDR for example.

Another hardware innovation they showed me was a single chip for receive (Cable EQ) or drive mode (TX) that makes BNC connectors more versatile as they needn't be just IN or OUT but can be either. The device isn’t available yet nor does it have a product name. Launch is expected in Q1 2017.

Note that I didn’t interact with any of the All-IP production vendors, but just noted it as a buzzing theme in conferences and on booth signage.

NAB Day 3

Oh and the Convention Centre car park looked like this from our apartment window by 9:30 am Monday through Wednesday:

That’s all for now folks.

Posted on 3 Comments

The State of #HDR in Broadcast and OTT – CES 2016 update

By Yoeri Geutskens

This article was first published in December 2015, but has been updated post-CES 2016 (corrections on Dolby Vision, UHD Alliance's "Ultra HD Premium" specification and the merging of Technicolor and Philips HDR technologies).

A lot has been written about HDR video lately, and from all of this perhaps only one thing becomes truly clear – that there appear to be various standards to choose from. What’s going on in this area in terms of technologies and standards? Before looking into that, let’s take a step back and look at what HDR video is and what’s the benefit of it.

Since 2013, Ultra HD or UHD has come up as a major new consumer TV development. UHD, often also referred to as ‘4K’, has a resolution of 3,840 x 2,160 – twice the horizontal and twice the vertical resolution of 1080p HDTV, so four times the pixels. UHD has been pushed above all by TV manufacturers looking for new ways to entice consumers to buy new TV sets. To appreciate the increased resolution of UHD, one needs to have a larger screen or a smaller viewing distance but it serves a trend towards ever larger TV sizes.

While sales of UHD TV sets are taking off quite prosperously, the rest of the value chain isn’t following quite as fast. Many involved feel the increased spatial resolution alone is not enough to justify the required investments in production equipment. Several other technologies promising further enhanced video are around the corner however. They are:

  • High Dynamic Range or HDR
  • Deep Color Resolution: 10 or 12 bits per subpixel
  • Wide Color Gamut or WCG
  • High Frame Rate or HFR: 100 or 120 frames per second (fps)

As for audio, a transition from conventional (matrixed or discrete) surround sound to object-based audio is envisaged for the next generation of TV.

Of these technologies, the first three are best attainable in the short term. They are also interrelated.

So what does HDR do? Although it’s using rather different techniques, HDR video is often likened to HDR photography as their aims are similar: to capture and reproduce scenes with a greater dynamic range than traditional technology can, in order to offer a more true-to-life experience. With HDR, more detail is visible in images that would otherwise look either overexposed, showing too little detail in bright areas, or underexposed, showing too little detail in dark areas.

HDR video is typically combined with a feature called Wide Color Gamut or WCG. Traditional HDTVs use a color space referred to as Rec.709, which was defined for the first generations of HDTVs which used CRT displays. Current flat panel display technologies like LCD and OLED can produce a far wider range of colors and greater luminance, measured in ‘nits’. A nit is a unit for brightness, equal to candela per square meter (cd/m2). To accommodate this greater color gamut, Rec.2020 color space was defined. No commercial display can fully cover this new color space but it provides room for growth. The current state of the art of color gamut for displays in the market is a color space called DCI-P3 which is smaller than Rec.2020 but substantially larger than Rec.709.

To avoid color banding issues that could otherwise occur with this greater color gamut, HDR/WCG video typically uses a greater sampling resolution of 10 or 12 bits per subpixel (R, G and B) instead of the conventional 8 bits, so 30 or 36 bits per pixel rather than 24.

Sony-Color-space

Color/luminance volume: BT.2020 (10,000 nits) versus BT.709 (100 nits); Yxy
Image credit: Sony

The problem with HDR isn’t so much on the capture side nor on the rendering side – current professional digital cameras can handle a greater dynamic range and current displays can produce a greater contrast than the content chain in between can handle. It’s the standards for encoding, storage, transmission and everything else that needs to happen in between that are too constrained to support HDR.

So what is being done about this? A lot, in fact. Let’s look at the technologies first. A handful of organizations have proposed technologies for describing HDR signals for capture, storage, transmission and reproduction. They are Dolby, SMPTE, Technicolor, Philips, and BBC together with NHK. Around the time of CES 2016, Technicolor and Philips have announced they are going to merge their HDR technologies.

Dolby’s HDR technology is branded Dolby Vision. One of the key elements of Dolby Vision is the Perceptual Quantizer EOTF which has been standardized by SMPTE as ST 2084 (see box: SMPTE HDR Standards) and mandated by the Blu-ray Disc Association for the new Ultra HD Blu-ray format. The SMPTE ST 2084 format can actually contain more picture information than TVs today can display but because the information is there as manufacturers build better TVs the content has the potential to look better as the new, improved display technologies come to market. Dolby Vision and HDR10 use the same SMPTE 2084 standard making it easy for studios and content producers to master once and deliver to either HDR10 or, with the addition of dynamic metadata, Dolby Vision. The dynamic metadata is not an absolute necessity, but using it guarantees the best results when played back on a Dolby Vision-enabled TV. HDR10 uses static metadata which ensures it will still look good – far better than Standard Dynamic Range (SDR). Even using no metadata at all, SMPTE 2084 can work at an acceptable level just as other proposed EOTFs without metadata do.

For live broadcast Dolby supports both single and dual layer 10-bit distribution methods and has come up with a single workflow that can simultaneously deliver an HDR signal to the latest generation and future TVs and a derived SDR signal to support all legacy TVs. The signal can be encoded in HEVC or AVC. Not requiring dual workflows will be very appealing to all involved in content production and the system is flexible to let the broadcaster choose where to derive the SDR signal.  If it’s done at the head-end they can choose to simply simulcast it as another channel or convert the signal to dual-layer single stream signal at the distribution encoder for transmission.  Additionally the HDR-to-SDR conversion can be built into set-top boxes for maximum flexibility without compromising the SDR or HDR signals. Moreover, the SDR distribution signal that’s derived from the HDR original using Dolby’s content mapping unit (CMU) is significantly better in terms of detail and color than one that’s captured natively in SDR, as Dolby demonstrated side by side at IBC 2015. The metadata is only produced and multiplexed into the stream at the point of transmission, just before or in the final encoder – not in the baseband workflow. Dolby uses 12-bit color depth for cinematic Dolby Vision content to avoid any noticeable banding but the format is actually agnostic to different color depths and works with 10-bit video as well. In fact, Dolby recommends 10-bit color depth for broadcast.

High-level overview of Dolby Vision dual-layer transmission for OTT VOD

High-level overview of Dolby Vision dual-layer transmission for OTT VOD;
other schematics apply for OTT live, broadcast, etc. 
Image credit: Dolby Labs Dolby Vision white paper

Technicolor has developed two HDR technologies. The first takes a 10-bit HDR video signal from a camera and delivers a video signal that is compatible with SDR as well as HDR displays. The extra information that is needed for the HDR rendering is encoded in such a way that it builds on top of the 8-bit SDR signal but SDR devices simply ignore the extra data.

Technicolor

Image credit: Technicolor

The second technology is called Intelligent Tone Management and offers a method to ‘upscale’ SDR material to HDR, using the extra dynamic range that current-day capture devices can provide but traditional encoding cannot handle, and providing enhanced color grading tools to colorists. While it remains to be seen how effective and acceptable the results are going to be, this technique has the potential to greatly expand the amount of available HDR content.

Having a single signal that delivers SDR to legacy TV sets (HD or UHD) and HDR to the new crop of TVs is also the objective of what BBC’s R&D department and Japan’s public broadcaster NHK are working on together.  It’s called Hybrid Log Gamma or HLG. HLG’s premise is an attractive one: a single video signal that renders SDR on legacy displays but HDR on displays that can handle this. HLG, BBC and NHK say, is compatible with existing 10-bit production workflows and can be distributed using a single HEVC Main 10 Profile bitstream.

Depending on whom you ask HLG is the best thing since sliced bread or a clever compromise that accommodates SDR as well as HDR displays but gives suboptimal results and looks great on neither. The Hybrid Log Gamma name refers to the fact that the OETF is a hybrid that applies a conventional gamma curve for low-light signals and a logarithmic curve for the high tones.

HLG

Hybrid Log Gamma and SDR OETFs; image credit: T. Borer and A. Cotton, BBC R&D

Transfer functions:

  • OETF: function that maps scene luminance to digital code value; used in HDR camera;
  • EOTF: function that maps digital code value to displayed luminance; used in HDR display;
  • OOTF: function that maps scene luminance to displayed luminance; a function of the OETF and EOTF in a chain. Because of the non-linear nature of both OETF and EOTF, the chain’s OOTF also has a non-linear character.

 BBC_WorkFlow

Image credit: T. Borer and A. Cotton, BBC R&D

The EOTF for Mastering Reference Displays, conceived by Dolby and standardized by SMPTE as ST 2084 is ´display-referred'.  With this approach, the OOTF is part of the OETF, requiring implicit or explicit metadata.

Hybrid Log Gamma (HLG), proposed by BBC and NHK, is a 'scene-referred' system which means the OOTF is part of the EOTF. HLG does not require mastering metadata so the signal is display-independent and can be displayed unprocessed on an SDR screen.

The reasoning is simple: bandwidth is scarce, especially for terrestrial broadcasting but also for satellite and even cable, so transmitting the signal twice in parallel, in SDR and HDR, is not an attractive option. In fact, most broadcasters are far more interested in adding HDR to 1080p HD channels than in launching UHD channels, for exactly the same reason. Adding HDR is estimated to consume up to 20% extra bandwidth at most, whereas a UHD channel gobbles up the bandwidth of four HD channels. It’s probably no coincidence HLG technology has been developed by two broadcast companies that have historically invested a lot in R&D. Note however that the claimed backwards compatibility of HLG with SDR displays only applies to displays working with Rec.2020 color space, i.e. Wide Color Gamut. This more or less makes its main benefit worthless.

ARIB, the Japanese organization that’s the equivalent of DVB in Europe and ATSC in North America, has standardized upon HLG for UHD HDR broadcasts.

The DVB Project meanwhile has recently announced that UHD-I phase 2 will actually include a profile that adds HDR to 1080p HD video – a move advocated by Ericsson  and supported by many broadcasters. Don’t expect CE manufacturers to start producing HDTVs with HDR however. Such innovations are likely to end up only in the UHD TV category, where the growth is and any innovation outside of cost reductions takes place.

This means consumers will need a HDR UHD TV to watch HD broadcasts with HDR. Owners of such TV sets will be confronted with a mixture of qualities – plain HD, HD with HDR, plain UHD and UHD with HDR (and WCG), much in the same way HDTV owners may watch a mix of SD and HD television, only with more variations.

The SMPTE is one of the foremost standardization bodies active in developing official standards for the proposed HDR technologies. See box ‘SMPTE HDR standards’.

SMPTE HDR Standards

ST 2084:2014 - High Dynamic Range EOTF of Mastering Reference Displays

  • defines 'display referred' EOTF curve with absolute luminance values based on human visual model
  • called Perceptual Quantizer (PQ)

ST 2086:2014 - Mastering Display Color Volume Metadata supporting High Luminance and Wide Color Gamut images

  • specifies mastering display primaries, white point and min/max luminance

Draft ST 2094:201x - Content-dependent Metadata for Color Volume Transformation of High-Luminance and Wide Color Gamut images

  • specifies dynamic metadata used in the color volume transformation of source content mastered with HDR and/or WCG imagery, when such content is rendered for presentation on a display having a smaller color volume

One other such body is the Blu-ray Disc Association (BDA). Although physical media have been losing some popularity with consumers lately, few people are blessed with a fast enough broadband connection to be able to handle proper Ultra HD video streaming, with or without HDR. Netflix requires at least 15 Mbps sustained average bitrate for UHD watching but recommends at least 25 Mbps. The new Ultra HD Blu-ray standard meanwhile offers up to 128 Mpbs peak bit rate. Of course one can compress Ultra HD signals but the resulting quality loss would defy the entire purpose of Ultra High Definition.

Ultra HD Blu-ray may be somewhat late to the market, with some SVOD streaming services having beat them to it, but the BDA deserves praise for not rushing the new standard to launch without HDR support. Had they done that, the format may very well have been declared dead on arrival. The complication, of course, was that there was no single agreed-upon standard for HDR yet. The BDA has settled on the HDR10 Media Profile (see box) as mandatory for players and discs with Dolby Vision and Philips’ HDR format as optional for players as well as discs.

HDR10 Media Profile

  • EOTF: SMPTE ST 2084
  • Color sub-sampling: 4:2:0 (for compressed video sources)
  • Bit depth: 10 bit
  • Color primaries: ITU-R BT.2020
  • Metadata: SMPTE ST 2086, MaxFall (Maximum Frame Average Light Level), MaxCLL (Maximum Content Light Level)

Referenced by:

  1. Ultra HD Blu-ray spec (Blu-Ray Disc Association)
  2. HDR-compatible display spec (CTA; former CEA)

UHD Alliance ‘Ultra HD Premium’ definition Display Content Distribution
Image resolution 3840×2160 3840×2160 3840×2160
Color Bit Depth 10-bit signal Minimum 10-bit signal depth Minimum 10-bit signal depth
Color Palette Signal input: BT.2020 color representation

Display reproduction: More than 90% of P3 color space

BT.2020 color representation BT.2020 color representation
High Dynamic Range SMPTE ST 2084 EOTF

A combination of peak brightness and black level either:

More than 1000 nits peak brightness and less than 0.05 nits black level
or

More than 540 nits peak brightness and less than 0.0005 nits black level

SMPTE ST 2084 EOTF

Mastering displays recommended to exceed 1000 nits in brightness, less than 0.03 black level, minimum of DCI-P3 color space

SMPTE ST 2084 EOTF

The UHD Alliance mostly revolves around Hollywood movie studios and is focused on content creation and playback, guidelines for CE devices, branding and consumer experience). At CES 2016, the UHDA has announced a set of norms for displays, content end ‘distribution’ to deliver UHD with HDR, and an associated logo program. The norm is called ‘Ultra HD Premium’ (see box). Is it a standard? Arguably, yes. Does it put an end to any potential confusion over different HDR technologies? Not quite – while the new norm guarantees a certain level of dynamic range it does not specify any particular HDR technology, so all options are still open. 

The Ultra HD Forum meanwhile focuses on the end-to-end content delivery chain including production workflow and distribution infrastructure.

In broadcasting we’ve got ATSC in North America defining how UHD and HDR should be broadcast over the air with the upcoming ATSC 3.0 standard (also used in South Korea) and transmitted via cable. Here, the SCTE comes into play as well. Japan has the ARIB (see above) and for most of the rest of the world, including Europe, there’s the DVB Project, part of the EBU, specifying how UHD and HDR should fit into the DVB standards that govern terrestrial, satellite and cable distribution.

In recent news, the European Telecommunications Standards Institute (ETSI) has launched a new Industry Specification Group (ISG) “to work on a standardized solution to define a scalable and flexible decoding system for consumer electronics devices from UltraHD TVs to smartphones” which will look at UHD, HDR and WCG. Founding members include telcos BT and Telefónica. The former already operates a UHD IPTV service; the latter is about to launch one.

Then there are CTA (Consumer Technology Association, formerly known as CEA) in the US and DigitalEurope dealing with guidelines and certification programs for consumer products. What specifications does a product have to support to qualify for ‘Ultra HD’ branding? Both have formulated answers to that question. It has not been a coordinated effort but fortunately they turn out to almost agree on the specs. Unity on a logo was not as feasible, sadly. The UHD Alliance has just announced they’ve settled on a definition of Ultra HD they’ll announce at CES, January 4th, 2016. One can only hope this will not lead to yet more confusion (and more logos) but I’m not optimistic.

By now, the CTA has also issued guidelines for HDR. DigitalEurope hasn’t yet. It’d be great for consumers, retailers and manufacturers alike if the two organizations could agree on a definition as well as a logo this time.

Ultra HD display definition CTA definition DigitalEurope definition
Resolution At least 3840x2160 At least 3840x2160
Aspect ratio 16:9 or wider 16:9
Frame rate Supporting 24p, 30p and 60p 24p, 25p, 30p, 50p, 60p
Chroma subsampling Not specified 4:2:0 for 50p, 60p

4:2:2 for 24p,25p, 30p

Color bit depth Minimum 8-bit Minimum 8-bit
Colorimetry BT.709 color space; may support wider colorimetry standards Minimum BT.709
Upconversion Capable of upscaling HD to UHD Not specified
Digital input One or more HDMI inputs supporting HDCP 2.2 or equivalent content protection. HDMI with HDCP 2.2
Audio Not specified PCM 2.0 Stereo
Logo  CTA Logo UHD  DigitalEur Logo UHD

CTA definition of HDR-compatible:

A TV, monitor or projector may be referred to as a HDR Compatible Display if it meets the following minimum attributes:

  1. Includes at least one interface that supports HDR signaling as defined in CEA-861-F, as extended by CEA-861.3.
  2. Receives and processes static HDR metadata compliant with CEA-861.3 for uncompressed video.
  3. Receives and processes HDR10 Media Profile* from IP, HDMI or other video delivery sources. Additionally, other media profiles may be supported.
  4. Applies an appropriate Electro-Optical Transfer Function (EOTF), before rendering the image.

CEA-861.3 references SMPTE ST 2084 and ST 2086.

What are consumers, broadcasters, TV manufacturers, technology developers and standardization bodies to do right now?

I wouldn’t want to hold any consumer back but I couldn’t blame them if they decided to postpone purchasing a new TV a little longer until standards for HDR have been nailed. Similarly, for broadcasters and production companies it only seems prudent to postpone making big investments in HDR production equipment and workflows.

For all parties involved in technology development and standardization, my advice would be as follows. It’s inevitable we’re going to see a mixture of TV sets with varying capabilities in the market – SDR HDTVs, SDR UHD TVs and HDR UHD TVs, and that’s not even taking into consideration near-future extensions like HFR.

Simply ignoring some of these segments would be a very unwise choice: cutting off SDR UHD TVs from a steady flow of UHD content for instance would alienate the early adopters who bought into UHD TV already. The CE industry needs to cherish these consumers. It’s bad enough that those Brits who bought a UHD TV in 2014 cannot enjoy BT Sport’s Ultra HD service today because the associated set-top box requires HDCP 2.2 which their TV doesn’t support.

It is not realistic to cater to each of these segments with separate channels either. Even if the workflows can be combined, no broadcaster wants to spend the bandwidth to transmit the same channel in SDR HD and HDR HD, plus potentially SDR UHD and HDR UHD.

Having separate channels for HD and UHD is inevitable but for HDR to succeed it’s essential for everyone in the production and delivery chain that the HDR signal be an extension to the broadcast SDR signal and the SDR signal be compatible with legacy Rec.709 TV sets.

Innovations like Ultra HD resolution, High Dynamic Range, Wide Color Gamut and High Frame Rate will not come all at once with a big bang but (apart from HDR and WCG which go together) one at a time, leading to a fragmented installed base. This is why compatibility and ‘graceful degradation’ are so important: it’s impossible to cater to all segments individually.

What is needed now is alignment and clarity in this apparent chaos of SDOs (Standards Defining Organizations). Let’s group them along the value chain:

Domain Production Compression Broadcast Telecom Media/ Streaming CE
SDO SMPTE, ITU-R MPEG , VCEG ATSC, SCTE, EBU/DVB, ARIB, SARFT ETSI BDA, DECE (UV), MovieLabs CTA, DigitalEurope, JEITA

Within each segment, the SDOs need to align because having different standards for the same thing is counterproductive. It may be fine to have different standards applied, for instance if broadcasting uses a different HDR format than packaged media; after all, they have differing requirements. Along the chain, HDR standards do not need to be identical but they have to be compatible. Hopefully organizations like the Ultra HD Forum can facilitate and coordinate this between the segments of the chain.

If the various standardization organizations can figure out what HDR flavor to use in which case and agree on this, the future is looking very bright indeed.

Further reading:

Yoeri Geutskens has worked in consumer electronics for more than 15 years. He writes about high-resolution audio and video. You can follow him on Ultra HD and 4K on twitter @UHD4k.

Posted on 1 Comment

@nebul2’s 14 reasons why 2015 will be yet another #UHD #IBCShow

Ultra HD or 4K has been a key topic of my pre and post IBC blogs for over 5 years. I’ve recently joined the Ultra HD Forum, serving on the communications working group. That’s a big commitment and investment, as I don’t have any large company paying my bills. I’m making it because I believe the next 18 months will see the transition from UHD as the subject of trials for big operators and precursor launches to something no operator can be without. Time to get off the fence. I once wrote that the 3D emperor didn’t have any clothes on; well, the UHD emperor is fully clothed.

Of course much still needs to be achieved before we see mass adoption. I don’t know if HDR and 4K resolution will reach market acceptance one at a time or both together, and yes, I don’t know which HDR specification will succeed. But I know it’s all coming.

Below is a list of 14 key topics ordered by my subjective (this is a blog remember) sense of comfort on each. I start with areas where the roadmap to industrial strength UHD delivery is clear to me and end with those where I’m the most confused.

Note on vocabulary: 4K refers to a screen resolution for next gen TV whereas UHD includes that spatial resolution (one even sees UHD phase 2 documents refer to an 8K resolution) but also frame rate, HDR and next generation Audio.

So as I wander round IBC this year, or imagine I’m doing that, as probably won’t have time, I’ll look into the following 14 topics with growing interest.

1. Broadcast networks (DVB)

I doubt I’ll stop by the big satellite booths for example, except of course for free drinks and maybe to glimpse the latest live demos. The Eutelsat, Intelsat or Astras of this world have a pretty clear UHD story to tell. Just like the cableCos, they are the pipe and they are ready, as long as you have what it takes to pay.

2. Studio equipment (cameras etc.)

As a geek, I loved the Canon demos at NAB, both of affordable 4K cameras and their new ultra sensitive low-light capabilities. But I won’t be visiting any of the studio equipment vendors, simply because I don’t believe they are on the critical path for UHD success. The only exception to this is the HDR issues described below.

 3. IP network; CDN and Bandwidth

Bandwidth constricts UHD delivery; it would be stupid to claim otherwise. All I’m saying is that by putting this issue so high on the list everything is clear in the mid-term. We know how fast High-Speed Broadband (over 30MPS) is arriving in most markets. In the meantime, early adopters without access can buy themselves a UHD Blu-ray by Christmas this year and use progressive download services. The Ultra HD Alliance has already identified 25 online services, several of which support PDL. Once UHD streams get to the doorstep or the living room, there is still the issue of distributing them around the home. But several vendors like AirTies are addressing that specific issue, so again, even if it isn’t fixed, I can see how it will be.

 4. Codecs (HEVC)

The angst around NAB this year when V-nova came out with a bang has subsided. It seems now that even if such a disruptive technology does come through in the near-term, it will complement not replace HEVC for UHD delivery.

The codec space dropped from a safe 2 in my list down to 4 with the very recent scares on royalties from the HEVC Advance group that wants 0.5% of content owner & distributor's gross revenue. Industry old-timers have reassured me that this kind of posturing is normal and that the market will settle down naturally at acceptable rates.

 5. Head-ends (Encoders, Origins, etc.)

I always enjoy demos and discussion on the booths of the likes of Media Excel, Envivio, Harmonic, Elemental or startup BBright and although I’ll try to stop by, I won’t make a priority of them because here again, the mid-term roadmaps seem relatively clear.

I’ve been hearing contradictory feedback on the whole cloud-encoding story that has been sold to us for a couple of years already. My theory – to be checked at IBC – is that encoding in the cloud really does make sense for constantly changing needs and where there is budget. But for T2 operators running on a shoestring – and there are a lot of them – the vendors are still mainly shifting appliances. It’s kind of counterintuitive because you’d expect the whole cloud concept of mutualizing resources to work better for the smaller guys. I must have something missing here, do ping me with info so I can update this section.

 6. 4K/UHD resolutions

While there is no longer any concern on what the screen resolutions will be, I am a little unclear as to the order in which they will arrive. With heavyweights like Ericsson openly pushing for HDR before 4K, I’m a little concerned that lack of industry agreement on this could confuse the market.

 7. Security for UHD

Content owners and security vendors like Verimatrix have all agreed that better security is required for UHD content. I see no technical issues here - just that if the user experience is adversely affected in any way (remember the early MP3 years), we could see incentive for illegal file transfer grow, just when legal streaming seems to be taking of at last.

 8. TV sets & STBs

Well into second half of my list, we’re getting into less clear waters.

When it’s the TV set that is doing the UHD decoding, we’re back at the product cycle issue that has plagued smart TVs. It’s all moving too fast for a TV set that people still would like to keep in the living room for over 5 years.

On the STB side, we’ve seen further consolidation since last year’s IBC. Pace for example is no longer; Cisco is exiting STBs etc. It seems that only players with huge scale will survive. Operators like Swisscom or Orange can make Hardware vendors’ lives harder by commoditizing their hardware using software-only vendors such as SoftAtHome to deliver advanced features.

 9. Frame rates

This is a really simple one but for which consensus is needed. At a 4K screen resolution the eye/brain is more sensitive to artifacts. Will refresh rates standardize at 50Hz or 60Hz? Will we really ever need 120Hz?

It’s clear that doubling a frame rate does not double the required bandwidth as clever compression techniques come to play. But but I haven’t seen a consensus on what the bandwidth implication of greater frame rate will actually be.

10. Next Gen Audio

There are only a few contenders out there, and all have compelling solutions. I’m pretty keyed up on DTS’s HeadphoneX streamed with Unified Streaming packagers because I’m helping them write an eBook on the subject. Dolby is, of course, a key player here but for me it’s not yet clear how multiple solutions will cohabit. It isn’t yet clear how if and when we’ll move from simple channel-based to scene based or object based audio. Will open source projects like Ambiophonics play a role and what about binaural audio.

11. HDR

High Dynamic Range is about better contrast. Also, the brain perceives more detail when contrast is improved, so it’s almost like getting more pixels for free. But the difficulty with HDR and why it’s near the bottom of my list is that there are competing specifications. And even once a given specification is adopted, its implementation on a TV set can vary from one CE manufacturer to another. I final reservation I have is the extra power consumption it will entail that goes against current CE trends.

12. Wide Color Gamut

As HDR brings more contrast to pixels WCG brings richer and truer colors. Unlike with HDR, the issue isn’t about which spec to follow, as it is already catered for in HEVC for example. No, it’s more about when to implement it and how the color mapping will be unified across display technologies and vendors.

 13. Work flows

Workflow from production through to display is a sensitive issue because it is heavily dependant on skills and people. So it’s not just a mater of choosing the right technology. To produce live UHD content including HDR, there is still no industry standard way of setting up a workflow.

 14. UHD-only content

The pressure to recoup investments in HD infrastructure makes the idea of UHD content that is unsuitable for HD downscaling taboo. From a business perspective, most operators consider UHD as an extension or add-on rather than something completely new. There is room for a visionary to coma and change that.

Compelling UHD content, where the whole screen is in focus (video rather than cinema lenses) gives filmmakers a new artistic dimension to work on. There is enough real estate on screen to offer multiple user experiences.

In the world of sports a UHD screen could offer a fixed view on a whole football pitch for example. But if that video were seen on an HD screen, the ball probably wouldn’t be visible. Ads that we have to watch dozens of times could be made more fun in UHD as their could be different storied going on in different parts of the screen, it would almost be an interactive experience …