Publié par Laisser un commentaire

(English) Consolidation in air at Broadband World Forum 2016

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Bonding touted as solution to boost bandwidth for fixed and mobile services

Major trade shows can provide useful bell weathers of a given industry and the recent Broadband World Forum 2016 highlighted two notable trends embracing both the fixed and mobile space, one business related and the other technical. For the former, consolidation was a major theme that will only be accentuated by the announcement of AT&T’s bid for Time Warner coming after the show had ended. But there was also a sentiment that consolidation should not be allowed to proceed so far that it inhibits competition and consumer choice, which are essential for any thriving market in our mixed global economy.
Continuer la lecture de (English) Consolidation in air at Broadband World Forum 2016

Publié par Laisser un commentaire

(English) Enterprise may drive Internet of Things boom

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

The Internet of Things (IoT) has reached a critical stage in its evolution where it seems to be caught between two tipping points, waiting for the final explosion after the arrival of joined up applications connecting different domains. The first tipping point came around 2014 with proven single domain applications and the arrival of big players in retail such as Staples, energy utility like British Gas and ADT in premises security. That was also the year Google acquired smart thermostat leader Nest. The big data centre systems companies also piled in but more on the enterprise side, such as IBM with a $3 billion investment early in 2015 in its Watson IoT centre based in Munich.

Since then though the sheen has come off IoT a little with mixed signals from the leading players. Google in particular has struggled rather as it did initially with Android TV, with Nest failing to bring out promised new products and recently calling time on its smart home hub for wireless control of end devices called Revolv, which was launched amid much fanfare in October 2014 but then withdrawn in May 2015. It now looks like Google is pursuing a more distributed approach promoting direct interoperability among its own Nest devices without any intermediate hub, but that is not yet completely clear.

Another big US technology company Intel has also found the IoT sector harder going than it expected with its IoT Group reporting reduced revenue growth and a 25% year on year slump in operating income down to $132 million for 2015. The common theme here is failure of the IoT to break out of its silos so that both companies were left connecting their own things.

British Gas has fared better largely because as an energy utility it started with the expectation that it would be confined to its own domain for a while before branching out into other smart home sectors such as security and environmental control. The company instead is focusing on developing the analytics tools it believes will enable wider success in a future joined up IoT and has been investing in real time processing of the large data sets generated by its Hive connected thermostat. Hive allows users to control their boilers and central heating systems remotely by phone, which generates 30,000 messages a second amounting to 40 TB of static data so far, distributed across 30 nodes. Like Google, British Gas has created a dedicated IoT subsidiary called Connected Home, which has built an open source software stack running on the Apache Cassandra distributed database to process data both in real time and offline.

British Gas then is preparing for IoT’s second tipping point, which will come with joined up services that exploit synergy between different domains. IBM shares this conviction from its enterprise-focused perspective, drawing heavily on its cognitive computing work at its Thomas J. Watson Research Centre in New York, with one line being analysis of data from multiple remote sensors for predictive diagnostics. IBM is already enabling Pratt & Whitney to monitor 4,000 commercial engines and obtain early warning of faults that cause costly service outages if left unfixed until later, even if they are not safety critical.

Telcos are of course also intent on capitalizing on the IoT from their position as broadband providers to homes. One early mover is Paris based SoftAtHome, in which three major Telcos are investors, Orange of France, Swisscom and Etisalat based in the United Arab Emirates. The software developer has extended its home operating platform with CloudAtHome to enable centralized control of devices with potential for integration between domains. All such initiatives must support all the key wireless protocols such as Wi-Fi, Bluetooth and Zigbee that IoT devices such as thermostats use to communicate. SoftAtHome uses a hybrid model combining some form of home hub and data repository with cloud-based processes. Such a hybrid approach aims to deliver the required flexibility, security (and privacy), performance and functional breadth. Flexibility comes from being able to deploy processes in the cloud or at home as appropriate, while keeping sensitive data within the local repository will ensure security and privacy. Performance may require some processes to be performed locally to keep latency down while some features may need cloud components.

A close look at this cloud/home distribution shows that in some cases the cloud should be partitioned between remote processes that may be executed in a distant data centre (what is usually called the cloud) and intermediate ones that might be best run at the network edge. This is known as Fog Computing, where some storage and processing takes place more locally perhaps in a DSLAM or even a street cabinet. The argument is that as IoT takes off, a lot of the initial data collection and analytics will be best performed at a Fog level before in some cases being fed back to the cloud after aggregation.

Fog could also work well for enterprise IoT where it might serve as a campus level control and aggregation layer within a larger cloud based infrastructure. It could also play a role as enterprise IoT becomes customer facing rather than mainly concerned with internal or supply chain operations. This could be a third IoT tipping point bringing together enterprise and consumer IT if a recent survey from Gartner is to be believed. This found that while only 18 percent of today’s enterprise IoT deployments are focused on customer experience, this will jump to 34 per cent over the year to Q1 2017. This represents a threefold absolute jump given that Gartner is forecasting the number of enterprises with IoT deployed somewhere to soar from 29 percent now to 43 per cent in a year’s time. Gartner also expects IoT to expand into new service related industry segments such as insurance beyond the heavier industries like manufacturing, utilities and logistics where it is concentrated now.

Such enterprise IoT forecasts have a history of becoming more accurate than some of the over hyped consumer analyst predictions. This means that if consumer IoT does continue to stall it may be dragged forward by enterprises seeking competitive advantage as well as new revenues, as we are seeing to an extent with the likes of British Gas.

Publié par Laisser un commentaire

(English) Virtualization approaches final frontiers in the home

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Virtualization has been around almost as long as business computing after being invented by IBM in the 1970s so that “big iron” mainframes could mimic smaller machines for economies of scale. Later after personal computers arrived it reached the desktop with products like Soft PC allowing Apple Mac computers to run Microsoft’s Windows operating system and associated applications.

Another 10 years on the scope of virtualization expanded during the noughties to allow separation of hardware and software outside the data center in networking equipment such as routers and firewalls and then finally the TV industry joined the party. Even that was not the end of the story since now virtualization is beating a path into the home, not just for the gateway or set top boxes, but right to the ultimate client, whether a user’s PC or even an Internet of Things (IoT) device like a thermostat.

Over time the motivations have evolved subtly, so that virtualization became more about being able to exploit lower cost and more flexible commodity hardware than getting the best value out of a few large computers and exploiting their superior capabilities in various areas such as resilience and security. But now as virtualization comes together with the cloud there is another dimension, which is to enable much greater flexibility over where both hardware and software are deployed.

This shift to virtualization around the cloud has been aided by major standardization efforts, especially the open source initiative OpenFlow, which defines the interface between the control and forwarding layers of an SDN (Software Defined Network) architecture. SDN enables traditional networking functions, notably routing from node to node across IP networks, to be split between packet forwarding, which can be done locally on commodity hardware, and the higher level control logic, which can run remotely somewhere in the cloud if desired. OpenFlow then enables a physical device in the home, such as a gateway, to be “bridged” to its virtual counterpart within the network.

The key point here is that not all home gateway functions should be hived off to the cloud, since for example sensitive personal data may be best stored at home perhaps on a NAS (Network Attached Storage) device. It may also be that some processes will run more effectively locally for performance or security reasons, including some associated with the IoT. Virtualization combined with the cloud via OpenFlow allows this flexibility such that functions as well as underlying hardware can be located optimally for given services without incurring a cost penalty.

Just as IBM broke the ground for virtualization in the data center, we are now seeing virtualization reach into the home. Orange founded the French software company SoftAtHome in 2007 so it could deploy hardware independent home gateways. Other vendors have joined the fray since with Alcatel Lucent (now Nokia) among the leaders with its vRGWs (virtualized Residential Gateway) portfolio. Nokia like SoftAtHome argue that with their products operators can turn up new and innovative services faster, while reducing CAPEX and OPEX for existing and new services. Updates can be applied centrally without having to replace hardware or visit the homes, as has been common practice in the data center for some years.

Not surprisingly then some technology vendors have come into the virtualized home gateway area from the enterprise arena. One of these is Japanese IT giant NEC with its networking software subsidiary NetCracker, which helped Austrian incumbent Telekom Austria over an in-depth trial of virtualized customer premises equipment (vCPE). This integrated SDN technology with virtual network functions (VNFs) through a common service and network orchestration platform which also involved technology from other vendors. The Telco cited as a key benefit the ability to have one single point of delivery for home media and entertainment content.

Now virtualization is approaching its next frontier in the IoT arena where the motivation shifts yet again. One challenge for IoT is to be able to configure generic devices for a range of applications rather than having to make dedicated hardware for each one. This is again about being able to use off the shelf hardware for a range of services but this time the commoditization must occur down at the chip level. This calls for embedded virtualization so that small single chip devices such as sensors can be remotely programmed and repurposed in the field. Apart from flexibility and cost reduction, embedded virtualization will confer greater security and real time performance since operations are executed within a single SoC (System on Chip). Even this is not entirely new since embedded virtualization has emerged in other sectors such as the automotive industry  where again there is a need for field upgradeability, given that vehicles as a whole now have a longer life cycle than many of the underlying software based components.

The real challenge for broadband operators will be to capitalize on end to end virtualization extending across the home network, which presents an opportunity to key vendors like Nokia and SoftAtHome to smooth the path.

Publié par Laisser un commentaire

(English) « HaLow » sets stage for multi-channel Wi-Fi

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

The Wi-Fi Alliance’s announcement of the low power version IEEE 802.11ah, dubbed “HaLow”, was dismissed by some analysts as being too late to make a significant impact in the fast growing Internet of Things (sector). That view is wrong and seriously discounts the power and momentum behind Wi-Fi, to the extent that HaLow has already received extensive coverage in the popular as well as technical press. It is already far closer to being a household name than other longstanding contenders as wireless protocols for IoT devices such as Zigbee and Zwave.

It is true that certification of HaLow compliant products will not begin until 2018, but with IoT surging forward on a number of fronts including the smart car, digital home and eHealth, SoC vendors such as Qualcomm are likely to bring out silicon before that. There are good reasons for expecting HaLow to succeed, some relating to its own specifications and others more to do with the overall evolution of Wi-Fi as a whole.

Another factor is the current fragmentation among existing contenders, with a number of other protocols vying alongside Zigbee and Zwave. This may seem to be a reason for not needing yet another protocol but actually means none of the existing ones have gained enough traction to repel a higher profile invader.

More to the point though HaLow has some key benefits over the others, one being its affinity to IP and Internet through being part of Wi-Fi. Zigbee has responded by collaborating with another wireless protocol developer Thread to incorporate IP connectivity. But HaLow has other advantages, including greater range and ability to operate in challenging RF environments. There is already a sense in which the others are having to play catch up even though they have been around for much longer.

It is true that Bluetooth now has its low energy version to overcome the very limited range of the main protocol, but even this is struggling to demonstrate adequate performance over larger commercial sites. The Wi-Fi Alliance claims that HaLow is highly robust and can cope with most real sites from large homes having thick walls containing metal, to concrete warehouse complexes.

 

The big picture is that Wi-Fi is looking increasingly like a multi-channel protocol operating at a range of frequencies to suit differing use cases. To date we have two variants, 2.4 GHz and 5 GHz, which tend to get used almost interchangeably, with the latter doubling up to provide capacity when the former is congested. In future though there will be four channels, still interchangeable but tending to be dedicated to different applications, combining to yield a single coherent standard that will cover all the basses and perhaps vie with LTE outdoors for connecting various embedded IoT and M2M devices.

HaLow comes in at around 900 MHz, which means it has less bandwidth but greater coverage than the higher frequency Wi-Fi bands and has been optimized to cope well with interference both from other radio sources and physical objects. Then we have the very high frequency 802.11ad or WiGig standard coming along at 60 GHz enabling theoretical bit rates of 5 Gbps or more, spearheaded by Qualcomm, Intel and Samsung. WiGig is a further trade-off between speed and coverage and it will most likely be confined to in-room distribution of decoded ultra HD video perhaps from a gateway or set top to a big screen TV or home cinema.

Then the 5 GHz version might serve premium video to other devices around the home, while 2.4 GHz delivers general Internet access. That would leave HaLow to take care of some wearables, sensors and other low power devices that need coverage but only modest bit rates. As it happens HaLow will outperform all the other contenders for capacity except Bluetooth, with which it will be on much of a par.

 

HaLow will be embraced by key vendors in the smart home and IoT arena, such as Paris based SoftAtHome, which already supports the other key wireless protocols in its software platform through its association with relevant hardware and SoC vendors. SoftAtHome can insulate broadband operators from underlying protocols so that they do not have to be dedicated followers of the wireless wars.

AirTies is another vendor with a keen interest as one of the leading providers of Wi-Fi technology for the home, already aiming to deliver the levels of coverage and availability promised by HaLow in the higher 2.4 GHz and 5 GHz bands. It does this by creating a robust mesh from multiple Access Points (APs), to make Wi-Fi work more like a wired point to point network while retaining all the flexibility of wireless.

 

All these trends are pointing towards Wi-Fi becoming a complete quad-channel wireless offering enabling operators to be one stop shops for the digital home of the future, as well as being able to address many IoT requirements outside it.

At the same time it is worth bearing in mind that the IoT and its relative M2M is a very large canvas, extending to remote outdoor locations, some of which are more far challenging for RF signals than almost any home. In any case while HaLow may well see off all-comers indoors, it will only be a contender out doors in areas close to fixed broadband networks. That is why there is so much interest in Heterogeneous Networks (HetNets) combining Wi-Fi with LTE and also why there are several other emerging wireless protocols for longer distance IoT communications.

One of these others is Long Range Wide Area Network (LoRaWAN), a low power wireless networking protocol announced in March 2015, designed for secure two way communication between low-cost battery-powered embedded devices. Like HaLow it runs at sub-GHz frequencies, but in bands reserved for scientific and industrial applications, optimized for penetrating large structures and subsurface infrastructures within a range of 2km. LoRaWAN is backed by a group including Cisco and IBM, as well as some leading Telcos like Bouygues Telecom, KPN, SingTel and Swisscom. The focus is particularly on harsh RF environments previously too challenging or expensive to connect, such as mines, underwater and mountainous terrain.

Another well backed contender is Narrowband-LTE (NB-LTE) announced in September 2015 with Nokia, Ericsson and Intel behind it, where the focus is more on long range and power efficient communications to remote embedded sensors on the ground. So it still looks like being a case of horses for courses given the huge diversity of RF environments where IoT and M2M will be deployed, with HaLow a likely winner indoors, but coexisting with others outside.

Publié par Un commentaire

(English) @nebul2’s 14 reasons why 2015 will be yet another #UHD #IBCShow

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Ultra HD or 4K has been a key topic of my pre and post IBC blogs for over 5 years. I’ve recently joined the Ultra HD Forum, serving on the communications working group. That’s a big commitment and investment, as I don’t have any large company paying my bills. I’m making it because I believe the next 18 months will see the transition from UHD as the subject of trials for big operators and precursor launches to something no operator can be without. Time to get off the fence. I once wrote that the 3D emperor didn’t have any clothes on; well, the UHD emperor is fully clothed.

Of course much still needs to be achieved before we see mass adoption. I don’t know if HDR and 4K resolution will reach market acceptance one at a time or both together, and yes, I don’t know which HDR specification will succeed. But I know it’s all coming.

Below is a list of 14 key topics ordered by my subjective (this is a blog remember) sense of comfort on each. I start with areas where the roadmap to industrial strength UHD delivery is clear to me and end with those where I’m the most confused.

Note on vocabulary: 4K refers to a screen resolution for next gen TV whereas UHD includes that spatial resolution (one even sees UHD phase 2 documents refer to an 8K resolution) but also frame rate, HDR and next generation Audio.

So as I wander round IBC this year, or imagine I’m doing that, as probably won’t have time, I’ll look into the following 14 topics with growing interest.

1. Broadcast networks (DVB)

I doubt I’ll stop by the big satellite booths for example, except of course for free drinks and maybe to glimpse the latest live demos. The Eutelsat, Intelsat or Astras of this world have a pretty clear UHD story to tell. Just like the cableCos, they are the pipe and they are ready, as long as you have what it takes to pay.

2. Studio equipment (cameras etc.)

As a geek, I loved the Canon demos at NAB, both of affordable 4K cameras and their new ultra sensitive low-light capabilities. But I won’t be visiting any of the studio equipment vendors, simply because I don’t believe they are on the critical path for UHD success. The only exception to this is the HDR issues described below.

 3. IP network; CDN and Bandwidth

Bandwidth constricts UHD delivery; it would be stupid to claim otherwise. All I’m saying is that by putting this issue so high on the list everything is clear in the mid-term. We know how fast High-Speed Broadband (over 30MPS) is arriving in most markets. In the meantime, early adopters without access can buy themselves a UHD Blu-ray by Christmas this year and use progressive download services. The Ultra HD Alliance has already identified 25 online services, several of which support PDL. Once UHD streams get to the doorstep or the living room, there is still the issue of distributing them around the home. But several vendors like AirTies are addressing that specific issue, so again, even if it isn’t fixed, I can see how it will be.

 4. Codecs (HEVC)

The angst around NAB this year when V-nova came out with a bang has subsided. It seems now that even if such a disruptive technology does come through in the near-term, it will complement not replace HEVC for UHD delivery.

The codec space dropped from a safe 2 in my list down to 4 with the very recent scares on royalties from the HEVC Advance group that wants 0.5% of content owner & distributor's gross revenue. Industry old-timers have reassured me that this kind of posturing is normal and that the market will settle down naturally at acceptable rates.

 5. Head-ends (Encoders, Origins, etc.)

I always enjoy demos and discussion on the booths of the likes of Media Excel, Envivio, Harmonic, Elemental or startup BBright and although I’ll try to stop by, I won’t make a priority of them because here again, the mid-term roadmaps seem relatively clear.

I’ve been hearing contradictory feedback on the whole cloud-encoding story that has been sold to us for a couple of years already. My theory – to be checked at IBC – is that encoding in the cloud really does make sense for constantly changing needs and where there is budget. But for T2 operators running on a shoestring – and there are a lot of them – the vendors are still mainly shifting appliances. It’s kind of counterintuitive because you’d expect the whole cloud concept of mutualizing resources to work better for the smaller guys. I must have something missing here, do ping me with info so I can update this section.

 6. 4K/UHD resolutions

While there is no longer any concern on what the screen resolutions will be, I am a little unclear as to the order in which they will arrive. With heavyweights like Ericsson openly pushing for HDR before 4K, I’m a little concerned that lack of industry agreement on this could confuse the market.

 7. Security for UHD

Content owners and security vendors like Verimatrix have all agreed that better security is required for UHD content. I see no technical issues here - just that if the user experience is adversely affected in any way (remember the early MP3 years), we could see incentive for illegal file transfer grow, just when legal streaming seems to be taking of at last.

 8. TV sets & STBs

Well into second half of my list, we’re getting into less clear waters.

When it’s the TV set that is doing the UHD decoding, we’re back at the product cycle issue that has plagued smart TVs. It’s all moving too fast for a TV set that people still would like to keep in the living room for over 5 years.

On the STB side, we’ve seen further consolidation since last year’s IBC. Pace for example is no longer; Cisco is exiting STBs etc. It seems that only players with huge scale will survive. Operators like Swisscom or Orange can make Hardware vendors’ lives harder by commoditizing their hardware using software-only vendors such as SoftAtHome to deliver advanced features.

 9. Frame rates

This is a really simple one but for which consensus is needed. At a 4K screen resolution the eye/brain is more sensitive to artifacts. Will refresh rates standardize at 50Hz or 60Hz? Will we really ever need 120Hz?

It’s clear that doubling a frame rate does not double the required bandwidth as clever compression techniques come to play. But but I haven’t seen a consensus on what the bandwidth implication of greater frame rate will actually be.

10. Next Gen Audio

There are only a few contenders out there, and all have compelling solutions. I’m pretty keyed up on DTS’s HeadphoneX streamed with Unified Streaming packagers because I’m helping them write an eBook on the subject. Dolby is, of course, a key player here but for me it’s not yet clear how multiple solutions will cohabit. It isn’t yet clear how if and when we’ll move from simple channel-based to scene based or object based audio. Will open source projects like Ambiophonics play a role and what about binaural audio.

11. HDR

High Dynamic Range is about better contrast. Also, the brain perceives more detail when contrast is improved, so it’s almost like getting more pixels for free. But the difficulty with HDR and why it’s near the bottom of my list is that there are competing specifications. And even once a given specification is adopted, its implementation on a TV set can vary from one CE manufacturer to another. I final reservation I have is the extra power consumption it will entail that goes against current CE trends.

12. Wide Color Gamut

As HDR brings more contrast to pixels WCG brings richer and truer colors. Unlike with HDR, the issue isn’t about which spec to follow, as it is already catered for in HEVC for example. No, it’s more about when to implement it and how the color mapping will be unified across display technologies and vendors.

 13. Work flows

Workflow from production through to display is a sensitive issue because it is heavily dependant on skills and people. So it’s not just a mater of choosing the right technology. To produce live UHD content including HDR, there is still no industry standard way of setting up a workflow.

 14. UHD-only content

The pressure to recoup investments in HD infrastructure makes the idea of UHD content that is unsuitable for HD downscaling taboo. From a business perspective, most operators consider UHD as an extension or add-on rather than something completely new. There is room for a visionary to coma and change that.

Compelling UHD content, where the whole screen is in focus (video rather than cinema lenses) gives filmmakers a new artistic dimension to work on. There is enough real estate on screen to offer multiple user experiences.

In the world of sports a UHD screen could offer a fixed view on a whole football pitch for example. But if that video were seen on an HD screen, the ball probably wouldn’t be visible. Ads that we have to watch dozens of times could be made more fun in UHD as their could be different storied going on in different parts of the screen, it would almost be an interactive experience …

Publié par Laisser un commentaire

IBC 2013, impressions of a 4K OTT show

Although OTT has been an IBC topic for a few years, we actually saw a plethora of end-to-end platforms that actually worked, often purely in the cloud. The range of supplier was impressive from Israeli start-ups like Vidmind to multinationals like Siemens or the pioneer Kit digital, now reprendre Piksel. There was also much more talk of real world deployments. Underlying technologies are of course needed to enable OTT and adaptive bit rate (ABR) was omnipresent with most - but not all - stakeholders betting on the convergent MPEG-DASH flavour. OTT ecosystems can still be daunting and as we predicted in last year's white paper written for VO, Broadpeak and Harmonic, multi-vendor pre-integration was a trending theme. This year's IBC was, as expected, all about the forthcoming Ultra HD/4K resolution, which will now be enabled by the new HDMI 2.0 announced at IFA and HEVC. HEVC was shown in a few real world setups as oppose to last year's lab demos, although there wasn’t yet any consumer-grade decoding solutions. Many demos painfully showed that frame-rate is an issue as Thierry Fautier pointed out to me here. The jerky 25 FPS demos clearly made the point that it's going to be at least 50 FPS or higher resolutions just won't take off.  The 8K, Super Hi-Vision demo by NHK in the IBC's future zone blew my mind. With such an immersive experience, I doubt we’ll be wasting any more time with 3D in the living room. Although less prominent, but nevertheless significant, like the tip of an iceberg, the Smart Home continued its slow forward march with for example a demo of Cisco's Snowflake that dimmed the lights during a movie's night scenes. Several vendors like ADB or Nagra were talking about media hubs in the home. Big data was in a lot of discussions and I was pretty amazed by the power of solutions like Genius Digital's analysis of viewing statistics and how they can being immediate gain. Of course I too loved Wyplay's huge blue frog in hall 5, representing their new open source initiative, which needs to be analysed in the light of the US centric RDK project pushed by Comcast. As every year, I spent some time with a company slightly out of my usual focus, this year Livewire Digital showed me how professional newsgathering can meet BYOD. Some things I had expected (described here), but didn't see much of, included HTML5 that wasn't promoted as the mother of all UI technologies as I thought it would be. Also, despite Google’s recent successful Chromecast launch, dongles were not really visible at IBC (I’m told Qualcomm had one on their booth). Finally, it occurs to me tidying up my notes, that the true implication of the BYOD phenomenon hasn’t really been addressed head-on. Of course the show and conference were full of things to say about tablets and smartphones, but nobody seems to be looking at the deep business model transformation underway. When I learnt to do a TV launch business model, barely over a decade ago, the STB represented 70% of the project CAPEX if you hit a million subs. So in the future will a TV rollout cost 30% of what it used to, with the subscriber subsidising the operator for the other 70%? This is about my tenth IBC. In the jury for best booth, to which I was invited again this year (thanks Robin Lince), we realised that as IBC matures in the age of Internet and social media, the show is less about learning what the latest trend or product is or even what people think about them, we usually know all that before even coming. Face to face networking and building relations are the deeper motivation. In follow-up posts I’ll report on the 17 companies I spoke to this year at IBC: Brightcove, Envivio, Axinom, Visiware, Vidmind, Wyplay, Genius Digital, Astec, Axentra, Gravity, Akamai, Rovi, Cisco, Livewire Digital, Tara Systems, Verimatrix and SofAtHome.