Posted on Leave a comment

AI, a new battleground for home entertainment

What goes around in the technology sphere sometimes comes around many times before it finally works and delivers value to consumers or enterprises. That is certainly true for the related fields of AI (Artificial Intelligence), Machine Learning and voice processing. These all seem to be coming together now over 60 years after they were first proposed around 1954 by the US government as vital tools in the Cold War against the old Soviet Union. The ambition then was to translate and interpret Soviet technical documents and scientific reports almost instantly by exploiting the ground-breaking work on grammar by linguist Noam Chomsky. Despite huge investment from the US this failed abysmally because although these early systems could cope to some extent with grammar, machines could not get to grips with context or metaphor and rendered "the spirit is willing but the flesh is weak."  as "the vodka is good but the meat is rotten” after translation from English to Russian and back again.
Continue reading AI, a new battleground for home entertainment

Posted on Leave a comment

Consolidation in air at Broadband World Forum 2016

Bonding touted as solution to boost bandwidth for fixed and mobile services

Major trade shows can provide useful bell weathers of a given industry and the recent Broadband World Forum 2016 highlighted two notable trends embracing both the fixed and mobile space, one business related and the other technical. For the former, consolidation was a major theme that will only be accentuated by the announcement of AT&T’s bid for Time Warner coming after the show had ended. But there was also a sentiment that consolidation should not be allowed to proceed so far that it inhibits competition and consumer choice, which are essential for any thriving market in our mixed global economy.
Continue reading Consolidation in air at Broadband World Forum 2016

Posted on Leave a comment

Operators should take charge of systems integration for control and agility

Systems integration has become a major challenge for pay TV operators as they embrace IP infrastructures and online delivery, with a requirement to become as agile as Internet companies while containing costs and dealing with legacy. For many the choice of integrator has become more critical even than that of the infrastructure’s core components, because getting it wrong can derail the whole enterprise and risk losing valuable ground at a time when new entrants are arriving and competitors are innovating faster than ever before in TV’s history. Against this background I have just co-written with Ben Schwarz an eBook focusing on SI in  pay TV and drilling down into the core issues through interviews with leading operators, analysts and of course integrators themselves. What emerged was a clear picture of best practice and priorities both in selecting an SI and then managing the subsequent relationship.
Continue reading Operators should take charge of systems integration for control and agility

Posted on Leave a comment

Enterprise may drive Internet of Things boom

The Internet of Things (IoT) has reached a critical stage in its evolution where it seems to be caught between two tipping points, waiting for the final explosion after the arrival of joined up applications connecting different domains. The first tipping point came around 2014 with proven single domain applications and the arrival of big players in retail such as Staples, energy utility like British Gas and ADT in premises security. That was also the year Google acquired smart thermostat leader Nest. The big data centre systems companies also piled in but more on the enterprise side, such as IBM with a $3 billion investment early in 2015 in its Watson IoT centre based in Munich.

Since then though the sheen has come off IoT a little with mixed signals from the leading players. Google in particular has struggled rather as it did initially with Android TV, with Nest failing to bring out promised new products and recently calling time on its smart home hub for wireless control of end devices called Revolv, which was launched amid much fanfare in October 2014 but then withdrawn in May 2015. It now looks like Google is pursuing a more distributed approach promoting direct interoperability among its own Nest devices without any intermediate hub, but that is not yet completely clear.

Another big US technology company Intel has also found the IoT sector harder going than it expected with its IoT Group reporting reduced revenue growth and a 25% year on year slump in operating income down to $132 million for 2015. The common theme here is failure of the IoT to break out of its silos so that both companies were left connecting their own things.

British Gas has fared better largely because as an energy utility it started with the expectation that it would be confined to its own domain for a while before branching out into other smart home sectors such as security and environmental control. The company instead is focusing on developing the analytics tools it believes will enable wider success in a future joined up IoT and has been investing in real time processing of the large data sets generated by its Hive connected thermostat. Hive allows users to control their boilers and central heating systems remotely by phone, which generates 30,000 messages a second amounting to 40 TB of static data so far, distributed across 30 nodes. Like Google, British Gas has created a dedicated IoT subsidiary called Connected Home, which has built an open source software stack running on the Apache Cassandra distributed database to process data both in real time and offline.

British Gas then is preparing for IoT’s second tipping point, which will come with joined up services that exploit synergy between different domains. IBM shares this conviction from its enterprise-focused perspective, drawing heavily on its cognitive computing work at its Thomas J. Watson Research Centre in New York, with one line being analysis of data from multiple remote sensors for predictive diagnostics. IBM is already enabling Pratt & Whitney to monitor 4,000 commercial engines and obtain early warning of faults that cause costly service outages if left unfixed until later, even if they are not safety critical.

Telcos are of course also intent on capitalizing on the IoT from their position as broadband providers to homes. One early mover is Paris based SoftAtHome, in which three major Telcos are investors, Orange of France, Swisscom and Etisalat based in the United Arab Emirates. The software developer has extended its home operating platform with CloudAtHome to enable centralized control of devices with potential for integration between domains. All such initiatives must support all the key wireless protocols such as Wi-Fi, Bluetooth and Zigbee that IoT devices such as thermostats use to communicate. SoftAtHome uses a hybrid model combining some form of home hub and data repository with cloud-based processes. Such a hybrid approach aims to deliver the required flexibility, security (and privacy), performance and functional breadth. Flexibility comes from being able to deploy processes in the cloud or at home as appropriate, while keeping sensitive data within the local repository will ensure security and privacy. Performance may require some processes to be performed locally to keep latency down while some features may need cloud components.

A close look at this cloud/home distribution shows that in some cases the cloud should be partitioned between remote processes that may be executed in a distant data centre (what is usually called the cloud) and intermediate ones that might be best run at the network edge. This is known as Fog Computing, where some storage and processing takes place more locally perhaps in a DSLAM or even a street cabinet. The argument is that as IoT takes off, a lot of the initial data collection and analytics will be best performed at a Fog level before in some cases being fed back to the cloud after aggregation.

Fog could also work well for enterprise IoT where it might serve as a campus level control and aggregation layer within a larger cloud based infrastructure. It could also play a role as enterprise IoT becomes customer facing rather than mainly concerned with internal or supply chain operations. This could be a third IoT tipping point bringing together enterprise and consumer IT if a recent survey from Gartner is to be believed. This found that while only 18 percent of today’s enterprise IoT deployments are focused on customer experience, this will jump to 34 per cent over the year to Q1 2017. This represents a threefold absolute jump given that Gartner is forecasting the number of enterprises with IoT deployed somewhere to soar from 29 percent now to 43 per cent in a year’s time. Gartner also expects IoT to expand into new service related industry segments such as insurance beyond the heavier industries like manufacturing, utilities and logistics where it is concentrated now.

Such enterprise IoT forecasts have a history of becoming more accurate than some of the over hyped consumer analyst predictions. This means that if consumer IoT does continue to stall it may be dragged forward by enterprises seeking competitive advantage as well as new revenues, as we are seeing to an extent with the likes of British Gas.

Posted on Leave a comment

Virtualization approaches final frontiers in the home

Virtualization has been around almost as long as business computing after being invented by IBM in the 1970s so that “big iron” mainframes could mimic smaller machines for economies of scale. Later after personal computers arrived it reached the desktop with products like Soft PC allowing Apple Mac computers to run Microsoft’s Windows operating system and associated applications.

Another 10 years on the scope of virtualization expanded during the noughties to allow separation of hardware and software outside the data center in networking equipment such as routers and firewalls and then finally the TV industry joined the party. Even that was not the end of the story since now virtualization is beating a path into the home, not just for the gateway or set top boxes, but right to the ultimate client, whether a user’s PC or even an Internet of Things (IoT) device like a thermostat.

Over time the motivations have evolved subtly, so that virtualization became more about being able to exploit lower cost and more flexible commodity hardware than getting the best value out of a few large computers and exploiting their superior capabilities in various areas such as resilience and security. But now as virtualization comes together with the cloud there is another dimension, which is to enable much greater flexibility over where both hardware and software are deployed.

This shift to virtualization around the cloud has been aided by major standardization efforts, especially the open source initiative OpenFlow, which defines the interface between the control and forwarding layers of an SDN (Software Defined Network) architecture. SDN enables traditional networking functions, notably routing from node to node across IP networks, to be split between packet forwarding, which can be done locally on commodity hardware, and the higher level control logic, which can run remotely somewhere in the cloud if desired. OpenFlow then enables a physical device in the home, such as a gateway, to be “bridged” to its virtual counterpart within the network.

The key point here is that not all home gateway functions should be hived off to the cloud, since for example sensitive personal data may be best stored at home perhaps on a NAS (Network Attached Storage) device. It may also be that some processes will run more effectively locally for performance or security reasons, including some associated with the IoT. Virtualization combined with the cloud via OpenFlow allows this flexibility such that functions as well as underlying hardware can be located optimally for given services without incurring a cost penalty.

Just as IBM broke the ground for virtualization in the data center, we are now seeing virtualization reach into the home. Orange founded the French software company SoftAtHome in 2007 so it could deploy hardware independent home gateways. Other vendors have joined the fray since with Alcatel Lucent (now Nokia) among the leaders with its vRGWs (virtualized Residential Gateway) portfolio. Nokia like SoftAtHome argue that with their products operators can turn up new and innovative services faster, while reducing CAPEX and OPEX for existing and new services. Updates can be applied centrally without having to replace hardware or visit the homes, as has been common practice in the data center for some years.

Not surprisingly then some technology vendors have come into the virtualized home gateway area from the enterprise arena. One of these is Japanese IT giant NEC with its networking software subsidiary NetCracker, which helped Austrian incumbent Telekom Austria over an in-depth trial of virtualized customer premises equipment (vCPE). This integrated SDN technology with virtual network functions (VNFs) through a common service and network orchestration platform which also involved technology from other vendors. The Telco cited as a key benefit the ability to have one single point of delivery for home media and entertainment content.

Now virtualization is approaching its next frontier in the IoT arena where the motivation shifts yet again. One challenge for IoT is to be able to configure generic devices for a range of applications rather than having to make dedicated hardware for each one. This is again about being able to use off the shelf hardware for a range of services but this time the commoditization must occur down at the chip level. This calls for embedded virtualization so that small single chip devices such as sensors can be remotely programmed and repurposed in the field. Apart from flexibility and cost reduction, embedded virtualization will confer greater security and real time performance since operations are executed within a single SoC (System on Chip). Even this is not entirely new since embedded virtualization has emerged in other sectors such as the automotive industry  where again there is a need for field upgradeability, given that vehicles as a whole now have a longer life cycle than many of the underlying software based components.

The real challenge for broadband operators will be to capitalize on end to end virtualization extending across the home network, which presents an opportunity to key vendors like Nokia and SoftAtHome to smooth the path.

Posted on Leave a comment

Measurement key to monetizing mobile video

Measuring mobile video audiences and associated ad engagement is one of the greatest challenges facing the pay TV industry, with big rewards for getting it right. Mobile video has surged over the last year, with phones and tablets accounting for 46 per cent of all online viewing globally during Q4 2016, up from 34 per cent a year earlier, according to video technology vendor Ooyala. Ad spending is moving with the eyeballs and in the UK for example more of it will be on mobile than mainstream TV for the first time this year, £4.58 billion ($7 billion) against £4.18 billion ($6.39 billion), according to eMarketer.

While some pay TV operators may have reasonable visibility over viewing on desktops, mobile devices raise complexity to another dimension. On desktops access to web sites and services is almost all via browsers, but on mobiles these only account for a minority of viewing. It is true that the majority of web sites are accessed from mobiles too via the browser, for obviously individual users only have room for a certain number of apps on their devices. But apps account for the great majority of time spent on mobiles and also for most traffic, because users tend to hang out in just a few places. Those places are accessed via apps rather than the browser, including the likes of Facebook, Google Maps and WeChat. However an interesting and relevant trend for operators during 2016, which has been highlighted by analyst group Forrester, is that users are increasingly turning towards aggregation apps to access the content they want.

When access is predominantly via a browser as on the desktop PC cookies can be used to track viewing activity and measure ad engagement. But cookies do not work well in the mobile world because activity is partitioned between the mobile browser and the various apps isolated from each other via sandboxing, which is a fundamental property of both the dominant mobile OSs, Android and Apple iOS. Web sites accessed within apps open via dedicated custom browsers which means that they cannot interact with persistent cookies on the device, which precludes use of proven desk top measurement tools. In the case of iOS devices, the situation is just as bad even for sites accessed via the mobile browser because Apple prohibits use of third party cookies.

There are also higher level challenges for mobile TV advertising such as defining how long people should watch an ad for it to count as having been viewed, given that attention spans are shorter on small screens. The situation is similar for the actual TV content, where the value of mobile viewing can depend on context, being particularly high when there is synergy with the big screen for example to resume watching something started earlier.

The overall challenge then is to integrate audience measurement and analytics across all screens including mobile to deliver consistent information that takes account of differences in context and engagement across the different platforms. There are now plenty of tools available for tracking activity on the mobile side, but integrating them within a coherent end to end measurement and analytics system is highly complex. Some big operators are attempting to do this in-house but increasingly even they are turning to specialist TV audience companies to enable the integration.

One example is UK based TV analytics firm Genius Digital, offering two services which can be combined or stand alone, one being Real Time Data Collection for reporting viewing data across all devices. This is based on multiscreen libraries that can be embedded into mobile or web applications to enable monitoring of video consumption, profile management, performance and quality management on JavaScript, iOS and Android devices. Secondly Genius Digital offers Multiscreen Data Service (MDS), designed to extract viewing data from apps, even those from third parties. A key benefit of this approach lies in marrying viewing information from these different apps, each of which will normally use different metrics, to provide consistent information about engagement with channels or specific programs for integration with traditional set top box return path data.

Another TV analytics company TVbeat, also UK based, has moved in a similar direction, in this case through a partnership with a dedicated TV app company Metrological. This has enabled TVbeat to meld set top data with mobile device return path and app consumption information from Metrological’s Application Platform.

Such developments ease the pain of mobile audience measurement for pay TV operators and we expect to see more that have previously relied solely on in-house development to at least consider working with one of the specialist analytics companies that are in a better position to aggregate data from many sources. With mobiles accounting for a rapidly increasing proportion of both viewing and ad budgets, operators need to embrace that with their existing actionable data analytics.