Posted on Leave a comment

Virtualization approaches final frontiers in the home

Virtualization has been around almost as long as business computing after being invented by IBM in the 1970s so that “big iron” mainframes could mimic smaller machines for economies of scale. Later after personal computers arrived it reached the desktop with products like Soft PC allowing Apple Mac computers to run Microsoft’s Windows operating system and associated applications.

Another 10 years on the scope of virtualization expanded during the noughties to allow separation of hardware and software outside the data center in networking equipment such as routers and firewalls and then finally the TV industry joined the party. Even that was not the end of the story since now virtualization is beating a path into the home, not just for the gateway or set top boxes, but right to the ultimate client, whether a user’s PC or even an Internet of Things (IoT) device like a thermostat.

Over time the motivations have evolved subtly, so that virtualization became more about being able to exploit lower cost and more flexible commodity hardware than getting the best value out of a few large computers and exploiting their superior capabilities in various areas such as resilience and security. But now as virtualization comes together with the cloud there is another dimension, which is to enable much greater flexibility over where both hardware and software are deployed.

This shift to virtualization around the cloud has been aided by major standardization efforts, especially the open source initiative OpenFlow, which defines the interface between the control and forwarding layers of an SDN (Software Defined Network) architecture. SDN enables traditional networking functions, notably routing from node to node across IP networks, to be split between packet forwarding, which can be done locally on commodity hardware, and the higher level control logic, which can run remotely somewhere in the cloud if desired. OpenFlow then enables a physical device in the home, such as a gateway, to be “bridged” to its virtual counterpart within the network.

The key point here is that not all home gateway functions should be hived off to the cloud, since for example sensitive personal data may be best stored at home perhaps on a NAS (Network Attached Storage) device. It may also be that some processes will run more effectively locally for performance or security reasons, including some associated with the IoT. Virtualization combined with the cloud via OpenFlow allows this flexibility such that functions as well as underlying hardware can be located optimally for given services without incurring a cost penalty.

Just as IBM broke the ground for virtualization in the data center, we are now seeing virtualization reach into the home. Orange founded the French software company SoftAtHome in 2007 so it could deploy hardware independent home gateways. Other vendors have joined the fray since with Alcatel Lucent (now Nokia) among the leaders with its vRGWs (virtualized Residential Gateway) portfolio. Nokia like SoftAtHome argue that with their products operators can turn up new and innovative services faster, while reducing CAPEX and OPEX for existing and new services. Updates can be applied centrally without having to replace hardware or visit the homes, as has been common practice in the data center for some years.

Not surprisingly then some technology vendors have come into the virtualized home gateway area from the enterprise arena. One of these is Japanese IT giant NEC with its networking software subsidiary NetCracker, which helped Austrian incumbent Telekom Austria over an in-depth trial of virtualized customer premises equipment (vCPE). This integrated SDN technology with virtual network functions (VNFs) through a common service and network orchestration platform which also involved technology from other vendors. The Telco cited as a key benefit the ability to have one single point of delivery for home media and entertainment content.

Now virtualization is approaching its next frontier in the IoT arena where the motivation shifts yet again. One challenge for IoT is to be able to configure generic devices for a range of applications rather than having to make dedicated hardware for each one. This is again about being able to use off the shelf hardware for a range of services but this time the commoditization must occur down at the chip level. This calls for embedded virtualization so that small single chip devices such as sensors can be remotely programmed and repurposed in the field. Apart from flexibility and cost reduction, embedded virtualization will confer greater security and real time performance since operations are executed within a single SoC (System on Chip). Even this is not entirely new since embedded virtualization has emerged in other sectors such as the automotive industry  where again there is a need for field upgradeability, given that vehicles as a whole now have a longer life cycle than many of the underlying software based components.

The real challenge for broadband operators will be to capitalize on end to end virtualization extending across the home network, which presents an opportunity to key vendors like Nokia and SoftAtHome to smooth the path.

Posted on Leave a comment

First 4 trends spotted at IBC14

I didn’t write any prediction on what would be important this year, but while it’s still fresh here are my first impressions

Last year the Cloud was a key buzzword and Amazon was, this year, it’s replaced by Virtualization, basically the same technology, but with the possibility of running all those virtual machines in a service provider's own data center. It is supposed to lower costs eventually and make things like redundancy management easier, but I’ve yet to be convinced if it’s really such a big deal. I’ll try and stop by some of the encoding booths like Envivio, Harmonic or Elemental to check out where it’s really just a generalization of the concept of software based encoding vs. hardware based encoding.  I'll also try to get back to the Amazon Web Services stand in hall 3 where they're explaining how Netflix uses AWS with special tools developed to optimize service availability.

4k has of course been around for several years yet still manages to buzz. I’ve been told to go see Samsung’s giant curved display in hall 1. The main difference from last year is that there’s hardly a booth without a 4K display or two, most now at 60fps and more and more UI’s, like that on display at SoftAtHome’s booth, are now native 4K.

OTT is still very present even if it too has lost its novelty as so many commercial deployments are out there. OTT ecosystem vendors are repositioning frantically as value is eroded. Some like Piksel seem to be keeping their end-to-end positioning, while others like Siemens with its Swipe service are also bringing out specific components to sell as services. Enhanced ABR is also appearing, to help reduce Opex costs, by finding tricks to use only as much bandwidth as is required. All in the CDN crowd like for example Limelight, Anevia, Broadpeak or Media Melon (who don’t have a booth) have things to show in this area.

IoT and the connected and/or smart homes have been around for years in other shows, but have now just made it to IBC. Managing the home network is becoming more challenging for many reasons. One that struck me more is that we are seeing a greater proportion of homes with 100M+ broadband connections, but in-home effective throughputs down to just a few megabits, often not enough to stream over Wi-Fi. There were quite a few solutions at IBC, like AirTies' home Wi-Fi meshing.

Some trends though are clearly on the way out. I noted for example that it’s already out of fashion to talk about embedded apps now that HTML5 is a no-brainer and any mention of the smart SmarTV is positively 2013.

More soon, stay tuned...