Publié par Laisser un commentaire

(English) Enterprise may drive Internet of Things boom

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

The Internet of Things (IoT) has reached a critical stage in its evolution where it seems to be caught between two tipping points, waiting for the final explosion after the arrival of joined up applications connecting different domains. The first tipping point came around 2014 with proven single domain applications and the arrival of big players in retail such as Staples, energy utility like British Gas and ADT in premises security. That was also the year Google acquired smart thermostat leader Nest. The big data centre systems companies also piled in but more on the enterprise side, such as IBM with a $3 billion investment early in 2015 in its Watson IoT centre based in Munich.

Since then though the sheen has come off IoT a little with mixed signals from the leading players. Google in particular has struggled rather as it did initially with Android TV, with Nest failing to bring out promised new products and recently calling time on its smart home hub for wireless control of end devices called Revolv, which was launched amid much fanfare in October 2014 but then withdrawn in May 2015. It now looks like Google is pursuing a more distributed approach promoting direct interoperability among its own Nest devices without any intermediate hub, but that is not yet completely clear.

Another big US technology company Intel has also found the IoT sector harder going than it expected with its IoT Group reporting reduced revenue growth and a 25% year on year slump in operating income down to $132 million for 2015. The common theme here is failure of the IoT to break out of its silos so that both companies were left connecting their own things.

British Gas has fared better largely because as an energy utility it started with the expectation that it would be confined to its own domain for a while before branching out into other smart home sectors such as security and environmental control. The company instead is focusing on developing the analytics tools it believes will enable wider success in a future joined up IoT and has been investing in real time processing of the large data sets generated by its Hive connected thermostat. Hive allows users to control their boilers and central heating systems remotely by phone, which generates 30,000 messages a second amounting to 40 TB of static data so far, distributed across 30 nodes. Like Google, British Gas has created a dedicated IoT subsidiary called Connected Home, which has built an open source software stack running on the Apache Cassandra distributed database to process data both in real time and offline.

British Gas then is preparing for IoT’s second tipping point, which will come with joined up services that exploit synergy between different domains. IBM shares this conviction from its enterprise-focused perspective, drawing heavily on its cognitive computing work at its Thomas J. Watson Research Centre in New York, with one line being analysis of data from multiple remote sensors for predictive diagnostics. IBM is already enabling Pratt & Whitney to monitor 4,000 commercial engines and obtain early warning of faults that cause costly service outages if left unfixed until later, even if they are not safety critical.

Telcos are of course also intent on capitalizing on the IoT from their position as broadband providers to homes. One early mover is Paris based SoftAtHome, in which three major Telcos are investors, Orange of France, Swisscom and Etisalat based in the United Arab Emirates. The software developer has extended its home operating platform with CloudAtHome to enable centralized control of devices with potential for integration between domains. All such initiatives must support all the key wireless protocols such as Wi-Fi, Bluetooth and Zigbee that IoT devices such as thermostats use to communicate. SoftAtHome uses a hybrid model combining some form of home hub and data repository with cloud-based processes. Such a hybrid approach aims to deliver the required flexibility, security (and privacy), performance and functional breadth. Flexibility comes from being able to deploy processes in the cloud or at home as appropriate, while keeping sensitive data within the local repository will ensure security and privacy. Performance may require some processes to be performed locally to keep latency down while some features may need cloud components.

A close look at this cloud/home distribution shows that in some cases the cloud should be partitioned between remote processes that may be executed in a distant data centre (what is usually called the cloud) and intermediate ones that might be best run at the network edge. This is known as Fog Computing, where some storage and processing takes place more locally perhaps in a DSLAM or even a street cabinet. The argument is that as IoT takes off, a lot of the initial data collection and analytics will be best performed at a Fog level before in some cases being fed back to the cloud after aggregation.

Fog could also work well for enterprise IoT where it might serve as a campus level control and aggregation layer within a larger cloud based infrastructure. It could also play a role as enterprise IoT becomes customer facing rather than mainly concerned with internal or supply chain operations. This could be a third IoT tipping point bringing together enterprise and consumer IT if a recent survey from Gartner is to be believed. This found that while only 18 percent of today’s enterprise IoT deployments are focused on customer experience, this will jump to 34 per cent over the year to Q1 2017. This represents a threefold absolute jump given that Gartner is forecasting the number of enterprises with IoT deployed somewhere to soar from 29 percent now to 43 per cent in a year’s time. Gartner also expects IoT to expand into new service related industry segments such as insurance beyond the heavier industries like manufacturing, utilities and logistics where it is concentrated now.

Such enterprise IoT forecasts have a history of becoming more accurate than some of the over hyped consumer analyst predictions. This means that if consumer IoT does continue to stall it may be dragged forward by enterprises seeking competitive advantage as well as new revenues, as we are seeing to an extent with the likes of British Gas.

Publié par 3 commentaires

(English) The Big Data emperor will need Big Change within companies, that is if he has any clothes on. Summit report Part III

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

There was a great turnout to TM Forum’s inaugural event on Big Data in January. It was small enough to enable proper networking, but the packed room made it feels like this something more than just hype or buzz is happening around Big Data.

Some of the clear benefits Big Data brings at once

A key benefit EBay has gotten out of Big Data analytics after having started with Hadoop in 2010 is a greater flexibility. An example of what they can do better now is to work out how much to bid on specific keywords like “iPad” because the decision often has to be made in near real-time. Big Data helps eBay manage they key differences in word meaning from market to market.

Bell Canada was one of the more upbeat operators on Big Data. James Gazzola made a case for Advertising Forensics where the operator could use analytics to determine which ads are actually watched. Bell hopes that these insights, once mastered could be monetized. Gazzola went on to point out that as Bell Canada serves 66% of Canadians, analytics could show what's happening in all of Canada. That sent a slight shiver down my back as I wondered if the journey from network planning to user analytics actually terminated at a station called to Big Brother, but oops this is the part on benefits. So back to more down to earth issues, Gazzola told the audience that voice traffic used to be relatively predictable, but that data traffic driven by smartphones is anything but. Big Data is what Bell is looking at to help planning future network capacities.

Google’s presentation was disappointing. I don’t really blame Google speakers because the expectations are always unrealistically high: there’s so much we crave to know about Google. Matt McNeil, from Google’s Enterprise division was asked if they have any big Telco clients for Big Data yet. His wooden answer that "we're talking to several" showed the limits of the company’s transparency policy. But during his sales pitch, Matt got quite excited explaining that “it'll cost you 5M$ to build the capacity Google charges just 500$ per hour, for a Hadoop-powered Big Data analysis platform”. When McNeil showed off with http://bigquery.cloud.google.com, the exciting fact that “Led Zeppelin” is more controversial than “Hitler” got me a bit concerned that maybe all this Big Data stuff really was hype after all. I suppose we need practice finding more telling examples because as Matt did say himself “this year will be a trough of disillusionment for Big Data”.

Big Data is about Big change

Peter Crayfourd who recently left Orange, pointed out that becoming truly customer-centric can be scary. Such an approach may uncover many unhappy customers. But becoming truly customer-centric will take at least 5 years. All speakers at the Big Data conference seemed in agreement that user centric KPIs based on averages are to be shunned because users are so very unique! That sounds fine in theory but CFO’s are going to need to stay up late finding out how to live without the concept of ARPU.

The eye-opening presentation from Belgacom's Emmanuel Van Tomme stayed on the customer-centricity able but made the clearest case so far that change management is key to Big Data implementation. Emmanuel was the first Telco guy I’ve heard talk about a great new metric called VAP or Very Annoyed People. They can now be identified with Big Data analytics.

Many speakers converged on the theme of "change management" as THE key challenge for Big Data going forward. The general message was that if Hadoop is ready to deliver, people and even less their organizations were not yet.

Thinking of the bigger Telcos conjures up an image of oil tankers trying to steer away from network metrics towards usage metrics. Looking solely at the agility dimension I couldn’t help wondering if they could survive the speedboats like Amazon or Google.

As the conference was wrapping up I gleaned an interesting metric: subscribers are 51% more willing to share their data if they have the control of whether or not to share it in the first place! It’s one of those Doh!-I-knew-that statistics you feel you should have come up with, but didn’t.

Earlier it had been pointed out during one of the panel sessions that to make Big Data work for Telcos, subscribers must entrust ALL their data to the operator. For them to agree to this, the outbound sales & marketing link must be cut. It’s probably wiser to have one unhappy head of sales than many unhappy customers.

But things aren’t so simple

The limitations of KPIs

Peter Crayfourd illustrated the danger of KPIs with the voice continuity metric. In his example it was 96% when calculated over 15 minutes, so if that’s what your tracking all is hunky dory.  But in the same network environment, when the same metric is calculated over 45 days the result is usually 0%. Crayfourd went on to explain how averages can be dangerous within an organization: someone with their head in the oven feet in freezer has good average temp! Matt Olson from US operator Century Link pointed out that in the User eXperience (UX) domain simple maths just don't work: UX isn't the sum of the parts but some more complex function thereof.

Listening to the UX focussed presentations one got the feeling that the Big Data story might just be a pretext for some new guys to come steal the carpet from under the feet of the QoE market stakeholders. They’ve been saying this for almost a decade … Big Data is a means not an end.

Cost of Big Data & Hadoop.

For EBay, Hadoop may be cheaper to setup, but it’s so much less efficient to run than structured data that the TCO currently seems the same as with other enterprise solutions.

Google, eBay and even Microsoft made compelling presentations about the nuts and bolts of Big Data and then tried to resell their capabilities to the service providers in the room. TM Forum could have been a bot more ambitious and tried to get more head-on strategic discussions going on how the big pure-play OTT giants are actually eating Telco and other Service provider’s lunch. Maybe a lively debate to setup in Cannes?

Does the Emperor have any clothes on?

UK hosting company MEMSET's Kate Craig-Wood isn’t sure at all. Kate said that Big Data techniques are only needed in a very few cases where many hundreds of terabytes are involved and near real-time results are required. She does concede that the concepts born from Big Data are useful for all.

MEMSET’s co-founder went on to explain how a simple open source SQL based DBMS called SQlite successfully delivered interesting analysis on hundreds of Billions of data points, where MySQL had fallen over. She had to simplify and reorganize data and importing it took 4 days, but once that was done she got her query answered in minutes. Ms Craig-Wood went on to say that the SQL community is working flat out to solve scalability issues going as far as saying "I'd put my money on SQL evolving to solve most of the Big Data problems". There's so much SQL expertise out there!

Perhaps the most controversial part of this refreshing Big Data debunking session from Kate Craig-Wood of MEMSET was when she said that “I don't believe in data scientists, most DevOps will do fine, and Hadoop isn't that complex anyway”. She has a point: we're at the pinnacle of the hype cycle.

Caution

Less extreme but still on the side of caution were the sensible questions from Telefonica that is experimenting with Big Data. The Spanish operator is still cautious about the “high entrance cost” and uncertain final price tag or TCO. So far the Telco has built both a centralized cloud instance of its platform and also separate instances for each of its operating companies in different markets. Telefonica’s Daniel Rodriguez Sierra gave an amusing definition of Big Data as simply those queries we can't handle with current technology.

Verizon wireless also reaffirmed the need for caution pointing out that to implement Big Data and reap any benefit thereof you need an agile trial and error approach. That’s a tall order for any incumbent Telco. The US mobile operator admitted that it was being wooed by the likes of Google, Amazon and EBay that would all love to resell their analytics capability to Verizon. But staunch resistance is the party line as Verizon mobile has the scale (and pockets) to determine that the core data is too strategic to be outsourced. In terms of scale Verizon wireless has 100M subs and 57K towers that generate a petabyte of data or 1,25 trillion objects per day crunched currently with 10K CPUs. Verizon’s Ben Parker was pleasantly open saying that an "army of lawyers is happily supplied with plenty of privacy work now we're capturing info on all data packets".

Governance was too frequently mentioned during several presentations not raise an alarm bell in my mind. It seems that those who’ve actually got their hands dirty with Big Data are finding themselves embarked on projects that are difficult to control.

In the end

I was really impressed by the commitment operators are making to big Data on the one hand while clearly expressing reservations or at least warning that we’re just at the beginning of what’s going to be a long Journey.

For further reading here are three other write-ups of the event that I commend:

There’s a mini video interview of Peter Crayfourd here: http://vimeo.com/58535980

Part I of this report (interview of TM Forum's Strategy Officer) is here.

Part II, a discussion with Guavus and Esri, is here.

Publié par 10 commentaires

(English) Google TV off to a bad start

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Google is undisputedly good at advertising and search. I’ve been convinced for a while now that Google & TV make sense, see this IPTV News interview from 18 months ago.

If Google had decided to enable a business model for companies from say Roku to NDS using its advertising capability linked to search, I’d be totally confident in the venture even though success might have still taken time to reach.

But by embracing the whole middleware environment with a complete solution, Google has bitten off too much to chew even for them. Large companies from Intel to Microsoft (including Apple) have all failed their initial entry into the TV market. Different reasons apply in each case; one commonality is the size and lack of agility of these companies that always want to fix the whole problem instead of concentrating on their strengths. In spite of still branding its products as betas, Google has now become such a behemoth that its legendary light-footedness has all but gone.

As Mike Elgan points out in his entertaining computerworld blog, the TV experience is mostly stuff you don’t want. The lean-forward Web experience is one of finding a needle you do want, in a haystack that you don’t. TV’s problem is more about sweeping out all the rubbish. This is where the traditional pay TV business fits in, although it is not clear whether this is cause or consequence.

Google may have some flashy (or should that read HTML5?) animations explaining what Google TV is. However, just reiterating that they’ll deliver the best of the Web and the TV together is not reason enough for this to happen.

So ...  what actually needs fixing for the Web and TV to Merge?

  • 1. reliability or stability of the set-top-boxes (or stuff inside the connected TV)
  • 2. ease of use of the user interface
  • 3. navigation within all the newly available content

Starting with the last item on this list, Google’s premise seems to be that they will be in a better position to resolve the difficult issue of content navigation. They do indeed have a unique selling point here with their search technology. But the other blocking point needs to be fixed first. I have 6 separate devices in my living room, all with the latest firmware; I can crash any of them, with sometimes just a few button presses. Android, the operating system that will power Google TV, is still pretty shaky, and that is a no go in my book. The lack of robustness of the demos at Google’s I/O event that amused many of the bloggers present, is telling in this respect.

Working up the list, despite its relative failure to date, Apple TV introduced the poster Art concept that all modern TV UI now mimic with variable success. Nobody has yet provided an adequate solution to navigating Web amounts of content from a lean-back TV viewing posture. Should Steve Jobs up the Apple TV status from its official “hobby project” to something more strategic, then whomever can fix this usability issue, second in my list, Apple can.

As for the first blocking point, Google delivered Android for the Smartphone at breakneck speed. But in so doing, confused the market with a multiplicity of unstable versions. This is almost the opposite of MacOS on the iPhone.  Apple’s closed approach furthermore ensures both a seamless user experience and a certain level of quality. Google’s open approach can open up a Pandora’s box of faulty or incompatible apps. For robustness in the TV space one would more naturally look to the NDS’s or the OpenTV’s of the world to fix this issue.

If I were Eric Schmidt, I’d put Android for TV back into the R&D labs for another couple of years. Then choose a partner, or to be more true to their philosophy, publish open API’s for anyone to monetise OTT content through an ad system designed for hybrid TV. Going for gold during a rush, the way Google is now doing is risky business and they may well fail. If they just sold the shovels, Google would be sure to succeed and they could always buy back into the whole TV experience when the dust settles.

Combining the Web with TV, which is the Google TV bottom line, has been tried more times than any industry expert can count.

If it finally succeeds because big HD screens let you read text in the living room and devices let you interact with cloud based services, maybe with voice control or gesture based interfaces, then surely it’s the set makers that stand to win. I don’t see how current the Google-Sony-Logitec alliance could withstand the strains of success.

If the glue that finally sticks the Web and TV together turns out to be a reshaped entertainment and media ecosystems, with OTT becoming the norm and content flowing directly to TV’s through bit-pipes, then we would see a fragmentation of the content industry. Google could then dominate this space just like it does the Internet - thanks to its search/advertising model. However, the advent of file sharing and the MP3 saga have woken the sleepy content industry.  I don’t believe they will let Google reach such prominence here. Even if I’m wrong and they do, what revenues does Google derive from MP3 file sharing, legal or otherwise?

Quality live TV & film are still associated with subscription services. During the advent of the Internet over the last decade, the Pay TV industry has only gotten stronger with rising numbers of subscribers. TiVO tried to innovate a new model but has seen its active subscriber base drop from 3.3m to 2.5m in the last 18 months.

An executive from the TV industry once told me that young enthusiastic techies like myself had been explaining to him how new technology would radically change the TV business for over ten years. This conversation took place over five years ago! His point was, and I suppose still is, that for fifteen years, waves of technical change have only reinforced the basic pay TV model. The still topical world recession hasn’t dented their subscriber numbers.

Let me revisit the content navigation issue once more. Beyond the sheer mass of available content where Google’s search will solve problems, the problem is also going to be about home networks. There’s no point having a great search engine if it cannot index the content on all the different devices in the home. Google is no better equipped than others to achieve seamless home networking. In fact, some like the proponents of the DLNA standard are probably better equipped for this.

I’ll end with lack of clarity from Google TV’s positioning.

Google champions the search paradigm where revenue is generated from advertising. With it’s Android operating system, Google is moving also into the iTunes/Appstore model where revenue is generated from the sale of apps. It’s not clear to me how Google will be able to play both hands simultaneously on the TV.

A blog entry by Vintner: If Google TV were a bicycle, I'm a fish also points to the lack of clarity. This fun read states that Google is no longer a start-up and that pushing technology is no longer enough, even if it’s cool technology.

Indeed, there’s already too much technology in the crowded TV space. What the industry desperately needs are viable business models to enable OTT content flows to complement - rather than replace  - existing pay TV platforms. So Google, please put your TV technology back in the R&D lab where it belongs and bring us the tools to find & monetise video from the web on the TV.