There was a great turnout to TM Forum’s inaugural event on Big Data in January. It was small enough to enable proper networking, but the packed room made it feels like this something more than just hype or buzz is happening around Big Data.
Some of the clear benefits Big Data brings at once
A key benefit EBay has gotten out of Big Data analytics after having started with Hadoop in 2010 is a greater flexibility. An example of what they can do better now is to work out how much to bid on specific keywords like “iPad” because the decision often has to be made in near real-time. Big Data helps eBay manage they key differences in word meaning from market to market.
Bell Canada was one of the more upbeat operators on Big Data. James Gazzola made a case for Advertising Forensics where the operator could use analytics to determine which ads are actually watched. Bell hopes that these insights, once mastered could be monetized. Gazzola went on to point out that as Bell Canada serves 66% of Canadians, analytics could show what’s happening in all of Canada. That sent a slight shiver down my back as I wondered if the journey from network planning to user analytics actually terminated at a station called to Big Brother, but oops this is the part on benefits. So back to more down to earth issues, Gazzola told the audience that voice traffic used to be relatively predictable, but that data traffic driven by smartphones is anything but. Big Data is what Bell is looking at to help planning future network capacities.
Google’s presentation was disappointing. I don’t really blame Google speakers because the expectations are always unrealistically high: there’s so much we crave to know about Google. Matt McNeil, from Google’s Enterprise division was asked if they have any big Telco clients for Big Data yet. His wooden answer that “we’re talking to several” showed the limits of the company’s transparency policy. But during his sales pitch, Matt got quite excited explaining that “it’ll cost you 5M$ to build the capacity Google charges just 500$ per hour, for a Hadoop-powered Big Data analysis platform”. When McNeil showed off with http://bigquery.cloud.google.com, the exciting fact that “Led Zeppelin” is more controversial than “Hitler” got me a bit concerned that maybe all this Big Data stuff really was hype after all. I suppose we need practice finding more telling examples because as Matt did say himself “this year will be a trough of disillusionment for Big Data”.
Big Data is about Big change
Peter Crayfourd who recently left Orange, pointed out that becoming truly customer-centric can be scary. Such an approach may uncover many unhappy customers. But becoming truly customer-centric will take at least 5 years. All speakers at the Big Data conference seemed in agreement that user centric KPIs based on averages are to be shunned because users are so very unique! That sounds fine in theory but CFO’s are going to need to stay up late finding out how to live without the concept of ARPU.
The eye-opening presentation from Belgacom’s Emmanuel Van Tomme stayed on the customer-centricity able but made the clearest case so far that change management is key to Big Data implementation. Emmanuel was the first Telco guy I’ve heard talk about a great new metric called VAP or Very Annoyed People. They can now be identified with Big Data analytics.
Many speakers converged on the theme of “change management” as THE key challenge for Big Data going forward. The general message was that if Hadoop is ready to deliver, people and even less their organizations were not yet.
Thinking of the bigger Telcos conjures up an image of oil tankers trying to steer away from network metrics towards usage metrics. Looking solely at the agility dimension I couldn’t help wondering if they could survive the speedboats like Amazon or Google.
As the conference was wrapping up I gleaned an interesting metric: subscribers are 51% more willing to share their data if they have the control of whether or not to share it in the first place! It’s one of those Doh!-I-knew-that statistics you feel you should have come up with, but didn’t.
Earlier it had been pointed out during one of the panel sessions that to make Big Data work for Telcos, subscribers must entrust ALL their data to the operator. For them to agree to this, the outbound sales & marketing link must be cut. It’s probably wiser to have one unhappy head of sales than many unhappy customers.
But things aren’t so simple
The limitations of KPIs
Peter Crayfourd illustrated the danger of KPIs with the voice continuity metric. In his example it was 96% when calculated over 15 minutes, so if that’s what your tracking all is hunky dory. But in the same network environment, when the same metric is calculated over 45 days the result is usually 0%. Crayfourd went on to explain how averages can be dangerous within an organization: someone with their head in the oven feet in freezer has good average temp! Matt Olson from US operator Century Link pointed out that in the User eXperience (UX) domain simple maths just don’t work: UX isn’t the sum of the parts but some more complex function thereof.
Listening to the UX focussed presentations one got the feeling that the Big Data story might just be a pretext for some new guys to come steal the carpet from under the feet of the QoE market stakeholders. They’ve been saying this for almost a decade … Big Data is a means not an end.
Cost of Big Data & Hadoop.
For EBay, Hadoop may be cheaper to setup, but it’s so much less efficient to run than structured data that the TCO currently seems the same as with other enterprise solutions.
Google, eBay and even Microsoft made compelling presentations about the nuts and bolts of Big Data and then tried to resell their capabilities to the service providers in the room. TM Forum could have been a bot more ambitious and tried to get more head-on strategic discussions going on how the big pure-play OTT giants are actually eating Telco and other Service provider’s lunch. Maybe a lively debate to setup in Cannes?
Does the Emperor have any clothes on?
UK hosting company MEMSET’s Kate Craig-Wood isn’t sure at all. Kate said that Big Data techniques are only needed in a very few cases where many hundreds of terabytes are involved and near real-time results are required. She does concede that the concepts born from Big Data are useful for all.
MEMSET’s co-founder went on to explain how a simple open source SQL based DBMS called SQlite successfully delivered interesting analysis on hundreds of Billions of data points, where MySQL had fallen over. She had to simplify and reorganize data and importing it took 4 days, but once that was done she got her query answered in minutes. Ms Craig-Wood went on to say that the SQL community is working flat out to solve scalability issues going as far as saying “I’d put my money on SQL evolving to solve most of the Big Data problems”. There’s so much SQL expertise out there!
Perhaps the most controversial part of this refreshing Big Data debunking session from Kate Craig-Wood of MEMSET was when she said that “I don’t believe in data scientists, most DevOps will do fine, and Hadoop isn’t that complex anyway”. She has a point: we’re at the pinnacle of the hype cycle.
Less extreme but still on the side of caution were the sensible questions from Telefonica that is experimenting with Big Data. The Spanish operator is still cautious about the “high entrance cost” and uncertain final price tag or TCO. So far the Telco has built both a centralized cloud instance of its platform and also separate instances for each of its operating companies in different markets. Telefonica’s Daniel Rodriguez Sierra gave an amusing definition of Big Data as simply those queries we can’t handle with current technology.
Verizon wireless also reaffirmed the need for caution pointing out that to implement Big Data and reap any benefit thereof you need an agile trial and error approach. That’s a tall order for any incumbent Telco. The US mobile operator admitted that it was being wooed by the likes of Google, Amazon and EBay that would all love to resell their analytics capability to Verizon. But staunch resistance is the party line as Verizon mobile has the scale (and pockets) to determine that the core data is too strategic to be outsourced. In terms of scale Verizon wireless has 100M subs and 57K towers that generate a petabyte of data or 1,25 trillion objects per day crunched currently with 10K CPUs. Verizon’s Ben Parker was pleasantly open saying that an “army of lawyers is happily supplied with plenty of privacy work now we’re capturing info on all data packets”.
Governance was too frequently mentioned during several presentations not raise an alarm bell in my mind. It seems that those who’ve actually got their hands dirty with Big Data are finding themselves embarked on projects that are difficult to control.
In the end
I was really impressed by the commitment operators are making to big Data on the one hand while clearly expressing reservations or at least warning that we’re just at the beginning of what’s going to be a long Journey.
For further reading here are three other write-ups of the event that I commend:
There’s a mini video interview of Peter Crayfourd here: http://vimeo.com/58535980
Part I of this report (interview of TM Forum’s Strategy Officer) is here.
Part II, a discussion with Guavus and Esri, is here.