Publié par Laisser un commentaire

(English) Operators plan massive carrier grade Wi-Fi expansion

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Huge pent up appetite for carrier grade Wi-Fi has been confirmed by recent research commissioned by customer experience specialist Amdocs, suggesting that both cable operators and MNOs (Mobile Network Operators) will deploy this at massive scale over the next three years. The research, conducted by Real Wireless and Rethink Technology Research, identifies how operators in both these camps are in turn responding to rapidly growing dissatisfaction among their customers with the Wi-Fi performance and reliability they are getting in public hot spots in particular. The fault does not really lie with Wi-Fi itself, which has actually improved in leaps and bounds, but instead the failure to keep up with escalating expectations. People now expect levels of availability for Internet access that used to be confined to enterprise data networks and Wi-Fi has come into the firing line as the new de facto “last mile” of the broadband access infrastructure.

So as Amdocs pointed out, service providers are seeing that “best-effort” Wi-Fi is becoming less profitable and a guaranteed higher quality of experience (QoE) is needed for emerging revenue generating services such as TV everywhere and online gaming. Yet as we all know Wi-Fi QoS at public places like hotels and trains is all too often poor and inconsistent, too susceptible to data traffic congestion as well as varying spectral conditions.

Carrier Wi-Fi implies guaranteed QoS for specific services such as TV, which in turn depends on traffic management techniques in order to meet varying requirements for bandwidth and latency by giving some IP packets priority whole holding up other packets associated with less urgent applications like email. Above all carrier Wi-Fi requires strong tools for network planning and management to ensure that QoS can be maintained even at peak times. In the Amdocs survey, two thirds of respondents identified lack of such strong tools as one of the top three risk factors that might deter or delay investment in carrier-grade Wi-Fi.

Fortunately such tools are now available from a clutch of vendors that now specialize in carrier Wi-Fi after cutting their teeth in offload to broadband via Wi-Fi from cellular networks. One of them, Aptilo, now emphasizes the importance of integrating Wi-Fi at the service management level with existing backend OSS/BSS operational systems as a foundation for policy enforcement and new revenue generation. Another, Birdstep, has been focusing increasingly on the bigger picture of heterogeneous networks (HetNets) that combine Wi-Fi with cellular with the catch line of “Experience Continuity” to describe the goal of delivering optimum QoS to users wherever they are and whatever device they have.

Of the two operator categories covered by Amdocs in its research, HetNets are of greatest interest to MNOs, but carrier grade Wi-Fi itself is a major goal for many cable operators seeking to give their subscribers access to premium TV content on the road and underpin their quad play offers. The interesting aspect of the research is the suggestion that operators will be clutching carrier Wi-Fi technology almost as soon as it comes out. As a result the prediction is for penetration of carrier-grade Wi-Fi hotspots to increase from 14% at the end of 2014 to 72% by 2018.

This will not be a case of technology leaking gradually out to the market as it comes along, but being pulled hard by consumer demand. Just as high speed broadband Internet access has come to be taken for granted, carrier Wi-Fi will quickly follow.

Publié par Un commentaire

(English) Real UHD deployments in 2016 says Thierry Fautier, UHD Forum will help

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Most of the video ecosystem is agreed on one thing: Ultra HD or 4K will happen, but none of us agree yet on when and how. It is clear is that the standards will play a key role in determining the timeframe. In previous cases, say with DASH for example, an industry body above competing standards has been the most effective way to speed things up. It seems like two separate initiatives are coalescing independently, which may be a good thing. CES 2015 was the place to be and many UHD issues where addressed. To get a clearly picture, I spoke to someone at the heart of it all. Here is my interview with Thierry Fautier VP of Video Strategy for Harmonic Inc.

Q: First of all Thierry can you confirm that UHD was a prominent them in Las Vegas this year?

A: Most certainly, Ultra HD was one of the most prominent topics at CES 2015. This was the first major show since some key announcements of Ultra HD services in late 2014:
- UltraFlix and Amazon that offer OTT services on connected TVs,
- DirecTV that announced a push VoD satellite service (through its STB that stores and then streams with decoding in the Samsung UHD TV),
- Comcast that announced a VoD streaming service directly through the Samsung TV, with content from NBC.

Q: But these services require UHD decoding on a Smart TV?

A: Yes that is a first takeaway from CES: TVs are the ones decoding UHD for now, STBs will start doing so from second half of 2015.

Q: Beyond the few services just described, what signs did you see that UHD might really start becoming available to all from 2015?

A: Several, for example the announcement that Warner Bros has decided to publish UHD titles using Dolby’s Vision process. Netflix also announced that its Marco Polo series would be re-mastered in HDR (but without announcing which technology). So on content and services side, things are moving on HDR.

Q: Do you see HDR as one of the first challenges to solve for UHD to succeed?

A: I do. The plethora of HDR demonstrations by all UHD TV manufacturers was impressive. I will not go into the details of the technologies used, it would take too much time and this may change (due to the standardization effort of HDR). The only thing I would say is that there is a consensus in the industry to produce UHD TV, it will be around 1,000 nits (against 10,000 for the MovieLabs spec) [a NIT is a measurement of light where a typical skylight lets in about 100 Million Nits and a florescent light about 4,000 Nits]. On the technology side, LG is the outsider with its OLED technology that was shown in 77 inches, while the rest of the industry seems to focus on the quantum dot (Samsung announced in 2014 that it was abandoning OLED).
This suggests that we will have HDR in 2015; the real question is on which spec HDR will be based? You now understand the eagerness of studios to standardize HDR.

Q: so is HDR a complete mess?

A: HDR is actually already in the process of standardization, but with more or less synchronized work:

- ITU began a call for a technology which was answered by Dolby, BBC, Philips and Technicolor.
- EBU / DVB is working on a standardization of HDR mainly for live broadcast applications. The goal is to finalize the spec in 2015.
- SMPTE is defining the parameters required for the production of HDR content. A first spec (ST 2036 for HDR EOTF and ST 2086 for Metadata) has already been ratified.
- MPEG is currently defining what to add to the existing syntax to HDR in a single layer. The outcome is expected in July 2015.
- Blu-ray is finalizing its HDR (single layer) specification and also hopes to freeze it mid-2015 to optimistically hoping to launch services in time for Christmas 2015. Blu-ray is working in coordination with MPEG and SMPTE. Note that Bly Ray will then follow specifications for streaming / download under Ultra Violet.
- The Japanese stakeholders, through NHK, announced they would now develop their own HDR for 8K.

So you see the diversity of the various proposals that exist, the new “Ultra HD Alliance” should bring some order here. The clue I can give is that to have a Blu-ray UHD service in 2015, this must be done with chips that are already in production in 2015. I think we will see more clearly at NAB (April) and that by IFA (September) everything will be decided, at least for the short term, aligned hopefully with DVB / EBU Ultra HD-1 Phase 2.

Q: I gather what is now called the “Ultra HD Alliance” is actually something different to what I described in my last blog and that it’s first challenge is getting HDR sorted out?

A: Indeed Ben, the Ultra HD Alliance is a group of 10 companies primarily from Hollywood and the world of TV in addition to Netflix and DirecTV on the operator’s side. The first goal of this group is to get HDR (High Dynamic Range) specifications under control (see diagram below) and the quality measurement from the output of the UHDTV. In this regard, Netflix will launch a certification of the quality of HDR streaming; HD to start and we can imagine that this will be extended to UHD. Note that no manufacturers have yet been invited, which is surprising as they are the ones actually going to do most of the job!

Q: So the organization we spoke about last time is something else?

A: Yes Harmonic, with a group of 40 other companies have proposed to create an Ultra HD Forum to take care of the complete UHD chain from end-to-end, including OTT, QoS, Push-VoD, nVOD, adaptive streaming, Live and on-demand. After various meetings that took place at CES, discussions are on going to ensure that the two groups (UHD Alliance and UHD Forum) work closely together.

Q: so as in other areas would you see the need for at least two governing bodies to manage UHD standards?

A: In the short term yes. The UHD Alliance is focussing a single blocking factor at the moment i.e. HDR/WCG/Audio , but will have a broader marketing and evangelization remit. The UHD Forum on the other hand is starting out with and ambition of end-to-end ecosystem impact. In the longer term there is no reason the two entities might not merge, but from where we stand today it seems most efficient to have the two bodies with the different focuses.

Q: Does HDR make sense without HRF (High Frame Rate)?

A: Well I'd say on the chip side there is still a challenge as 2 times more computing power is required; HDMI is also a limiting factor as bandwidth increases. Early services might get away with just a 25% increase. Most encoder providers are not yet convinced that the effort will produce improvements justifying the disruption brought by the doubling of frame rate. We have been asking for 60/120 fps formal testing but we’ll need to wait for the new generation cameras especially in sport, as opposed too currently used cameras often equipped with low shutter speeds coming from the film world where 24 fps is the norm. At IBC’14, Harmonic together with Sigma Designs, was showing encoding of UHD p50 and up conversion in a Loewe Ultra HD TV set to 100 fps, with a motion compensated frame up conversion powered by Sigma Designs. Visitors from the EBU saw the demonstration and were pleased with the result. This will be one of the most contentious topics in the months to come, as the value might not be able to counterbalance the impact on the ecosystem.

Q: What about the chipset makers?

A: I visited Broadcom ViXS, STM, Sigma Designs who all had demos at different maturity levels to support different types of HDR. They are all waiting for a standard for HDR.

Q: So to wrap up can you zoom out of the details and give us the overall picture for UHD deployment?

Ultra HD is a technology that will revolutionize the world of video. Making UHD requires a complete rethinking of the workflow, from video capture, production to the presentation. This will take several years. I’m not even talking about spectrum issues to get this on the DTT network....
As you can see, the specifications are still in flux when we talk about “real Ultra HD”, the technologies are being set up and should be ready in 2016 to make live large scale interoperability testing during the Rio Olympics and also have the first services to OTT or on Blu-ray Disc that supports the HDR and WCG (Wide Color Gamut).

(Disclaimer 1: Thierry is a friend and is passionate about Ultra HD, he was invited speaker at both NAB and IBC last year on UHD, disclaimer 2: Although I have written for Harmonic in the past, I’m not under any engagement from them).

diagram UHD2

To be continued....

Publié par 2 commentaires

(English) Ultra HD ecosystem getting organized, alliance on the way

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

I attended the French HD Forum meeting on UHD last week in Paris, which hosted by Eutelsat. France prides itself on being innovative, often with government or strong regulator incentive. How this actually works out is a matter for politicians as in the case of the Minitel that predated the Internet. There is still no consensus on whether it was a good thing for France, with French people becoming used to eCommerce before the term even existed or whether on the contrary it made France miss the first Internet wave.

When it comes to TV standards similar debates rage. Much ink was spilt over the terrestrial switch-over which was completed here in 2011. The transition from SD to HD was always a political hot potato and is still underway with spectrum scarcity in the current DVB-T1 setup restricting HD to 5 of the 23 FTA channels.

Unsurprisingly, when the French get talking about UHD, there’s palpable tension with all the differing agendas. Does it make more sense to finish upgrading the end-to-end environment to HD before playing around with UHD, or on the contrary would it be more economical to avoid two upgrades and go straight to the ultimate target of full UHD? Should TV stations wait for customer demand or try to stimulate it with UHD services in as early as 2015?

Beyond these legitimate debates, there is also some confusion that is artificially created by a lack of information and sharing across the Ultra HD video ecosystem.

The risk of confusion

As UHD TV gradually tips over it’s peak of inflated expectations the TV industry at large, through the diversity of its reactions, will undoubtedly lead it down to the depths of disillusionment. Some TV stations publicly doubt if 4K will ever be a sound business proposition, while satellite operators and many technology vendors have bet their future on UHD success. Sometimes, even within the same industry group UHD is being pulled in several different directions at once as for example between the different UHD specifications of ITU, EBU, Digital Europe and CEA. Some key differences and commonalities are:

Feature

ITU EBU (phase 1) CEA Digital Europe

Resolution 

4320p 2160p 2160p 2160p
Frame rate 120/60 60 60/30/24 60/30/24
Color space BT2020 BT 709
HDMI NA 2.0 2.0
Bit depth 10/12 10 8 minimum 8 minimum
HDR Under standardization Phase 2 Not mentioned Not mentioned

The standardization of UHD has so far been much less chaotic than it was for say 3D technology at a similar stage.

Some clear standards emerging from:

  • the telecoms sector with the (ITU-R) recommendation from ITU’s Study Group 6 (more at: http://www.tvtechnology.com/news/0086/itu-issues-uhdtv-standards-recommendations-/213615#sthash.DXG9J7bU.dpuf),
  • the video technology space, which is also active with MPEG-HEVC having published a specification in January 2013 that can use used for UHD and that is now looking actively at HDR,
  • the consumer electronics industry that provided a vital part of the Ultra HD requirements with the standardization of HDMI 2.0,
  • the broadcasters, with the DVB/EBU ultra high definition broadcast format (UHD-1 Phase 1) specification for example.

But UHD’s success will rely on much more than just increased bandwidth and resolution and many of the other elements are still under discussion like for example the required increases in both color sensitivity and contrast with HDR (High Dynamic Range) or refresh rates with HFR (High Frame Rate). Norms for carrying higher definition audio with a greater number of channels have been standardized by ETSI with AC-4 that is actively promoted by Dolby. The MPEG standards body is currently in the process of creating an object based audio encoding standard with MPEG-H. The IP encapsulation techniques defined by SMPTE (2022-6) are still to be universally accepted by the industry.

To succeed faster, at a lower the cost for early adopters, UHD doesn’t need yet another body defining standards, but one that explains them, helps ensure their interoperability and promotes successful business cases.

After the failure of 3D, the industry needs to regroup around UHD to ensure its success, in a similar way the DASH Industry forum (dashif.com) has rallied all the DASH energies.

The Ultra HD ecosystem is quite complex and we provide here (courtesy of Harmonic) an end to end diagram for Ultra HD:

diagram UHDTo speed up the process of getting through the trough of disillusionment or maybe is it to cross the chasm, I learnt in Paris that a few market leading companies are in talks to set up an alliance. Its intended scope is to cover all parts of the content lifecycle from production to display, encompassing contribution, distribution, post-production and play-out. The Alliance’s stated goal will be to promote interoperable specifications, propagate effective business models, provide forecasting and share all successful application models.

The alliance would identify, describe and share specifications relevant to all parts of the distribution chain in close collaboration with standardization bodies.

Interoperability will be a key driver for all the alliance’s work, defining the system level interop points, organizing interop plug fests and publishing and promoting the results.

The Alliance would also deliver business models for both live and on-demand content, sharing any industry success stories and ensuring any mistakes are only ever made once.

An Ultra HD Alliance would promote existing industry reports but also pool real market data from its members and use projections to obtain the most accurate forecasts for critical market dynamics. The number of deployed UHD capable CPE, the readiness of live TV workflows or the extent of UHD VoD assets will be closely monitored and projected. The alliance also intends to show how UHD can be used in different application domains such as VoD, Live TV, Linear play-out, Push VoD, etc. presenting the benefits of UHD over HD with operator feedback.

To successfully promote Ultra HD, the alliance would be represented at trade shows and conferences. The alliance’s website would encourage interaction with blogging and social media. Webinars and various publications including whitepapers will also shorten UHD’s time-to-market.

The alliance would be open to companies from all parts of the ecosystem. Content providers, broadcasters, production houses, operators, playout companies, encoder vendors, audio specialists, security providers, chipset makers and UHD device manufacturers would all be able to join. Other organisations such as the HD Forum, EBU, DVB, etc. would be welcome too.

The setup of the alliance is still at the stage of informal talks, but the first formal meeting will take place at the CES in Las Vegas in January 2015.

Stay tuned for an update after the show (previous 4K blog on 7 Reasons why UHD/4K makes sense here)

Publié par Laisser un commentaire

(English) Live OTT streaming – Industry feedback from CDN World Summit London 2014

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Last week I led a round table on the future of live OTT TV and it’s implication for CDNs during the last session of Informa’s CDN World Summit in London.

My first point to open the debate was on QoE. I pointed out that mobile telephones are a giant step backwards in term of voice QoE and service availability compared to good old fixed lines. However we’re all happy to renew dropped calls lose coverage or ask our correspondents to repeat on our mobile phones because we gained so much more than service QoE with mobility. I then suggested users might accept a similar trade-off and embrace lower QoE for OTT TV than broadcast, in exchange for lower costs, mobility, greater choice and personalisation. The reactions around the table made me think I’d just insulted the queen. There was emphatic disagreement. TV is TV and will always be TV said the TV operators and nobody dared take issue. I guess that’s what happened in the boardrooms of railway companies when airplanes arrived. One of the non-TV-operator participants did agree that maybe - except for sports - QoE might be traded-off for greater choice. At this point, the challenge of content navigation was brought up for search and recommendation.

That got us talking about “long-tail live TV” and if it might ever makes sense, i.e. being able to watch a unique live stream that you really care about. That access might make you so grateful that even if the quality wasn’t always pristine you’d still be happy. This idea is buoyed up in an OTT rather than broadcast context. Indeed all the TV markets I’ve worked in, even if they have many hundreds of channels available, invariably have 10 or fewer channels that any one community is prepared to pay for. One of the key promises of OTT is to abolish markets, typically under a satellite footprint. All those start-ups targeting Diasporas are going to find tough competition as the big guys come into their nascent markets more and more.

From a financial modelling point of view, the satellite broadcasters around the table were pretty excited about the fact that for live OTT, if you have a tail-end channel that nobody is watching, your Opex goes down to zero. This for them was the real opportunity in live OTT.

Consensus was easy to get on the fact that live OTT TV brings mobility, however nobody was clear yet about a killer use case where this is really important. Watching videos on the tube or train is still very much a download experience and rarely legal at that.

When I brought up the question of when rather than if, Netflix starts live streaming nobody felt ready to pick up the gauntlet. I’ll keep that for another day.

Our debate wound up over an interesting discussion on the blurring of boundaries between linear and on-demand content. Typically a shopping channel can be played out from an automated server with people being able to interact and turn a multicast stream into a unicast one. The final feedback from two operators round the table was that Multicast is only really a panacea for large Telcos that own a network. For the rest of us the cost benefit analysis turns out much worse in the real world than on the drawing board of business planning.

This left me with the clear impression that there are still problems out there looking for solutions, not the other way round for a change. As many network and service operators want to build their own solutions rather than relay on the global CDN operators, we'll probably see a major player emerge from the likes of Anevia with its edge caching, Broadpeak with its Nano-CDN, Media Melon with its QoE analysis or Octoshape.

Publié par Laisser un commentaire

(English) Disruptors are everywhere – do you see them?

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

Senior consultant Hugh Massam escaped from the energy sector after serving a 15-year sentence. Here he offers some thoughts post-IBC.

‘There is nought so powerful as a good idea whose time has come’. The question is, do you see the great ideas and disruptors that your competitors see?

When we’re busy developing products, jumping into new markets and maybe sitting in meetings about dense tech or legal issues, it can be all too easy to miss a competitor that’s about to eat your lunch. It was clear to me from three days at IBC that some spaces – such as middleware – are ponds filled with hungry competitors. The ones that spot a seismic shift first – or even make it happen – will be the survivors.

Netflix has been creating such a shift giving sleepless nights to many TV executives. Reliable live streaming over the Internet will probably be even more of a disruptor, and the content delivery suppliers at IBC are promising it for very soon, certainly less than five years, maybe as little as two.

At Wolfpack we are often asked to benchmark clients’ brands, and even this simple exercise can provide startling insights and opportunities pointing out gaps in the market or areas where completion is already fierce. From there we partner with them as marketeers and industry insiders, to help position products and promote them, avoiding the many sand-traps that can befall al product along the way. But what we also do is maintain an outside perspective and look for the real seismic shifts or threats.

Asleep at the wheel? …a story from the energy sector

Looking wider than IBC, the supply of electricity is a former state function and in terms of NPD, was asleep for decades. In some markets both are still true. Even applying basic marketing to it is fairly new and the marketing sophistication of some former state-owned behemoths in this space is still decades behind a sector like broadcast. And let’s face it, not many marketers are hogtied by the need to keep supplying product to a customers with an astronomical cost-to-serve (a wire to a farmhouse in the Outer Hebrides, for example) or to people who patently can’t pay (in practice most energy companies rarely cut people off).  Both happen with electricity.

But in the wider energy sector there is innovation, and it bleeds right into people’s home networks. Besides the obvious shift to local and renewable generation, there’s seismic action in storing power, and in supplying it to your laptop. For example, innovators like Moixa.

A conversation in the energy-supply space is likely to take a bunch of things for granted: Houses need to run off the mains; home electricity networks use standard 110 or 220 volt sockets; solar panels are great but they are best used through an inverter (solar panels generate DC, an inverter converts this to AC) and supplying AC in real-time to the building they are on or to the grid; once the sun goes down solar power is not relevant to evenings - the peak time of use – because batteries are too expensive. These are all fundamental assumptions about the market which players like Moixa are challenging.

Small solar phone chargers are old news. So why can’t a home’s solar panels - already generating DC - directly power small DC devices like phones, laptops and lights through a local network? Why can’t you charge a battery from your solar panels in the day, or from the grid at 1am, and store that power to peak-shave by taking demand off the grid in the evening? The grid provider will pay you handsomely to reduce peak-load on demand – ‘demand management’ is a $1.8B market in the UK alone.

The Economist last year spotted that emerging standards which will soon boost a USB cable’s power-carrying capabilities up to 100V (from the current 10) will change the game too.

Players like Moixa saw the obvious – an explosion in low-power DC devices in the home, and a rise in the number of solar panels producing DC on roofs. They are working hard on dull details like DC voltages and plugs, which have never been standardised.

And at the big end – many people still assume electricity plants are large, should be available 24/7/365,  and need to be built permanently near their fuel supply, but players like Karadeniz Energy and Agrekko are making massive inroads (and profits) supplying floating power plants and rented power generators. And the largest market for the latter may surprise you – the Middle East, not Africa.

So even in a slow-moving market like electricity generation, disruptors are there. The real question is – how often are you checking on the top three potential disruptors in your market?

 

Hugh Massam is a Senior Consultant with Wolfpack and also the Principal Consultant at E Equals Limited, an energy communications agency based in Cambridge, UK. @eequalsuk. 

Publié par Laisser un commentaire

(English) First 4 trends spotted at IBC14

Désolé, cet article est seulement disponible en Anglais Américain. Pour le confort de l’utilisateur, le contenu est affiché ci-dessous dans une autre langue. Vous pouvez cliquer le lien pour changer de langue active.

I didn’t write any prediction on what would be important this year, but while it’s still fresh here are my first impressions

Last year the Cloud was a key buzzword and Amazon was, this year, it’s replaced by Virtualization, basically the same technology, but with the possibility of running all those virtual machines in a service provider's own data center. It is supposed to lower costs eventually and make things like redundancy management easier, but I’ve yet to be convinced if it’s really such a big deal. I’ll try and stop by some of the encoding booths like Envivio, Harmonic or Elemental to check out where it’s really just a generalization of the concept of software based encoding vs. hardware based encoding.  I'll also try to get back to the Amazon Web Services stand in hall 3 where they're explaining how Netflix uses AWS with special tools developed to optimize service availability.

4k has of course been around for several years yet still manages to buzz. I’ve been told to go see Samsung’s giant curved display in hall 1. The main difference from last year is that there’s hardly a booth without a 4K display or two, most now at 60fps and more and more UI’s, like that on display at SoftAtHome’s booth, are now native 4K.

OTT is still very present even if it too has lost its novelty as so many commercial deployments are out there. OTT ecosystem vendors are repositioning frantically as value is eroded. Some like Piksel seem to be keeping their end-to-end positioning, while others like Siemens with its Swipe service are also bringing out specific components to sell as services. Enhanced ABR is also appearing, to help reduce Opex costs, by finding tricks to use only as much bandwidth as is required. All in the CDN crowd like for example Limelight, Anevia, Broadpeak or Media Melon (who don’t have a booth) have things to show in this area.

IoT and the connected and/or smart homes have been around for years in other shows, but have now just made it to IBC. Managing the home network is becoming more challenging for many reasons. One that struck me more is that we are seeing a greater proportion of homes with 100M+ broadband connections, but in-home effective throughputs down to just a few megabits, often not enough to stream over Wi-Fi. There were quite a few solutions at IBC, like AirTies' home Wi-Fi meshing.

Some trends though are clearly on the way out. I noted for example that it’s already out of fashion to talk about embedded apps now that HTML5 is a no-brainer and any mention of the smart SmarTV is positively 2013.

More soon, stay tuned...