Posted on 4 Comments

10 questions on UHD and how not to seem like a 4K Luddite

Despite being a bit of a geek, I feel I need to brush up my 4K basics to avoid seeming lame on such a hot topic. 2013 has already seen UHD/4K bloom at CES; now I need to be ready for IBC. So I had a long chat with Thierry Fautier who happens to be pretty knowledgeable on this. Here’s the transcript of our talk, in case you too want to seem less lame on the subject. 1. What exactly are 4K & UHD? 4K stands for “4 thousand” from the screen resolution of 2160 * 4096 pixels and is the new standard defined by the movie industry. The frame rate is still 24 fps, and the bit-depth remains 8 bit. UHD stands for Ultra High Definition and is supported by the broadcast and TV industries. It differs from 4K, with its greater color depth of 10 or 12 bits (which is a huge dynamic color range increase). The aspect ratio is brought back to the TV’s 16/9 ratio so it sports a 2160 * 3860 resolution. Frame rate is still a big debate. Some broadcasters are arguing that 120 fps are needed for football and that color depth should be 12 bits. The ITU specification gives a range of values, but the industry needs to rapidly agree and settle on some figures so that interoperability can be assured. Thierry’s company Harmonic believes that 4K with 10bit color depth @ 60 fps is a "good time-to-market and cost compromise" for the ecosystem. The following diagram gives a scale of the different quantity of data each screen size involves:

From Wikimedia (4K article)
From Wikimedia (4K article)

2.  Will/should 4K and UHD merge? No, cinema workflow will stay separate even if you'll always get movies on your TV. Broadcasters will not accept such low frame rates on their own production. 3. What time frame do you see for adoption? The 2016 Olympic games will see the beginning of mass adoption. So we need field trials throughout 2015. That in turn means products must be on the market some time in 2014, which implies that we have to sort out the specs in 2013. The Brazil football world cup in 2014 will show a spike of interest but it’s too soon for any real impact. By EOY 2014 however there will be a range of TV sets and mass production can probably start in 2015. A recent Consumer Electronics Association study expects just 1M 4K screens in US in 2015. 4. "HD ready" or 720p preceded "Full HD" or 1080p. Will we see a similar 2K or something here? It seems that the UHD logo will be properly protected, so consumers should avoid confusion. Services like Netflix will target intermediate formats, and we will probably see an intermediate phase before UHD is launched with 1080p50/60. Indeed there is more and more content produced at 50/60 fps and workflows can support this. Once UHD STBs that can easily decode 1080p50760 are deployed, operators will be able deliver an optimized HD quality that will look much better on a 4K screen than today's 1080i would. 5. Are there any short-term stumbling blocks for CPE? The current HDMI 1.4 standard limits 4K to an unacceptable 30 fps. HDMI 2.0 is needed for 60 fps, and this will be the true kick-starter for 4K adoption. CES 2014 should see the consecration of HDMI 2.0. 6. Is the compression ratio linear (i.e. will UHD require exactly 8 times the bandwidth of HD)? No. Today’s HD streams are compressed to 6MPS at constant bit-rate. By the time it’s ready for mass adoption UHD @60fps with 10 bit color depth should require just under 20MPS. 7.  Will UHD require HEVC or can it make sense to use H264? Without doubt HEVC is required for UHD to make it economically viable on existing infrastructure. 8.  Apple created the marketing term Retina display. What would be the UHD screen size to call it that? Early testing shows that there is no benefit below a 65-inch screen. But we are framing the problem incorrectly. Try to watch HD on a 65-inch screen. You will see artifacts, so if you want a screen above 65 you need UHD! 9.  In general, what's the new screen-size vs. optimal viewing distance? I argue with my colleagues in the UHD community who dream of people sitting 1,5m away from the screen. In reality I believe people will stay 3m away, so again the key factor is large screen size. Very large screens will be THE key success factor for UHD adoption. The figure bellow shows the screen size as a function of the viewing distance for various resolutions. 5337324d_resolution_chart_zps161be652 [Author's note: At its simplest it means that with a 50" screen you need to be 5 feet or 1,5 meters from the screen. For the more standard 10 foot or 3 meter viewing distance to really feel the 4K effect in your gut, you need an 85" screen.] 10.  How will the upgrade from HD to UHD compare to the one we've been through from SD to HD, in terms of: a) content production / post production This will be a hard transition for broadcasters this time because there are no connector specs yet. But the cinema industry has been digitally mastering in 4K for a while so there are plenty of 4K movies ready for release. b) content acquisition / preparation This should be fine as much acquisition is already in 4K. c) encoding Except for some early prototypes, 4K encoding is not yet available in real-time, mainly due to lack of CPU power. IBC 2014 should have some products but they might not yet be cost-effective. d) transport / Broadcast I see no network issue for the satellite and cable guys, indeed several successful demos have already been done (like the Eutelsat demo still available on 10A). For Telcos UHD will be dedicated to fiber delivery and terrestrial will probably need to wait until around 2016-18 for DVB-T to be ready for 4K. e) decoding Broadcom chipsets will be widely available to decode 4K/10bit/60fpw by 2014 so the first mass produced STBs will be ready by 2015. f) content protection This discussion has only just started. For now Sony’s 4K content uses Marlin DRM, which is the only commercial service currently available. g) pricing The very first devices will probably carry a premium for encoders and STBs of a factor around 3-4 on the price tag vs HD, just like we saw in the early HD days vs. SD. h) customer proposition People aren't “dying for new screens” right now, but 4K could be a driver. The industry must convince consumers that much larger screens, where HD sucks, are a good thing. Otherwise 4K on a small screen isn't appealing enough. It's all about the large screen and being closer to it for a much more immersive sensation - without disturbing the brain the way 3D did (at least with glasses). Content and economic constraints will see 4K start life as a VOD experience as audiences will be too narrow to justify broadcast. This is where Telcos and Cable MSO come into play and I’m looking forward to talking to some of them about this at IBC 2013. Disclaimer I have no ongoing commercial relation with Harmonic; I just had easier access to Thierry than to Envivio, Ateme, Ericson, Elemental or any of the other reputable vendors in the space. And BTW I’m looking to do a similar debunking piece on HEVC, so ping me if you’d like to be my interviewee this time. BTW this 4K/UHD topic is one of the hot topics I identified for this year IBC here.

Update (november 11)

Kudos to Elemental who proudly announced the first real-time 4k transmission last week together with telco K-Opticom at 20MBPS - I'm told 12MBPS could have worked. It was for the Osaka marathon, perhaps not the most exacting of sports for TV, so the 30 frames per second limitations was probably not too much of an issue. Details on their press release here. Harmonic is showing 4K decoded at 60 fps on true CE device for the first time this week at inter BEE, but although the target frame-rate is here it'll still be 8-bit color, Harmonic sa the rest of the workflow isn't ready for 10 bit yet away... Seems like the FPS debate is closed as even Elemental people told be 60 is right for sports, but it looks like there's room for a future blog exploring the 8 vs. 10 bit color issue. Stay posted.

Posted on 2 Comments

My pre @IBCShow 2013 hot topics

Here’s my take on what the key hot topics of IBC 2013 might be and the questions they raise for me.

Safe bets

Four topics are really way hotter than any others at the moment.

1.    4K/UHD

Will the cinema standard merge with the broadcasting one? Will there be an intermediary 2K, like we had “HD Ready” before “full HD”? [I tried to answer some of these questions with Thierry Fautier's help here]

2.    HEVC

Are we in for the same long wait as when H264 was first supposed to come, or have things really accelerated? It used to take a decade to halve bandwidth requirements.  Last years UHD/4K demos required 35 to 40MBPS, how long will it take to compress down to the promised 10MBPS?

3.    OTT

Technology, ecosystems, devices

  • Is there a future for OTT STBs?
  • Will DASH finally be the ABR to standardize them all?
  • Has the interest in connected TVs peeked?

OTT Business & content disruption

  • What does Netflix or YouTube commissioning content mean to the industry?
  • Is the second screen becoming the TV? Is now the time for mass adoption of play-along apps?
  • Is cord cutting, a temporary phenomenon or the beginning of the end?
  • Oh and I suppose Social TV fits in here, but I'm not expecting it to trend much in 2013.

4.    Big Data, privacy, customer intelligence or the new clothes of recommendation

Content recommendation platform vendors have been screaming into the wind for half a decade already. All of a sudden the industry is listening to their message, but not from them. The Big Data crowd have stolen the limelight. Its ever so hard to form an opinion when something is so very hyped, but it is common knowledge that most operators still have a long way to go to start benefiting form the gold mine of customer data they’re sitting on. Content recommendation is probably just the tip of the iceberg.

Outsiders that might get traction in 2013

New subject: Dongles

Despite set makers fantasies, the connected TV still isn’t a reality in terms of usage. But with those millions of out-of-date screens out there, could HDMI dongles like Google’s latest offering finally make that change?

An ten-year old story; that may at last be true: The time is coming for IP, another 4 points:

1.     The rebirth of IPTV

I used to write about the death of IPTV, so, I got the timing wrong. Well actually I may have gotten the whole story wrong. As OTT services seem to be more than a fleeting fancy, Telcos are realising that all that expensive multicast IP technology could actually make a difference. Maybe they won’t have to sue money out of the global players like Apple or Netflix, but actually be able to cut deals with them in exchange for guaranteed last mile delivery.

2.     Targeted advertising

Companies have come and gone on this subject. My take was that although the targeting tech sort-of worked, there were never big enough segments to personalise to, making an ad just costs too much. That may at last be changing with the scale available to some operators.

3.     Guaranteeing service, offloading, DPI, Net neutrality

Technology is now here to enable an operator to offload video streams from 4G to Wi-Fi either because its free YouTube stuff and the Wi-Fi is free or on the contrary because its part of a pay TV subscription that the Telco is getting a cut from and the Wi-Fi has no guaranteed quality.

4.     4G & Fiber

New high-speed networks really are finally here and accessible to significant segments of the market. This is not an IBC subject per se, but it is the fuel behind this whole IP set of trends.

See you in Amsterdam, and here or elsewhere to see how wrong I was ;o)

Posted on Leave a comment

9 new trends to help my visit the TV Connect 2013 show floor

Many of us whinged and whined about the name change from IPTV World Forum to IP&TV World Forum because the names were too difficult to tell apart (if you are still looking, it was the adding of the ‘&’ in IP&TV). By naming the event TV Connect the organizers have now moved away far enough for the new name to stick. Now though it must be differentiated from the “Connected TV” events.

This year’s event is too big to just simply attend. I put on my thinking hat to ponder where things are going over the next three years, in order to decide who and what to see in Olympia. The trends below are new impressions of things I’m just beginning to understand, not the obvious ones like drowning in content.

For each trend I’ve suggested, in this blue font, which exhibitor I’ll be looking to see at the show. Please add a comment if you think I’ve missed something important, which of course I have.

Trend 1: Moore’s law looks like slowing down at last

My 2-year old gaming PC still plays all the latest games! Who’d have imagined, Apple still selling iPhone 4’s from its website three years after initial release?

At the same time, if the advantages of Broadband up to a “good DSL” speed  (i.e. from ~10MBPS) seems obvious, many operators are struggling to sell “fibre” speeds (from ~100MPS and above) unless there’s no price increase.

Raw processing power is no longer enough in the TMT sector to reach the mass market beyond geeks & early-adopters, and soon raw bandwidth won’t be enough either. Services must serve a deeper purpose. Ok, how can that be done?

At TV Connect I'll be looking how the numerous device makers (just for the letter A there are already: Amino, Airties, ABox42, …) have improved the packaging and User eXperience of their products without necessarily changing all that much under the hood since last year.

Trend 2: Analytics everywhere

Big Data is a trendy topic currently at the height of its Hype cycle, which also represents a genuinely new approach. After over a decade of promises, the ability to ingest richer data and process it near to real time is finally here. At last, operators can focus on user experience rather than just connectivity.

I’ll try to scratch under the surface of the “Big Data” words I expect to see plaster onto many booths.

Trend 3: Colliding segments of QoE, UX and Security

The User eXperience (UX) domaine has only naturally linked with Quality of Experience/Service and monitoring. So I’ll be looking for how the QoS/QoE/Monitoring vendors are embracing overall User eXperience. I’ve written earlier about security companies as potential candidates for a stake in this new game, as they know exactly what is being watched by whom when it comes to premium content. In the age of abundance we have entered, a key challenge is content navigation that also means UI design, search and recommendation.

I suppose VO and Nagra come to mind first as having merged much of this, but I'll also be checking in no particular order: Witbe, Veveo, Verimatrix, Conax, Red Bee, Mariner, Jinni, Ineoquest, Genius Digital, Agama,  …

Trend 4: CDNs going local and the Cloud coming to a TV near you

Other areas where there seems to still be some low-hanging fruit to improve User eXperience include the distribution of heavy (HD) content in networks. All operators with a fixed line network are racing to bring out their own CDNs.

Broadpeak seems to be the only CDN specialist

Some Cloud services like Dropbox or Network PVRs seem obvious. The jury is still out on others as the early disappointment of Connected TV has shown. OTT service delivery platforms (SDPs) will be another thing to look out for.

In the fog we’re all stumbling around in, I’ll try to see which of the one-stop-shops like Kit Digital, Siemens, Cisco, Ericsson or Nagra have the more powerful fog lights. Of course for a best-of-breed approach you’ll need to stop by at almost all the booths.

Trend 5: declining long-term value of Pay TV?

In the early 90s nothing worked better in the home than the fixed-line telephone. The availability and reliability of basic telephone services, whether mobile or fixed has significantly dropped twenty years later. Subscribers have been happy to trade lower prices and mobility for reliability and what we used to call quality. A similar trend can be seen with pay TV services. Early “cord-cutters” are showing that trade-offs are possible here too. Subscribers will probably trade old-fashioned TV quality for better variety, lower prices and better content navigation.

To keep the value in TV, some operators will use bundling or mashing-up TV types of service with social media and communication services.

The companies I’d talk to, to get a handle on this would be those at the forefront of social media like Accedo, or already close to operator’s triple play like SoftAtHome.

Trend 6: device wars growing fiercer

In what my friend Sebastian Becker calls a new rendition of “The Empire Strikes Back”, many European Cablecos have launched powerful boxes that have little to envy from a PC’s spec sheet, as for example with Numericable's LaBox. At the same time, Google is still happily ploughing millions into various device-centric Google TV projects, and Sony says the PS4 will revolutionize media in the living room. Nobody understands what Microsoft is saying: new OTT devices still crop up in shops ranging from powerful all-in-one boxes to tiny USB or HDMI sticks, … and the list goes on.

So short term, should I need to advise any operators on device architecture, I’ll go for being agnostic.

To get some clarity on this I’d drop in to the OIPF booth to see how standards are helping.

Trend 7: SD à HD à 4K

I saw Sony’s 4K screen at IBC and am a true geek on this one. The 4K industry drive will succeed because it just feels so right in the gut, where 3D with spectacles in the living room never could. 4K or ultra-HD will start to impact on us within three years.

I’ll be keen to see who at the show is already on the ball with 4K, although it’ll be harder to get the timing right on this than just be the fist too early mover.

Trend 8: Capex can really shrink at last

I have written over a dozen business cases for TV rollouts around the world and if you’re small, the killer Capex item is the head-end but if you’re large, it’s the STB.

For the former, new centralized digital “headend in the SKY” services substantial Capex savings. You just send files to be encoded streamed or whatever your head-end requirements are.

As for the larger operators, the STB can still be a killer cost as are fancy devices like the LGI Horizon box. People are actually happy though to spend hundreds of dollars on devices that are even more powerful than any STB. Once the empire has finished striking back, I sense a trend for overall lowering of STB costs.

I’ll drop by the usual suspects here for an update on head ends (Elemental, Envivio, Harmonic, Ateme, etc.) but also try to understand where Avail TVN is at.

Trend 9: Hello TV, Goodbye TV

If the 8 previous trends have a dose of gut feeling, this one - pure conjecture - feels right. I have come to realize that many of us work in the market sector we call TMT. Before I looked it up, I assumed that one of the T’s was for TV. Maybe I’m spending Too Much Time on this, but the acronym actually stands for Technology, Media and Telecoms, no TV anywhere.

So could TV have been just a passing thing? Before IPTV there wasn’t any TV on IP networks, and now in the age of multiscreen galore and OTT, is “TV” already being pushed back out of IP networks in favour of just “video”? Maybe one day there’ll just be Sports, News and Video left so beyond the three year time-frame of this blog we can all come back to the 2017 event which will be rebranded the SNV World Forum.

Posted on Leave a comment

WolfPack will soon be online

Click here to see our new website

The WolfPack is here, fully focussed on our first clients. www.wolfpackcoms.com will take you somewhere else, which we hope will be exciting. In the meantime we believe no website is better than a lame one, so we'll stay hosted by CTOiC.net until then. Stay tuned.             You can see who we are below:

 

Posted on 3 Comments

The Big Data emperor will need Big Change within companies, that is if he has any clothes on. Summit report Part III

There was a great turnout to TM Forum’s inaugural event on Big Data in January. It was small enough to enable proper networking, but the packed room made it feels like this something more than just hype or buzz is happening around Big Data.

Some of the clear benefits Big Data brings at once

A key benefit EBay has gotten out of Big Data analytics after having started with Hadoop in 2010 is a greater flexibility. An example of what they can do better now is to work out how much to bid on specific keywords like “iPad” because the decision often has to be made in near real-time. Big Data helps eBay manage they key differences in word meaning from market to market.

Bell Canada was one of the more upbeat operators on Big Data. James Gazzola made a case for Advertising Forensics where the operator could use analytics to determine which ads are actually watched. Bell hopes that these insights, once mastered could be monetized. Gazzola went on to point out that as Bell Canada serves 66% of Canadians, analytics could show what's happening in all of Canada. That sent a slight shiver down my back as I wondered if the journey from network planning to user analytics actually terminated at a station called to Big Brother, but oops this is the part on benefits. So back to more down to earth issues, Gazzola told the audience that voice traffic used to be relatively predictable, but that data traffic driven by smartphones is anything but. Big Data is what Bell is looking at to help planning future network capacities.

Google’s presentation was disappointing. I don’t really blame Google speakers because the expectations are always unrealistically high: there’s so much we crave to know about Google. Matt McNeil, from Google’s Enterprise division was asked if they have any big Telco clients for Big Data yet. His wooden answer that "we're talking to several" showed the limits of the company’s transparency policy. But during his sales pitch, Matt got quite excited explaining that “it'll cost you 5M$ to build the capacity Google charges just 500$ per hour, for a Hadoop-powered Big Data analysis platform”. When McNeil showed off with http://bigquery.cloud.google.com, the exciting fact that “Led Zeppelin” is more controversial than “Hitler” got me a bit concerned that maybe all this Big Data stuff really was hype after all. I suppose we need practice finding more telling examples because as Matt did say himself “this year will be a trough of disillusionment for Big Data”.

Big Data is about Big change

Peter Crayfourd who recently left Orange, pointed out that becoming truly customer-centric can be scary. Such an approach may uncover many unhappy customers. But becoming truly customer-centric will take at least 5 years. All speakers at the Big Data conference seemed in agreement that user centric KPIs based on averages are to be shunned because users are so very unique! That sounds fine in theory but CFO’s are going to need to stay up late finding out how to live without the concept of ARPU.

The eye-opening presentation from Belgacom's Emmanuel Van Tomme stayed on the customer-centricity able but made the clearest case so far that change management is key to Big Data implementation. Emmanuel was the first Telco guy I’ve heard talk about a great new metric called VAP or Very Annoyed People. They can now be identified with Big Data analytics.

Many speakers converged on the theme of "change management" as THE key challenge for Big Data going forward. The general message was that if Hadoop is ready to deliver, people and even less their organizations were not yet.

Thinking of the bigger Telcos conjures up an image of oil tankers trying to steer away from network metrics towards usage metrics. Looking solely at the agility dimension I couldn’t help wondering if they could survive the speedboats like Amazon or Google.

As the conference was wrapping up I gleaned an interesting metric: subscribers are 51% more willing to share their data if they have the control of whether or not to share it in the first place! It’s one of those Doh!-I-knew-that statistics you feel you should have come up with, but didn’t.

Earlier it had been pointed out during one of the panel sessions that to make Big Data work for Telcos, subscribers must entrust ALL their data to the operator. For them to agree to this, the outbound sales & marketing link must be cut. It’s probably wiser to have one unhappy head of sales than many unhappy customers.

But things aren’t so simple

The limitations of KPIs

Peter Crayfourd illustrated the danger of KPIs with the voice continuity metric. In his example it was 96% when calculated over 15 minutes, so if that’s what your tracking all is hunky dory.  But in the same network environment, when the same metric is calculated over 45 days the result is usually 0%. Crayfourd went on to explain how averages can be dangerous within an organization: someone with their head in the oven feet in freezer has good average temp! Matt Olson from US operator Century Link pointed out that in the User eXperience (UX) domain simple maths just don't work: UX isn't the sum of the parts but some more complex function thereof.

Listening to the UX focussed presentations one got the feeling that the Big Data story might just be a pretext for some new guys to come steal the carpet from under the feet of the QoE market stakeholders. They’ve been saying this for almost a decade … Big Data is a means not an end.

Cost of Big Data & Hadoop.

For EBay, Hadoop may be cheaper to setup, but it’s so much less efficient to run than structured data that the TCO currently seems the same as with other enterprise solutions.

Google, eBay and even Microsoft made compelling presentations about the nuts and bolts of Big Data and then tried to resell their capabilities to the service providers in the room. TM Forum could have been a bot more ambitious and tried to get more head-on strategic discussions going on how the big pure-play OTT giants are actually eating Telco and other Service provider’s lunch. Maybe a lively debate to setup in Cannes?

Does the Emperor have any clothes on?

UK hosting company MEMSET's Kate Craig-Wood isn’t sure at all. Kate said that Big Data techniques are only needed in a very few cases where many hundreds of terabytes are involved and near real-time results are required. She does concede that the concepts born from Big Data are useful for all.

MEMSET’s co-founder went on to explain how a simple open source SQL based DBMS called SQlite successfully delivered interesting analysis on hundreds of Billions of data points, where MySQL had fallen over. She had to simplify and reorganize data and importing it took 4 days, but once that was done she got her query answered in minutes. Ms Craig-Wood went on to say that the SQL community is working flat out to solve scalability issues going as far as saying "I'd put my money on SQL evolving to solve most of the Big Data problems". There's so much SQL expertise out there!

Perhaps the most controversial part of this refreshing Big Data debunking session from Kate Craig-Wood of MEMSET was when she said that “I don't believe in data scientists, most DevOps will do fine, and Hadoop isn't that complex anyway”. She has a point: we're at the pinnacle of the hype cycle.

Caution

Less extreme but still on the side of caution were the sensible questions from Telefonica that is experimenting with Big Data. The Spanish operator is still cautious about the “high entrance cost” and uncertain final price tag or TCO. So far the Telco has built both a centralized cloud instance of its platform and also separate instances for each of its operating companies in different markets. Telefonica’s Daniel Rodriguez Sierra gave an amusing definition of Big Data as simply those queries we can't handle with current technology.

Verizon wireless also reaffirmed the need for caution pointing out that to implement Big Data and reap any benefit thereof you need an agile trial and error approach. That’s a tall order for any incumbent Telco. The US mobile operator admitted that it was being wooed by the likes of Google, Amazon and EBay that would all love to resell their analytics capability to Verizon. But staunch resistance is the party line as Verizon mobile has the scale (and pockets) to determine that the core data is too strategic to be outsourced. In terms of scale Verizon wireless has 100M subs and 57K towers that generate a petabyte of data or 1,25 trillion objects per day crunched currently with 10K CPUs. Verizon’s Ben Parker was pleasantly open saying that an "army of lawyers is happily supplied with plenty of privacy work now we're capturing info on all data packets".

Governance was too frequently mentioned during several presentations not raise an alarm bell in my mind. It seems that those who’ve actually got their hands dirty with Big Data are finding themselves embarked on projects that are difficult to control.

In the end

I was really impressed by the commitment operators are making to big Data on the one hand while clearly expressing reservations or at least warning that we’re just at the beginning of what’s going to be a long Journey.

For further reading here are three other write-ups of the event that I commend:

There’s a mini video interview of Peter Crayfourd here: http://vimeo.com/58535980

Part I of this report (interview of TM Forum's Strategy Officer) is here.

Part II, a discussion with Guavus and Esri, is here.

Posted on 2 Comments

Part II of The Big Data Summit organized by TM Forum in Amsterdam: the demos.

In this short blog I’m just reporting on the demos I saw at the show in January. Part III will be on the conference content itself which was very interesting. For a first of it’s kind, having 4 exhibitors was a reasonable achievement. I didn’t get to talk the folks Amdocs whose booth only had brochures nor with the Lavastorm guys as they were too busy for me both times I tried. I did get to the Guavus and Esri booths so here’s w

hat I took away.

Guavus was the sole sponsor of the event – although still only a silver sponsor, so the TM Forum sales people must be tough cookies ;o)

Guavus Logo

Guavus is a private 350 person company head-quartered in San Mateo, CA with offices throughout the US as well as in the UK, Singapore, Montreal and India where they also have R&D teams. As a few others, they claim to have been delivering Big Data analytics from 2006, before the name even existed.

It’s always a delicate balancing act to ride a hype wave like this Big Data Tsunami. You need to be seen to have been doing it for ages, but then again you also have to acknowledge its novelty otherwise you can’t join in on the orgy of industry news.

The CEO founded the company after working at SPRINT labs. Anukool Lakhina, realized there was a scalability hurdle that the traditional model for storing data and doing business intelligence analytics were not going to be able to cross. He raised some money and started working on a solution. The core algorithms developed then are currently patent pending.

Guavus now works with 2 of the big US Telcos as well as Bell Canada through the recent acquisition of Neuralitic. Star Hub is also a major client in Singapore that came about through the Neuralitic acquisition.

The company’s primary focus is in the Telco space, because that's where the core data resides. But as an aggressive young company Guavus is already looking at other segments and has a few confidential Proof of Concepts underway.

I asked Suzanne McCormac, Senior Director of Marketing Communications, if Big Data could save Telcos from falling into the commodity oblivion of the dumb pipe. “They’re sitting on a gold mine  - if they can just figure it out they have the opportunity to compete with the OTT players because they have better data from billing, CRM etc. There is a fantastic window of opportunity for them here”.

I asked Suzanne why Guavus, a US focussed company came all the way out to Amsterdam for the show, “The TMF Big Data Summit in Amsterdam is a key event for Guavus given the company's global expansion plans. We expect to announce several more CSP deployments outside of the US in 2013."

Despite being one of only four exhibitors, Guavus had no demo, but I’m told they’ll have a lot to show at this years Mobile World Congress in Barcelona.

Esri Logo

The only real demo I saw at the show was from Esri, a Geographical Information System or GIS company with a strong emphasis on being environmentally friendly and sustainable. That they clearly are, as Esri has been around since 1969. The company is atypically still privately held. Headquarters are in Redlands, California. Randy Frantz, Esri’s Telecoms & LBS Industry Manager was at the show and told me Esri is now the world’s largest GIS software supplier with over 3,000 employees and 350,000 clients (his business card uses the LBS acronym without explaining it, so if you’re as forgetful as I, let me remind you it stands for Location Based Services).

The demos were all of the graphical analysis of various data points that had a geographical component. Randy showed me several instances of dynamic charting where all sorts of graphs and colours automatically updated on the screen as you move around navigating through the data. So for example, clicking on one part of the network automatically updated the QoS/QoE data around the screen. One demo also integrated Esri’s display capabilities with IBM’s Business Intelligence software Cognos.

I got a clear impression that having Esri as part of a solution, say in an operator’s NoC, would make for an extremely powerful UI. Although Esri can undoubtedly power great monitoring interfaces, the competitive edge I sensed was more for trouble-shooting type of applications were interactivity is key. Such a top-of-the-range solution pointed to by the Big Data demos I saw clearly targets top tier operators that could justify the cost.

If you missed it, part I of this series is an interview of Nik Willetts, TM Forum's Chief Strategy Officer. It's here.

Part III is a report on the conference content, it's here.