Monday, 15 December 2014

Good news, bad news and wrong news on IT spending

 I've got some good news for all you cloud service providers, security technology companies, storage suppliers, big data companies and those of you in the mobile industry: 2015 is going to be a bumper year.

"IT decision makers’ spending on security technologies will increase 46 percent in 2015, with cloud computing increasing 42 percent and business analytics investments up 38 percent. Enterprise investments in storage will increase 36 percent, and for wireless and mobile, 35 percent."

That at least what was said in this story on the web site of Forbes Magazine, a publishing house that claims "iconic status in the lexicon of American media."

That's the good news. The bad news is that the rest of IT is going to have a very tough time in 2015. The Forbes article also reported: "The average IT budget will increase by 4.3 percent in 2015."

The article cites as its source for this astonishing information research undertaken by IDG, publisher of Computerworld and numerous other IT magazines.

The truth is that this is a load of boloney. The author of the Forbes article misinterpreted IDG's report of its research. What IDG found was that 46 percent of respondents said that spending on security would increase in 2015, but not by what percentage. Similarly, 42 percent said spending on cloud would increase, and so on.

Trouble is, that error has been widely spread by numerous others who have picked up and quoted or referenced the Forbes report unquestioningly. I searched on the exact phrase of the Forbes article headline "Computerworld's 2015 Forecast Predicts Security, Cloud Computing And Analytics Will Lead IT Spending" and got 50 hits.

What's really astonishing is the lack of response by both Forbes and IDG to this error. On its web site Forbes offers an email address through which to submit corrections to its published articles. I did so, on 4 December. No response, on 15 December the article is still there unchanged.

I got a half-hearted response from IDG that acknowledged only one error, but IDG does not seem to have sufficiently concerned about the misrepresentation of its research to pursue Forbes and ensure that the error is corrected.

An IDG PR person responded to my email by saying: "I have reached out to Louis [Colombus, the author of the Forbes article] as that point of data was misinterpreted. Forty six percent of respondents anticipate their security budget will increase in the next year, but we did not collect by what percent."



Friday, 14 November 2014

All quiet on the Optus OSS front

It was, I am sure, no coincidence that NEC announced a major OSS deal with Optus on the day that Optus announced its Q2 results, as part of parent SingTel's Q2 results announcement.

Optus is trying to play down the importance of a deal that will have far reaching effects on the company for years to come and, if successful, will greatly improve Optus' agility and competitiveness. And if not successful...

NEC subsidiary NetCracker is providing a complete operational support system upgrade to Optus. NetCracker products will support all Optus services - consumer and business - across the entire service fulfilment chain from service order management to network configuration and activation.

The project will take three years to complete and is believed to be one of the biggest of its kind being undertaken by a major telco anywhere in the world.

Optus is saying nothing. Rather than crow about what the deal could mean in terms of efficiency, improved customer service, etc it is keeping its head down. Not that its low profile will make any impact where the news really matters - to Optus major competitors who will be monitoring the project as closely as they can for signs of its impact on Optus' market activities, and for any signs of hiccups.

Gartner has just released its Magic Quadrant for OSS. It has NetCracker as the highest ranked player in the leaders quadrant, marginally ahead of Amdocs and Ericsson. Oracle and IBM are also in the leaders quadrant.

More important though, here is what Gartner has to say about OSS in general: "OSS data is of strategic importance to measure the impact on specific operational, customer and business goals. OSS helps to improve operational efficiency, drive down costs and improve customer experience. ... [OSS data is used] to link operational technical planning with actual customer data, and hence is used by lines of business, marketing, strategic planning, etc.

And like almost every other area of IT, OSS is struggling to keep up with disruption. "The evolution of digital services and of convergent services poses complex challenges for OSS. Challenges include machine to machine (M2M), value-added services (VAS) and B2B (enterprise), as well as the need to provide more competitive personalized services (many of which consist of third party content and components)," Gartner says.

It's believed to be 10 years since Optus undertook a major OSS upgrade, so you can be pretty sure it's existing systems are under stress.

And I have not mentioned one of the biggest disruptions to telcos that is approaching at great speed: the combination of software defined networking and network functions virtualisation.

As Gartner says: "The arrival of NFV and SDN will be disruptive to [telcos'] existing OSS architectures, potentially requiring an upgrade or a new generation of OSSs. Network-facing OSSs — such as provisioning, fault and event, and performance management — must be re-architected to support network and service orchestration functions for hybrid infrastructures."

In short, it would be no exaggeration to say that the future of Optus depends on the success of this project.


Wednesday, 12 November 2014

In search of the Internet of Everything

The Guardian 's web site this week carried a lengthy article that sought to arrive at a clear definition of the 'Internet of Things'. While it explored the topic it did not do much to clarify the definition.

"You could be forgiven for believing that the Internet of things (IoT) is a well-defined term and that everyone is on the same page," it said. "But you would be mistaken to say the least, given the huge variety of intelligent connected devices that this term refers to."

That doesn't make much sense: the term surely is meant to be all-embracing, to include every kind of connected 'thing'. The Guardian then confused the issue even further by equating the term to the Internet of Everything (IoE). "In fact, the thing about the IoT is that it could mean almost anything. In some ways it is better to think of it as the Internet of everything."

I'm not sure whether it was Cisco that coined the term Internet of Everything, but Cisco has largely been responsible for it gaining currency. (According to Wikipedia, Cisco launched its first global re-branding in six years in 2013 with its 'Tomorrow starts here' and 'Internet of Everything' advertising campaigns). In so doing Cisco managed to muddy the waters as to the distinction between IoT and IoE.

And if you look at some of Cisco's IoE statements, there really seems to be little distinction. Cisco defines Cisco defines the Internet of Everything as "The bringing together of people, process, data and things to make networked connections more relevant and valuable than ever before, turning information into actions that create new capabilities, richer experiences, and unprecedented economic opportunity for businesses, individuals, and countries."

That does go beyond simply a network of connected things, but I don't think it makes the distinction clear enough. It's expressed much better in this blog from Cisco's chief futurist, Dave Evans, in the Huffington Post, devoted to elucidating the distinctions between IoT and IoE.

He reveals that IoT now has an 'official' definition from one of the highest authorities on the English language, the Oxford Dictionary: "a proposed development of the Internet in which everyday objects have network connectivity, allowing them to send and receive data." Good try, and it brings in another dimension: changes to the Internet itself, rather than simply connected devices that exploit it.

It is almost certain that changes to the Internet will be needed to cope with the billions of things expected to be connected to it. The IEEE has set up a new IoT working group, P2413, on the Internet of Things and I quoted here one of its members saying that new Internet standards were urgently needed because "with 50 billion connected devices by 2020. The Internet as we know it today is not ready for that."

Changes to the Internet aside, Evans argues that IoT is just one of the four dimensions - people, process, data and things - that constitute IoE. "If we take a closer look at each of these dimensions, and how they work together, we'll begin to see the transformative value of IoE," he says.

His blog is well worth a read, but he nicely sums up the distinction in his concluding paragraph. IoE is not about those four dimensions in isolation from each other. "Each amplifies the capabilities of the other three. It is in the intersection of all of these elements that the true power of IoE is realised."

In other words, big data (aka data analytics) to make sense of the masses of data that the 'things' will generate will be just as much a part of IoE as the devices that produce data and the networks that interconnect them, as will the applications and innovations that emerge to exploit billions of connected devices of all kinds.





Tuesday, 28 October 2014

Cloud is coming, faster than you might think

IDC has been banging on about its 'third platform' for several years now, but maybe it takes one company's real-world experience to bring home the importance of this shift and the rate at which it is happening.

IDC believes IT is moving rapidly to a paradigm based on cloud computing, mobile, big data and social, and that this shift will be highly disruptive. It recently set out its views in IDC Predictions 2014: Battles for Dominance —and Survival — on the 3rd Platform, which you can buy from IDC for the princely sum of $5, or get for free here, courtesy of SAP.

CIO magazine gave it a rave review. "While understated, its analysis and predictions provide as much drama as any novel, and its denouement is the kind of cliffhanger that makes it, as the saying goes, unputdownable. Simply stated: You must read this report and think about what it means for your company's future. This is true whether you're an IT user a vendor or, for that matter, a company that thinks of itself as in another business altogether."

So, back to that real world experience. Here's how fast one software company - call centre systems vendor Interactive Intelligence - has seen its sales shift from the on-premises version to the cloud version.

In mid 2011 I asked Brendan Maree, the head of Interactive Intelligence for Australia and New Zealand how fast he thought sales would shift from the on-premises version of the company's software to the cloud version. He predicted that up to 50 percent of revenue could come from the cloud version within three years.

He was right, and he was wrong. Last year, globally, 50 percent of sales were for the cloud version, according to founder and CEO, Don Brown, but in ANZ the figure was 87 percent.

Maree says that the shift to cloud has been rapid and almost complete "This is our fourth year into the cloud business in Australia and New Zealand. We introduced a cloud offering in 2009. In the first year five percent of orders were for the cloud version. The next year it was 11 percent of revenue. It moved to 26 percent the following year and then jumped to 87 percent last year. This year we have done only two deals for on-premises system, both quite small."

He added: "There was a time last year when cloud created a lot of work for us. The sales engineers had to do two designs because prospective customers weren't sure about cloud. But all the tenders now, and all the discussions, are about cloud. It's like people have forgotten about premises based systems."

Globally Brown said revenues from cloud had gone from five percent four years ago to 60 percent this year and projections for next year were 70 percent.

There was no suggestion from either Maree or Brown that the company has been aggressively promoting the cloud version at the expense of the on-premises version. Rather, that it has simply responded to market demand.

What this rapid transition illustrates is the need for enterprises to be alert to these paradigm shifts and adapt accordingly. Brown related two stories of discussions with potential customers - one a low margin fashion retailer and the other a large long-established insurance company - that do not appear to have realised what is happening.

The fashion retailer had "run its previous [premises based] contact centre into the ground; amortised it to dust" and was quite happy to do the same with a replacement system. The insurance company "almost threw me out of the room when I mentioned cloud," Brown said. "It was like I was insulting the operational capacity of their data centre."


There may well be good reasons for a company to opt for a premises based solution rather than cloud-based, but it's clearly not a decision that should be taken because that was how it was done in the past, or in ignorance of a transition that is turning the world upside down at great speed.

Friday, 24 October 2014

Here comes the Quantified Selfstra

The launch of Telstra's eHealth initiative this week was a curious affair. Most of the focus was on ReadyCare, Telstra's joint venture with Swiss company Medgate set up to provide general practitioner services over the phone. It's at least six months from becoming reality.

In contrast no mention was made of close to 20 ehealth offerings detailed a glossy brochure handed out at the event, all of which, it seems, are services that are already in operation. So quite likely most of what you've read in the news about Telstra Health represents only the tip of the iceberg. You can find most of them on this web page.

When I asked about these Telstra told me that the providing entities were a mixture of companies acquired by Telstra, companies in which it has taken equity and those whose products and services it has licensed.

So there is certainly more, much more to Telstra Health than Telstra has talked about so far and some it likely centres around how Telstra can leverage the personal monitoring devices that will, inevitably, be a major component of many ehealth initiatives.

According to Frost & Sullivan, "wearable technology has gained considerable traction especially in the health and wellness industry." That's stating the obvious, but it was made in a press release announcing a F&S report: 'Sensor Technology Innovations Enabling Quantified-Self', in which F&S said: "The market for quantified-self technologies – apps that enable people to track and quantify aspects of their daily lives – is currently in the embryonic stage. However, explosive growth is expected in coming years."

It added: "As healthcare is one of the main industries impacted by the quantified-self movement, acquiring accurate data and ensuring seamless interoperability are key challenges. In addition, data sharing among health services and pharmaceutical firms raises privacy concerns. Healthcare companies must ensure that data collected from clients is not shared without direct consent."

Assured privacy and security will underpin everything Telstra does in eHealth, or else the initiative will be dead in the water. As a company that clearly intends to be active in all areas of eHealth, interoperability will also feature strongly in its offerings. In short Telstra is well placed to take a lead position in the quantified self market.

Similarly, F&S observes: "To get the healthcare industry further involved in quantified-self, enhancing the connectivity of wearable devices with technology companies to support data exchange will also be crucial." That's another function Telstra Health will be well-placed to fulfil.

But just what is the quantified self? According to F&S: "Quantified-self facilitates the tracking of diet, sleep, heart rate, activity, exercise and moods and allows individuals to gain better insights on physiological parameters that were never examined earlier.”

The movement, and the term, was created by two editors from Wired Magazine, Gary Wolf and Kevin Kelly. You’ll find a detailed explanation of it in this 2009 article by Wolf.


There is already a global movement for self monitoring and quantified self (http://www.quantifiedself.org). It has over 100 Meetup groups around the world and one each in Sydney and Melbourne with over 400 members between them.

Tuesday, 21 October 2014

Telstra's store revamp - in search of consistency

When Telstra CEO, David Thodey, trumpeted the innovations in customer service embedded in the company's new $112m flagship retail outlet in the old Daryl Lea chocolate shop at 396 George Street, Sydney he talked about giving customers the ability to start an interaction in one store, review the history on their own PC, tablet etc and, eventually, when the retail refresh has been rolled out fully, pick up the conversation in another store.

Trouble is, although many of those stores might look and to some extent feel like Telstra stores, they aren't. ASX listed Vita Group alone operates 109 Telstra branded retail stores and Telstra Business centres around Australia.

I asked Thodey whether the same seamless experience would be available across these licensed stores - because of course customers are blithely unaware of ownership distinctions. Not initially, he said, but this was something Telstra was working on.

Consistency of customer experience in retail environments certainly something that has been highlighted as important and its importance is reflected in the long history of love-hate relationships between Australian mobile operators and the own versus licensed retail outlet model.

Last year KPMG published the results of a global mystery shopping exercise for prepaid SIMs. They weren't interested in prepaid SIMs per se, but they were a convenient, and almost universal product that could be used to contrast and compare the 'customer experience' across different channels - online, in store, etc - different outlets - owned v licensed v independent - and different countries.

Consistency loomed large. "In the early days of prepaid, operators are often happy to have their products sold from all retail channels: licensees, non-exclusive, supermarkets, convenience stores, etc," KPMG said. "The focus has now changed to the extent that some operators are buying back the franchise and licensee stores so they can have more control over the customer experience. ... It is vitally important that operators deliver a consistent experience across all retail outlets regardless of ownership."

KPMG went on to note that "Two leading Australian telcos in April 2013 both moved to end long-standing agreement with sub distributors of their prepaid offerings to allow for more control of the channel."

Optus last year shed deals with AllPhones and TeleChoice cutting its retail outlets by over 300 stores and announced plans to open up another 30 of its own stores. Optus MD of sales, Rohan Ganeson, was reported saying: “We believe [that] investing more in our stores, investing more online and investing into our people will deliver [the] results we need them to, but also deliver a greater experience for our customers overall. We think the renewed focus will give a cleaner, more engaging…customer experience — much more enjoyable — rather than going in and having a phone flogged to you.”

Vodafone has had a view of retail that has varied with its ownership. Back in 2003 it jettisoned the last of its retail stores announcing a deal to outsource the operation of its 58 shops to privately held phone retailers Digicall Australia and First Mobile. Then in 2009, following the merger with Hutchison, the new company announced that it would bring all its 208 Vodafone-branded retail outlets in-house.

VHA said: “We do see it as a very important strategic shift which will allow us to own and operate our national Vodafone-branded retail channel. “We also see it as a very shrewd move competitively because it will better position VHA with a more cohesive and consistent Vodafone customer experience."

Of course what these stores sell has changed dramatically over the past decade. And it was clear from the Telstra store that it's not just about mobiles. It about smart homes, wearable devices, Internet connected domestic appliances and more. The store even features a wall sized video screen portraying products and technologies that are, to varying degrees 'futureware'.

To communicate the full potential of what is available to the connected consumer will increasingly require significant investment in large demonstration-type facilities like the new Telstra store. Maintaining a consistent experience across multiple outlets will be a huge challenge.


Monday, 20 October 2014

Salesforce Wave and the future of data analytics

Forbes Magazine concluded a lengthy report on Dreamforce - Salesforce's mammoth conference held in San Francisco last week - with the comment that neither of the major product announcements - data analytics tool Wave and mobile app development platform, Lightning - were "as finished, or nearly as polished, as the four-day corporate love-fest that Salesforce has become the master of hosting."

I'd come to the same conclusion. Like everything else at Dreamforce the demonstrations of Wave were slick and seamless, but I also came to the conclusion that what was demonstrated was without doubt the future of data analytics and that, however far short of that Wave today is in reality, it will get closer, and probably quite rapidly.

Firstly, Salesforce has developed it Wave the basis that the primary user interface will be a mobile device, not only because mobile devices are now the favoured means of interfacing to cloud based IT services but because it believes that access to analytics capability needs to be available in real time to the people at the coalface: sales people trying to win deals, distribution managers trying to determine delivery volumes, etc, not just to data analytics specialists in back rooms.

Secondly, ease of use is a number one priority. Data analytics needs to be available directly to those who need the answers, not just to specialists and the results displayed in easy to understand graphics in any one of many different formats and flicked instantly from mobile to tablet or desktop.

Thirdly there needs to be access to data from many different sources to support a wide range of queries, many of which might be hard to anticipate in advance.

In one demonstration a rep from a financing company convinces a builder of luxury boats to take on millions of dollars of finance to build more of a particular model by showing (a) inventory of that model is very low and (b) demand for it is likely to be high based on the volume of positive chatter about it on social media sites.

Salesforce maintains that none of the demonstrations were set up, that all were real systems. That's no doubt true but what determines the usefulness of tools like Wave is not just the user interface - and Salesforce seems to have done a very good job of making very easy to use - but the data sets that have been integrated into it.

The choices made will inevitably be determined in anticipation of the types of queries likely to be made and costs would make it a case of diminishing returns to cater for infrequently needed queries.

But the vision is spot on. Being able to easily ask any question of any relevant data set, even those that might previously never have seen relevant, comparing the answers with queries on other datasets and getting quantified results presented graphically will revolutionise many aspects of many industries.

The realisation of that vision will do for the world of quantitative data analysis what Google it for qualitative information research. Remember the pre-Internet and pre-Google era where online research was the exclusive domain of librarians and researchers who understood the arcane technology of online databases like Dialog?

Launched in 1966, Dialog claimed to be "the world's first online information retrieval system to be used globally with materially significant databases." Its usefulness was limited by the range of its datasets and the skills needed to use it. And that is pretty much where data analytics is at today.