Monday, 15 December 2014

Good news, bad news and wrong news on IT spending

 I've got some good news for all you cloud service providers, security technology companies, storage suppliers, big data companies and those of you in the mobile industry: 2015 is going to be a bumper year.

"IT decision makers’ spending on security technologies will increase 46 percent in 2015, with cloud computing increasing 42 percent and business analytics investments up 38 percent. Enterprise investments in storage will increase 36 percent, and for wireless and mobile, 35 percent."

That at least what was said in this story on the web site of Forbes Magazine, a publishing house that claims "iconic status in the lexicon of American media."

That's the good news. The bad news is that the rest of IT is going to have a very tough time in 2015. The Forbes article also reported: "The average IT budget will increase by 4.3 percent in 2015."

The article cites as its source for this astonishing information research undertaken by IDG, publisher of Computerworld and numerous other IT magazines.

The truth is that this is a load of boloney. The author of the Forbes article misinterpreted IDG's report of its research. What IDG found was that 46 percent of respondents said that spending on security would increase in 2015, but not by what percentage. Similarly, 42 percent said spending on cloud would increase, and so on.

Trouble is, that error has been widely spread by numerous others who have picked up and quoted or referenced the Forbes report unquestioningly. I searched on the exact phrase of the Forbes article headline "Computerworld's 2015 Forecast Predicts Security, Cloud Computing And Analytics Will Lead IT Spending" and got 50 hits.

What's really astonishing is the lack of response by both Forbes and IDG to this error. On its web site Forbes offers an email address through which to submit corrections to its published articles. I did so, on 4 December. No response, on 15 December the article is still there unchanged.

I got a half-hearted response from IDG that acknowledged only one error, but IDG does not seem to have sufficiently concerned about the misrepresentation of its research to pursue Forbes and ensure that the error is corrected.

An IDG PR person responded to my email by saying: "I have reached out to Louis [Colombus, the author of the Forbes article] as that point of data was misinterpreted. Forty six percent of respondents anticipate their security budget will increase in the next year, but we did not collect by what percent."



Friday, 14 November 2014

All quiet on the Optus OSS front

It was, I am sure, no coincidence that NEC announced a major OSS deal with Optus on the day that Optus announced its Q2 results, as part of parent SingTel's Q2 results announcement.

Optus is trying to play down the importance of a deal that will have far reaching effects on the company for years to come and, if successful, will greatly improve Optus' agility and competitiveness. And if not successful...

NEC subsidiary NetCracker is providing a complete operational support system upgrade to Optus. NetCracker products will support all Optus services - consumer and business - across the entire service fulfilment chain from service order management to network configuration and activation.

The project will take three years to complete and is believed to be one of the biggest of its kind being undertaken by a major telco anywhere in the world.

Optus is saying nothing. Rather than crow about what the deal could mean in terms of efficiency, improved customer service, etc it is keeping its head down. Not that its low profile will make any impact where the news really matters - to Optus major competitors who will be monitoring the project as closely as they can for signs of its impact on Optus' market activities, and for any signs of hiccups.

Gartner has just released its Magic Quadrant for OSS. It has NetCracker as the highest ranked player in the leaders quadrant, marginally ahead of Amdocs and Ericsson. Oracle and IBM are also in the leaders quadrant.

More important though, here is what Gartner has to say about OSS in general: "OSS data is of strategic importance to measure the impact on specific operational, customer and business goals. OSS helps to improve operational efficiency, drive down costs and improve customer experience. ... [OSS data is used] to link operational technical planning with actual customer data, and hence is used by lines of business, marketing, strategic planning, etc.

And like almost every other area of IT, OSS is struggling to keep up with disruption. "The evolution of digital services and of convergent services poses complex challenges for OSS. Challenges include machine to machine (M2M), value-added services (VAS) and B2B (enterprise), as well as the need to provide more competitive personalized services (many of which consist of third party content and components)," Gartner says.

It's believed to be 10 years since Optus undertook a major OSS upgrade, so you can be pretty sure it's existing systems are under stress.

And I have not mentioned one of the biggest disruptions to telcos that is approaching at great speed: the combination of software defined networking and network functions virtualisation.

As Gartner says: "The arrival of NFV and SDN will be disruptive to [telcos'] existing OSS architectures, potentially requiring an upgrade or a new generation of OSSs. Network-facing OSSs — such as provisioning, fault and event, and performance management — must be re-architected to support network and service orchestration functions for hybrid infrastructures."

In short, it would be no exaggeration to say that the future of Optus depends on the success of this project.


Wednesday, 12 November 2014

In search of the Internet of Everything

The Guardian 's web site this week carried a lengthy article that sought to arrive at a clear definition of the 'Internet of Things'. While it explored the topic it did not do much to clarify the definition.

"You could be forgiven for believing that the Internet of things (IoT) is a well-defined term and that everyone is on the same page," it said. "But you would be mistaken to say the least, given the huge variety of intelligent connected devices that this term refers to."

That doesn't make much sense: the term surely is meant to be all-embracing, to include every kind of connected 'thing'. The Guardian then confused the issue even further by equating the term to the Internet of Everything (IoE). "In fact, the thing about the IoT is that it could mean almost anything. In some ways it is better to think of it as the Internet of everything."

I'm not sure whether it was Cisco that coined the term Internet of Everything, but Cisco has largely been responsible for it gaining currency. (According to Wikipedia, Cisco launched its first global re-branding in six years in 2013 with its 'Tomorrow starts here' and 'Internet of Everything' advertising campaigns). In so doing Cisco managed to muddy the waters as to the distinction between IoT and IoE.

And if you look at some of Cisco's IoE statements, there really seems to be little distinction. Cisco defines Cisco defines the Internet of Everything as "The bringing together of people, process, data and things to make networked connections more relevant and valuable than ever before, turning information into actions that create new capabilities, richer experiences, and unprecedented economic opportunity for businesses, individuals, and countries."

That does go beyond simply a network of connected things, but I don't think it makes the distinction clear enough. It's expressed much better in this blog from Cisco's chief futurist, Dave Evans, in the Huffington Post, devoted to elucidating the distinctions between IoT and IoE.

He reveals that IoT now has an 'official' definition from one of the highest authorities on the English language, the Oxford Dictionary: "a proposed development of the Internet in which everyday objects have network connectivity, allowing them to send and receive data." Good try, and it brings in another dimension: changes to the Internet itself, rather than simply connected devices that exploit it.

It is almost certain that changes to the Internet will be needed to cope with the billions of things expected to be connected to it. The IEEE has set up a new IoT working group, P2413, on the Internet of Things and I quoted here one of its members saying that new Internet standards were urgently needed because "with 50 billion connected devices by 2020. The Internet as we know it today is not ready for that."

Changes to the Internet aside, Evans argues that IoT is just one of the four dimensions - people, process, data and things - that constitute IoE. "If we take a closer look at each of these dimensions, and how they work together, we'll begin to see the transformative value of IoE," he says.

His blog is well worth a read, but he nicely sums up the distinction in his concluding paragraph. IoE is not about those four dimensions in isolation from each other. "Each amplifies the capabilities of the other three. It is in the intersection of all of these elements that the true power of IoE is realised."

In other words, big data (aka data analytics) to make sense of the masses of data that the 'things' will generate will be just as much a part of IoE as the devices that produce data and the networks that interconnect them, as will the applications and innovations that emerge to exploit billions of connected devices of all kinds.





Tuesday, 28 October 2014

Cloud is coming, faster than you might think

IDC has been banging on about its 'third platform' for several years now, but maybe it takes one company's real-world experience to bring home the importance of this shift and the rate at which it is happening.

IDC believes IT is moving rapidly to a paradigm based on cloud computing, mobile, big data and social, and that this shift will be highly disruptive. It recently set out its views in IDC Predictions 2014: Battles for Dominance —and Survival — on the 3rd Platform, which you can buy from IDC for the princely sum of $5, or get for free here, courtesy of SAP.

CIO magazine gave it a rave review. "While understated, its analysis and predictions provide as much drama as any novel, and its denouement is the kind of cliffhanger that makes it, as the saying goes, unputdownable. Simply stated: You must read this report and think about what it means for your company's future. This is true whether you're an IT user a vendor or, for that matter, a company that thinks of itself as in another business altogether."

So, back to that real world experience. Here's how fast one software company - call centre systems vendor Interactive Intelligence - has seen its sales shift from the on-premises version to the cloud version.

In mid 2011 I asked Brendan Maree, the head of Interactive Intelligence for Australia and New Zealand how fast he thought sales would shift from the on-premises version of the company's software to the cloud version. He predicted that up to 50 percent of revenue could come from the cloud version within three years.

He was right, and he was wrong. Last year, globally, 50 percent of sales were for the cloud version, according to founder and CEO, Don Brown, but in ANZ the figure was 87 percent.

Maree says that the shift to cloud has been rapid and almost complete "This is our fourth year into the cloud business in Australia and New Zealand. We introduced a cloud offering in 2009. In the first year five percent of orders were for the cloud version. The next year it was 11 percent of revenue. It moved to 26 percent the following year and then jumped to 87 percent last year. This year we have done only two deals for on-premises system, both quite small."

He added: "There was a time last year when cloud created a lot of work for us. The sales engineers had to do two designs because prospective customers weren't sure about cloud. But all the tenders now, and all the discussions, are about cloud. It's like people have forgotten about premises based systems."

Globally Brown said revenues from cloud had gone from five percent four years ago to 60 percent this year and projections for next year were 70 percent.

There was no suggestion from either Maree or Brown that the company has been aggressively promoting the cloud version at the expense of the on-premises version. Rather, that it has simply responded to market demand.

What this rapid transition illustrates is the need for enterprises to be alert to these paradigm shifts and adapt accordingly. Brown related two stories of discussions with potential customers - one a low margin fashion retailer and the other a large long-established insurance company - that do not appear to have realised what is happening.

The fashion retailer had "run its previous [premises based] contact centre into the ground; amortised it to dust" and was quite happy to do the same with a replacement system. The insurance company "almost threw me out of the room when I mentioned cloud," Brown said. "It was like I was insulting the operational capacity of their data centre."


There may well be good reasons for a company to opt for a premises based solution rather than cloud-based, but it's clearly not a decision that should be taken because that was how it was done in the past, or in ignorance of a transition that is turning the world upside down at great speed.

Friday, 24 October 2014

Here comes the Quantified Selfstra

The launch of Telstra's eHealth initiative this week was a curious affair. Most of the focus was on ReadyCare, Telstra's joint venture with Swiss company Medgate set up to provide general practitioner services over the phone. It's at least six months from becoming reality.

In contrast no mention was made of close to 20 ehealth offerings detailed a glossy brochure handed out at the event, all of which, it seems, are services that are already in operation. So quite likely most of what you've read in the news about Telstra Health represents only the tip of the iceberg. You can find most of them on this web page.

When I asked about these Telstra told me that the providing entities were a mixture of companies acquired by Telstra, companies in which it has taken equity and those whose products and services it has licensed.

So there is certainly more, much more to Telstra Health than Telstra has talked about so far and some it likely centres around how Telstra can leverage the personal monitoring devices that will, inevitably, be a major component of many ehealth initiatives.

According to Frost & Sullivan, "wearable technology has gained considerable traction especially in the health and wellness industry." That's stating the obvious, but it was made in a press release announcing a F&S report: 'Sensor Technology Innovations Enabling Quantified-Self', in which F&S said: "The market for quantified-self technologies – apps that enable people to track and quantify aspects of their daily lives – is currently in the embryonic stage. However, explosive growth is expected in coming years."

It added: "As healthcare is one of the main industries impacted by the quantified-self movement, acquiring accurate data and ensuring seamless interoperability are key challenges. In addition, data sharing among health services and pharmaceutical firms raises privacy concerns. Healthcare companies must ensure that data collected from clients is not shared without direct consent."

Assured privacy and security will underpin everything Telstra does in eHealth, or else the initiative will be dead in the water. As a company that clearly intends to be active in all areas of eHealth, interoperability will also feature strongly in its offerings. In short Telstra is well placed to take a lead position in the quantified self market.

Similarly, F&S observes: "To get the healthcare industry further involved in quantified-self, enhancing the connectivity of wearable devices with technology companies to support data exchange will also be crucial." That's another function Telstra Health will be well-placed to fulfil.

But just what is the quantified self? According to F&S: "Quantified-self facilitates the tracking of diet, sleep, heart rate, activity, exercise and moods and allows individuals to gain better insights on physiological parameters that were never examined earlier.”

The movement, and the term, was created by two editors from Wired Magazine, Gary Wolf and Kevin Kelly. You’ll find a detailed explanation of it in this 2009 article by Wolf.


There is already a global movement for self monitoring and quantified self (http://www.quantifiedself.org). It has over 100 Meetup groups around the world and one each in Sydney and Melbourne with over 400 members between them.

Tuesday, 21 October 2014

Telstra's store revamp - in search of consistency

When Telstra CEO, David Thodey, trumpeted the innovations in customer service embedded in the company's new $112m flagship retail outlet in the old Daryl Lea chocolate shop at 396 George Street, Sydney he talked about giving customers the ability to start an interaction in one store, review the history on their own PC, tablet etc and, eventually, when the retail refresh has been rolled out fully, pick up the conversation in another store.

Trouble is, although many of those stores might look and to some extent feel like Telstra stores, they aren't. ASX listed Vita Group alone operates 109 Telstra branded retail stores and Telstra Business centres around Australia.

I asked Thodey whether the same seamless experience would be available across these licensed stores - because of course customers are blithely unaware of ownership distinctions. Not initially, he said, but this was something Telstra was working on.

Consistency of customer experience in retail environments certainly something that has been highlighted as important and its importance is reflected in the long history of love-hate relationships between Australian mobile operators and the own versus licensed retail outlet model.

Last year KPMG published the results of a global mystery shopping exercise for prepaid SIMs. They weren't interested in prepaid SIMs per se, but they were a convenient, and almost universal product that could be used to contrast and compare the 'customer experience' across different channels - online, in store, etc - different outlets - owned v licensed v independent - and different countries.

Consistency loomed large. "In the early days of prepaid, operators are often happy to have their products sold from all retail channels: licensees, non-exclusive, supermarkets, convenience stores, etc," KPMG said. "The focus has now changed to the extent that some operators are buying back the franchise and licensee stores so they can have more control over the customer experience. ... It is vitally important that operators deliver a consistent experience across all retail outlets regardless of ownership."

KPMG went on to note that "Two leading Australian telcos in April 2013 both moved to end long-standing agreement with sub distributors of their prepaid offerings to allow for more control of the channel."

Optus last year shed deals with AllPhones and TeleChoice cutting its retail outlets by over 300 stores and announced plans to open up another 30 of its own stores. Optus MD of sales, Rohan Ganeson, was reported saying: “We believe [that] investing more in our stores, investing more online and investing into our people will deliver [the] results we need them to, but also deliver a greater experience for our customers overall. We think the renewed focus will give a cleaner, more engaging…customer experience — much more enjoyable — rather than going in and having a phone flogged to you.”

Vodafone has had a view of retail that has varied with its ownership. Back in 2003 it jettisoned the last of its retail stores announcing a deal to outsource the operation of its 58 shops to privately held phone retailers Digicall Australia and First Mobile. Then in 2009, following the merger with Hutchison, the new company announced that it would bring all its 208 Vodafone-branded retail outlets in-house.

VHA said: “We do see it as a very important strategic shift which will allow us to own and operate our national Vodafone-branded retail channel. “We also see it as a very shrewd move competitively because it will better position VHA with a more cohesive and consistent Vodafone customer experience."

Of course what these stores sell has changed dramatically over the past decade. And it was clear from the Telstra store that it's not just about mobiles. It about smart homes, wearable devices, Internet connected domestic appliances and more. The store even features a wall sized video screen portraying products and technologies that are, to varying degrees 'futureware'.

To communicate the full potential of what is available to the connected consumer will increasingly require significant investment in large demonstration-type facilities like the new Telstra store. Maintaining a consistent experience across multiple outlets will be a huge challenge.


Monday, 20 October 2014

Salesforce Wave and the future of data analytics

Forbes Magazine concluded a lengthy report on Dreamforce - Salesforce's mammoth conference held in San Francisco last week - with the comment that neither of the major product announcements - data analytics tool Wave and mobile app development platform, Lightning - were "as finished, or nearly as polished, as the four-day corporate love-fest that Salesforce has become the master of hosting."

I'd come to the same conclusion. Like everything else at Dreamforce the demonstrations of Wave were slick and seamless, but I also came to the conclusion that what was demonstrated was without doubt the future of data analytics and that, however far short of that Wave today is in reality, it will get closer, and probably quite rapidly.

Firstly, Salesforce has developed it Wave the basis that the primary user interface will be a mobile device, not only because mobile devices are now the favoured means of interfacing to cloud based IT services but because it believes that access to analytics capability needs to be available in real time to the people at the coalface: sales people trying to win deals, distribution managers trying to determine delivery volumes, etc, not just to data analytics specialists in back rooms.

Secondly, ease of use is a number one priority. Data analytics needs to be available directly to those who need the answers, not just to specialists and the results displayed in easy to understand graphics in any one of many different formats and flicked instantly from mobile to tablet or desktop.

Thirdly there needs to be access to data from many different sources to support a wide range of queries, many of which might be hard to anticipate in advance.

In one demonstration a rep from a financing company convinces a builder of luxury boats to take on millions of dollars of finance to build more of a particular model by showing (a) inventory of that model is very low and (b) demand for it is likely to be high based on the volume of positive chatter about it on social media sites.

Salesforce maintains that none of the demonstrations were set up, that all were real systems. That's no doubt true but what determines the usefulness of tools like Wave is not just the user interface - and Salesforce seems to have done a very good job of making very easy to use - but the data sets that have been integrated into it.

The choices made will inevitably be determined in anticipation of the types of queries likely to be made and costs would make it a case of diminishing returns to cater for infrequently needed queries.

But the vision is spot on. Being able to easily ask any question of any relevant data set, even those that might previously never have seen relevant, comparing the answers with queries on other datasets and getting quantified results presented graphically will revolutionise many aspects of many industries.

The realisation of that vision will do for the world of quantitative data analysis what Google it for qualitative information research. Remember the pre-Internet and pre-Google era where online research was the exclusive domain of librarians and researchers who understood the arcane technology of online databases like Dialog?

Launched in 1966, Dialog claimed to be "the world's first online information retrieval system to be used globally with materially significant databases." Its usefulness was limited by the range of its datasets and the skills needed to use it. And that is pretty much where data analytics is at today.


Tuesday, 19 August 2014

Telcos doomed by dumb pipes? Don’t be so sure.

The death of telcos, starved of revenues and relegated to being providers of dumb pipes while over the top providers cream off all the revenues, has been predicted for far longer than the likes of Google have been around.

In the wake of both Telstra and SingTel Optus releasing annual results last week the issue has once again come to the fore, in a lengthy article in the Australian Financial Review last Saturday. It painted a graphic picture of "a future in which humans are making more phone calls, sending more messages and downloading more content than ever before. And yet the big phone companies, such as Telstra, that for more than 100 years have made it happen are reduced to utilities providing little more than a network of 'dumb pipes'."

The message was reinforced a couple of days later with the release by IDC of its Australia Mobile Services 2014–2018 Forecast and Analysis report, accompanied by a press release quoting senior market analyst, Amy Cheah, saying that connectivity is no longer enough to provide revenue growth. "While operators must continue to invest in network capabilities to protect their core revenue they must adapt their strategy to become more like OTTs [overt the top service providers]; to create new streams of revenue growth by creating new business and deliver new customer experiences."

That's another piece of advice that has been repeated ad nauseam for several years. It was accompanied by some more, from IDC research manager, Siow-Meng Soh, who said: "Mobile operators need to form the right partnerships and train their sales force to be able to sell mobile solutions to different verticals instead of just selling connectivity — which is increasingly being commoditised."
He added: "Some of these applications are machine-to-machine (M2M) applications targeted at specific verticals (eg, utilities, manufacturing, and transportation)."

Both these articles, and I have seen many over the years, start from the premise that the pipes are dumb, that value can be added only through what they carry and the issue for the owners and operators of those pipes is that they get some of that value instead of letting the OTT providers have it all.

But what happens when that network ceases to be dumb and becomes smart, very smart indeed? Last week when Telstra and Optus were announcing their results, over in the US, in Las Vegas TMC was staging a conference on software defined networking (SDN) and network functions virtualisation (NFV). On its web site there's a Q&A with conference chairman and TMC CEO, Rich Tehrani. It's well worth a read, here are a few snippets.

He's asked how the specifications for NFV will impact the market. "These specifications describe how carriers can grow their revenue by providing virtualised services to their enterprise customers by placing solutions in the customer's cloud. By providing virtualised customer equipment, carriers will be able to cost-effectively compete with OTT cloud-communications vendors more effectively."

He's asked what the opportunity will be for telcos. "Massive – huge – incredible – I haven't seen an opportunity in the telco ecosystem space this big since 1998 when I launched Internet Telephony Magazine and subsequently watched circuit switched products become legacy thanks to packet-switched. This exact sort of transformation is going to happen again. ... The OTT providers and other competitors know this and are certainly not slowing down their assault."

It's less that two years since a bunch of telcos, including Telstra, presented the first white paper on NFV. Since then progress has been astonishingly rapid. Telstra is already trialing virtualised CPE functionality with Ericsson and yesterday Ericsson announced that would continue to supply optical transmission equipment to Telstra, along with gear from its partner, Ciena. All their press release could talk about - albeit very vaguely - was how the technology would enable Telstra to implement SDN and NFV.

In other words those dumb pipes are about to become very smart indeed, and the telcos will be the brains. Reports of their death are, in all likelihood, greatly exaggerated.


Monday, 18 August 2014

Vodafone Australia going after M2M market

On 10 September Vodafone Australia is holding a press briefing "Machine-to-Machine – Vodafone insights on adoption, growth and future markets" at which it promises to look at "the global [M2M] landscape based on insights from Vodafone’s M2M Adoption Barometer 2014" and share "new Ovum research, commissioned by Vodafone, [that] will more closely examine expectations for the Australian market." This could be the one area where Australia's beleaguered number three mobile player can gain an edge on Telstra and Optus.

Vodafone Group was an early entrant into the global M2M market. In January 2009 it launched a global machine to M2M service platform to help companies deploy and manage large, wireless M2M projects, supported by a team charged with growing the company’s M2M business worldwide The move came from Vodafone Global Enterprise, a business unit set up in 2007 to serve the mobile communications needs of multinationals across multiple countries.

That initiative appears to have paid off. In June Machina Research, a UK based company that claims to be "the world's leading provider of strategic advice on the newly emerging M2M, IoT and big data markets" published its annual review of the M2M operations of the major global telcos, saying that Vodafone had taken the top spot.

The report's author, Matt Hatton said: “Vodafone’s scale, growth and customer wins are testament to its ongoing lead in the sector. If we just look at the numbers, this year it overtook AT&T and Verizon to become the biggest global M2M provider, in terms of SIMs. It was also the fastest growing of the fifteen we studied."

He added: "We’re not just counting numbers of connections to determine current, or future, success. We expect initiatives such as its SOBE product, its plans for licensing the GDSP platform, and the added bonus of funds from Project Spring to provide further impetus in the next few years.”

SOBE (Simple Out-of-Box Experience) is designed to make it easier for users of smart devices to sign a prepaid data contract on the fly. For example, the Amazon Kindle Fire come equipped with Vodafone's SOBE technology. GDSP (Global Data Services Platform) is an online service designed to help Vodafone clients manage all their M2M connections and activate, block and disable devices. Project Spring is a global network investment program.

In a further move into the M2M market, Vodafone Group announced in July that it would expand its long-standing partnership with Japanese telco, NTT Docomo, to include the delivery of M2M services to global enterprises. However the two gave no details of their new collaboration.

Also in July Vodafone announced plans to acquire Italian company Cobra Automotive Technologies, a provider of security and telematics products to the automotive and insurance industries. Vodafone said the move was in line with its strategy to expand its M2M capability beyond connectivity. “Cobra’s telematics products and expertise will enable Vodafone to provide a more comprehensive range of end-to-end services to automotive customers.”

So back to the 10 September event. I guess we'll have to wait until then for Ovum's research on the Australian M2M market but Vodafone's barometer was published in July. The headline is that the percentage of companies saying that they are using M2M has grown by over 80 percent since Vodafone's first study a year ago. Vodafone says: "Over a fifth of companies that we spoke to already have at least one M2M-based solution in place. Fifty five percent of organisations that we talked to said that they expect to have an M2M solution in place within two years."


Vodafone's success in the M2M market will be good news for NetComm Wireless (ASX: NTC). In October 2012 it signed an agreement with Vodafone Global Enterprise to supply M2M modems, saying: "This is an ongoing supply agreement and the size of revenues will directly relate to the success that Vodafone has in securing M2M contracts."

Friday, 15 August 2014

Digital security? No such thing, says Gartner

Good headline eh? A beat up? Don't be so sure. Gartner has just released its 2014 Hype Cycle for Emerging Technologies, claiming that it "maps the journey to digital business." It puts digital security at the very start of that journey and on a slow road to the destination.

This journey is important. All the analysts are saying that businesses must become digital to survive. I spent most of yesterday at Forrester's CIO Summit in Sydney at which that message was driven home time and again.

It's neatly summed up in the introduction to a new Forrester report The State Of Digital Business In Asia Pacific In 2014, due for release later this month. "Regional CIOs must incorporate digital as a core technology imperative. CIOs who ignore the impact of digital disruption do so at their own peril. The only way to weather dynamic industry changes is to incorporate systems that help your organisation win, serve, and retain customers."

I'm sure you're familiar with the Hype Cycle. A new technology starts with an innovation trigger, rises relatively rapidly to the 'peak of inflated expectations' before descending equally rapidly to the 'trough of disillusionment' and then, assuming it survives rises relatively slowly up the 'slope of enlightenment' to the 'plateau of productivity'.

There are close to 50 emerging technologies spread across this hype cycle, each individually coded with Gartner's estimate of how long that technology will take to reach the plateau of productivity.

The one that caught my eye is right at the start of the cycle with a 5 to 10 year time frame to productivity. It's labelled 'digital security'. Surely not? The issues with security are well publicised but it is an established and generally successful technology, if you take it to be a blanket term covering the whole gamut of techniques and technologies used to secure data in the digital world. That at least is how Wikipedia defines it (although that article is flagged as having multiple issues).

Without access to Gartner's full report on the hype cycle it's hard to know exactly what Gartner means by 'digital security' but references in Gartner blogs etc suggest it to be a fairy general umbrella term.

And indeed it seems that Gartner is forecasting the end of security as we know it. In Gartner's top 10 predictions for 2014 there is one slide (slide 13) which predicts that: "By 2020 enterprises and governments will fail to protect 75 percent of sensitive data [and] will declassify and grant broad/public access to it."

It goes on to say: "Enterprises and government should accept that sharing many seemingly sensitive data is neither dangerous or unprofitable, politically and economically." It argues that the growth of data will exceed protective mechanisms and that the best form of protection is having nothing to protect.

Trouble is the same technologies than can be used exfiltrate data can be used to infiltrate and disrupt systems that do much more than store and process data: systems that control things, things like electricity supply, lifts, life support systems etc, etc.

Another of Gartner's 2014 predictions is that "by 2024 at least 10 percent of activities potentially injurious to human life will require mandatory use of a non-overidable smart system."

Non-overidable of course does not mean secure and non-hackable. But hey, Gartner reckons digital security will reach have reached the plateau of productivity by then so we can all relax.

Friday, 8 August 2014

Does this space drive break the law of physics?

There's been quite a bit of coverage on news services these last few days of a new technology that appears to violate the laws of physics and that could revolutionise space travel. I've read a few of these and found the explanations a bit confusing. Discussing it with people over dinner last night also suggested that just there's a bit of confusion as to just why it is so significant. So I thought I would try and explain it in simple terms.

The technology is called the EmDrive. It's been in development by British company SPR Ltd for several years amid much scepticism. It's hit the headlines this month because NASA engineers, apparently, have studied it, confirmed it works as advertised and published their findings in a conference paper. However they have not, reports say, explained exactly how it works, simply that it does work.

This has produced headlines like this one "‘Impossible’ space drive tested by NASA foretells future of deep-space travel"" The story went on to say: "NASA has conducted long-awaited experiments to prove that the fabled space drive, capable of generating its own thrust and breaking a fundamental law of physics, works. If the find survives fresh scrutiny, space ship construction will be revolutionised."

So, why is this so exciting and why does it appear to violate the laws of physics? Here's the problem. You're sat in your spaceship, stationary out in space. The laws of physics say that if you want to move forward you have to chuck something out the back end. The more you can throw and the faster you can throw it the faster you will go forward.

You can have an infinite supply of energy with which to throw stuff, but if you ain't got anything to throw you're going nowhere. This creates another problem. The mass you want to throw out the back later has to move forward with you initially. So the further and faster you want to go the more mass you need to carry and the more you need to throw out the back (or the faster you need to throw it) to even get started. It's a case of diminishing returns.

This is why you need a massive space rocket to launch a small satellite. (In this case the source of energy, the rocket fuel, also provides the mass that gets shoved out the back to move the rocket forward.)

This state of affairs is formally expressed as the law of conservation of momentum. It states that, in a closed system such as a spaceship in space, the product of mass x velocity remains constant. Put simply if you want to get a mass of 1kg moving one way at 10km/minute you need 1kg moving in the opposite direction at 10km/minute, or 100gms at 100k/minute. And that 100gms must come from the spaceship.

What the EmDrive claims to have done, according to its web site, is to "convert electrical energy directly into thrust" that is move the object forward without throwing anything out the back end. "Thrust is produced by the amplification of the radiation pressure of an electromagnetic wave propagated through a resonant waveguide assembly."
At first blush this would seem to be in conflict with the law of conservation of momentum. SPR claims this is not the case. According to the FAQs on its web site "The EmDrive does not violate any known law of physics. ... The EmDrive cannot violate the conservation of momentum. The electromagnetic wave momentum is built up in the resonating cavity, and is transferred to the end walls upon reflection. The momentum gained by the EmDrive plus the momentum lost by the electromagnetic wave equals zero. The direction and acceleration that is measured, when the EmDrive is tested on a dynamic test rig, comply with Newton's laws and confirm that the law of conservation of momentum is satisfied.
I have no idea what that means. But I guess eventually this will turn out to be either the breakthrough of the century, or the con of the century.


Thursday, 7 August 2014

The soon-to-be-global smartphone player you've never heard of...

...And almost certainly can't pronounce. Its name is Xiaomi. It is only four years old and has just, according to market analyst Canalys, soared to the top of China's smartphone market, which in Q2 of 214 accounted for 37 percent of global smartphones shipments, some 108.5 million units according to Canalys.

"In little over a year, Xiaomi has risen from being a niche player to become the leading smart phone vendor in the world’s largest market, overtaking Samsung in volume terms in Q2," Canalys said.

"Xiaomi took a 14 percent share in China, on the back of 240 percent year-on-year growth. With Lenovo, Yulong, Huawei, BBK, ZTE, OPPO and K-Touch, the eight Chinese vendors in the top 10 together accounted for a total of 70.7 million units and a 65 percent market share."

Ninety seven percent of Xiaomi's sales were in Mainland China, but that's unlikely to be the case for very long. According to Canalys Xiaomi is now looking to expand into other markets, with Indonesia, Mexico, Russia, Thailand and Turkey in its sights for the second half of the year.

Canalys analyst Jessica Kwee says Xiaomi faces considerable challenges in trying to crack the global market, and she has some advice for the company: "Xiaomi needs to build its international brand, and will need to localise its services offering with MIUI [its version of Android] for the different markets into which it expands, else its differentiation, value proposition and service-oriented revenue streams will be eroded. And it must tailor its marketing and largely online sales channels accordingly."

She concludes: "Xiaomi does have the potential to be a disruptive force beyond China and international vendors should take note." I'd say that is a very significant understatement.

Xiaomi's MIUI android-based OS is already gathering a global following with, 26 global 'fan sites' including one for Australia. You can download and install the MIUI OS on numerous Android phones. There was also at one time a Xiaomi store in Australia importing selling the company's products, but not connected with Xiaomi. However it has closed down.

Rival forecaster IDC seems to see the threat to other smartphone vendors as being far greater than Canalys. It has just released its Worldwide Quarterly Mobile Phone Tracker for Q2 of 2014. The top five vendors from one to five were Samsung, Apple, Huawei, Lenovo and LG. But program director Ryan Reith doesn't expect that ranking to remain. "Right now we have more than a dozen vendors that are capable of landing in the top five next quarter," he says. "A handful of these companies are currently operating in a single country, but no one should mistake that for complacency – they all recognise the opportunity that lies outside their home turf."

And to ram home that message, senior research manager, Melissa Chau, adds: "As the death of the feature phone approaches more rapidly than before, it is the Chinese vendors that are ready to usher emerging market consumers into smartphones. The offer of smartphones at a much better value than the top global players but with a stronger build quality and larger scale than local competitors gives these vendors a precarious competitive advantage."

More to the point, Canalys and IDC are at odds on the global smartphone vendor rankings. According to Canalys Xiaomi had 14 percent of a total of 108.5 million smartphone shipments in China in Q2, which works out to be 15.75 million units. According to IDC. Number three global vendor Huawei shipped 20.3 million units in Q2; number four vendor Lenovo 15.8 million and number five vendor LG 14.5 million. So, depending on whose figures are correct Xiaomi could already be the number four global smartphone vendor.


I'd say all those vendors should do more than take note of Xiaomi. They should be very afraid.

Wednesday, 6 August 2014

The data centre of the future could be built from smartphones

There's an article just published in Wired magazine, titled: "The Data Centers of Tomorrow Will Use the Same Tech Our Phones Do." The data centre of the future, it says, "is about eliminating all vestiges of the proprietary hardware used in networking and storage in favour of commodity components available through the mobile supply chain. It’s about this commodity hardware performing the function of proprietary systems today."

According to Wired, the data centre of the future will be "a bunch of dirt cheap, cell-phone-like machines—all connected together with sophisticated software—instead of those power-sucking, refrigerator-sized boxes."

This indeed is the next logical step beyond the architecture espoused by Cloudera, which has its origins in Google: masses of commodity 'pizza boxes' comprising CPUs, memory and storage, where redundancy and data replication, managed by software, replace reliability, and which I wrote about last week. Cloudera CTO and founder Amr Awadallah claims that costs of storage in its technology can be as little as 100th those of conventional systems.

The vast majority of CPU chips in data centre servers are supplied by Intel, but as equipment densities in data centres increase, power and cooling become bigger issues. Much of the demand for capacity in data centres is coming from those mobile devices: many of the billions of apps on those billions of devices rely heavily on resources in the cloud.

Nowhere are power issues more critical than in smartphones: where the ever increasing demands of applications and higher speed communications are pushing the limits of battery and semiconductor technologies.

And who holds the lion's share of the smartphone processor market? Not Intel, but ARM. In 2013 it claimed to have a 90 percent share in smartphones; a 95 percent share in feature and voice phones; and, a 50 percent share in mobile computing devices including tablets, net books and laptops.

Some analysts think that Intel will start to make inroads into the mobile device market. Morningstar senior analyst Andy Ng wrote in June "While Intel has had limited success in penetrating the smartphone and tablet processor market so far, it has only begun to use its manufacturing technology advantage in its Atom product line. Intel's new Silvermont Atom chips are the first Atoms manufactured using cutting-edge process technology, as the firm used older-generation technologies for prior Atom products. As a result, we think Intel will have opportunities to achieve some success in mobile device processors as it fully harnesses its moat in that market."


Intel will be working very hard to achieve that success before its dominance of the data centre market is threatened by the evolution of the data centre into "a bunch of dirt cheap, cell-phone-like machines—all connected together with sophisticated software—instead of those power-sucking, refrigerator-sized boxes," or even instead of low cost Intel-powered pizza boxes.

Tuesday, 5 August 2014

Speed bumps on the road to digital transformation

Cloudera founder and CTO Amr Awadallah, account of the challenges it and its customers face as they strive to exploit the full potential of the Cloudera Enterprise Data Hub technology is yet another example of the problems enterprises face as they embark on journey of 'digital transformation'.


When I interviewed Awadallah last week he showed me a slide depicting how Cloudera sees customers progressively embracing and extending their use of the Cloudera Enterprise Data Hub technology. "The journey starts from using the system for better operational efficiency, just for cheap storage," he explained. "Then they start to use it for what we call 'extract, transform and load': transforming the data from its unstructured form into a structured form that they can use inside their database. That's a big bottleneck today for a lot of organisations."

This is followed by data warehouse optimisation. Awadallah claims that the cost of storage in conventional data warehouse technology means that much data is relegated to the archives and for all practical purposes lost. "The archive is the graveyard of data, he says.

Based on the claim that the Cloudera approach offers storage at 1/30th to 1/100th cost of a data warehouse, he quips: "We offer economy class storage. Data Warehouse is first class. With a data warehouse it's fly or die."

Once organisations have achieved this level of usage, he says, they can start to exploit the real potential of the technology, analysing both structured and unstructured data, applying sophisticated data science techniques and finally applying converged analytics. This, he says "is when you have achieved enlightenment as an organisation. Where you have a single place with all your data and your workloads all come to the data as opposed to the data going to the workloads."

He claims that this is typically a four year journey for an organisation. "It can be a ten-year journey and for some organisations, it can be a one-year journey." Cloudera is only six years old, so none of its customers has yet completed the journey, but many are at the 'data science' stage, Awadallah says.

Not the least of the challenges for an organisation is how it makes the transition from the technology being in the domain of IT to the domain of lines of business - a transition shown in Awadallah's slide as a dotted line between enterprise 'data warehouse optimisation' and 'agile exploration'.

I asked him how Cloudera gets its customers across that line. How they make the transition from Cloudera technology being the domain of IT to the domain of lines of business.

The answer, it seems is not IT pushing the business along the journey, telling them how they can make better use of the technology. Nor is it the lines of business pressuring IT to provide the additional functionality. "IT does not know how to do the sale. In fact IT is afraid of this as there are new skills they have to learn," he says. "At a minimum it forces them to do something new. IT sometimes fights this change."

This places Cloudera in an awkward position: from having an established relationship with IT, it moves to selling the benefits of its technology to the business, potentially undermining established relationships. "We try to make friends with the IT guys" Awadallah say. "We give them training so they are less resistant."