Cusumano’s view – Cloud Computing and SaaS as New Computing Platforms.

Cusumano, M. (2010). “Cloud Computing and SaaS as New Computing Platforms.” Communications of the ACM 53(4): 27-29. http://doi.acm.org/10.1145/1721654.1721667
This is an interesting and well argued analysis of the concept of Cloud and SaaS as a platform. The paper concentrates on the lock-in and network effects and the risk they pose given the dominance of certain players in the market, in particular Salesforce, Microsoft, Amazon and Google.
Direct network effects (that the more telephones people have the more valuable they become) and indirect network effects (the more popular on platform is for developers, the more attractive the platform for other developers and users) are key to understanding the development of Cloud. Central to the articles potential importance is the analysis of how intergrated webservices (and thus integrated software platforms) might create conflicts of interest, network effects and hence risks.
Cusumano’s anlysis of Microsoft’s involvement in the market is compelling (particularly given his history in this area and detailed knowledge of the firm).
I do worry however that the papers exclusive focus on current players (and hence the interest in traditional concerns about network effects and dominance) downplays the key role of integrators and small standardisation/integration services which are emerging with the aim of reducing the impact of these network effects. Unlike traditional software  (where the cost of procurement,  installation, commissioning and use is very high) the mobility between clouds is easy if the underlying application is Cloud-provider-independent. This means there is considerable pressure from users to develop a cloud-independent service model (since everyone understands the risks of lock-in).
The future might thus be an open-source platform which is wrapped to slot into other cloud platforms… a meta-cloud perhaps.. which acts on behalf of users to enable the easy movement between providers. This is something Google is keen to stress at its cloud events.
I look forward to seeing the book on which the article is based

How Cloud Computing Changes IT Outsourcing — Outsourcing — InformationWeek

via How Cloud Computing Changes IT Outsourcing — Outsourcing — InformationWeek.

This article provides a useful look at the outsourcing relationship and compares this with the Cloud contracts. In particular (quote) “Cloud computing blurs the lines between what had been conventional outsourcing and internal operations, and it will test IT’s management and control policies”.  The article points out that companies are not ready for the challenges of Cloud growth, with their survey suggesting only “17% say they directly monitor the performance and uptime of all of their cloud and SaaS applications”. with a “shocking 59% relying on their vendors to monitor themselves”.

This is indeed shocking. As companies contemplate moving their operations to the Cloud they are perhaps being led into a strong sense of security by the vendors promises. But as demand grows these vendors facilities will be stretched and less certain.

On contracts the article points out that a cloud computing contract is a hybrid of outsourcing, software and leasing and are major contractual commitments.

Finally the more obvious points about business strategy are made – pointing out that a cloud provider may be less interested in driving innovation and major technological change as they are not as aligned to a businesses core capabilities and objectives.

Cloud Computing Presentation at GridPP Collaboration Meeting

Yesterday I gave an introductory presentation on Cloud Computing from a business perspective to a Grid meeting at Royal Holloway University. The slides are available on the GridPP website here: http://www.gridpp.ac.uk/gridpp24/CloudComputingGridPP24.ppt

Green Computing and the Cloud – SETI@home

Cloud computing hides the environmental impact of computing from the user. When we search using Google our own PC doesn’t suddenly start to cough – the fan doesn’t ramp up, our laptop doesn’t burn through the table. But somewhere in Google processors are using energy to undertake the search. Google is aware of this and tries hard to reduce this cost and its environmental impact.

There is a corollary of this though. When we use peer-to-peer software our processor uses more power and more electricity but we seldom notice. While perhaps tiny in aggregate this can be significant. And unlike Google few of us think about it, or try to use renewable energy to reduce its CO2 emissions.

Let me demonstrate with a quick back-of-the-envelope calculation.

SETI@home (the peer-to-peer application searching for ET) has 5.2 million participants and has produced an aggregate two million years of computing time. Taking an example of power usage for basic computers we can see that the difference between an idle computer and a in-use computer (i.e. where SETI is doing its processing) would be around 20watts ( though perhaps more). Given SETI has run for 17,520,000,000 hours that works out at about 350,400 Megawatt hours or 350.4 Gigawatt hours.

The UK average consumption of Gas and Electricity is about 22,338kWh per household (in 2007).  In the ten years since it started SETI@home has used about as much energy as a town of about 15,000 people would use in Gas and Electricity in an entire year!

Interestingly assuming US consumer energy costs (since most will be in homes in the US) at about 8c per kw/h this is about $28million of electricity! The key points is that this is only about 50c per year per participant – scarcely enough to make them change their SETI screensaver, but highly significant in the aggregate.

And SETI@home has yet to discover anything alien!

G-Cloud – A talk by John Suffolk (hosted by Computer Weekly)

A couple of weeks ago I attended a talk by the UK Government’s CIO – John Suffolk ( See here for more information on his role). At the talk John outlined his idea for a “G-Cloud” (government cloud) with the primary aim of reducing IT costs within government. Central government has around 130 datacenters, and an estimated 9000 server rooms, with local government and quasi-government obviously adding to this figure. Reducing and consolidating these through Cloud Computing would offer significant efficiencies and cost saving. Indeed given that 5% of contract costs are simply for bidding/procurement by simply having less procurement of resources costs would automatically be saved.

John outlined different “cloud-worlds” which he sees as important opportunities for cost saving through cloud computing in government.

1) “The testing world” – by using cloud computing to provide test-kits and environments it is possible to reduce the huge number of essentially idle servers kept simply for testing. For such servers utilisation is estimated at 7%.

2) “The shared world” – Many of the services offered by government require the same standardised and shared services. While these must be hosted internally they offer savings by using Cloud ideas. http://www.direct.gov for example has two data-centres at present – but could these also be used for similar services in other areas?

3) “Web Services world” – This was more unclear in the talk  but centred around the exploitation of cloud offerings through web services. For example could an “App-Store” be developed to aid government in simple procurement of tested and assured services. Could such an App-Store provide opportunities for SMEs to provide software into government through easier procurement processes (which currently preclude many SMEs from trying).

This idea of an App-store is  interesting. It would essentially provide a wrapper around an application to make transparent across government the pricing of an application, the contracting-vehicle required to purchase, the security level it is assured for use with, and details of who in government is using it. Finally deployment tools would be included to allow applications to be rolled out simply.

John acknowledged that many details need ironing out, particularly issues of European procurement rules (and the UKs obsession with following them to the letter of the law).  While government might like to pay-per-use and contract at crown level (so licences can be moved from department to department rather than creating new purchases) this would be a change in the way software is sold and might affect R&D, licence issues, maintenance etc.

The App-Store would be a means to crack the problem of procurement and the time it takes. and so drive costs down for both sides.

What was clear however was the desire to use the cloud for the lower level application stack. To “Disintermediate applications” because “we don’t care about underlying back-end, only care about the software service” – Government can use a common bottom of the stack.

Indeed it was discussed that a standard design for a government desktop-PC might be an “application” within the app-store so centralising this design and saving the huge costs of individual design per department (see http://www.cabinetoffice.gov.uk/media/317444/ict_strategy4.pdf#page=23 for more details).

Finally the cloud offers government the same opportunities to scale operations to meet demand (for example MyGov pages when new announcements are made, or Treasury when the budget is announced), however this scalable-service would also affect costs and might not be justified in the budgeting.  While we look at the cloud to stop web-sites going down there is also a cost to providing such scalable support for the few days a year it is needed – cloud or no cloud.

Thank you to Computer Weekly for inviting me to this event!

Google Atmosphere

Google Atmosphere Event

I attended this great event at which Nicholas Carr, Werner Vogels and Geoffrey Moore presented.

My notes on the meeting were (and these are not verbatim – mistakes may have been made).

—–

Nicholas Carr:

Drawing on his book (http://www.nicholasgcarr.com/bigswitch/) he argued that the Mainframe was “impersonal computer utility” and that “the power works [that is the belts and pullys of steam power] in the 1900 factories is the ERP/Oracle/SAP solutions of today”

Christensen’s innovators dilemma is introduced to argue that Cloud is a distruptive technology which will punch through our exiting models of IT trajectory.

The rest of the talk was very much aligned with the book – though no less useful because of this. I will not summarise this here though.

Werner Vogels: CTO/VP of Amazon argued that their involvement is not to sell unused server capacity (as often suspected of a company which has massive demand at certain peek periods). Rather it is because Cloud capitalises on their focus on providing scaled reliability at low cost/high volume. This is the essence of all Amazon business he argues.

As such he states that the 2008 Gartner definition misses the point – Cloud is about “on demand services” and “pay as you go”.

Amazon provides an enterprise platform (which Marks and Spencer and Mothercare are using) for eCommerce. It is not new to enterprise applications. By doing this these companies can scale up their sales operations during the key periods.

Amazon also announced its provision of Virtual Private Clouds (subnets of a companies data-centre hosted by a cloud provider and accessed by VPN).

Another interesting example of the use of AWS was the Guardian newspapers review of MP expenses. Their competitor (the daily telegraph) had had lots of time to do detailed analysis. In contrast the Guardian hosted it on the cloud then invited individuals (the wisdom of the crowd) to look at their individual MPs expenses. This led to very quick response and analysis.

Finally Indiana Speedway provides a multimedia streaming of its races using AWS when the race is running – other times it remains dormant.

Other companies using AWS: EPSN, Playfish, Autodesk, Pfizer (using VPC) NetFlix and LiveStream. Finally Intuit (a tax services which scales on April Tax day).

In response to questions of security the answer is “how secure is the corporate data centre” – use of cloud to respond to a Denial of Service Attack better than corporates. Security innovation is moving ahead in the cloud – e.g. Subnet Direction.

Amazon is providing tools to users to allow them to know the location of their data. And they have provided seperated datacentres to store EU data not outside.

—–

More Soon