Fordism and the Cloud

In a recent Article Dustin Owens (2010) argues that elasticity defines the benefit of Cloud Computing.  He states “Elasticity could bring to the IT infrastructure what Henry Ford brought to the automotive industry with assembly lines and mass production: affordability and substantial improvements on time to market.” (p48). The article is useful and focuses primarily on security however it was the comparison with Henry Ford which got me thinking.

It is a bold statement and deserves further analysis. In particular it is unclear what Ford brought the motor industry – certainly increased penetration, increased usage, increased availability all of which are positive. Arguably though it also brought with it urban sprawl, oil-dependence, reduced wages and Tailorism. One can imagine similar problems with the cloud. Urban sprawl might be similar to flattening organisational sizes and increasing risk. For those companies whose data-centre provides competitive advantage will see an increased landscape of small competitors capitalising on the Cloud to compete – the sprawl. Our oil-dependence will similarly remain – Cloud computing hides the electricity and environmental impact of our actions in data-centres hidden from view. Purchases are unlikely to know what percentage of costs are electricity – and are unlikely to care. Finally reduced wages and Tailorism. Prior to Ford those involved in developing cars were highly skilled and worked across the line – Ford reduced this movement of skill, and Fredrick Tailor developed this into scientific management. One can see similarities in the cloud provider with increased specialism among staff  within these very large data-centres. With this comes risks – the generalist is better able to respond to radical architectural change and innovation. The generalist also has a more interesting job (a failure of Scientific Management). All this is speculation at the moment – I await my first Model T Cloud.

On another level however this comparison is interesting. because it is worth remembering that Ford was overtaken by General Motors for the simple reason that Henry Ford was against debt and demanded people pay in cash, whereas GE realised that borrowing to buy a GE car was beneficial. With the car you’re earning potential rose as you could travel for work and you were thus better able to afford the car.

In cloud computing the same might also be true. The argument behind Cloud Computing has always been that start-up ventures do not need to purchase the expensive IT equipment they can focus on  Opex not CapEx. Similarly SaaS offers reduced CapEx costs. But one might also imagine a growth of financial services industries around the cloud. For example providing short-term loans to allow small companies to ramp up websites in response to incredible demand (and perhaps insuring against this). Or allowing small media enterprises to rent cycles for rendering films as a percentage of future profits. Finally, and perhaps most importantly, if computing is a commodity we are likely to see spot-markets and commodity markets spawn. Can we imagine CDOs developing based not on sub-prime mortgages but on poorly used processor cycles within the cloud… or imagine insuring against a Denial of Service Attack such that when one occurs you can ramp up website services to respond, but not have to pay for the processor cycles! I can see many small companies taking out such insurance for their websites (if anyone profits from this – then donations received with thanks 🙂 ).

——

Owens, D (2010) “Securing Elasticity in the Cloud”, Communications of the ACM 53(6) 48-51 doi:http://doi.acm.org/10.1145/1743546.1743565

An Over Simplistic Utility Model

Brynjolfsson, E., P. Hofmann, et al. (2010). “Economic and Business Dimensions Cloud Computing and Electricity:Beyond the Utility Model.” Communications of the ACM 53(5): 32-34.

—-

This paper argues that technical issues associated with innovation, scale, and geography will confront those attempting to capitalise on utility computing. They take the utility model of computing (i.e. that cloud computing is analogous to the electricity market) and identify key challenges.

In particular they identify the following technical challenges:

1)    The pace of innovation of IT – managing this pace of change requires creative expertise and innovation (unlike utilities such as electricity which, they argue, are stable).

2)    The limits of scale – Parallelisable problems are only a subset of problems. Scalability of databases has limits within architectures. APIs e.g. using SQL are difficult for high-volume transaction systems. Further large companies can benefit from Private Clouds with little advantages, and greater risks, if they go to the public cloud.

3)    Latency: The speed of light limits communication. Latency remains a problem. For many applications performance, convenience and security considerations will demand local. [While not mentioned in the article it is interesting to note that this problem is being attacked by http://www.akamai.com/ who specialise in reducing the problems of network latency through their specialist network]

They also identify the following business challenges:

1)    Complementarities and Co-Invention: “Computing is still in the midst of an explosion of innovation and co-invention First that simply replace corporate resources with cloud computing, while changing nothing else, are doomed to miss the full benefits of the new technology” (p34). It is the reinvention of new services which are key to the success of cloud. IT enabled businesses reshape industries – e.g. Apple quadrupled revenue by moving from perpetual licence to pay-per-use in iTunes, but this demanded tight integration of EPR and Billing which would have been difficult within the cloud given their volumes.

2)    Lock-in and Interoperability: Regulation controlled energy monopolies, and electrons are fungible. Yet for computing to operate like electricity will require “radically different management of data than what is on anyone’s technology roadmap”. Information is not electrons – cloud offerings will not be interchangeable. “Business processes supported by enterprise computing are not motors or light-bulbs”.

3) Security – We are not concerned about electrons as we are with information. Regulators, laws or audit is not needed. New security issues will need to be faced (see  (Owens 2010) for interesting debate on security).

—-

Owens, D (2010) “Securing Elasticity in the Cloud”, Communications of the ACM 53(6) 48-51 doi: http://doi.acm.org/10.1145/1743546.1743565

Cusumano’s view – Cloud Computing and SaaS as New Computing Platforms.

Cusumano, M. (2010). “Cloud Computing and SaaS as New Computing Platforms.” Communications of the ACM 53(4): 27-29. http://doi.acm.org/10.1145/1721654.1721667
This is an interesting and well argued analysis of the concept of Cloud and SaaS as a platform. The paper concentrates on the lock-in and network effects and the risk they pose given the dominance of certain players in the market, in particular Salesforce, Microsoft, Amazon and Google.
Direct network effects (that the more telephones people have the more valuable they become) and indirect network effects (the more popular on platform is for developers, the more attractive the platform for other developers and users) are key to understanding the development of Cloud. Central to the articles potential importance is the analysis of how intergrated webservices (and thus integrated software platforms) might create conflicts of interest, network effects and hence risks.
Cusumano’s anlysis of Microsoft’s involvement in the market is compelling (particularly given his history in this area and detailed knowledge of the firm).
I do worry however that the papers exclusive focus on current players (and hence the interest in traditional concerns about network effects and dominance) downplays the key role of integrators and small standardisation/integration services which are emerging with the aim of reducing the impact of these network effects. Unlike traditional software  (where the cost of procurement,  installation, commissioning and use is very high) the mobility between clouds is easy if the underlying application is Cloud-provider-independent. This means there is considerable pressure from users to develop a cloud-independent service model (since everyone understands the risks of lock-in).
The future might thus be an open-source platform which is wrapped to slot into other cloud platforms… a meta-cloud perhaps.. which acts on behalf of users to enable the easy movement between providers. This is something Google is keen to stress at its cloud events.
I look forward to seeing the book on which the article is based

How Cloud Computing Changes IT Outsourcing — Outsourcing — InformationWeek

via How Cloud Computing Changes IT Outsourcing — Outsourcing — InformationWeek.

This article provides a useful look at the outsourcing relationship and compares this with the Cloud contracts. In particular (quote) “Cloud computing blurs the lines between what had been conventional outsourcing and internal operations, and it will test IT’s management and control policies”.  The article points out that companies are not ready for the challenges of Cloud growth, with their survey suggesting only “17% say they directly monitor the performance and uptime of all of their cloud and SaaS applications”. with a “shocking 59% relying on their vendors to monitor themselves”.

This is indeed shocking. As companies contemplate moving their operations to the Cloud they are perhaps being led into a strong sense of security by the vendors promises. But as demand grows these vendors facilities will be stretched and less certain.

On contracts the article points out that a cloud computing contract is a hybrid of outsourcing, software and leasing and are major contractual commitments.

Finally the more obvious points about business strategy are made – pointing out that a cloud provider may be less interested in driving innovation and major technological change as they are not as aligned to a businesses core capabilities and objectives.

Green Computing and the Cloud – SETI@home

Cloud computing hides the environmental impact of computing from the user. When we search using Google our own PC doesn’t suddenly start to cough – the fan doesn’t ramp up, our laptop doesn’t burn through the table. But somewhere in Google processors are using energy to undertake the search. Google is aware of this and tries hard to reduce this cost and its environmental impact.

There is a corollary of this though. When we use peer-to-peer software our processor uses more power and more electricity but we seldom notice. While perhaps tiny in aggregate this can be significant. And unlike Google few of us think about it, or try to use renewable energy to reduce its CO2 emissions.

Let me demonstrate with a quick back-of-the-envelope calculation.

SETI@home (the peer-to-peer application searching for ET) has 5.2 million participants and has produced an aggregate two million years of computing time. Taking an example of power usage for basic computers we can see that the difference between an idle computer and a in-use computer (i.e. where SETI is doing its processing) would be around 20watts ( though perhaps more). Given SETI has run for 17,520,000,000 hours that works out at about 350,400 Megawatt hours or 350.4 Gigawatt hours.

The UK average consumption of Gas and Electricity is about 22,338kWh per household (in 2007).  In the ten years since it started SETI@home has used about as much energy as a town of about 15,000 people would use in Gas and Electricity in an entire year!

Interestingly assuming US consumer energy costs (since most will be in homes in the US) at about 8c per kw/h this is about $28million of electricity! The key points is that this is only about 50c per year per participant – scarcely enough to make them change their SETI screensaver, but highly significant in the aggregate.

And SETI@home has yet to discover anything alien!

G-Cloud – A talk by John Suffolk (hosted by Computer Weekly)

A couple of weeks ago I attended a talk by the UK Government’s CIO – John Suffolk ( See here for more information on his role). At the talk John outlined his idea for a “G-Cloud” (government cloud) with the primary aim of reducing IT costs within government. Central government has around 130 datacenters, and an estimated 9000 server rooms, with local government and quasi-government obviously adding to this figure. Reducing and consolidating these through Cloud Computing would offer significant efficiencies and cost saving. Indeed given that 5% of contract costs are simply for bidding/procurement by simply having less procurement of resources costs would automatically be saved.

John outlined different “cloud-worlds” which he sees as important opportunities for cost saving through cloud computing in government.

1) “The testing world” – by using cloud computing to provide test-kits and environments it is possible to reduce the huge number of essentially idle servers kept simply for testing. For such servers utilisation is estimated at 7%.

2) “The shared world” – Many of the services offered by government require the same standardised and shared services. While these must be hosted internally they offer savings by using Cloud ideas. http://www.direct.gov for example has two data-centres at present – but could these also be used for similar services in other areas?

3) “Web Services world” – This was more unclear in the talk  but centred around the exploitation of cloud offerings through web services. For example could an “App-Store” be developed to aid government in simple procurement of tested and assured services. Could such an App-Store provide opportunities for SMEs to provide software into government through easier procurement processes (which currently preclude many SMEs from trying).

This idea of an App-store is  interesting. It would essentially provide a wrapper around an application to make transparent across government the pricing of an application, the contracting-vehicle required to purchase, the security level it is assured for use with, and details of who in government is using it. Finally deployment tools would be included to allow applications to be rolled out simply.

John acknowledged that many details need ironing out, particularly issues of European procurement rules (and the UKs obsession with following them to the letter of the law).  While government might like to pay-per-use and contract at crown level (so licences can be moved from department to department rather than creating new purchases) this would be a change in the way software is sold and might affect R&D, licence issues, maintenance etc.

The App-Store would be a means to crack the problem of procurement and the time it takes. and so drive costs down for both sides.

What was clear however was the desire to use the cloud for the lower level application stack. To “Disintermediate applications” because “we don’t care about underlying back-end, only care about the software service” – Government can use a common bottom of the stack.

Indeed it was discussed that a standard design for a government desktop-PC might be an “application” within the app-store so centralising this design and saving the huge costs of individual design per department (see http://www.cabinetoffice.gov.uk/media/317444/ict_strategy4.pdf#page=23 for more details).

Finally the cloud offers government the same opportunities to scale operations to meet demand (for example MyGov pages when new announcements are made, or Treasury when the budget is announced), however this scalable-service would also affect costs and might not be justified in the budgeting.  While we look at the cloud to stop web-sites going down there is also a cost to providing such scalable support for the few days a year it is needed – cloud or no cloud.

Thank you to Computer Weekly for inviting me to this event!

Google Atmosphere

Google Atmosphere Event

I attended this great event at which Nicholas Carr, Werner Vogels and Geoffrey Moore presented.

My notes on the meeting were (and these are not verbatim – mistakes may have been made).

—–

Nicholas Carr:

Drawing on his book (http://www.nicholasgcarr.com/bigswitch/) he argued that the Mainframe was “impersonal computer utility” and that “the power works [that is the belts and pullys of steam power] in the 1900 factories is the ERP/Oracle/SAP solutions of today”

Christensen’s innovators dilemma is introduced to argue that Cloud is a distruptive technology which will punch through our exiting models of IT trajectory.

The rest of the talk was very much aligned with the book – though no less useful because of this. I will not summarise this here though.

Werner Vogels: CTO/VP of Amazon argued that their involvement is not to sell unused server capacity (as often suspected of a company which has massive demand at certain peek periods). Rather it is because Cloud capitalises on their focus on providing scaled reliability at low cost/high volume. This is the essence of all Amazon business he argues.

As such he states that the 2008 Gartner definition misses the point – Cloud is about “on demand services” and “pay as you go”.

Amazon provides an enterprise platform (which Marks and Spencer and Mothercare are using) for eCommerce. It is not new to enterprise applications. By doing this these companies can scale up their sales operations during the key periods.

Amazon also announced its provision of Virtual Private Clouds (subnets of a companies data-centre hosted by a cloud provider and accessed by VPN).

Another interesting example of the use of AWS was the Guardian newspapers review of MP expenses. Their competitor (the daily telegraph) had had lots of time to do detailed analysis. In contrast the Guardian hosted it on the cloud then invited individuals (the wisdom of the crowd) to look at their individual MPs expenses. This led to very quick response and analysis.

Finally Indiana Speedway provides a multimedia streaming of its races using AWS when the race is running – other times it remains dormant.

Other companies using AWS: EPSN, Playfish, Autodesk, Pfizer (using VPC) NetFlix and LiveStream. Finally Intuit (a tax services which scales on April Tax day).

In response to questions of security the answer is “how secure is the corporate data centre” – use of cloud to respond to a Denial of Service Attack better than corporates. Security innovation is moving ahead in the cloud – e.g. Subnet Direction.

Amazon is providing tools to users to allow them to know the location of their data. And they have provided seperated datacentres to store EU data not outside.

—–

More Soon