Globalization Today “Cloud as Technology – What Kind of Transformation”

Read our latest article on Cloud Technology (based on our earlier Accenture reports) in Globalization Today –

(pages 26-33)

Our Fourth Report: Innovation: Cloud and the Future of Business: From Costs to Innovation –

Our fourth report on Cloud Computing is now available… This report looks at the future of business, mapping out the concept of the Cloud Corporation, and discusses the fragmentation and redevelopment of the technology supply industry. In particular we discuss how the industry may become layered and increasingly specialised – with organisations benefiting from the business agility and better alignment offered this will create.

Click on the image below for the full report.

Cloud and the Future of Business: From Costs to Innovation - Part Four: Innovation

Third Report – The Impact of Cloud Computing

The third report in our series for Accenture is now available by clicking the image below:

Cloud and the Future of Business: From Costs to Innovation - Part Three: Impact



In this report we consider the potential short and long term impact of Cloud Computing on stakeholders. Using our survey of over 1000 executives, and supported by qualitative interviews with key Cloud stakeholders, we assess this impact on organisational performance, outsourcing and the supply industry both in the short-term and long term.

The 7 capabilities of Cloud Computing – a review of a recent MISQE article on Cloud

Iyer, B., and Henderson, J. “Preparing for the Future: Understanding the Seven Capabilities of Cloud Computing,” MIS Quarterly Executive (9:2) 2010, pp 117-131.


In this article Bala Iyer and John Henderson research vendor offerings on Cloud to identify “seven capabilities” of services that organizations should consider before implementing Cloud. For them Cloud consists of a stack of IaaS, PaaS and the Application. Interestingly they include Collaboration within their stack – reflective of their focus on Cloud’s association with Mash-ups as the service provided to the user.   This seems useful but does confuse things a little – it is not clear the extent to which these components of the stack are integrated/exploited within a particular cloud offering.

The seven capabilities are then described:

1) Controlled Interface – the capacity for the integrated infrastructure to be responsive to change. In particular the capability of APIs to allow the innovation of applications and services on top of the platform – and the demands of the platform owner in managing/controlling that innovation.  This seems a very important point as platform owners business models are dependent upon the explotation of the platform – ranging from an open platform (like MS-Windows) where Microsoft make money from selling the initial product licence, to closed platforms (like Apple’s iPhone) where money is extracted from application purchase ontop.

2) Location Independence – the capacity for services and information assets to be controlled/exploited without reference to their location. This hooks into a range of themes – from the technical architecture of systems and their capacity for integration, to legislative demands for locations and safe-harbouring of information.

3) Sourcing Independence – this is connected with the concern for lock-in and the desire for organisations to move their application between cloud platforms. They usefully highlight however that lock-in should be evaluated within the company firewall as much as outside it. Companies should evaluate their ability to move between any IT sources and their IT services should be independent of the platform used.

4) Ubiquitous Access – this referes to the ability of a cloud service to be accessed from differing devices and platforms globally. However they rightly extend this to include access to application programming interfaces not simply web-site portal pages.

5) Virtual Business Environments – Similar to the Virtual Machine – this perspective virtualises and integrates tools which support specific major business capabilities. Another way to look at it is a suite of cloud-services and workflows which allow the realisation of business processes/functions within a cloud type environment. By considering such VBE’s the paper hints at Business Process as a Service and the possibilities of cloud services which transend basic service provision and direclty link to business process  – allowing the scalability and elasticity of cloud to link to Business process innovation.

6) Addressability and Traceability – This calls for the ability to verify the history, location and application of data in the cloud for traceability purposes and compliance issues. I would however argue that it is not simply a matter of ensuring tracability – but being able to manage the traces recorded. Our inherent assumptions of the desirability of traceability are incorrect – as Apple is learning through the problems of its desire to trace and record wiki-antena data on iPhones, or the legal challenge and sentence against Google for its (albeit unintended) recording of users WiFi signals within its Google Mapping activity in Europe. Lets remember that sometimes it is better to forget.

7) Rapid Elasticity – The self-service capability of scaling up services. Here the authors make an interesting point – highlighting the need for elasticity in IT Service AND in Contract. Simply having scalable services but pricing which is not reflective of this is challenging.

These are important dimensions of the cloud – and add to the corpus of our knowledge. What is useful is that they are drawn from an analysis of vendor offering – and further that they provide a road-map for strategy. I would urge those interested to get hold of the paper which goes into much more detail on strategic approaches to Cloud and the need for specific IT skills to manage such services. What is particularly refreshing about the article is its focus on mashing together of services – treating Cloud as a patchwork of services rather then focusing too heavily on the individual components.

Cloud Computing – it’s so ‘80s.

For Vint Cerf[1], the father of the internet, Cloud Computing represents a return to the days of the mainframe where service-bureaus rented their machines by the hour to companies who used them for payroll and other similar tasks. Such comparisons focus on the architectural similarities between centralised mainframes and Cloud computing – cheaply connecting to an expensive resource “as a service” through a network. But cloud is more about the provision of “low-cost” computing (albeit in bulk through data-centres) at even lower costs in the cloud. A better analogy that the mainframe then is the introduction of the humble micro-computer and the revolution it brought to corporate computing in the early 1980s.

When micros were launched many companies operated using mini or mainframe computers which were cumbersome, expensive and needed specialist IT staff to manage them[1]. Like Cloud Computing today, when compared with these existing computers the new micros offered ease of use, low cost and apparently low risk which appealed to business executives seeking to cut costs, or SMEs unable to afford mini’s or mainframes[2]. Usage exploded and in the period from the launch of the IBM PC in 1981 to 1984 the proportion of companies using PCs increased dramatically from 8% to 100% [3] as the cost and opportunity of the micro became apparent. Again, as with the cloud[4], these micros were marketed directly to business executives rather than IT staff, and were accompanied by a narrative that they would enable companies to dispense of heavy mainframes and the IT department for many tasks –doing them quicker and more effectively. Surveys from that time suggested accessibility, speed of implementation, response-time, independence and self-development were the major advantage of the PC over the mainframe[5] –  easily recognisable in the hyperbole surrounding cloud services today. Indeed Nicholas Carr’s current pronouncement of the End of Corporate IT[6] would probably have resonated well in the early 1980s when the micro looked set to replace the need for corporate IT. Indeed in 1980 over half the companies in a sample claimed no IT department involvement in the acquisition of PCs[3].

But problems emerged from the wholesale uncontrolled adoption of the Micro, and by 1984 only 2% of those sampled did not involve the IT department in PC acquisition[3]. The proliferation of PCs meant that in 1980 as many as 32% of IT managers were unable to estimate the proportion of PC within their company[3], and few could provide any useful support for those who had purchased them.

Micros ultimately proved cheap individually but expensive on-mass[2] as their use exploded and new applications for them were discovered. In addition to the increased use IT professionals worried about the lack of documentation (and thus poor opportunity for maintenance), poor data management strategies, and security issues[7]. New applications proved incompatible with others (“the time-bomb of incompatibility”[2]), and different system platforms (e.g. CP/M, UNIX, MS-DOS, OS/2, Atari, Apple …) led to redundancy and communication difficulties between services and to the failure of many apparently unstoppable software providers –household names such as Lotus, Digital-Research, WordStar and Visi and dBase[8].

Ultimately it was the IT department which brought sense to these machines and began to connect them together for useful work using compatible applications – with the emergence of companies such as Novell and Microsoft to bring order to the chaos[8].

Drawing lessons from this history for Cloud Computing are useful. The strategic involvement of IT services departments is clearly required. Such involvement should focus not on the current cost-saving benefits of the cloud, but on the strategic management of a potentially escalating use of Cloud services within the firm. IT services must get involved in the narrative surrounding the cloud – ensuring their message is neither overly negative (and thus appearing to have a vested interest in the status quo) nor overly optimistic as potential problems exist. Either way the lessons of the microcomputer are relevant again today.  Indeed Keen and Woodman argued in 1984 that companies needed the following four strategies for the Micro:

1)      “Coordination rather than control of the introduction.

2)      Focusing on the longer-term technical architecture for the company’s overall computing resources, with personal computers as one component.

3)      Defining codes for good practice that adapt the proven disciplines of the [IT industry] into the new context.

4)      Emphasis on systematic business justification, even of the ‘soft’ and unquantifiable benefits that are often a major incentive for and payoff of using personal computers” [2]

It would be wise for companies contemplating a move to the cloud to consider this advice carefully – replacing personal-computer with Cloud-computing throughout.

(c)2011 Will Venters, London School of Economics. 

[1]            P. Ceruzzi, A History of Modern Computing. Cambridge,MA: MIT Press, 2002.

[2]            P. G. W. Keen and L. Woodman, “What to do with all those micros: First make them part of the team,” Harvard Business Review, vol. 62, pp. 142-150, 1984.

[3]            T. Guimaraes and V. Ramanujam, “Personal Computing Trends and Problems: An Empirical Study,” MIS Quarterly, vol. 10, pp. 179-187, 1986.

[4]            M. Benioff and C. Adler, Behind the Cloud – the untold story of how went from idea to billion-dollar company and revolutionized and industry. San Francisco,CA: Jossey-Bass, 2009.

[5]           D. Lee, “Usage Patterns and Sources of Assitance for Personal Computer Users,” MIS Quarterly, vol. 10, pp. 313-325, 1986.

[6]            N. Carr, “The End of Corporate Computing,” MIT Sloan Management Review, vol. 46, pp. 67-73, 2005.

[7]            D. Benson, “A field study of End User Computing: Findings and Issues,” MIS Quarterly, vol. 7, pp. 35-45, 1983.

[8]            M. Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A history of the software industry. Cambridge,MA: MIT Press, 2003.

List of 100 Cloud Vendors

A student of mine forwarded me the following “top 100” cloud vendors list. I take such ratings with a pinch of salt but it is useful to see a list of companies who are significant players in this market. As a resource for ideas on different vendors offerings it may prove useful  – once you wade through the advertising to read it that is!


Cloud and the Future of Business: From Costs to Innovation

I have not been updating this blog for a while as I have been busy writing commercial papers on Cloud Computing. The first of these, for Accenture, has just been published and is available here

The report outlines our” Cloud Desires Framework” in which we aim to explain the technological direction of Cloud in terms of four dimensions of the offerings – Equivalence, Abstraction, Automation and Tailoring.

Equivalence: The desire to provide services which are at least equivalent in quality to that experienced by a locally running service on a PCor server.

Abstraction: The desire to hide unnecessary complexity of the lower levels of the application stack.

Automation: The desire to automatically manage the running of a service.

Tailoring: The desire to tailor the provided service for specific enterprise needs.

(c) Willcocks,Venters,Whitley 2011.

By considering these dimensions to the different types of cloud service (SaaS, PaaS, IaaS and Hosted service (often ignored – but crucially Cloud-like)) it is possible to distinguish the different benefits of each away from the “value-add” differences. Crucially the framework allows simple comparison between services offered by different companies by focusing on the important desires and not the unimportant technical differences.

Take a look at the report – and let me know what you think!


This is a company to watch – they have two products:

VPN-Cubed provides a virtual network onto the network of a cloud provider. This enables first to keep a standard networking layer which is consistent even if the cloud provided network changes (e.g. IP address changes).

Elastic Server allows real-time assembly and management of software components. This allows the quick creation of easy to use applications which can be easily sent to various cloud services.

However it is the fact that together these services allow virtual machines and cloud services to be moved between cloud IaaS providers without significant real-time work which is important. If their products live up to the promise then users can move to the cheapest cloud provider with ease so driving down costs to commodity supplier levels… and creating the spot market for cloud.