Latest article published: Strategic Outsourcing: An International Journal | Cloud Sourcing and Innovation: Slow Train Coming? A Composite Research Study

The latest article from our long-running Cloud Computing research stream has just been published…

Leslie Willcocks, Will Venters, Edgar A. Whitley, (2013) “ Cloud Sourcing and Innovation: Slow Train Coming? A Composite Research Study“, Strategic Outsourcing: An International Journal, Vol. 6 Iss: 2

ABSTRACT:

Purpose – Although cloud computing has been heralded as driving the innovation agenda, there is growing evidence that cloud is actually a “slow train coming”. The purpose of this paper is to seek to understand the factors that drive and inhibit the adoption of cloud particularly in relation to its use for innovative practices.

Design/methodology/approach – The paper draws on a composite research base including two detailed surveys and interviews with 56 participants in the cloud supply chain undertaken between 2010 and 2013. The insights from this data are presented in relation to set of antecedents to innovation and a cloud sourcing model of collaborative innovation.

Findings – The paper finds that while some features of cloud computing will hasten the adoption of cloud and its use for innovative purposes by the enterprise, there are also clear challenges that need to be addressed before cloud can be successfully adopted. Interestingly, our analysis highlights that many of these challenges arise from the technological nature of cloud computing itself.

Research limitations/implications – The research highlights a series of factors that need to be better understood for the maximum benefit from cloud computing to be achieved. Further research is needed to assess the best responses to these challenges.

Practical implications – The research suggests that enterprises need to undertake a number of steps for the full benefits of cloud computing to be achieved. It suggests that collaborative innovation is not necessarily an immediate consequence of adopting cloud computing.

Originality/value – The paper draws on an extensive research base to provide empirically informed analysis of the complexities of adopting cloud computing for innovation.

Our Fifth Report is out – Management implications of the Cloud

The fifth report in our Cloud Computing series for Accenture has just been published. This report looks at the impact Cloud Computing will have on the management of the IT function, and thus the skills needed by all involved in the IT industry. The report begins by analysing the impact Cloud might have in comparison to existing outsourcing project. It considers the core-capabilities which must be retained in a “cloud future”, considering how these capabilities might be managed, and the role of systems integrators in managing the Cloud.

Please use the comments form to give us feedback!

Cloud and the future of Business 5 – Management .

Understanding the Business Impacts of Cloud Computing | The European Business Review

 

 

 

 

 

 

 

Understanding the Business Impacts of Cloud Computing | The European Business Review.

Read an article I jointly wrote with colleagues at the LSE on the Business Impacts of Cloud Computing within the European Business Review.

 

 

Cloud Computing – it’s so ‘80s.

For Vint Cerf[1], the father of the internet, Cloud Computing represents a return to the days of the mainframe where service-bureaus rented their machines by the hour to companies who used them for payroll and other similar tasks. Such comparisons focus on the architectural similarities between centralised mainframes and Cloud computing – cheaply connecting to an expensive resource “as a service” through a network. But cloud is more about the provision of “low-cost” computing (albeit in bulk through data-centres) at even lower costs in the cloud. A better analogy that the mainframe then is the introduction of the humble micro-computer and the revolution it brought to corporate computing in the early 1980s.

When micros were launched many companies operated using mini or mainframe computers which were cumbersome, expensive and needed specialist IT staff to manage them[1]. Like Cloud Computing today, when compared with these existing computers the new micros offered ease of use, low cost and apparently low risk which appealed to business executives seeking to cut costs, or SMEs unable to afford mini’s or mainframes[2]. Usage exploded and in the period from the launch of the IBM PC in 1981 to 1984 the proportion of companies using PCs increased dramatically from 8% to 100% [3] as the cost and opportunity of the micro became apparent. Again, as with the cloud[4], these micros were marketed directly to business executives rather than IT staff, and were accompanied by a narrative that they would enable companies to dispense of heavy mainframes and the IT department for many tasks –doing them quicker and more effectively. Surveys from that time suggested accessibility, speed of implementation, response-time, independence and self-development were the major advantage of the PC over the mainframe[5] –  easily recognisable in the hyperbole surrounding cloud services today. Indeed Nicholas Carr’s current pronouncement of the End of Corporate IT[6] would probably have resonated well in the early 1980s when the micro looked set to replace the need for corporate IT. Indeed in 1980 over half the companies in a sample claimed no IT department involvement in the acquisition of PCs[3].

But problems emerged from the wholesale uncontrolled adoption of the Micro, and by 1984 only 2% of those sampled did not involve the IT department in PC acquisition[3]. The proliferation of PCs meant that in 1980 as many as 32% of IT managers were unable to estimate the proportion of PC within their company[3], and few could provide any useful support for those who had purchased them.

Micros ultimately proved cheap individually but expensive on-mass[2] as their use exploded and new applications for them were discovered. In addition to the increased use IT professionals worried about the lack of documentation (and thus poor opportunity for maintenance), poor data management strategies, and security issues[7]. New applications proved incompatible with others (“the time-bomb of incompatibility”[2]), and different system platforms (e.g. CP/M, UNIX, MS-DOS, OS/2, Atari, Apple …) led to redundancy and communication difficulties between services and to the failure of many apparently unstoppable software providers –household names such as Lotus, Digital-Research, WordStar and Visi and dBase[8].

Ultimately it was the IT department which brought sense to these machines and began to connect them together for useful work using compatible applications – with the emergence of companies such as Novell and Microsoft to bring order to the chaos[8].

Drawing lessons from this history for Cloud Computing are useful. The strategic involvement of IT services departments is clearly required. Such involvement should focus not on the current cost-saving benefits of the cloud, but on the strategic management of a potentially escalating use of Cloud services within the firm. IT services must get involved in the narrative surrounding the cloud – ensuring their message is neither overly negative (and thus appearing to have a vested interest in the status quo) nor overly optimistic as potential problems exist. Either way the lessons of the microcomputer are relevant again today.  Indeed Keen and Woodman argued in 1984 that companies needed the following four strategies for the Micro:

1)      “Coordination rather than control of the introduction.

2)      Focusing on the longer-term technical architecture for the company’s overall computing resources, with personal computers as one component.

3)      Defining codes for good practice that adapt the proven disciplines of the [IT industry] into the new context.

4)      Emphasis on systematic business justification, even of the ‘soft’ and unquantifiable benefits that are often a major incentive for and payoff of using personal computers” [2]

It would be wise for companies contemplating a move to the cloud to consider this advice carefully – replacing personal-computer with Cloud-computing throughout.

(c)2011 Will Venters, London School of Economics. 

[1]            P. Ceruzzi, A History of Modern Computing. Cambridge,MA: MIT Press, 2002.

[2]            P. G. W. Keen and L. Woodman, “What to do with all those micros: First make them part of the team,” Harvard Business Review, vol. 62, pp. 142-150, 1984.

[3]            T. Guimaraes and V. Ramanujam, “Personal Computing Trends and Problems: An Empirical Study,” MIS Quarterly, vol. 10, pp. 179-187, 1986.

[4]            M. Benioff and C. Adler, Behind the Cloud – the untold story of how salesforce.com went from idea to billion-dollar company and revolutionized and industry. San Francisco,CA: Jossey-Bass, 2009.

[5]           D. Lee, “Usage Patterns and Sources of Assitance for Personal Computer Users,” MIS Quarterly, vol. 10, pp. 313-325, 1986.

[6]            N. Carr, “The End of Corporate Computing,” MIT Sloan Management Review, vol. 46, pp. 67-73, 2005.

[7]            D. Benson, “A field study of End User Computing: Findings and Issues,” MIS Quarterly, vol. 7, pp. 35-45, 1983.

[8]            M. Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A history of the software industry. Cambridge,MA: MIT Press, 2003.

G-Cloud – A talk by John Suffolk (hosted by Computer Weekly)

A couple of weeks ago I attended a talk by the UK Government’s CIO – John Suffolk ( See here for more information on his role). At the talk John outlined his idea for a “G-Cloud” (government cloud) with the primary aim of reducing IT costs within government. Central government has around 130 datacenters, and an estimated 9000 server rooms, with local government and quasi-government obviously adding to this figure. Reducing and consolidating these through Cloud Computing would offer significant efficiencies and cost saving. Indeed given that 5% of contract costs are simply for bidding/procurement by simply having less procurement of resources costs would automatically be saved.

John outlined different “cloud-worlds” which he sees as important opportunities for cost saving through cloud computing in government.

1) “The testing world” – by using cloud computing to provide test-kits and environments it is possible to reduce the huge number of essentially idle servers kept simply for testing. For such servers utilisation is estimated at 7%.

2) “The shared world” – Many of the services offered by government require the same standardised and shared services. While these must be hosted internally they offer savings by using Cloud ideas. http://www.direct.gov for example has two data-centres at present – but could these also be used for similar services in other areas?

3) “Web Services world” – This was more unclear in the talk  but centred around the exploitation of cloud offerings through web services. For example could an “App-Store” be developed to aid government in simple procurement of tested and assured services. Could such an App-Store provide opportunities for SMEs to provide software into government through easier procurement processes (which currently preclude many SMEs from trying).

This idea of an App-store is  interesting. It would essentially provide a wrapper around an application to make transparent across government the pricing of an application, the contracting-vehicle required to purchase, the security level it is assured for use with, and details of who in government is using it. Finally deployment tools would be included to allow applications to be rolled out simply.

John acknowledged that many details need ironing out, particularly issues of European procurement rules (and the UKs obsession with following them to the letter of the law).  While government might like to pay-per-use and contract at crown level (so licences can be moved from department to department rather than creating new purchases) this would be a change in the way software is sold and might affect R&D, licence issues, maintenance etc.

The App-Store would be a means to crack the problem of procurement and the time it takes. and so drive costs down for both sides.

What was clear however was the desire to use the cloud for the lower level application stack. To “Disintermediate applications” because “we don’t care about underlying back-end, only care about the software service” – Government can use a common bottom of the stack.

Indeed it was discussed that a standard design for a government desktop-PC might be an “application” within the app-store so centralising this design and saving the huge costs of individual design per department (see http://www.cabinetoffice.gov.uk/media/317444/ict_strategy4.pdf#page=23 for more details).

Finally the cloud offers government the same opportunities to scale operations to meet demand (for example MyGov pages when new announcements are made, or Treasury when the budget is announced), however this scalable-service would also affect costs and might not be justified in the budgeting.  While we look at the cloud to stop web-sites going down there is also a cost to providing such scalable support for the few days a year it is needed – cloud or no cloud.

Thank you to Computer Weekly for inviting me to this event!