Presenting Cloud Desires and Realities Webcast

Tomorrow I am giving a free web-cast on my recent work on cloud computing: 

http://www.brighttalk.com/webcast/288/53101

I hope to see you there!

Best wishes

Will. 

ABSTRACT:Cloud computing has become central to current discussions about corporate information technology. To assess the impact and potential that cloud may have on your enterprise it is important to evaluate the claims made critically review these against you’re organisations’ reality. Drawing on extensive research this talk will walk through my “cloud desires framework” (recently published in the prestigious Journal of Information Technology).  The framework is structured around a series of technological and service ‘desires’, i.e. characteristics of cloud that are important for cloud users. 

 

Book Chapter Out: The Participatory Cultures Handbook

Amazon.com: The Participatory Cultures Handbook (9780415506090): Aaron Delwiche, Jennifer Jacobs Henderson: Books.

I  co-authored (with Sarah Pearce) a chapter in this book focusing on the culture of particle physicist at CERN as they developed the world’s largest Grid Computing infrastructure for the LHC. The chapter considers the different collaborative and management practices involved in such a large endeavour and offers lessons for others building information infrastructure in a global collaboration.

Pre-Order a copy now!!!

 

Clouds and Coffee: User affordance and information infrastructure

Some desktop coffee machines (e.g. figure 1) are now connected to the Internet (Pritchard, 2012).  Such devices are enrolled within increasingly complex information infrastructures involving cloud services. This form of entanglement creates mazes of unexpected heterogeneous opportunities and risks (Latour, 2003), yet users ability to perceive such opportunity and risk is limited their lack of visceral understanding of such entanglement. It is this understanding of the cloud by the user which is the focus of blog posting. Such a coffee maker “calls out” (Gibson, 1979) to users’ with a simple offering – its ability to make coffee. Its form attests to this function with buttons for espresso and latte, nozzles for dispensing drinks, and trays to catch the drips. To any user experienced in modern coffee this machine affords (Norman, 1990; Norman, 1999) the provision of coffee in its form and function keeping its information infrastructure hidden from view – only an engineer can understand that this machine is communicating.

Yet such assemblages of plastic, metal and information technology are a “quasi-object” (Latour, 2003)– complicated cases requiring political assemblies and no longer “matters of fact” but instead “states of affairs” (Latour, 2003).  Such a coffee maker is a drinks dispensing service (representing a service-dominant logic (Vargo & Lusch, 2004; Vargo, 2012)), provided through an assembly of material and immaterial objects whose boundary and ultimate purpose remain unclear.  While the device above only communicates about its maintenance, other machines may go further. Such machines’ users, hankering for an espresso to get them through a boring conference, may be kept unaware that the infrastructure is monitoring his choices to influence global coffee production, to ensure the output is sufficiently tepid and dull to damage his economic productivity, or that the device is recording and transmitting his every word.   He may be annoyed to discover his coffee is stronger than his female colleagues as gender profiling based on image recognition decides the “right” coffee for him. He may be horrified that the device ceases to work at the very moment of need because of a fault in contract payments within the accounts department – perhaps caused by their tepid weak coffee.

Similarly companies involved in providing the coffee and milk for such machines might become enrolled in this reconfiguration (Normann, 2001) of coffee service, an enrolment which could reconfigure the knowledge asymmetries within the existing market.  Suddenly an engineering company who previously made plastic and metal coffee machines is now in a position to better understand coffee demand than coffee growers or retailers. The machine itself could negotiate automatically on local markets for its milk provision, or compare material prices with similar machines in other markets, and even alter prices of coffee for consumers based on local demand. Through the enrolment of information infrastructures within a coffee service the knowledge of the coffee market shifts.

All this has happened already to the market for music (increasingly controlled by a purveyor of sophisticated walkmen using a cloud service) and more recently ebooks (increasingly controlled by a book retailer and their sophisticated book readers).   Now imagine the emergence of the smart-city with huge numbers of devices from street-lights to refrigerators connected to the cloud. How will the user of such smart-cities understand what they are interacting with – the quasi-objects they used to consider objects? How will such objects afford their informational uses alongside their more usual functions?

At the centre of this reconfiguration of material objects is a computer system residing in the cloud aggregating information. It is the aggregation of data from devices which may be central to the lessons of the cloud for SmartCities.

(© Will Venters 2012).

 

 

Gibson JJ (1979) The Ecological Approach to Perception. Houghton Mifflin, London.

Latour B (2003) Is Re-modernization Occurring-And If So, How to Prove It? Theory, Culture & Society 20(2), 35-48.

Norman D (1990) The Design of Everyday Things. The MIT Press, London.

Norman D (1999) Affordance, Conventions, and Design. Interactions ACM 6(3), 38-43.

Normann R (2001) Reframing Business: When the map changes the landscape. John Wiley & Sons Ltd, Chichester.

Pritchard S (2012) Mobile Comms: Coffee and TV. IT Pro, Dennis Publishing Ltd, London.

Vargo S and Lusch R (2004) Evolving to a New Dominant Logic for Marketing. The Journal of Marketing 68(1), 1-17.

Vargo SL (2012) Service-Dominant Logic: Reflections and Directions. Unpublished Powerpoint, Warwick,UK.

Our Fifth Report is out – Management implications of the Cloud

The fifth report in our Cloud Computing series for Accenture has just been published. This report looks at the impact Cloud Computing will have on the management of the IT function, and thus the skills needed by all involved in the IT industry. The report begins by analysing the impact Cloud might have in comparison to existing outsourcing project. It considers the core-capabilities which must be retained in a “cloud future”, considering how these capabilities might be managed, and the role of systems integrators in managing the Cloud.

Please use the comments form to give us feedback!

Cloud and the future of Business 5 – Management .

Cloud Computing – it’s so ‘80s.

For Vint Cerf[1], the father of the internet, Cloud Computing represents a return to the days of the mainframe where service-bureaus rented their machines by the hour to companies who used them for payroll and other similar tasks. Such comparisons focus on the architectural similarities between centralised mainframes and Cloud computing – cheaply connecting to an expensive resource “as a service” through a network. But cloud is more about the provision of “low-cost” computing (albeit in bulk through data-centres) at even lower costs in the cloud. A better analogy that the mainframe then is the introduction of the humble micro-computer and the revolution it brought to corporate computing in the early 1980s.

When micros were launched many companies operated using mini or mainframe computers which were cumbersome, expensive and needed specialist IT staff to manage them[1]. Like Cloud Computing today, when compared with these existing computers the new micros offered ease of use, low cost and apparently low risk which appealed to business executives seeking to cut costs, or SMEs unable to afford mini’s or mainframes[2]. Usage exploded and in the period from the launch of the IBM PC in 1981 to 1984 the proportion of companies using PCs increased dramatically from 8% to 100% [3] as the cost and opportunity of the micro became apparent. Again, as with the cloud[4], these micros were marketed directly to business executives rather than IT staff, and were accompanied by a narrative that they would enable companies to dispense of heavy mainframes and the IT department for many tasks –doing them quicker and more effectively. Surveys from that time suggested accessibility, speed of implementation, response-time, independence and self-development were the major advantage of the PC over the mainframe[5] –  easily recognisable in the hyperbole surrounding cloud services today. Indeed Nicholas Carr’s current pronouncement of the End of Corporate IT[6] would probably have resonated well in the early 1980s when the micro looked set to replace the need for corporate IT. Indeed in 1980 over half the companies in a sample claimed no IT department involvement in the acquisition of PCs[3].

But problems emerged from the wholesale uncontrolled adoption of the Micro, and by 1984 only 2% of those sampled did not involve the IT department in PC acquisition[3]. The proliferation of PCs meant that in 1980 as many as 32% of IT managers were unable to estimate the proportion of PC within their company[3], and few could provide any useful support for those who had purchased them.

Micros ultimately proved cheap individually but expensive on-mass[2] as their use exploded and new applications for them were discovered. In addition to the increased use IT professionals worried about the lack of documentation (and thus poor opportunity for maintenance), poor data management strategies, and security issues[7]. New applications proved incompatible with others (“the time-bomb of incompatibility”[2]), and different system platforms (e.g. CP/M, UNIX, MS-DOS, OS/2, Atari, Apple …) led to redundancy and communication difficulties between services and to the failure of many apparently unstoppable software providers –household names such as Lotus, Digital-Research, WordStar and Visi and dBase[8].

Ultimately it was the IT department which brought sense to these machines and began to connect them together for useful work using compatible applications – with the emergence of companies such as Novell and Microsoft to bring order to the chaos[8].

Drawing lessons from this history for Cloud Computing are useful. The strategic involvement of IT services departments is clearly required. Such involvement should focus not on the current cost-saving benefits of the cloud, but on the strategic management of a potentially escalating use of Cloud services within the firm. IT services must get involved in the narrative surrounding the cloud – ensuring their message is neither overly negative (and thus appearing to have a vested interest in the status quo) nor overly optimistic as potential problems exist. Either way the lessons of the microcomputer are relevant again today.  Indeed Keen and Woodman argued in 1984 that companies needed the following four strategies for the Micro:

1)      “Coordination rather than control of the introduction.

2)      Focusing on the longer-term technical architecture for the company’s overall computing resources, with personal computers as one component.

3)      Defining codes for good practice that adapt the proven disciplines of the [IT industry] into the new context.

4)      Emphasis on systematic business justification, even of the ‘soft’ and unquantifiable benefits that are often a major incentive for and payoff of using personal computers” [2]

It would be wise for companies contemplating a move to the cloud to consider this advice carefully – replacing personal-computer with Cloud-computing throughout.

(c)2011 Will Venters, London School of Economics. 

[1]            P. Ceruzzi, A History of Modern Computing. Cambridge,MA: MIT Press, 2002.

[2]            P. G. W. Keen and L. Woodman, “What to do with all those micros: First make them part of the team,” Harvard Business Review, vol. 62, pp. 142-150, 1984.

[3]            T. Guimaraes and V. Ramanujam, “Personal Computing Trends and Problems: An Empirical Study,” MIS Quarterly, vol. 10, pp. 179-187, 1986.

[4]            M. Benioff and C. Adler, Behind the Cloud – the untold story of how salesforce.com went from idea to billion-dollar company and revolutionized and industry. San Francisco,CA: Jossey-Bass, 2009.

[5]           D. Lee, “Usage Patterns and Sources of Assitance for Personal Computer Users,” MIS Quarterly, vol. 10, pp. 313-325, 1986.

[6]            N. Carr, “The End of Corporate Computing,” MIT Sloan Management Review, vol. 46, pp. 67-73, 2005.

[7]            D. Benson, “A field study of End User Computing: Findings and Issues,” MIS Quarterly, vol. 7, pp. 35-45, 1983.

[8]            M. Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A history of the software industry. Cambridge,MA: MIT Press, 2003.

Cloud and the Future of Business: From Costs to Innovation

I have not been updating this blog for a while as I have been busy writing commercial papers on Cloud Computing. The first of these, for Accenture, has just been published and is available here

http://www.outsourcingunit.org/publications/cloudPromise.pdf

The report outlines our” Cloud Desires Framework” in which we aim to explain the technological direction of Cloud in terms of four dimensions of the offerings – Equivalence, Abstraction, Automation and Tailoring.

Equivalence: The desire to provide services which are at least equivalent in quality to that experienced by a locally running service on a PCor server.

Abstraction: The desire to hide unnecessary complexity of the lower levels of the application stack.

Automation: The desire to automatically manage the running of a service.

Tailoring: The desire to tailor the provided service for specific enterprise needs.

(c) Willcocks,Venters,Whitley 2011.

By considering these dimensions to the different types of cloud service (SaaS, PaaS, IaaS and Hosted service (often ignored – but crucially Cloud-like)) it is possible to distinguish the different benefits of each away from the “value-add” differences. Crucially the framework allows simple comparison between services offered by different companies by focusing on the important desires and not the unimportant technical differences.

Take a look at the report – and let me know what you think!

CohesiveFT

This is a company to watch http://www.cohesiveft.com/ – they have two products:

VPN-Cubed provides a virtual network onto the network of a cloud provider. This enables first to keep a standard networking layer which is consistent even if the cloud provided network changes (e.g. IP address changes).

Elastic Server allows real-time assembly and management of software components. This allows the quick creation of easy to use applications which can be easily sent to various cloud services.

However it is the fact that together these services allow virtual machines and cloud services to be moved between cloud IaaS providers without significant real-time work which is important. If their products live up to the promise then users can move to the cheapest cloud provider with ease so driving down costs to commodity supplier levels… and creating the spot market for cloud.

Accenture Outlook: The coming of the cloud corporation

I have written, with two colleagues, an article for Accenture’s Outlook journal which introduces the idea of the Cloud Corporation:

Accenture Outlook: The coming of the cloud corporation.

The article discusses various trends in outsourcing which will impact upon Cloud (and vice versa).

Cloud computing remains focused on cost cutting achieved through new technology, however lessons from the past suggest that this is only a minor part of the disruptive innovation which Cloud may offer. In particular we should not ask “what is cloud computing?” but rather “why is cloud computing?” – in essence exploring the pressures on innovation today which resonate with the idea of utility computing.

While the cost saving is an important incremental innovation on existing practices, it is cloud’s potential to allow new forms of organisational collaboration which offer the potential of radical innovation. Moving the data-centre outside the organisation asks us to evaluate the relationship between the data-centre and the organisation. Is it “ours” to horde and control, or are parts of it able to be shared, opened, exploited by others (partners, customers, suppliers etc)? In turn does this opening of the relationship between the organisation and its information recast the organisation itself?

 

Using rented computing to crack passwords.

Cloud computing is open to everyone – good or bad.  Here we see someone renting computing power for a couple of dollars to crack an SHA-1 password. Imagine the potential of a competitor  using a few thousand pounds worth of computing potential to crack your passwords… or a disgruntled employee launching an attack with some of their severance. See the following article from the Register for more information.

German hacker uses rented computing to crack hashing algorithm • The Register.

Why you can’t move a mainframe with a cloud • The Register

Why you can’t move a mainframe with a cloud • The Register.

 

This is a detailed technical analysis of the market for mainframes – discussing the infrastructure issues of moving Mainframes to cloud, or cloud to mainframes. The issues discussed are somewhat perennial – “greying workforce” shift to cheaper platforms of linux and java. But as the article attests it is the shear reliability and stability of mainframes which keeps them going – something those who proclaim the cloud will prevail must understand and respond to. With such guaranteed uptime of years  for transaction processing we cannot really envisage the Cloud for the core applications which run our information economy.