Double trouble – why cloud is a question of balance |My New Blog on Cloud Pro

I have been invited to Blog on CloudPro – don’t worry I will keep posting here as well – but if you want to read my first posting see:

Double trouble – why cloud is a question of balance | Cloud Pro.

Big Data – Charm or Seduction [invited article by Mike Cushman]

Big Data – Charm or Seduction  [invited article by Mike Cushman].

 “The Allure of Big Data”, The 14th Social Study of ICT workshop at LSE on 25 April 2014 pointed to answers on some questions but left others unaddressed. Two in particular were left hanging: ‘How New is Big Data’ and ‘What is Big Data’?

How new is Big Data?

Like many themes in the fields of Management and Information Systems it is both new and not new and both the ‘gee-whiz’ and ‘we’ve seen it all before’ reflexes are incomplete.

In important aspects Big Data is a re-packaging and re-selling of Data Warehousing, Data Mining, Knowledge Management, e-science and many other items form the consultants’’ catalogues of past decades. Each of these, especially KM, is a re-badging of previous efforts. But only to say that is to miss that the growth of processing power and cheap, and ever-cheaper, storage is producing changes in the uses to which accumulations of data can be put. In addition previous iterations have not had the current quantities of social media content and GPS attributes from the growth of mobile computing available to them. The development of innovative algorithms to analyse the growth of quantity and types of data afford new possibilities, even if many of them, but far from all of them, just look like expanded versions of the old routines.

What is Big Data?

Much of the discussion at the workshop was compromised by the lumping of too many distinct phenomena under one heading. Big Data is not one thing and this is a preliminary attempt at a typology of Big Data.

  • Big Data is the business. Companies like Google and Facebook are essentially their ability to analyse the data provided by their users in return for free provision of services. Discussions about such companies should lead to discussion about he role of advertising in the economy and society. While newspapers and magazines have always been dependent upon advertising revenue, this revenue is far more central to Google and its peers.
  • Big Data for marketing. The collection of customer data through loyalty cards allows retailers to design promotions at a national, store and individual customer level and makes CRM systems far more powerful.
  • Big Data for cost control. The collection of data on every aspect of the business allows the elimination of unnecessary cost, making cost accounting far more effective and supporting lean manufacturing approaches.
  • Big Data for workforce management. Employers now have access to far more data about employees’ histories and performance. This has led to the spread of both performance related pay and more intrusive disciplinary codes.
  • Big Data for performance ranking and comparison. It has become accepted that heterogeneous organisations can be listed in meaningful league tables with standardised measures as easily as football teams can. The result of a football match is unambiguous, subject to moderately competent refereeing. The performance of a school, university or hospital is less easily agreed. LSE moves alarmingly up and down national and international rankings according to the measures and their weightings selected by a particular newspaper. Big Data is the key cement to the conceit that these league tables are a sensible activity and that they sufficiently meaningful to obliterate the harm they do. Because Big Data and the tables are assumed to be necessary the data must be constructed and collected regardless of cost and disruption, so the Research Excellence Framework is allowed to dominate university life and only education measurable in GCSEs is understood to be a valuable product of school efforts.
  • Big Data for product development. The collection of data about products in use in industries like motor manufacturing can feed back into product design to eliminate design faults and weaknesses and better meet customer demands.
  • Big Data for science. The growth in computing capacity is necessary for data-rich experiments like those at CERN but also the collection of far greater quantities of observational data in both hard science like meteorology and in social science leading to the production of new scientific knowledge.
  • Big Data for policy development. Policy in areas like housing, transport, education and health have always depended on large data sets like the national census and the general household survey (the degree of faithfulness of any particular policy to the data that is claimed to support it has always, and will always, be a matter for political argument). Whether the development of bigger data will improve policy development or only intensify politicisation of data use is a matter for conjecture.
  • Big Data for surveillance. There has long been a recognition that states collect data on their citizens. Each state announces loudly the data collection practices of their opponents while, generally, concealing its own. ‘Totalitarian’ states have been more willing to publicise their surveillance in order to intimidate their population; ‘liberal democracies’ try to minimise knowledge about their own practices claiming it is only ‘them’ about whom dossiers are compiled – criminals, terrorists, subversives, and paedophiles. The admitted categories have always been elastic according to political priorities so may also be widened to include such as trade unionists; benefit claimants; or immigrants, refugees and aliens. While groups are added there is great institutional resistance to slimming down the list. Edward Snowden revealed that even ‘liberal democracies’ regard every citizen as potentially hostile and a surveillance subject ‘just in case’

There are continuing ethical and privacy concerns about Big Data. These are made more complex and irresolvable because Big Data is too often discussed as one thing. Regarding it is many distinct phenomena, with each domain having its own ethical and privacy requirements will allow more clarity.


Mike Cushman

29 April 2014

Mike Cushman is a retired colleague from the LSE who also specialises in Information Systems and their social and organisational implications. 

Cloud computing security – lessons from Bletchley Park

 Today I’m at Bletchley park, home of the code-breakers in the second world war and the perfect location for a workshop* on Cloud Computing security. I thought I would share some of the most interesting points that emerged today:

Focus on security audits:

  • A talk from someone in the US department of homeland security was calling for improvements in CIO’s ability to move to the cloud while maintaining security. In doing this they argue the need for better auditing  – security audit, privacy impact audits, and performance audits. They argue the “goal is to develop test and deploy cloud computing to facilitate end-to-end trust”. Silverline was proposed as part of this move.

Cloud security as a religious debate:

  • Prof Ahmad Sadeghi argued that cloud security is “a religious debate”. While cloud security is presented as new, many parts of the work was already achieved in utility computing and IBM mainframes. The problem, he argues, is that for cloud providers the focus is upon optimization not on security. This lack of focus on security is a significant problem for BYOD (Bring your own device) since an employee backing up data with iCloud on their iPhone may be inadvertently sharing company data (e.g. calendar data on who they are meeting) in a less secure site.

Hardware Solutions to the problem of cloud security:

  • The problem with cloud security is ensuring that everything from the CPU up through the operating system stack, the hypervisor, and the users’ virtual machines are secure. Without this there is a risk either from the systems administrator, or another virtual machine, of attacking a user’s virtual machine. Prof. Sadeghi explained that one solution to this problem is being developed by Intel through their SGX – Software Guard Extension chipset. This is a hardware based cloud security solution maintaining an “enclave” area of memory which is secure from the operating system upwards – if you trust the CPU you can trust the whole server. The implementation is complex, but suffice it to say that many of the attack challenges are resolved allowing highly secure parts of the cloud to keep data.


Simplicity and cloud computing

In my recent co-authored book on cloud computing [1]we argue that one of the primary desires for the adoption of computing as a service (as opposed to as a product such as software and hardware organised by the purchaser) was the desire for simplicity. We even adopted the term “Simplicity as a Service” to describe the disentanglement of complexity offered by new pay-as-you-go computing services associated with cloud computing  through, for example, more standardised contracting.  Indeed one of the primary motivations for many moves to the cloud is to simplify.  Yet we stumble quickly upon a problem – while the term simplicity[2] is widely used in relation to cloud computing, we have very little understanding of what this simplicity actually means? Understanding simplicity better may help us better understand our procurement of this types of service.

In this short essay want to unpick the concept of simplicity, then apply this back to the issue of cloud computing.  In this I consider simplicity from three directions which I roughly define as Modularic, Aesthetic and Systemic simplicity.

Modularic Simplicity

Is simplicity a concern for simpler mechanisms to provide the same service (i.e. a quartz watch is simpler than a Swiss automatic chronograph yet both tell the time)? To be simpler a device must perhaps have fewer components? Or perhaps simplicity lies in the interrelation between components – the interfaces?  If we consider simplicity in these terms we can seek to examine the modularity of objects – understanding how a service is composed of different services, and examining their underlying structures[3].  This is important for cloud computing in which various technical services are often interconnected to provide service – NetFlix for example integrates various Amazon’s cloud services with my iPad’s App, and with movie-content to provide service. Through decoupling services’ modularity the complexity of the constellation of modules can perhaps be better understood. Structures such as “hierarchies” are also used to keep things “simple”  and understanding such structures would help.

In this way simplicity is a calculation roughly based on counting components and their interfaces. Yet this seems rather well… simplistic!  For as Aristotle highlights wholes are “more than the sum of their parts” – there is emergence and emergent behaviour. But more than this, there is variation in the simplicity of components.

Aesthetic Simplicity

One problem with “modularic simplicity” is that the most simple modular-objects themselves can vary considerably in their “simplicity”.  Take two objects made of clay – a brick and a pottery vase. If both weigh the same they likely have the same number of atoms within them. Yet most people would agree the brick is simpler. The vase’s atoms are in a structure which introduced intricacy and difference despite the material itself being identical. Similarly two apparently similar digital MP3 files –seemingly random series of 0s and 1s –can vary considerably in their simplicity when realised as music – a flute solo verses a prog-rock band.

Simplicity then is not inherent in the material and any attempt to calculate simplicity by counting components and their relationship will be somewhat problematic. What then makes the vase more complex?  As humans perhaps we evaluate simplicity through our interpretation – an aesthetically concept of simplicity. This is certainly the perception of many designers and reflects the design aspirations of Apple computing. From their first sales brochure’s proclamation that “Simplicity is the ultimate sophistication” [4] this company has championed the idea that computing should feel “simple” for humans, in particular that the human should (in the words of their chief designer) “feel we can dominate [physical products]. As you bring order to complexity, you find a way to make the product defer to you. Simplicity… isn’t just visual style,… minimalism or the absence of clutter.”  For Apple and their vice-president of design Jonathan Ive’s simplicity is about removal of the unessential – and the reassertion of the whole (that is the form of the final product) over the parts (that is the components which make up that whole) – but wholly centred around the human user.

This concept is also represented in Ockham’s razor[5] – the assumption that simpler explanations are better despite the lack of irrefutable logical principle that this is the case (though they are more easily tested).

A human interpretation is required – cloud computing is considered “simple” in relation to its use in doing something for humans. It can only be evaluated at the level of its use (just as an iphone is only simple when held in the hand and used – not when taken apart and examined from within where its myriad complexity becomes evident).

Systemic Simplicity

If modularic simplicity places the “thing” at the centre of simplicity, and if, in contrast,  aesthetic simplicity placed humans at the heart of defining what is simple, then perhaps we can define simplicity in terms of the interrelationship between things and people – as a kind of socio-technical perspective towards simplicity? A view of simplicity in terms of the complex social and technical arrangements of life through which we get things done –such as, for example, organisations.

In many ways this simplicity might be defined by its absence – the lack of simplicity of modern organisations and their technical arrangements. The role of managers is thus often seen to be seeking to organise things to be “simpler”.  Yet most organisations are never simple and to aspire to make them so may be problematic.  Miller[6] argued that “organisations lapse into decline precisely because they amplify and extend a single strength or function while neglecting most others. Ultimately, a rich and complex organisation becomes excessively simple –it turns into a monolithic, narrowly focused version of its former self, converting a formula for success into a path towards failure.” [7] For Miller simplicity is an overwhelming preoccupation with a single goal, strategic activity, department or world-view – and making things simple through simplifying the organisation is therefore often problematic.  This suggests that understanding what can be simplified and what cannot requires a rich appreciation of the complexity of the organisation.

Indeed the origins of cybernetics[8] and complexity theory highlight that management must meet the complexity of a situation with a similar level of complexity in their response to that situation[9]. This demands that a manager’s response to organisational complexity cannot simply be simplification of their actions if they cannot similarly understand or simplify the environment within which the organisation resides.

Simplifying without this understanding is often what managers seek to do. And in making things simple they often rely on relatively simple models of the organisation to help them make these decisions. Whether it is the organisation chart, the process diagram, the UML model, their attempts to derive simplicity is focused on such simplifications. As Stafford Beer (1973) [10]reminded us managers become bewitched by the paper representations of their organisations as a “surrogate world we manage”, losing contact with the messiness of their world[11] and assuming simplicity in the world rather than seeking to simplify the world.

Beer goes further to highlight that “if a simple process is applied to complicated data, then only a small portion of that data will be registered, attended to, and make unequivocal. Most of the input will remain untouched and will remain a puzzle”.

This is not to say that we should not attempt to simplify our understanding of organisations into models and representations, but that we must carefully acknowledge these models as “simple”, and ensure that we remain attune to their alignment with the complexity of that which they represent.

When we buy cloud computing services which aim to change our organisation in some way we must be careful that we are not selecting the computing model based on a simplistic understanding of what the organisation is trying to achieve.

What can learn from this management and cloud computing?

From these three conceptualisation of simplicity we can draw some lessons for organisational managers and for cloud computing:

1) Simplicity isn’t always inherent in devices or technology, it relates to their interpretation and representation. We should seek to model simplicity in ways which reflect this.

2) That simplifying computing systems must be met with an understanding of the level of complexity of the task they are for. Selecting too simple a service is problematic[12]

3) That simplicity does not necessary mean less complex. Rather it can relate to the use of the service at the interface being observed. In procuring a service we should be attune to the lack of simplicity at different levels.

© 2014 W.Venters.

[1] Willcocks, L., W. Venters and E. Whitley (2013). Moving To The  Cloud Corporation. Basingstoke, Palgrave Macmillan.

[2] I acknowledge the contribution of PA consulting in raising with me a concern for better understanding simplicity.

[3] Baldwin, C. and K. Clark (2000). Design Rules: The power of modularity. Cambridge,MA, MIT Press.

[4] Isaacson, W. (2011). Steve Jobs, Little Brown. Page 343.


[6] Miller, D. (1993). “The Architecture of Simplicity.” The Academy of Management Review 18(1): 116-138.

[7] Miller, D. (1993). “The Architecture of Simplicity.” The Academy of Management Review 18(1): 116-138.

[8] Ashby, W. R. (1956). An introduction to cybernetics. London, Methuen & Co Ltd. Churchman, C., R. Ackoff and E. Arnoff (1957). Introduction to Operations Research. New York, Wiley.

[9] This is inherent in Ashby’s law of “requisite variety” – though different terms are used.

[10] Beer, S. (1984). “The Viable System Model: Its provenance, development , methodology and pathology.” Journal of the Operational Research Society 35: 7-36.


[11] Pickering, A. (2013). Living in the material world. Materiality and Space: Organizations, Artefacts and Practices. F.-X. de Vaujany and N. Mitev, Palgrave Macmillan.

[12] I discuss this in much more detail through the term “Variety” in Venters, W. and E. Whitley (2012). “A Critical Review of Cloud Computing: Researching Desires and Realities.” Journal of Information Technology 27(3): 179-197.


I’m presenting at “The Exchange 2013 – Knowledge Peers”


I’m excited to be presenting at “The Exchange 2013 – Knowledge Peers” on the 28th November. Not only is it at the Kia Oval (which I drive past regularly so am looking forward to getting the tour inside), but also because their focus is on networking with smaller and medium sized organisations. I am of the opinion that cloud computing will offer more valuable and exciting opportunities for SMEs than large organisations so I am looking forward to connecting with many more small organisations at the event.

I hope you can join me there!