CIOs keen to drive consequential-innovation

A couple of weeks ago I chaired a Global CIO Institute conference, hosting a dinner, various talks, and round table discussions with CIOs.

What has struck me during all these interactions was the marked contrast between these CIOs at the coalface and the topics obsessed upon by LinkedIn/academic/journalistic style discussions. While CIOs are interested in topics like digital transformation, AI, robotics, data-lakes and lakehouses, the API-economy and the rise of ChatGPT (the usual LinkedIn fare) these were not what drove them. Their interest was much more on safely driving consequential innovation within their company’s line of business.

Of significant interest within this was the need to manage various forms of risk. Risk was not to be “avoided” – or as Robin Smith (CISO) at Aston Martin put it, we need to promote “positive risk taking” for innovation. All intervention generated risk. For some this manifested as needing guard-rails around IT innovation so creative and innovative staff were not constrained by the risk of a catastrophic failure. This was particularly true as low-code and citizen development expands. For CIOs, developing a culture of innovation demanded systems that allowed innovations to fail safely and elegantly.

Risk-taking behaviour within innovation was only one risk they face. Sobering conversations concerned external sources of risk and the need for business resilience in the face of pandemic, war, and cyber-security challenges. Any innovation in digital technology increases the potential surface-area that companies can be attacked through. This demands ever more sophisticated (and expensive) technical countermeasures but also cultural changes. While attention is driven towards the use of AI (like ChatGPT) for good, nefarious actors are thinking about how such tools might be used for ill. For example, attackers can use emails, telephone calls, and deep-fake video calls to sound, and even look, like a company’s CEO or top customer asking for help[1]. How can CIOs ensure their staff do not fall foul of these and various more technical scams? How can trust be established if identity is hard to prove? What happens when AI is applied to exploring possible attacks through Public APIs?

Also of significant concern was keeping-the-lights-on with their ever more demanding and heterogenous estate of products, platforms and systems. One speaker pointed out the following XKCD cartoon which captures this so well. The law of unintended consequences dominated many of their fears, particularly as organisations moved towards exploiting such new-technologies in various forms.  

 Source/: (cc) XKCD with thanks).

What was clear, and remains clear, is that we need to have a view of the enterprise technology landscape that balances risk and reward. While commentators ignore the complexity of legacy infrastructure, burgeoning bloated cloud computing estates, and the risks involved in adding more complexity to these, those tasked with managing the enterprise IT estate cannot. 

These thoughts are obviously not scientific and are entirely anecdotal. The CIOs I met were often selected to attend, the conversations were steered by agenda etc. But they did remind me why CIOs are not as obsessed with ChatGPT as everyone might think.

[1] An executive from OKTA gave the example of this for Binance exec says scammers made a deep fake hologram of him • The Register

Header Image “Business Idea” by danielfoster437 is licensed under CC BY-NC-SA 2.0.

What is Fog Computing?

I read an interesting article on Fog Computing and thought readers might like a short precis:

Applications such as health-monitoring or emergency response require near-instantaneous response such that the delay caused by contacting and receiving data from a cloud data-centre can be highly problematic. Fog Computing is a response to this challenge. The basic idea is to shift some of the computing from the data-centre to devices which are closer to the edge of the network – so moving the cloud to the ground (hence “fog computing”). The computing work is shared between the data-centre and various local IoT devices (e.g. a local router or smart-gateway).

“Fog computing is a paradigm for managing a highly distributed and possibly virtualized environment that provides compute and network services between sensors and cloud data-centers” (Dastjerdi et al. 2016)

While cloud computing (using large data-centres) is perfect for analysis of Big Data “at rest” (i.e.  analysing historical trends where large magnitudes of data are required and cheap processing necessary) fog computing may be much better for dynamic analysis of “data-in-motion” (data concerning immediate ongoing actions which require rapid analytical response).  For example an Augmented Reality Application cannot wait for a distant data-centre to respond when a user’s head it turned. Similarly safety-critical and business-critical applications such as health-care remote monitoring, or remote diagnostics cannot rely on permanent availability of internet connections (as those in York know when floods knocked out their internet for days this year).

Privacy concerns are also relevant. By moving data-analysis to the edge of the network (e.g. a device or local mobile phone) which is often owned by, and controlled by, the data-source the user may have more control over their data. For example an exercise tracker might aggregate and process its GPS data and fitness data on a local mobile phone rather than automatically uploading it to a distant server. It might also undertake data-trimming so reducing the bandwidth and load on the cloud. This is particularly relevant as the number of connected devices increases to billions. This gain should be balanced with the challenge of managing an increasing number of devices which must be secured to hold sensitive data safely.

Another challenge is the climatic damage this new architecture poses. While data-centres are increasingly efficient in their processing, and often rely on clean-energy sources, moving computing to less efficient devices at the edge of the network might create a problem. We are effectively balancing latency with CO2 production.

For more information on see:

Dastjerdi, A. V., Gupta, H., Calheiros, R. N., Ghosh, S. K., and Buyya, R. 2016. “Fog Computing: Principles, Architectures, and Applications,” in Internet of Things: Principles and Paradigm. Elsevier / MKP.

(Image Ian Furst (cc))

Netskope’s approach to Shadow IT security.

On Wednesday last week I attended “Cloud Expo Europe” at London’s Excel centre. One of particularly interesting product was Netskope (also a finalists in the UK Cloud Awards) who are addressing the challenge of ShadowIT – employees use of cloud-services which are not sanctioned by the corporate IT departments.

According to Accenture (2013) “78% of cloud procurement comes from Strategic Business Units (SBUs), and only 28% from centralized IT functions”. Without some form of control the data-protection and compliance challenges of this can prove a huge. Users are also poorly skilled in making rational decisions about the safety of company data and products like Netskope address this by examining fire-wall logs or running Proxy servers and providing an easy interface so IT departments can enforce cloud access policies. The product analyses users’ access patterns and sends alerts, encrypts content on upload, blocks cloud transactions and quarantines content for review by Legal or IT. It essentially monitors and stops employees doing anything risky.

For me, the value of this product is the database of different cloud services with detailed information as to their safety and compliance. The product is however also really frustrating. At its heart is the assumption that the job of the IT professional is to monitor, control and police employees. This puts IT in opposition to the other business functions. Why couldn’t this product have instead started from a different assumption – that employees are, mostly, just trying to do their work as efficiently as possible. While a few are bad, most are just ignorant to the risks. Netskope would have been fantastic if it instead helped reduce this ignorance rather than policing users’ failures.  Had it provided an employee-portal to allow employees to evaluate cloud services prior to adoption it would have promoted the effective use of them, and allowed users to make rational decisions on their adoption. The IT department would be in a facilitation role rather than a policing role, and employees would feel in control (rather than in fear). The safety would be just the same (with Netskope policing policy) but with users feeling part of that effort. Productivity gains might also be achieved as users are freed to try using new valuable IT services knowing they were doing it safely and with management approval.

This isn’t to criticise Netskope for what it does do – but to call upon new approaches to thinking about the role of IT and the CIO in this cloud-future.

Strategy Security in the cloud – comments on Athens Cloud Computing Conference

Stefan Riepl - Thanks CC

Attending the Cloud Computing Conference in Athens today I was struck by the overarching interest of the audience in security. This is entirely understandable, and certainly should be the primary concern for IT directors whose overarching concern is to keep the company safe in this dangerous digital world. As fellow speaker Ian Murphy discussed – Hacking is available “as a service” today and for little money hackers can be directed towards any organisation whose security protocols are substandard.This point was reiterated by Amar Singh.

What worries me though is that organisational strategy is not also considered a significant security concern in the face of the cloud.  For me IT directors should be taking a primary position in considering the strategic risks to their organisation from cloud-based services ripping the heart out of their business. Without considering how the business model of a business might be undermined by cloud based digital services companies look like the vacuum-valve or Cathod-Ray-Tube manufacturer obsessing about whether their product can be stolen in production and delivery!

My rather random list of possible risks would include.

1) Disintermediation – Don Tapscott discussed many years ago how intermediation business can be lost as customers circumvent or replicate intermediary business and go direct. Cloud provides the simple tools to create this type of business.

2) Cost Collapse – Many businesses rely on cost inhibiting entry into marketplaces. Automation, Cloud and Data-abundance, and PAYG infrastructure can collapse the cost of entering some of these marketplaces. An example of this is Animation where small studies can now produce full feature-films using cloud rendering services.In the future digital technology are likely to do the same to many other areas of business which are today considered capital intensive.

3) Globally local – Prior to Uber most people working in taxi services could not imagine that the value of their business would shift to include services provided from north america. Yet such platforms, by their intensive focus on value creation for users, and their creation of brokerage services radically change the business model.  Like Ebay, AirBnB, and the creation of a dual-sided market… Read Eisenmann, T. R., G. Parker and M. W. V. Alstyne (2006). “Strategies for Two-Sided Markets.” Harvard Business Review(10). for more on this type of business model

4) Service Quality – Many existing companies struggle to respond to customers need. Using cloud services small businesses can emerge which provide much better ease of use and services by starting with a cloud-only strategy and uninhibited by the existing legacy IT.

This is just a rather random list – with time I will try to develop these ideas into something more coherent! I welcome readers contributions.

My interview for the Financial Times on Cloud Regulation…

Follow this link for the video an interview I did for the Financial Times on regulation of cloud computing:

Understanding Cloud Computing – Financial Times.


Double trouble – why cloud is a question of balance |My New Blog on Cloud Pro

I have been invited to Blog on CloudPro – don’t worry I will keep posting here as well – but if you want to read my first posting see:

Double trouble – why cloud is a question of balance | Cloud Pro.

Cloud sourcing and innovation: slow train coming? FREE JOURNAL ARTICLE

An article I wrote with Edgar Whitley and Leslie Willcocks for the journal Strategic Outsourcing has been awarded the “Outstanding paper of 2014” award. This means that the article is freely available from the following website (articles are usually $32 so quite a saving!). Please feel free to download a copy today:

Emerald Insight | Cloud sourcing and innovation: slow train coming?: A composite research study.


Purpose – Although cloud computing has been heralded as driving the innovation agenda, there is growing evidence that cloud computing is actually a “slow train coming”. The purpose of this paper is to seek to understand the factors that drive and inhibit the adoption of cloud computing, particularly in relation to its use for innovative practices.

Design/methodology/approach – The paper draws on a composite research base including two detailed surveys and interviews with 56 participants in the cloud supply chain undertaken between 2010 and 2013. The insights from this data are presented in relation to set of antecedents to innovation and a cloud sourcing model of collaborative innovation.

Findings – The paper finds that while some features of cloud computing will hasten the adoption of cloud, and its use for innovative purposes by the enterprise, there are also clear challenges that need to be addressed before cloud can be adopted successfully. Interestingly, the analysis highlights that many of these challenges arise from the technological nature of cloud computing itself.

Research limitations/implications – The research highlights a series of factors that need to be better understood for the maximum benefit from cloud computing to be achieved. Further research is needed to assess the best responses to these challenges.

Practical implications – The research suggests that enterprises need to undertake a number of steps for the full benefits of cloud computing to be achieved. It suggests that collaborative innovation is not necessarily an immediate consequence of adopting cloud computing.

Originality/value – The paper draws on an extensive research base to provide empirically informed analysis of the complexities of adopting cloud computing for innovation.


BBC News – Microsoft ‘must release’ data held on Dublin server

The following news article – reported on the BBC but repeated elsewhere – is perhaps the most important issue for cloud computing today (particularly in the consumer space). Our post-Snowden world is being shaped by legal arguments in the USA which have profound implications for the use of global cloud services. If Microsoft is forced to hand over data from its Dublin data-centers then companies concerned about the US gaining access to their data will have to avoid US companies entirely. Watch this space!

BBC News – Microsoft ‘must release’ data held on Dublin server.

The 7 deadly sins of cloud computing –

A thoughtful article which addresses a road less travelled than the usual hysteria type articles on cloud security…

The 7 deadly sins of cloud computing –

Virtualizing Networking and the Cloud Corporation

“The truth is, in 10 years, you’re not going to have highly skilled, highly paid people working with networking hardware.”.

via Mavericks Invent Future Internet Where Cisco Is Meaningless | Wired Enterprise |

In a fascinating article (sent to me by Ayesha Khanna – thanks) Wired’s Cade Metz explores the growth of a company which abstracts and virtualises networks through software.

Why is this interesting to me? Because I have argued that as cloud computing moves the data-centre from inside organisations to the cloud we are likely to see cloud ecosystems emerge in which companies integrate cloud provided services to create new forms of potentially more collaborative organisations – something I termed “the cloud corporation”. For example a drinks company and ice-cream company might integrate element of their cloud based EPR systems to develop and sell a new type of iced drink. But achieving this would require relaxations in their security and networking – the cloud based ERP sits in the public internet, and users must leave the corporate network to interact with the ERP.

However virtualising the network suggests the opportunity to dynamically create a new type of  network, flexibly created in software, which integrates elements of the ice-cream, drinks and EPR companies networks into a wholly private -albeit virtualised – network shared between them all. Significantly achieving this would be a simple reconfiguration of the network software of these companies – rather than involving the installation and messy configuraiton of VPN appliances, and various routers etc.  Creating new types of collaborative businesses is thus all about the configuration of cloud-based software… no hardware involved.  One step on the way to a kind of plug-and-play corporate collaborative arrangements.