Anti-competitive Artificial Intelligence (AI) – [FT.com]

Yesterday’s FT provides a fascinating article (available here) on the role algorithms may increasingly plan in price-rigging and collusion. While previously humans have colluded to fix prices, today’s algorithms which seek profit maximization may end up colluding in a way which is hard to detect and difficult to stop. Indeed a recent OECD report states:

“Finding ways to prevent collusion between self-learning algorithms might be one of the biggest challenges that competition law enforcers have ever faced… [Algorithms and Big Data] “may pose serious challenges to competition authorities in the future, as it may be very difficult, if not impossible, to prove an intention to co-ordinate prices, at least using current antitrust tools”.

While algorithmic trading has proliferated in financial services (reported in many popular books such as “Dark Pools”), it is their increasing use in consumer marketplaces which concerns the article’s authors – airline booking, hotels, and online retailing.

The problem for regulation is that “All of the economic models are based on human incentives and what we think humans rationally will do.” (Terrell McSweeny US FTC) while an AI algorithm which “learns” that its most profitable course of action is price coordination are poorly represented in our understanding.

“What happens if the machines realise it is in their interest to systematically and quickly raise prices in a co-ordinated way without deviating?” (Terrell McSweeny)

Indeed we might ask whether an algorithm which uses huge databases of historical demand and supply data, and detailed data of the competitive marketplace, to arrive at its most profitable price in the milliseconds of a webpage loading is acting competitively in keeping with market principles or against the consumer (who could never undertake similar analysis and therefore faces huge information asymmetry challenges).

An interesting example in the article is an App to track petrol pricing whereby, because the app highlights instantly to competitors that a price has been cut (and they can match the price cut before demand shifts), so it removes the incentive for anyone to discount.

The article even states: “the availability of perfect information, a hallmark of free market theory, might harm rather than empower consumers”

 

(Image (cc) Keith Cooper – thanks)

Building Mobility-as-a-Service in Berlin: The rhythms of information infrastructure coordination for smart cities

[The following article was jointly written with my PhD Student Ayesha Khanna. The article was published today on LSE Business Review http://blogs.lse.ac.uk/businessreview/ and is syndicated here with their agreement]

The 21st century has seen a growing recognition of the important of cities in the world: not only does over half of humanity live in cities, but cities contribute 60 per cent of global GDP, consume 75 per cent of the world’s resources and generate 75 per cent of its carbon emissions. There is little doubt that the enlarging footprint of cities, with the rapid rate of urbanization in the developing world, will be where “the battle for sustainability will be won or lost” and, for those engaged in “smart-cities” initiatives, the focus of winning this battle is through the use of digital technology to efficiently manage resources. One of the key sectors for such smart cities initiatives is transportation.

Transportation infrastructures today rely heavily on private car ownership, which is powered by fossil fuels, and public transportation, both of which operate independently of each other. Policy makers believe radical innovation in this sector is needed to move it to a more sustainable system of mobility.

To achieve the goal of sustainable, seamless, and efficient mobility, an infrastructure would be required that would allow residents to move away from private ownership to a combination of car-sharing and public transport. For example, such an intermodal chain of mobility might include taking a rented bicycle to the bus station, a bus to a stop near the office, and then a car-sharing service to the office, covering every step from origin to the last mile. Powered by renewable energy, electric vehicles could make this journey entirely green.

In order to create such a mobility infrastructure, all the services offered (buses, trains, car-sharing systems, charging stations, and payments) would have to be integrated using digital technology in order to provide an urban resident with an easy way to map and take an intermodal journey using her smartphone. This change would transform transportation as we know it today to Mobility-as-a-Service but requires considerable innovation in the various heterogeneous digital computer-based systems (what we might term the information infrastructures), underpinning the physical transportation infrastructure. (For a more detailed account of the ideas of information infrastructure see Hanseth, O. and E. Monteiro, 1998)

Framing an Academic Project

Academic research on how such mobility information infrastructures would grow from the constituent disparate systems that currently exist in silos has been nascent, especially on the topic of the coordination efforts required. Part of the reason is that many required elements of such infrastructures do not currently exist, and that cities are only just beginning to prototype them.

In our research, we use a theory of digital infrastructure coordination as a framework to unravel the forces that influence the development of a mobility focused information infrastructure, extending it to focus particularly on the influence of temporal rhythms within the coordination. Understanding this has important implications for policy makers seeking to better support smart-cities initiatives. Our research took us to Berlin and a project which was prototyping an integrated sustainable mobility system there.

The BeMobility Case Study

The BeMobility project, which lasted from September 2009 to March 2014, was started as part of a concerted effort by the German government to become a market leader and innovator in electric mobility. A public-private partnership between the government and over 30 private and academic sector stakeholders, the goal of BeMobility was to prototype an integrated mobility services infrastructure that would be efficient, sustainable and seamless for Berlin residents. The largest railways operator Deutsche Bahn was chosen as the lead partner of the project, with the think-do tank InnoZ (an institute focused on future mobility research) as the project coordinator and intermediary. Organizations participating in the project ranged from energy providers like Vattenfall to car manufacturers such as Daimler to technical scientists provided by Technical University of Berlin.

The project, despite facing many challenges, was able to prototype a transportation infrastructure which integrated electric car sharing with Berlin’s existing public transport system. In the second phase of the project, it further integrated this infrastructure with a micro-smart power-grid, providing insights into how such mobility services could be powered by renewable energies. While the integration effort was both at the hardware and software levels, our research studied the coordination efforts related to information infrastructure in particular.

“Integration of all this information is what we now call Mobility-as-a-Service. BeMobility was  one of the first projects in the world to attempt to do it.” – Member of BeMobility Project

Findings and Discussion

Our analysis showed that individuals and organizations respond to coordination efforts based on a combination of historical cycles of funding, product development and market structures, and anticipated patterns of technology disruption, innovation plans and consumer behaviour. Peoples’ actions in contributing to an integrated infrastructure are tempered not only by these past and future rhythms, but also by the limits of the technologies they encounter. Some of these limitations are physical in nature, such as the inability to integrate data due to lack of specific computing interfaces, and some are political, such as blocked access to databases due to concerns about competitive espionage and customer privacy.

Our findings also surfaced the power of the intermediary as coordinator. Contrary to the limited perception of a coordinator as a project manager and accountant for a government funded project, we saw InnoZ emerge as a key driver of the information infrastructure integration. One of the most powerful tools for the intermediary was its role in mapping future rhythms of technology development. It achieved this by showcasing prototypes of different types of electric vehicles, charging stations, solar panels, and software systems, at InnoZ’s campus.

This campus itself acted as a mini-prototype where both hardware and software integration could be first implemented and tested. The ability to physically demonstrate how the micro-smart grid could connect with the car-sharing system to enable sustainable energy for electric cars, for example, both surprised and motivated other stakeholders to take the imminent possibility of a sustainable mobility infrastructure more seriously.

Ultimately, business stakeholders were especially concerned about the commercial viability of such radical innovation. Here too the intermediary proactively shaped their thinking by conducting its own extensive social science research on the behavioural patterns of current and future users. For example, by showing that young urban residents were more interested in car-sharing than private ownership of cars, InnoZ made a strong case for why an integrated infrastructure could also be a good business investment.

Implications

As more cities experiment with Mobility-as-a-Service, understanding the influence of rhythms on coordinating information infrastructure is helpful for policymakers. Insights that would be useful to policymakers include:

  • Keeping a budget for building an innovation lab where cutting edge technologies can be tested and integration efforts can be showcased will lead to more engagement with stakeholders.
  • Working more closely with the intermediary to conduct social research on the mobility habits of millennial urban dwellers will incentivise stakeholders as it will prove a market for the smart infrastructure.
  • Anticipating the disciplinary inertia imposed by legacy systems and organizational practices, and countering it by including stakeholders in the working group whose temporal rhythms include innovative product cycles more in line with the goals of the integrated infrastructure.

This study also contributes to the academic literature on information infrastructure development by providing insights on the role of time in coordinating integration efforts. It responds to a gap in the understanding of the evolution of large-scale multi-organizational infrastructures, specifically as they relate to mobility.

♣♣♣

Notes:

Will Venters is an Assistant Professor within the Department of Management at the London School of Economics and Political Sciences. His research focuses on the distributed development of widely distributed computing systems. His recent research has focused on digital infrastructure, cloud computing and knowledge management systems. He has researched various organisations including government-related organisations, the construction industry, telecoms, financial services, health, and the Large Hadron Collider at CERN. He has undertaken consultancy for a wide range of organisations, and has published articles in top journals including the Journal of Management Studies, MIS Quarterly, Information Systems Journal, Journal of Information Technology and Information Technology and People (where he is also an associated editor).  http://www.willventers.com

Ayesha Khanna is a digital technology and product strategy expert advising governments and companies on smart cities, future skills, and fintech. She spent more than a decade on Wall Street advising product innovation teams developing large scale trading, risk management and data analytics systems. Ayesha is CEO of LionLabs, a software engineering and design firm based in Singapore. She has a BA (honors) in Economics from Harvard University, an MS in Operations Research from Columbia University and is completing her PhD on smart city infrastructures at the London School of Economics.

Photo by Mueller felix (CC- thanks)

(cc) Kevin Dooley

Evolving your business alongside cloud services – V3 writeup of my talk at Cloud Expo Yesterday

I gave a talk at Cloud Expo at the London Excel centre yesterday on the need for a much more dynamic perspective towards cloud computing. V3.co.uk have written an article providing an excellent summary of the talk if you are interested:
http://www.v3.co.uk/v3-uk/news/2454551/enterprises-must-be-ready-to-evolve-alongside-cloud-services

Dr Will Venters, assistant professor of information systems at the London School of Economics, explained that companies integrating cloud services into their IT infrastructure need to establish fluid partnerships with multiple vendors, as opposed to purchasing a static product….

Drugs enter the digital age – Details of a research project I’m part of…

A team of us at the LSE have just won £700k to look at the complex digital processes and infrastructures surrounding future medicine delivery. The following is taken from the press release (link below).

The world’s health sector has gone digital, with electronic prescriptions, digitised supply chains and personalised medicine the new buzz words.

Earlier this year, the US biotech company Proteus announced that it had raised US$172 million for its pioneering tablets containing embedded microchips. These swallowable devices collect and report biometric data and can tell if a patient has taken their medication correctly.

In a similar breakthrough, Google has recently announced a prototype contact lens which measures glucose in a user’s tears and communicates this information to a mobile phone so that patients can better manage their medication.

Both innovations illustrate the hybrid devices that medicines have now become – and highlight the cumbersome and mostly paper-based current systems that are still being used to deliver medicines.

Dr Tony Cornford from LSE’s Department of Management hopes to make some headway in this area by spending the next two years exploring digital innovations in how drugs are supplied and used.

A £700,000 grant from Research Councils UK will allow Dr Cornford and a team of co-investigators from LSE, the University of Leeds, UCL, Brunel and the Health Foundation to map emerging new fields, such as electronic prescribing systems, intelligent medicines supply chains, new diagnostic and monitoring procedures, and personalised medicines based on individual genomic profiles.”

 Read the full article at: Drugs enter the digital age – Health – Research highlights – Research and expertise – Home.

CWF: Will Venters – EM360 PodcastEnterprise Management 360°

I was interviewed by Enterprise Management 360 at the cloud world forum – the podcast of the interview is now available on their site:

CWF: Will Venters – EM360 PodcastEnterprise Management 360°.

Double trouble – why cloud is a question of balance |My New Blog on Cloud Pro

I have been invited to Blog on CloudPro – don’t worry I will keep posting here as well – but if you want to read my first posting see:

Double trouble – why cloud is a question of balance | Cloud Pro.

Simplicity and cloud computing

In my recent co-authored book on cloud computing [1]we argue that one of the primary desires for the adoption of computing as a service (as opposed to as a product such as software and hardware organised by the purchaser) was the desire for simplicity. We even adopted the term “Simplicity as a Service” to describe the disentanglement of complexity offered by new pay-as-you-go computing services associated with cloud computing  through, for example, more standardised contracting.  Indeed one of the primary motivations for many moves to the cloud is to simplify.  Yet we stumble quickly upon a problem – while the term simplicity[2] is widely used in relation to cloud computing, we have very little understanding of what this simplicity actually means? Understanding simplicity better may help us better understand our procurement of this types of service.

In this short essay want to unpick the concept of simplicity, then apply this back to the issue of cloud computing.  In this I consider simplicity from three directions which I roughly define as Modularic, Aesthetic and Systemic simplicity.

Modularic Simplicity

Is simplicity a concern for simpler mechanisms to provide the same service (i.e. a quartz watch is simpler than a Swiss automatic chronograph yet both tell the time)? To be simpler a device must perhaps have fewer components? Or perhaps simplicity lies in the interrelation between components – the interfaces?  If we consider simplicity in these terms we can seek to examine the modularity of objects – understanding how a service is composed of different services, and examining their underlying structures[3].  This is important for cloud computing in which various technical services are often interconnected to provide service – NetFlix for example integrates various Amazon’s cloud services with my iPad’s App, and with movie-content to provide service. Through decoupling services’ modularity the complexity of the constellation of modules can perhaps be better understood. Structures such as “hierarchies” are also used to keep things “simple”  and understanding such structures would help.

In this way simplicity is a calculation roughly based on counting components and their interfaces. Yet this seems rather well… simplistic!  For as Aristotle highlights wholes are “more than the sum of their parts” – there is emergence and emergent behaviour. But more than this, there is variation in the simplicity of components.

Aesthetic Simplicity

One problem with “modularic simplicity” is that the most simple modular-objects themselves can vary considerably in their “simplicity”.  Take two objects made of clay – a brick and a pottery vase. If both weigh the same they likely have the same number of atoms within them. Yet most people would agree the brick is simpler. The vase’s atoms are in a structure which introduced intricacy and difference despite the material itself being identical. Similarly two apparently similar digital MP3 files –seemingly random series of 0s and 1s –can vary considerably in their simplicity when realised as music – a flute solo verses a prog-rock band.

Simplicity then is not inherent in the material and any attempt to calculate simplicity by counting components and their relationship will be somewhat problematic. What then makes the vase more complex?  As humans perhaps we evaluate simplicity through our interpretation – an aesthetically concept of simplicity. This is certainly the perception of many designers and reflects the design aspirations of Apple computing. From their first sales brochure’s proclamation that “Simplicity is the ultimate sophistication” [4] this company has championed the idea that computing should feel “simple” for humans, in particular that the human should (in the words of their chief designer) “feel we can dominate [physical products]. As you bring order to complexity, you find a way to make the product defer to you. Simplicity… isn’t just visual style,… minimalism or the absence of clutter.”  For Apple and their vice-president of design Jonathan Ive’s simplicity is about removal of the unessential – and the reassertion of the whole (that is the form of the final product) over the parts (that is the components which make up that whole) – but wholly centred around the human user.

This concept is also represented in Ockham’s razor[5] – the assumption that simpler explanations are better despite the lack of irrefutable logical principle that this is the case (though they are more easily tested).

A human interpretation is required – cloud computing is considered “simple” in relation to its use in doing something for humans. It can only be evaluated at the level of its use (just as an iphone is only simple when held in the hand and used – not when taken apart and examined from within where its myriad complexity becomes evident).

Systemic Simplicity

If modularic simplicity places the “thing” at the centre of simplicity, and if, in contrast,  aesthetic simplicity placed humans at the heart of defining what is simple, then perhaps we can define simplicity in terms of the interrelationship between things and people – as a kind of socio-technical perspective towards simplicity? A view of simplicity in terms of the complex social and technical arrangements of life through which we get things done –such as, for example, organisations.

In many ways this simplicity might be defined by its absence – the lack of simplicity of modern organisations and their technical arrangements. The role of managers is thus often seen to be seeking to organise things to be “simpler”.  Yet most organisations are never simple and to aspire to make them so may be problematic.  Miller[6] argued that “organisations lapse into decline precisely because they amplify and extend a single strength or function while neglecting most others. Ultimately, a rich and complex organisation becomes excessively simple –it turns into a monolithic, narrowly focused version of its former self, converting a formula for success into a path towards failure.” [7] For Miller simplicity is an overwhelming preoccupation with a single goal, strategic activity, department or world-view – and making things simple through simplifying the organisation is therefore often problematic.  This suggests that understanding what can be simplified and what cannot requires a rich appreciation of the complexity of the organisation.

Indeed the origins of cybernetics[8] and complexity theory highlight that management must meet the complexity of a situation with a similar level of complexity in their response to that situation[9]. This demands that a manager’s response to organisational complexity cannot simply be simplification of their actions if they cannot similarly understand or simplify the environment within which the organisation resides.

Simplifying without this understanding is often what managers seek to do. And in making things simple they often rely on relatively simple models of the organisation to help them make these decisions. Whether it is the organisation chart, the process diagram, the UML model, their attempts to derive simplicity is focused on such simplifications. As Stafford Beer (1973) [10]reminded us managers become bewitched by the paper representations of their organisations as a “surrogate world we manage”, losing contact with the messiness of their world[11] and assuming simplicity in the world rather than seeking to simplify the world.

Beer goes further to highlight that “if a simple process is applied to complicated data, then only a small portion of that data will be registered, attended to, and make unequivocal. Most of the input will remain untouched and will remain a puzzle”.

This is not to say that we should not attempt to simplify our understanding of organisations into models and representations, but that we must carefully acknowledge these models as “simple”, and ensure that we remain attune to their alignment with the complexity of that which they represent.

When we buy cloud computing services which aim to change our organisation in some way we must be careful that we are not selecting the computing model based on a simplistic understanding of what the organisation is trying to achieve.

What can learn from this management and cloud computing?

From these three conceptualisation of simplicity we can draw some lessons for organisational managers and for cloud computing:

1) Simplicity isn’t always inherent in devices or technology, it relates to their interpretation and representation. We should seek to model simplicity in ways which reflect this.

2) That simplifying computing systems must be met with an understanding of the level of complexity of the task they are for. Selecting too simple a service is problematic[12]

3) That simplicity does not necessary mean less complex. Rather it can relate to the use of the service at the interface being observed. In procuring a service we should be attune to the lack of simplicity at different levels.

© 2014 W.Venters.

[1] Willcocks, L., W. Venters and E. Whitley (2013). Moving To The  Cloud Corporation. Basingstoke, Palgrave Macmillan.

[2] I acknowledge the contribution of PA consulting in raising with me a concern for better understanding simplicity.

[3] Baldwin, C. and K. Clark (2000). Design Rules: The power of modularity. Cambridge,MA, MIT Press.

[4] Isaacson, W. (2011). Steve Jobs, Little Brown. Page 343.

[5] http://en.wikipedia.org/wiki/Occam’s_razor

[6] Miller, D. (1993). “The Architecture of Simplicity.” The Academy of Management Review 18(1): 116-138.

[7] Miller, D. (1993). “The Architecture of Simplicity.” The Academy of Management Review 18(1): 116-138.

[8] Ashby, W. R. (1956). An introduction to cybernetics. London, Methuen & Co Ltd. Churchman, C., R. Ackoff and E. Arnoff (1957). Introduction to Operations Research. New York, Wiley.

[9] This is inherent in Ashby’s law of “requisite variety” – though different terms are used.

[10] Beer, S. (1984). “The Viable System Model: Its provenance, development , methodology and pathology.” Journal of the Operational Research Society 35: 7-36.

 

[11] Pickering, A. (2013). Living in the material world. Materiality and Space: Organizations, Artefacts and Practices. F.-X. de Vaujany and N. Mitev, Palgrave Macmillan.

[12] I discuss this in much more detail through the term “Variety” in Venters, W. and E. Whitley (2012). “A Critical Review of Cloud Computing: Researching Desires and Realities.” Journal of Information Technology 27(3): 179-197.