Remedies for Dysfunctional Complexity

Text Analysis | Maximilian Braun

A Text Analysis of “The Coming Software Apocalypse” by James Somers

Synopsis

What is at stake, for whom, who is held responsible in a software apocalypse? Maximilian Braun reviews James Somers’ The Coming Software Apocalypse by drawing on MCTS notions of responsibility and responsiveness.

Keywords: infrastructure studies, platform studies, software, programming, code is law

 

From Macro to Micro: Software as an Important Infrastructure
Dense is the new big: After facing evermore boundary-pushing construction endeavours throughout the last decades, our senses now witness the micro and nano scale of mankind’s technical innovation capabilities. Unlike the skyscrapers and roads of yesterday (Somers 2017: 3) that were unfolding right before our eyes, Moore’s law (Moore 1965), centralized storage systems and distributed network technologies enable the establishment of unobservable, more intrusive architectures that guide our interactions with the reality that surrounds us. An abundance of software technologies renegotiates our understanding of coded programs from convenient, situated solutions to a necessary infrastructure that forms an essential part of our everyday life. However, this new level of technological density comes at cost: whereas former infrastructures mostly comprised perceivable physical components, the realm of software seems to confront us with uncanny opacity.

The reviewed text argues that coded structures reach a degree of complexity that makes it impossible to fully avoid potentially dysfunctional behaviour, even for experienced programmers. The mentioned tragedies around the “911 outage” (Somer 2017: 1) and Toyota’s “unintended acceleration incidents” (ibid.: 4) speak for themselves. However, a possible remedy is depicted as residing in the simple formula “Inventing on Principle” (ibid.: 7). This way of designing software adheres to the following imperative: Make Integrated Development Environments (IDEs) and Software Development Kits (SDKs) – in brief, software programming frameworks that require a developer to hack thousands of lines of abstract code into a textual editor – a thing of the past! Instead, create development frameworks that are “truly responsive” (ibid.: 9), i.e. that provide instant feedback via simulating how the system will behave when certain parameters change, preferably “with knobs, buttons and sliders that the user learns to play like an instrument” (ibid.:8).

So far, so easy. But all this is a double-edged sword. We are not just talking about defective products or disappointing services. It is about safety, security and the reliability of regulations that protect our integrity when living in an infrastructure full of such complex systems. So, rather than introducing simpler ways of designing software and software infrastructures, we should think about who is held prospectively responsible for them, who determines their scope of function and whether these questions may only be asked to codes’ creators.

Superfluous Developers and Modularized Responsibility?
How to ascribe responsibility for systems whose “complexity is invisible to the eye” (Somer 2017: 3)? We might be able to find answers in the transboundary experiment between infrastructure and platform studies, two once opposing, now slightly complementing domains of research related to Science and Technology Studies (MCTS). Plantin et al. (2018) try to reconcile the two mentioned disciplines: Infrastructures are described as “heterogeneous systems and networks connected via sociotechnical gateways”, whereas platforms comprise a “programmable, stable core system” and “modular, variable complementary components” (ibid.: 1). Bringing the issue back to questions of responsibility, it still requires a special sort of human contribution to ensure stability, i.e. multiple instances of that type of engineer “whose work quietly keeps the internet running” (Somer 2017: 14).

In the light of the differentiation above, we might argue that Somers opts for a “platformization of infrastructures” (ibid.: 1). when he summons Bret Victor’s conviction that the developer’s role is to make herself superfluous (ibid.: 10). Software tools created with such an intention resemble platforms that combine stability and variability with finite application possibilities, just as Victor demonstrated in his Mariolike Jump’n’Run-framework (ibid.: 8). Here, jumping and running suffice as core functionalities for the program and the effects of certain adjustments (e.g. higher gravity) are directly shown to the programmer in a separate window, without having to replay the game.

It might be a feasible approach to further platformize infrastructures and, therefore, split complementary from vital core components, but deliberation must not fall short in this process. The main question is whether existing structures of “organized irresponsibility” (Beck 1995) are amendable enough to render them responsible. They must be equipped with a prospective sight on responsibility, mentioning responsiveness (i.e. adaptive and deliberative capacities) as a key dimension (Somer 2017: 9). Otherwise, solutions fall short when dealing with hazards that cannot be tackled with a retrospective, knowledge-based responsibility framework (cf. Owen et al. 2013). The outcome would more resemble Toyota’s line of reasoning after the unintended acceleration incidents: It can be poorly designed floor mats, sticky pedals, a driver error, so why should it be our software (Somer 2017: 4)?

The Maintenance of Ever-evolving Programs
The bane and boon of software is its flexibility that seems to effectively outperform all benefits of solid and reliable hardware. Well dried solder connections and firmly mounted electrical components on circuit boards are no more an obstacle for the diverse application possibilities of the manifold software instances running on and between chips, memories and APIs. Instead, updates result as the maintenance of today.

Somers mixes up different levels of abstraction in coding (e.g. High-Level Language as C and JavaScript versus Programmable Logic Controller related languages that work in a quite different fashion), what makes it hard to derive concrete solutions from his accounts. Also, additional functional requirements (like throwing pizza or digging holes in case of the Mario example) will always require coding skills – and there the story begins anew. This leads to the same vicious procedure outlined above, only that it is not a monolithic program with “feature after feature piling on top of” (Somer 2017: 5), but the framework itself. The maintenance work and the unobtrusive slippery slope of adding features to keep the system meeting new requirements is only shifted from the program itself to the interfaces.

Questions of responsibility should specifically address the issue of modularization. It must be ensured that programs remain controllable and maintainable, are subject to regular updates and fall under the responsibility of a dedicated software development team. Otherwise one runs the risk that the frameworks become the same untameable monsters that are to be avoided. Also, it cannot hold that the knowledge and competences that are able to assess the difficulties arising from complex software systems are not represented in the committees in charge of soft-lawing practices (ibid.: 13). If “software ‘is eating the world’” (ibid.: 2), then at least out regulatory institutions should know how its metabolism works.

 

References

Beck, Ulrich (1995): Ecological Politics in an Age of Risk. Cambridge: Polity Press.

Moore, George E. (1965): Cramming More Components onto Integrated Circuits. In Electronics (38), pp. 114–117.

Owen, Richard; Stilgoe, Jack; Macnaghten, Phil; Gorman, Mike; Fisher, Erik; Guston, Dave (2013): A Framework for Responsible Innovation. In Richard Owen, John Bessant, Maggy Heintz (Eds.): Responsible Innovation, vol. 31. Chichester, UK: John Wiley & Sons, Ltd, pp. 27–50.

Plantin, Jean-Christophe; Lagoze, Carl; Edwards, Paul N.; Sandvig, Christian (2018): Infrastructure Studies Meet Platform Studies in the Age of Google and Facebook. In New Media & Society 20 (1), pp. 293–310.

Somers, James (2017): The Coming Software Apocalypse. Edited by Hayley Romer. The Atlantic Monthly Group LLC. Washington.