Over the past year we have experienced something similar in the financial system: a dramatic and unpredictable cascade of events that has produced the economic equivalent of a global blackout. As governments struggle to fix the crisis, experts have weighed in on the causes of the meltdown, from excess leverage, to lax oversight, to the way executives are paid.
Although these explanations can help account for how individual banks, insurers, and so on got themselves into trouble, they gloss over a larger question: how these institutions collectively managed to put trillions of dollars at risk without being detected. Ultimately, therefore, they fail to address the all-important issue of what can be done to avoid a repeat disaster.
Answering these questions properly requires us to grapple with what is called "systemic risk." Much like the power grid, the financial system is a series of complex, interlocking contingencies. And in such a system, the biggest risk of all - that the system as a whole might fail - is not related in any simple way to the risk profiles of its individual parts. Like a downed tree, the failure of one part of the system can trigger an unpredictable cascade that can propagate throughout the entire system.
There is no need to retrace the whole of his article though I do intend to return to his conclusions.
Overall, he has covered the bases fairly well, provided that you are happy with a two-base hit. The idea he has used is a propos and makes for an interesting read.
The “Out” call at third though comes because he has missed an extremely important factor; one which does not in any way contradict what he has said but in fact is an augmentation of the prim misses he has started with.
It starts with the “complex system”. Now I, and anyone else with half a brain knows what Watts is talking about here; essentially a system which has a large number of inter-related and inter-acting nodes, where many of the relationships and interactions are not clear or well defined.
Being an old-time computer-wallah, and I mean OLD, “systems” has a flavour to it which turns my brain to parallels from that kind of world. So, with your patience I will indulge.
“Complex” systems can be extremely simple. The complexity can come from a number of different factors.
The most critical of these must be the system “Output”. Many moons back I taught myself to write Q-Basic, a language used by a number of accounting systems as well as the general home-writer. One of the programs I wrote with it used a model from Scientific American. It was two fairly simple formulas of two factors and a geometric (if I remember rightly “atan”) function. By taking the “answers” to the first calculation and feeding those back to the formula it was possible to print to the screen (in colour eventually) a veritable Persian carpet. Given the same two seed values, the result was always the same – the same carpet was generated. By choosing a different pair of seed values though you could arrive at a totally different carpet, or nothing at all.
Now that original model came out of the first days of “chaos calculation” and re-iterative functions. Who remembers Mandelbrot, and fractals these days other than Wikipedia? It was used to illustrate that a reiterative function could, in time, bring results that varied hugely from the initial calculations and even from the immediately prior calculations.
The point here is that the output from the systems that Watts is discussing is comparatively, deceptively, simple. It is intended to be so that users – the people who invest money in the system – can understand the levels of risk impinging their investment and the value their investments have earned. It is the Persian carpet that is hung on the wall. It is an agglomeration of pretty colours and an apparently organised pattern. Understanding the detail that lies behind it, the seed values, the process, is beyond the understanding of any person. The output is prepared from the accumulation of detail into increasingly generalised categories, by people who “understand” what they are calculating, but can not necessarily grasp the enormity of the detail and risk that they are given as inputs by people who individually work to prepare summarised number descriptions of the detailed data for which they are responsible.