WikiLeaks: Critical Foreign Dependencies

[ cross-posted from Zenpundit ]

The most interesting part of the WikiLeaks-posted State Department Request for Information: Critical Foreign Dependencies, it seems to me, is the part that ties in with Zen’s recent post Simplification for Strategic Leverage.

Zen referenced Eric Berlow‘s recent TED talk to the effect that sometimes a complex network can be made effectively simple by reducing it to the graph of nodes and links within one, two or three degrees of the node you care about and wish to influence.

“Simplicity often lies on the other side of complexity”, Dr Berlow says, and “The more you step back, embrace complexity, the better chance you have of finding simple answers, and it’s often different than the simple answer that you started with.”

*

This resonates neatly with a few things I’ve been thinking and talking about for some time now.

1. There’s the need for visualization tools that don’t operate with as many nodes as there are data points in a database like Starlight — I’ve been wanting to reduce the conceptual “load” that analysts or journos face from thousands, sometimes tens of thousands of nodes, to the five, seven, maybe ten or twelve nodes that the human mind can comfortably work with:

What I’m aiming for is a way of presenting the conflicting human feelings and understandings present in a single individual, or regarding a given topic in a small group, in a conceptual map format, with few enough nodes that the human mind can fairly easily see the major parallelisms and disjunctions, as an alternative to the linear format, always driving to its conclusion, that the white paper represents. Not as big as a book, therefore, let alone as vast as an enormous database that requires complex software like Starlight to graphically represent it, and not solely quantitative… but something you could sketch out on a napkin, showing nodes and connections, in a way that would be easily grasped and get some of the human and contextual side of an issue across.

2. There’s the fact that the cause is typically non-obvious from the effect. In the words of Jay Forrester, the father of stocks and flows modeling:

From all normal personal experience, one learns that cause and effect are closely related in time and space. A difficulty or failure of the simple system is observed at once. The cause is obvious and immediately precedes the consequence. But in complex systems, all of these facts become fallacies. Cause and effect are not related in either time or space… the complex system is far more devious and diabolical than merely being different from the simple systems with which we have experience. Though it is truly different, it appears to be the same. In a situation where coincident symptoms appear to be causes, a person acts to dispel the symptoms. But the underlying causes remain. The treatment is either ineffective or actually detrimental. With a high degree of confidence we can say that the intuitive solutions to the problems of complex social systems will be wrong most of the time.

3. There’s the need to map the critical dependencies of the world, which became glaringly obvious to me when we were trying to figure out the likely ripple effects that a major Y2K rollover glitch – or panic – might cause.

Don Beck of the National Values Center / Spiral Dynamics Group captured the possibility nicely when he characterized Y2K as “like a lightening bolt: when it strikes and lights up the sky, we will see the contours of our social systems.” Well, the lightning struck and failed to strike, a team from the Mitre Corporation produced a voluminous report on what the material and social connectivity of the world boded in case of significant Y2K computer failures, we got our first major glimpse of the world weave, and very few of the possible cascading effects actually came to pass.

I still think there was a great deal to be gleaned there — as I’m quoted as saying here, I’m of the opinion that: “a Y2K lessons learned might be a very valuable project, and even more that we could benefit from some sort of grand map of global interdependencies” – and agree with Tom Barnett, who wrote in The Pentagon’s New Map:

Whether Y2K turned out to be nothing or a complete disaster was less important, research-wise, than the thinking we pursued as we tried to imagine – in advance – what a terrible shock to the system would do to the United States and the world in this day and age.

4. That such a mapping will necessarily criss-cross back and forth across the so-called cartesian divide between body & mind (materiel and morale, wars and rumors of wars, banks and panics):

You will find I favor quotes and anecdotes as nodes in my personal style of mapping — which lacks the benefits of quantitative modeling, the precision with which feedback loops can be tracked, but more than compensates in my view, since it includes emotion, human identification, tone of voice.

The grand map I envision skitters across the so-styled “Cartesian divide” between mind and brain. It is not and cannot be limited to the “external” world, it is not and cannot be limited to the quantifiable, it locates powerful tugs on behavior within imagination and powerful tugs on vision within hard, solid fact.

Doubts in the mind and runs on the market may correlate closely across the divide, and we ignore the impacts of hope, fear, anger and insight at our peril.

*

Getting back to the now celebrated WikiLeak, which even al-Qaida has noticed, here’s the bit — it’s really just an aside –that fascinates me:

Although they are important issues, Department is not/not seeking information at this time on second-order effects (e.g., public morale and confidence, and interdependency effects that might cascade from a disruption).

It seems to me that the complex models which Starlight provides, and Eric Berlow pillories, overshoot on one side of the problem – but avoiding all second-order effects?

One cause, one effect, no unintended consequences?

What was it that Dr Berlow just said? “if you focus only on that link, and then you black box the rest, it’s actually less predictable than if you step back, consider the entire system”…

Avoid all second-order effects?

If you ask me, that’s overshooting on the other side.