The Architecture of Complexity — Herbert Simon on Watchmaking, Hierarchies, and Decomposable Systems

The Architecture of Complexity — Herbert Simon on Hora and Tempus, hierarchies, and decomposability

Stacking dolls, hierarchies, and complex systems

Herbert Simon suggests that the architecture of complexity is usually hierarchic. In complex societies, for example, people organize themselves into groups and groups of groups to make communities, companies, economies, governments, and whatnot. 

Similarly, on one level of biology, we can analyze the cell as a building block of life. They form tissues, which lead to organs, which lead to the animal. Some animals will organize further into social hierarchies. And an ecology springs forth from their flurry of interactions. 

Symbolic systems exhibit similar traits too. The English language, for instance, uses letters to make words, to make phrases, sentences, paragraphs, and so on. And we can find comparable structures in music, programming, and mathematics too. As Herbert Simon and Mie Augier explain:

“Most complex systems — be they natural, human or artificial — are hierarchical in structure. This does not refer to their internal relations of power or authority, but to the fact that they are divided into parts, and the parts into parts, and so on, like an elaborate collection of Chinese boxes.”

Mie Augier and Herbert Simon. (2000). Commentary on the Architecture of Complexity. 

Notice, in particular, the nuance in Augier and Simon’s simile — that building blocks of complex systems are unlike that of simple bricks and pyramids. It is the idea that combination and aggregation may create something new and different, much like the Chinese boxes in Augier’s example or the Russian stacking dolls in the picture above.

The watchmakers Hora and Tempus

But why should we expect to find complex hierarchies and building blocks in nature, society, and technology in the first place? 

To answer this, Simon tells the story of two blind watchmakers: Hora and Tempus.1 Tempus, ever the perfectionist, made watches in the same way kids build toy arches. Every part has to come together simultaneously to make one functioning watch.

Unfortunately, everytime Tempus was distracted, perhaps by a phone call or door bell, he would have to restart an incomplete watch from scratch. You can imagine how problematic watchmaking might be for Tempus on particularly disruptive days.

Hora, by contrast, had a different tack. She built her watches with many subcomponents. Unlike Tempus, she didn’t lose as much progress every time she was distracted. She simply resumed work on her last subpart, and got on with it.

As Simon explains:

“Complex systems will evolve from simple systems much more rapidly if there are stable intermediate forms than if there are not. The resulting complex forms in the former case will be hierarchic… Among possible complex forms, hierarchies are the ones that have the time to evolve.”

Herbert Simon. (1962). The Architecture of Complexity. 

1 Simon never said that Hora and Tempus were blind. But I feel it strengthens the parallel to evolution in biology and economics (and Richard Dawkins’ commentary in The Blind Watchmaker).

Building blocks and foggy mazes

Simon says we can draw similar parallels to that in evolutionary biology. While there is no master plan at work, you can imagine how a mixture of random mutation and natural selection might first favor simpler, stable systems before arriving at more complex structures.

Our modern institutions, likewise, did not emerge spontaneously. Democratic systems, market economies, the theory of special relativity, and the Internet, for example, were products of trial-and-error, adaptive learning, occasional brilliance and some good luck. They were generations in the making.

“Human problem-solving, from the most blundering to the most insightful, involves nothing more than varying mixtures of trial and error and selectivity. The selectivity derives from various rules of thumb, or heuristics, that suggest which paths should be tried first and which leads are promising. We do not need to postulate processes more sophisticated than those involved in organic evolution to explain how enormous problem mazes are cut down to quite reasonable size.”

Herbert Simon. (1962). The Architecture of Complexity. 

Sampling rates and hierarchic levels

While the number of permutations for building blocks in complex systems are enormous, probable permutations can be influenced by selection devices, feedback mechanisms and the history and emergent properties of the system itself. The evolution of competition in nature and business are classic examples of this. Vision or camouflage, for instance, may thrive in some environments but not others. And the very emergence or non-emergence of such traits will bias the fitness landscape and selection of subsequent building blocks down the line.

Interestingly, in Complex Adaptive Systems: A Primer, John Holland suggests that “there is a strong relation between the level in the hierarchy and the amount of time it takes for competitions to be resolved”. He suggests that ecologies and economies “work on a much longer timescale” than that of the organism or the company. We can find similar examples in other domains too. Your goto phrase or song tends to change more frequently than your choice of language or favorite music genre.

Nearly decomposable systems

Likewise, Simon writes about “near decomposability” of many complex systems, where “intra-component linkages are generally stronger than inter-component linkages”. For example, think about relations in a firm. Employees on the same team are more likely to correspond together than with employees in another division or office. The point being that when we think about systems, it is important to look at the level and frequency of interactions within and between components.

As Augier and Simon explain:

“Molecules are divided into atoms, and these into elementary particles; multi-celled organisms are divided into organs and tissues, and these into cells; and so on. The components at each level are not independent of each other, but there is much denser and more rapid interaction within the components at any level than between components at that level. Such systems are said to be nearly decomposable.”

Mie Augier & Herbert Simon. (2000). Commentary on the Architecture of Complexity.

System-environment interactions

Of course, a systems-only view will provide an incomplete picture. We have to look at interactions between the system and the environment as well. For economic systems, game theory can be helpful. It encourages us to think about payoffs, move sets, and related features of the system. And motivational structures like the iterated prisoner’s dilemma, game of chicken, and preemption game can shape the course of system interactions.

Holland reminds us also that while environments tend to contain exploitable niches and regularities, “there is [usually] no super process that can outcompete all others”. Indeed, ecologies tend to arise in complex systems. They may include generalists that average over many environments and specialists that excel at particular niches.

Even chess grandmasters tend to develop play styles and distinct biases to simplify their decision-making process. And in more complex domains like the economy, it isn’t surprising to find systems that “are always far from any optimum or equilibrium situation”. People are forever adapting and responding to their current and anticipated environment.

Reducing and measuring complexity

With all that said, hierarchy and near decomposability may provide us with a starting point to describe and understand the architecture of complexity. Simon likens it to an amateur’s attempt at drawing the human face. Most of us begin “in a hierarchic fashion”, focusing on broad lines and shapes as opposed to minute details.

As Simon notes:

“If there are important systems in the world that are complex without being hierarchic, they may to a considerable extent escape our observation and understanding. Analysis of their behavior would involve such detailed knowledge and calculation of the interactions of their elementary parts that it would be beyond our capacities of memory or computation.”

Herbert Simon. (1962). The Architecture of Complexity. 

To end, it’s worth noting that hierarchies provide us with only one description of complexity. While we  don’t have room for their discussion here, other measures include size (e.g., genome size), logical depth, algorithmic information content, statistical complexity, Shannon entropy, computational capacity, and fractal dimensions.2 

For a more general framing device, I like Seth Lloyd’s three questions in his Measures of Complexity: (1) “How hard is it to describe?” (2) “How hard is it to create?” And (3) “What is its degree of organization?” Indeed, your answers to these very questions may tell us something about the stacking dolls of which we are a part. 

2 For more on measures of complexity, I recommend Melanie Mitchell’s wonderful book Complexity: A Guided Tour.

Sources and further reading