Bandwagon jumping
Many decades ago, the energy and defense corporation General Atomic commenced early development of a new High-Temperature Gas-Cooled Nuclear Reactor (HTGR). At the time, many experts believed that HTGR could be more efficient, stabler, and safer than your standard light-water nuclear reactor. Yet, despite its promises, HTGR failed to capture market share. Adoption of light-water outstripped every other reactor type from the mid-1970s onward.
In his book, Infinite in All Directions, the physicist Freeman Dyson attributes the evolution of reactors to a “bandwagon-jumping” effect. You see, light-water technology had a “flying start” inside American submarines. The USS Nautilus, in particular, the first nuclear-powered sub, prototyped a pressurized water reactor in the 1950s—which soon became the industry standard. So when demand for civilian reactors arose, most buyers and sellers were primed for light-water. HTGR, by contrast, was still in its infancy. A full scale reactor had not yet been built, and “nothing could be done in less than twelve years.”
Now, the bandwagon effect is not irrational per se. Governments and industry are right to want the most affordable, efficient, and reliable option as soon as possible, and with as little risk as possible. So light-water won because of its head start along the learning curve. But as Dyson notes, “by jumping on [the bandwagon] too soon, the nuclear industry deprived itself of alternative technologies.” Now, this is not a prescription to say that we should have invested in HTGR or some alternative. It is simply a reminder of the trajectories that development can take, and the opportunity costs that come about. Passage through one door closes another. As Dyson writes, “bandwagon-jumping is not always bad. Only, before you jump on, you should look carefully to see whether the bandwagon is moving in the direction you want to go.”
Quick is beautiful
It is also a reminder that “quick is beautiful”, says Dyson. By sheer nature of probability, technologies that require decades for development and implementation are subject to immense unpredictability. They range from the vagaries of government funding to the prices of inputs to the arrival of superseding technology. “An industry which takes twelve years to react will be perpetually too late”, Dyson argues. “And the people running the industry will experience sensations of paralysis and demoralization.”
Here, I cannot help but think of the scientists and engineers who committed a good part of their lives to the unfinished Superconducting Super Collider (SCC) near Waxahachie Texas. With planning going as far back as 1976, they had envisioned a large particle accelerator to study the deepest mysteries of physics. But in 1993, after more than $2 billion of spending on consulting, planning, digging, constructing, and whatnot by the Department of Energy and the State of Texas, Congress axed the project—leaving local Texans with 30 kilometers of incomplete tunnels, empty buildings, and question marks.
False scale economies
It doesn’t help either, that large enterprises are typically beset with false economies of scale. It is easy to neglect the costs of dysfunction, mismanagement, bureaucracy, and inflexibility. Much of the sluggishness and inertia in our political and economic centers, Dyson believes, is due to such illusions.
Moreover, enormous ventures tend to mean enormous committees. Boards, bureaus, commissions, panels, task forces, and round tables are set up to oversee their vision. Unfortunately, “the big missions are [typically] overemphasized” under such a build-up of hierarchy, Dyson observes. The fine prints and rapid learning are lost by grandstanding on the ivory tower. Dyson believes that the space sciences and related ventures “will flourish only if we can move away from grand one-shot missions like the Hubble Space Telescope… toward smaller and more flexible missions in the style of Hipparcos.”
This is reminiscent of the lean manufacturing methods that boosted post Second World War Japanese automakers into global pre-eminence, and the lean-startup philosophies at Silicon Valley. As a bulwark against bureaucracy and false scale economies, their chief tenets, amongst other things, emphasize validated learning, just-in-time production, minimum-viable-products, autonomous ‘nervous systems’, waste reduction, management-on-the-floor, fast-failures, and faster pivots. To them, quick and lean wasn’t just beautiful, it was survival.
Lock-in and reversibility
We should wonder, however, just how overwhelming bandwagon-jumping and false benefits can be. Can we reverse our position on a technology once it achieves critical mass? From the wheel to the computer, the answer seems to be no. Many technologies and building blocks are enduring. The nineteenth century Luddites, for example, could not halt the march of mechanization and technical displacement.
For good and bad, the “great inventions”, Dyson writes, “result in a permanent expansion of our horizons.” Now, this can be an existential point, especially when it comes to the development and management of our nuclear arsenal. Today, “nuclear states admit to owning about 13,000 warheads.” Surely an intentional or accidental launch would spell the end of modernity. Might we find a way to reverse course? Or must we live with the unstable stability borne by the threat of mutual destruction?
Right now, nuclear disarmament is probably unthinkable. “But the fact that a historic transition is unimaginable before it happens does not imply that it will never happen”, writes Dyson. “History is full of examples of transitions which upset deeply entrenched institutions and deeply held beliefs.” In the sphere of technology, he proffers two examples.
Carts and camels
The first dates back to a curious reversal in the Mediterranean around 500 AD. As the historian Richard Bulliet explains, “in ancient times, the Middle East teemed with carts and wagons and chariots, but they were totally driven out by the camel.” This may seem strange given the opposite was true elsewhere in Europe. But there was an invisible feedback loop at work. You see, in the beginning, camels in these Arab regions were only marginally more viable than the oxcart for freight and trade. But there was a crucial difference. The productivity of carts depended on roads and craftsmen. So as some people switched from cart to camel, demand for road and cart maintenance fell. This pushed others to switch from cart to camel as well, leading to further deterioration in roadworks and craftsman supply. Before long, “not only the oxcart, but even the memory of its existence, disappeared from the Arab world”, writes Dyson. And “the word for a wheeled vehicle vanished from the Arab language for a thousand years.”
Firearms and katanas
For a second example, we travel to sixteenth century Japan, after their first contact with European visitors. In their exchange, the Japanese had learned to make and use firearms. And so they did for many decades. Feudal lord Oda Nobunaga, for example, employed hundreds of musketeers to defeat his enemies. But after the Battle of Sekigahara in 1600, and the unification of Japan under shōgun Tokugawa Ieyasu, gun manufacture and use was curtailed by the shogunate. This also appealed greatly to the sensibilities of the Samurai who felt threatened by the rise of guns—for “even a lowly peasant, armed with one of these matchlocks, could kill the noblest, most heroic warrior, and without even an introduction.” During this period, the katana returned to the military, while guns were relegated to hunting. But this ethos held only until the nineteenth century when foreigners by way of Commodore Matthew Perry and the United States Navy arrived for gunboat diplomacy with Japan.
Inescapable dilemmas
History tells us that technology is on an inexorable path. But there are instances in which social, political, and economic forces are powerful enough to reverse course. Might a similar trend apply to nuclear arms? It is hard to say. But a practical response, Dyson notes, must recognize that “it is not possible to replace something with nothing”; and that each nuclear state “must be allowed an equal share with our own in the shaping of [such a] program.”
Still, we must wonder if the prisoner’s dilemma is ultimately inescapable. Indeed, both the Middle East and Japan returned eventually to their respective worlds of wheels and firearms. Unless humans destroy each other or die out as a result of some other cause, technology might truly be a one way street.
Forecast and fiction
Moreover, forecast and fiction are two ways to think about the prospects of technological change. The difference being, Dyson notes, is that “science fiction does not pretend to predict… It deals in possibilities, not in probabilities.” After all, “neither cleverness nor stupidity is predictable.” And the most radical of changes typically go unseen by both camps.
Dyson points, for example, to the polymath John von Neumann. Despite his brilliance and major contributions to mathematics, physics, economics, and computer science, von Neumann mispredicted the future of computers. In particular, he dreamed that computers would solve the problems of long-range weather prediction. What von Neumann had underestimated, however, was the degree of chaos and complexity that rules and steers our weather—and how sensitive some nonlinear systems can be to minor perturbations. Today, a “forecast lead time of midlatitude instantaneous weather is around ten days, which serves as the practical predictability limit.”
Von Neumann was also “wrong about the evolution of the computer itself”, Dyson adds. “He was thinking of computers as big, expensive and rare, to be cared for by teams of experts and owned by prestigious institutions.” He did not imagine the rise of companies like Texas Instruments, Hewlett Packard, or Intel Corporation. As Dyson notes, “the real wave of the future was to make computers small, cheap, and widely available.”
Persistent patterns
Now, Dyson’s assessment is perhaps a tad unfair to von Neumann, whose architectures and algorithms contributed fundamentally to the development of the modern computer. Indeed, few people in the first half of the twentieth century could foresee its bewildering potential.
But that is exactly Dyson’s point. “It was stupid of von Neumann” or anyone, he writes, “to push computers toward one gradiose objective and ignore the tremendous diversity of more modest applications to which computers would naturally adapt themselves.” While von Neumann and his contemporaries dreamed of applications for meteorology and defense, everyday folk clamored for Pacman and Space Invaders.
The future is difficult to predict because technology and engineering is a rainforest of possibilities. Dyson believes “the pattern resembles in some ways the rise and fall of species in the evolution of plants and animals.”
The dinosaurs, for example, were hugely successful during the Mesozoic era. Many of them grew large and accustomed for millions of years until an asteroid smashed into the Gulf of Mexico—triggering an impact winter and the Cretaceous Extinction. Suddenly, the rules and hierarchies of ecology were upended. Those who had grown inflexible perished, while vacant niches were filled quickly by furtive survivors.
Indeed, as Dyson writes: “when a technology has grown so big and sluggish that it can no longer bend with the winds of change, it is ripe for extinction… Birds and dinosaurs were cousins, but birds were small and agile while dinosaurs were big and clumsy… The future belongs to the birds.”
Sources and further reading
- Dyson, Freeman. (1988). Infinite in All Directions.
- Schelling, Thomas C. (1966). Arms and Influence.
- Alchian, Armen. (1950). Uncertainty, Evolution, and Economic Theory.
- Grove, Andrew. (1988). Only the Paranoid Survive.
- List, John. (2022). The Voltage Effect.
- Lorenz, Edward. (1972). Predictability.
- Kremer, Michael. (1993). Population Growth and Technological Change.
Latest posts
- The Trusted Advisor — David Maister on Credibility and Self-Orientation
- How We Learn — Stanislas Dehaene on Education and the Brain
- Catching the Big Fish — David Lynch on Creativity and Cinema
- Donald Murray on the Apprentice Mindset and Return to Discovery
- The Hidden Half — Michael Blastland on the Unexpected Anomalies of Life
- Artificial Intelligence — Melanie Mitchell on Thinking Machines and Flexible Humans
- Fallibility and Organization — J. Stiglitz and R. Sah on Hierarchies and Polyarchies
- The Art of Statistics — David Spiegelhalter on Reasoning with Data and Models
- Chess Vision and Decision-Making — Lessons from Jonathan Rowson
- The Drunkard’s Walk — Leonard Mlodinow on Randomness and Reasoning