Complex Organizations was a serial of sorts. The first edition (1972) argued for the importance of bureaucracy in organizations - in contrast, most notably, to the human relations school that was then dominant - and for the importance of organizations in shaping society. True to the subtitle, A Critical Essay, its chapter on the institutional school argued that power played a larger role than the institutional school, and my mentor, Philip Selznick, admitted to. The 1979 installment expanded and sharpened the 1972 chapter on the environment; critically examined a new boy on the block, the population-ecology school; but gushed over another new entry, network analysis. In the last installment in 1986 the serial finally found the implicit theme in the previous ones and pushed for a power analysis of both the internal operation of organizations and especially their role in society. I have not moved from a power analysis since.
But back in 1979 I had been asked for a position paper on the accident at the Three Mile Island nuclear power plant in Pennsylvania. Something other than power was at work here, I quickly decided, and it was a dimension of organizational structure (complexity) and process (tight coupling) that had not been explored. The immense complexity of some industrial organizations and their tight internal connections occasionally allowed even some small local failures, inevitable in complex systems, to cascade through the system and bring it down. If the system also had catastrophic potential, perhaps it should not exist. In 1984, I published Normal Accidents, concerned with accidents in a variety of risky systems.
Normal Accident drew attention to two different forms of organizational structure that Herbert Simon had pointed to years before, vertical integration, and what we now call modularity. Examining risky systems in the Accident book, I focused upon the unexpected interactions of different parts of the system that no designer could have expected and no operator comprehend or be able to interdict. The difficulty of anticipating unexpected interactions appeared to stem from a system design that emphasized efficiency and speedy construction and speedy operation. In contrast, an equally complex design was a network of modules. This design anticipated the inevitable failure of some parts of a system, e.g. because of design mistakes, operator errors, faulty parts or supplies, poor procedures, or an unfriendly environment. It sought to decompose the vertical integration into modules, such that failures in one module would not cascade through the system, a design that loosened the coupling of parts. Modularity also allowed inventive designs within each module and allowed testing of independent modules rather than having to test the whole system.
Eight years later I published a book on a totally different topic: the 19th Century origins of the United States' distinctive form of capitalism, Organizing America. But there was a connection with the accident book, though I did not make it at the time. The first mass production organizations in the new country were the textile mills of New England. They were fully integrated systems; huge companies with highly specialized tasks, mass producing cheap fabrics, with a frozen technology. The firms owned the towns, the water supply that drove the mills, and, in effect, owned the workers and their families. Highly centralized, very efficient, and very profitable, they were all giants, though vulnerable to economic downturns and to disastrous fires and floods.
In contrast, the textile industry around Philadelphia, as large as the New England one, was made up of small firms, decentralized, innovative technologically, emphasizing high quality and producing more expensive fabrics. The firms were small, family owned, and constantly changing their ownership and employees. The towns in New England were barren of infrastructure, while those in Pennsylvania had good streets, technical schools, no worker dormitories, and generally good public services. If a firm did not get enough business to survive, its workers merely moved to another nearby that did get orders; owners became workers, as workers became owners; innovations spread rapidly as the experimenters - not locked into a huge mass production machine - moved about. "Renting rooms with power" resembled "changing jobs without changing your car pool," as in Silicon Valley a century later.
One important reason for the different industrial system was that laws of Massachusetts favored capital accumulation while Pennsylvania had laws that favored investments in public services such as transport, thus less capital was available for private ventures. With concentrated capital the Boston wealthy could build huge mills; with distributed capital, only small firms could prosper in Philadelphia. Prosper they did, until the end of the 19th Century when both New England and Philadelphia textile firms went into a slow decline because of foreign competition and cheaper labor in the southern states.
The US railroad industry started out, necessarily, as a number of small organizations geographically dispersed and mostly public or public-private in ownership. But by the middle of the 19th Century they were being privatized, and then mergers and consolidations took place until, by 1910, the hundreds were only four. Most of the profits came from long distance hauling. Accordingly, they structured the US economy around national centers instead of regional centers. Regional, decentralized development, "modules," so to speak, were favored by some in Congress such as Senator Robert La Follette Sr., who anticipated growing inequality if we allowed national centers of grain, meats, steel, or furniture. The efficiency and safety of some regional railroad lines was greater than the national lines, but the giant railroad corporations bought them out and even neglected their innovations.
The theme of consolidation, increasing the political power of corporations, was expanded when I looked at the growing vulnerabilities of systems with catastrophic potential in The Next Catastrophe (2007, 2011). The consequences of natural disasters, industrial accidents, and terrorist attacks, were all being magnified by increasing accumulations of hazardous substances in industries, concentrations of populations in risky settings, and concentrations of political power in parts of our critical infrastructures such as communications, finance, and transport. We needed vast decentralizations to meet the challenges of climate change, terrorism, and infrastructure interdependencies, yet the opposite has been occurring. Some systems, such as existing nuclear power plants, which could not be modularized or decentralized, simply should be abandoned because of their catastrophic potential- a conclusion I had reached in 1984 when I anticipated a major nuclear power plant disaster in the next decade. We had two in the next three decades: Chernobyl in 1986, Fukushima in 2011.
Neither of these two disasters were "normal accidents" and thus unpreventable. A normal accident is where everyone tries very hard to play safe, but unexpected interaction of two or more failures (because of interactive complexity), causes a cascade of failures (because of tight coupling). The combination of complexity and coupling will bring down the system despite all safety efforts. In November, 2012, 20 months after the accident at Fukushima, the responsible utility, TEPCO, finally reversed itself and admitted that they had been warned of the earthquake and tsunami dangers but chose to ignore them, and admitted many other management failures such as evading regulations, poor training of worker, lack of emergency preparedness, etc. They did not play safe. As with many other industrial disasters such as Bhopal, Chernobyl, Exxon-Valdez, and BPs Gulf of Mexico oil spill that I have written about, it was an accident waiting to happen because of management failures. It was aided by complexity and coupling but not caused by it.
The cause of another meltdown, the 2008 financial crisis that shook the world's financial markets and the world economy, is still the subject of debate, but it is clear that warnings were ignored for years. They came from risk assessors within the financial firms, from government agencies, some big investors and gurus such as Warren Buffet, thousands of newspaper and magazine articles, and a few hedge funds that stood to profit from the collapse by betting against firms with heavy subprime mortgage holdings. Even Goldman Sachs knew enough to bet against the loans they were selling to clients just before the collapse.
Deregulation and the increasing complexity of the financial system made it easy to "game" the system and engage in fraudulent behavior, and its tight coupling allowed the failures to cascade. But it was not the case that no one could anticipate the failures. Firms were making so much money on the new financial instruments such as derivatives that the warnings were ignored and fraud was easy. The system had been concentrating since the deregulations of the late 1990s, magnifying the risks. There had been a degree of modularity in the larger system when commercial and investment functions were forced to separate following the Great Depression of the early 1930s. But in 1999 the two activities were permitted to take place in the same firm. Within the firms, the short-run, risky and highly profitable investment activity prevailed over the longer term, low risk activity. The takeover was aided by the predominance of shareholder value interests (short run) over the broader stakeholder interests (long run). The latter had ruled since the prosperous post World War II period, but shareholder value interests gained prominence in the 1990s as firms turned to the stock market instead of commercial banks for investment capital.
The themes of Organizing America, Normal Accidents, and The Next Catastrophe are linked: Multiple, independent producers will distribute power and wealth more broadly; consolidation will concentrate wealth and power. Modular systems enhance safety by making complexity less interactive and coupling less tight; vertically integrated systems do the reverse. It will not be hard to apply these concepts to the "last catastrophe," global warming, which I am currently working on.