A nearly 25-year-old theory posed by a late Danish physicist provides insight into how to the U.S. can soften the blows of catastrophic events, according to a recently released book written by Ted Lewis, executive director of the 51 Center for Homeland Defense and Security.
Lewis argues in "Bak’s Sand Pile: Strategies for a Catastrophic World" that the systems on which society depends – from the electric grid to the Internet to water supply distribution – are susceptible to breakdown not so much because of external forces but due to their optimized designs that don’t allow for minor disruptions.
The book is based on the seminal work of renowned scientist Per Bak and colleagues, Chao Tang and Kurt Wiesenfeld, who in a 1987 paper employed a sand pile analogy that was a prelude to Bak’s theory of self-organized criticality (SOC). Self organized criticality is a theory that contends failure of a complex system is inevitable due more to the system’s own design rather than from an external force.
"The reason these catastrophes are happening more often is because in modern society technology has made it possible to optimize everything," Lewis said.
Bak and his peers conducted a sand pile experiment. As grains of sand fell, a cone-shaped form emerged and grew in breadth and height. As grains continued to fall, small "landslides" took place, due to the build-up of criticality in the sand pile. The size and timing of landslides is unpredictable, just like so many natural and human-caused collapses. Bak’s sand pile became a metaphor for the real world.
Ever since it was published, science writers have sought to apply this concept to various calamities, but Lewis believes those works have missed the mark by not further delving in to self-organized criticality to understand why catastrophic systems failures occur, whether it is a nuclear plant in Japan or the financial meltdown of the past decade.
"Actually, it turns out that self-organization is the really interesting part," Lewis notes. "The sand pile is sort of the superficial stuff and the self-organized criticality is sort of the interesting stuff. The idea of self-organized criticality is the central theme of the book. I didn’t think other people had done it justice, so that’s why I wrote the book. Once you get into that you start looking for evidence of self-organized criticality in everything. It’s pretty easy to find it."
Lewis sees the concept in events like Hurricane Katrina where an overly-optimized and aging levee system lead to exacerbated disaster; or, in the Gulf oil spill where technology facilitated deepwater drilling, but what should have been a minor system disturbance resulted in a disastrous oil spill. Technology and cost efficiency can result in a lack of resiliency: Hospitals are designed to maximize profitability, so they don’t have the capacity to care for a large influx of patients following a catastrophic event; the power grid doesn’t have resilience because it has been designed for efficient operation.
"This thing called self-organized criticality builds up over time, until the system gets to its tipping point," Lewis explained. "Then, when even the smallest thing happens that ordinarily would not cause a disaster ends up causing a big flare-up, a big consequence."
With this understanding, policymakers could potentially take action to combat self-organized criticality, although doing so may be politically unlikely. Addressing the phenomenon means giving up cost and efficiency benefits initially.
"If you want a resilient society, you have to ‘un-optimize’ these systems," Lewis observes. "Of course, nobody wants to do that because that costs extra money." But economics is not the only factor driving up self-organized criticality. In fact, many existing regulatory policies contribute to tipping points in modern society.
"I would go so far as to say policies are more dangerous than terrorists," Lewis said. "They have created more vulnerability and risk than terrorists have." The 2003 power blackout that affected 55 million people was caused by self-organized criticality that is a direct result of FERC policy. Another example is the 1996 Telecommunications Act that allowed the communications industry to interconnect networks between separate companies. That was efficient and economical, but it decreased the redundancy needed for stability.
While policy tweaks are one way to address the issue, another is more creative thinking in building infrastructure. For example, in a chapter titled “Blackout USA” he outlines a plan that would locate natural gas, electricity, and fuel conduits underground along the rights-of-ways of the nation’s interstate system. Such an undertaking would not only provide for more resilient distribution systems, it would also bypass the syndrome popularly known as "Not in My Backyard." Moreover, that kind of massive public works endeavor could provide a spark to the economy by creating jobs.
The paradox is that part of the solution lies within the cause of the problem.
"Self-organized criticality causes the problem, but it also solves the problem because it shows you where the critical points are," Lewis said.
For more information about 51’ Center for Homeland Defense and Security, visit .