Why does it feel like the world is falling apart? | Brian Klaas
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
In this thought-provoking discussion, Brian Klaas explores how modern humans face unprecedented global instability despite having local stability in daily life. Unlike our hunter-gatherer ancestors, we navigate a world where small fluctuations lead to significant impacts, a concept rooted in complex systems theory. Klaas challenges traditional assumptions of linear dynamics, clear causes, and predictable patterns, emphasizing the unpredictable nature of today's rapidly changing world. He underscores the dangers of relying on past models and forecasts, given our current system's inherent instability, akin to a precariously balanced sandpile.
Highlights
Our ancestors dealt with a world of local instability and global stability; today, it's reversed. ð
Convenience in daily life masks the broader instability in global systems. ðĶ
Today's interconnected world means small disturbances can cause massive upheaval. âïļ
The illusion of predictability stems from our routine lives, but black swans remind us of uncertainty. âģ
We engineer instability by maximizing efficiency, akin to building a sandpile ready to collapse. ð
Recognizing potential non-linear outcomes can better prepare us for unforeseen events. ð
Critical slowing down hints at potential system collapses. ðĻ
Resilience could be the antidote to chaos, overemphasizing practicality and safety. ðĄïļ
Parsing necessary from unnecessary control allows better future navigation. ð
Key Takeaways
We've flipped from local instability to global instability, making our world feel unpredictable. ð
The pace of change means small events can have massive consequences. âĄ
Traditional models of understanding cause and effect often fail in today's world. ð§
Complex systems are adaptive, unlike simple, linear systems. ðĶ
Non-linear dynamics mean the past doesn't predict the future. ðŪ
Black swan events highlight our illusion of control. ðĶĒ
We should aim for resilience over pure efficiency to prevent disasters. ðŠ
Understanding complex systems can guide us through modern chaos. ð
Questioning stability helps us face the unknown. â
Overview
In an era where our daily routines offer stability and predictability, Brian Klaas argues that this masks the true nature of our world's global instability. Through the lens of complex systems theory, he explains how small changes can lead to substantial impacts, challenging the traditional views of linearity in cause and effect. This new dynamic demands adaptive thinking to understand the real threats facing our modern society.
Klaas further delves into the misconceptions that arise from relying on past patterns to predict future outcomes. An overreliance on such models can be dangerous, as shown through historical shifts like the Arab Spring. Using the metaphor of a sandpile, he illustrates how our pursuit of efficiency may bring societies to the brink of collapse, emphasizing the need for caution and adaptability in our increasingly interconnected and chaotic world.
Additionally, Klaas discusses the unpredictability of 'black swan events' and their implications for perceived control in social systems. By examining concepts like critical slowing down, he advocates for a focus on resilience over mere stability, urging us to question assumed regularities and to prepare for future uncertainties. This requires acknowledging what is beyond our control while strategically navigating the complexities of modern life.
Chapters
00:00 - 00:30: The Dynamics of Modern Existence The chapter discusses how modern humans experience a reversal in the dynamics of existence compared to historical peoples. Historically, the environment provided local instability but was globally stable, such as in the lives of hunter-gatherers. Modern existence, however, is characterized by local stability within our daily lives but is subject to global instability.
00:30 - 01:00: Global Instability and Non-linear Dynamics The chapter discusses the concept of global instability and non-linear dynamics. It highlights how despite the predictable nature of daily life, represented by consistent experiences like ordering products online or visiting a Starbucks anywhere in the world, changes are happening at an unprecedented speed. These rapid changes mean that when things go wrong, the repercussions are both profound and immediate, showcasing the dangers associated with such global instability.
01:00 - 01:30: Complex Systems and Social Change The chapter discusses the engineering of a volatile world where certain elements like Starbucks remain unchanged while significant societal structures like democracies are collapsing and rivers are drying up. It highlights the attempt to understand phenomena in social systems using old paradigms characterized by linear dynamics, where typically a small cause results in a small effect and a large cause results in a large effect.
01:30 - 02:00: Understanding Complex vs. Complicated Systems The chapter 'Understanding Complex vs. Complicated Systems' discusses how complex systems differ from complicated ones. It explains that complicated systems have linear relationships, meaning that changes produce proportional effects. Complex systems, on the other hand, operate non-linearly, where small changes can lead to significant impacts. Complex systems theory seeks to model these non-linear interactions and acknowledges that what may appear as noise can substantially alter the system's behavior. It emphasizes the need for non-linear thinking to understand and model such systems effectively.
02:00 - 02:30: Assumptions of Cause and Effect This chapter explores the concept of assumptions regarding cause and effect in understanding social change. It discusses how, historically, the belief in clear-cut causes has been prevalent but not always accurate in deciphering complex events. It uses examples like the dropping of the atomic bomb on Japan to illustrate that, had circumstances been different (e.g., no Battle of Midway, no Albert Einstein), the outcomes may have varied, highlighting the complexity in attributing single causes to events.
02:30 - 03:00: Patterns of the Past vs. Present In the chapter titled 'Patterns of the Past vs. Present', the narrative explores the intricacy of historical events and the influence of complex systems. It discusses how certain monumental events, like the use of uranium on August 6th, 1945, are products of multiple interlinked causes rather than isolated incidents. This challenges traditional models which often fail to account for the numerous factors leading to a single effect. The discourse advocates for the understanding that complex systems demand more comprehensive analytical approaches to grasp the interactions between different elements and to truly understand the systems as a whole.
03:00 - 03:30: Limitations of Predictive Models This chapter explores the differences between complex and complicated systems. It uses the analogy of a Swiss watch to illustrate the distinction: a complicated system like the watch has many interlocking parts but does not adapt if a part fails, unlike complex systems which can adapt to changes and errors within their components. This illustrates the limitations of predictive models that assume understanding a system's structure is enough to predict its behavior.
03:30 - 04:00: Complex Systems in Human Society In the chapter 'Complex Systems in Human Society,' complex systems are distinguished from non-adaptive systems. Unlike a watch that stops working when a part fails, complex systems like traffic are adaptive - when one element changes, such as someone braking, others in the system respond and adapt, affecting the system as a whole. This highlights the necessity of understanding interactions within systems, not just individual components, which is a core principle in social science.
04:00 - 04:30: Self-Organized Criticality and the Sandpile Model The chapter discusses the concept of self-organized criticality and the sandpile model. It highlights how patterns of cause and effect may not always apply as expected due to rapidly changing world dynamics. Philosopher David Hume's ideas are referenced to underline the problem of assuming past patterns are applicable in the present and future. An example involving the analysis of authoritarian regimes is mentioned.
04:30 - 05:00: Efficiency vs. Resilience The chapter discusses the concept of resilience and stability in Middle Eastern regimes, as analyzed in a specific book. Initially, the book posits that these regimes are highly resilient and stable. However, shortly after publication, the Arab Spring occurs, leading to the rapid collapse of these regimes. This raises the question of whether the theory was incorrect or if the world had changed between the book's writing and the collapse of these regimes.
05:00 - 05:30: The Mirage of Regularity The chapter titled 'The Mirage of Regularity' explores the reliance on predictive models in modern life. It discusses how these models, including AI, are built on the assumption that past patterns can predict future outcomes. However, the world is constantly changing, making these assumptions not only incorrect but potentially dangerous if the underlying shifts are not recognized.
05:30 - 06:00: Radical Uncertainty and the Limits of Prediction The chapter discusses the challenges of understanding and predicting complex systems, emphasizing the uniqueness and variability among their components. It highlights that human society, with its 8 billion individuals, is a prime example of a complex system where interaction and adaptation are key, pointing to the inherent limitations in testing and predicting such systems. The chapter also suggests that certain concepts within complex systems theory can offer valuable insights into how our lives develop.
Why does it feel like the world is falling apart? | Brian Klaas Transcription
00:00 - 00:30 - Modern humans experience
a different world and a different dynamic of our existence than anyone who has ever come before us. And the reason for that is because we've inverted the dynamics of how our lives unfold. So all of the past people, the hunter-gatherers that came before us, they lived in a world that was defined by what I call local instability,
but global stability. Their day-to-day lives in their local environment
was unpredictable. Now we have flipped that world. We experience local stability,
but global instability.
00:30 - 01:00 We have extreme regularity
in our daily lives. We can order products online and expect exactly when
they're going to arrive. We can go to Starbucks
anywhere in the world and it's going to taste roughly the same. But our world is changing
faster than it ever has before. The consequence of that is
that when things do go wrong, the ripple effects are much more profound and much more immediate. And this is where that sort of
aspect of global instability is really dangerous.
01:00 - 01:30 We've actually engineered a volatile world where Starbucks is completely
unchanging from year to year, but democracies are collapsing
and rivers are drying up. So when we tried to understand why things happen in social systems, the old way of thinking often involved what are called linear dynamics. In other words, a small cause has a small effect and a big cause has a big effect,
01:30 - 02:00 and the relationship
between them is linear. Now, the way the world
actually works is non-linear, and this means that
sometimes a very small change can produce a very big effect. And complex systems
theory understands this and tries to incorporate
it into its models. It connects the dots. It requires non-linear thinking. It requires the idea of modeling a world in which a tiny shift or a little fluctuation
that seems to be noise can actually radically shift the way
02:00 - 02:30 that the future unfolds. In the history of trying to
understand social change, there have been a few key
assumptions that were incorrect but were sometimes useful
to try to make sense of a very complex world. And one of the first
assumptions that we start with is this idea that when you
look at why something happens, that there's going to be a
clear-cut cause behind it. So, for example, you think
about the atomic bomb being dropped on Japan. If all sorts of things had
been slightly different, this would not have
played out the same way. So if the Battle of
Midway had not happened, or if Albert Einstein had not been born,
02:30 - 03:00 or if uranium deposits had not been forged by geological forces, all of these things had
to exist just as they were for that event to happen on
exactly August 6th, 1945. And yet our models don't do well in trying to understand an
infinite number of causes producing a single effect. The second assumption that we have, which complex systems theory challenges and tells us is wrong, is that if you just
understand the components of an individual system,
03:00 - 03:30 you will therefore
understand the entire system. And the reason that's incorrect is because complex systems are different from what are called complicated systems. It sounds like a very small nuance, but it's a very important one. So a Swiss watch, for example, is complicated but not complex. And the reason it's complicated is because it has a
million interlocking parts, it's got all these little
things in it that need to work for the watch to actually
perform its function. But it doesn't adapt within
the system if something breaks. So if part of the watch breaks down,
03:30 - 04:00 the watch just stops working. A complex system is
different, it's adaptive. We can imagine this in our own lives because if you're in traffic, for example, and somebody pumps the brakes, it's not like everybody just keeps going and slams into them. They also pump the brakes and
it affects the entire system. This adaptation that we
have in a complex system means that you can't just
understand the constituent parts. You have to understand exactly how they interact with each other. The third big assumption, and this is something
that's totally central to social science and a lot of understandings
of change in the past,
04:00 - 04:30 is that if something was a
pattern of cause and effect in the past, it will also be a pattern
of cause and effect in the present or in the future. And the philosopher David
Hume identified this problem many hundreds of years ago. Now, the problem is that the
world is more rapidly changing than ever before, so that means that the
patterns of the past are least likely to apply to our current moment or to the future. For example, there's an analysis
of authoritarian regimes
04:30 - 05:00 and how they work and
why they're so stable. And this analysis says
that Middle Eastern regimes are extremely resilient,
they're extremely stable. And then about a year or two
after this book was published, the Arab Spring happens and
all these regimes collapse in the span of like three months. Now, the immediate reaction is to think this theory was wrong. But the second way of
thinking about the world is that the world actually changed from the point that the book was written to the point when the regimes collapsed. And now the theory is incorrect,
05:00 - 05:30 but not because it was wrong in the past, but because simply the world has shifted. Because so much of modern life runs on models that assume that past patterns are
predictive of future ones. AI is also like this, right? Where AI is learning
based on past patterns and then trying to develop, you know, machine learning-driven models
that can help us navigate uncertainty in the future. But if the underlying world has shifted, then that analysis is not
going to just be wrong, it's going to be dangerous.
05:30 - 06:00 And the problem is we
can't really test this. Complex systems involve diverse parts where not everything is uniform. They interact with each other and crucially they adapt to each other. Human society is obviously
a complex system. We're not all the same, we're not uniform. We are a giant complex system of 8 billion interacting individuals. Now, there are a few other ideas within complex systems theory that I think are particularly useful for understanding how our lives unfold
06:00 - 06:30 partly between order and chaos. And we're in the middle there, that's where complexity thrives. So the first one is
called the sandpile model, which is a subset of what is called self-organized criticality where you have a grain of sand and you add another grain of
sand and another grain of sand and eventually you build up a pile. Now, at some point, the pile
of sand is going to get so tall and so unstable, it will be on what
physicists sometimes call the edge of chaos. And at this point,
06:30 - 07:00 a single grain of sand can cause the entire pile to collapse. Now, that collapse can be
produced by something so tiny and yet have such huge effects that it explains the non-linear
dynamics of that system. This is something that I
think we have engineered in modern society because of what are called
basins of attraction. And a basin of attraction is where a system will tend to move towards this evolution
of the system over time. A great way of thinking about this is traffic on the highway.
07:00 - 07:30 If you are driving down the
highway, there is a speed limit. The speed limit is a basin of
attraction for that system. Not everyone is going to
drive 60 miles an hour when the speed limit is 60 miles an hour, but most people will
drive somewhere near it. Additionally, the cars are going to be roughly evenly spaced. Of course, there might be
a jerk who's tailgating, there might be someone
who's way far behind you. But for the most part, there's sort of this order that emerges from a lot of interconnected
individuals driving cars, producing a relatively predictable system. And that's supposed to be good for us
07:30 - 08:00 because efficiency is the main driver of so much of our modern social systems. But this means that the basin
of attraction for our world is on the edge of chaos, it's at the absolute
limit of the sandpile, so that when the sand falls, it's more likely to cause an avalanche. A black swan is a term that was coined by Nassim Nicholas Taleb, and what it basically refers to is a highly consequential rare event that was unpredictable. It's usually something that
you couldn't actually foresee.
08:00 - 08:30 One of the things that
produces an early warning sign in natural systems is something called critical slowing down, which is a new branch of science that I think will be developed
further in the coming years and will be an important way of having an early warning
system to black swan events. When critical slowing down happens, you have fluctuations that
don't go back to equilibrium. They start to become really erratic and they can measure how quickly it takes for a system to snap back to normal. This early warning system
potentially can provide us
08:30 - 09:00 with a red flag that things
are a little bit unstable. This is where I think
we make a profound error in understanding social disasters. We tend to say, "Oh, well,
that was just a black swan, and now we've gone back to the
normal way the world works." The problem with that
is you've misunderstood the origin of those black swans. You've developed a sandpile that's so tall that the avalanche is inevitable. It's part of the system, it will absolutely happen at some point. And so this is where I
think the lesson for us
09:00 - 09:30 is to prioritize efficiency
and optimization slightly less, to build a slightly smaller sandpile, so that it's more resilient. So when the sort of randomness
or the flukes of life, the noise of life does happen, we don't end up with disaster. When we think about our daily lives and we navigate our world, we tend to have what I call
the mirage of regularity. And it's very easy to be seduced by this because things unfold in
these highly predictable ways.
09:30 - 10:00 And because of this, we start to think, "Okay, the world is controllable." This is, however, a mirage. And I think this is
where the sandpile model and some of the other ideas
from complex systems theory are so important for
correcting that mirage and making us understand
that it is just an illusion. The mirage of regularity is one where when you think
about the 21st century, we have been able to delude ourselves into thinking we can
make forecasts, right? But every forecast has been invalidated by these black swan events. I mean, just imagine going back
10:00 - 10:30 and reading the economic forecasts for what the world in 2020 would look like written in late 2019. I mean, those forecasts
were so unbelievably wrong. And I think one of the things
that's worth thinking about is that we could never
imagine some of these dynamics until they happen to us. There are things that we
call radical uncertainty where we can't simply
understand the future in any possible way because the idea doesn't even occur to us. We can't tame the world, we
can't predict accurately.
10:30 - 11:00 And so there are things
that we have to separate from the questions we have to answer from the questions we
don't have to answer. If we parcel out the parts of the world that we have to try to navigate
through this uncertainty, we'll make fools of
ourselves slightly less often and that will be a good
thing for our societies. - Want to dive deeper? Become a Big Think member and join our members-only community, watch videos early and
unlock full interviews.