Exploring Complexity Ep. 2

Why the world feels more complex—and why that feels hard. Why more problems are complex problems. Why organizations struggle with complexity.

Episode 2: Why the World Feels More Complex (and Why that Feels Hard)

Exploring Compexity is where we explore the complexity of minds meeting machines by combining complexity science, artificial intelligence, and the human sciences.


Transcript

Dave Edwards 0:06
Welcome back to Exploring Complexity from Artificiality. We're here to talk about the complexity of minds meeting machines and bringing together complexity science, artificial intelligence and the human sciences.

In this episode, we want to talk about why the world feels more complex, and why that feels harder. So, Helen, why don't you start off talking a little bit about why you think the world feels more complex?

Helen Edwards 0:32
Well, yeah, okay, so we can't definitively prove that it is more complex, but there's definitely reasons to believe that it is in many respects. And it also feels so there's a couple of there's, there's a bunch of reasons that that we think really matter here. The first is that there's a lot of deep social change going on. If you look at various charge, mini charts, we have population growth, we have co2 emissions, we have climate change, we have, if you look at all of these, the technology change, you'd look at all of those kinds of charts, they all show exponential hockey sticks, at certain points. Now, that's not to say that exponential growth can continue forever, it can't, which is part of what complexity matters as an important topic for us to understand. But these revolutionary changes that are happening, are and they're deeply embedded inside the social systems in which we operate. There's just so much of these changes happening as we're also interconnected.

Dave Edwards 1:43
So why does that revolutionary change feel more complex? Why does it make life in the world feel more complex? What is it about those revolution?

Helen Edwards 1:51
Well, I think there's, there's many things right. So the fact is, there's that a lot of them are happening, coincidentally, like at the same time, and the pace of change is very high. So that increases the sense of uncertainty, which makes everything feel more complex. So there's that sort of emotional overlay. But there are other things that that that really matter as well. One of them is around the pressure that this place is on resources. So all of those things are discovered that discussed discussed before. They they place a lot of pressure on resources. And the more pressure there are on resources, the more people more there's competition between people, the more that the competition for those resources, builds a sense of a division between self and other. So it creates more sort of Us and Them tension, which changes the way that we all interact, we're all interacting all the time, in a way that we didn't used to have to do sort of clashing over these scarce resources. Another really important aspect is around information overload. And information overload. We've got this constant stream of information coming at us all of the time. And what that does is make us feel like our knowledge today is instantly out of date. So we end up with this paradox of on one hand, we've got access to all the knowledge in the world. On the other hand, it's going to be out of date tomorrow. So we are out of date in the next second, or we might have missed something because there could be another piece of information. So we're grasping for the sense of wizard wizard was a true fact or what do I believe or, and so those that dynamic, I think is deeply troubling to people. And in an organizational context, that makes us more likely to create false certainty. And that false certainty is just a huge weapon, because it's just it never, it never works. It doesn't live up to its promise. It becomes you know, it, it becomes something that ultimately crumbles and erodes our confidence even further, which increases how complex we feel these systems are. What do we even feel we know? There's complex connectivity, we've networked ourselves, when we networked out our opportunities, we networked all our problems as well. And you never get away from that network. And it's coming at you all of the time. And this is where we go back to discuss this, Deena talking about the internet as the source of of most of the complexity that's arisen for us and our lived experience is this open connectivity of everyone's connected to everyone at all times. And I think that's, you know, we have we, human evolution has mostly been in small groups and tribes and been exposed to this level of conduct. 70 years is really difficult for us, not something we evolved for. There's an escalating cadence as well. Of of change. And so the faster we change, the faster the rate of change the faster we know, what we what we know is outdated. That's, that's a, that's a real paradox, because how do you? How do you value what you'd know today? And hold on to that knowledge today and use it? When tomorrow? It could be outdated. We so I think that that, that sort of epistemic fragility adds to our sense of things being complex. Because we can't go back to the last video we did, it's actually harder to tell the difference between simple and complicated and complex, when you're not even sure how much you know. And the final one really is, is just simply about human behavior. We have a, we're sort of trained to think that organizations have a purpose and a vision, and off we go. But real organizations are filled with real people who have very different motivations and values at different times of the day in different contexts for different reasons. And as you move people in an organization from one context to another, those motivations can change which complexify the way that we have to deal with innovation and problems and organizations.

Dave Edwards 6:32
We are not machines, we

Helen Edwards 6:34
are not machines. Now there is a countervailing argument, if you want to be like, No, we're going to introduce a tension here, because that's a nice thing to do. And complexity is to say that, well, maybe everything is actually getting more simple. And David Krakow put that, that challenge to to a group of business people recently saying, well, everything's just getting more simple because it is actually more predictable, we have less diversity, because we have the same algorithms running small numbers of social networks. And yeah, I think he's right about that. There, if you start looking for places that are becoming more simple, because there's less resiliency and redundancy and diversity, you can see it all over the place when it comes to algorithmic solving of certain problems. When you when you think about patterns of the way that we're expected to patterns of norms, you know, the colonial location across the Internet of certain ways of doing things. If everyone in the world use chat, GPT, that would be absolutely a more simple arrangement, if you like. And in an ecosystems, we've lost so much diversity, and we're losing so much diversity that now some of things that used to be impossible to predict or actually, unfortunately, predictable. Not in a good way. Yeah,

Dave Edwards 8:06
I understand where David's coming from that. And especially when you look at AI if an organization standardizes on a single tool, like if everybody starts using Microsoft copilot within their organization, there's a there's a simplification aspect to that for sure. And I can understand his point on on a on an on a very large sort of system level, like really step back and look across so huge population, but I'm with you, I'm on the side that for individuals, it feels more complex. And I'm thinking as you're going through these key points, I'm thinking about why we talk about the complexity of minds meeting machines, right. So the complexity of humans, adapting to AI, right, that's kind of what we're getting at with that phrase. And you think about it, that there's there's this revolutionary change, suddenly, there's this tool that is intelligent, that suddenly is everywhere, seemingly, because something like 6 billion people on the planet are connected to the internet, and therefore can touch these tools. It's just huge revolutionary change, and it has this potentially massive impact. It could kill us all. Like, I mean, it's definitely in that sort of revolutionary change thing. There are scarce resources that people are worried about those recalled jobs, you know, jobs are scarce. And there's a question about whether this is going to make jobs more scarce. And the sort of, you know, the, some of the folks in the AI space like to say you're not going to lose your job to an AI you're going to lose it to a person who's using AI, which I think is a terrible phrase anyway. But that doesn't it gets to the point that that job no matter what is scarce, you're going to have it or somebody else is going to have it is not going to exist but that that seems sort of pressures there. We have an incredible information overload. Hopefully all of you out here are not in the same of watching every single news bit that happens in the AI space because it is a huge Huge information overload, there's some new model that comes out some new change some new something or other. And there's so many newsletters and LinkedIn talking heads who are just parroting in this all the time, it's just huge information overload. And that's a difficult, obviously, we're all connected, which means all of the stuffs happening really fast, as long as you open your eyes and ears and sort of pay attention to it. So there's that the cadence does seem to be quite a lot faster, like the change in AI, the rate of change, the change in the tools, the change in the usage, the change in how many companies are talking about it on their quarterly earnings calls like that, that rate of change is faster than anything I've seen, right. And, you know, I built my career studying the internet, and then we did clean tech together. And like, all of these things that are huge, you know, we're big moves were so much slower than this change. And our human behavior is the thing that's actually so underlie so much of our, our individual experiences of, of complexity. And I think that's amplified in this world of AI. Because the tools operate differently based on each individual's behavior. Right. I mean, we keep we talked about this a lot there, that software is fundamentally changed before this. Every software operated exactly as as it was programmed. If it didn't, it was called a bug, you sent it back to engineering, and you got it fixed. But these tools are unpredictable. By nature by design, I shouldn't say by nature, by design, they're a little bit of anthropomorphizing, revising there. But they're there. They're unpredictable by design. With just your own usage, if you go and ask the tool multiple times, it'll be on but the way you ask a question or prompt or the way I prompted us, that creates a whole new level. And this sort of, it feels like an amplification of that human behavior.

Helen Edwards 12:00
And it feels like everything just suddenly got squared. Yeah. to the power of Yeah. Just simply because it's one thing for you and I have an interaction, the math of you and, and your church EBT and your Claude yeah, having an interaction, and the math of my church, EBT and Claude heaven, and then all six of us getting together, that's like a combinatorial explosion, yes, to a certain point. Now, certain point those these, you know, an interesting way of thinking of adopting an idea or a mindset or a mental model from complexity is these as being more sensitive to what we call attractive basins, you know, things that we attract into, as we just can't break out of that. This is a common sort of basin to be in. And all of these models at the moment are still they're all pushing us into a little bit of that. So it challenges us not just as individuals to think about how do we get out and purely explore, versus, you know, exploit what we already know, which is, you know, one of those classic dichotomies, the Explore exploit the economy, it also challenges us as a system to do that, as a, as a partnership, as, as a couple. If we're in a bigger organization, if we're in an organization doing something, how do all of those people get out of our joint attractor basin? And, and be really much more conscious and mindful, that exploring is something that we have to do without limits and boundaries? And we shouldn't have, we have machines helping us and pushing us to do more of that. That's a quite different model, than the alternative, which is really stepped, sitting back and saying, how do we perfect what we know? And how do we perfect, you know, an innovative creation around this? Or, and what do we do to protect our boundaries around what we already know, and what we value? And I think that what happens with a complexity mindset, which is really part of what this whole video series is about is how do you become more sensitive to different ways of thinking about these age old challenges? And how do you take some of the insights from from complexity science, as well as what we're seeing around what AI does differently every day? And just have a better set of intuitions about which which way this is going.

Dave Edwards 14:41
Alright, so you mentioned organizations. So last question for this video, is we talked about how the world feels much more complex. There is definitely complexity in adapting to AI as an individual and as an organization. Why is complexity so hard? for organizations to manage,

Helen Edwards 15:01
well, there's sort of an inescapable fact that organizations have to, to, to hit their forecasts is really what kind of what it comes down to, right? You, you make a commitment about what you're going to deliver as an organization, customers, to shareholders to employees. And you have to deliver on that. So there's, there's an implicit assumption that there's a point that you can put in space, and then you can figure out with your models and your machinery to get there. Now, a lot of this comes from, I guess, sort of Taylorism, you know, there's this idea of organizations as machines. From time to time, people have had have tried to adopt organizations as a as a view of, of an organism as, and there's certain parts of that have changed, we've certainly adopted some ways of doing things on experimental groups and units that are shut off that are able to innovate on their own. There's lots of organizational responses to that. But at the very nature, organizations are top down. And complexity is bottom up. And we talked about emergence in the last video and how you can understand every little piece in an organization and understand every individual, every piece of software every can every then what actually happens is different. And the tension in organizations is how do you allow for emergence, how do you manage and organize for emergence? But how do you let it happen when it's sort of fundamentally uncomfortable?

Dave Edwards 16:42
Okay, great. Well, thanks for joining us. Please click onto the next video, where we're going to be talking about complexity science and how we apply complexity science. Thanks a lot.

Transcribed by https://otter.ai

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Artificiality.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.