PathMBA Vault

Business communication

Embracing Complexity

por Tim Sullivan

Artwork: Jen Stark, Radial Reverie, 2008, Hand-cut paper on wood backing, 20″ x 20″

Mauboussin: I look at it through a specific lens, of complex adaptive systems.

Meaning…?

A complex adaptive system has three characteristics. The first is that the system consists of a number of heterogeneous agents, and each of those agents makes decisions about how to behave. The most important dimension here is that those decisions will evolve over time. The second characteristic is that the agents interact with one another. That interaction leads to the third—something that scientists call emergence: In a very real way, the whole becomes greater than the sum of the parts. The key issue is that you can’t really understand the whole system by simply looking at its individual parts.

Can you give us a concrete example?

A canonical example of a complex adaptive system is an ant colony. Each individual ant has a decision role: Am I foraging? Am I doing midden work? Each one also interacts with the other ants. A lot of that is local interaction. What emerges from their behavior is an ant colony.

If you examine the colony on the colony level, forgetting about the individual ants, it appears to have the characteristics of an organism. It’s robust. It’s adaptive. It has a life cycle. But the individual ant is working with local information and local interaction. It has no sense of the global system. And you can’t understand the system by looking at the behavior of individual ants. That’s the essence of a complex adaptive system—and the thing that’s so vexing. Emergence disguises cause and effect. We don’t really know what’s going on.

Why is an ant colony the first example you think of?

Complex adaptive systems are one of nature’s big solutions, so biology is full of great examples. Ant colonies are solving very complicated, very challenging problems with no leadership, no strategic plan, no Congress.

Once you’re aware of how the structure works, though, you’ll see these systems everywhere—the city of Boston, the neurons in your brain, the cells in your immune system, the stock market. The basic features—heterogeneous agents, interaction, and an emergent global system—are consistent across domains.

Why should businesspeople pay attention?

So what could a biologist or an ant specialist or a honeybee specialist possibly tell us about running businesses? The answer is, a whole lot more than you might guess, if you are willing to make some connections. This to me is an essential way to think—especially in the 21st century.

“What could a biologist tell us about running businesses? A lot—if you’re willing to make some connections.”

Consider capital markets. Rather than looking at them through the rational-expectations model, or even using the no-arbitrage assumption—the idea that you won’t find any $100 bills on the sidewalk because somebody has already picked them up—you can look at them through a complex adaptive systems model, which empirically fits how the markets work. But complexity doesn’t lend itself to tidy mathematics in the way that some traditional, linear financial models do.

What are the dangers of misunderstanding complexity?

In the late 1800s rangers at Yellowstone National Park brought in the U.S. cavalry to try to improve the game population by hand-feeding elk. The elk population swelled, and the elk started eating aspen trees, and aspen trees were what the beavers were using to build their dams, and the beaver dams caught the runoff in the spring, which allowed trout to spawn. More elk equaled less trout.

That one choice, feeding the elk, led to a series of cascading events that were completely unanticipated. People seek to improve complex adaptive systems, sometimes with disastrous consequences. It doesn’t take a lot of effort to make the leap from elk to the economy. People really have the best of intentions. But there is no way they can anticipate the ultimate results.

The question, then, is What conditions have to be in place to actually solve these kinds of challenging problems? That spills over to organizations very quickly.

What prevents us from dealing effectively with complexity?

The biggest issue, in my mind, is that humans are incredibly good at linking cause and effect—sometimes too good. Ten thousand years ago most cause and effect was pretty clear. And our brains evolved to deal with that.

But it means that when you see something occur in a complex adaptive system, your mind is going to create a narrative to explain what happened—even though cause and effect are not comprehensible in that kind of system. Hindsight’s a beautiful thing.

Also, we have a tendency to think that certain causes will lead to particular effects. That’s our Yellowstone story. And we just don’t know. I think that’s the biggest single bias.

What else gets in the way?

First, we tend to listen to experts, although it’s been well documented that expert predictions are quite poor. But they’re authoritative, so we listen to them, even when we know that these people are predicting something that’s fundamentally hard to predict. The individual who comes across as more authoritative is actually more believable. People are much more comfortable deferring to the person in the pinstripe suit with the PowerPoint slides.

Second, we’re reluctant to share private information, so we aggregate information poorly. In one study the researchers gave team members shared information about the same three candidates, but also gave each member a unique piece of information about one candidate. If the team members shared all the unique information, they would choose the best-qualified candidate. If they used only the information common to all of them, they would pick the wrong candidate. A vast majority of the time, they selected a suboptimum candidate. Why? Because they chose to talk about the shared information and to reserve the unique information. Committees are not optimized to share private information. So even in organizations where the information exists, it’s not being surfaced.

What do these naturally occurring systems teach us about how to deal with complexity?

When information is diverse and aggregation and incentives are healthy, you get very good answers to problems. That’s what nature is doing, and that’s what we have to learn to do more effectively.

How do you translate that specifically into management?

Let’s start with diversity. I don’t necessarily mean social identity diversity, which is what we typically think of. Not that it isn’t important, but I’m talking about cognitive diversity—how people think, their training, their experience, their personalities. Scott Page [the author and University of Michigan professor] has shown that diverse groups are better at making predictions, for instance. Cognitive diversity—intentionally putting together different points of view that will challenge one another—is essential for hiring and for building teams.

Even if you have a diverse team, if you’re bowling everybody over with your point of view, that’s not going to be of great value. Leaders have to step back and let those diverse views surface.

This doesn’t come naturally to executives. Often we try to hire smart people. We try to put smart people on teams. But we don’t think enough about how much diversity can contribute. The key is to find smart people who think differently.

How do you go about this at Legg Mason?

Our view basically is that a lot of the key day-to-day things people need to do to function are things we can train almost anybody to do. So that’s not complicated. The complicated part is coming up with a high level of intellectual curiosity, with different skills and experiences. But we have to be mindful of it every day, because our natural inclination is to hang out with people who are mostly like us.

How can you manage information aggregation?

Most managers, even executives, aren’t naturally good at drawing out others’ opinions. In fact, many organizations end up surrounding their executives with people who mostly want to please those executives rather than give them a frank assessment of what’s going on. One solution is the “team of rivals” idea—the great executive who can surround himself or herself with people who will offer up different points of view, challenging the consensus within the organization.

There has also been a burst of activity in the past 10 or 15 years around the notion of using prediction markets within organizations as a way of aggregating employees’ knowledge—to make more-robust forecasts, for instance. Very few companies are harnessing this “wisdom of crowds” in any meaningful way. It could be an effective way of accessing the information in people’s heads that they’re not talking about. The key is to make sure that as a leader, you’re not just tapping, you’re actually almost extracting this unshared information from everybody and putting it on the table to be evaluated. And that’s where a lot of organizations fail.

Short of setting up a prediction market, what’s a practical way to do that?

Frank Bryan, a political scientist, has done a lot of work on Vermont town hall meetings. Their moderators are taught to follow certain protocols to ensure that private information is shared. For instance, following Robert’s Rules of Order, no one can speak twice till everyone who wants to has spoken once. That’d be a great rule to institute in every company. A simple rule like that would change a lot of dynamics very quickly in most places.

Anything else?

Small experiments with controls are a terrific aid, and they’re cheaper than ever. In his new book, Everything Is Obvious, Duncan Watts has a great line from Gary Loveman, of Harrah’s Entertainment. There are only two ways to get fired at Harrah’s: One is to steal from the company, and the other is to run an experiment without a control.

That kind of approach allows Loveman to mimic nature. Harrah’s is evolving through mutation and selection. And that’s how you navigate when feedback is ambiguous or hard to come up with.

Think about an investor in the stock market who buys stock that immediately goes up or down a little bit. Was buying the right decision? You don’t know the answer for years. It’s the same with decision making generally.

“Think about an investor in the stock market who buys stock that immediately goes up or down a little bit. Was buying the right decision? You don’t know for years. It’s the same with decision making generally.”

How do you think more broadly about strategy in a complex environment?

There’s an HBR article that I loved called “Strategy as Simple Rules” [January 2001]. The idea is that, especially in complex adaptive systems, a rapidly changing environment, we don’t really know how things are going to unfold, so it’s difficult to make forecasts or budgets going many years into the future.

The authors, Kathleen Eisenhardt and Donald Sull, recommend creating a set of decision rules, somewhere between a half dozen and a dozen, that are virtually immutable: These are the things the organization stands for and that will guide your decisions. Then you pretty much let people decide on the fly in the field what they think makes sense given what they see. They’re never to violate the basic rules, but they have a lot of flexibility to actually decide from moment to moment. I like that framework. I think that’s a really valuable way to go.

A company that has embodied that approach is Amazon.com. You don’t necessarily know where they’re going to go next, but they have very specific ways of thinking about their decisions. They’re willing to kill things that don’t work. And they’re willing to investigate things that you might not think would be logical for them to do. But the strategy is simple. I think they’re a good embodiment of that whole principle.

What are some rules of thumb for getting yourself into the right mind-set to deal with complexity?

First, it’s important to constantly learn and expose yourself to diverse points of view. But it’s work to do that. I mean, there’s a central joy in it, but it’s work. It means you must allocate X% of your time. Typically the work is reading, but it’s also speaking to people who are interesting and exposing yourself to realms that you’re not familiar with.

“It’s important to constantly learn and expose yourself to diverse points of view. But it’s work to do that.”

If you’re an investor and you’re only looking at your Bloomberg terminal and reading Forbes and Fortune and the Wall Street Journal, it’s unlikely you’re going to get a lot of really interesting ideas. You need to read and think more broadly. Great investors most definitely do that already.

I also recommend constantly looking out in the world to find places where diversity is in jeopardy. Where do people all seem to have the same belief or the same point of view? That in and of itself doesn’t mean you should bet against it, but if it leads to a mispricing or an opportunity, then that’s where you should step in.

This is almost turning your understanding of complexity on its head and saying, “If this is what leads to good results, what leads to bad results? And let’s try to take advantage of those things.”