Smart Swarm: Using Animal Behaviour to Organise Our World. Don Tapscott

Читать онлайн книгу.

Smart Swarm: Using Animal Behaviour to Organise Our World - Don  Tapscott


Скачать книгу
experts who made little effort to coordinate their work would do so poorly. They did even worse, in fact, than teams that had no experts at all.

      “We filmed all the teams and watched them several times,” Woolley says. “What seems to happen is that, when two of the people are experts and two are not, there’s a status thing that goes on. The two that aren’t experts defer to the two that are, when in fact you really need information from all four to answer the problem correctly.”

      Why was this disturbing? Because that’s how many analytic teams function in real life, Woolley says, whether they’re composed of intelligence agents interpreting data, medical personnel making a diagnosis, or financial teams considering an investment. Smart people with special skills are often put together to make important decisions, but they’re frequently left on their own to figure out how to apply those skills as a group. Because they’re good at what they do, many talented people don’t feel it’s necessary to collaborate. They don’t see themselves as a group. As a result, they often fail to make the most of their collective talents and end up making a poor decision.

      “We’ve done a bunch of field research in the intelligence community and I can tell you that no agency, not the Defense Department, not the CIA, not the FBI, not the state police, not the Coast Guard, not drug enforcement, has everything they need to figure out what’s going on,” Hackman told a workshop on collective intelligence at MIT. “That means that most antiterrorism work is done by teams from multiple organizations with their own strong cultures and their own ways of doing things. And the stereotypes can be awful. You see the intelligence people looking at the people from law enforcement saying, You guys are not very smart, all you care about is your badge and your gun. We know how to do this work, okay? And the law enforcement people saying, You guys wouldn’t recognize a chain of evidence if you tripped over it. All you can do is write summa cum laude essays in political science at Princeton. That’s the level of stereotyping. And they don’t get over it, so they flounder.”

      Personal prejudice is a poor guide to decision making, of course. But it’s only one in a long list of biases and bad habits that routinely hinder our judgment. During the past fifty years, psychologists have identified numerous “hidden traps” that subvert good decisions, whether they’re made by business executives, political leaders, or consumers at the mall. Many can be traced to the sort of mental shortcuts we use every day to manage life’s challenges—the rules of thumb we apply unconsciously because our brains, unlike those of ants or bees, weren’t designed to tackle problems collectively.

      Consider the trap known as “anchoring,” which results from our tendency to give too much weight to the first thing we hear. Suppose someone asks you the following questions:

      Is the population of Chicago greater than 3 million?

      What’s your best estimate of Chicago’s population?

      Chances are, when you answer the second question, you’ll be basing it on the first. You can’t help it. That’s the way your brain is hardwired. If the number in the first question was 10 million, your answer to the second one would be significantly higher. Late-night TV commercials exploit this kind of anchoring. “How much would you pay for this slicer-dicer?” the announcer asks. “A hundred dollars? Two hundred? Call now and pay only nineteen ninety-five.”

      Then there’s the “status quo” trap, which stems from our preference not to rock the boat. All things being equal, we prefer options that keep things the way they are, even if there’s no logic behind that choice. That’s one reason mergers often run into trouble, according to John Hammond, Ralph Keeney, and Howard Raiffa, who described “The Hidden Traps in Decision Making” in the Harvard Business Review. Instead of taking swift action to restructure a company following a merger, combining departments and eliminating redundancies, many executives wait for the dust to settle, figuring they can always make adjustments later. But the longer they wait, the more difficult it becomes to change the status quo. The window of opportunity closes.

      Nobody likes to admit a mistake, after all. Which leads to the “sunk-cost” trap, in which we choose courses of action that justify our earlier decisions—even if they no longer seem so brilliant. Hanging on to a stock after it has taken a nosedive may not show the best judgment. Yet many people do exactly that. In the workplace, we might avoid admitting to a blunder—hiring an incompetent person, for example—because we’re afraid it will make us look bad in the eyes of our superiors. But the longer we let the problem drag on, the worse it can be for everyone.

      As if these flaws weren’t enough, we also ignore facts that don’t support our beliefs. We overestimate our ability to make accurate predictions. We cling to inaccurate information even after it has been disproved. And we accept the most recent bit of trivia as gospel. As individuals, in short, we tend to make a lot of mistakes with even simple decisions. Throw a problem at us that involves interactions of multiple variables and you’re asking for trouble.

      Yet increasingly, analysts say, that’s exactly what business leaders are dealing with. “Managers have long relied on their intuition to make strategic decisions in complex circumstances, but in today’s competitive landscape, your gut is no longer a good enough guide,” writes Eric Bonabeau, who is now chief scientist at Icosystem, a consulting company near Boston. Often managers rise to the top of their organizations because they’ve been able to make tough decisions in the face of uncertainty, he writes. But when you’re dealing with complexity, intuition “is not only unlikely to help, it is often misleading. Human intuition, which arguably has been shaped by biological evolution to deal with the environment of hunters and gatherers, is showing its limits in a world whose dynamics are getting more complex by the minute.”

      We aren’t very good at making difficult decisions in complex situations, in other words, because our brains haven’t had time to evolve. “We have the brains of cavemen,” Bonabeau says. “That’s fine for problems that don’t require more than a caveman’s brain. But many other problems require a little more thinking.”

      One way to handle such problems, as we’ve seen, is to harness the cognitive diversity of a group. When Jeff Severts asked his prediction market to estimate the probability of the new Best Buy store opening on time, he tapped into a wide range of perspectives, and the result was an unbiased assessment of the situation. In a way, that’s what most of us would hope would happen, since society counts on groups to be more reliable than individuals. That’s why we have juries, committees, corporate boards, and blue-ribbon panels. But groups aren’t perfect either. Unless they’re carefully structured and given an appropriate task, groups don’t automatically produce the best solution. As decades of research have demonstrated, groups have many bad habits of their own.

      Take their tendency to ignore useful information. When a group discusses an issue, it can spend too much time going over stuff everybody already knows, and too little time considering facts or points of view known only by a few. Psychologists call this “biased sampling.” Let’s say your daughter’s PTA is planning a fund-raiser. The president asks everybody at the meeting for ideas about what to sell. The group spends the whole time talking about cookies, because everybody knows how to make them, even though many people might have special family recipes for cupcakes, fudge, or other goodies that might be popular. Because these suggestions never come up, the group may squander its own diversity.

      Many mistakes made by groups can be traced to rushing a decision. Instead of taking time to put together a full range of options, a group may settle on a choice prematurely, then spend time searching for evidence to support that choice. Perhaps the most notorious example of rushing a decision is a phenomenon that psychologist Irving Janis described as groupthink, in which a tightly knit team blunders into a fiasco through a combination of unfortunate traits, including a domineering leader, a lack of diversity among team members, a disregard of outside information, and a high level of stress. Such teams develop an unrealistic sense of confidence about their decision making and a false sense of consensus. Outside opinions are dismissed. Dissension is perceived as disloyalty. Janis was thinking, in particular, of John F. Kennedy’s reckless decision to back the Bay of Pigs invasion of Cuba in 1961, when historians say that President Kennedy and a small circle of advisors acted in isolation without serious analysis or


Скачать книгу