Home > Range(26)

Range(26)
Author: David Epstein

 

* * *

 

   • • •

   In the course of studying problem solving in the 1930s, Karl Duncker posed one of the most famous hypothetical problems in all of cognitive psychology. It goes like this:


Suppose you are a doctor faced with a patient who has a malignant stomach tumor. It is impossible to operate on this patient, but unless the tumor is destroyed the patient will die. There is a kind of ray that can be used to destroy the tumor. If the rays reach the tumor all at once at a sufficiently high intensity, the tumor will be destroyed. Unfortunately, at this intensity the healthy tissue that the rays pass through on the way to the tumor will also be destroyed. At lower intensities the rays are harmless to healthy tissue, but they will not affect the tumor either. What type of procedure might be used to destroy the tumor with the rays, and at the same time avoid destroying the healthy tissue?

 

   It’s on you to excise the tumor and save the patient, but the rays are either too powerful or too weak. How can you solve this? While you’re thinking, a little story to pass the time: There once was a general who needed to capture a fortress in the middle of a country from a brutal dictator. If the general could get all of his troops to the fortress at the same time, they would have no problem taking it. Plenty of roads that the troops could travel radiated out from the fort like wheel spokes, but they were strewn with mines, so only small groups of soldiers could safely traverse any one road. The general came up with a plan. He divided the army into small groups, and each group traveled a different road leading to the fortress. They synchronized their watches, and made sure to converge on the fortress at the same time via their separate roads. The plan worked. The general captured the fortress and overthrew the dictator.

   Have you saved the patient yet? Just one last story while you’re still thinking: Years ago, a small-town fire chief arrived at a woodshed fire, concerned that it would spread to a nearby house if it was not extinguished quickly. There was no hydrant nearby, but the shed was next to a lake, so there was plenty of water. Dozens of neighbors were already taking turns with buckets throwing water on the shed, but they weren’t making any progress. The neighbors were surprised when the fire chief yelled at them to stop, and to all go fill their buckets in the lake. When they returned, the chief arranged them in a circle around the shed, and on the count of three had them all throw their water at once. The fire was immediately dampened, and soon thereafter extinguished. The town gave the fire chief a pay raise as a reward for quick thinking.

   Are you done saving your patient? Don’t feel bad, almost no one solves it. At least not at first, and then nearly everyone solves it. Only about 10 percent of people solve “Duncker’s radiation problem” initially. Presented with both the radiation problem and the fortress story, about 30 percent solve it and save the patient. Given both of those plus the fire chief story, half solve it. Given the fortress and the fire chief stories and then told to use them to help solve the radiation problem, 80 percent save the patient.

   The answer is that you (the doctor) could direct multiple low-intensity rays at the tumor from different directions, leaving healthy tissue intact, but converging at the tumor site with enough collective intensity to destroy it. Just like how the general divided up troops and directed them to converge at the fortress, and how the fire chief arranged neighbors with their buckets around the burning shed so that their water would converge on the fire simultaneously.

   Those results are from a series of 1980s analogical thinking studies. Really, don’t feel bad if you didn’t get it. In a real experiment you would have taken more time, and whether you got it or not is unimportant. The important part is what it shows about problem solving. A gift of a single analogy from a different domain tripled the proportion of solvers who got the radiation problem. Two analogies from disparate domains gave an even bigger boost. The impact of the fortress story alone was as large as if solvers were just straight out told this guiding principle: “If you need a large force to accomplish some purpose, but are prevented from applying such a force directly, many smaller forces applied simultaneously from different directions may work just as well.”

   The scientists who did that work expected that analogies would be fuel for problem solving, but they were surprised that most solvers working on the radiation problem did not find clues in the fortress story until they were directed to do so. “One might well have supposed,” the scientists wrote, that “being in a psychology experiment would have led virtually all subjects to consider how the first part [of the study] might be related to the second.”

   Human intuition, it appears, is not very well engineered to make use of the best tools when faced with what the researchers called “ill-defined” problems. Our experience-based instincts are set up well for Tiger domains, the kind world Gentner described, where problems and solutions repeat.

   An experiment on Stanford international relations students during the Cold War provided a cautionary tale about relying on kind-world reasoning—that is, drawing only on the first analogy that feels familiar. The students were told that a small, fictional democratic country was under threat from a totalitarian neighbor, and they had to decide how the United States should respond. Some students were given descriptions that likened the situation to World War II (refugees in boxcars; a president “from New York, the same state as FDR”; a meeting in “Winston Churchill Hall”). For others, it was likened to Vietnam, (a president “from Texas, the same state as LBJ,” and refugees in boats). The international relations students who were reminded of World War II were far more likely to choose to go to war; the students reminded of Vietnam opted for nonmilitary diplomacy. That phenomenon has been documented all over the place. College football coaches rated the same player’s potential very differently depending on what former player he was likened to in an introductory description, even with all other information kept exactly the same.

   With the difficult radiation problem, the most successful strategy employed multiple situations that were not at all alike on the surface, but held deep structural similarities. Most problem solvers are not like Kepler. They will stay inside of the problem at hand, focused on the internal details, and perhaps summon other medical knowledge, since it is on the surface a medical problem. They will not intuitively turn to distant analogies to probe solutions. They should, though, and they should make sure some of those analogies are, on the surface, far removed from the current problem. In a wicked world, relying upon experience from a single domain is not only limiting, it can be disastrous.

 

* * *

 

   • • •

       The trouble with using no more than a single analogy, particularly one from a very similar situation, is that it does not help battle the natural impulse to employ the “inside view,” a term coined by psychologists Daniel Kahneman and Amos Tversky. We take the inside view when we make judgments based narrowly on the details of a particular project that are right in front of us.

   Kahneman had a personal experience with the dangers of the inside view when he assembled a team to write a high school curriculum on the science of decision making. After a full year of weekly meetings, he surveyed the entire team to find out how long everyone thought the project would take. The lowest estimate was one and a half years, the highest two and a half years. Kahneman then asked a team member named Seymour, a distinguished curriculum expert who had seen the process with other teams, how this one compared.

Hot Books
» House of Earth and Blood (Crescent City #1)
» A Kingdom of Flesh and Fire
» From Blood and Ash (Blood And Ash #1)
» A Million Kisses in Your Lifetime
» Deviant King (Royal Elite #1)
» Den of Vipers
» House of Sky and Breath (Crescent City #2)
» The Queen of Nothing (The Folk of the Air #
» Sweet Temptation
» The Sweetest Oblivion (Made #1)
» Chasing Cassandra (The Ravenels #6)
» Wreck & Ruin
» Steel Princess (Royal Elite #2)
» Twisted Hate (Twisted #3)
» The Play (Briar U Book 3)