Maximum brainpower in the real world
Why Brainpower Is Not About How We Do on Tests But How We Do in Life
Why Experts Know Nothing
John Smith is a tall, lanky Londoner with a flair for silk ties. He works in an office amid piles of coffee cups and soft-drink cans—anything with caffeine. He stares all day long at a wall of computer screens. John makes millions of dollars a day, and he has no idea how.
John’s story is the story of the expert—the person who has so much ability and experience that he or she finds quick solutions to complicated problems that vex the rest of us. We all know experts. The auto mechanic who in ten minutes fixes an engine that stumps everyone else in the garage. The cook who with a single sip identifies the one missing ingredient that will make a good soup great. The engineer who draws by hand an airfoil design as efficient as that produced by a computer. The doctor who plucks the correct diagnosis out of thin air, unsupported by tests and scans (think television’s Dr. House, only with nicer bedside manners).
Experts process a problem through the vast amounts of knowledge and experience they have accumulated, and out pops an answer—usually the correct one, sometimes an unexpected one. I met John, a currency trader, before the introduction of the euro, at a time when every European country still had its own currency. Most large companies hold and trade different currencies in order to pay bills in local currencies and to avoid being caught with excessive holdings in case one currency’s value declines. Financial institutions actively trade in currency as they would in any stock or commodity, in the hopes of making money by determining whether a particular currency will go up or down.
John was one of those institutional mavericks who was exceedingly good at what he did. His specialty was buying and selling U.S. dollars against the German mark. Known as “the money factory,” he brought in millions of dollars in profits every month. The large international financial institution where he worked feared that competitors would steal him away, or that he’d simply retire. He had made so much money for himself and for the company over the years that he no longer needed to work.
How did this trader do it? How did he regularly bet on the right currency? The company leaders decided that perhaps a smart psychologist might be able to tell them, which is where I came in. By gaining insights into John’s intellectual methods and analytical processes, they hoped to protect themselves against his possible departure and to learn how they might train others to achieve similar results.
For days I sat next to John on the trading floor amid his computers and clutter. There were other traders in the middle of the floor, calling out offers to buy or sell. It was very, very busy, and very, very dull. Hour after hour, we watched the screens as the dollar and mark went up and down against each other. I saw every uptick and downtick that he did. I heard the same announcements. We watched news on economic and political activity all over the world on the “tickers” streaming by on the screens. Over the course of several days, he made modest trades, but nothing spectacular. Very boring stuff. A currency trader must be well paid to sit through this kind of tedium all of his life.
Then, one day, he was suddenly electrified. He held up his hand, flashing his fingers to signal “five,” then “three.” I scanned the screens. I zoned in on every word the sellers on the floor were saying. Nothing obvious had happened, and yet—within a few seconds—John had bought eight million marks! Before long, the value of the mark began to rise. By the close of trading, this gentleman and his company had made a very tidy profit.
By then, John and I had become friendly. Over a few pints of Guinness after work, I asked him why he had bought marks at that particular moment. I expected a technical analysis relating to economic news or monetary policy. Or perhaps, like an experienced chess player who has seen about every gambit imaginable, John had detected a series of clever moves by other traders and was able to exploit them for his own gain. His answer?
“I suddenly felt,” he said, “The mark wants to go up.”
“It was a feeling. ‘It feels like the mark wants to go up.’ ”
We talked for quite a while. I don’t believe he was putting me on, or putting me off. He genuinely could not explain his instinct to buy marks at that particular moment. It was a feeling. He could not say how or where he could “feel” it. He just could. He was influenced by something he couldn’t verbalize. Perhaps tension crept into the voices of the brokers on the trading floor. Perhaps there was some pattern in the movements of rates that he picked up unconsciously. Perhaps, at some unconscious level, he recalled a similar pattern from a few weeks or months before. Perhaps his own large purchase, coming out of nowhere, created a mini-stampede that drove the price of the mark higher. But people bought and sold marks all day, often in large volumes. How did he know when the time was right to make his move? He didn’t know. The effort to understand his process was a total waste of time.
My shadowing project ended a short time later when the bank was sold, but I do not believe I would have learned anything more if I had remained by John’s side for a year, or even ten. I would have experienced a great deal of boredom for nothing. He was right about when to trade currency. That is all there was to say.
John’s inability to explain his strategy is no isolated example. Gerd Gigerenzer, in his book Gut Feelings, writes about police officers who can instantly spot a drug carrier in a crowd on the basis of nothing more than a “hunch.” Malcolm Gladwell, in his book Blink, describes art experts who can, at a glance, tell whether a statue is an authentic work or a forgery. They cannot explain how they know. Gigerenzer speaks of a gut feeling—an unconscious, immediate “fast and frugal” aspect of cognition that is strong enough to act upon even though we are not fully aware of why. Gladwell calls this ability “thin slicing,” describing it as an automated, accelerated ability to find patterns in a small amount of information. Experts are able to reduce a complex situation to a simple one very fast through some process of unconscious inference.
This chapter explores the cognitive processes of experts such as John to figure out how their expertise emerges. It is the beginning of our story about how the brain works, how it is shaped by experience, and how experience can be good or bad depending on our response.
Of Soldiers and Roadside Bombs
People who work in harm’s way learn to rely on the indefinable instincts that mark the expert, the “feelings” that do not come from the conscious verbal pathways that we commonly identify with “thinking.” These instincts probably stem from the more primitive pathways of the brain, the parts of the limbic system—the “old brain”—that kept our ancient forebears alive when there was no time to evaluate situations in a calm and reflective manner. Police officers and soldiers in particular learn to “trust their gut” on dangerous streets and roadways.
In Iraq and Afghanistan, improvised explosive devices (IEDs, or roadside bombs) have caused more than half of all of the deaths of American and coalition soldiers plus untold numbers of civilians. Several hundred other IED attacks occur every month elsewhere in the world. The United States alone has been spending more than two billion dollars a year trying to counter IEDs, using heavier armor, spy aircraft, ground robots, and a variety of electronic devices to discover and disable them. Yet there have only been two consistently reliable approaches to finding roadside bombs, neither of which is sophisticated. The first is getting information from locals about where the IEDs are planted. The second is relying on soldiers who spot the devices from moving vehicles.
It turns out that a small number of combat veterans have an uncanny knack for detecting these bombs. Some of the signs of an IED require nothing more than close observation. A normally busy street is deserted. A pile of debris appears along a road that was recently cleared. Vehicles are left in unusual places for no apparent reason. At times, however, a “bomb-sniffing” soldier simply has a “feeling”—like our currency trader—that something has changed. Once a bomb is discovered it is possible to speculate on what small clue the observer might have spotted, but at the crucial moment, it is usually only a single soldier—often the same one—who is able to detect something that’s amiss. Why?
One clue may come from experiments involving pigeons.
Of Pigeons, Paintings, and Photos
Birds as a whole, and pigeons in particular, have great perceptual skills. Pigeons can distinguish differences in geometric shapes, colors, and patterns, and between classes of items such as cats, chairs, cars, and flowers. They can recognize themselves in mirrors, as only a few other species can. In one study, pigeons learned to discriminate between paintings by Picasso and Monet. After training, the birds were able to identify the painters’ work, correctly categorizing paintings that they had never seen before. Eventually, they even learned to pick out paintings that were either cubist (Picasso’s style) or impressionist (Monet’s). Their success was identical to that of college students who were given the same amount of instruction. Not bad for bird brains.
Humans have recognized the military value of these birds for some time. First employed as battlefield messengers, pigeons were used for aerial surveillance in World War I. (These efforts proved problematic because the heavy cameras strapped to the birds often forced them to walk home.) Aided by an extra set of color-sensing cones in their retinas (which mammals lack), pigeons have keen eyesight, and it was this ability that first turned them into intelligence operatives.
Even in an age of high-tech wizardry and high-resolution imaging, one of the most difficult intelligence challenges is identifying military equipment or personnel in photographs. Targets of interest are often hidden. Among dozens or hundreds of photographs covering many square miles of territory, where does an analyst begin to look? Knowing that pigeons had been trained to distinguish “tree” from “not tree,” some intelligence agents decided to go one step further and determine whether pigeons could differentiate between “natural” and “manmade.”
The birds were shown two sets of photographs, one containing one class of objects, and the other containing a different class of objects. There were all natural objects in one set of photos and manmade objects interspersed with natural objects in the other set. Simply put: natural versus manmade. If the photo was natural and the birds pecked on the left lever, they received a reward of food. If the photo included manmade objects and the birds pecked on the right lever, they received a reward of food. Any other choice left them without a treat.
Before long, the pigeons were able to correctly identify “natural” versus “manmade.” They continued to be able to distinguish the difference, even as the manmade objects became less and less obvious. Eventually, with their keen eyesight, the pigeons were able to detect manmade objects that were as inconspicuous in the photographs they were shown as deliberately hidden military hardware would be in a desert or forest. The point, though, is not that these birds have impressive eyesight. It’s that they can learn to generalize—to identify new examples within one class or to distinguish members of different classes. The pigeon brain is twenty grams in size, one-seventieth the size of a human brain. Yet lo and behold this tiny little brain can learn a very subtle rule, even though that rule is never explicitly defined! What is the definition of “manmade” that a bird can understand? Are manmade patterns, even irregular ones, far more symmetrical to a pigeon brain than natural patterns? Can it discern differences in texture or reflectivity? We do not know. We know only that brains far less sophisticated than ours can see, conceptualize, and react to things that are virtually imperceptible.
The pigeon’s impressive perception lies below the level of verbalizing thought. It may be that humans have similar abilities buried deep within the brain. This would explain why some soldiers can see things in their mind’s eye that others cannot. Like pigeons, certain people may be able to recognize “danger” among all the clutter of a ramshackle street or a rural setting. Soldiers who are the best at spotting IEDs usually grew up in either perilous urban neighborhoods or in rural areas where they hunted. In other words, they were raised in environments in which they needed to be constantly vigilant. One soldier, a deer hunter from Michigan, detected a three-inch clothes pin at thirty yards and recognized it as a bomb trigger.
The question this raises, oddly enough, is whether the military should try to teach bomb-finding skills to soldiers who lack this natural ability. Like most other experts, the soldiers who are good at finding bombs may not be able to articulate why. More important, the other soldiers could potentially be better off learning to develop their own “sixth sense” than to learn four or five particular things to look for. The very lack of training may open the mind to new stimuli. By engaging their conscious minds with a memorized list of possible signs of IEDs, the soldiers could be disengaging their unconscious minds, reducing the level of visual and mental alertness that might best serve them.
This point is supported by an experiment devised by Donald Spence. Subjects were told that the study was nothing more than a test of memory. In reality, it was a test of the brain’s ability to make unconscious connections. Subjects were presented with a list of words and asked to recount the ones they remembered. Spence arranged the words in such a way that about half of them had a common but undeclared denominator. The common denominator was the word “cheese,” which was itself on the list along with other associated words—“ripe,” “smell,” “blue,” and so on. The list also contained words not normally associated with cheese.
Spence found an intriguing dichotomy. If the subject recalled the actual word “cheese,” he or she recalled fewer related words. And he or she recalled fewer related words after identifying the word “cheese” than before. But if the subject did not recall the word “cheese,” he or she remembered more cheese-related words. The result seems counterintuitive. Wouldn’t we recall more related words after hitting upon the central idea? No—because after finding the solution, the brain stops searching.
Excerpted from Maximum Brainpower by Shlomo Breznitz and Collins Hemingway. Copyright © 2012 by Shlomo Breznitz. Excerpted by permission of Ballantine Books, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.