CAREENING INTO THE FUTURE
At 3:16 p.m. on 19 July, 1989, the jet's tail engine blew apart. Twelve thousand meters above the U.S. Midwest, shards of the engine's fan rotor cut through the rear of the aircraft, shredding its hydraulic systems. As fluid bled from hydraulic tubing, the pilots in the front of the plane lost command of the rudder, elevators, and ailerons essential to stabilizing and guiding the craft. Immediately, the plane twisted into a downward right turn. United Airlines Flight 232 from Denver to Chicago--with 296 people aboard--was out of control.
By itself, the failure of the tail engine was not catastrophic: the dc-10 had two other engines, one under each wing. But cockpit gauges showed a complete loss of hydraulic quantity and pressure. When the first officer tried to halt the right turn, the plane didn't respond. As the rightward bank became critical, the captain took over, pulling back on the control column and turning the wheel hard left--but still there was no response. In a last-ditch effort to regain command, he cut power to the left engine and boosted it to the right one. The right wing slowly came up, and the plane rolled back to a horizontal position. The right turn stopped.
Yet the situation remained critical. The plane was no longer turning, but it was still losing altitude. The captain sent crew members to look out of the windows in the passenger cabin. They saw that the inboard ailerons were slightly up, the spoilers were locked down, and the horizontal stabilizers were damaged. None of the main flight-control surfaces were moving. And it appeared that the airframe might have suffered structural damage severe enough to cause it to break apart in flight.
Back in the cockpit, the captain and first officer worked the flight controls feverishly--they still believed they could change the plane's trajectory. But their efforts produced no obvious effect. The captain also manipulated the thrust of the two remaining engines, sometimes giving extra power to the left engine, sometimes to the right engine. This action did have a noticeable effect. It helped keep the plane level and countered its tendency to turn right. But changes in engine thrust gave the captain only minimal control. In fact, from the perspective of the passengers, the plane was moving in three dimensions simultaneously: it was rolling from side to side and pitching up and down, as if riding long waves across the sky.
A flight attendant opened the cockpit door to say that an off-duty United Airlines pilot, seated in first class, had offered to help. He was a "check airman" who flew with flight crews to assess their performance. The captain acknowledged that the unexpected assistance was urgently needed, because he was finding it impossible to work the flight and thrust controls simultaneously. When the airman entered the cockpit, the captain briefed him on the aircraft's critical situation in a staccato of abbreviated phrases. "Tell me what you want, and I'll help you," he replied. The captain asked him to take over the thrust controls. Grasping an engine throttle in each hand, the check airman then knelt on the floor between the captain and first officer's seats, and--with his eyes fixed on the flight instruments--began to manipulate the power of the two wing engines.
About fifteen minutes had passed since the explosion. The nearest airport was at Sioux City, Iowa. But the plane had lost nearly 7,000 meters of altitude and--despite the best efforts of the check airman--was still describing a series of clockwise circles over the Iowa countryside.2 In various parts of the United States, clusters of people had gathered around microphones and speakers to follow United 232's plight and to offer suggestions. The crew particularly wanted to hear from the United Airlines System Aircraft Maintenance (sam) facility in San Francisco.
Second Officer to United Airlines Chicago Dispatch: "We need any help we can get from sam, as far as what to do with this. We don't have anything. We don't [know] what to do. We're having a hard time controlling it. We're descending. We're down to 17,000 feet. We have . . . ah, hardly any control whatsoever.
But the sam engineers didn't have a clue how to help. They had never heard before of a simultaneous failure of all three hydraulic systems. They kept asking, in disbelief, if there really was no hydraulic quantity or pressure. And they asked the second officer to flip back and forth through the pages of a thick flight manual, to no avail. The crew's frustration with ground support rose.
Captain to Second Officer: "You got hold of sam?"
Second Officer: "Yeah, I've talked to 'im."
Captain: "What's he saying?"
Second Officer: "He's not telling me anything."
Captain: "We're not gonna make the runway, fellas, we're gonna have to ditch this son of a [bitch] and hope for the best."
Almost thirty minutes into the crisis, sam had finally assembled a team of engineers around the speaker and asked the second officer for yet another full report. He provided a detailed run-down of the aircraft's status. After a period of radio silence, sam again asked, "United 232, one more time, no hydraulic quantity, is that correct?" The second officer replied in exasperation, "Affirmative! Affirmative! Affirmative!" The engineers on the ground, the crew decided, could offer no help. United 232 was on its own.
Yet, at almost exactly the same time, the check airman accomplished a miracle. He managed to bring the plane around in a single broad turn to the left, lining up the plane for the shortest runway at the Sioux City airport. This was the only left turn the plane was to make following the explosion. The captain called the head flight attendant forward and explained the procedures for an emergency landing.
Captain: "We're going to try to put into Sioux City, Iowa. It's gonna be tough . . . gonna be rough."
Flight Attendant: "So we're going to evacuate?"
Captain: "Yeah. We're going to have the [landing] gear down, and if we can keep the airplane on the ground and stop standing up [i.e. stop right side up] . . . give us a second or two before you evacuate. 'Brace, brace, brace,' will be the signal . . . it'll be over the pa system: 'Brace, brace, brace'."
Flight Attendant: "And that will be [the signal] to evacuate?"
Captain: "No, that'll be to brace for the landing. And then if we have to evacuate, you'll get the command signal to evacuate. But I really have my doubts you'll see us standing up, honey. Good luck, sweetheart."
Thirty-five kilometers from the airport and at 1,300 meters altitude, the plane was still roughly lined up for the runway. Sioux City air traffic control suggested a slight left turn to produce a better approach and to keep the plane away from the city. "Whatever you do, keep us away from the city," the captain implored. Almost immediately afterward, as if in defiance, the plane began its tightest rightward turn, a complete 360-degree circle. The crew desperately tried to bring the nose around to face the runway again. As the aircraft rolled to a severe angle, the check airman exclaimed, "I can't handle that steep of bank . . . can't handle that steep of bank!" For five excruciatingly slow minutes, the plane turned in a circle. Working the throttles, the check airman leveled the wings once more and got the plane back to its original course.
Sioux City control: "United 232 heavy: the wind's currently three six zero at one one. Three sixty at eleven. You're cleared to land on any runway."
The runway that they were heading towards was closed and covered with equipment. Two minutes before touchdown, airport workers scrambled frantically to clear the equipment away. It was also short, at just over 2,000 meters; and, without hydraulic pressure, the plane had no brakes. But Sioux City control assured the captain that there was a wide, unobstructed field at the end. The cockpit crew struggled with the controls through the flight's last seconds.
Captain: "Left turns! Left turns! Close the throttles."
First Officer: "Close 'em off."
Captain: "Right turn. Close the throttles."
First Officer: "Pull 'em off!"
Check Airman: "Nah. I can't pull 'em off or we'll lose it. That's what's turning ya!
Unidentified voice: "OK."
First Officer: "Left throttle . . . left! Left! Left! Left! Left! Left! Left! . . . Left! Left! Left!
[Ground proximity alarm sounds]
First Officer: "We're turning! We're turning! We're turning!"3
Unidentified voice: "God!"
The plane hit the ground at the runway's leading edge, just to the left of the centerline. The right landing gear touched the ground first, then the right wing. As the plane skidded across the runway to the right it lost its right engine, chunks of its right wing, and its tail engine. It plowed across the grass, lost its left engine and tail section, and hit the pavement of another runway. The cockpit nose broke off. The remainder of the fuselage cartwheeled away and exploded in flames, coming to rest upside down in the middle of a field.
Of the 296 passengers, 111 died, including one flight attendant. The entire cockpit crew survived.
At first, United 232's experience seems to be no more than an isolated, harrowing event in the skies of the United States--a tale of heroism in the face of terror, of discipline and skill in the face of unforeseen catastrophe. It was a dramatic front-page story that had everybody talking and astonished for a day. Then it receded from public consciousness and was enveloped by the rational, bureaucratic procedures of accident investigators.
But when I read about United 232, something struck a deeper chord. The event could serve, I suspected, as a crude but vivid metaphor for the situation we are all facing, individually and collectively.
When the plane's tail engine disintegrated, the flight crew immediately faced a staggeringly complex task. Multiple, simultaneous, and interdependent emergencies converged in the cockpit. Some were recognized and understood, some were misunderstood, and some didn't even cross the crew's threshold of consciousness. As the crew members tried to make sense of their instruments and the data they received via their eyes and ears, problems cascaded into other problems with almost overwhelming speed. The crew was swept along by a tightly coupled chain of cause and effect. For forty-four harrowing minutes the captain and his officers assessed a prodigious flow of incoming information, made countless inquiries and observations, and issued dozens of commands. Even with extra help from the check pilot, it was all they could do to keep the plane aloft and roughly on course to a crash landing.
Of course, our daily lives don't have nearly the same drama or urgency. But most of us feel, at least on occasion, that we are losing control; that issues and emergencies, problems and nuisances and information--endless bits of information--are converging on us from every direction; and that our lives are becoming so insanely hectic that we seem always behind, never ahead of events. Unexpected connections among places and peoples among macro and micro events, connections that we barely understand in their true dimensions, weave themselves around us. Most of us also sense that, just beyond our view, immense, uncomprehended, and unpredictable forces are operating, such as economic globalization, mass migrations, and changes in Earth's climate. Sometimes these forces are visible; more often they flit like shadows through our consciousness and then disappear again, behind the haze of our day-to-day concerns.
Yet the flight of United 232 is more than a vivid metaphor for a world of converging complexities and connections, of decision-making at high speeds in conditions of high uncertainty, and of the difficulty of managing in such circumstances. It is also a metaphor for crisis--for the sharp, unexpected, blinding events that sometimes send us reeling. Investigation after the crash revealed that the engine explosion was caused by a fatigue crack in the tail engine's stage-1 fan disk, a large doughnut of titanium alloy out of which the blades of the jet engine's fan radiate. The crack started near the center of the disk, at the site of a tiny metallurgical flaw formed when it had been cast seventeen years before. Slowly, imperceptibly, during 38,839 hours of flying time (about 7 billion revolutions of the disk), this flaw turned into a crack, and the crack grew in length.4 At the time of the disk's last inspection, in April 1988, it was over a centimeter long and should have been noticed by United's inspectors. But it was not.5 And 2,170 flight hours later, in a split second, the crack shot outwards to the edge of the disk, and the disk blew apart.
In our personal lives we sometimes see similarly sudden and shocking events: physical or mental disease unexpectedly affects a loved one, companies we deal with abruptly go bankrupt, and computers, televisions, and cars suddenly break down. Within the larger society, stock markets crash, revolutions break out, and floods devastate communities. The simple mental models in our heads, the models that guide our daily behavior, are built around assumptions of regularity, repetition of past patterns, and extrapolation into the future of slow, incremental change. These mental models are the autopilots of our daily lives. But no matter how much we plan, build buffering institutions and technologies, buy insurance, and develop forecasts and predictions, reality constantly surprises us. Sometimes these are happy surprises; sometimes they are not. Rarely are our reactions neutral.
United 232 also offers some reassuring lessons about our ability to react. Faced with sudden calamity, the crew members used their wits and their courage to save almost two-thirds of the lives aboard. The U.S. National Transportation Safety Board (ntsb) declared that "under the circumstances, the ual flight crew performance was highly commendable and greatly exceeded reasonable expectations." The situation they faced was unprecedented: they hadn't trained for it; no airline crew had ever trained for it. Such a disaster was thought too unlikely or too catastrophic to justify specific training. The pilot and his officers therefore had to invent, on the spot, a method for controlling the plane. They also had to assess the plane's damage, choose a place to land, and prepare their passengers for a crash landing.
The moment the engine exploded, crew members had to meet a sharply higher requirement for ingenuity--that is, for practical solutions to the problem of flying the aircraft in new conditions.
The decision-making process they used to meet this ingenuity requirement has been studied in minute detail by Steven Predmore, currently a manager of human-factors analysis at Delta Airlines in Atlanta.6 Predmore examined the transcript produced from United 232's cockpit voice recorder. He reduced the thirty-four minutes of recorded conversation to a series of "thought units"--that is, "utterances that deal with a single thought, action, or issue." He then classified these thought units by type, speaker, and target, which allowed him to analyze the crew's response to the emergency. Since thought units are almost the same as individual pieces of information, his technique created "a rough index of the rate of information transfer" among crew members.
Predmore found that the number of thought units averaged about thirty per minute with peaks of fifty to sixty per minute. In other words, with every one or two ticks of a watch's second hand, a chunk of information flew across the cockpit. (For comparison, during demanding moments of routine flight, aircraft crew members rarely transmit more than fifteen thought units per minute.) The thought units came in all forms: commands by the captain; advocacy of actions by junior officers; observations about the state of the plane; requests for information concerning, for example, possible landing sites; statements of intent; and expressions of emotional support. Much of the communication was in parallel, with independent, simultaneous conversations overlapping and intersecting. The highest rate of flow during United 232's flight--almost one thought unit per second--occurred fifteen minutes after the explosion. "This represents the point," Predmore says in the doctoral dissertation he delivered at the University of Texas in 1992, "where the check airman enters the cockpit after his visual damage inspection, and he is immediately brought into the loop with regard to damage to the flight control systems, corrective action that is ongoing, decisions about where to land, and instruction on the manipulation of the throttles."
For the entire duration of the crisis, the crew members were close to a human being's peak cognitive load: they were processing information, making decisions, and supplying ingenuity about as quickly as humanly possible. The load was extreme in part because the crew was enveloped by uncertainty and lacked a clear understanding of the aircraft's state: expert analysis later showed that the cockpit's flight controls were completely useless. "Both the captain and first officer were fighting the stick to maintain control," Predmore told me, when I interviewed him eight years after the crash, "but it turns out that they could have actually let go of the stick entirely. The only thing that helped was the check airman's use of differential engine thrust." Given the circumstances, however, the captain and first officer were right to keep manipulating the controls. "It's easy to look back and say it was a simple problem, because there was actually very little they could do to control the aircraft," Predmore went on. "But they didn't know that there was little they could do. They had some sense that they had flight control, but they didn't know which systems were working."
Ingenuity requirements were so high that the check airman quickly became indispensable. "The demands created by the use of differential engine thrust to control the aircraft" made it almost impossible for either the captain or the first officer to attend to other tasks.10 The addition of the check airman to the crew allowed the captain to assign priority to tasks, divide and delegate them among crew members, and monitor the crew's performance. "The captain was an amazing leader," Predmore noted. The check airman's extra help, combined with the captain's effective organizing of the resources he had, allowed the crew to act like a precisely coordinated team; they became almost a single mind, and supplied enough ingenuity to direct the plane to a crash landing.
United 232 was also blessed with a great deal of luck. The crisis occurred during daylight, in good weather, and in reasonable proximity to an airport. "Had any one of these factors been different, the outcome would have been different," Predmore concluded. "The captain said the number one factor was luck. After the accident, they reprogrammed the scenario into a flight simulator, and on thirty-five attempts, they couldn't get anywhere near the runway."11 In fact, based on these simulator exercises, the Safety Board concluded that "landing at a predetermined point and airspeed on a runway was a highly random event. . . . [Such] a maneuver
involved many unknown variables and was not trainable, and the degree of controllability during the approach and landing rendered a simulator training exercise virtually impossible."
Is our world becoming too complex to manage? Can all societies supply the ingenuity they need to meet the challenges they face? Sometimes it seems that we are collectively careening into the future, very much as United 232 careened to a crash landing. Must we, in response to the challenges before us, turn ourselves and our societies into analogues of the United 232 crew? Must we become tightly integrated decision-making units, hypercharged with adrenaline and fighting to stay on top of events?
My long-standing interest in how societies adapt to complex stresses dates back more than two decades. For many years this interest remained unfocused and uncrystallized in my mind; it was barely more than a background concern that created a fragile connection between disparate current events that I read about in newspapers. But a specific intellectual challenge I faced after I finished my doctorate at mit in 1989 led me to put boundaries around my interest in social adaptation and give it more definition.
I grew up in a rural area outside Victoria, British Columbia, in the 1950s and 1960s. My father worked as a forester, and my mother was an artist and illustrator of wildlife, so as a young boy I learned to love the outdoors and take an interest in environmental issues. Both my parents were also attuned to the ebb and flow of current affairs, and we spent hours talking about what was going on in the world. It may have been watching the unfolding, televised horrors of the Vietnam War, or perhaps our family discussions of the century's history, especially of events surrounding World War II, but something instilled in me a deep curiosity about the causes of human violence. When I arrived at university, many years later, I focused my studies on a phenomenon that truly bewildered me--the nuclear arms race between United States and the Soviet Union.
Eventually I entered the political science program at mit, where I continued my studies of international relations and defense and arms control policy. But gradually I shifted away from these issues to return to the deeper, underlying question: What makes people fight each other? I was especially intrigued by the processes behind "group-identity" conflicts, which involve a stark distinction between "us" and "them." This category includes violence that centers on nationalist, ethnic, racial, or other ethnocentric identities. In my dissertation, I carefully analyzed and tested several of the best theories of group-identity conflict.
In my first research project after graduation, I combined my two chief interests: conflict and the environment. I decided to explore whether violence inside poor countries could be traced to critical environmental problems. Many poor countries in the developing world suffer from severe pollution, scarcity of fresh water, erosion of cropland, deforestation, and depletion of fisheries. Could this environmental stress, I asked myself, increase the risk of insurgencies, ethnic clashes, urban riots, and coups d'état? For about seven years, with the help of a wonderfully talented group of researchers and advisors from fifteen countries, I worked to answer this question, with considerable success.13 But once I was deep into the issue, I found that environmental problems cannot, by themselves, cause violence. They must combine with other factors, usually the failure of economic institutions or government. Some societies, it turns out, adapt quite smoothly to environmental stress, while others succumb to confusion and deterioration. Why, I wondered, did some succeed and others fail?
Over time, I came to the conclusion that a central feature of societies that adapt well is their ability to produce and deliver sufficient ingenuity to meet the demands placed on them by worsening environmental problems. Basically, I proposed, societies that adapt well are those able to deliver the right kind of ingenuity, at the right time and place, to prevent environmental problems from causing severe hardship and, ultimately, violence.
Within this rudimentary theory, I defined ingenuity as ideas that can be applied to solve practical technical and social problems, like the problems that arise from water pollution, cropland erosion, and the like. Ingenuity includes not only truly new ideas--often called "innovation"--but also ideas that though not fundamentally novel are nonetheless useful. Social theorists have long known that something like ingenuity is key to social well-being and economic prosperity. Experts in history, economics, organizational theory, and cognitive science recognize that an adequate flow of the right kind of ideas is vital, and that we need to understand the factors that govern this flow. For example, Paul Romer, a Stanford economist who pioneered the field of New Growth theory, argues that ideas are a factor of economic production just like labor and capital.14 For him and like-minded economists, ideas have intrinsic productive power and are responsible for a significant part of economic growth.
Taking Paul Romer's argument as a starting point, I began to think of ingenuity as consisting of sets of instructions that tell us how to arrange the constituent parts of our social and physical worlds in ways that help us achieve our goals. We need copious ingenuity to address the commonplace challenges around us. Every day, for instance, an average city receives thousands of tons of food and fuel, tens of millions of liters of water, and hundreds of thousands of kilowatt hours of electricity. Huge quantities of wastes are removed, hospitals provide health services, knowledge is transmitted from adults to children in schools, police forces protect property and personal safety, and hundreds of committees and councils from the community to the city level deal with governance. The amount of ingenuity needed to run this system is, of course, not the same as the amount needed to create it, because at any one time an enormous array of routines and standard operating procedures guides people's actions.15 But our urban system, with its countless elements, is the product of the incremental accretion of human ingenuity. It was created, over time, by millions of small ideas and a few big ones.
I soon realized that ingenuity comes in two distinct kinds: the kind used to create new technologies, like irrigation systems that conserve scarce water, or custom-engineered grains that grow in eroded soil, and the more crucial kind used to reform old institutions and social arrangements and build new ones, including efficient markets, competent and honest governments, and productive schools and universities. I called these two kinds technical and social ingenuity.
Technical ingenuity helps us solve problems in our physical world--such as requirements for shelter, food, and transportation. Social ingenuity helps us meet the challenges we face in our social world. It helps us arrange our economic, political, and social affairs and design our public and private institutions to achieve the level and kind of well-being we want. The crisis aboard United 232 nicely illustrates the difference between the two kinds of ingenuity. The captain initially used differential engine thrust to control the plane--an example of technical ingenuity, because differential thrust was a strictly technological solution to the problem. But he recognized that the load of tasks in the cockpit was simply too great for the three people there; a technological solution, by itself, was not enough. So he accepted the check airman's unexpected offer of help and created a new allocation of tasks within the cockpit--an example of social ingenuity that allowed him to devote time to integrating the crew's performance. It also helped him think farther into the future.
Social ingenuity, I came to understand, is a critical prerequisite to technical ingenuity. We need social ingenuity to design and set up well-functioning markets; and we need market incentives to produce an adequate flow of new technologies. Astute political leaders bargain, create coalitions, and use various inducements to put new institutional arrangements into place; competent bureaucrats plan and implement public policy; and people in communities, towns, and households build local institutions and change their behavior to solve the problems they face: they are all supplying social ingenuity.
I was making good progress, I thought, in understanding ingenuity's role in our ability to adapt to environmental stress. But once I had grasped the fact that ingenuity was key, a host of new questions arose. Two in particular drew my attention, and would remain with me as I explored this issue in coming years. First: is humanity's requirement for ingenuity rising as its environmental problems increase, and if so, how fast and why? My research so far had strongly suggested that ingenuity requirement goes up as environmental problems worsen, because societies need more sophisticated technologies and institutions to reduce pollution and to conserve, replace, and share scarce natural resources.16 Second, can human societies supply enough ingenuity at the right times and places to meet this rising requirement, and if not, why not?
Our supply of ingenuity, I soon recognized, involves both the generation of good ideas and their implementation within society. It's not enough for a scientist, community, or society simply to think up an idea to solve an environmental problem; the idea must also be put into practice--the hybrid corn must be planted, the new farming credit system must be set up and operated, the community must educate itself to change its behaviors--before the ingenuity can be said to be fully supplied. I soon discovered that many of the critical obstacles occur not when the ingenuity is generated (there is usually no shortage of good ideas) but when people try to implement new ideas. The biggest obstacle is often political competition among powerful groups, which stalls or prevents key institutional reform.
In 1995, I brought all these elements and questions together in an article I published in a leading academic journal.17 I suggested that many of the societies around the world that are currently experiencing severe environmental problems, from China and Pakistan to Egypt and Haiti, are locked in a race between their soaring requirement for ingenuity to solve these problems and their uncertain ability to deliver it. If a society loses this race--if, in other words, it cannot supply sufficient ingenuity to meet its needs--it develops an ingenuity gap between requirement and supply. Societies with severe ingenuity gaps can't adapt to or mitigate environmental stress. And mass migrations, riots, insurgency, and other forms of social breakdown often result.From the Hardcover edition.
Excerpted from The Ingenuity Gap by Thomas Homer-Dixon. . Excerpted by permission of Vintage, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.