Deadly decisions : how false knowledge sank the Titanic, blew up the shuttle and led America into war / Christopher Burns.
Record details
- ISBN: 9781591026600 :
- ISBN: 1591026601 :
- Physical Description: 360 pages : illustrations ; 24 cm
- Publisher: Amherst, N.Y. : Prometheus Books, 2008.
Content descriptions
Bibliography, etc. Note: | Includes bibliographical references (pages 337-346) and index. |
Search for related items by subject
Subject: | Errors. |
Available copies
- 1 of 2 copies available at Evergreen Indiana.
Holds
- 0 current holds with 2 total copies.
Location | Call Number / Copy Notes | Barcode | Shelving Location | Status | Due Date |
---|---|---|---|---|---|
Porter County PL - Valparaiso Public Library | 001.96 BUR (Text) | 33410010234039 | Adult Nonfiction | Checked out | 04/17/2024 |
Starke Co PL - Schricker Main Library (Knox) | 001.96 BUR (Text) | 30032010455324 | ADULT NON-FICTION | Available | - |
Loading Recommendations...
DEADLY DECISIONS
HOW FALSE KNOWLEDGE SANK THE TITANIC, BLEW UP THE SHUTTLE, AND LED AMERICA INTO WARBy CHRISTOPHER BURNS
Prometheus Books
Copyright © 2008 Christopher BurnsAll right reserved.
ISBN: 978-1-59102-660-0
Contents
INTRODUCTION.................................................11 1. FALSE KNOWLEDGE...........................................17 Titanic: Unsinkable..........................................17 Medicine, Money, and War: Unthinkable........................23 2. VIRTUAL TRUTH.............................................31 Three Mile Island: The Opposite Is True......................32 Ignore, Deny, Forget.........................................40 Shuttle Challenger: No One Is to Blame.......................44 A Conspiracy of Silence......................................54 3. MEMBERS OF THE MIND.......................................59 The Truthful Brain...........................................61 Some Things Must Be Believed to Be Seen......................69 USS Vincennes: The System Worked Fine........................76 Faster Than the Speed of Thought.............................85 4. CHALLENGING TRUTH.........................................93 Betsy Lehman: Truth to Power.................................94 The Truthful Team............................................99 Inspiration, Reason, and Consensus...........................102 Les Philosophes..............................................109 5. THE INFORMATION WAR.......................................119 The Stag Hunt................................................122 Blinking Red.................................................132 Facts, Values, and Concepts..................................138 9/11: Reality Hits Home......................................140 What Went Wrong?.............................................155 6. DEADLY DECISIONS..........................................161 The Information Bubble.......................................162 Katrina: Warners and Warnees.................................165 Uncertainty Absorption.......................................172 Mrs. Aristotle's Teeth.......................................178 Six Tests for Truth..........................................183 The Truth/Action Paradox.....................................188 7. THE COMING EPIDEMIC.......................................191 Avian Flu: The Facts.........................................192 Believing WHO?...............................................202 Considering Disaster.........................................206 8. WALTZING INTO WAR.........................................213 Faking the News..............................................213 Manufacturing Truth..........................................225 There Is Less to This Than Meets the Eye.....................232 Invading Iraq: The Smoking Gun...............................245 Mission Accomplished.........................................259 9. TRUTH SYSTEMS.............................................267 The Truthful Organization....................................269 Truth Systems Crumble........................................275 Can Democracy Survive?.......................................281 The Machinery of Knowledge...................................285 So?..........................................................292 The Future Isn't What It Used to Be..........................297 ENDNOTES.....................................................301 SELECTED BIBLIOGRAPHY........................................337 INDEX........................................................347
Chapter One
FALSE KNOWLEDGE
It was nearly midnight on April 14, 1912, when the White Star liner Titanic, racing along on its maiden voyage across the North Atlantic, struck an iceberg many times larger than the ship itself. We now know from extraordinary new photographs that the seam along the starboard hull was ripped open by the impact. The ocean poured in, breaching the double hull, flooding the mail room, the squash court, and the third-class berths. It spilled over the "watertight" bulkheads to snuff out the giant steam boilers and pull the ship down in one of the greatest disasters of modern times.
In the wake of this tragedy, the British government ordered that ice patrols commence along the major shipping lanes to guard against another disaster. They should have called for truth patrols instead.
TITANIC: UNSINKABLE
There was nothing surprising about icebergs in the North Atlantic. All day the Titanic had received detailed information about the ice ahead, delivered directly to Captain Edward Smith and to other officers on the bridge. Two hours before the incident, the Mesaba sent the Titanic a warning of icebergs at 42° 25'N 50° 14'W, almost precisely where the accident later occurred. At least one report was given to Bruce Ismay, president of the White Star Line. Eager to set a new record for transatlantic crossing, Ismay calmly stuffed the note in his pocket. In spite of frequent and specific warnings, he made the decision to let the ship race ahead at full speed.
Ismay believed three things to be true: first, he knew from his own experience with other ships that the lookout would give him actual sightings of any iceberg in time to steer around it. Second, his team of engineers assured him that even if the Titanic struck an iceberg submerged or otherwise difficult to see, the ship would not sink. And third, if there should be an accident of any kind, there was a tight and mutually supportive community of ships nearby that would come to his aid. A truth patrol would have noted that he was wrong on all counts.
The night was not dark and stormy, but clear, cloudless, and spangled with stars; conditions for timely warning seemed excellent. But the lookout actually assigned to watch for ice couldn't see that well. Frederick Fleet had not been given an eye exam in five years and, in spite of his frequent requests, had not been provided binoculars. More important, the unprecedented size and speed of the Titanic were so great that, unlike other ships in Ismay's experience, this one could not be stopped or significantly turned in less than a mile with engines full astern. Even a lookout with excellent vision could not have spotted the iceberg at that distance.
Nor was it unsinkable. The Titanic was built to remain afloat even with its first four watertight compartments flooded, but the new design had not been tested. Ismay even overruled the worried engineers who built the ship, cutting the number of lifeboats from thirty-two to sixteen, the minimum required by the British Board of Trade. He later observed that the reason the Titanic carried any lifeboats at all was so they could rescue passengers and crew from other ships.
And finally, the Titanic was for all practical purposes alone. Ten miles away, the Leyland liner Californian had slowed down in the dangerous ice field when third officer Charles Victor Groves saw the lights from a large passenger liner racing up from the east. As he watched, the liner seemed to stop as if in trouble. He went to the radio room, found that the operator had gone to bed, and tried to operate the radio himself but was unable to make contact.
For the next two hours, Groves and his fellow officers studied the strange behavior of the Titanic. She seemed to float awkwardly on the sea, firing white rockets normally considered a signal of distress. When they reported this to Stanley Lord, captain of the Californian, he told them to try to make contact by Morse lamp, but this proved unsuccessful. Finally, when it seemed that the lights of the nearby ship were beginning to disappear, the officers went again to Captain Lord, who was lying down in his cabin. "Were they all white rockets?" he asked. And hearing that they were, he went back to sleep.
At the conclusion of the British inquiry, Lord Mersey, chairman of the inquiry committee, wrote: "When she first saw the rockets, the Californian could have pushed through the ice to the open water without any serious risk and so have come to the assistance of the Titanic. Had she done so, she might have saved many if not all of the lives that were lost."
Why didn't Lord take the distress signals seriously? He testified later that he believed the Titanic was "unsinkable," that the distress signals seemed ambiguous, and that he was in a dangerous ice field himself. His officers concluded from the same evidence that they were witnessing a disaster of unprecedented proportions, but they stood at the rail and kept their silence.
It is part of the Titanic legend that Robert Sarnoff, the twenty-one-year-old employee of Marconi Wireless Telegraph Company, stayed at his primitive radio for seventy-two hours, receiving and passing on the names of those lost at sea; the first major demonstration of the revolutionary new wireless telegraphy. It seems ironic that we should find there, at the birth of the technology that epitomizes our age, the specter of an ancient and enduring problem. The Titanic sank because of multiple failures to manage information correctly: failure by Captain Smith to heed warnings, failure by Bruce Ismay to credit evidence that was contrary to his ambitions, failure by Captain Lord of the Californian to act in the face of doubt. The Titanic sank because of false knowledge.
The errors that surrounded the sinking of the Titanic are not unusual. A number of recent disasters on a comparable scale indicate that the problem may be growing worse. Early warnings are often brushed aside by men and women in the thrall of their own dreams. Critical information is frequently delayed or lost in organizations blundering through a crisis. And at the last moment, when the situation could be saved, communication systems fail and dissenters often fall silent.
Every new age begins with a drum roll of novel disasters and ends in a fog of nostalgia. One of the unexpected problems of the Industrial Age, for example, was finding low-cost labor to come in from the farms and cottages to tend the mills, a task that precipitated bloody riots and decades of dissent. Now, with the onset of the Information Age, our information systems seem to offer new power over a complex world, but then those systems fail us. The warning is late, the message is confusing, the signal never gets through. Failures are attributed to some cause over which we can comfortably claim no control: equipment malfunction, system complexity, inadequate training, bad weather. Even our new disasters are understood in terms of old conditions.
It is customary in business management literature to say that, in such cases, the decision system failed. The one responsible for the final evaluation of alternatives was distracted, deluded, indecisive, unable to handle all the data, or emotionally ill-equipped for the stress. And when none of these conditions can be confirmed with certainty, those who rake through the ruins of an accident say he made the best decision he could, given the information available to him at the time. No one is to blame. The solution is to build more decision support systems, get more data, make better presentations, order extra computer checks.
But the reality is otherwise. Some of the information, though wrong, looks right. It fits neatly with the rest of the data. It comes from a trusted source. It has been reviewed and approved by several hierarchies of analysts. The danger is that we live in a world where the evidence and analysis always makes sense, but where, through personal and social processes we scarcely understand, the information gathered by the organization has come to include false knowledge. Everyone is to blame.
Our ability to determine the accuracy of information is increasingly inadequate. Smart men and women in the best organizations, surrounded by data, make the wrong decisions. We race ahead like Ismay on the bridge, charmed by the possibilities but betrayed by the facts. We are vaguely aware of the need for speedy and relevant information but insensitive to the limits of cognition and almost entirely ignorant about how information is distorted as it travels through an organization. We cling to the idea that a reasonable man in a position of authority should make these complex decisions alone, and we ignore the fact that the whole community-the engineers, the lookout, the signalman, the radio operator, the officers on the deck of the Californian-have already narrowed his choices and set the future in motion. We are trying to manage communities, businesses, and nations on the basis of garbled reports and unreliable message systems without having achieved the ability to test for truth-not as some philosophical matter, but as the basis for action in a complex world.
Even direct observation can often be misleading. Beyond a certain speed and level of complexity, the world we observe is a false one. The Titanic was effectively traveling blind: by the time Fleet spotted the iceberg, it was already too late to turn the ship. The collision had occurred downstream in time and nothing in his power could then undo it. Until that moment, a man's ability to observe nature had been approximately equal to his power to react. He could run from a volcano, dodge a screaming artillery shell, and steer a ship through a storm. But new technology has changed the rules. Some armies in history have marched faster than their supply trains and starved to death on the eve of victory. Fighter pilots joke that beyond the speed of sound there is no point looking out the window: all you see is the past. At Mach 2 another fighter coming directly at you appears as a speck a mile away, but within one second-faster than many pilots can react-the planes have passed each other or collided.
A hundred years ago, the truth was easier to establish. One could visit the mills, go to the market, see the laboratory, and reach a conclusion based on personal experience. Information was a record of market transactions, a notebook of observations from the mill, or a report of distant events. It could be confirmed by personal observation. But now, as organizations expand to undertake broader and more specialized tasks, financial trading systems have become the marketplace, software development is the mill floor, reports are the events. And the decision center of an organization is often too far from the relevant reality to check for accuracy. The real world we need to observe is deep in the reactor core, in the night sky on the other side of the world, or in the purchasing behavior of millions who will buy the product. We discover too late that our observations are not timely enough, or complete enough, or accurate enough to be relied upon. We design products for consumers we have never met. We prescribe medicine we have not tested ourselves for patients we scarcely know to treat illnesses diagnosed by others. We aim our missiles at nations whose language we cannot read or speak and we are betting our future health and productivity on genetic devices and nanotechnology we can no longer see or feel.
MEDICINE, MONEY, AND WAR: UNTHINKABLE
How shall we test the quality of complex information? How was Ismay to decide the truth of the message "There are icebergs ahead?" In the middle of a modern management situation, it is rarely possible to consult some long-standing authority. Events change quickly, and authority has a habit of hiding in generalities. Nor can we always measure the truth of a message by comparing it rationally to other information available. Too often the other information comes from the same source. The liar's art, after all, is to weave from ambiguous and dissonant data a "reasonable tale." It was certainly contrary to Ismay's style-and to the style of his times-to gather his officers together and evaluate the problem. And if he had, what good would it have done? The officers might have been more cautious but they were not better informed.
Consider three fields where information is the primary reality-medicine, finance, and international relations.
In the field of medicine, the inability to see errors in the patient's diagnosis or treatment has led to a startling increase in patient deaths. In 2000, the Journal of the American Medical Association published the results of a study showing 225,000 deaths a year as a result of medical error: 12,000 from unnecessary surgery, 7,000 from medication errors in the hospital, 20,000 from other hospital errors, and 80,000 from infections occurring while the patient was hospitalized. One hundred and six thousand deaths resulted from "non-error" negative effects of drugs. The total number of deaths caused by medical error in the United States is roughly equal to the number of casualties from two jumbo jets crashing every day.
It is possible that doctors are trained to be more attentive to the problems of false knowledge than engineers, lawyers, financial analysts and intelligence professionals-as we shall see. In modern medicine, truth is their life's work. But the mistakes they make are more visible, and the results more personal. A study published in 2008 by the New England Healthcare Institute concluded that 8.8 percent of hospital patients in a sample of eight Massachusetts hospitals suffered from "preventable adverse drug reactions," ranging from a change in respiratory rate to a fever or a seizure to anaphylactic shock. "Conservative estimates show that nationwide, adverse drug events result in more than 770,000 hospital injuries and deaths each year."
And that's in the hospital, where a team of professionals works within the discipline of protocol, where outcomes are measured, and where insurance companies have a special interest in preventing mistakes. This doesn't take into account the errors committed by individual physicians treating patients in their own offices. Dr. Vincent DeVita, director of the National Cancer Institute at the US National Institutes of Health, said that of the 462,000 cancer deaths in the United States that year, at least 20 percent occurred because the doctor didn't have-or wouldn't use-information that was immediately available to him or her. In each case, the doctor reached a conclusion based on experience and prescribed the indicated therapy. In each case, the patient died. DeVita went further: the number of deaths "could be cut by as much as 50 percent" if doctors would (or could) take the time to look beyond their desktop for prevention and treatment knowledge already available. Although the information can be found in journals, textbooks, online databases, and university research departments, many doctors cannot or will not use it. Instead they rely on their own memory and judgment, heavily influenced by the desire to heal, and reluctant to use diagnostic tools that might define the disease in terms they don't recognize.
According to a study at the Harvard School of Public Health, 20,000 heart attack deaths a year could have been prevented over the last decade if doctors had accepted research findings when they were first published. Simple discoveries such as the value of aspirin in preventing second heart attacks were ignored for a decade because they were published in a statistics journal doctors don't read. It isn't that they couldn't get access to the journals; access is easy. The doctors couldn't come to grips with information that seemed foreign and contradictory, information that challenged their competence or caused confusion, information of unknown and indeterminable quality.
Herbert Simon, the Nobel economist, writing about how decisions are made, suggested that there are limits to reason. Individuals tend to identify a new situation as representative of a class of conditions they are familiar with. And as they decide how to respond, they remember successful or vivid responses in the past. People do what has worked before. But when the information is incomplete, people have to imagine what might be missing. When the experience is insufficient to explain what is going on, people have to search for other less familiar responses and make judgments about the efficacy of each one. Usually, Simon says, they choose the first combination of situation and remedy that seems to fit the facts. Not the best one, just the one that is, in Simon's language, most "available." In other words, they guess.
(Continues...)
Excerpted from DEADLY DECISIONS by CHRISTOPHER BURNS Copyright © 2008 by Christopher Burns. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.