TOO GRAVE A RISK
                                    by Alan F. Phillips, M.D.

    When a risk has knowingly been taken in order to achieve some
result, and the result has been achieved without the feared mishap
occurring, it is normal to conclude that it was worth taking the risk.
The success is attributed to good judgment.  Acceptance of  the risk, if
it is thought of at all, tends to be seen as courageous and commendable.

    The 40-year nuclear confrontation between U.S.A. and Russia, the
"Cold War" of 1947 to 1987, entailed the risk of a nuclear war.  It is
widely believed in North America that the breakup of the Soviet Union
showed that the Cold War was won by boldly facing the enemy without
blinking.  That may be so, but in fact the risk was too grave to be
justified by any conceivable success.  It was no less than a risk of
complete destruction of our civilization, of worldwide starvation, and
possibly extermination of the human race.

    It was by good luck that the world survived the 40 years of Cold
War, and we must not count on our good luck holding for ever.
---------------

    A huge system like the American nuclear deterrence system is bound
to have mistakes in its design and construction.  During prolonged
operation they show up.  As well as those mistakes, the system is bound
to suffer equipment failures and human errors of many kinds.  It would
have been surprising if serious mishaps did not occur from time to time,
and naturally they have done.  A false alarm is particularly dangerous
in a confrontation that relies on "deterrence" at long range.  Suppose
the U.S. system suffers a false alarm.  The preparations to retaliate
are seen by the Russian early warning systems, and are perceived as
preparations for an unprovoked attack.  They in turn prepare to
retaliate, and those preparations would be seen by U.S. as confirmation
of the original false alarm; and thus a mishap could build up to the
ultimate disaster.

    Gradually the planners became aware of the power of nuclear weapons
and the awful disaster that an inadvertent war would be.  Increasingly
elaborate safeguards against accidental or unauthorized use were built
into nuclear weapons and their delivery systems.  The planners also knew
that failure of any feature of the system might delay or prevent
retaliation, so they built in plenty of redundancy, particularly in the
realm of communication.  But they were not quick to learn from mistakes,
and they refused to follow the logic of both theory and experience:
that the risk of inadvertent war could not be made zero, and therefore
the whole doctrine of Mutually Assured Destruction was unacceptable.
-----------------

    Here is an example of a design error.  During the night of 24
November, 1961, at U.S. Supreme Air Command Headquarters in Omaha,
Nebraska, all communication links with NORAD at Colorado Springs, and
with the three early warning radar sites in the far north, suddenly went
dead.  For General Thomas Power in Omaha there seemed to be only one
likely explanation:  enemy action.  Accidental failure of the multiple
telephone and telegraph links, all at the same time, was almost
inconceivable.  Air Force bases were put on full alert and B-52 nuclear
bomber crews started their engines, with instructions not to take off
without further orders.  The Omaha staff managed to make radio contact
with a B-52 aircraft in flight 3,000 miles away over Greenland.  The
B-52 crew contacted the Thule Early Warning site, and reported back that
no attack had taken place.  (Imagine those messages, vital to the
survival of civilization, heard only through the static of long-distance
air-to-ground radio, 1961 quality!)  So General Power, still mystified,
canceled the Alert.  He could not know that all the redundant routes for
telephone and telegraph between NORAD and his headquarters ran through
one relay station in Colorado.  At that relay station a motor had
overheated and caused a small fire which interrupted all the lines.
--------------------

    Electrical construction commonly contains wiring errors, which in
civilian life are usually found quite easily when some function does not
work.  Then they can usually be corrected after only minor
inconvenience.  At around midnight on 25 October 1962, during the Cuban
Missile Crisis, a guard at the Duluth base saw a figure climbing the
fence.  He shot at it, and activated the "sabotage alarm".  This
automatically set off sabotage alarms at all bases in the area.  At Volk
Field, Wisconsin, the alarm was wrongly wired, and the Klaxon sounded
which ordered nuclear-armed fighter aircraft to take off.  The pilots
believed World War III had begun, and might have used their nuclear
weapons.  The duty officer at Volk Field kept his cool.  He called
Duluth.  There was no war.  By this time aircraft at Volk Field were
starting down the runway.  A car raced from command center and managed
to signal the aircraft to stop. When things had settled down it was
found that the original intruder had been a bear, not a saboteur.  Once
again, good and quick thinking by the military officer in charge
prevented a possible disaster.
----------------

    The sort of thing that might start a war would be some unfortunate
coincidence: two or three mishaps making such a confused situation that
no-one managed to guess that it was a false alarm.

    In 1956, on Guy Fawkes Day (5 November), British and French forces
were attacking Egypt at the Suez Canal.  The Soviet Government was
threatening to fire rockets at London and Paris.  (These were not
nuclear-armed rockets, of course, at that date; but NATO could respond
with nuclear bombers and attack Russian cities.)  That night, among the
many messages received by the U.S. military headquarters in Europe were
these four:
      (i) unidentified aircraft were flying over Turkey and the Turkish
air force was on alert
     (ii) 100 Soviet fighter planes were flying over Syria
     (iii) a British Canberra bomber had been shot down over Syria
     (iv) the Russian fleet was moving through the Dardanelles.
These four reports added up to a strong presumption that a war with
Russia was starting; but none of them actually meant what it seemed to
mean.  A flight of swans over Turkey had been misidentified as
aircraft.  The Soviet airplanes over Syria were a routine escort
(nowhere near 100 planes) for the president who was returning from a
visit to Moscow.  The Canberra bomber was forced down by mechanical
problems; and the Russian navy was engaged in a scheduled routine
exercise in which a fleet sailed through the Dardanelles.
---------------

    Computers can play dreadful tricks, and can fail at the most
inconvenient moments.  There were many computers in the nuclear weapons
system from the earliest days of computers when failures were common.
The two following examples fortunately did not happen at any very bad
moment.

    At the main Command Centers the warning displays were two windows
that showed the number of missiles, Intercontinental Ballistic Missiles
and Submarine-Launched Ballistic Missiles respectively, that were
currently in view by the early warning radar.  Normally each window
showed a string of four zeros.  At 2:25 a.m. on 3 June, 1980, the
windows started showing various numbers of missiles, always the figure 2
in place of one or more of the zeros.  Preparations for retaliation were
instituted, including nuclear bomber crews starting their engines,
actual launch of Pacific Command's Airborne Command Post, and the
concrete covers of Minuteman missile silos rolled back.  This last
action would have been visible to Soviet satellites, and so could have
resulted in corresponding Alert procedures on their side.

    It wasn't difficult to determine that the system was malfunctioning,
because the patterns of numbers were not rational.  While the cause of
the false alarm was still being investigated three days later, the same
thing happened and again preparations were made for retaliation.  Not
having been there at the time, one cannot judge with certainty, but to
take the same irrational display as a serious warning the second time,
and again risk escalation to a crisis or a war, seems stupid.  The cause
was a single faulty computer chip that was failing in random fashion,
and giving the same deceptive display at all the Command Posts.

    My last example was probably the nearest we have been to nuclear
holocaust.  Fortunately it happened in 1979, when nothing much else was
going on.  Just suppose this had happened during the Cuban Missile
Crisis:  at 8:50 a.m. on 9 November, duty officers at the four principal
U.S. command centers all saw on their displays a consistent pattern
indicating a large number of Soviet missiles in a full-scale attack on
U.S.A.  Emergency preparations for retaliation were put in hand.
Missile silo lids were rolled back.  Air Force planes took off,
including the president's National Emergency Airborne Command Post, but
without the president.  It seems they could not find him in time.  With
commendable speed NORAD was able to contact the early warning radar and
learn that no missiles had been detected.  Also, the sensors on
satellites were functioning that day and saw no missiles.  In only 6
minutes the threat assessment conference was terminated.  It is said
that at 15 minutes intercontinental missiles would have been launched,
and nothing could bring them back.

    What had happened was that a technician was trying out a war games
tape on one of the back-up computers.  Two operational computers failed
one after the other, and each one automatically switched in its backup.
The second backup was the one the technician was using.  U.S. Senator
Charles Percy happened to be in NORAD Headquarters at the time and is
reported to have said there was absolute panic.  Be that as it may, some
officers acted coolly enough to make the correct checks and cancel the
alert in time to save the world from disaster.

    After that alarm an off-site testing facility was constructed, so
that exercise tapes did not have to be run on a system that could
possibly be in military operation.  But they ought to have done it long
before.  There were two or three incidents where war games tapes caused
confusion back at the time of the Cuban Missile Crisis, 17 years
earlier.

    People sometimes act foolishly, sometimes more wisely than we could
ever expect.  You can't foresee everything that might happen.
---------------

    What was the real risk of nuclear war?  Any risk of it was too much,
but did we need just "a bit of good luck" to survive, or were we
incredibly lucky?  You can't tell for sure but you can make some
guesses.  And on the basis of those guesses you can make probability
calculations that give conclusions which are certain, to the extent that
the guesses are right.

    I collected reports of twenty mishaps, including the five described
here, that could possibly have started a train of events or
misunderstandings and caused a nuclear war.  Let us make a very
conservative guess, and suppose that each of the twenty represented, on
the average, a risk of 1 in 100 of disaster.  That means the chance of
surviving any one of them is 99 to 1 in your favour.  It is a simple and
certain deduction from probability theory that the chance of surviving a
series of risky events is equal to the probability of surviving the
first one, multiplied by the probability of surviving the second,
multiplied by the third, and so on.  Unity represents certainty of
surviving, and zero means certainty of being destroyed, and 1 in 100
risk of disaster is exactly 0.99 probability of surviving.  Multiplying
0.99 by 0.99 twenty times gives 0.82, or 82%.  That is just about the
same as the chance of surviving a single pull of the trigger at Russian
roulette played with a 6-shooter.

    No doubt the Russian system had just as many defects and accidents,
perhaps more.  Say twenty mishaps on the Russian side too: another pull
of the trigger?  A 1 in 3 risk of death, or 67% chance of surviving.

    My selection of twenty mishaps represents only a small proportion of
the mishaps in accessible U.S. records.  There must also have been
accidents that were never reported because individuals did not want to
admit error, or wanted to keep up the good reputation of their military
unit.  Others may still be "classified" and the records inaccessible.

    My calculation depends upon the pure guess that the average risk may
have been 1 in 100, and upon the probable under-estimate of only twenty
events on each side.  Some of those events must have been worse,
particularly the computer tape debacle.  Suppose that some of them were
as bad as a 1 in 10 risk of disaster.  It would only take seven events
as bad as 1 in 10 to give the world less than a 50:50 chance of
survival.
---------------

    My conclusion is that, although most of the time great prudence was
exercised by military commanders at all operational levels, the risks of
the Cold War were by no means justified.  Nuclear deterrence is far too
dangerous, and has to be abandoned.  During the Cold War we ought to
have been more scared than we were.
---------------

    The danger has not gone away.  The Russians still have something
like half of the original Soviet arsenal of ballistic missiles, with
nuclear warheads in place.  Similarly, in U.S.A. about half the maximum
arsenal is still deployed.  On both sides the alert status is not much
lower than it was during the Cold War.  Retaliation to a perceived
attack might be started in a few minutes, without taking time for
thought and consultation, and without even waiting for the first nuclear
explosion.

    Suppose there were an unauthorized launch of an intercontinental
missile, which is not impossible in the present chaotic state of the
Russian military; suppose Chicago and its inhabitants were destroyed by
a direct hit with a one megaton bomb.  What would the U.S. response be?
Would they retaliate and destroy a Russian city?  If so, would that be
the end of the matter, or could the exchange escalate to a nuclear war?

Recommended reading:
    Phillips, Alan F. 1998.   "Twenty Mishaps that might have caused
Nuclear War." available by e-mail on request.
    Sagan, Scott D.  1993.  The Limits of Safety.  Princeton, N.J.:
Princeton University Press.
------------------------

The Author:
Alan F. Phillips, b.1920; graduated with honours in physics, University
of Cambridge, England; military research on radar in World War II; then
qualified in medicine (University of Edinburgh), and specialized in
Radiation Oncology; practised radiation oncology in England, U.S.A., and
Canada; retired in 1984 and studied nuclear weapons and nuclear war;
became an activist for peace.
                                            16 March, 1998
------------------

Alan F. Phillips, M.D.
Physicians for Global Survival, Canada
<aphil@icom.ca>


Home | How You Can Make a Difference | Problem Identification Topics |
Proposals/Solutions | Information Resources | Who's Who | Upcoming Events
1998.  Permission to reprint is granted provided acknowledgment is made to:
The Canadian Centres for Teaching Peace
Last update:  21 Nov 2000