SAGE like Forrest Gump
Les Earnest (les at cs.stanford.edu)
I began work at MIT Lincoln Lab in late 1956 and was
assigned to share an office with Paul Sinesi. When I asked
what he did, he replied in a proper
Having been trained as an aviation electronics officer I found that answer puzzling, since I knew that hostile bombers normally attempt to jam radars. Being new to the organization, however, I figured that I would later figure out how they dealt with this problem.
In fact, as I subsequently observed, they didn’t deal with
it and carefully designed all tests and demonstrations to avoid it. Thus SAGE
was a gigantic fraud on taxpayers in that it was a “peacetime defense system”
that would malfunction in an actual attack, much like
This presented me with a dilemma. I had come to MIT because it was a hotbed of advanced computer technology but helping create a military-industrial-political fraud was unethical. Nevertheless I continued to work on SAGE for a time and learned that it had other major defects.
These problems were widely discussed in-house at the time
but were kept secret from the public. Physicist Herman Kahn, who was then
Thus SAGE had several things in common with the mythical Forrest Gump: it was very fast, financially successful, and incredibly stupid.
Because I had come from a military aviation background I was assigned to do weapons integration for SAGE, which involved specifying intercept guidance calculations for various tactical approaches by manned interceptors and for direct interceptions by BOMARC missiles. I had to negotiate with several aircraft company engineering representatives regarding how the guidance commands could either be displayed to pilots or optionally coupled to their autopilots so that the pilot could focus on launching air-to-air missiles at bombers. I also designed the display consoles used by Intercept Directors to select tactics and oversee interceptions.
I subsequently learned that MIT had tried hard to duck out of the responsibility for weapons integration on the grounds that it should be done by the Air Force, but for some reason the Air Force did not want to do that.
Good to go
Each BOMARC missile was to use a rocket booster to get airborne and a ramjet to cruise at high altitude under SAGE control to the vicinity of its target. It then used its Doppler radar to locate the target aircraft more accurately so that it could dive at it and detonate. BOMARCs were based in hardened structures and, when a given missile received a launch command from SAGE, sent via land lines, the roof would roll back, the missile would erect, and if it had received a complete set of initial guidance commands, it would launch in the specified direction.
It was clearly important to ensure that the electronic guidance systems in these missiles were working properly, so the Boeing engineers included a test feature that would generate a set of synthetic launch commands so that the missile electronics could be monitored for correct operation. When in test mode, of course, the normal sequence of erecting and launching the missile was suppressed.
However when we reviewed the BOMARC launch control system, one of our engineers noticed a rather serious defect. If the launch command system was tested, each missile would be in a state of readiness for launch. If the "Test" switch was then returned to "Operate" without individually resetting the control systems in each missile that had been tested, they would all immediately erect and fire! Needless to say, that “feature” was modified soon after we mentioned it to Boeing. The fact that it wasn’t caught by the manufacturer suggested that safety oversight of their engineering practices was inadequate.
Another problem showed up in the packet radio system used to guide both manned interceptors and missiles. The packet formats were carefully specified for each kind of command, principally heading, altitude and speed, and the creation of the ground transmitter and airborne receivers were assigned to two different contractors, but when the system was tested it didn’t work. It turned out that the specifications neglected to specify whether the high or low order bit was to be transmitted first and the two contractors made different assumptions. That too got fixed, at some expense.
Inadvertent erection and another embarrassment
In 1960, I somehow was assigned the responsibility of leading
a study group to get approval for putting nuclear warheads on the
second-generation BOMARC ground-to-air missiles. This involved proving to a
government nuclear safety board in
The SAGE system used land lines to transmit launch commands to the missile sites and, since these lines were duplexed, a black box at each missile site was set up to detect when the primary line went bad so that it could switch to the backup. However on examination we noticed that if both lines went bad concurrently the system would remain connected to the backup line and the amplifiers would then pick up and amplify whatever noise was there and interpret it as a stream of random bits.
Jack Dominitz, a member of our team, did a Markov analysis to determine the expected time that it would take for a random bit stream to generate a Fire command for one of the missiles. He found that it was a little over two minutes and, when such a command was received, the missile would erect and prepare to launch. However, unless the missile also received a full set of guidance commands during the launch window of about five minutes, it would automatically abort. Fortunately he was able to show that getting a complete set of acceptable guidance commands within this time frame was extremely improbable, so this failure mode did not present a nuclear safety threat, though it could be a bit frightening.
The official name of the first BOMARC model was IM-99A, so I wrote a short classified report about this problem titled "Inadvertent erection of the IM-99A." While that title raised a few eyebrows, it was destined to get much more attention than I expected because its prediction came true a couple of weeks after it was published. Both phone lines went bad at one site, which caused a missile to suddenly erect, start the launch sequence, and abort. Needless to say, this scared hell out of the site staff and a few other people but I believe it was not made public at the time.
The Air Force was suitably impressed with our prediction and I was immediately called upon to chair a committee having the honor of fixing the problem. The fix was rather easy: just disconnect when both lines are bad. With good engineering practice, of course, this kind of problem wouldn't have happened. However, the world is an imperfect place.
Now, with 50 years hindsight, I realize that both our study group and the government nuclear safety committee overlooked another possibility, namely that a malevolent programmer might have been able to launch a missile all by himself. There was no certainty that such a scheme would have worked inasmuch as the SAGE software was reviewed by multiple people who might have questioned any odd-looking code.
Nevertheless we should have considered that possibility and taken steps to ensure that it didn’t happen. The reason we didn’t was that there was no such thing as a malevolent programmer in that era (1950s and ‘60s) – we were all honest, upright, and altruistic, so the idea that a programmer might sneak in evil code was inconceivable. Later experiences on the Internet revealed other possibilities.