AN EMPHASIS ON LOGIC
Though the phrase has come to mean practically all things to all programmers,
essentially it describes a systematic, mathematical approach to the creation of
software. ln particular, it calls for dividing programs into small, logically arranged
tasks, as Pascal does. One specific aim of structured programming is to reduce
the use of the so-called unconditional jump, or GOTO statement. Most major
languages use the GOTO in order to transfer control of processing from
one place in a program to another point perhaps several pages distant. Though
a handy tool for the programmer, the GOTO statement almost always makes
a program more difficult to read and thus increases the chance that errors
will go undetected.
By stressing rigorous organization, advocates of structured programming
hoped to limit the problems created by the ever-increasing complexity of software.
Programs such as those required by systems that control air traffic - and
later space satellites - were growing so large that they took years to complete;
they had to be written in sections by teams of programmers, none of whom had a
grasp of how everything fit together. Too often the result was software that cost
millions of dollars, lagged months behind schedule and came on-line containing
thousands of errors. The problem became so severe that computer scientists
started referring to it as "the software crisis."
Nowhere was this mounting crisis more critical than in the U.S. military establishment,
the world's largest consumer of computer hardware and software. By
1973, when officials began to pay serious attention to the problem, the Department
of Defense was spending nearly half of its $7.5 billion computer budget to
develop and maintain software. The cost of computer hardware, by contrast, was
declining despite dramatic improvements in the computers' power and memory.
The software problem was most acute in weaponry and other so-called
embedded computer systems. Such a system consists of a computer embedded
in a weapon or machine - the tiny computer in a ballistic missile, for example,
or the bigger ones controlling the communications on an airplane or ship.
(Examples of embedded systems in nonmilitary applications include the microprocessors
in automobiles or microwave ovens as well as the ones in robots on an
industrial assembly line.)
Programs for embedded military systems often run to tens of thousands of lines
of code. Expensive to write, such programs are even more costly to maintain.
Over a typical lifetime of up to 20 years, they must undergo repeated modifications
to keep up with the system's changing requirements. And program bugs in a
system controlling a ballistic missile or an air-defense network could obviously
have disastrous consequences.
No small part of the problem was the incredible hodgepodge of languages
in which embedded-system software was written. Surveys during the early
1970s found no fewer than 450 high-level languages and dialects employed
in coding such programs. (Some estimates, which also counted assembly languages,
ran as high as 1,500.) Many were obscure languages developed for a
single job because none of the major general-purpose languages could meet
the job's special needs. These needs might include unusual input/output requirements
and real-time control - the ability to monitor and respond to constantly changing conditions.
One result of the proliferation of languages in the military was massive duplication
of effort. Each service had its own favorite languages, which were
incompatible with those of the other services; a program written in an air
force language, for example, had to be completely rewritten in a different language
for use by the army or the navy. This, together with the related problems
of training programmers to make them literate in more than one language,
and of developing separate compilers for many applications, added
up to runaway costs.
In january 1 975, the Pentagon set out to impose order on the linguistic chaos. It
established a large committee known as the High Order Language Working
Group (HOLWG), with representatives from all the military services as well as
from three U.S. allies in the North Atlantic Treaty Organization - France, West
Germany and the United Kingdom. HOLWG's mandate was to find languages -
preferably only a few of them - suitable for programming every new embedded
computer system that came on-line.
THE AIR FORCE'S COMPUTER JOCKEY
HOLWG's chairman, and the driving force behind the effort to straighten out the
software mess, was Air Force Lieutenant Colonel William Whitaker. A brilliant
student who dreamed of becoming a jet pilot, Whitaker had breezed through his
undergraduate studies in physics at Tulane University in two years. He then
achieved the highest academic grades in the history of his air force flight
school, only to wash out as a pilot because he just could not get the hang
of controlling an airplane. Despite this disappointment, Whitaker stayed in
the air force and became thoroughly versed in computer science during a 16-year
stint in nuclear weapons research at Kirtland Air Force Base, near Los
Alamos, New Mexico. While rising to the post of chief scientist of the Air Force
Weapons Laboratory there, he personally accounted for some 30,000 hours of
computer processing time.
During that period, Whitaker came to know the frustrations of language incompatibility
all too well. He remembered one program in particular that had to
be rewritten five times as his computers again and again were replaced by newer
models. Though HOLWG's mandate did not require the creation of a single
common language, Whitaker had that in mind from the beginning. "He believed,
when no one else believed, that there was a need for a common language,"
recalled one close observer, "and then he made it happen."
During that period, Whitaker came to know the frustrations of language incompatibility
The way Whitaker made it happen was a sharp departure from all of the
language-design procedures that had gone before, either in or out of the military.
Instead of appointing a committee to haggle endlessly and then settle upon a
language, HOLWG - at Whitaker's urging - sought the guidance of a long list of
computer users within the military and programming experts outside.
The users were asked to help define the necessary requirements for a common
language. The task of drafting these general specifications fell to David Fisher,
a civilian researcher at the Institute for Defense Analyses. Fisher brought to
the job a solid background in the theory and practice of programming; he had
taught at two universities and had designed military software at the Burroughs
Corporation. He already had conducted studies of the Defense Department's
software costs and understood its tangle of computer languages so thoroughly
that he could usually pinpoint in an instant which department installation
used which dialect of which language, and what the dialect's particular features
were meant to achieve.
A STRAWMAN'S FATE
ln April 1975, three months after the formation of HOLWG, Fisher's first draft of
requirements for a common language was circulated to reviewers in the military,
industry and academia under the code name Strawman. The choice of name was
significant, indicating that Fisher and Whitaker intended this document, as some-
one put it, "to have the stuffing knocked out of it" by the reviewers, who would
then suggest improvements.
The pounding was not long in coming, and Strawman was revised in response
to the critical comments. This cycle of draft, review and revision continued
through five additional sets of requirements over the following three years,
eventually reflecting evaluations by more than 80 review teams in the U.S. and
Europe. Each succeeding document bore a name that measured progress toward
a hardening of the requirements: Woodenman, Tinman, Ironman, Revised lron-
man and, the final standard, Steelman.
The list of requirements lengthened, reaching nearly 100 by the Tinman phase,
until it became clear that no existing language could fill them all. The armed
services issued an interim list of seven languages, including FORTRAN and
COBOL, approved for programming embedded systems. But subsequent appraisals
of these and a score of others made clear that none could satisfy more
than 75 percent ofthe specified requirements.
Under Whitaker's prodding - "he ran the project with an iron fist," an observ-
er noted, "in a velvet glove, of course" - HOLWG came to agree that the requirements
could be met only by creating an entirely new language. To achieve
this, the committee decided to stage an unprecedented international competition.
ln May 1977, while the specifications were still evolving, the committee
requested proposals from the world's top language designers, with the understanding
that the proposals would be based on one of three languages: PL/I,
ALGOL 68 or Pascal. Fifteen design teams responded, and most of their proposals
were based on Pascal, demonstrating the dramatic impact of the new concept of
HOLWG selected four of the proposals for funding during a six-month preliminary
design phase. The contractors, all of whom proposed Pascal-based
designs, were two Massachusetts companies, SofTech and lntermetrics; a California
firm, SRI International; and Cii Honeywell Bull, the Paris-based subsidiary
of an American company, Honeywell Corporation. Though each design team's entry
received a color code name to preserve its anonymity during
the review process, the predilections of the contractors were so familiar that
astute reviewers were able to match the teams with their respective designs in
a matter of minutes.
In 1978, after evaluation by nearly 400 reviewers, two of the four designs -
Red (Intermetrics) and Green (Cii Honeywell Bull) were selected for a final
showdown. The year-long phase of refinement that followed was unusually
intense. A member of the Red team remembered falling asleep at night crying
from fatigue. "Red was the more conservative language, Green the more briefly
described, avant-garde |anguage," one of the competitors said. "But both languages
changed during this final phase: Red becoming more avant-garde, Green
becoming more conservative as it was fleshed out."
The winner, announced in May of 1979, was Cii Honeywell Bull. The
Green team's victorious entry was christened Ada. The name honored
Augusta Ada, Countess of Lovelace, the 19th-century mathematician and writer
who is often credited with being the world's first programmer because of her
interpretive writings about Charles Babbage's Analytical Engine in the predawn
history of computing.
The victory was a personal triumph for Jean Ichbiah, who headed the Green
team. Born in Paris in 1940, Ichbiah trained as a civil engineer at the prestigious
Ecole Polytechnique. Later, the French government awarded him afellowship for
further study in the United States. He became so captivated by computer programming
while taking his Ph.D. at M.I.T. that he had difficulty completing his
thesis on the optimal arrangement of subway systems. Soon thereafter, Ichbiah `
joined Cii, a new French company that later merged into Cii Honeywell Bull, and
in 1972 he designed his first programming language, LIS, for
Langage d'Implementation de Systemes. LIS was strongly influenced by Pascal
and was the seed from which Ada sprang.
During the design competition, Ichbiah, who spoke no fewer than five human
languages and had a brown belt in judo, drove himself even harder than he drove
his 10-person international team, which included members from the U.S., the
United Kingdom and West Germany as well as France. He sometimes worked
100 hours a week perfecting the design. Often he let his intuition guide him
in making a decision, relying on esthetic considerations, for example, before
developing a logical rationale. The result, wrote an admiring member of the
runner-up Red team, was not "a language designed by a committee" but
one "designed by a small team with a strong leader".
MODULES FOR EASY MAINTENANCE
Ada's most distinctive aspect was an extreme approach to structured pro-
gramming. The language permitted programs to be written in packages - self-contained
modules that can be produced by different programmers and then
fitted together. A package can be designed, tested, debugged and then stored in a
library for later use in a program as if it were a piece of off-the-shelf software. This
modular scheme, Ada's advocates have argued, creates programs that are reliable,
easy to read and easy to maintain, saving thousands of hours and hundreds
of millions of dollars.
But Ada's fans concede that the language pays a price for its readability and
other advantages. Ada has so many features, designed to meet the government's
Steelman specifications, that it is exceedingly difficult to learn. In addition, an
Ada compiler occupies many more times the memory space needed by compilers
for its root language, Pascal. Ada's size and complexity bothered critics such
as Pascal's author, Niklaus Wirth, and C.A.R. Hoare, his old colleague from the
ALGOL 68 controversy. Hoare, who served with Wirth on the SRI International
team that was eliminated in the semifinals of the design competition, worried
aloud that "gadgets and glitter prevail over fundamental concerns of safety and
economy." He even publicly raised the specter of missiles going awry because of
an undetected flaw in an Ada compiler.
Wirth put his concern a different way. "It throws too many things at the
programmer," he said. "I don't think you can just learn a third of Ada and be fine.
There are places where you tread on one of these spots which you haven't
learned about, and it backfires on you."
In defense of his language, Ada's chief architect, Jean Ichbiah, expressed his
"admiration and respect" for Wirth but added: "There are times when Wirth
believes in small solutions for big problems. I don't believe in that sort of miracle.
Big problems need big solutions!"
Other advocates have contended that the only alternative to a large, complex
language like Ada for writing big software projects is a proliferation
of small, simple and incompatible languages - the very situation that Ada
was meant to remedy.
Predictably, creating compilers that would allow Ada programs to run efficiently
on the Defense Department's various machines was no easy task. The job
was made even more difficult by the Pentagon's determination that Ada remain
unadulterated by dialects, extensions or subsets. Under the department's Ada
copyright, any proposed compiler must conform to uncommonly rigid standards:
No one can call their product an Ada compiler unless it is first officially validated
in a battery of some 2,000 tests.
THE SURVIVORS AT WORK
Despite these hurdles, successful compilers eventually appeared, and Ada began
to make its presence felt. In 1983, the Defense Department directed that
all new "mission-critical" applications be written in Ada. "Mission-critical"
refers to computerized communications and weapons systems, such as the
enormous programs contemplated for the Strategic Defense Initiative anti-missile
network. The Pentagon has predicted that by the end of the decade,
85 percent of new mission-critical software - five billion dollars' worth -
will be written in Ada.
Beyond its military applications, which included adoption as NATO's standard
programming language, Ada has made modest headway. One lifesaving
program that takes advantage of Ada's real-time capabilities monitors the condition
of hospital patients connected to kidney dialysis machines. And although
critics of the language remain vocal, Ada's absolute uniformity makes it irresistible
to many managers of large programs.
Other major languages have gone through the tedious process of standardization,
under the auspices of the American National Standards Institute (ANSI), in
an attempt to rein in their dialects. But no other recent language has been so
vigorously standardized from the outset, before dialects could even begin to
proliferate. Thus, Ada in the 1980s has come close to guaranteeing true
portability: A program can be written for one computer with the near-certainty that it can
be recompiled and run correctly on other machines. This alone makes Ada an
important programming tool for big projects, bringing order to at least a portion of
the turbulent world of computer languages.