AdaPower Logged in as Guest
Ada Tools and Resources

Ada 95 Reference Manual
Ada Source Code Treasury
Bindings and Packages
Ada FAQ


Join >
Articles >
Ada FAQ >
Getting Started >
Home >
Books & Tutorials >
Source Treasury >
Packages for Reuse >
Latest Additions >
Ada Projects >
Press Releases >
Ada Audio / Video >
Home Pages >
Links >
Contact >
About >
Login >
Back
Why Ada isn't Popular (Mathew Heaney)

Rick Thorne writes:

> In truth, I like Ada too.  I'm a former Ada man myself, as I believe I
> mentioned.  There are great things about the language, and since we're all
> aware of them there's no point in going into them.  I also prefer Beta to
> VHS.

Mathew Heaney responds:

This is actually a specious comparison.  Though Beta had superior picture
quality, it had only half (or a third?) of the playing time of VHS.  The
market decided that playing time was more important, and so chose VHS.

(Credit goes to Robert Dewar for pointing this out in a previous post.)

Arguments about Beta format being "better" usually omit the contribution
of playing time in the decision to chose VHS.

So I don't buy the argument that Ada has "failed" even though it is
"better." 

I would say that Ada is not as popular for a few reasons, among them:

1) Tony Hoare severely criticized Ada in his Turing Award lecture,
   saying (literally) that the future of mankind was at stake if we were
   to use Ada, and that Ada was "doomed to succeed."  Who's gonna argue
   with Hoare?  If he said it, it must be true, right?

   In retrospect, his criticisms seem a little, well, dated.  One of the
   things he said would cause life on Earth to end was using exceptions!
   Although exceptions can be misused, that's true of all language
   features, and nowadays, everyone seems to think exceptions are a
   Pretty Good Idea.

   People sometimes "forget" to mention that Hoare's lecture was
   directed at an early version of the language.  Ada wasn't
   standardized until 1983, and Hoare's speech took place in 1980.  The
   language was in fact made simpler between its 1980 draft and its 1983
   final version.

2) The world wasn't ready for another large language.  Parnas, Hoare,
   Dijkstra were all critical of the language, noting especially its
   size, and when guys like that talk, people listen.  I suspect
   (perhaps I am re-writing history) that Hoare's speech influenced the
   ACM canvassers, who *rejected* the language during the ballot, citing
   its size as a concern.

   People (like P. J. Plaugher) sometimes lump together "large"
   languages, putting Ada in the same bucket as PL/I and Algol 68.  I
   don't think this is a fair comparison, because Ada is a consistent
   language, reflecting the vision of its architect (even if you don't
   happen to like that vision).  In his paper "If C++ is the answer,
   what is the question?", Plaugher gave a lame criticism of Ada,
   explaining that programmers could nest packages to any level they
   wanted.  Huh?

   I've seen Ada put in the imperative group of C and Fortran,
   explaining the fact that Ada doesn't use distinguished-receiver
   syntax as proof of its emphasis on procedural programming.  Even
   Brown University professor Peter Wegner seems to ignore the fact that
   Ada has abstract data types, labeling Ada merely "object-based"
   because it "uses packages as the unit of decomposition."  Huh?

3) Most programmers think that getting run-time errors, and then using a
   debugger to find and fix those errors, is the normal way to program.
   They aren't aware that many of those errors can be detected by the
   compiler.  And those that are aware, don't necessarily like that,
   because repairing bugs is challenging, and, well, sorta fun.

   You are not giving a programmer good news when you tell him that
   he'll get fewer bugs, and that he'll have to do less debugging.
   Basically, we still live in the dark ages of programming, not unlike
   the time engineers were learning about boiler technology by figuring
   out why a boiler exploded, scalding people to death (remember the
   Therac-25?).  People will probably have to die in order for "software
   engineering" to be a true engineering profession, instead of the
   buzzword that it is today.  Sad but true.

4) Early compilers were way, way too expensive, and compilers were (and
   still are today) very difficult to implement.  As a language
   designer, Jean Ichbiah didn't concentrate enough on language
   implementation issues.  (By contrast, Tucker is a compiler-writer's
   language designer.  Suffice to say, things would be very different
   had Red, the version proffered by Ben Brosgol then at Intermetrics,
   been chosen.)

   The obvious repercussion of this is that there weren't any cheap
   compilers (say, in the US$50 - US$100 range) that you could run on
   your PC at home, so no one could experiment with the language.  Ada
   essentially missed the boat in the PC revolution, and so was never
   able to develop the grass-roots support that Pascal and C had
   (because those languages were relatively easy to implement, and were
   therefore much more readily available).

   Again, we see that there are many issues that factor into a decision
   to purchase a compiler (just like as for buying tapes for your VCR).
   You can tell the client about how he's going to have fewer bugs
   (superior picture quality), but forget to mention that the compiler
   will cost US$3000 (has less playing time).

   The market chose availability and cost of compilers over quality of
   language.  This might not be a very smart decision, because the cost
   of human labor to find and fix bugs is way, way, way more expensive
   than any compiler, but since we don't use metrics in this industry,
   decision-makers don't now that.

5) There is an entire industry devoted to selling tools to repair the
   defects in the C language (tools for finding memory leaks, type
   errors, etc).  Guys like Les Hatton have a vested interest in keeping
   things exactly as they are, because their livelihood depends on
   people using error-prone languages.  Those people just aren't going
   to stand on the sidelines while you tell programmers that if they use
   Ada, they can throw way all their other tools too.

6) Ada didn't have type extension and dynamic binding, and so missed the
   boat on the object technology revolution.  You just weren't cool
   enough in the 80's, if you didn't use an object-oriented language.

   Why wasn't Ada83 object oriented?  It depends on who you ask.
   According to Bertrand Meyer (peut-etre il a parle a Jean?), Jean
   --who had been writing Simula compilers, and was thus familiar with
   the paradigm-- thought that dynamic binding would have been too
   radical for the conservative DoD, who after all were the ones
   commissioning the language, and so he figured they wouldn't go for
   it.  According to others, Jean in fact didn't want type extension and
   dynamic binding, because he didn't think it was necessary.

   (Although, ironically, it was Jean who did push for inheritance of
   operations.  In retrospect, I think this turned out to be a bad
   language design decision, because very, very few Ada programmers
   --even longtime ones-- really understand how the inheritance model of
   Ada83 works, and therefore don't use it.  One "programmer" who did
   understand this model was Tucker Taft, who made this the cornerstone
   of the mechanism to add type extension to the language.)

   You have to understand the climate of the times.  Ada was largely a
   reaction to languages like Fortran.  They [the commissioners -- the
   Dod] wanted once and for all to determine everything at compile time,
   so you could eliminate errors like the parameters of an invocation of
   a subprogram not agreeing with the parameters of the declaration.
   Certainly not an unreasonable desire!

   At that time, Smalltalk was the popular object-oriented language, and
   method lookup really was slow, but only because the language was
   interpreted.  Sadly, many people then and even today overlook this,
   and conclude that "object-oriented programming makes your program run
   slow," which squelched the idea for inclusion in a deterministic,
   real-time language. (Example: at SIGAda *this* year (1998) someone
   got up to the microphone to ask the presenter a question, explaining
   that he did real-time systems, and he wanted to know if he should be
   nervous about object-oriented programming!  Some rumors just die
   hard.)

   Of course we know now that dynamic binding is nearly as efficient as
   static binding.  The Smalltalk legacy lives on, however, and reuse
   via inheritance came to be seen as the Measure Of All Good Things.  

   But there is a dark side to this, called the "fragile base class"
   problem.  Deep inheritance hierarchies create a lot of coupling
   between abstractions, creating a tension between reuse and
   information hiding.  An abstraction is basically exposing its
   representation by announcing that it inherits from another
   abstraction, and we should all know the kind of maintenance headaches
   you have when you don't practice information hiding.

   Thankfully, the tide seems to be turning, and people are beginning to
   realize that type extension is not so great after all, and that
   "mere" aggregation is often preferable.  Deep inheritance hierarchies
   as a re-use mechanism may be fine for by-reference languages like
   Smalltalk and Eiffel, but leaf-classes in a by-value language like
   Ada95 or C++ become VERY SENSITIVE to the representation of the
   ancestor classes, which means massive re-compilations are often
   required any time you touch a base class.  (This is the sort of
   problem we had for other reasons in Ada83, which motivated the
   inclusion of child packages in Ada95.)

   If you are an Ada95 or C++ programmer who programs "the pure
   object-oriented way" by creating deep inheritance hierarchies, then
   YOU ARE MAKING A HUGE MISTAKE.  You're going to spend all your time
   just compiling.

   Funny story: A new programmer just started using Ada, and posted a
   question to this newsgroup.  He had been reading in Mike Feldman's
   introductory book about the abstract data types in Ada, and remarked
   that ADTs reminded him of object-oriented programming.  He wanted to
   know what the difference was between the two.

   Good question.

   Guys like Pete Coad who write things like "Ada is not
   object-oriented. PERIOD." miss the whole point, which is that the
   important thing is language support for user-defined abstractions.
   Ada83 had that, and then some.  If you use a "pure" language like
   Smalltalk or Eiffel, your entire world is inheritance hierarchies,
   and so you think any language without deep inheritance hierarchies
   must be lacking.  

   (Aside: I hate the term "pure object-oriented," because it makes a
   lot of naive programmers think that "pure" must be "better."  This is
   the same reason I don't like how Wegner created a hierarchy from
   "object-based" to "class-based" to "object-oriented," because
   programmers are going to think "object-oriented" is better than
   "object-based."  These shouldn't be in a hierarchy, because they are
   just alternate implementation techniques; one is better than another
   only to the extent that it helps you solve a problem.  Just look at
   the singleton pattern.  In a "pure" language, you have to jump
   through hoops to create the singleton, which is an object-based
   abstraction.)

   What also happened during the 80's is that the term "object-oriented"
   changed meaning.  It used to refer a data-centric style of
   programming that emphasized abstraction, and what you do to an
   abstraction, in contrast to a procedural style, which emphasized
   strictly what you do.  Given that definition, people were happy to
   call Ada an object-oriented language.

   For whatever reason (probably due to Wegner), the term
   object-oriented came to mean language support for type extension and
   dynamic binding, and if your language didn't have that, then you
   couldn't call it object-oriented.  And so Smalltalk programmers like
   Pete Coad could criticize the Ada for not being truly
   "object-oriented."

   But this is like saying you can only call them "jeans" if they have a
   zipper fly.  If your blue cotton pants made by Levi have only a
   button fly, and not a zipper, then they're not really jeans.

   I hope you see how ridiculous this nomenclature issue is.
   Object-oriented is a paradigm, a way of thinking.  Ada83 had direct
   language support for modeling in terms of abstractions, which is the
   sine qua non of object-objected programming.

   I like having type extension and dynamic binding in the Ada95, but
   you'd be wrong to think that this changes my style of programming
   much.  The most important additions to the language were better
   support for real-time programming, and hierarchical name-spaces (child
   packages).  The tagged type stuff is just frosting on the cake.

7)  The mandate.

   I don't think the government was trying to "bully us" with the mandate,
   they were just trying to manage the process.  But by mandating that Ada
   be used for all systems --even those for which it wasn't necessarily
   suitable-- they diluted the value of the language in those systems where
   it really is an advantage to use Ada.

   If you didn't like the policy, that's fine, but don't throw the baby out
   with the bath-water.

   At the time, the US DoD was the number one consumer of software, and
   they had huge software costs that were only growing.  They had to get
   their costs down (hundreds of languages were being used), and the
   success rate up (many systems weren't even being delivered), and one way
   they chose to do that was to commission the design of programming
   language that the DoD could use as their standard language for building
   real-time, embedded systems.

   I think their intentions were good, but the management of that process
   wasn't so good, and many programmers share the sentiment that the gov't
   was trying to ram Ada down their throats.

   I don't blame you or any other programmer for being offended by this
   policy, but don't blame Ada the language.  I myself used to scream "I
   can do anything I need to in Fortran.  Why do I need Ada?"  But as I
   started to use Ada over the next few weeks and months, I gradually began
   to understand what the language was buying me.

   Judge the language based on its own merits, separately from any opinion
   you may have about how the DoD commissions software systems.  If the
   gov't does something stupid, why blame Ada?

   As someone pointed out a few years ago, Ada is a large woman, but once
   you get your arms around her, you learn to really love her.

> C++/Java and others have considerable strengths of their own that make
> Ada unnecessary.  YES - unnecessary.  C++ and Java are perfect forms
> of protest.  They were developed by a handful of people (not a
> government bureaucracy like Ada was) AND they're incredible languages,
> whether or not YOU agree.

This is a common misconception.  The language was commissioned (paid
for) by the DoD, but it certainly wasn't designed by a "government
bureaucracy."  Ada was designed by Jean Ichbiah, then of Honeywell/Bull,
with input from a group of reviewers comprising members of industry and
academia.

But be careful not to construe this as "design by committee."  As John
Goodenough pointed out in HOPL-II, Jean vetoed committed decisions that
were 12-to-1 against him.

(Another story: I met Jean Sammet at this year's SIGAda conference, and
I asked her about her experience during the Ada design process.  She
told me that she disagreed with many of Ichbiah's decisions, and still
thinks he was wrong.)

So the moral of the story is, don't blame the gov't for putative errors
in the language.  If you want someone to blame, then blame Jean Ichbiah.

I'm reading a great book now called Why People Believe Weird Things, by
Micheal Shermer, in which the author explains what rational thinking is,
and how skepticism is a process.  Basically, people believe something
because that want to, not because of any scientific arguments you make.

There are guys out there who dislike Ada, but they do so because they
want to, not because of any rational analysis of its merits or flaws.
Sometimes even their arguments are factually incorrect, like saying that
"Ada was designed by committee," ignoring the fact that Jean vetoed
language design arguments that were 12-to-1 against him.  It's not
unlike creationists who explain the "fact" that evolution violates the
2nd law of thermodynamics.  (No, it does not, as any book on freshman
physics will tell you.)

I've explained the reasons Ada why I think is not as popular as C++, and
I'd like to hope that it will convince Ada's detractors that Ada isn't
so bad after all.  But as Robert Dewar pointed out, a person who has
made an irrational decision probably isn't going to be swayed by
rational arguments!  

Read Shermer, and you'll understand.


(c) 1998-2004 All Rights Reserved David Botton