The myth of better programming languages

I like Andy Hunt, and I like Ruby, but his post at perpetuates a myth that I think is harmful. He says of Ruby, "First, more than any other language I've used, it stays out of your way. You don't have to spend any effort 'satisfying the compiler'.... I can type in an absurd amount of code in Ruby and have it work the first time. Not 2-3 passes .... It just works"

What I object to is a small thing: his use of the word "you" in the first sentence ("out of your way...You don't..."). I have no problem with the second part of the quote ("I can type..."). The myth is that programmers share a psychology and therefore, the tool that "fits" my mind will "fit" your mind if only you give it a try. Now, that may prove the case for particular instances of "I" and "you," and I applaud reasoned advocacy of whatever-the-heck you love in life, but I've come to believe that there is not a shared psychology of computer programming.

Please don't reduce my point to Microsoft's insulting "Mort, Elvis, Einstein," scheme, which combines the same sweeping generality I'm condemning with a heap of condescension (do you think anyone working in Redmond classifies themselves as "Mort"?). I'm talking about programmers perfectly capable of tackling the same problems with the same productivity.

What I'm suggesting is that there is not a "best" programming language, nor perhaps are there even "better" programming languages once you get beyond a certain level of functionality. Certain programming languages are better at certain tasks, without a doubt (If you want to scrape a Web page, use a language in the Perl family. If you want to keep your CPU toasty-warm, use C++ and assembler).

Are there languages that "stay out of [my] way," and in which my code "works the first time...It just works"? Absolutely. In my career, I've felt that way about Basic, Prolog, PAL, C++, Java, and, lately, C# (although I had this anonymous delegate thing the other day I still can't parse). I used to write lengthy Prolog programs on the bus and type them in when I got to work. For a brief while, I thought that would be true for everyone, if only they gave the language a chance. In retrospect, I was fortunate to fall for a language that so few loved.

Lisp is the king of languages touted as "if only you gave it a chance." But what Lisp advocates fail to acknowledge is that Lisp was given a chance by virtually everyone exposed to computer science in the 70s and 80s. Lisp and Basic are the most abandoned languages in the field. Lisp was the second or third language I learned (after Basic, and pretty much simultaneously with Fortran) and I worked with it professionally in the late 80s and early 90s. Back then, it didn't fit my psychology. Now, though, in the Jolt Awards, my vote for best book of the year will go to Peter Seibel's "Practical Common Lisp" and I've caught myself thinking about burying a Lisp interpreter in an upcoming project.

Ruby is the belle of the ball currently, largely because of Rails. It's a recurring theme in programming language popularity that the differences between languages, which are very real, are masked by libraries and toolsets. Smalltalk, for instance, may or may not "fit" your mind, but the Smalltalk browser and workspace were, without a doubt, years beyond other IDEs. Ruby may or may not "fit" your mind, but Rails is without a doubt the most influential framework in several years. Java brought unit-testing into the mainstream, VB brought GUI builders, etc.

The shame is that when advocates conflate the benefits of their libraries and tools with the psychological aspects of their language, they focus attention incorrectly. What you get is "Hey, let's port Rails to .NET," or whatever, when what is needed is more discussion of what makes Ruby "fit" in certain approaches. Especially frustrating is that we don't even have a decent vocabulary for discussing language differences. People talk about "dynamic languages" and that, to many, means "dynamic typing," which, to many, means "implicit typing." And implicit vs. explicit typing so dominates the discussion of programming languages that IT MAKES ME WANT TO F***ING SHOOT MYSELF!!!!! THERE ARE OTHER ISSUES, PEOPLE!!!!!!

Okay, sorry.

But, for instance, Prolog relies on sequences of predicates: true, true, false ... oh, back up, try something else ... true, true, true, etc. Is that something that "fits" your mind or not? I haven't tried any languages that implement predicate dispatch, but I'm thoroughly convinced such languages would appeal to me.

Another example: As I've been reinvestigating Lisp, I've found myself actively liking Lots of Irritating Superfluous Parentheses. Now, I'm like "Yeah, why should I type a ')' to end a call, a ';' to end a statement, and a '}' to end a block? They're not 'superfluous,' there are rarely more parentheses than there would be operators-or-punctuators." But is seeing different operators important to you to quickly understand structure? (For me, I think a major reason for my change in attitude is that after 17 years of OO, flow-control in my programs is now much more governed by the structure of the object-graph, not the values of local variables.)

[...Wow. I didn't intend to rant like this... Okay, finishing it off abruptly...]

The important thing is to realize that different languages engage your mind in different ways. There are languages that you will find allow you to be profoundly more productive solving certain problems. Search for those languages, and do not conflate tool and environmental benefits (equally interesting, equally worth pursuing) with language benefits. Also realize that your own mental approach to programming is always evolving and may change the way that a certain language strikes you.

Okay. Off to walk the dog.