Tuesday, December 28, 2010


I've been thinking about languages a lot lately. Which is kind of a joke, given the title of my blog, but I actually mean "I've been thinking about them more than usual". This thought has been specifically dominated by thoughts of the Blub hierarchy as proposed by Paul Graham.

I'm not sure what's on top.

PG claims "Lisp", I've seen many that shout "Ruby", I've met many that claim "Haskell". In fact, if you participate in programming discussion for any length of time, there's a pretty good chance that you'll meet someone for every language (other than C) claiming it's Omega. It's ridiculous, of course. All most of them are really claiming is "This is the most powerful language I know how to work with", which is not the same thing as "the most powerful language". It's easy to see that trying to compare in any supposedly objective sense would cause giant hatestorms and various infighting. So people are perhaps justified in making statements like

"You don't compare a hammer with a screwdriver, but you use the one that fits your task & way of thinking/education/needed level of abstraction the most. Also, since the one doing the comparison is biased by the fact that he knows at least one of the two, or at least prefers one of the two, it is hard to find a truly objective criteria for comparing them (exceptions exist)." -Rook, pogrammers.SE

when discussing language comparison. That was an answer from a question about whether language comparison was useful. As of this writing, it has been closed and re-opened twice, and the original asker has accepted (then unaccepted, then accepted again) a joke answer. This is perhaps more telling of the culture of programmers.SE than of the question, but it doesn't seem like an uncommon response. People duck comparisons precisely because languages are half tools and half religions, and no one wants a crusade declared. But, well, you need to compare.

"A language is a tool. That said, I've seen really, really crappy tools before. No one wants to work with a hammer whose head is liable to fly off and hit another carpenter in the stomach. Likewise, if you noticed your fellow worker's hammer was in that shape, you'd probably steer clear of them when they were using it. It's also important to really understand which tool it is. You can't use a screwdriver and hammer interchangeably (though some try desperately). Hell you can't even use all hammers interchangeably; you need a sledge for some things, a mallet for others and a tack for yet others. If you use the inappropriate tool, then at best, you'll do a poorer job, at worst you'll injure yourself or a co-worker." -me, programmers.SE

Graham goes further, stating that not only can you compare languages in terms of power, but goes on to point out the obvious corollary that there is therefore such a thing as an empirically best language. As a note, I agree with him, but "which religion is best?" is a question you just don't discuss in polite society, so I haven't pushed the idea on any forum I frequent. It makes sense though. No one would disagree that Assembly < Cobol < Python on the power scale (I'm defining "power" as a non-specific mix of expressiveness, terseness, maintainability, readability and flexibility). And even admitting that simple truth exposes you to the idea that there's a ladder, or tree, or at least concentric circles of languages with one (or a relatively small group) taking the prime position.


Graham puts Lisp there, but he's making the same claim that any Ruby ardent or avid Haskeller are expressing; "Of all the languages I know, this one is the most powerful". The thing is, I haven't heard many convincing arguments to the contrary. The best argument aimed at Lisp these days is that it's slow, and even then, slow in what sense? It can certainly do the job of server software, or even local desktop/console software on today's powerful systems. Remember, Lisp was called slow back when 1Gz was the sort of processing power you paid many thousands of dollars for. I have more than that right now in my $300 dollar netbook. We're fast approaching an age where a phone you get for free with a subscription is more powerful. "Slow" just isn't a good enough strike against a language to discount it anymore. Other than that, people complain about the parentheses, which is an empty complaint at best, and typically a trolling attempt. The only good argument against Lisp as Omega comes from an unlikely source.

"I don't think it's necessarily Omega. The Haskellers and MLers say 'Well, from where we sit, Common Lisp looks like Blub. You just don't understand the brilliance of strong inferred typing'. And they may be right. Of course, Common Lispers look at Haskell and say 'Well, Haskell's really Blub, because you guys don't have macros'. It may be the case that there is no Omega, or that Common Lisp and Haskell are on different branches of the lattice, and someone's gonna find a way to unify them and a few other good ideas and make Omega." -Peter Seibel, Practical Common Lisp Talk at Google

It's an interesting fact that practitioners of either language can point to lack of features in the other. That has some pretty obvious corollaries as well.

  1. There may be such a thing as the most powerful language right now, but it may involve trade-offs (I don't know what it is, but one exists. I'll call it "Alpha" so as not to offend anyone).
  2. There is such a thing as the language that will be the best for the next 10 to 100 years (This one may or may not exist in some form today; it might be unified from several current languages as Seibel alludes. I'll use his name and call it "Omega").
  3. There is such a thing as the most powerful language that could exist on current machine architectures (This one almost certainly doesn't exist yet, and may never be embodied in an implementation. It's just the limit, in the calculus sense, of what we can hope to achieve with a language along the axes of expressiveness, terseness, maintainability, readability and flexibility. This one I'll call 0).

I'm not sure what Alpha is. I'm not sure anyone knows, because as I've said, people tend to bind that variable to whichever is the most powerful language they currently know. 0 is far away, and I won't even try talking about it today, because I don't have anywhere near enough information to make a decent guess at what it'll look like. So what does Omega look like? Well, Graham effectively says it's Arc (or what Arc will evolve into). Others variously substitute their own languages. There's a sizable community which thinks it's Haskell. Some ardents think it's Lisp. A few would like you to believe it's Java, despite the recent turbulence between Oracle and Google. And there are a couple of personalities in the industry who are vigorously pushing either Ruby or C#. Yegge echoes Seibel pretty closely

"[T]he Wizard will typically write in one of the super-succinct, "folding languages" they've developed on campus, usually a Lisp or Haskell derivative." -Steve Yegge, Wizard School

It's a line from one of his humorous, fictional pieces wherein he describes a Hogwart's-like school that churns out wonder-kid programmers, but it still seems like a vote for the Haskell/Common Lisp unification theory. It might happen. If it does, it'll be a race between the Haskellers and Lispers to out-evolve one another. In order to converge, Haskell needs to strap on prefix notation and macros, make IO easy (rather than possible), and blur the line between run-time, read-time and compile-time. Lisp needs declarative matching definitions, lazyness, currying (possibly eliminating the separate function namespace), strong types and a few small syntactic constructs (function composition and list destructuring leap to mind first). Lisp has a longer list to run through, but keep in mind that because it has macros, almost all of them can theoretically be added by you as you need them, rather than by CL compiler writers as they decide it's worth it.

It's also worth noting that the last point in Haskell's list is a pretty tricky proposition. How do you blur read/compile/run time when one of your goals is to have a complete type system? Well. REPLs for Haskell exist, so I assume it's possible, but making it part of the language core doesn't seem to be a priority at the moment (and probably won't be for a while due to the performance hits it imposes, and the perception performance hits still have in the general public of programmers). That's not the only hard bit either language would have though. How do you implement full currying and optional/default/keyword/rest arguments? Haskell purports to solve the problem by defaulting to currying, and giving you the option of passing a hash-table (basically) as an argument to implement flexibility. LISP gives you &rest, &body &key and very simple default argument declaration, but "solves" the currying issue by making currying explicit. Neither language's solution satisfies, because sometimes you want flexible arguments (and counter-arguing by saying "well, if you need them, you've factored your application wrong" is missing the point; expressiveness is a measure of power, remember, and having to think about the world in a particular way is a strike against you in that sense), and sometimes you want implicit currying (this is perhaps most obvious when writing in Haskell's point-free style, and if you've never done so, I doubt I could convince you).

As a common lisper, there are a bunch of things I'd like to steal from Haskell, if I could. The pattern-matching definitions are certainly useful in some places, list destructuring would help, and function composition seems useful (though this is, like defmacro, the sort of construct you have to understand first, in order to find places that it would greatly simplify). I'll check later, but I have a sneaking suspicion that someone has already lifted all of the above into a library somewhere on github or savannah. Even if not, list destructuring and function composition seem like they'd be easy enough to implement. The latter as a call to destructuring-bind, the former as a simple fold macro.

From the other side, there's already two projects underway; Liskell is a competing compiler to GHC that has a prefix notation and outputs the same machine code, and Lisk is a pre-processor for GHC that takes specific prefix notation forms and converts them programatically back to the Haskell source code before invoking the compiler. Lisk's creator talked briefly about macros, but the project is early enough along that nothing much more specific is out there right now (I'm watching his github repo with interest though).

I haven't a clue how to place my own bet. I tried starting this paragraph both with "My bet's on Lisp..." and "My bet's on Haskell...", but each beginning got to a logical dead end within two sentences. It doesn't seem like one can completely absorb the other. But, if Haskell + Lisp makes Omega, we'll see what it looks like shortly (by which I mean ~10 years) because cross-pollination is already happening, and it's not a far jump from there to full-on unification. Or maybe things get bloodier as the preview to Barski's Land of Lisp implies, who knows.

Either way, we'll see soon enough.

EDIT: rocketnia posted a particularly thoughtful response to the above post at the Arc Forum. He points out that there may not be an Alpha, Omega and 0, but rather "[L]ocal optima that can't be unified into Omega". I could have sworn I addressed this point (and acknowledged it, but stated that I was more interested the unification idea today), but my only mention of it is "...with one (or a relatively small group) taking the prime position." Apologies. He also explains a lot about how macros might coexist with a strong type system.


  1. Does Haskell actually need Macros though?

  2. I'm not sure.

    To be fair here, accurately answering your question would involve considering every possible programming situation, then figuring out whether (and if so, by what amount) it could be simplified with a macro, THEN figuring out whether the percentage and magnitude of "yes"es justifies breaking the type system. I doubt anyone (including those who would like to gloss over the lack of macros with "well, we don't REALLY need them") has that much time on their hands.

    A priori, it seems you get many of the advantages of defmacro by being fully functional, fully lazy and compiled, but I'm not sure you get all of them, and I don't think you can replicate reader macros the same way.

  3. Just a plug: I've been working on an arc-like interpreter (http://github.com/akkartik/wart) that provides pervasive destructuring and keyword args for any function argument. (Arc provides pervasive destructuring.) A couple of random insights from this process:

    a) When rest args are present, they take priority over optional args. Requiring keywords for optional args seems like a good idea when there's a macro body, for example.

    b) Generic functions dispatch on the type of the *last* arg. That seems like the best fit for lisp's existing functions.

  4. I think the problem is that the main contenders for Alpha and Omega have different mixes of expressiveness, terseness, maintainability, readability, flexibility, etc. So one language might be better in a situation where readability is more important, but another language might be better when you need maintainability more.

    I think the main problem with Lisp is actually the macros. Everyone has a different set of macros, making their Lisp code different than everyone else's. So it takes longer to get up to speed on the "variant" of Lips+macros being used. To my mind, the macros also typically make the code too terse, with a lot of "concept" within a few lines.

    On the other hand, I see a progression of main-line languages (Java, Perl, Python, Ruby, Scala) towards Lisp, or at least functional programming constructs. I think this is good. But the reason folks don't go all the way to Lisp is that languages like Ruby are more readable and more approachable. Or more precisely, the ability to think and code in procedural and object-oriented constructs is often useful.

    I think the future of languages lies more on the Haskell side than Lisp, due to the type inference and more advanced concepts. But I also think that it's even harder to learn than Lisp. So I think that some combination of Lisp and Haskell, along with a more readable syntax and other paradigms, will likely end up as Omega. I'm also keeping my eye on Reia and Factor.

  5. @Kartik - Sounds interesting.

    @Craig - Pardon the delay; your comment was flagged as spam for some reason.

    The macros issue is dealt with throughout Lisp literature. Typically, the answer to your comment is that a macro is complex, but macroless code that achieves the same thing would probably be more complex. It's accepted that the trade-off is less readability for more maintainability and terseness (incidentally, this is why I refused to put specific weights on the criteria of "power"; as a Lisp user, I probably value readability lower than I should, and maintainability/flexibility higher than is good for me). I'm not sure what else to say other than that I think it's a worthy trade.

    I'm having trouble parsing your third paragraph; both procedural and object-oriented programming is possible in Lisp. I'm using "readability" to mean more-or-less "How likely is an expert [language] programmer to understand a typical [language] codebase at first reading?", so that bit makes sense, but can you please clarify what you mean by "approachable"? I ask because it sounds suspiciously like "how close is it to C syntax?", which is a fine criteria for a popular language, but I was discussing power.

    The future is pretty hazy; I don't know that it'll fall onto the side of Haskell, and even if it does, it likely won't be because of the type inference. I haven't seen evidence that a mandatory strong type system (inferred or explicit) is axiomatically a good thing. There are plenty of assertions to this effect, but I've yet to read a follow-up defending those assertions, or even a breakdown of the trade-offs that doesn't devolve into religious warfare.

  6. Clojure is a nice example of a Lisp dialect that has clearly been influenced by Haskell. Its lazy sequences are especially noteworthy in this respect. It also eliminates the separate namespace for functions, makes function composition a little easier than in CL and pervasive destructuring and pattern matching. It's worth taking a look at as an example of what such a combination could look like even if you don't end up using it.

    I want to be clear that I'm not claiming that Clojure is Omega, or even Alpha. I really wish it had CL's method combination (it does have multimethods) and a way to locally or globally cause a function that was not originally generic to be treated that way.

  7. @Zak - True. The thought had crossed my mind (based on the things some of my associates have to say about it) that Clojure would make a decent example of Lisp being influenced by other languages (though from what I've heard, it seems to be closer to a Scheme than a CL), but I don't have enough experience to discuss it semi-meaningfully. It's on my to-do list along with another three languages I've been meaning to jump into.

  8. The world runs on C/C++. The OS you are using right now, the control software for your car and the compilers/interpreters that make it possible for you to use less practical languages are all programmed in C/C++. Not LISP. Not Haskell. I always find it amusing when people argue that it is impossible to write good software in C/C++ when in fact the world runs on software written in those exact languages. And I also find it ironic that, despite the often repeated claims that (1) LISP/Haskell/functional programmers are somehow cleverer than the rest and (2) the languages are much more productive, I look around and LISP/Haskell has had virtually zero impact on the world outside of the language geek community. Another way to put it is that there is absolute, unquestionable empirical proof that C/C++ has what it takes to run the software the world relies on. Because it does. There is no such empirical proof of LISP or Haskell. So everything claimed about LISP or Haskell or any other language is just that: a claim or wishful thinking not backed up by empirical evidence. Show me a real practical OS or JVM written in LISP or Haskell and you will get my attention. Look...I love programming languages and have programmed in many functional languages including LISP, Haskell and Clojure. And by the way I write functional language compilers for a living. But I still prefer C/C++. Why? Because it is the foundation of everything else. The solid rock holding up the ivory towers of language purists.

  9. @mb - I'm sorry, I must have missed the part of my article where I claim that it's impossible to write good software in C/C++, and the one that claims that C/C++ are marginal languages, and the one that claims Lisp/Haskell hackers are smarter than C/C++ hackers.

    Can you please point them out to me?