Bill de h?ra's post on the language's he's used in the past year contains the provocative thought "...some people are looking at things like HTTP and Ant and CSS and wondering whether they are really programming languages....[T]hey are either replacing or reducing the raw coding I used to do....[A]t least, if you start thinking about HTTP as a language, the job of using it becomes far easier...."
To me, the definition of a programming language has always been simple: Is it Turing complete? More practically, I skip any type of formal analysis and look for control structures and recursion. By my criteria, not only is HTTP not a language, SQL (as it's generally used) isn't either. This definition has two advantages: it's pretty bulletproof from a theoretical standpoint and, pragmatically, something feels "right" about assigning looping and recursion as the uniquely "program language-y" thing. Lots of things are complex or reduce complexity and lots of things have state that evolves over time, but solving problems by looping is what makes a solution feel like programming.
On the other hand, this view seems a little out-of-touch, if not anachronistic. Much of the "action" traditionally associated with language development ( conceptual mappings, problem-solving models, etc.) has shifted to framework development. Part of this is the evolution from "libraries" (generally, a group of related functions at the same abstraction level, such as a library for trigonometry or statistical analysis) to "frameworks" (a set of components of different abstraction levels covering an entire problem domain). Objects unified the vocabulary for discussions of behavior, data, and structure. Once that was established, patterns gave us a channel for professional discourse that previously might have required a shared language background (indeed, they're called "pattern languages.")
Further, something is wrong with actually writing a programming language. As Bjarne Stroustrup puts it, "On the order of 200 new languages are developed each year and...about 200 languages become unsupported each year." So on the one hand, you have the great value of expressing domain reasoning directly in code and on the other, you have a task that requires a large effort and which is almost certainly doomed. Perhaps one is smarter to simply embrace studying frameworks.
But ultimately I cannot believe that that's the right answer: frameworks are great, but I don't think they shift your reasoning in the way that a language does. I was fortunate enough to spend some time in my early professional years shifting back and forth between C and Prolog: two radically different languages. It was obvious to me that different parts of my reasoning were engaged by the different languages; I could be exhausted in C and fresh as a daisy in Prolog and vice versa. (In retrospect, it's probably fair to say that in C I was dealing with "housekeeping" and in Prolog I was solving "higher-level" problems, but in those days, low-level coding for data acquisition, transformation, and speed-ups was not optional.)
Stroustrup advocates Semantically Enhanced Libraries as the route forward. I note an echo of the modular compiler meme that Harry Pierson has mentioned (essentially: start from a complete language and extend/restrict it rather than start with a grammar and fire up lex/yacc). Those familiar with Lisp will naturally point out that extending/restricting the base language is precisely what Lisp macros do; the disadvantage being that the language being extended/restricted is Lisp (which I don't mean in a snide way, but simply in the way of acknowledging that the market has declined to embrace Lisp over the course of forty-five years).
Given my belief that the shift from single- to multi-to many- core is going to be the issue in programming within a half-decade, I naturally think we need tools for exploring new semantics. To me, it still seems that the best tools for that are new languages, not new constructions built on sinking foundations.