My rant in reaction to Paul Caplan's response to Angelika Langer's OP:
> I would like to hear more about
> what is wrong with the languages of today
Of all the activities imaginable, computer programming is the one in which computers should have the greatest productivity impact. And yet compared to activities that have enjoyed huge productivity gains via computers in the past 20 years (say, image manipulation), the productivity gains in computer programming are trivial. Give a 500MHz P3 to one professional graphics designer and a 3GHz P4 to another and compare their productivity: you will see a productivity difference, because this is a task / profession which has managed to leverage the computer itself. Give the same disparate hardware to two comparably talented programmers and what productivity difference will you see? None, or so little difference as to be immeasurable.
Similarly, give two designers the current feature set of a preferred professional tool (let's say, Photoshop) and the feature set of that tool 5 years ago, and you'll see a difference. In programming? Doubtful (with the notable exception of a refactoring IDE such as IDEA).
More concretely, "computer programs" are almost invariably viewed as a series of linear text streams that are converted in some way into machine instructions in some O(lines of source code) manner. 99+% of the world's code is written imperatively. Object-orientation, which is almost universally accepted as the preferred structuring mechanism for software systems, has turned out not to be universally superior for learning, comprehension, or reuse.
Persistence, business rules, interfaces: all are areas in which the way we specify, create, and maintain systems invariably trade off productivity with maintainability. If you want to do things fast, you might have some chance to use a tool that presents the problem as something other than a text stream (i.e., you might be able to use a visual builder). But those tools invariably create overly-coupled representations of the solution.
Pattern matching is absolutely fundamental to human problem-solving, but where's the computer-support for pattern matching in the task of software development? That is, why can't a programming language leverage the fact that the vast majority of computer programs are built from examples?
Test-driven design and functional programming: The whole world of TDD is based on either minimizing side-effects or making them explicit. Well, if you program side-effect free, you should have programmatic support, i.e., use a functional language, which has all sorts of implications for behind-the-scenes implementation. And if you rely on side-effects, the world of unit-testing tools is in conflict with language provisions for visibility (although in .NET, at least, you can get around visibility with sufficient security permissions).
Typing, multithreading, resource management: In all of these areas, there's an enormous gap between standard and best practices. Just as managed memory and built-in exception mechanisms are for most programmers effective "solutions" to common problems, these things should be part of the development / deployment infrastructure.
Sheesh, I haven't even started on the issue of multiple representations and semi-structured data....
What's wrong with today's languages? Everything.