Does computational power add programmer performance? I said no in this post, Josh sarcastically agreed: "I agree whole-heartedly, if you can be a engineering-puritan and use a simple text editor and compiler, without complex build/make/ant scripts which do far more than just compile. It seems to me that with the evolution of the industry itself, software programming is much more than that now. I can't imagine a programmer today (well, I suppose I don't want to anyway) who builds an application with the luxury of not having to actually test it. The hardware requirements to run the necessary testing/environment software is what really starts to cost, in my experience. In my case, I have to test my software with 3rd party (sometimes homemade) stress-testing and unit-testing software--and these are not thin. With web/network applications, there's also the overhead of running a .NET or J2EE app/web server."
Well, first, I think that the percentage of programmers who actually do the best practice of executing a test suite as part of their compile-cycle is smething far short of 10% of the programming populace. But I'll concede the point that the hyphen in the compile-debug process is a place where a faster computer can be felt. But, I ask, how much productivity gain from a four-fold increase in processing power? A 2Ghz machine versus a 500Mhz (say) -- what productivity increase would you expect to see.