First of all, happy New Year!
Now, on the first post of the year, I would like to quote the great
Donald Knuth: "We should forget about small efficiencies, say, about 97% of the time:
Premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code;
but only after that code has been identified."
I've been defending this for a long time, but I keep stumbling upon developers that disregard this rule of thumb. When faced with performance issues most tend to use an empirical approach: read the source code and attempt to make informed guesses on what may be the critical code. Most forget about code profilers that can accurately identify the critical code. I believe this is not only their fault, but also a fault of the education system, that failed to instruct and emphasize on the relevance of such tools.
Here's another quote (from
Continuous Delivery): "Don't guess; measure". My experience tells me, that, most of the times, our informed guesses are not the main culprit of our performance issues. So, the best approach is to clearly identify the culprits and deal with them. What's the point in spending time solving an issue when there's another issued that on itself may represent a 50% improvement? We rarely have the time to perfect our code until we deem it as good and fast as we'ld like, so in such situations a programmer should be pragmatic (which I believe to be one of the most relevant qualities in programmers. By the way, if you haven't read
The Pragmatic Programmer yet, it's time to start thinking about it).
Happy coding in 2013!