job.answiz.com
  • 4
Votes
name
name Punditsdkoslkdosdkoskdo

How to premature optimization really the root of all evil?

It's important to keep in mind the full quote:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.

What this means is that, in the absence of measured performance issues you shouldn't optimize because you think you will get a performance gain. There are obvious optimizations (like not doing string concatenation inside a tight loop) but anything that isn't a trivially clear optimization should be avoided until it can be measured.

The biggest problems with "premature optimization" are that it can introduce unexpected bugs and can be a huge time waster.

I'm surprised that this question is 5 years old, and yet nobody has posted more of what Knuth had to say than a couple of sentences. The couple of paragraphs surrounding the famous quote explain it quite well. The paper that is being quoted is called "Structured Programming with go to Statements", and while it's nearly 40 years old, is about a controversy and a software movement that both no longer exist, and has examples in programming languages that many people have never heard of, a surprisingly large amount of what it said still applies.

Here's a larger quote (from page 8 of the pdf, page 268 in the original):

The improvement in speed from Example 2 to Example 2a is only about 12%, and many people would pronounce that insignificant. The conventional wisdom shared by many of today's software engineers calls for ignoring efficiency in the small; but I believe this is simply an overreaction to the abuses they see being practiced by penny-wise-and-pound-foolish programmers, who can't debug or maintain their "optimized" programs. In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal; and I believe the same viewpoint should prevail in software engineering. Of course I wouldn't bother making such optimizations on a one-shot job, but when it's a question of preparing quality programs, I don't want to restrict myself to tools that deny me such efficiencies.

There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail.

Another good bit from the previous page:

My own programming style has of course changed during the last decade, according to the trends of the times (e.g., I'm not quite so tricky anymore, and I use fewer go to's), but the major change in my style has been due to this inner loop phenomenon. I now look with an extremely jaundiced eye at every operation in a critical inner loop, seeking to modify my program and data structure (as in the change from Example 1 to Example 2) so that some of the operations can be eliminated. The reasons for this approach are that: a) it doesn't take long, since the inner loop is short; b) the payoff is real; and c) I can then afford to be less efficient in the other parts of my programs, which therefore are more readable and more easily written and debugged.

  • 0
Reply Report

Premature micro optimizations are the root of all evil, because micro optimizations leave out context. They almost never behave the way they are expected.

What are some good early optimizations in the order of importance:

  • Architectural optimizations (application structure, the way it is componentized and layered)
  • Data flow optimizations (inside and outside of application)

Some mid development cycle optimizations:

  • Data structures, introduce new data structures that have better performance or lower overhead if necessary
  • Algorithms (now its a good time to start deciding between quicksort3 and heapsort ;-) )

Some end development cycle optimizations

  • Finding code hotpots (tight loops, that should be optimized)
  • Profiling based optimizations of computational parts of the code
  • Micro optimizations can be done now as they are done in the context of the application and their impact can be measured correctly.

Not all early optimizations are evil, micro optimizations are evil if done at the wrong time in the development life cycle, as they can negatively affect architecture, can negatively affect initial productivity, can be irrelevant performance wise or even have a detrimental effect at the end of development due to different environment conditions.

If performance is of concern (and always should be) always think big. Performance is a bigger picture and not about things like: should I use int or long?. Go for Top Down when working with performance instead of Bottom Up.

  • 0
Reply Report

It's important to keep in mind the full quote:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.

What this means is that, in the absence of measured performance issues you shouldn't optimize because you think you will get a performance gain. There are obvious optimizations (like not doing string concatenation inside a tight loop) but anything that isn't a trivially clear optimization should be avoided until it can be measured.

The biggest problems with "premature optimization" are that it can introduce unexpected bugs and can be a huge time waster.

  • 0
Reply Report