Writing software for 10+ years. Eschew hype; focus on performance.

Living in Switzerland 🇨🇭 since 2017.

Premature optimization is the root of all evil, is bunk

Whenever the subject is performance, time and time again, from various sources and levels of the organization I've heard the famous quote:

"Premature optimization is the root of all evil." -- Donald Knuth, 1974

Usually said by two kinds of people, in my experience:

  1. Those that don't care about performance, or think performance is a hardware + compiler problem, not theirs
  2. Those that care more about business value, and they consider thinking "excessively" about performance a detriment to business value

I want to debunk both of those ideas, and also explore the meaning of the original quote, in context.

Knuth's quote

I encourage you to read the original quote in the context of its paper.

Structured Programming with go to statements. by Donald E. Knuth, 1974

Here's a large excerpt, where I made bold the original quote, but the italic text is from the original.

There is no doubt that the grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. It is often a mistake to make a priori judgments about what parts of a program are really critical, since the universal experience of programmers who have been using measurement tools has been that their intuitive guesses fail. After working with such tools for seven years, I've become convinced that all compilers written from now on should be designed to provide all programmers with feedback indicating what parts of their programs are costing the most; indeed, this feedback should be supplied automatically unless it has been specifically turned off.

And here's an excerpt of Lex Fridman's podcast, where he interviewed Knuth.

What did Knuth mean?

Well he certainly didn't mean "don't optimize on the first pass". He said don't optimize tiny parts of your code (like a loop) unless you know that it's a so-called "hot spot".

He didn't say "don't think about performance".

Computers back then vs. modern computers were like snails being compared to a hyperspace spaceship. For context, here's the cutting-edge in computing back then:

Mini-computers, these bad boys don't even have a screen. I honestly don't know how you'd interact with them. And notice they're an entire desk, not mini at all.

Those bad boys had kilobytes, I repeat, KILOBYTES of memory. You couldn't stream one portion of one second of a modern YouTube video with that. (1 second of 1K video on YouTube is around 20 Mbps.)

Similarly the cutting-edge mainframes had 1-4 MB of memory.

Floppy disks weren't around yet, if you know what those are. I barely got to see them in my childhood in early 2000s in Mexico so I know how they feel on the hand.

Your average program you download from the web nowadays is like 300 MB for just the binary, not to speak of the monstruous amounts of memory they tend to require.

The point I'm trying to make is, back then you couldn't say "don't think about performance" and mean it, because computers were damn slow. You couldn't achieve things if you didn't think about performance somewhat.

But what was a problem was people spending hours optimizing some small portion of the code that they didn't even know needed optimization!

You would be a fool, however, to not think about the overall program's structure.

Premature abstraction

The pendulum swing saw a UNO reverse card. It was supposed to swing towards optimizing everything again, but it didn't.

Knuth's line is still quoted 50 years later, although somewhat out of context.

Yet somehow people manage to do exactly what this quote says not to do, and they do it even harder, while convincing themselves that they're not doing it.

What did the Java and OOP boom bring? Clean Code(tm), SOLID, The Gangs of Four. And absolute zealots that don't know how to solve the problem at hand, but would sure love to solve an imaginary problem in the future.

I'm talking about premature abstraction. It's another form of premature optimization.

I've never seen somebody prematurely think about performance. It's always at precisely the right moment (the first time you're thinking about the problem), or too late. But never prematurely.

I have very often seen people think about how to abstract things into an interface before they've even made prototypes to find out if their solution even solves the problem. I rarely see prototypes, for that matter. There's always a time crunch, not enough time to find out if what we're doing is what we need to do.

I want to address now the first two kinds of people I brought up in the introduction of this blog post.

Performance is a hardware + compiler problem

My only question is: says who?

Do the experts, the guys that actually have to write performant code, do they say that?

Or the dude that writes super slow code says that?

Talk with or listen to any programmer whose job involves high performance code. High performance meaning "it needs to run at 60 FPS or our customers will complain".

Most web people are disqualified, because they often can't even keep 30 FPS on a page with static text, even though they have millions in revenue (GitHub PR diff pages, for example). They leave the performance to the hardware + compiler, they didn't take it upon themselves to do it.

How about JIRA? Super old software, Atlassian had $4.4 BILLION in revenue in 2024. Still slow AND poorly designed to boot. They leave the performance up to the compiler and the hardware. How do I know? Because a list of tickets, or a particular ticket, doesn't take more than 10ms to load from a database, you've got an indexed ticket label for crying out loud, but they take seconds.

Next, how about the fact that if you program how you would like to program (or how Clean Code(tm) encourages you to program), your compiler can't avoid having the machine code jumping 5 times in memory just to execute 1 + 1 (your business logic). Memory is the slowest part of the performance, and OOP and its sister practices are optimized to cause cache misses.

What do the people that write fast software, say about this issue? Listen to them.

Thinking "excessively" about performance is a detriment to business value

I totally agree. If what you mean by "excessively" is writing in Assembly when you just need to render some stuff on the screen. ffmpeg guys being obsessively excessive about performance, and writing pure Assembly, is how the web runs right now. You cannot have modern video hosting websites like YouTube, or social media like Instagram or TikTok, if you don't have ffmpeg. Those guys are obsessed.

They don't even get paid or get sponsored very much for their work. Taken for granted.

If what you mean by excessive is "I just need a todo list app, why are you writing it in Assembly?" then yeah I'm with you.

If what you mean is not even being aware of what is slow and what is fast, then that's just ignorance.

Casey Muratori already did a write-up he called Performance Excuses Debunked, where he shares many many business cases where it turns out, oops, thinking about performance from the beginning would've been cheaper!

You "don't need" performance, until you do, at which point you need a total re-write, because performance is rarely hot spot issues, it's usually an entire architectural issue.

It would be cheaper if programmers were simply educated in how to write fast code on the first pass, and they didn't have to think about it much, they just did it "pretty much right" the first time. This isn't about predicting the future, this is about being good programmers.

Conclusion

"Premature optimization is evil" is a bunk concept, given the people that say it often prematurely abstract. And given that Knuth didn't quite mean it how people use it. And given that if you don't think about performance, you are simply doing a poor job as a programmer and as a business.

Who cares if Knuth said it, if it doesn't work?

Too many words. I hope you found them insightful. You might realize I have a lot to say to people that have wasted 20 years of Moore's Law by writing shitty code.

I swear, old computers and old software ran faster than modern software on modern hardware. I know it's not the hardware's fault, those guys are desperately trying to keep up with how shitty software is becoming.