The most powerful optimization technique in any programmer's toolbox is to do nothing.

We can get mathematically specific about this. It is almost
never worth doing optimizations that reduce resource use by merely a
constant factor; it's smarter to concentrate effort on cases in which
you can reduce average-case running time or space use from
O(*n*^{2}) to
O(*n*) or O(*n* log
*n*),^{[}112] or similarly reduce from a higher
order. Linear performance gains tend to be rapidly swamped by Moore's
Law.^{[113]}

Another very constructive form of doing nothing is to not write
code. The program can't be slowed down by code that isn't there. It
can be slowed down by code that *is* there but
less efficient than it could be — but that's a different
matter.

^{[112] }For readers unfamiliar with O
notation, it is a way of indicating how the average running time of an
algorithm changes with the size of its inputs. An O(1) algorithm runs
in constant time. An O(*n*) algorithm runs in a
time that is predicted by `A n
+ C`, where

^{[113] }The eighteen-month doubling time usually quoted for
Moore's Law implies that you can collect a 26% performance gain just
by buying new hardware in six months.