A colleague of mine today committed a class called ThreadLocalFormat
, which basically moved instances of Java Format classes into a thread local, since they are not thread safe and "relatively expensive" to create. I wrote a quick test and calculated that I could create 200,000 instances a second, asked him was he creating that many, to which he answered "nowhere near that many". He's a great programmer and everyone on the team is highly skilled so we have no problem understanding the resulting code, but it was clearly a case of optimizing where there is no real need. He backed the code out at my request. What do you think? Is this a case of "premature optimization" and how bad is it really?
Design – Is premature optimization really the root of all evil
Architecturedesignoptimizationquality-attributes
Related Topic
- SIMD Programming – Maintenance Cost of SIMD Programming Code Base
- Optimization – Get All Data vs Get Partial Data Optimization
- Domain Driven Design – Can Two Aggregates Have the Same Root?
- Performance Optimization – Dealing with Misconceptions About ‘Premature Optimization is the Root of All Evil’
- C++ – Is Passing Arguments as Const References Premature Optimization?
Best Answer
It's important to keep in mind the full quote (see below):
What this means is that, in the absence of measured performance issues you shouldn't optimize because you think you will get a performance gain. There are obvious optimizations (like not doing string concatenation inside a tight loop) but anything that isn't a trivially clear optimization should be avoided until it can be measured.
The biggest problems with "premature optimization" are that it can introduce unexpected bugs and can be a huge time waster.