izard: (Default)
[personal profile] izard
When I start a new software optimization project for a new customer, there is about 60% chance that I can show a very easy magic: run Vtune on a working system, spot an easy to fix bottleneck, implement or suggest a quick fix, get 1.5x-10x speedup. If speedup was not good enough, I work further to find few more easy to fix places, getting extra 1.1-1.2x speedup from each. Then it becomes difficult to get extra performance, each incremental step would get just 1-2% improvement, and would require days or weeks of careful work. Showing "easy magic" is most spectacular part of the performance, a too easy way to impress customer's management. However there are no reasons to be proud of it.

This "easy magic" is possible only in case performance was never a priority for the development team (and it is acceptable approach for most of real world projects). About 30% of my customers treat performance as one of key design metrics, and allocate about 1 full time engineer per 20 to work on system's performance full time (or more often it is a role shared by several most experienced developers). When I start working with this kind of customers, showing them "easy magic" is usually impossible. So I start from step 2 - find several ways to get 1.1x improvement, spending from few days to two weeks for each. This is possible because I work with software performance issues full time for last 6 years, and had encountered most of bottlenecks types in existence already. Another reason is I remember quite well what is written in Intel optimizations manual for all recent CPU archs.
My first mentor in software performance, Chris Elford, told me that optimizing software is like solving puzzles. However what I feel now I am more a database rather than puzzle solver.

There were several times when there is a software performance guru in a team I am trying to help with performance. In this case I join and we go straight to the step 3: spend weeks to find ways to get few extra percent of execution speed (or latency, response time, or whatever the performance metric is). In this case local performance expert in a team I try to consult is always in a winning position: he is expert in both their software and performance and is usually smarter than me. The only thing I bring is familiarity with a tool set and knowledge of some internal CPU details.

The reason I write this post is tomorrow I have a deadline for an optimization project, where local expert is smarter than me. And he knows all the right tools (I've shown him my favorite tools on our previous optimizations project.) So it will be a first time I fail a project completely.

Usually when I talk with customers (especially management) I try to pretend to be an expert in software performance. In fact I just remember few recipes, have a bit of luck, and know the right combination of tools. Perhaps, it is impostor syndrome?

Profile

izard: (Default)
izard

July 2025

S M T W T F S
  12345
67 8 91011 12
13141516171819
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 16th, 2025 09:02 am
Powered by Dreamwidth Studios